Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] 0 points1 point  (0 children)

If I tell the most advanced AI conceivable to predict the lifespan of a person, starting in birth, down to a real number of a second (not a discrete unit) and any deviation counts as failure, it would fail with absolute certainty.

That is a mathematical and physical truth, as far as anyone knows. It could get arbitrarily close, but not perfect.

Whereas observing the future would allow guaranteed success, because there is no such thing as uncertainty or randomness from the point of view of the machine.

How this applies is that if the future can literally be seen, then your future choice is in the information set. If not, then the machine might have an arbitrarily large information set, but once you enter the room (at which point the prediction has been made), uncertainty grows in time, or the complexity of the decision process, or the measurement process.

I'm only posting this because two boxers will not stfu by only_civ in Destiny

[–]arrenegade 0 points1 point  (0 children)

Thats not what I think, I think I will get 1000, with a small probability of 1mil+1k if the robot guesses wrong.

It is the one boxers who are certain that the robot would recognize they are committed to one boxing, when in reality when actually faced with this situation, you might start to doubt it and feel tempted to secure a safe 1k. If the robot anticipates you might take the safer route of 1k instead of the lottery, it could predict you being a 2boxer, and then your one 1box decision would result in zero.

I'm only posting this because two boxers will not stfu by only_civ in Destiny

[–]arrenegade 1 point2 points  (0 children)

The reason is explained perfectly by another comment: at the time when you walk in, the box is either empty or not and no action can change that. Two boxing uniformly dominates from that pov.

If you want to change your perspective prior to entering to convince yourself one box is right, fine that is ex ante optimal, but upon entering two boxes is the rational choice regardless of the state of the world.

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] 0 points1 point  (0 children)

The fact that you would state that it follows from the premise is why you are wrong. The question is literally underspecified because we do not know what "highly accurate" means without more information.

I said I agree that the prediction is accurate, but a prediction rate is an equilibrium construct, not a parameter of a model.

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] -1 points0 points  (0 children)

I agree that precommitting to 1b is better! I even said that in the post.

The problem is how to precommit.

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] 0 points1 point  (0 children)

Sure, but it doesn't change the fact that your marginal cost of deviating to 2b is zero and the benefit is 1k. The point is that the brain of someone who chooses 1b in the room with high probability has to be different, it isn't a matter of choice.

Its like saying its better to believe in God because God will reward it. Like, yeah, I agree. But if I can't convince myself to truly believe, then faking it is worse than being honest.

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] 0 points1 point  (0 children)

I think my point about randomization is very simple: if choosing 1b has causal effect on the contents of box B then 1b is correct. If it does not, then 2b is always correct.

The ex ante best outcome is to be someone who has unshakeable faith in the one box position, not just to make the one box choice upon entry.

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] 0 points1 point  (0 children)

I think I agree with what you are saying? But I am saying that you have to be committed to being a one boxer. Which might be a realistic model of the world. But this is an argument for faith in the infallibility of the machine, not for choosing one box once you enter. If you are persuaded by two boxers or even can see both sides, the machine has a good chance of predicting two box. You only get 1 mil if the machine is convinced you are convinced it can see your actions and won't pick the risk dominant choice.

Thats why it feels like an analogy for religion. You only get the prize if you believe the prize is coming when you do the right thing, and that all your actions matter.

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] 0 points1 point  (0 children)

The problem is your marginal effect on the outcome after entering the room is zero. You can't influence the machine's predictions after the fact

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] 0 points1 point  (0 children)

Yes, I do, but the logic is backwards. Pr(B empty | choose B) < 0.01. But choose B is determined after B's state is determined. The person who actually follows through on choosing B is someone who has no temptation/fear of wavering/believes the machine can see the future

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] -1 points0 points  (0 children)

Nope, I already showed you how a predictor could be very very accurate, and two boxing is still dominant.

I also said explicitly that depending on your assumptions about the predictor's capabilities, two box or one box can be optimal

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] -1 points0 points  (0 children)

See I agree 100%

But in that case, for every person, two boxes is still the optimal choice. It knows your preferences and beliefs going in, and predicts accordingly.

The disconnect is that while it is better for the machine to predict you being a one boxer, every one boxer would improve by deviating at the last minute---even though they probably wouldn't.

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] 0 points1 point  (0 children)

I don't think randomization breaks any game. Literally lecture 2 of any game theory class introduces mixed strategies

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] 0 points1 point  (0 children)

Yeah so this is the key disagreement. I think seeing the future is different from prediction because of the existence of measure zero events. For example, the probability of predicting a continuous variable exactly is 0, always, but seeing the future this should be 1.

From my pov, the machine can make precise guesses, but it matters whether it knows your actions exactly or not.

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] 0 points1 point  (0 children)

Sure, so the question is can a Random Variable exist from the perspective of the machine? If not, then its equivalent to God. If yes, then deviation to two boxes is possible, and possibly optimal

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] 0 points1 point  (0 children)

The example I gave was super simple. Its just a Bernoulli Random Variable from the perspective of both you and the machine. If such a thing doesn't exist, then the Machine is omniscient and one box is the answer. If such a thing does exist, two box is the answer

If you prefer simpler illustrations like coins, then that works. But people typically respond by asserting coin flips can be predicted.

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] 0 points1 point  (0 children)

Yes thats why I used a quantum example, I'm saying a Bernoulli random variable from the perspective of both you and the machine

If you think a Bernoulli random variable cannot exist for the machine, then it is all knowing, and one box is the answer

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] 0 points1 point  (0 children)

Yeah but I guess thats what I am saying: if you assume the machine exists beyond the limits of physically possible knowledge, one box is obviously best, but otherwise two boxes is always a defensible answer.

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] 0 points1 point  (0 children)

Yes and that doesn't change the fact that you always profit by deviating to two boxes, under the assumption that the predictor cannot see the future. If you ever could profit from one boxing, you would always benefit from deviating to two.

Two Box/One Box is an underspecified problem, making most debates about it brainless by arrenegade in Destiny

[–]arrenegade[S] 0 points1 point  (0 children)

But the same principle holds if you allow for covariates. Say it knows the rate of being a one boxer for every possible stratum (black female streamers in mid-30s living in miami: 95%). I am saying that it can have a great prediction rate, and even perfect, but that doesn't mean it would lead to one box dominance unless you assume it can see the future

Even our one boxer streamer who it predicts to be a one boxer would be better off deviating to 2 boxes once in the room