Posts

Comments

Comment by joecode on Article upvoting · 2009-07-20T18:15:40.221Z · LW · GW

Rather ironic that a site dedicated to rationality should depend to a large extent on, from what I can tell, is an irrational impulse to vote, no?

After all, what is the utility of voting, unless it is believed that everyone will act in the same way---but group behavior is not something you can control.

I suppose it could be useful to remind yourself of the more interesting comments and commenters, at least, as you suggest.

Oh my, in a fit of irrationality, I just voted up your comment.

Comment by joecode on Newcomb's Problem and Regret of Rationality · 2009-07-20T13:10:26.468Z · LW · GW

I've come around to the majority viewpoint on the alien/Omega problem. It seems to be easier to think about when you pin it down a bit more mathematically.

Let's suppose the alien determines the probability of me one-boxing is p. For the sake of simplicity, let's assume he then puts the 1M into one of the boxes with this probability p. (In theory he could do it whenever p exceeded some thresh-hold, but this just complicates the math.)

Therefore, once I encounter the situation, there are two possible states:

a) with probability p there is 1M in one box, and 1k in the other

b) with probability 1-p there is 0 in one box, and 1k in the other So:

the expected return of two-boxing is p(1M+1k)+(1-p)1k = 1Mp + 1kp + 1k - 1kp = 1Mp + 1k

the expected return of one-boxing is 1Mp

If the act of choosing affects the prior determination p, then the expected return calculation differs depending on my choice:

If I choose to two-box, then p=~0, and I get about 1k on average

If I choose to one-box, then p=~1, and I get about 1M on average

In this case, the expected return is higher by one-boxing.

If choosing the box does not affect p, then p is the same in both expected return calculations. In this case, two boxing clearly has better expected return than one-boxing.

Of course if the determination of p is effected by the choice actually made in the future, you have a situation with reverse-time causality.

If I know that I am going to encounter this kind of problem, and it is somehow possible to pre-commit to one boxing before the alien determines the probability p of me doing so, that certainly makes sense. But it is difficult to see why I would maintain that commitment when the choice actually presents itself, unless I actually believe this choice effects p, which, again, implies reverse-time causality.

It seems the problem has been setup in a deliberately confusing manner. It is as if the alien has just decided to find people who are irrational and pay them 1M for it. The problem seems to encourage irrational thinking, maybe because we want to believe that rational people always win, when of course one can set up a fairly absurd situation so that they do not.