Posts

Comments

Comment by Stephen4 on Pascal's Mugging: Tiny Probabilities of Vast Utilities · 2007-10-22T23:10:00.000Z · LW · GW

Robin's anthropic argument seems pretty compelling in this example, now that I understand it. It seems a little less clear if the Matrix-claimant tried to mug you with a threat not involving many minds. For example, maybe he could claim that there exists some giant mind, the killing of which would be as ethically significant as the killing of 3^^^^3 individual human minds? Maybe in that case you would anthropically expect with overwhelmingly high probability to be a figment inside the giant mind.

Comment by Stephen4 on Pascal's Mugging: Tiny Probabilities of Vast Utilities · 2007-10-21T02:48:00.000Z · LW · GW

Maybe the origin of the paradox is that we are extending the principle of maximizing expected return beyond its domain of applicability. Unlike Bayes formula, which is an unassailable theorem, the principle of maximizing expected return is perhaps just a model of rational desire. As such it could be wrong. When dealing with reasonably high probabilities, the model seems intuitively right. With small probabilities it seems to be just an abstraction, and there is not much intuition to compare it to. When considering a game with positive expected return that comes from big payoffs and small probabilities, it reduces to the intuitive case if we have the opportunity to play the game many times, on the order of one over the payoff probability. This type of frequentist argument seems to be where the principle comes from in the first place. However, if the probabilities are so small that there is no possibility of playing the game that many times, then maybe a rational person just ignores it rather than dutifully investing in an essentially certain loss. Of course, if we relegate the principle of maximizing expected return to being just a limiting case, this leaves open the question of what more general model underlies it.

G: Sorry to put words in your mouth.

Comment by Stephen4 on Pascal's Mugging: Tiny Probabilities of Vast Utilities · 2007-10-20T22:36:00.000Z · LW · GW

G,

I was essentially agreeing with you that killing 3^^^^^3 vs 3^^^^3 puppies may not be ethically distinct. I would call this scope insensitivity. My suggestion was that scope insensitivity is not necessarily always unjustified.