Question about Large Utilities and Low Probabilities

post by sark · 2011-06-24T18:33:07.708Z · LW · GW · Legacy · 9 comments

Contents

9 comments

Advanced apologies if this has been discussed before.

Question: Philosophy and Mathematics are fields in which we employ abstract reasoning to arrive at conclusions. Can the relative success of philosophy versus mathematics provide empirical evidence for how robust our arguments must be before we can even hope to have a non-negligible chance of arriving at correct conclusions? Considering how bad philosophy has been at arriving at correct conclusions, must they not be essentially as robust as mathematical proof, or correct virtually with probability 1? If so, should this not cast severe doubt on arguments showing how, in expected utility calculations, outcomes with vast sums of utility can easily swamp a low probability of their coming to pass? Won't our estimates of such probabilities be severely inflated?

Related: http://lesswrong.com/lw/673/model_uncertainty_pascalian_reasoning_and/

9 comments

Comments sorted by top scores.

comment by XiXiDu · 2011-06-25T10:22:33.550Z · LW(p) · GW(p)

See also how wrong people tend to be in guessing the truth of mathematical statements:

comment by gwern · 2011-06-24T23:51:47.579Z · LW(p) · GW(p)

It's interesting that you think there's a distinction to be made between the methods of philosophy and math, as opposed to their subject matters.

Replies from: sark
comment by sark · 2011-06-25T10:06:00.500Z · LW(p) · GW(p)

So are you suggesting their differences in success has to do with subject matter?

Replies from: gwern
comment by gwern · 2011-06-25T13:27:25.831Z · LW(p) · GW(p)

Yes.

Replies from: sark
comment by sark · 2011-06-25T14:11:26.067Z · LW(p) · GW(p)

I don't doubt that might be the ultimate cause, as different methods are amenable to different subject matters. But that does not affect the inference I want to draw here, that in doing abstract reasoning, one has to hold oneself to a ridiculously high standard of precision and rigor.

comment by Benquo · 2011-06-24T21:05:01.348Z · LW(p) · GW(p)

There is a related discussion here too: http://lesswrong.com/lw/2id/metaphilosophical_mysteries/

comment by Manfred · 2011-06-24T19:18:21.489Z · LW(p) · GW(p)

Model uncertainty only has a big effect on probabilities that are defined as not (some event with probability near 1). When talking about specific scenarios with low probability, model uncertainty just scales them - e.g. a specific god existing in Pascal's wager isn't vastly over or underestimated if model uncertainty isn't accounted for.

Replies from: sark
comment by sark · 2011-06-25T10:05:23.752Z · LW(p) · GW(p)

Hmm, why is this the case? I think I'm missing background knowledge here.

Replies from: Manfred
comment by Manfred · 2011-06-25T19:41:55.457Z · LW(p) · GW(p)

Think of it like this: say you're flipping a coin and want the probability of heads. The only way you can think of to not get heads or tails is if an alien swaps the coin with something else when you toss it, and you assign that a tiny probability. Then suddenly you realize that there's a 1/10000 chance to land on the edge!

Now, factor by which this changes your probability estimates for heads and tails is really small. 0.499999999999 is pretty much the same as 0.49995, if you were betting on heads, your expected payoff would barely shiver. But if you were betting on "neither heads nor tails", suddenly your expected payoff gets multiplied by a couple billion!

The probabilities for "normal stuff" and "not normal stuff" both change by the same absolute amount. But the relative amount is much huger for "not normal stuff"!

Now you may say "Why does it have to be phrased like 'not normal stuff,' why can't I just bet on something like the coin landing on its edge?" This is the nature of uncertainty. Sure, after you realize the coin can land on its edge you might bet on it. But if you knew about it before in order to bet on it,, it would already be in your model! Uncertainty doesn't mean you know what's going to happen, it means you expect something to happen in an unexpected direction.