When people ask for your P(doom), do you give them your inside view or your betting odds?
post by Vivek Hebbar (Vivek) · 2022-03-26T23:08:17.712Z · LW · GW · 9 commentsThis is a question post.
Contents
Answers 3 Dagon 2 rohinmshah None 9 comments
There has been some confusion about whether people are using inside views or all-things-considered betting odds when they talk about P(doom). Which do you give by default? What are your numbers for each?
Answers
Depends on who asks, why, and how much context we share. Much of the time, I give the "signaling answer", which is neither my inside/true estimate nor a wager I can make.
It's not purely signaling - I'm not JUST trying to convince them I'm smart or to donate to my preferred cause or change their behavior or whatnot. It also includes a desire to have interesting conversations and to elicit models and predictions from my discussion partners. But it's not a pure estimate of probability.
I give it somewhere around p(personal death) = 0.999, p(most humans die with me) = 0.15, and p(humans wiped out within a few hundred years) = 0.8.
Independent impressions (= inside view in your terminology), though my all-things-considered belief (= betting odds in your terminology) is pretty similar.
9 comments
Comments sorted by top scores.
comment by Charlie Steiner · 2022-03-27T00:02:29.332Z · LW(p) · GW(p)
Could you explain more about the difference.and what it looks like to give one vs. the other?
Replies from: morten-hustveit↑ comment by Morten Hustveit (morten-hustveit) · 2022-03-27T01:49:30.506Z · LW(p) · GW(p)
When betting, you should discount the scenarios where you're unable to enjoy the reward to zero. In less accurate terms, any doom scenario that involves you personally dying should be treated as impossible, because the expected utility of winning is zero.
Replies from: Vivek, TLW↑ comment by Vivek Hebbar (Vivek) · 2022-03-27T12:17:12.831Z · LW(p) · GW(p)
Oh, this is definitely not what I meant.
"Betting odds" == Your actual belief after factoring in other people's opinions
"Inside view" == What your models predict, before factoring in other opinions or the possibility of being completely wrong
Replies from: Pablo_Stafforini↑ comment by Pablo (Pablo_Stafforini) · 2022-03-27T14:10:24.417Z · LW(p) · GW(p)
Though I understood what you meant, perhaps a clearer terminology is all-things-considered beliefs vs. independent impressions [? · GW].
↑ comment by TLW · 2022-03-27T03:05:20.518Z · LW(p) · GW(p)
Er, treated as impossible != treated as zero utility.
Replies from: Vaniver, matt-weatherston↑ comment by Vaniver · 2022-03-28T17:01:28.324Z · LW(p) · GW(p)
Suppose I think the probability of me dying in a car accident is 20%, and I don't care about what happens to my wealth in that world (rather than caring about my heirs having more money). Should I buy a contract that pays out $100 if I die in a car accident at the cost of $10?
The claim is: no, because it pays out only in situations where the money is worthless to me. If you try to back out my estimate from my willingness-to-pay, it will look a lot like I think the probability of me dying in a car accident is 0%. [And the reverse contract--the "I don't die in a car accident" one--I should buy as tho my price were 100%, which basically lets me move all of my money from worlds that I don't care about to ones that I do, basically setting my bet counterparties as my heirs.]
You can get milder forms of this distortion from 'currency changes'. If I make a dollar-denominated bet on the relative value of the dollar and the euro, and I mostly buy things in euros, then you need to do some work to figure out what I think the real probabilities are (because if I'm willing to buy a "relative value of the dollar halves" contract at 30%, well, I'm expecting to get $50 in current value back instead of $100 in current value back).
[This is to say, I think you're right that those are different things, but the "because" statement is actually pointing at how those different things construct the conclusion.]
Replies from: TLW↑ comment by Matt Weatherston (matt-weatherston) · 2022-03-27T03:16:15.293Z · LW(p) · GW(p)
No one said it should be treated as zero utility
Replies from: TLW