Best effort beliefs

post by Adam Zerner (adamzerner) · 2023-10-21T22:05:59.382Z · LW · GW · 9 comments

Contents

9 comments

Alice: Hey Bob, how's it going?

Bob: Pretty good. How about you?

Alice: Pretty good. What are you drinking?

Bob: A Bayeslight.

Alice: Cool. Hey bartender, lemme get one of those as well.

Alice: Hey, have you seen President Hanson's new immigration policy?

Bob: No. What is it?

Alice: He's shutting down the borders pretty hard. It's like an 8.5/10 on the left-right political axis, where 10/10 is ultra conservative and 0/10 is ultra liberal.[1]

Bob. Oh gawd. He's such an idiot.

Alice: Oh yeah? What sort of immigration policy would you implement?

Bob: Hm, I think something like a 2/10 on the left-right spectrum.

Alice: And how confident are you in that?

Bob: Probably like an 8/10.

Alice: Ok, let's call this 2/10 on the left-right spectrum belief with an 8/10 confidence a 2-8 belief. I disagree and will bet you $10 on it.

Bob: But how are we going to bet? What are we betting on? What's the resolution criteria?

Alice: Proposes some clever thing to bet on with crystal clear resolution criteria that perfectly captures everything.

Bob: Wow. Ok. Yeah, that works. But actually, after thinking about it a little more, I think I'm more like a 2.5-7. Can we bet at those odds instead?

Alice: We can. But only if we can bet $100 instead of $10.

Bob: Oh. Ummmm. Ok, I guess I'm probably more like a 3-6.5. I could do that at $100. Is that ok?

Alice: No. But I'd do 3-6.5 at $1,000. Shake on it?

Bob: Woah... Alice, this got real expensive real fast.

Alice: What's the matter Bob? Do you need to chug a few Bayeslights first?

Bob: Fuck you. I've got a family to feed over here.

Alice: Pansy. Wussy. That's a cop out and you know it.

Bob: Sigh. Ok, ok. Let me think about this. I am a Bayesian, and I do agree that Bayesians should be perfectly happy to bet on their beliefs when doing so would be +EV.

Alice: Right...

Bob: But... it's just that... well, $1,000 is kind of a lot of money.

Alice: Is it? You're semi-retired. You own your home. Your kids are in college. You have a fund to pay for college. You don't have expensive taste. You always talk about how you'd be perfectly fine retreating to some cabin in the woods somewhere.

Bob: I mean, yeah. You're right.

Alice: Ok, so what's the problem? It looks like things are pretty inelastic. Like, the amount of utility you'd gain if you won $1,000 is pretty much the same as what you'd lose if you lost $1,000. (Save for the initial short-term, Fisher-ian, wussy-pussy feelings of "Oh no, I lost $1,000. What am I going to tell my wife?")

Bob: Ok, ok. I take enough pride in my rationalism that I have to concede that you're right. I should be willing to bet $1,000 on my 3-6.5 belief. But... just... let me think about this for a second.

Alice: Hey bartender, get this guy another Bayeslight!

Bob: Fuck you again, Alice.

Alice: Hold my beer. I'm 'bout to go whup these suckas at some beer pong. Think about your true beliefs while I'm gone Bob. And be ready to bet $1,000 on them.

Bob: Will do. You're an asshole, but thanks for pushing my epistemics in the right direction.

Alice: 😉

Bob: thinking...

Alice: Phew! Those guys are a bunch of weirdos over there. They wouldn't bet more than $20. What cowards.

Bob: Ok. So I thought about it, and I determined that my true belief is actually a 4-7. And I'm ready to bet $1,000 on it.

Alice: Cool. How's $10,000 sound?

Bob: WAIT, WHAT??? $10,000?!

Alice: What's the problem?

Bob: I thought you said $1,000?!

Alice: I did say $1,000. Now I'm saying $10,000. Problem?

Bob: WTF Alice!

Alice: Dude, you're being a pansy again. Do we really need to go through this? Why are you so comfortable betting $1,000 and not $10,000? $10,000 is not going to materially impact your life, Bob. We can discuss it, but I'm pretty sure $10,000 is still in "inelastic territory".

Bob: Sighhhhhhh

Alice: Do I need to get you another Bayeslight?

Bob: No! No more Bayeslights!

Alice: Do I need to leave you some time to think again?

Bob: No...

Alice: You know you should be perfectly willing to bet $10,000 on your stated belief with the odds I'm offering you, since you're inelastic and it's +EV to you at those odds.

Bob: Well...

Alice: Let me guess: you're not so certain about that 4-7 belief, huh?

Bob: Correct.

Alice: What do you think is going on here?

Bob: I mean, I gotta admit, I started off a little reactionary against President Hanson's immigration policy. I shouldn't have been that confident in my 2-8 belief.

Alice: Yeah. And now?

Bob: Well now I'm just not so sure about my 4-7 belief. For $1,000 I was ready to roll with it. But for $10,000? Well, I'm just thinking about how I underestimated how complicated immigration policy is. I'm thinking about all of these second and third and fourth order effects, and how it's difficult to predict what outcome any given policy will have. Before I was mostly thinking about the first-order effects, and maybe a little bit about the second-order effects, but I'm realizing that this only scratches the surface of the ultimate consequences. I also am realizing that I was focused on how the conservatives just don't understand A, B and C. But now I'm realizing that even if that's true, once you factor in A, B and C, well, there's just so many other things to factor in on top of A, B and C. D, E, F, G, etc. So I guess I'm just feeling pretty uncertain.

Alice: Cool. So what are your true beliefs?

Bob: Um, I don't know. I'd have to really sit down and think about it for a while. Probably spend a few weeks researching things and stuff.

Alice: False. You do in fact have some belief right now, with some level of confidence. Sure, if you spend six weeks researching things you'd probably revise your belief and confidence level, but that doesn't change the fact that you are not a rock and you do in fact, at this very moment, have some belief and have some level of confidence in that belief. C'mon Bob, I thought you were a Bayesian. Do I have to dutch book you?

Bob: Ugh. No. You don't have to dutch book me. I know that you're right.

Alice: Ok. I won't make you bet $10,000 on it, but what would you say your true beliefs are right now?

Bob: Eh, I think I do, truly, lean slightly towards a liberal immigration policy. But I'm honestly not very confident in it. I think my true belief is something like a 4-2.

Alice: $100?

Bob: Smiles. Yes, $100.

Alice and Bob shake on it.

Bob: You are still an asshole, but this was actually a good lesson for me.

Alice: I think I'm more of a dick, but thanks.

Bob: Haha. And I guess I'm the pussy huh?

Alice: Yup.


Bob: You're the worst. But hey, let me ask you something. How did you get that job as an oddsmaker for the Economic Policy Prediction Market? My daughter has been talking about pursuing something like that.

Alice: Some lady I met at this bar pushed me until I made a $100,000 bet with her on something related to immigration policy. Then she offered me a job. 

  1. ^

    Please suspend your disbelief. Using numbers like this probably isn't realistic, but it makes the writing easier.

9 comments

Comments sorted by top scores.

comment by JBlack · 2023-10-21T23:08:49.598Z · LW(p) · GW(p)

I have little idea what this is trying to convey, since 

"Alice: Proposes some clever thing to bet on with crystal clear resolution criteria that perfectly captures everything"

hides the actual meaning of everything that follows. Like, is Bob betting on his actual beliefs exactly matching some profile, or in some direction, or a given range, or on object-level consequences of his beliefs, or what? Me trying to guess at what is meant from the subsequent text wasn't much use, since it appears to be talking about both beliefs and consequences and seems to be inconsistent with every particular hypothesis I formed in the few minutes I was thinking about it.

On top of that, the escalation totally pinged my scam-detector and so now I'm wondering whether that's an intentional part of the story or not.

My thoughts on the last line were along the lines of "Alice related that she was behaving like a risk-seeking gullible idiot, and that's why she got the job? Confusing! It's more evidence that Alice is actually trying to scam Bob, lying about that as part of the scam to connect the same risky action she wants Bob to take with his daughter's career goals."

However that doesn't seem to be anything that matches up with the post's title, so doesn't seem likely to be anything that the author intended to convey.

Replies from: adamzerner
comment by Adam Zerner (adamzerner) · 2023-10-22T07:06:00.273Z · LW(p) · GW(p)

Here is what I was going for with this post (spoilers). I'm not sure how much of it was clear.

Often times if you ask someone for their belief, they'll say A. But then if you push them on it, ie. by asking them to bet, they'll backpedal into belief B instead of belief A (B is often more moderate than A). And then if you push them more, or ask them to bet a larger amount of money, they'll backpedal again from belief B to belief C.

What is going on here? If the person will start off with belief A but end up with belief F (going from A, to B, to C, to D, to E, to F) after being pushed, maybe we can call belief F they're "true" belief?

Nah. I don't think "true" is a good way to label it. Too generic. It kinda begs the question of "what do you mean by true?". I think "best effort belief" is a pretty good label though. After all, it's the belief that one would arrive at after making their "best effort". They type of effort they'd make if, say, $100,000 were on the line.

I also like "best effort" because it can be easily extended to phrases like "3/4 effort belief", "reactionary belief", and "extraordinary effort [LW · GW] belief".

This seems like a useful frame to look at things through. Largely because I think it makes it salient that most of our beliefs are not very high-effort. It'd be a cool experiment to try to quantify this. Eg. where you do what Alice did to Bob and see how much the subjects' confidence changes in response to the bet size.

hides the actual meaning of everything that follows

In some sense, I agree that it does. But not in an important sense.

Imagine that the framing of the conversation was even more abstract. Ie. instead of being about "immigration policy", it were about "X". Maybe Alice asks Bob how confident he is in X, he starts off by saying 95%, when offered a bet he backpedals to 90%, then 80%, so on and so forth.

Such a framing would hide something, but the important information (see spoiler above) would be retained.

On top of that, the escalation totally pinged my scam-detector and so now I'm wondering whether that's an intentional part of the story or not.

No, it was not. I tried to portray things as:

  1. Taking place in a bit of a dath ilanian [? · GW] world (President Hanson; Bayeslight; an oddsmaker being a prestigious job).
  2. Alice and Bob are longtime friends.

So then, I don't think there's anything sketchy about wanting to bet on your beliefs. I think it's rational and virtuous.

In our universe though, I agree that it would usually be sketchy, but I could imagine scenarios where it wouldn't be. For example, if I had been part of a rationalist community for many years, known someone very well like Bob knows Alice, and they wanted to bet with me like this, I wouldn't necessarily have scam-detector alarms go off. As another example, I get the sense that this sort of thing could happen in certain circles of high-level gamblers with good epistemics.

My thoughts on the last line were along the lines of "Alice related that she was behaving like a risk-seeking gullible idiot, and that's why she got the job? Confusing! It's more evidence that Alice is actually trying to scam Bob, lying about that as part of the scam to connect the same risky action she wants Bob to take with his daughter's career goals."

Yeah, that's not what I was going for. Here's what I was going for:

Betting on beliefs is a virtue that is both useful on it's own, causally related to other good things, and correlated with a bunch of other good things. The person who hired Alice did so for these reasons. 

Replies from: JBlack
comment by JBlack · 2023-10-22T22:29:27.415Z · LW(p) · GW(p)

Even in dath ilan, having someone push to bet orders of magnitude more is sketchy. It's doubly sketchy in the case of Alice's story where it's "Some lady I met at this bar" versus someone who Bob knows, and "... that's how I got my high-status job" is triply sketchy.

Replies from: adamzerner
comment by Adam Zerner (adamzerner) · 2023-10-23T18:01:02.933Z · LW(p) · GW(p)

Perhaps. Although it feels to me like it's a reasonable thing to suspend disbelief on.

comment by Shmi (shminux) · 2023-10-22T02:38:36.064Z · LW(p) · GW(p)

A more realistic and rational outcome: Alice is indeed an ass and it's not fun to be around her. Bob walks out and blocks her everywhere. Now, dutchbook this! 

Replies from: adamzerner, Richard_Kennaway
comment by Adam Zerner (adamzerner) · 2023-10-22T07:08:17.293Z · LW(p) · GW(p)

In most contexts, yes. However, in the sort of dath ilanian context I was going for [LW(p) · GW(p)], or even in the context of a serious group of rationalists in our universe who are longtime friends, I don't think what you described is a more realistic or rational outcome.

comment by Richard_Kennaway · 2023-10-22T06:43:11.631Z · LW(p) · GW(p)

Alice: Ha ha, what a sucker Bob is! I offered him free money, given his stated beliefs, and he turned it down! I win!

comment by ProgramCrafter (programcrafter) · 2023-10-22T13:24:22.423Z · LW(p) · GW(p)

Can't Bob say "Alice is doing something strange; she is rejecting bet of smaller value at certain odds but accepting larger one, leaving free money on the table in former case, and that isn't rational as well"?

comment by Seth Herd · 2023-10-23T13:54:37.834Z · LW(p) · GW(p)

At low levels of betting, it's reasonable that have that money partly serve to buy you status. You're signaling your confidence in your beliefs, and therefore signaling your competence in rationality.

When the bet gets higher, it's no longer a good pric.for signaling. So the rational position to bet changes.