Public Positions and Private Guts
post by Vaniver, jacobjacob · 2020-06-26T23:00:52.838Z · LW · GW · 1 commentsContents
Talk Q&A None 1 comment
(Talk [LW · GW] given on Sunday 21st June, over a zoom call with 40 attendees. Vaniver is responsible for the talk, jacobjacob is responsible for the transcription)
Ben Pace: Thank you everyone very much for coming. All 41 of us. This is a LessWrong event. So it is less wrong than your normal events. Jacob and I wanted to try out some online events, see what was fun. We pinged a bunch of the curated authors who write great stuff and said, "Do you also want to give a short talk?" And a bunch of them were like, "Oh that sounds nice, to actually see the people who read my stuff, rather than just imagining them."
Ben Pace: So, we're going to have five minute talks. I'll keep time. And then we'll have some Q&A afterwards, maximum 10 minutes but shorter. And there's going to be five talks. Vaniver, if you'd like to begin?
Talk
Vaniver: Cool! Hi, I'm Vaniver. The thing I'm going to be talking about is Public Positions and Private Guts [LW · GW], a blog post that I wrote a while ago that got curated. It's originally due to a series of talks given by Anna Salamon at some workshops.
Vaniver: So, what is this about? Why do we care?
Vaniver: Well, there's some things that happen sometimes, like start-up founders who have an idea that they strongly anticipate will work, but they can't explain it to other people, they can't prove that the thing will work. If they could prove it, it wouldn't be a start-up anymore, it would already be some mature business somewhere.
Vaniver: Similarly, you will run across people who are PhD students who have this logical, airtight argument that they should get their PhD, and yet they're mysteriously uninterested in doing any work on their dissertation. And so there's this question of, what's up with that? Why isn't there this one map of the world that goes both ways?
Vaniver: So there are these two clusters of knowledge. I'm going to talk first about a sort of communication, that I think defines these clusters. For that I'm going to talk a bit about formal communication. Basically, in philosophy, there's this model of how people talk to each other; you have a shared context where both the speaker and the audience know all the things in the shared context. The speaker will add additional facts, one at a time (like maybe I've observed a thing that people don't know about yet). And, also, logical facts count; if there's, A in the context and also "if A, then B", asserting B is a thing that might not have been in the context yet because the audience isn't logically omniscient.
Vaniver: And so, one of the facts about this process is, if each of these observations doesn't contradict the things that's already in the shared context, and is trivially checkable in this one step, you can end up believing the things that come out at the end, much like a math proof and that sort of thing, at least as much as you trust the context that you started off with.
Vaniver: And so an interesting fact about the sort of things you can fit into this communication style is they all have to be easily justifiable. If I have a point that I want to get across that requires five different complicated arguments to support it, unless I can go through each one of these complicated arguments in a serial fashion, you're not going to be able to build this thing using this formal communication style.
Vaniver: And so, Public Positions are these sorts of beliefs that have been optimized for justifiability or presentation. The PhD student that has this logical argument that they have worked through with other people on why they should get their PhD, they have this public position, they have this formal communication to back it up.
Vaniver: Private Guts, in contrast to this, they're not mutually exclusive to Public Positions, but they're defined in a different way for a different purpose. They're trying to actually anticipate things about the future. And they come from the actual historical causes of the belief. So, for example, this PhD student might historically want the PhD because their family always respected education, and when they think about quitting the PhD, they can't do it because that would mean they're a “quitter”.
Vaniver: When you think about training a neural network to recognize pictures of dogs and cats, it will use lots of little pieces of information to come to its conclusion, in a way that's opaque and difficult to understand because it's not optimizing at all for understandability, it's just optimizing for the success metric of, did it correctly anticipate the thing or not?
Vaniver: And so, the startup founder's complicated reason for believing that their startup will work comes from this sort of thing. There's lots of little pieces that all fit together in their mind, but they can't easily explain it or else this would be a widespread belief that the startup would work.
Vaniver: So, anyway, a lot of CFAR related things relate to how to build bridges between these two sorts of things so that people who are convinced of something through a logical argument also end up feeling it in their guts. And also, people who feel a thing in their guts, that they don't have this formal position for, are able to figure out how to draw out these many small pieces of data and construct something that's reasonable and articulable.
Q&A
Ben Pace: All right, that's five minutes. It sounded like actually you maybe just naturally stopped?
Vaniver: Just under the wire.
Ben Pace: Cool, cool. Thanks. So the PhD guy has a bunch of formal arguments but not private guts for why he should do the thing, and he's having a hard time translating between those, and trying to have the formal argument inform his private guts. And similarly, the startup guy is having a hard time naturally turning the private guts into a formalized argument. That was what you said? That was accurate?
Vaniver: Yeah, I think that's my take on those examples. I tried to come up with examples that were a little different from the original historical cause of this talk; which is something like, many people who would take AI safety seriously with their speech but not with their actual actions [LW · GW]. And there's this question of, well why is that? And it's like, oh it's because they don't feel it coming for real, in the same way they might feel climate change is coming for real, or something.
Ben Pace: Yeah. Patrick, would you like to ask a question?
Patrick LaVictoire: Building on this, I have something to say about how to model the internal experience of this happening and what you can do about it. But I think, instead of a comment, I'm going to talk for five minutes about it and mention that your point is relevant to mine.
Vaniver: Cool!
Ben Pace: Sounds good. How does it tend to look when people successfully turn their private guts into formal arguments?
Vaniver: Yeah, so I think part of this is coming up with communication styles that aren't so much formal communication. One thing that has grown more popular over the last few years is this idea of doing double-crux [LW · GW], where you and another person will both try and look at your actual belief system and say “this is the thing that would change my mind about this subject that we disagree on”, and you jointly explore this together.
Vaniver: It's interesting because when you watch a public debate, often you'll find the things that are said are designed to convince you, the audience, or be broadly applicable. But when you watch a double-crux, this is the opposite of what they're doing. They're trying to focus, laser-like, on what do I, Vaniver, care about in this issue? Even if only two percent of the audience cares about this particular part of the issue, it's the bit that's crux-y for me.
Vaniver: So I think there's a way in which formal communication does actually limit the sort of things you can believe. In the same way that being relentlessly empirical about the world, instead of theoretical, means that you can only believe things that you've already seen happen in the past instead of also believing in things that you predict will happen in the future.
Ben Pace: Yeah, that makes sense. I'm just curious, have you also seen examples of people turning the explicit formal arguments into their private guts, and what that's felt like?
Vaniver: Yeah. I think I've seen some examples of that. The first thing that comes to mind is actually Robin Hanson's construal level theory and the whole near/far distinction. I think just having that in my mental vocabulary, at least, it's been much easier to see what sort of beliefs do I have that are just the color of my banners, or something, versus what beliefs do I have that are actually about anticipating the future.
Vaniver: When I've come across something that matters to a lot of different facts of my life, like where I live, and I'm like, oh wait there's this sort of home-town bias thing going on here where living in this place is great because it's the place I live in. Seeing that sort of thing can help me switch to near mode and do much more of the “what are the actual factors that should matter here? Are my guts linked up with the thing they should be linked up with?”
Ben Pace: Oh, a comment from Dennis, which I think is a solid question about how this relates to Kahneman’s System 1 / System 2. I think people often think of System 1 as the implicit one that has all the gears that are not easily accessible to my conscious brain; my System 2 is this sensible, explicit reasoner who justifies his thoughts or something.
Vaniver: Yeah, so I think they're related... I'm always a little hesitant to say if something is System 1 or is System 2 because it's this technical concept from psychology that I'm wary about getting wrong. But I do think there's a way in which both System 1 and System 2 would be part of the private guts, where many of your System 2 things, they're deliberative, they're slow, but they're not necessarily optimized for justification.
Vaniver: Similarly, when you look at public positions, I think the mode of it which I talked about, which is very much formal communication-esque, is very much this System 2 things of, here's my deliberative reasons for the position. But I think there's an aspect to public positions which is understanding the lay of the land, knowing what the shared context is, knowing what things will and won't get you attacked or you will or won't have to justify. And that one feels like it's often very immediate and reactive and intuitive, and the various other things that people say about System 1.
Vaniver: I think there's a big overlap, but there's also some bits on the diagonals.
Ben Pace: That makes sense, yeah. Thanks very much, Vaniver. We'll move onto the next one for now.
1 comments
Comments sorted by top scores.