Say Wrong Things
post by G Gordon Worley III (gworley)
Appearing Less Wrong
Say It Wrong
To become less wrong, you must give up being most right.
There are many ways you might approach being less wrong.
A popular one is to make fewer wrong statements; to say fewer wrong things.
Naively it would seem this is a recipe for success, since you just say more things that are true and right and fewer things that are false and wrong. But if Goodhart [LW · GW] has anything to say about it, and [LW · GW] he [LW · GW] does [LW · GW], you'll find ways to maximize the measure at the expense of the original objective.
Assuming the real objective is something like "have a more complete, precise, and accurate model of the world that better predicts the outcome of subjectively unknown events", then we can quickly see the many ways Goodharting can lead us astray if we focus too much on appearing less wrong. We might:
- make fewer claims than we could, pulling us away from completeness even as we appear less wrong;
- make weaker claims than we could, pulling us away from precision;
- and, a perennial favorite, filter the claims we publicly make so we appear less wrong than we really are by hiding our least confident claims.
The first two can be corrected with better calibration, that is by making statements with confidence intervals or likelihoods that proportionally match the observed frequency of correctness of similarly confident claims. But simply suggesting someone "be better calibrated" is not a motion they can make; it's an outcome of taking actions towards increasing calibration. As good a place to start as any for improving calibration is the forecasting literature [LW · GW], if that's what you'd like to do.
The third is more tricky, though, because it's less directly about claims being made and their probability of correctness and more about social dynamics and how you appear to other people. And for that reason it's what I want to focus on here.
Appearing Less Wrong
I've met a lot of people in my life who are experts at not looking as stupid as they are.
That's kind of harsh. Maybe a nicer way to say it is that they are experts at appearing to be better at making correct predictions about the world than they actually are.
Some of their techniques are just normal social tricks: projecting confidence, using social status, the ever-abused term "gaslighting", and other methods of getting people to believe they are right even when a more careful examination would reveal them to be mistaken. These are people we all love to hate and love when we can call them on their bullshit: overconfident academics, inflated politicians, self-important internet intellectuals, and those people whose idea of social interaction is to say "well, actually... [LW · GW]".
But there's a way to avoid looking stupid that is more pernicious, less amenable to calling out, and that subtly drags you towards local maxima that trap you mountains and valleys away from more complete understanding. And it's to shut up and not tell anyone about your low confidence beliefs.
It is extremely tempting [LW · GW] to do this. Among the many benefits of keeping low probability claims to yourself:
- you have a high accuracy ratio of publicly made claims, making you look right more often when observed;
- you say only things that, even when wrong, turn out to be wrong in conservative ways that still make you look smart;
- and you accrue a reputation of being right, usually conferring social status, which can feel really good [LW · GW].
The only trouble is that this approach is too conservative, too modest [LW · GW]. It's easy to justify this kind of outward modesty as keeping up appearances in a way that is instrumental to some bigger goal, and you say to yourself "I'll still make low probability claims; I'll just keep them to myself", but down that path lies shadow rationality [LW · GW] via compartmentalization [LW · GW]. You can try it, but good luck, because it's a dark art that hopes to do what human brains cannot, or at least cannot without some sufficiently powerful magic [LW · GW], and that magic traditionally comes with vows not to do it.
Meanwhile, out in the light, finding models that are better predictive of reality sometimes requires holding beliefs that appear unlikely to be true but then turn out to be right [LW · GW], sometimes spectacularly [LW · GW] so, although semper caveat, all models are wrong [LW · GW], but some are useful [LW · GW]. And then you have to go all in sometimes, exploring the possibility that your 10% guess turns out to be 100% correct, minus epsilon, because if you don't do this you'll do no better than the medieval Scholastic holding to Aristotelian physics or the early 20th century geologist ignoring the evidence for continental drift, forever locked away from taking the big risks necessary to find better, more accurate, precise, and complete models.
Okay, so let's say you are convinced not to try so hard to appear more right than you are. How do you do that?
Say It Wrong
So I suppose it's nothing so much harder than just telling people your claims, even if you have low confidence in them, and seeing how they react, although depending on the circumstances you'll probably want to adequately explain your confidence level [LW · GW] so they can update on it appropriately. The trouble is getting yourself to do that.
I can't change your mind for you, although thankfully some folks have developed some techniques that might help if you're not interested in over-solving that problem [LW · GW]. What I can do is point out a few things that might help you see where you are being too modest, nudge you towards less modesty, and create an environment where it's safe to be less modest.
- Look for the feeling of "pulling your punches" when you are telling people your ideas.
- Ramble more and filter less.
- Alternatively, more babble [LW · GW] less prune [LW · GW].
- Worry less about how you look to others.
- Relatedly, increase your ability to generate your own esteem so you less need to receive it from others, giving you more freedom to make mistakes.
- Encourage others to tell you their half-baked ideas.
- When they do, be supportive.
- Take a collaborative, nurturing [LW · GW] approach to truth seeking.
- Play party games like "what do you think is true that you think no one else here also thinks is true?" and "what's your least popular opinion?".
- And my personal favorite, write up and share your immodest, lower-confidence ideas that you think deserve exploration because they have high expected value if they turn out to be right.
I suspect that the real reason most people try too hard to appear more right than they are is fear—fear of being wrong, fear of looking stupid, fear of losing status, fear of losing prestige, fear of losing face, fear of being ostracized, fear of being ignored, fear of feeling bad, fear of feeling lesser. I see this fear, and I honor it, but it must be overcome if one wishes to become stronger [LW · GW]. And when you fear being more wrong, you will be too careful [LW · GW] to ever become as less wrong as you could.
To sum it all up pithily:
To become less wrong, you must give up being most right.
Comments sorted by top scores.
comment by Wei_Dai ·
2019-05-25T00:38:08.580Z · LW(p) · GW(p)
Ramble more and filter less.
This has to be carefully targeted to a select group of people or the world would be drowning in noise (even more than it already is). I think many people tend to be overconfident in their own ideas and already ramble more and filter less than they ideally should. Fear of being wrong might help counteract that somewhat so I'm wary of trying to remove that counterweight in someone before making sure to fix the overconfidence problem, unless you're only targeting the intervention on people who already aren't overconfident.
Replies from: cousin_it, Raemon, Stefan_Schubert
↑ comment by cousin_it ·
2019-05-28T12:52:03.179Z · LW(p) · GW(p)
I think both overconfidence and underconfidence are widespread, so it's hard to tell which advice would do more good. Maybe we can agree that people tend to over-share their conclusions and under-share their evidence? That seems plausible overall; advising people to shift toward sharing evidence might help address both underconfidence (because evidence feels safer to share) and overconfidence (because people will notice if the evidence doesn't warrant the conclusion); and it might help with other problems as well, like double-counting evidence due to many people stating the same conclusion.
↑ comment by Raemon ·
2019-05-25T21:27:32.551Z · LW(p) · GW(p)
I think there's a weird thing where it's best to *both* ramble more but *also* prune more? (something like, I think it'd be better if more people wrote off the cuff shortform thoughts, and also if more people dedicated some time to thinking through their positions more carefully.Replies from: Ikaxas
↑ comment by Stefan_Schubert ·
2019-05-30T11:31:28.652Z · LW(p) · GW(p)
Agreed; those are important considerations. In general, I think a risk for rationalists is to change one's behaviour on complex and important matters based on individual arguments which, while they appear plausible, don't give the full picture. Cf Chesterton's fence, naive rationalism, etc.
comment by Swimmer963 ·
2019-05-25T18:14:00.634Z · LW(p) · GW(p)
I've noticed in the past that I feel aversion to saying (and especially writing down) things that "might be false" – where I'm low confidence, where I expect that I just don't have the information to do more than speculate wildly.
When I try to introspect on this, I do think some of it is fear of being wrong in public (and this feeling definitely responds to local social incentives – I'm more likely to be comfortable rambling without a filter in private with close friends, and in fact do this a bunch.)
I think there are also other pieces. I'm wary that it's hard for humans to incorporate epistemic status tags and actually take my thoughts less seriously if I say I'm very uncertain. I'm also wary of...crystallizing a theory about something, and then becoming vulnerable to confirmation bias as I try to test it against observations? (This worry is stronger in areas where it feels hard to actually get feedback from reality.) As you point out, though, the tradeoff there is giving up on improving understanding.
I suspect I'm a lot less averse to low-confidence-speculation in private than I was several years ago, and it's partly because I think it's good for developing understanding, and partly just because I feel more comfortable socially and have less anticipation of being shut down.
(Also want to note that in general I have only moderate confidence in my introspection on stuff like this, and this comment is mostly a ramble.)
comment by Dagon ·
2019-05-24T23:48:07.708Z · LW(p) · GW(p)
Related: https://www.lesswrong.com/posts/YRgMCXMbkKBZgMz4M/asymmetric-justice [LW · GW] . In common conversation, people are often judged more harshly for being wrong (or taking action with intended or unintended consequences) than for being silent (or failing to take action). If you're worried about social judgement of a peer group that doesn't already have this norm, you'll likely have to make clear when you're being intentionally wrong vs actually believing something to be the truth.
Outright lying about your beliefs can backfire, but "strong beliefs, weakly held" is extremely powerful. "I think X, but I'm not sure" works really well in some groups. "I think X' (a simplification of X)" works well in others. Rarely have I had success with "I think ~X".
So I'd recommend being willing (again, in some contexts) to make tentative and uncertain statements, in order to learn where people disagree. I'd avoid outright inflammatory known-untruths.
comment by jmh ·
2019-05-28T12:14:18.994Z · LW(p) · GW(p)
As is was reading one old saying was popping into my head: Better to say nothing and be thought a fool than open you mouth and prove yourself a fool. (or something close to that)
That does seem to be something of a view argued against and I think that justified in many ways. It is a gray field and no lines really in my opinion.
A couple of thoughts though. Typically we have facts and knowledge (less wrong) but hardly anything approaching complete knowledge. We will always have many opportunities to apply what we know to new areas. In some cases others have been there before so we reinvent the wheel to some extent which is okay. We might get pointed to that literature but should really not be chastised for doing original thinking ourselves even if we get it a bit wrong -- that's how we all learn.
In other cases there it may well be new territory and that means purely subjective for some period of time. The statistical testing of models, and the model development, are just ways of trying to figure out how to do something useful in a new area with what we already know. This is really where new knowledge arises to my thinking.
Perhaps the calculations should not be the ratio of what someone gets right to what they have wrong but rather what value came from what they got right (with some consideration for any costs from what they got wrong). Its probably safe (conservative?) to say that the world is better due to the risk takers and not the conservative thinker wanting to be more right than wrong.
comment by Donald Hobson (donald-hobson) ·
2019-05-25T12:12:36.842Z · LW(p) · GW(p)
Lets consider the different cases seperately.
Case 1) Information that I know. I have enough information to come to a particular conclusion with reasonable confidence. If some other people might not have reached the conclusion, and its useful or interesting, then I might share it. So I don't share things that everyone knows, or things that no one cares about.
Case 2) The information is available, I have not done research and formed a conclusion. This covers cases where I don't know whats going on, because I can't be bothered to find out. I don't know who won sportsball. What use is there in telling everyone my null prior.
Case 3) The information is not readily available. If I think a question is important, and I don't know the answer already, then the answer is hard to get. Maybe no-one knows the answer, maybe the answer is all in jargon that I don't understand. For example "Do aliens exist?". Sometimes a little evidence is available, and speculative conclusions can be drawn. But is sharing some faint wisps of evidence, and describing a posterior that's barely been updated saying wrong things?
On a societal level, if you set a really high bar for reliability, all you get is the vacuously true. Set too low a bar, and almost all the conclusions will be false. Don't just have a pile of hypotheses that are at least likely to be true, for some fixed . Keep your hypothesis sorted by likelihood. A place for near certainties. A place for conclusions that are worth considering for the chance they are correct.
Of course, in a large answer space, where the amount of evidence available and the amount required are large and varying, the chance that both will be within a few bits of each other is small. Suppose the correct hypothesis takes some random number of bits between 1 and 10,000 to locate. And suppose the evidence available is also randomly spread between 1 and 10,000. The chance of the two being within 10 bits of each other is about 1/500.
This means that 499 times out of 500, you assign the correct hypothesis a chance of less than 0.1% or more than 99.9%. Uncertain conclusions are rare.
comment by Slider ·
2019-05-25T03:44:02.210Z · LW(p) · GW(p)
Sometimes its critical that you fail and you do not really process before you do. In that kind of situation it can be more of a question whether you fail early or late.
An agent that is committed to learning for their mistakes always gains new capabilities when catching mistakes. Basking on how much knowledge you already have has little to no impact on ability gain so it's pretty much waste of time outside of emotional balancing.
Sometimes its very important to be genuine instead of accurate. Being honest means people know to relate to you according to your true state. If you are clueless people know to give you information. If you are on target people might not burden you with noise.