Overconfident talking down, humble or hostile talking up

post by ozziegooen · 2018-11-30T12:41:54.980Z · LW · GW · 19 comments

Contents

  A Distribution of Knowlege
    Claim 1: It’s easy to judge where on the curve people are who are lower than you.
    Claim 2: It’s difficult to judge where on the curve people are who are higher than you, absent of sophisticated systems to support this.
  Overconfident talking down, humble or hostile talking up
  The Economics of Knowledge Signaling
None
19 comments

A Distribution of Knowlege

If one were to make a distribution of the amount of knowledge different people have about, say, macroeconomics, I would suspect the distribution to be somewhat lognormal; they would have tails to both ends, but be very skewed to the right. Most people have almost no knowledge of macroeconomics, some have a bit, then there is a long tail of fewer and fewer people who make up the experts.

The above graph doesn’t exactly resemble what I’d expect for macroeconomics but acts as a rough heuristic. The large numbers represent halvings of the remaining percentiles (3/4th, 7/8th, 15/16th, etc).[1]

I’m going to posit the following claims:[2]

Claim 1: It’s easy to judge where on the curve people are who are lower than you.

Claim 2: It’s difficult to judge where on the curve people are who are higher than you, absent of sophisticated systems to support this.

Given these, let’s imagine a few situations:

  1. Say you’re the local economist for a state government. You have some actions you really would like the government to take, even though your colleagues wouldn’t typically approve. You’re around a +4 on the macroeconomic scale, and your most knowledgeable colleagues are around a +2. Could you get away with pretending that macroeconomics has a very confident stance that happens to align with what you want to see happen? How would you do so?
  2. You’re a local radio intellectual who’s a +3 on the macroeconomic scale. Almost all of your listeners are below a +1.5. You’ve been essentially lying to them for some time about macroeconomic theory because it helps your political message. A professor who’s a +5 starts writing a few articles that call you out on your lies. How do you respond?
  3. You’re a college student who’s a +3 on the macroeconomic scale. Your professors are all a good deal higher and will be the primary ones evaluating your studies. You want to legitimately learn macroeconomics. How do you treat your professors?

Overconfident talking down, humble or hostile talking up

I think the answers I’d expect from these questions can be summarized in the phrase “Overconfident talking down, humble or hostile talking up.”

When you’re communicating with people who know less than you, and you have little accountability from people who know more, then you generally have the option of claiming to be more knowledgeable than you are, and lying in ways that are useful to you.

When you’re communicating with people who know more than you, you have two options. You can accept their greater state of knowledge, causing you to speak more honestly about the pertinent topics. Or, you could reject their credibility, claiming that they really don’t know more than you. Many people who know less than you both may believe you over them.

There are many examples of this. One particularly good one may be the history of schisms in religious organizations. Religious authorities generally know a lot more about their respective religions than the majority of citizens. Each authority has a choice; they could either accept the knowledge of the higher authorities, or they could reject the higher authorities. If they reject above authority, they would be incentivized to discredit that authority and express overconfidence in their own new beliefs. If they succeed, some followers would believe them, giving them both the assumption of expertise and also the flexibility of not having to be accountable to other knowledgeable groups. If they both defect on their previous authorities and fail, then they may wind up in a very poor position, so after defecting it's very important to ensure that their existing audience gives them full support.

The Economics of Knowledge Signaling

In slightly more economic terms, one could say that there are strong signals going up the chain of knowledge (from the nonexperts to the experts), and weak signals going down it. The market for knowledgeable expertise is one with relatively low transparency and typical incentives to lie and deceive, similar to the markets for lemons.

I'm not claiming with this that all of the overconfidence and discrediting is knowingly dishonest.[3] I'm also not claiming that this is original; much is quite obvious and parts are definitely studied. That said, I do get the impression that the science of signaling is still pretty overlooked (much of this is from Robin Hanson), and this is one area I think may not be well understood as a holistic economic system.

Finally, I'm reminded of the old joke:

"Once I saw this guy on a bridge about to jump. I said, “Don’t do it!” He said, “Nobody loves me.” I said, “God loves you. Do you believe in God?”
He said, “Yes.” I said, “Are you a Christian or a Jew?” He said, “A Christian.” I said, “Me, too! What franchise?” He said, “Protestant.” I said, “Me, too! Northern Baptist or Southern Baptist?” He said, “Northern Baptist.” I said, “Me, too! Northern Conservative Baptist or Northern Liberal Baptist?” He said, “Northern Conservative Baptist.” I said, “Me, too! Northern Conservative Baptist Great Lakes Region, or Northern Conservative Baptist Eastern Region?” He said, “Northern Conservative Baptist Great Lakes Region.”
I said, “Me, too!” “Northern Conservative Baptist Great Lakes Regions Council of 1879 or Northern Conservative Baptist Great Lakes Region Council of 1912?” He said “Northern Conservative Baptist Great Lakes Council of 1912.” I said, “Die, heretic!” And I pushed him over.

One may wonder what incentives seem to lead to such heartfelt but predictably frequent divisions.


  1. This is similar to the log-odds scale. Standard deviation could also be used, but I find it a bit unintuitive, especially for non-normal distributions.
  2. These mostly come from lots of anecdotal evidence, some general reasoning, and my memories of a few research studies. I’ve spent around 40 minutes attempted to locate useful studies for this post, but haven’t, though I’m quite sure I remember reading about related ones several years ago. If you have any recommended links, please post in the comments.
  3. The first few chapters of The Elephant in the Brain go into this.

19 comments

Comments sorted by top scores.

comment by Noah Walton (noah-walton) · 2018-12-02T17:55:04.028Z · LW(p) · GW(p)
When you’re communicating with people who know more than you, you have two options. You can accept their greater state of knowledge, causing you to speak more honestly about the pertinent topics. Or, you could reject their credibility, claiming that they really don’t know more than you. Many people who know less than you both may believe you over them.

A third option is to claim epistemic learned helplessness. You can believe someone knows more than you, but reject their claims because there are incentives to deceive. It's even possible to openly coordinate based on this. This seems like something I've seen people do, maybe even frequently. I can't think of anything specific, but one method would be to portray the more knowledgeable person as "using their power [in the form of knowledge] for evil".

Replies from: ozziegooen
comment by ozziegooen · 2018-12-02T19:50:18.104Z · LW(p) · GW(p)

It's a good point.

The options are about how you talk to others, rather than how you listen to others. So if you talk with someone who knows more than you, "humble" means that you don't act overconfidently, because they could call you out on it. It does not mean that you aren't skeptical of what they have to say.

I definitely agree that you should often begin skeptical. Epistemic learned helplessness seems like a good phrase, thanks for the link.

One specific area I could see this coming up is when you have to debate someone you are sure is wrong, but has way more practice debating. They may know all the arguments and counter-arguments, and would destroy you in any regular debate, but that doesn't mean you should trust them, especially if you know there are better experts on the other side. You could probably find great debaters on all controversial topics, on both sides.

comment by Shmi (shminux) · 2018-12-01T01:52:41.654Z · LW(p) · GW(p)
Claim 2: It’s difficult to judge where on the curve people are who are higher than you, absent of sophisticated systems to support this.

Eliezer mentioned something like this in The Level Above Mine [LW · GW].

Replies from: Darmani
comment by Darmani · 2018-12-01T05:55:55.137Z · LW(p) · GW(p)

And Paul Graham in Beating the Averages: http://www.paulgraham.com/avg.html

comment by benwr · 2018-11-30T17:19:55.308Z · LW(p) · GW(p)

This nicely explains why I feel so embarrassed when I learn that someone I'm talking with is more knowledgeable than I thought. I wonder how to avoid subconscious overconfidence- / humility-projecting.

It might work to add a TAP for thinking "if this person were much more/less knowledgeable than me, would I have the same presentation in this conversation?"

Replies from: ozziegooen
comment by ozziegooen · 2018-11-30T17:37:00.473Z · LW(p) · GW(p)

That's a good point. My communication changes a lot too and it's one reason why I'm often reluctant to explain ideas in public rather than in private; it's much harder to adjust the narrative and humility-level.

Replies from: ozziegooen
comment by ozziegooen · 2018-11-30T17:53:31.940Z · LW(p) · GW(p)

To be a bit more specific; I think there are multiple reasons why you would communicate in different ways to people on different levels of knowledge. One is because you could "get away with more" around people who know less than you. But another is that you would expect people at different parts of the curve to know different things and talk in different ways, so if you just optimized for their true learning, the results would be quite different.

comment by rk · 2018-12-01T13:35:05.801Z · LW(p) · GW(p)

As I read through, the core model fit well with my intuition. But then I was surprised when I got to the section on religious schisms! I wondered why we should model the adherents of a religion as trying to join the school with the most 'accurate' claims about the religion.

On reflection, it appears to me that the model probably holds roughly as well in the religion case as the local radio intellectual case. Both of those are examples of "hostile" talking up. I wonder if the ways in which those cases diverge from pure information sharing explains the difference between humble and hostile.

In particular, perhaps some audiences are looking to reduce cognitive dissonance between their self-image as unbiased on the one hand and their particular beliefs and preferences on the other. That leaves an opening for someone to sell reasonableness/unbiasedness self-image to people holding a given set of beliefs and preferences.

Someone making reasonable counterarguments is a threat to what you've offered, and in that case your job is to provide refutation, counterargument and discredit so it is easy for that person's arguments to be dismissed (through a mixture of claimed flaws in their arguments and claimed flaws in the person promoting them). This would be a 'hostile' talking up.

Also, we should probably expect to find it hard to distinguish between some hostile talking ups and overconfident talking downs. If we could always distinguish, hostile talking up is a clear signal of defeat.

Replies from: ozziegooen
comment by ozziegooen · 2018-12-01T13:59:22.600Z · LW(p) · GW(p)

Good points;

I would definitely agree that people are generally reluctant to blatantly deceive themselves. There is definitely some cost to incorrect beliefs, though it can vary greatly in magnitude depending on the situation.

For instance, just say all of your friends go to one church, and you start suspecting your local minister of being less accurate than others. If you actually don't trust them, you could either pretend you do and live as such, or be honest and possibly have all of your friends dislike you. You clearly have a strong motivation to believe something specific here, and I think generally incentives trump internal honesty.[1]

On the end part, I don't think that "hostile talking up" is what the hostile actors want to be seen as doing :) Rather, they would be trying to make it seem like the people previously above them are really below them. To them and their followers, they seem to be at the top of their relevant distribution.

1) There's been a lot about politics being tribal being discussed recently, and I think it makes a lot of pragmatic sense. link

comment by Pattern · 2018-11-30T18:47:58.716Z · LW(p) · GW(p)

In your post, after where it says:

Given these, let’s imagine a few situations:

There is a list that's numbered:

1.

1.

1.

instead of:

1.

2.

3.

Replies from: ozziegooen
comment by ozziegooen · 2018-11-30T18:51:52.908Z · LW(p) · GW(p)

Dangit, fixed. I switched between markdown and the other format a few times, I think that was responsible.

comment by Decius · 2018-12-06T13:36:50.228Z · LW(p) · GW(p)

I don't think it's inherently difficult to tell the difference between someone who is speaking N levels above you and someone who is speaking N+1 levels above you. The one speaking at a higher level is going to expand on all of the things they describe as errors, giving *more complex* explanations.

The difficulty is that it's impossible to tell if someone who is higher level than you is wrong, or telling a sophisticated lie, or correct, or some other option. The only way to understand how they reached their conclusion is to level up to their level and understand it the hard way.

There's a related problem, where it's nigh impossible to tell if someone who is actually at level N but speaking at level N+X is making shit up completely unless you are above the level they are (and can spot errors in their reasoning).

Take a very simple case: A smart kid explaining kitchen appliances to a less smart kid. First he talks about the blender, and how there's an electric motor inside the base that makes the gear thingy go spinny, and that goes through the pitcher and makes the blades go spinny and chop stuff up. Then he talks about the toaster, and talks about the hot wires making the toast go, and the dial controls the timer that pops the toast out.

Then he goes +x over his actual knowledge level, and says that the microwave beams heat radiation into the food, created by the electronics, and that the refrigerator uses an 'electric cooler' (the opposite of an electric heater) to make cold that it pumps into the inside, and the insulated sides keep it from making the entire house cold.

Half of those are true explanations, and half of those are bluffs, but someone who is barely has the understanding needed to verify the first two won't have the understanding needed to refute the last two. If someone else corrects the wrong descriptions, said unsophisticated observer would have to use things other than the explanation to determine credibility (in the toy cases given, a good explanation could level up the observer enough to see the bluff, but in the case of +5 macroeconomics that is impractical). If the bluffing actor tries to refute the higher-level true explanation, they merely need to bluff more; people high enough level to see the bluff /already weren't fooled/, and people of lower level see the argument see the higher level argument settle into an equilibrium or cycle isomorphic to all parties saying "That's not how this works, that's not how anything works; this is how that works", and can only distinguish between them by things other than the content of what they say (bias, charisma, credentials, tribal affiliation, or verified track records are all within the Overton Window for how to select who to believe).

comment by John_Maxwell (John_Maxwell_IV) · 2018-12-04T01:01:37.274Z · LW(p) · GW(p)

Here [EA · GW]'s an old thread on the EA forum about how to assess expertise.

comment by cousin_it · 2018-11-30T15:11:57.881Z · LW(p) · GW(p)

I guess this applies mostly to politicized topics? I also like Robin's advice to "pull the rope sideways".

Replies from: ozziegooen
comment by ozziegooen · 2018-11-30T17:35:24.276Z · LW(p) · GW(p)

Perhaps if you have a large definition of politicized. To me this applies to many areas where people are overconfident (which happens everywhere). Lots of entrepreneurs, academics, "thought leaders", and all the villains of Expert Political Judgement.

To give you a very different example, take a tour guide of San Francisco. They probably know way more about SF history than the people they teach. If they happen to be overconfident for different reasons, no one is necessarily checking them. I would imagine that if they ever give tour guides to SF history experts, their stated level of confidence in their statements would be at least somewhat different.

comment by ChristianKl · 2018-11-30T17:04:11.712Z · LW(p) · GW(p)

How about adding units to the graph? What the X and what's the Y axis?

Replies from: ozziegooen
comment by ozziegooen · 2018-11-30T17:26:47.304Z · LW(p) · GW(p)

It's a frequency distribution ordered by amount of knowledge on a topic. The Y axis for a distribution is frequency, but the units aren't very useful for these (the shape is the important part, because it's normalized to total 1).

comment by rpcrpcrpc · 2018-12-01T09:10:43.543Z · LW(p) · GW(p)

I'm generally hesitant to get into this line of thinking (and others like it) because knowledge is such a thoroughly multi-dimensional space and usually the ends people are looking to move towards with these kinds of models aren't terribly realistic.

I think the true answer is that it's both hard to know what anyone knows about a given field and it also very rarely matters. It reminds me of the talk "Superintelligence: The Idea That Eats Smart People" -- there's a habit among intellectuals, academics, and learned professionals (usually in that order) to get so caught up in their work that they think it intrinsically matters, when, really, nothing does (at least not to everyone).

You can be very "knowledgeable" in a field, double down on the wrong side of a schism, and then see years of your work become nearly worthless when your mental framework is empirically proven wrong. That work might also turn out to be useful again decades later for secondary or even unrelated reasons; when and where are you more or less knowledgeable than your peers here?

And to circle back to the Superintelligence talk: we as humans are very adept at finding ways to survive and thrive despite all kinds of uncertainty and threats, and one of the best tools we have for that is ignoring things until they're a major problem for us. In your radio intellectual example, I'd put forward that those kinds of situations arise because the presence or absence of such figureheads (or demagogues) doesn't generally matter to most people most of the time. When such people become burdensome and overbearing in their demands, they are ousted--entire governments have been bloodily overthrown from within and without for such reasons. That feels inefficient to the person who thinks such fields and their heads matter, but it's generally good enough.

My last point would just be that if it's really hard to know how much more knowledgeable than you someone is, how can you have confidence that someone knows more about specific sub-niche X than you, and not just more about overall field Y? Einstein probably knew more about Physics on the whole than just about anyone outside of a group that could fit into a single lecture hall, but if he looked at a suspension bridge's plans and wanted to "make corrections" I'd probably stick with a seasoned civil engineer unless they both agreed on review. The engineer would probably know more about the physics of suspension bridges in their home country than Einstein; if the latter was able to convince me otherwise that's a question of societal status and political skill in general.

Replies from: ozziegooen
comment by ozziegooen · 2018-12-01T12:54:51.468Z · LW(p) · GW(p)

In response to your last point, I didn't really get into differences between similar areas of knowledge in this post, it definitely becomes a messy topic. I'd definitely agree that for "making a suspension bridge", I'd look at people who seem to have knowledge in "making suspension bridges" than knowledge in "physics, in general."