I don't want to listen, because I will believe you

post by George3d6 · 2020-12-28T14:58:34.952Z · LW · GW · 6 comments

This is a link post for https://cerebralab.com/I_dont_want_to_listen,_because_I_will_believe_you

Contents

  I - We don't learn basic things rationally
  II - Information environment
  III - Oh god, is this really bad?
  IV - Reprival and synthesis
None
6 comments

I often encounter a seemingly implicit belief that listening to someone's opinion or idea is something that can be done in exchange for merely the time spent doing so. This idea seemed so obviously true when I glanced upon it in my mind, I was almost certain it was wrong.

I - We don't learn basic things rationally

We sometimes model ourselves as rational actors, disseminating information to form beliefs and ideas. Under this model, we might think of ourselves as having some priors about facets of reality, listening to someone and evaluating the logical consistency of their ideas, then contextualizing those ideas based on our priors, looking into the supporting evidence (if any), then finally updating our priors and creating new ones.

Take as an example:

A: We should introduce a carbon tax in order to reduce our negative impact on the environment.

B: Hmh [I do have a fairly strong prior that carbon-containing gas releasing industry affects our environment for the worst. I also have a fairly strong prior that taxing a behavior disincentivizes it. But I also have a fairly strong prior that taxing industry is a very complex process, thus we need fairly solid data to have a high certainty about a given taxation strategy]. What you're saying seems reasonable, has any country experimented with such a tax?

A: Oh yes, dozens have have, including large countries with reliable data collection, such as Sweeden, Argentina, and Canada.

B: That's curious. I'll look into the data and see if and how said the carbon tax was correlated with air quality, the replacement of carbon-polluting industries with more environmentally friendly forms of industry, and government investments into air-cleaning projects. [proceeds to investigate more than alter his beliefs based on the evidence]

As far as my limited understanding of human psychology goes, I find no reason to believe the above is anywhere close to our default modus operandi.

The most closely related and scientifically measurable evidence here are a multitude of phenomena that fall under the umbrella term priming.

Priming our brains for specific patterns of thought seldom implicates what we'd call "reason" or even conscious thinking. Bring someone to a part where everyone speaks Spanish and they'll be primed to understand Spanish; they will still "understand" other languages, of course, but the context switching required to communicate in them will require a bit of time and effort.

I think the Wikipedia article is a good entry point, but if you want more specific examples of irrefutable priming effects, I think the strongest you'd find are those under the umbrella of repetition priming. Areas critical to our everyday life where repetition priming is an important driver of cognition are reading and face recognition. But you're likely to find repetition priming plays a useful role in movements, navigation, and memory. It constitutes the basis of large swaths of neuroscience.

Even going beyond priming, I think there's at least contentious evidence that once we hear a lie, even if it's debunked later, even if the teller then states it's a lie, even if it's prefaced as a lie... we are still quite likely to remember it as a real fact about the world.

The experiments here are a bit shabby, and I'd point you to an amazing SSC article that digs into the issue further (read up to chapter III for the relevant bits).

However, I will remind you that the researchers mentioned above are trying to look at whether or not the truth is more powerful than fiction when it's known as such. The only takeaway I want is that we are far from perfect at distinguishing between past information as true or false even if we "know" it's truth value with 100% certainty.


Moving into the realm of hopefully-shared experiences, think about the process of language learning and schooling children.

It's quite obvious that "reason" is not very much implicated in language acquisition. Yes, it might serve to iron out grammatical details or to acquire the ability to form complex syllogisms in a language, but conscious reasoning is unrelated to the basics of learning to speak and understand. I can't tell you "how" I learned English the same way I can tell you "how" to find the area of a triangle, or "how" the Harvard computer architecture works. Even if I can come up with post-factum rules based on how I and others speak and write, I'm still not getting at the deepest part of language learning, the association of sounds with complex internal concepts.

Take the words:

Seeing these 3 constructs on the page will trigger search processes which, might return associate words, events, images, feelings... etc. We have close to zero conscious control as to whether or not these thought processes begin upon seeing the words, or as to what they end up returning, what the words imprint into our thinking.

Finally, think back to the things that you were told in school. Not things you learned, not things you reasoned about, just things that you happened to read or be told. Maybe it's the history of your country, some famous book, or some facts about the natural world. These things likely have an impact upon your life, whether you like it or not, if you happen to stumble upon questions or concepts that relate to them for which you have no better reference.

If I am told in class about WW2 and the Natzis, or about Stalin and his secret services, my mind will be guided to those if I think "what's a very bad person" or "what's an example of a violent conflict between countries". This will happen long before I have enough training in ethics to understand why genocides are "worst" than voluntarily letting millions of people in Africa die from disease and starvation.

If I read Romeo and Juliet, hearing or thinking about love or dedication or family bonds or suicide might harken me back to the text. Unless I happen to also read a bunch of other things related to those concepts, my mind will anchor to the closest bit of information it can remember.

If I am told about Democracy being a system of governance that provides freedom and which came about as the peak solution to running a country based on enlightenment thinking, I am likely to remember that association.

This is not to say that the examples we are told about in school are incorrect or unideal.

It might be that Rome and Juliet is a great template for romantic relationships (barring the suicide bit), it might be that Stalin and Hitler are great examples of mass-murderers, it might be that democracy is something that brings freedom and prosperity. I'm fairly convinced at least the later 2 are true

But I don't remember having "reasoned" to those conclusions. I didn't read 100 romance novels to compare Rome and Juliet against. I didn't research 100 mass-murderers to compare Stalin against. I didn't read up on 100 alternative systems of governance and their effects upon freedom or prosperity.

Some people with interest in those specific topics might have, much like I read up on, say, mathematics and computer science. But nobody has the time to become an expert in everything, so we are stuck with uncritically acquired information in a lot of areas.

II - Information environment

Hence I think our information environment might dictate a lot of what we believe. This is obvious when we are kids and have comparatively few and comparatively weak priors, as well as lack most of the "reasoning" ability we have now. However, based on my understanding of priming, language interpretation, and my own experience, I'm fairly secure in speculating our information environment might be fairly relevant to how our beliefs change in adulthood. We're probably less susceptible to our information environment as adults, but we are less susceptible to learning new things in general as we age.

In particular, I think the information environment we expose ourselves to is very prone to creating beliefs in areas where we have otherwise weak priors.

If you look at most areas where quackery seems to prosper, they are those not well covered by science. Or even better, they concern themselves with things that are so subjective or poorly defined that no two men can agree on their exact properties, thus forever shrouding themselves in the impermeable armor of vaguery.

Tea-leaf readers, numerologists, astrologers, and other such charlatans make "predictions" about amorphous concepts such as "love" and "happiness" and "prosperity". Homeopathy and crystal healing address "improving the immune system" and "increasing energy levels", not curing a very specific form of cancer or treating malaria. Religion concerns itself with concepts such as good, evil, desert, and what happens to consciousness once we die.

Note: Some readers might not get the half-joke if I include cosmology and nutritional epidemiology in the above list, so I refrained from doing so

It's hard to obtain good priors in those areas, so when in need of a belief, left empty-handed by any serious scientific endeavor, our brain might turn itself to quackery. Not for lack of reasoning skills, but for the very same reasons it might turn itself to Romeo and Juliet when thinking of love at first sight, or the same reason it subconsciously affixes certain concepts to the word "MATTRESS", because our brains tend to prefer an answer, a "something", anything, to lack of knowledge.


But chief among all examples is advertising, a multi-trillion dollar industry that concerns itself mainly with some form of efficiently infecting us with false or skewed beliefs.

Granted, some ads amount to "Here's a problem you thought unsolvable here's a solution to it", but most advertising is chiefly focused on things like:

Ads are the prime example of thought tumors. They embed themselves into our minds and we are left little option of stopping them once they've been seen. By design, they are made to avoid the filter of reason.

Most people seem to have some understanding that ads are "bad", but they shelf this under "ads are annoying" or "my data privacy is important" not under "this is seeding false ideas into my brains and will make my map of the world less accurate, potentially snowballing into catastrophic failures of reason".

I've mainly addressed the information environment which we ourselves construct. We have a choice in installing an adblocker, we have a choice in not consuming media where "we are the product", we have a choice in not reading the blogs of quacks, we have a choice in not listening to the random garble of everyone on the internet.

But the worst of problems comes from the information we are exposed to by our social and physical environment.

For example, Tokyo has a very art-nouveau aesthetic for me, with its huge neon billboards, every square centimeter of space covered in some bright colored poster, and ad videos blurting out from screens lodged every nook and cranny. But that's because I can't read Japanese, I assume that, if I was able to, I'd be maddening... then again, most residents seem to adopt a "headphone in + head down" posture 90% of the time, so I guess that might be a viable coping strategy (as far as I've seen the ads seem to be fighting back by encroaching on the pavement)

Or what about the people we hang out with? At least, assuming you're a person with a fairly diverse friend group.

The "problem" with friends is that one can't really ignore them as one would an ad or news report on TV. Sure, you can get away with saying "look, this is, like, a very very boring subject and I don't want to talk about it", but if half of what someone wants to address relies on something you consider to be "likely bullshit", then there's no getting away from discussing it, it's that or toning down the friendship.

One idea that some seem to rally behind is that of "just listening" to someone, the idea that sometimes people just need an empathic fleshboard to vent upon. That would be all well and good if we were purely rational actors, but we aren't. Every time someone tells us something, especially a friend, i.e. someone we trust, a small part of our brain now beliefs that. Say the words and trigger the sensory queues and arrange the internal monologue in just the right order, and even the most absurd of thoughts you ever encountered might pop out.

To give a quick example of what I'm talking about, remember the carbon tax discussion? This is how I really imagine that sort of discussion:

A: We should introduce a carbon tax in order to reduce our negative impact on the environment. [Because someone told me so in a video on youtube, and now it feels good to have a new piece of information and my brain instinctively wants to share it]

B: Hmh [I have no idea what those terms are, I heard of "global warming" and "carbon causing climate change" a few times, so I guess they have to do with that amorphous cloud of concpets]. So you're saying [retrive catchphrases] it's a way to fight global warming !?

A: Yeah, I think so [because I share that same blob of concepts where climate change == global warming == environment impact and where anything that affects one positively affects the other 2 the same way]

B: Oh, ok, I guess that's cool. [I will now proceed to do no further research on this because the topic doesn't interest me one bit, but whenever someone brings up environmental issues or climate change the concept of a carbon tax, even though I'm not sure what it implies, will now pop to mind, potentially with some nice sound around it]

III - Oh god, is this really bad?

If you're with me thus far, hopefully, we can agree on 4 things:

For one, the critique above seems to be damming many forms of fiction. I recently finished reading "Unsong", which, in case you aren't familiar, is a book that has among its premises:

This book includes a surprisingly large swath of the nonsensical ideas one might encounter in the real world. And it makes them seem compelling, it makes characters based on these arts that you get attached to, it waves an amazing story around them, it's ending is so touching it made me listen to American Pie 20 or 30 times over the next few days, it brings joy and causes interesting and seemingly profound thought during and after it's reading.

Daniel Denette once compared religion to music and said that, if we found out music lead to dangerous or violent behavior, we ought to discourage and ban it. This is an interesting analogy and it served to show me a point where I break with traditional "rationalism", in that I truly believe some form of art might have close to intrinsic value (intrinsic not in a "supernatural" sort of way but in a "critique of pure reason" sort of way). I feel this way about music and I also feel this way about some fiction, be it lord of the rings or the foundation series.

Looking at information as "harmful until proven true" leaves fiction on a perilous cliff edge, marks it off as a dirty pleasure.


Perhaps worst, this viewpoint means that certain forms of discussion and debating are dangerous. I was always of the opinion that one should, as long as they are bored, try to reasonably discuss with whomever crosses their path on the internet... in a collaborative and non-aggressive sort of way, mind you. The more misinformed the person seems the more likely at least one of you stands to learn from that conversation.

But if we view information as harmful until proven otherwise, "getting to someone's level" could be dangerous, unless you have above-average certainty that "their level" is in an epistemically superior place to yours.


Another thing this perspective casts a shadow upon is acquiring new perspectives (no pun intended), for it seems likely that we're especially vulnerable to learn wrong things about subjects we've had no or little prior exposure to.

On the whole though, this "harmful before proven innocent" approach to information is most damaging because, if taken seriously, it provides a way to rationalize being stuck with outright psychotic beliefs.

On one hand, we've got:

On the other hand, we've got:

Now, if you look at the downsides in the first section you will note they seem to describe a kind of cultish behavior one would associate with radical political movements and religion. I think this vague pattern matching is enough to scare me away from them.

But looking even further, accepting behavioral shifts of this sort might shift a life away from "one worth living" under many common systems of value, mine included.

Finally, depending on your starting point and the way the world progresses, "being stuck delusional beliefs" might mean "thinking marmite tastes bad" or "thinking that a hunt for bigfoot is the best way to pursue happiness and success in life". While we might believe ourselves to be in the former case, with few and mild delusional beliefs, the intensity of that belief is probably tightly correlated with how delusional we actually are.

Granted, I do believe there are practical ways to probe for delusional beliefs, but I digress, that's a topic for another article.

IV - Reprival and synthesis

To summarize what I've proposed thus far, I've outlined what I think is an often unspoken but in hindsight obvious proposition: That the information you expose yourself to, no matter what your reason "thinks" or would think about it, might sometimes (often?) embed itself into your mind and set or even affect priors in wholly unreasonable ways.

I concluded that "protecting" yourself from entering in contact with very suboptimal or outright wrong sources of information is impossible without seemingly becoming a soulless uncaring robot. Even then, unless you had really good priors and a map that covered a lot of territory, things might not work out.

But there might still be things that we can do to protect ourselves from some or even most such invasive incorrect beliefs without becoming an asshole or halting our engagement in public forms.

So allow me to walk close to the edge of the abyss I'm preaching caution of, by speculating some potential solutions.


First I think there's the obvious stuff which I'm sure everybody must be doing until I realize many seemingly brilliant people aren't:


As an author what can be done to reduce the "contamination" of your audience with unwarranted beliefs, happily enough, often coincides with what many academics and non-fiction no-bullshit writers would consider to be common courtesy.

Don't make appeals to emotions too strongly or too often. That's not to say once's writing shouldn't contain some jokes or at least underhanded jabs here and there, nor that some topics don't require summoning up some staple of love, anger, fear, or awe.

But if a piece is heavily affective this probably serves to lower the reader's guard. After all, it seems common sense that reason works best in a calm and somewhat detached state.

To that, I'd add frontloading evidence and citing negative findings. There's a very obvious line between being a complete bore and hiding your sources at the bottom of the page, under the ads, with no working links to them.

I think the rule of thumb should be to peak interest and very gently hint at but not draw any important conclusions in the abstract. Then frontload as much evidence as possible in the first section, in order to allow the reader to "disagree" with the principal pillars upon which your opinion stands early on. Ideally giving some slight encouragement for people to read the sources, which might be as simple as adding hyperlinks in the exact spot where you cite it and citing in the proper place. This way, if the reader disagrees with the evidence presented (or is very suspicious and lacks the necessary time to read it) they can stop reading and hopefully forget all about it.

Follow that with outlining as many of the edge cases and shaky premises your reader must "grant" for the point to stand. Only then venture into the chain of reasoning, the conclusion, and anything that might flow from that.

One can even encourage the reader to reason by employing micro-constructs such as "question as premise => evidence => incomplete syllogism => finalize in the conclusion"... but this is easier said than done.

Finally, I'd argue that any suggestion based on your conclusion that hasn't been experimentally proven ought to go at the very end and be clearly demarcated as speculation... much the same way this writing guide is.

On the whole, I think it serves to engage in the kinds of communities where these kinds of rules are implicitly observed. You'll notice all of these guidelines are sliding scales, so just try to aim for as high as you can manage.


As a consumer of foreign ideas, besides what I wrote above, I think it might pay dividends to do a few things:


I do concede that we have an amazing ability not to believe everything we hear. But I am in doubt as to its efficiency.

You can read a lot of fiction and never spare a second thought to the rules of thaumaturgic magic or wrap-plasma drives when thinking about engineering in the real world. Presumably thanks to this ability to compartment "fictive information" away from "real information". But it's not at all obvious to me "how" I do this, I don't do any rituals nor feel an immediate shift in state of mind when I start consuming fiction.

It's obvious that this shift is not perfect. Again, referring to the mind-killer, you will note that most political shows are a mix of 3 things:

You needn't even limit yourself to political shows, as far as I know, this is the case with most political discussion boards (e.g. reddit frontpage or subreddits of less popular political factions).

Maybe I'm wrong though, I started religiously distancing myself from anything politics in the last 2 years to the point of ignorance, but looking back in memory I can't figure out a single partisan political propaganda outlet that didn't include a mix of fiction and reality. I'd love to know if someone actually collected data on this.

These political propagandists do label fiction as such, you can't legally say "The opposition leaders climbed down into the basement of his castle and murdered our sacred cows then proceeded to murder children"... that's liable, but not if prefixed with "it's like" or suffixed with "ha ha ha". But it doesn't seem like anyone watching them shelves this under "fiction".

I've decided not to trust this ability to separate fact from fiction unless I get a significant gain of happiness or "meaning" from doing so. So, as mentioned before, something like reading Tolkein is in, falling asleep to the sounds of a soap opera is out. I think the dangerous point here is when one chooses to consume fiction instead of partaking in the sensory experience of the real world, even if said fiction is not inspiring or enjoyable. I understand that many of you might think this to be a fictional behavior, but I've personally noticed it quite a lot and I think we can catch ourselves sometimes doing it if we are mindful.


Last but not least I think there's an ability one might be able to acquire to hold things as "unknown" and be comfortable with that.

I think that this whole problem might be embedded into the broader issue that we are suckers for explaining and understanding, and prefer scraps of garbage over admitting to ourselves we can't know something.

I spoke about this more broadly in "Costs and benefits of metaphysics" and I think a whole article could be dedicated to modern philosophical skepticism and a subsequent prescription to accept some things just can't be known.

However, I should stop with the prescriptive part of this article here, it's the one that I have the

6 comments

Comments sorted by top scores.

comment by Garrett Baker (D0TheMath) · 2020-12-28T18:22:21.156Z · LW(p) · GW(p)

This advice could be beneficial to a theoretical person who felt the need to talk & hear the points given by everyone they disagreed with, about every point of disagreement, and slightly less extreme versions of this person. I’m thinking about people like Joe Rogan here, who listen to everyone, and seemingly put very little effort into making sure the arguments given by such people are valid.

I, on the other hand, am very averse to discussing fundamental disagreements or reading about why I may be wrong. Such aversion makes it difficult for me to tell when the person I’m talking to is right about a particular topic, and makes me underestimate the benefits of knowing about their position. So I don’t think this advice—that is, the advice about not talking to people you disagree with—is helpful for me, or people like me. Many of the recommendations listed like turning off background info dumps, having an add blocker, and (to a lesser extent, admittedly) staying away from political discussions I do instinctively & automatically.

Replies from: D0TheMath
comment by Garrett Baker (D0TheMath) · 2020-12-28T18:37:46.054Z · LW(p) · GW(p)

A good example of who we should strive to be like is Julia Galef, on her podcast Rationally Speaking. Here, she'll read several books about the topics to be discussed, then talk with her interviewees, keeping the epistemic bar very high. Asking about predictions their hypotheses have made in the past, unnecessary complexities which don't seem justified, and generally applying high-quality Bayesian rationality to the points given. Neither shying away from disagreement like I would, nor talking to people with niche ideas for the sake of talking to people with niche ideas like Joe Rogan would.

comment by Timothy Johnson (timothy-johnson) · 2020-12-30T12:35:45.399Z · LW(p) · GW(p)

I don't have the philosophical sophistication to explain this as clearly as I would like, but I think fiction is valuable to the extent that it can be "more true" than a literal history.

Of course, fiction is obviously false at the most basic level, since the precise events it records never actually happened. But it can be effective at introducing abstract concepts. And except for trivia competitions, the abstract patterns are usually what's most useful anyway.

The best analogy I can think of is lifting weights. Fiction is an artificial gym that trains our minds to recognize specific patterns, much as weight lifting uses artificial movements that target specific muscle groups.

Fiction works by association, which as you suggest is how our minds tend to operate by default already. So at a minimum, wrapping ideas in a fictional story can make them more memorable. For example, people who memorize decks of cards tend to use "memory palace" techniques that associate the cards with vivid events.

The knowledge we gain from reading fiction is largely subconscious, but for me the most important part is the ability to understand how people who are different from me will think or act. This can also inspire me to think and act more like the role models I've observed.

There are other purposes in reading fiction - some fiction is meant mainly for entertainment. But I think most of what people would consider classics aim to teach something deeper. Perhaps what you experience as meaningful in reading Lord of the Rings is related to this?

Of course, there is the danger that reading bad fiction will make you less informed than you would have been otherwise. And the fact that learning occurs mostly subconsciously exacerbates this problem, since it can be difficult to counter a faulty narrative once you've read it.

But fiction seems no more dangerous to me than any other method of getting information from other people. Even sticking strictly to facts can be misleading if the facts are selectively reported (as occurs frequently in politics).

I do need to think some more about your point about how exactly to distinguish what part of a story is fictional and what can be treated as true. I don't have a clear framework for that yet, though in practice it rarely seems to be an issue. Do you have an example of a time you felt misled by a fictional story?

Overall, I think my understanding of the world, and especially of people, would be greatly impoverished without fiction.

comment by TAG · 2020-12-28T21:08:46.072Z · LW(p) · GW(p)

There more intensely we listen, the more likely that information is to get embedded.

Why shouldn't intense listening be accompanied by critical appraisal?

comment by Vanilla_cabs · 2020-12-28T16:17:02.682Z · LW(p) · GW(p)

That we are unconsciously suggestible to information is a valuable point. Now, this begs the question: why do some people leave cultish beliefs after being raised in them? It seems we are not all equally suggestible in all circumstances. It then seems to me of the greatest importance to discover what prevents or accelerates unconscious validation of information: what circumstances, what character traits? What allows people to un-validate unconsciously validated information?

Note: Some readers might not get the half-joke if I include cosmology and nutritional epidemiology in the above list, so I refrained from doing so

Not to mention belief in the Egyptian god Apophasis.

Avoid discussions that heavily relate to sex and politics, because people tend to switch to full-on tribal-signaling and tribal-propaganda

These topics most of the time are a rational minefield with little to gain. Still, nothing says there isn't or won't be a time when it's worth going there. Maybe there will be a particularly strong incentive, maybe you will be trained enough in rationality so the costs to your sanity will be lower. In any case, there seems to be a risk if all the rational actors leave such important fields to everyone else.

Learn the "establishment" position and the arguments for it before learning the "wake up sheeple" position. On the whole, I think it's safe to assume establishment positions are better than random the alternatives.

Correction by me to avoid biased word.

Everybody can agree that there were historical situations in which this was the right thing to do, and others in which this was the wrong thing to do. So the question is: how to distinguish them?

Replies from: George3d6
comment by George3d6 · 2020-12-28T17:13:16.492Z · LW(p) · GW(p)

Correction by me to avoid biased word.

The word choice was intentional. Since alternative is a loaded word. People don't think of an insane's relatives facebook post when you say "alternative to conventional medical knowledge".

Everybody can agree that there were historical situations in which this was the right thing to do, and others in which this was the wrong thing to do. So the question is: how to distinguish them?

There obviously were, but in aggregate the establishment opinion is likely the correct one unless you have very good proof that you are surrounded by geniuses which can give you a better take.

Also, please note that I'm calling for "starting with" the establishment view, not limiting yourself to it.

That we are unconsciously suggestible to information is a valuable point. Now, this begs the question: why do some people leave cultish beliefs after being raised in them?

What's the chance those people still hold a lot of misguided beliefs and leaving the cult took and still takes a great deal of time spent reasoning through the false beliefs they were indoctrinated with.

See first heading, I'm not claiming truth doesn't win over falsehood in the end (or rather that more probable beliefs don't win over less probable alternatives), only that significant mental energy must be spend for such a thing to happen, and we can't do it with every single belief we have.