Beware Superficial Plausibility

post by Celer · 2021-09-28T03:58:08.961Z · LW · GW · 13 comments

Andrew Wakefield was funded by some people suing vaccine manufacturers to publish some extremely damaging research. With a sample size of twelve people, he “found” that the Measles-Mumps-Rubella vaccine caused autism, kicking off the modern anti-vax movement

In most places where I spend time, anti-vaxxers are an easy target. They’re not just wrong: they’re harming people. And when we desist, it’s because we realize that spending time mocking people who are obviously wrong is bad for us (here’s Scott for an example [LW · GW]).

“You need to do your research”: Vaccines, contestable science, and maternal epistemology is a fascinating article because it’s a sympathetic view of the “critical, postmodern, and feminist perspectives on science and knowledge production” by anti-vax mothers. The author has a Ph.D. in Communications and an MA in English Lit, Critical Theory, and Women's Studies: she's sympathetic to their concerns and, when she talks about multiple sources of knowledge and contesting scientific explanations of the world, is familiar with a compatible intellectual tradition. And with quotes from her article, I’m going to try to demonstrate just how divorced reasoning processes can be from outcomes by showing how people can land on very familiar views that I expect many of my readers to share, while also drawing from painfully bad epistemology.

Epistemic status: this is a qualitative study with a sample size of 50 people. Worse, it’s someone sympathetically reporting these mothers’ framings to an audience, instead of giving us access to the raw data. Take everything here with several grains of salt.

Participants in this study lacked confidence in the quality of much published research on this topic in light of traditional scientific standards.

Arguments maintained the ideal of expert knowledge but challenged particular claims to expertise

Arguments from participants were not based in a contention that participants had the right to be included in the technical sphere but rather suggested that the standards of the technical sphere had become too lax. This distinction is important because it challenges the commonplace assumption that vaccine refusers believe in pseudoscience as definitive. Instead, participants in this study lacked confidence in the quality of much published research on this topic in light of traditional scientific standards.

“My big thing is when, you know, they don’t do a double-blind research on it. So it’s not like they take a test group and give them this vaccine, and take another test group and not give them a vaccine because, from what I’ve read, they feel that would be unethical to deprive a child from the vaccine. So what they do is, they test it against some different form of vaccine that’s not even the same vaccine”

“And then, like, if you read the news, you see these studies like, “Oh, twenty year study on vitamins and supplements” and they really don’t do anything. So you see all this research and studies come out on other things, but you never see long term research studies come out on vaccines”

Participants suggested that, although scientific expertise exists, education and titles did not necessarily confer it.

These are very reasonable criticisms! Anyone paying attention should lack confidence in the quality of much published research, especially anything published before 2013. Moreover, they’re criticisms that, if they were levelled against something other than vaccines, I might nod along with. It’s not just the conclusions, 

But the reasoning that produces this is...less good. Firstly, there’s a strong commitment to the idea of a process of discovery, not to actually identifying unique truth.

Participants argued that it was not the conclusions of research that matter but, rather, the process of discovery. This distinction reflected the importance of doing one’s own research, which was overwhelmingly included in participants’ advice to other mothers. This advice communicates the importance of being informed and educated, but also reflected a contention that scientific research—and the conclusion it yields—is personal. In particular, the use of the possessive noun in advice regarding research was almost ubiquitous and, I argue, crucial to the meaning that mothers ascribed to the phrase. That is, the advice to other mothers was not to “do research” but to “do your own research” or “she needs to do her research.” The possessive pronoun both indicated and constituted a broader worldview in which research and science can yield multiple and contestable truths.

“I would tell her to do her own research, to educate herself, and to really know that it is her baby and it’s her choice. It is not her doctor’s choice, it is not her mom’s choice, it’s not her neighbor’s choice, and it is not her school’s choice.”

"Honestly, you can read all the things you want to read, you can, you know, sort of like I was saying before, find both sides out there I mean, and it’s really gonna come down to your gut. You know what you know about your child, and weighing the risks and benefits in your own mind, you know, it comes down to you. You’re responsible for this child, what’s best for him or her, and you’re the only person who knows what you know."

An expectation that information be shared, and that aspiring rationalists should come to agreement, is a strong protective factor pointing people towards truth. Errors are semi-random and truth is fixed, so a moderate dose of Condorcet's Jury Theorem is helpful, even if we abandon all ideas of interpersonal knowledge comparisons.

Of course, it doesn’t stop there. They also employ the naturalistic fallacy.

In the absence of understanding about the body, the best a mother can aim for is to protect the body from the threat of taint or artificial and inevitably damaging intervention

There’s a preference for emotionally compelling anecdotes over statistical data.

“You know, mothers that have been there. There are many mothers that have actually written books, there’s a lot of blogs by moms that have lost children from reactions like that, and there are horrifying pictures. I mean, I don’t—you can’t ignore that kind of evidence.”

We can suspect a litany of other intellectual errors, of course, but those are the ones that come through even in a relatively sympathetic reading. When your ideal of truth is embodied in maternal instinct, as opposed to minimizing prediction error, we should expect them. 

In conclusion, as you might expect of people who are very wrong about something both socially and scientifically well-established, their reasoning processes are bad. But look back to those earlier arguments. They sound so reasonable! Yeah, a lot of doctors don’t really know what they’re doing, and should not be trusted as all-knowing sources of authority. Just as reversed stupidity is not intelligence, superficial intelligence can mask all sorts of errors.

With thanks to Ozy [LW · GW], who reminded me that I should use language well and think about what I say, and Swimmer963 [LW · GW], who kindly reviewed this via Feedback [LW · GW] such that I ended up completely rewriting it. All remaining errors are my own.

13 comments

Comments sorted by top scores.

comment by Edward Pascal (edward-pascal) · 2021-09-29T16:45:26.149Z · LW(p) · GW(p)

First: It seems that medical studies and individual doctors are not very good at getting vague and low-level symptoms, syndromes, and chronic symptoms in general. Often these things can be real, without being easily measurable. Maybe it is like taking your mechanic to the car and you cannot get it to make for him that noise it keeps making. So, for people who end up with these types of conditions, it may appear the establishment is out of touch, untrustworthy, perhaps even conspiratorial.

The above doesn't seem controversial, but I think people who have or know someone who has those symptoms are often more sympathetic to vaccine hesitancy or even great suspicion of modern medical science in general.

Second: The truth is often very nuanced, whereas promoted consensus tends to draw with broad strokes according to what seems salient. A good example is Opioids. We have passed through times of greater or lesser opophbia, and seem to be entering a time when it will be greater. It is very bad that it gets much more regulated, expensive, and difficult to acquire for the people who genuinely need it and cannot or will not resort to illegal solutions. Additionally, this problem will skew poorer and elder, for people who cannot afford the type of or the time of doctors who can advocate for them. Yet, as a public health matter, there are also a lot of addicts.

There just isn't room for a lot of nuance in the mediated """national""" """discussion.""" I think it's a pity when someone is perceptive enough to notice trends along these lines but not in a community where it can be explored with a lot of rationality and good sense. To such people, there may appear to be a 'conspiracy.'

Third: For various reasons, these factors skew towards certain demographics more than others. I am often surprised how little social sophistication people much below my (G2) class level have. For example, I got a traffic ticket, attended court, and in a room with 500 people who didn't want the ticket 'to go on their insurance' I was the only person who knew to plead guilty to a lesser charge (below Department of Motor Vehicles reporting threshold) and pay the full fine. They mostly plead nolo, which is no help in that case. I just don't get it, because I have never had that kind of background. However, without familiarity with bureaucracies and social systemic games, people just don't know what to do or how to do it. I think this contributes to the feeling "the system can't be trusted."

Replies from: edward-pascal
comment by Edward Pascal (edward-pascal) · 2021-09-29T17:02:24.250Z · LW(p) · GW(p)

To be more explicit, I am making the case that for the people vaccine hesitancy really has a lot of salience, the issues are wide, and if put into the exact same milieu, I assume most of us (even the sharper ones!) might end up with similar beliefs. I don't think the reasoning errors you point out would fix things for them. As someone else has said, the priors might be too different.

Given this, I think the easiest solution would be to deal with issues of system opacity and bureaucratic walls in general as a means of increasing trust. Obviously that sounds like the hard way. In some sense, I would like to solve the socioeconomic issues and education issues, but most people in Working classes rather than Gentry like myself actually like their education level and socioeconomic markers better than mine, so trying to intervene there seems to be a losing battle.

comment by Vladimir_Nesov · 2021-09-28T13:36:04.931Z · LW(p) · GW(p)

easy target [...] They’re not just wrong: they’re harming people.

This emphasis feels painful, like implied endorsement of an error: harm of a claim should only make it easier to argue against it via appeal to consequences.

Replies from: Celer
comment by Celer · 2021-09-28T15:17:18.241Z · LW(p) · GW(p)

I think that the reason I don't see a lot of arguments against anti-vaxxers is that I don't know that I know of any. I think the reason that I see anti-vaxxers derided more often than average is flat-earthers are parsed as harmless and anti-vaxxers are parsed as doing harm. I think I'm not quite following what you're saying.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2021-09-28T15:42:06.737Z · LW(p) · GW(p)

The harm should make it more valuable/appealing to argue about a claim, lending value of information to it, but shouldn't make it easier for arguments to hold. The fallacy of appeal to consequences facilitates arguing against a claim because of the harm it does, so appealing to ease of argument because or the harm is appealing to use of appeal to consequences.

(You are certainly not doing any of that, I'm just curious about the reasons behind my own reaction to that statement, where it perplexingly was viscerally painful to read.)

Replies from: Viliam
comment by Viliam · 2021-09-30T19:32:10.160Z · LW(p) · GW(p)

The anti-vaxers usually assume the harm argument points the other way: that vaccines cause actual harm, while the disease itself is mostly harmless (or does not exist at all). From their perspective, it is the sheeple believing whatever the Big Pharma tells them who are not just wrong, they're harming children by giving them autism.

If you want to argue by appeal to harmful consequences, first you have to show those consequences. But either side will dismiss the data provided by the other side.

(Everyone is the good guy in their own story. The covid denialists also don't go like "hey, I am an asshole and I don't give a fuck whether your grandma dies". Instead they tell you how face masks suffocate the kids and prevent them from developing social skills because they can't see each other's faces. From their perspective, it's the people who want to stop covid who are the heartless bastards.)

comment by Slider · 2021-09-28T08:16:34.702Z · LW(p) · GW(p)

If you have subjective evidence you don't turn a blind eye to it in order to agree on the basis of publicly available evidence. If there is a sticker making diamonds more likely you are not free to ignore it. There might be a unique balance point on what benefits the "statistical mean person" the most, but if you have information about the individual you can do more individualised reasoning.

One could take the stance that since "truth is fixed" then either "people ought to take bloopressure medicine" or "people ought not to take bloodpresure medicine". Rather it is more natural that those that have high bloodpressure take it and others don't. But then truth is not fixed.

I would call it rather personal research instead of scientific research (with scientic referring to things that can be generalised). "Doctor it hurts when I do this", "Stop doing it then" is valid reasoning even if the reason or source is a bit nebolous. To forgo personal research would be to say that you should uncritically read the papers and take their conclusions as direct personal recommendation and blindly follow them. Part of the utility of having strong iron clad results would be that the personal inquiry would be to be informed about so convincing stuff that there is little to do personally. But the ironcladness is hard and is not provided in overwhelming quantities for all stuff.

Replies from: ChristianKl
comment by ChristianKl · 2021-09-30T09:49:29.013Z · LW(p) · GW(p)

Mothers who think about whether or not to vaccinate their children usually don't have any comparable source of evidence to "Doctor it hurts when I do this".

Robin Hanson writes about how medicine is about caring and in this case the way a mother signals she cares for her child is about making her own personal research about vaccines. Then of course she has to talk to other mothers about it to make the signal of caring visible to the other mothers.

Unfortunately, doing such research for a routine intervention like vaccination doesn't bring any good knowledge that allows oneself to make a better decision.

One member of our local LW meetup has a mother who was denied two times to be vaccinated against COVID-19 because of her wasp allergy. That would be the kind of event that actually warrents doing the personal research to get a better idea about whether or not it's reasonable for the mother to seek another doctor to vaccinate her because it's not a standard medical issue. 

Replies from: Slider
comment by Slider · 2021-09-30T20:25:25.254Z · LW(p) · GW(p)

Isn't most of the evidence that doctors use supposed to be public if laboursome to access?

How come you know the result of the research before doing it? If you don't have faith and then as result of research agreed with the "majority" or standard view then that would produce the confidence/faith to proceed.

Off course if there is background paranoia that could be skewing the results but then it moves to the question whether that paranoia is warranted or not. As a non-expert is it ever reasonable to fail to follow expert opinion without becoming an expert yourself?

comment by Phil · 2021-09-28T04:45:21.091Z · LW(p) · GW(p)

Why do you say research before 2013 is of lower quality?

Replies from: Celer
comment by Celer · 2021-09-28T05:01:07.480Z · LW(p) · GW(p)

There's not a hard cutoff between 2005, when Ioannidis publishes, and the present, but I've worked on multiple systematic reviews, going over thousands of papers, and there's a visible improvement in quality over time, and that seemed like a reasonable date for "replication crisis attention is high."

comment by TAG · 2021-09-28T16:02:30.038Z · LW(p) · GW(p)

An expectation that information be shared, and that aspiring rationalists should come to agreement,

Even if they don't share priors?

Replies from: Celer
comment by Celer · 2021-09-28T23:13:53.005Z · LW(p) · GW(p)

Priors are relative to how much evidence can be shared. There may not be agreement in a single conversation, but they should expect movement towards a common belief, though there are degenerate counter-cases. For example, perhaps both parties share a base rate and have different pieces of information that push in the same direction.