What epistemic hygiene norms should there be?

post by Kaj_Sotala · 2012-03-21T19:26:45.006Z · LW · GW · Legacy · 26 comments

Contents

26 comments

The wiki entry for Epistemic Hygiene defines the term as:

Epistemic hygiene consists of practices meant to allow accurate beliefs to spread within a community and keep less accurate or biased beliefs contained. The practices are meant to serve an analogous purpose to normal hygiene and sanitation in containing disease.

The term was coined in Steve Rayhawk's and Anna Salamon's post "The ethic of hand-washing and community epistemic practice", and there have been several mentions of it around the site. But what, exactly, might good epistemic hygiene norms be?

I'm especially interested in this question in the context of meetup groups. In Less Wrong NYC: Case Study of a Successful Rationalist Chapter, Cosmos writes:

Epistemic privilege and meme-sharing: The most powerful aspect of a group of rationalists is that you have an entire class of people whose reasoning you trust. Division of labor arises naturally as each member has different interests, they all pursue a variety of skills and areas of expertise, which they can then bring back to the group. Even the lowest-level rationalists in the group can rapidly upgrade themselves by adopting winning heuristics from other group members. I cannot overstate the power of epistemic privilege. We have rapidly spread knowledge about metabolism, exercise, neuroscience, meditation, hypnosis, several systems of therapy... and don't forget the Dark Arts.

This would imply that one way that a meetup group (or for that matter, any social group) could get really successful would be by adopting great epistemic hygiene norms. But unless everyone present has read most of the Sequences - which gets increasingly unlikely the more the group grows - even most LW meetups probably won't just spontaneously enforce such norms. Wouldn't it be great if there were an existing list of such norms, that each group could look through and then decide which ones they'd like to adopt?

Here are some possible epistemic hygiene norms that I could find / come up with:

And of course, there's a long list of norms that basically amount to "don't be guilty of bias X", e.g. "avoid unnecessarily detailed stories about the future", "avoid fake explanations", "don't treat arguments as soldiers", etc.

Which of these norms do you consider the most valuable? Which seem questionable? Do you have any norms of your own to propose?

26 comments

Comments sorted by top scores.

comment by fubarobfusco · 2012-03-21T22:51:53.358Z · LW(p) · GW(p)

Salamon and Rayhawk, "Share likelihood ratios, not posterior beliefs", suggests a few rules for updating on others' beliefs that may be relevant; most importantly (paraphrased):

  • Distinguish evidence from priors — in conversation, when saying things like "I think Jack is smart, but not extremely smart" distinguish "I don't have evidence that Jack is extremely smart" from "I do have evidence that Jack is not extremely smart". This makes it possible for the recipient of information to combine evidence without double-counting agreed priors e.g. the base rate of extremely smart people in the world.
comment by NancyLebovitz · 2012-03-22T01:07:14.025Z · LW(p) · GW(p)

This is something which I think overlaps a number of your suggestions, but doesn't exactly match any of them,

If you're making a generalization, check it for scope. How much knowledge of what you're generalizing about do you actually have? Could conditions have changed? How representative are the examples you're drawing your conclusions from?

Replies from: JStewart, NancyLebovitz
comment by JStewart · 2012-03-22T01:28:22.143Z · LW(p) · GW(p)

I agree. I've noticed an especially strong tendency to premature generalization (including in myself) in response to people asking for advice. Tell people what your experiences were, not (just) the general conclusions you drew from them.

comment by NancyLebovitz · 2012-03-23T06:25:24.346Z · LW(p) · GW(p)

Another angle on scope: If there's a difference, how large is it?

This relates to a frequent annoyance-- articles about differences between men and women which does mention the size of the difference or how much overlap there might be.

comment by NaomiLong · 2012-03-21T21:14:54.306Z · LW(p) · GW(p)

Thanks for the compilation. This point was especially useful:

Before you stake your argument on a point, ask yourself in advance what you would say if that point were decisively refuted. If you wouldn't actually change your mind, search for a point that you find more convincing.

I think beginning rationalists should first look to make sure they're willing to change their mind on a subject, period.

comment by Will_Newsome · 2012-03-22T07:11:17.850Z · LW(p) · GW(p)

(It's worth noting that the "moral wrongdoing is like infectious disease" metaphor is a surprisingly deep aspect of human social psychology, even to the extent that hand-washing affects levels of risk aversion. The pervasiveness of the metaphor is especially clear in the Christians' emphasis on baptism, holy water, purgatorial fires, et cetera. I take about five baths a day, and I suspect it has at least a little to do with constantly feeling guilty. Understanding the basis of this connection might helps us manipulate it: "quietly going along with something even if you don't actually agree with it is dirty".)

comment by [deleted] · 2012-03-24T13:50:52.696Z · LW(p) · GW(p)
  • Don't be afraid to ask stupid questions.

I hate, hate social situations where I have to take big status hit to ask a simple clarifying question; it would greatly promote understanding to have the social norm that dumb-sounding questions are considered acceptable.

Replies from: Vladimir_Nesov, John_Maxwell_IV, army1987
comment by Vladimir_Nesov · 2012-03-24T13:55:24.651Z · LW(p) · GW(p)

Train your social environment to expect trivial clarifying questions from you (and so to stop making inferences of any significance from individual instances of this behavior), by asking them frequently.

comment by John_Maxwell (John_Maxwell_IV) · 2012-03-28T06:16:58.546Z · LW(p) · GW(p)

I try to keep in mind something I remember reading about the eminent mathematician David Hilbert. Supposedly he would constantly and shamelessly ask questions during math presentations, and once even asked what a Hilbert space was.

comment by A1987dM (army1987) · 2012-04-21T09:23:01.365Z · LW(p) · GW(p)

I usually use “Sorry for the stupid question, but X is [answer I think most likely], isn't it?”. I will learn the same thing as if I asked “What is X?”, but I won't sound like an idiot.

comment by Daniel_Starr · 2012-03-22T12:32:05.556Z · LW(p) · GW(p)

Don't punish yourself or others for taking time to think before responding.

If you were a kid in school who was rewarded for being quick with the answers, taking time and letting others take time is a habit that takes some training.

To express it at more length:

Don't press yourself or others to have immediate perfect articulations of what they know.

If people are only respected if they speak immediately, the only thoughts that will be spoken are cached thoughts.

Be comfortable saying things like "I think I shouldn't agree or disagree right away," or "I'll need to take some time to think through what my real opinion is on that point," or "I didn't quite say what I really meant earlier." Support others who say those things.

Believe that the value of a person's thoughts is not always the same as their speed of response.

Create a favorable environment for both you and others to figure out what they really think, notice, and remember, as opposed to what they're able to articulate in the heat of a single conversation.

Replies from: Grognor
comment by Grognor · 2012-03-22T21:02:10.254Z · LW(p) · GW(p)

This is a special case of "leave a line of retreat."

No, it is not. Leaving a line of retreat is about imagining possible worlds where one of your beliefs is false.

Replies from: Daniel_Starr
comment by Daniel_Starr · 2012-03-23T00:27:20.689Z · LW(p) · GW(p)

Oops. Yah. It's related, in the sense of "lower the pressure to insist that you're right", but it's not the same. Fixed.

comment by Viliam_Bur · 2012-03-26T12:33:27.204Z · LW(p) · GW(p)

Just a data point, but I felt a need to write it:

I found here a link to an interesting article "Multitaskers bad at multitasking". According to the article, people who 'routinely consumed multiple media such as internet, television and mobile phones' were worse at three attention-related experiments described in the article. However these people have self-reported as better in multitasking.

Now here are some selected comments from below the article: (three comments from three different people)

Women make better multitaskers then most men, only because their lives depend on multitasking at home and at work...it seems the above study was just computer based, which is probably why the multitaskers didn't do so well...they were thinking of other things that they should've or could've been doing.

This is surprising. I'm fantastic at it. Perhaps the pool didn't actually include those who are actually gifted enough to multitask.

I am a multitasker. I have a conference call on mute as I type this. The advantage? Ignoring the detail allows me to focus on the big picture, ensuring my team are always realigning with the strategic objectives. I have a very competent group of focussed people. They're great with detailed tasks that require focus. They wouldn't let me near those activities as we all know I'd make mistakes. As a team we understand each other's strengths and weaknesses and it works well.

There is something valuable in these comments. First, some aspects of the work (such as focusing on the big picture) in a given context may be more valuable than what was tested in the experiment. Second, despite multitasker getting less utility per task, they may get more total utility, which would make multitasking a good strategy. Third, people are different, so even if the results of this experiment are relevant for majority of people, there may be some exceptions for whom multitasking is a way to get high quality work done.

However all three comments have somehow missed the essence of the article. If the article says "it was experimentally shown that people who believe to be good at multitasking work are statistically worse at work" you can't respond just by "that's nonsense, because I believe I am good at multitasking my work", unless you include some convincing evidence why we should believe that your belief is more reliable than similar beliefs of the other people who were experimentally proven wrong. (How about considering a possibility that you might be wrong, both about "being fantastic" and "focusing on the big picture"?) And that evidence should be something better than a cached thought (women being better multitaskers), or an ad-hoc excuse (a computer-based study can't measure the real multitasking).

I think we are already doing good about it on LW. I did not notice the gradual change in my epistemic hygiene expectations, until I read a text containing basic mistakes, which I probably would not have recognized as mistakes a few month ago. As an analogy, washing one's hands may feel like a boring ritual, until one sees other people eating with dirty hands and suddenly feels disgusted.

Replies from: David_Gerard
comment by David_Gerard · 2012-03-26T21:23:14.856Z · LW(p) · GW(p)

Yeah. I have just been metaphorically bloodying my forehead trying to explain to someone that if A says "X", B says "oh, you cherry-picked that example, so it's a bad argument", A says "no I didn't, it's from Y" and B replies "oh, you just picked Y at random, so it's a bad argument", then B has done something stupid no matter what the argument is about - and this has been impossible to get across. It felt very like arguing with a creationist. (It was similar in unlikelihood of minds being changed - one person is British, one is American and the argument was about gun control - so it was pretty much epistemic sewer diving.)

tl;dr LessWrong has made me significantly less tolerant of run-of-the-mill weapons-grade stupidity. (That should be an oxymoron, but somehow doesn't seem to be one in practice.)

comment by Daniel_Starr · 2012-03-22T12:40:37.558Z · LW(p) · GW(p)

Another one that I feel is important:

Identify the testable predictions that go along with your belief.

This not only wards off "just so stories" and confirmation bias; it also shows people how to mine their own knowledge to add weight for or against the belief you're offering. If their expertise overlaps yours at all, they probably have some knowledge to offer; if you point out the testable predictions your belief implies, they can deploy that knowledge to make yourself and them smarter.

comment by Grognor · 2012-03-22T08:25:27.457Z · LW(p) · GW(p)

An important one:

Others:

Also, I rather like these.

comment by TimS · 2012-03-21T19:51:36.757Z · LW(p) · GW(p)

In terms of debate, the following have always been the most helpful to me in ensuring that I'm not believing things only because I want them to be true. I've ordered them from easiest to hardest to implement.

In discussions, presume the kinds of conditions that are the least convenient for your argument.

Even if you've been fairly mind-killed, it's pretty easy to notice when someone else raises a difficulty you haven't thought of.

Encourage "Why do I think that" monologues. You elaborate on a thing you currently believe to be true by specifying the reasons you believe it, the reasons you believe the reasons, etc and trying to dig out the whole epistemological structure

This method also has the benefit of forcing you to focus on improving the truth-quality of beliefs rather than winning the argument.

Before you stake your argument on a point, ask yourself in advance what you would say if that point were decisively refuted. If you wouldn't actually change your mind, search for a point that you find more convincing.

This is incredibly hard to implement, because if you didn't have the core belief, the supporting beliefs would never seem important.

comment by steven0461 · 2012-03-21T20:34:58.888Z · LW(p) · GW(p)

The term was coined in Steve Rayhawk's and Anna Salamon's post

No, it's a lot older.

ETA: at least, I'm pretty sure I've seen it floating around before then, possibly as "epistemological hygiene", though it doesn't seem to be used in the SL4 archives.

comment by Crux · 2012-03-22T08:48:20.496Z · LW(p) · GW(p)

Interesting. I've used the term "epistemic hygiene" plenty of times, but it looks like I've been using it incorrectly. I thought an "epistemic hygiene technique" was simply a strategy for avoiding getting epistemically messed up in some way, but it seems like what it really deals with is sound belief propagation throughout a community.

As for the question of what epistemic hygiene norms there should be though, I wonder whether there's anything more to what those norms should be past what I used to think the term meant. In other words, does being epistemically sanitary with what you write involve anything past simply remaining epistemically rational with what you're communicating?

It seems like an interesting question--how to keep the community memeplex sanitary, and avoid spilling harmful memes or letting destructive beliefs propagate--but I'm not sure there's anything more to this than what we already seem to spend most our time talking about: how to maintain your own epistemic rationality, and how to communicate efficiently and effectively by properly managing the inferential distance etc.

It certainly seems like it may be a fruitful line of inquiry, and I'll think more about it later, but for now I'm just trying to bring up the possibility that this question is of no special importance, and there's nothing to community epistemic practice besides basic individual epistemic practice coupled with the ability to communicate effectively.

comment by Luke_A_Somers · 2012-03-22T01:31:59.824Z · LW(p) · GW(p)

Only pass on ideas that you've verified yourself. (Problematic, since any given individual can only verify a tiny fraction of all of their beliefs.)

That really depends how literally you take this, doesn't it? Totally literally, you don't pass on an unverified idea - you can pass on a verifiable token referring to the idea ("I heard from so-and-so that...")

comment by timtyler · 2012-03-21T19:40:52.301Z · LW(p) · GW(p)

[admin if you (Kaj) delete your comment I will delete mine]

Replies from: Kaj_Sotala
comment by Kaj_Sotala · 2012-03-21T19:44:46.949Z · LW(p) · GW(p)

Fixed, thanks.

comment by calebp99 · 2022-02-11T12:10:56.200Z · LW(p) · GW(p)

I've been thinking about how we might improve epistemic hygiene in the EA community (particularly on the forum) this post has been useful and I'm keen to find more content in this space.

comment by [deleted] · 2015-09-20T01:37:21.767Z · LW(p) · GW(p)

Thanks Kaj, this is my favourite article on LessWrong at the moment. I have plenty to learn.

Most valuable to me at the moment:

Before you stake your argument on a point, ask yourself in advance what you would say if that point were decisively refuted. If you wouldn't actually change your mind, search for a point that you find more convincing. (Is That Your True Rejection? at CATO Unbound)

In discussions, presume the kinds of conditions that are the least convenient for your argument. (The Least Convenient Possible World)

If people are trying to figure out the truth, don't mistake their opinions about facts for statements of values. (Levels of communication)

Only pass on ideas that you've verified yourself. (Problematic, since any given individual can only verify a tiny fraction of all of their beliefs.) (The ethic of hand-washing)

Explicitly separate “individual impressions” (impressions based only on evidence you've verified yourself) from “beliefs” (which include evidence from others’ impressions). (Naming beliefs)

Leave the other person a line of retreat in all directions, avoiding pressures that might wedge them towards either your ideas or their own. (The ethic of hand-washing) Encourage people to present the strongest cases they can against their own ideas. (comment, Carl Shulman)

comment by A1987dM (army1987) · 2012-04-21T09:18:35.111Z · LW(p) · GW(p)

The link to “Your Rationality is My Business” is broken (by an extra lesswrong.com/ in the URL).