What's a Bias?

post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2006-11-27T01:50:34.000Z · LW · GW · Legacy · 17 comments

Contents

16 comments

The availability heuristic is a cognitive shortcut humans use to reach conclusions; and where this shortcut reliably causes inaccurate conclusions, we can say that an availability bias is at work. Scope insensitivity is another example of a cognitive bias.

“Cognitive biases” are those obstacles to truth which are produced, not by the cost of information, nor by limited computing power, but by the shape of our own mental machinery. For example, our mental processes might be evolutionarily adapted to specifically believe some things that arent true, so that we could win political arguments in a tribal context. Or the mental machinery might be adapted not to particularly care whether something is true, such as when we feel the urge to believe what others believe to get along socially. Or the bias may be a side-effect of a useful reasoning heuristic. The availability heuristic is not itself a bias, but it gives rise to them; the machinery uses an algorithm (give things more evidential weight if they come to mind more readily) that does some good cognitive work but also produces systematic errors.

Our brains are doing something wrong, and after a lot of experimentation and/or heavy thinking, someone identifies the problem verbally and concretely; then we call it a “(cognitive) bias.” Not to be confused with the colloquial “that person is biased,” which just means “that person has a skewed or prejudiced attitude toward something.”

In cognitive science, “biases” are distinguished from errors that arise from cognitive content, such as learned false beliefs. These we call “mistakes” rather than “biases,” and they are much easier to correct, once we’ve noticed them for ourselves. (Though the source of the mistake, or the source of the source of the mistake, may ultimately be some bias.)

“Biases” are also distinguished from errors stemming from damage to an individual human brain, or from absorbed cultural mores; biases arise from machinery that is humanly universal.

Plato wasn’t “biased” because he was ignorant of General Relativity—he had no way to gather that information, his ignorance did not arise from the shape of his mental machinery. But if Plato believed that philosophers would make better kings because he himself was a philosopher—and this belief, in turn, arose because of a universal adaptive political instinct for self-promotion, and not because Plato’s daddy told him that everyone has a moral duty to promote their own profession to governorship, or because Plato sniffed too much glue as a kid—then that was a bias, whether Plato was ever warned of it or not.

While I am not averse (as you can see) to discussing definitions, I don’t want to suggest that the project of better wielding our own minds rests on a particular choice of terminology. If the term “cognitive bias” turns out to be unhelpful, we should just drop it.

We don’t start out with a moral duty to “reduce bias,” simply because biases are bad and evil and Just Not Done. This is the sort of thinking someone might end up with if they acquired a deontological duty of “rationality” by social osmosis, which leads to people trying to execute techniques without appreciating the reason for them. (Which is bad and evil and Just Not Done, according to Surely You’re Joking, Mr. Feynman, which I read as a kid.) A bias is an obstacle to our goal of obtaining truth, and thus in our way.

We are here to pursue the great human quest for truth: for we have desperate need of the knowledge, and besides, we're curious. To this end let us strive to overcome whatever obstacles lie in our way, whether we call them “biases” or not.

17 comments

Comments sorted by oldest first, as this post is from before comment nesting was available (around 2009-02-27).

comment by Robin_Hanson2 · 2006-11-27T02:06:34.000Z · LW(p) · GW(p)

We seem to mostly agree about what we are about here, but it seems damn hard to very precisely define exactly what. I guess I'll focus on coming up with concrete examples of bias and concrete mechanisms for avoiding it, and set aside for now the difficult task of defining it.

comment by pdf23ds · 2006-11-28T00:55:24.000Z · LW(p) · GW(p)

"it seems damn hard to very precisely define exactly what"

Robin, I don't see why a definition offered in terms of the origin of a phenomenon ("the shape of our mental machinery") should be any less a definition (or any less precise) than one that directly describes the characteristics of the phenomenon. Why isn't the former sufficient?

comment by Robin_Hanson2 · 2006-11-28T04:22:47.000Z · LW(p) · GW(p)

Pdf, I didn't mean to imply that Eliezer's approach was inferior to the approach I was taking, just that all the approaches run into problems when you try to become more precise.

comment by NancyLebovitz · 2008-01-31T17:42:03.000Z · LW(p) · GW(p)

Is there a well-defined difference between the shape of one's mental machinery and its limited computing power?

comment by Arandur · 2011-08-13T01:36:44.768Z · LW(p) · GW(p)

Oh, how curious. I've been reading on here a while, and I think I had previously misunderstood the adopted meaning of the word "bias"... using the term as it's socially used, that is to say, a prior reason for holding a certain belief over another due to convenience. A judge might be biased because one side is paying him; a jury member might be biased because their sister is the one on trial. Are these "mistakes"? Or do they fall under a certain type of cognitive bias that is similar among all humans? *ponder*

Replies from: MarkusRamikin
comment by MarkusRamikin · 2014-10-02T06:21:43.000Z · LW(p) · GW(p)

I would call a judge who is favoring a side because they're paying him "biased", and not "mistaken" or any such thing. But it's not a cognitive bias. The word "bias" has legitimate meanings other than what EY is saying, so it would have been clearer if the article used the term "cognitive bias" at least at the outset.

Replies from: simonthedeer
comment by simonthedeer · 2022-05-13T09:20:43.309Z · LW(p) · GW(p)

I would argue a corrupt judge only seems biased as biased people in my understanding are not aware of their underlying preferences. That also might be the common ground with a cognitive bias: you are never directly aware of its presence and can only deduce on it by analysis.

Replies from: benjamin-kost
comment by Benjamin Kost (benjamin-kost) · 2024-07-29T20:13:43.840Z · LW(p) · GW(p)

You are confusing two definitions for the same word. The judge is biased by one definition of “bias”, but not by the other definition as used in cognitive or statistical bias.

comment by Peter Wildeford (peter_hurford) · 2011-08-17T04:51:55.922Z · LW(p) · GW(p)

Biases seem like they could be understood in terms of logical validity. Even if you reason solely from sound premises, you could still adopt an invalid argument (aka a fallacy; a conclusion that does not actually follow from the premises, no matter how true). I suggest the definition that biases are whatever cause people to adopt invalid arguments.

Replies from: viktor-riabtsev-1
comment by Viktor Riabtsev (viktor-riabtsev-1) · 2018-10-10T22:13:06.791Z · LW(p) · GW(p)
I suggest the definition that biases are whatever cause people to adopt invalid arguments.

False or incomplete/insufficient data can cause the adoption of invalid arguments.

Contrast this with [LW · GW]:

The control group was told only the background information known to the city when it decided not to hire a bridge watcher. The experimental group was given this information, plus the fact that a flood had actually occurred. Instructions stated the city was negligent if the foreseeable probability of flooding was greater than 10%. 76% of the control group concluded the flood was so unlikely that no precautions were necessary; 57% of the experimental group concluded the flood was so likely that failure to take precautions was legally negligent. A third experimental group was told the outcome and also explicitly instructed to avoid hindsight bias, which made no difference: 56% concluded the city was legally negligent.

I.e. on average, it doesn't matter if people try to avoid hindsight bias. "prior outcome knowledge" literally corresponds to conclusion "prior outcome should've been deemed very likely".

To avoid it, you literally have to INSIST on NOT knowing what actually happened, if you aim to accurately represent the decision making process that actually happened.

Or if you do have the knowledge, you might result in having to force yourself to assign an extra 1 : 10 odds factor against the actual outcome (or worse) in order to compensate.

comment by thomblake · 2012-04-17T20:21:25.607Z · LW(p) · GW(p)

This definition of bias seems problematic. If a putative bias is caused by absorbed cultural mores, then supposedly it is not a bias. But that causal chain can be tricky to track down; we go on thinking something is a 'bias' until we find the black swan culture where the bias doesn't exist, and then realize that the problem was not inherent in our mental machinery. But is that distinction even worth making, if we don't know what caused the bias?

Replies from: Navanen
comment by Navanen · 2012-12-18T05:06:48.686Z · LW(p) · GW(p)

I suspect the definition is worth making because even if we don't know what caused the bias, we can use the label of a bias "not inherent in our mental machinery" as a marker for study of what it's cause is in the future.

For example, I read in a contemporary undergraduate social psychology textbook that experimental results found that a common bias affected subjects from Western cultures more strongly than it affected subjects from more interdependent cultures such as China and Japan.

[Obviously, my example is useless. I just don't have access to that book at the current moment. I will update this comment with more detail when I'm able.]

comment by Viktor Riabtsev (viktor-riabtsev-1) · 2018-10-03T20:48:32.640Z · LW(p) · GW(p)

The Simple Truth link should be http://yudkowsky.net/rational/the-simple-truth/

Replies from: habryka4
comment by habryka (habryka4) · 2018-10-03T21:06:10.011Z · LW(p) · GW(p)

Thanks, fixed!

comment by deisner · 2019-11-27T21:06:34.059Z · LW(p) · GW(p)

Typo: "and besides, were curious." ~ s/were/we're/.

Replies from: Caperu_Wesperizzon
comment by Caperu_Wesperizzon · 2022-04-17T07:05:04.351Z · LW(p) · GW(p)

I wonder when a venerable old article reaches the "any remaining bugs become features" stage.

There's still "things that arent true", instead of "things that aren't true", in the second paragraph.