Do biases matter?

post by Student_UK · 2011-02-20T01:10:24.142Z · LW · GW · Legacy · 15 comments

Contents

15 comments

It occurred to me that our biases might not matter very much lots of the time. They matter if you want to maximize your chances of finding the truth, but not if you are interested in maximizing the chances the chances of someone finding the truth.

Take science as an example, scientists aren't always free from biases when it comes to thinking about their own theories. They look for ways to confirm what they believe, not discomfirm it. But does that matter? As long as their rivals are looking for ways to falsify those theories then overall the system should work well.

In fact it might work better:

Imagine that there is a prize (a really big prize) at the end of a maze. Several of you are sent into the maze and the first person to find the prize gets to keep it. You all head down the corridor and come to three doors, there are some clues written on the wall, but before you can even begin to read them, someone dashes through door 1. Someone else follows then another through door 2 and a few take door 3. What do you do? You could be methodical and try to solve the clues. This would maximize your chance of finding the right path. However, it will not maximize your chances of being the first one to find the prize. For this, you need to pick a door and run.

Likewise, in science, if you want to get the prize (nobel prize, or a good job, or fame, or a best-selling book), then you might be better off coming up with a new theory and running with it (anecdotally, this seems to be what a lot of successful scientists have done). Having lots of people making leaps in different directions might also make science progress faster overall.

What does all this mean for biases? Are they best thought of on an individual level or group level? Is it really a bias if it is the best thing to do? Can you think of other examples where individual biases might produce better results for the group?

15 comments

Comments sorted by top scores.

comment by jsteinhardt · 2011-02-20T01:44:24.713Z · LW(p) · GW(p)

A better analogy in science would be a maze with 10^10 possible branches. Without considering which branch to take, no one is going to find the finish.

Replies from: Student_UK
comment by Student_UK · 2011-02-20T10:56:55.723Z · LW(p) · GW(p)

Sure. In reality it is still going to require some narrowing down. But once you have reduced it to a few cases the best thing might be to just guess.

comment by JenniferRM · 2011-02-21T05:51:57.118Z · LW(p) · GW(p)

Can you think of other examples where individual biases might produce better results for the group?

When the a hive of ants (specifically Temnothorax curvispinosus) need to find a new hive location, explorers will spread out looking for likely places. If an ant finds one, she will go back to the existing hive and bodily grab another ant to carry to the new location. If both ants still think it is a good place they each grab another ant, and if everyone still agrees the body-movers bring back 8, 16, 32, etc ants. If a place is marginal then only a small percent of ants will pitch in after being physically carried and the growth will be much slower. A social algorithm like this can select between competing possible hive locations, taking into account issues like distance (less distance meaning shorter doubling times) as well as suitability, and it requires some measure of behavioral variability among the ants. Their aggregate behavior appears to avoid some decision biases found in everything from birds and humans, partly because each ant only seems to judge the quality of one option at a time.

The trick is, this behavior is something that has had many generations to be tuned by evolution. Ants hives have been solving this problem for a long time over a large number of hives that compete with each other for food and territory, whereas there have not been many hundred-million-person-groups competitively solving scientific collective action problems for thousands of generations, such that group selection could have tuned us to do it particularly well. For example, we might be well tuned for rock and spear fights between platoon size groups and some of that might carry over to the effectiveness of groups smaller than 50 people, but I strongly doubt we are genetically tuned for anything that happens at the level of democratic nation states, like subsidized physics research or stock market regulation.

I can imagine one or more human planners who figured out a way for relatively uncultivated humans to be given small amounts of training and arranged in certain institutions that are structurally tuned to thrive on their expected random errors, but doing this would itself require substantial cognitive effort, and if the situation changed (like memory intensive biology turned out to deserve way more resources than working memory intensive theoretical physics) this might need to be recognized somehow and adjusted for via exogenous rationally calculated efforts. You'd be able to point to the person or group of people doing the modeling and optimization work implied, and such people might have existed in the U.S. in the 1890's and 1940's but if they exist for the English speaking internet using world right now, I'm not familiar with their work.

In your example, the people who ran off down each door could probably have produced a better outcome for the group if they had stopped to trade cell phone numbers, agreed on a scheme for dividing the prize, and coordinated their exploration, possibly with some people in reserve to do warm up stretches while waiting to be effectively deployed later based on discoveries relayed back from early explorers of each path. If the maze, represented as a tree structure, had fewer leaf nodes than people and wasn't lopsided, then maybe "everyone run along a path not taken by competitors" could work, but at the very least they should agree on some condition for deciding to come back to the entrance to strategize more effectively based on individual discoveries, or they might all end up simply getting lost.

Adaptive structure doesn't come from nowhere. If Azathoth has not sacrificed a ridiculous number of people on the altar of fitness to have made us into people who are "naturally inclined" towards success in some particular environment, then we have to sacrifice false theories on the altar of truth so that they may die in our stead.

comment by Normal_Anomaly · 2011-02-20T18:58:12.908Z · LW(p) · GW(p)

For some people some of the time, a better analogy might be that the clues are made available before anyone is allowed into the maze. In this case, it's clearly better to figure out which door is best while you're waiting. To unpack the analogy, it's useful to debias yourself now, so that if a task requiring rationality comes along later (and it will) you can do it as well as possible.

comment by falenas108 · 2011-02-20T04:46:43.558Z · LW(p) · GW(p)

Having lots of people making leaps in different directions might also make science progress faster overall.

Yes, but some of this might be in the wrong direction. We have plenty of examples where scientists have gone with incorrect theories...

Replies from: Student_UK
comment by Student_UK · 2011-02-20T10:58:26.137Z · LW(p) · GW(p)

Of course. Most of it will be in the wrong direction, that's the point. It might not be best for you, but maybe it will be the best thing for the group.

Replies from: falenas108
comment by falenas108 · 2011-02-20T13:46:10.665Z · LW(p) · GW(p)

Sorry, should have been clearer. There are examples where scientists have had incorrect theories that science has accepted, which has set back scientific progress for decades.

This may not be due to running with it, maybe they did give their ideas a great deal of thought before writing about them, so your point may still be valid.

comment by HonoreDB · 2011-02-20T03:26:35.832Z · LW(p) · GW(p)

I think it's best for the world as a whole that there are some arrogant dilettantes who just bolt for the first door they see without doing proper scholarship. I think most of us use a mixed strategy: pick a main pursuit and try to do it as well as possible, and pick a few side pursuits to idly walk through every so often, looking for low-hanging fruit that somehow everyone else missed. The moral being: indulge your urge to be irrationally contrarian sometimes, but don't let it take over your life.

comment by Manfred · 2011-02-20T23:13:14.191Z · LW(p) · GW(p)

The former president of South Africa denied that HIV caused AIDS. Biases matter.

Replies from: Student_UK
comment by Student_UK · 2011-02-21T00:52:20.099Z · LW(p) · GW(p)

Ok. Clearly you only read the title, and not my actual post. I didn't say no biases matter, just that they might not always be a bad thing.

Replies from: Manfred, Desrtopa
comment by Manfred · 2011-02-21T04:08:25.734Z · LW(p) · GW(p)

Ah, sorry, I'd assumed that, though you talked about other things in your post, you still wanted attempted answers to the question in the title.

EDIT: also note that this provides a nice test of the value of heuristics with biases. If everyone on earth had them instead of didn't, would it be more valuable than a few million Africans?

comment by Desrtopa · 2011-02-21T01:46:52.604Z · LW(p) · GW(p)

Biases may not always be a bad thing, but you can't tell whether they're good in any specific case without comparing them objectively to an unbiased position. You can't skip straight to second order rationality without employing first order rationality first. If biases are bad on average, then as a rule you're generally better off assuming that it's better not to preserve your bias.

comment by prase · 2011-02-20T20:36:00.009Z · LW(p) · GW(p)

The science analogy is based on assumption that an unbiased scientist is unable to choose what to do. How is this assumption justified? In situations where the analogy applies, to realise that you have no time to think and must pick one random way to win the prize is a pretty standard rational decision, no bias involved.

Replies from: Student_UK
comment by Student_UK · 2011-02-21T00:56:10.929Z · LW(p) · GW(p)

You only have no time to think if your main priority is winning the prize. If you are interested in holding true beliefs then you can take longer. However, our current system tends to reward those that get there first, not those who maximize their chances of being correct.

Replies from: prase
comment by prase · 2011-02-21T13:39:30.597Z · LW(p) · GW(p)

However, our current system tends to reward those that get there first, not those who maximize their chances of being correct.

Depends on situation, and overall, your description of choosing between bias and success sounds like a false dilemma.

Your scenario assumes that 1) reading the clues you find out the location of the prize sooner than if you just waited until someone finds it, but 2) after you found out where the prize is you would have no time to get it physically, and meanwhile 3) you want to get the prize and simultaneously 4) you want to learn where the prize is as soon as possible.

I wonder how frequent such situations are (I don't think it applies to science), but nevertheless 3 and 4 are in a conflict given 1 and 2. Conflict of priorities is hardly something unexpected, and rational advice is straightforward: decide whether you want more 3 or 4, if 3, take one door and run, if 4, study the clues. Are you suggesting that decision about your priorities needs bias, or that prioritising 3 is bias? In either case, you would be using the word in a non-standard way. LW wiki defines bias as a specific, predictable error pattern in the human mind. The word error is important. Bias isn't a catch-all category which includes everything that retards knowledge.