Identifying bias. A Bayesian analysis of suspicious agreement between beliefs and values.
post by Stefan_Schubert · 2016-01-31T11:29:05.276Z · LW · GW · Legacy · 26 commentsContents
26 comments
Here is a new paper of mine (12 pages) on suspicious agreement between belief and values. The idea is that if your empirical beliefs systematically support your values, then that is evidence that you arrived at those beliefs through a biased belief-forming process. This is especially so if those beliefs concern propositions which aren’t probabilistically correlated with each other, I argue.
I have previously written several LW posts on these kinds of arguments (here and here; see also mine and ClearerThinking’s political bias test) but here the analysis is more thorough. See also Thrasymachus' recent post on the same theme.
26 comments
Comments sorted by top scores.
comment by OrphanWilde · 2016-02-01T21:24:11.717Z · LW(p) · GW(p)
I stopped arguing politics when I noticed that my beliefs and values had, at some point in the decade of arguing politics, converged.
I returned briefly once more because one of the forum members asked me to intervene in some ugliness that was taking place - one of the people who was nominally on my side had turned rabid in my absence - but otherwise haven't returned to that pastime.
Since then, my political beliefs have shifted (in some ways, I'm more extreme, in others, more moderate) absent the continual pressure of refinement by argument, but they've diverged from my values again.
I'm not certain whether that makes them more correct, or less, however. The winnowing of all unnecessary elements may have simply revealed the values underlying the beliefs, rather than the constant defense of them driving my values to align with my beliefs.
Replies from: buybuydandavis↑ comment by buybuydandavis · 2016-02-03T09:19:55.465Z · LW(p) · GW(p)
Having beliefs and values converge to the truth is the desired outcome. The trick is knowing if the convergence is to the truth, or just the shortest line projection between the two.
Whether in science or law, truth producing activities tend to be adversarial. If done honestly and with commitment, wtih capable adversaries, that's a pretty good system. If you care enough to spend the effort, and have capable and similarly committed adversaries available, I think that's a much better recipe for coming to the truth than stewing in the juices of your own beliefs and the beliefs of your tribe.
Replies from: OrphanWilde↑ comment by OrphanWilde · 2016-02-03T13:35:03.571Z · LW(p) · GW(p)
Having beliefs and values converge to the truth is the desired outcome.
Having your beliefs converge on the truth is the desired outcome.
Values don't have a truthiness property. If your beliefs and your values converge, something else is going on.
Replies from: buybuydandavis↑ comment by buybuydandavis · 2016-02-04T00:26:29.140Z · LW(p) · GW(p)
Since you were talking about values in a political context, I assumed they were political values, which usually presuppose some facts. Policies are rarely preferred entirely deontologically.
Do you have specific examples of the values you were referring to?
Replies from: OrphanWilde↑ comment by OrphanWilde · 2016-02-04T13:09:59.558Z · LW(p) · GW(p)
Policies aren't values. Values are those things which cause an individual to choose which policies to support.
Replies from: buybuydandavis↑ comment by buybuydandavis · 2016-02-05T01:50:18.325Z · LW(p) · GW(p)
That's not helping me at all.
I know the fact value distinction. I'm asking for specific examples so I can understand how you personally apply that.
Replies from: OrphanWilde↑ comment by OrphanWilde · 2016-02-05T14:19:20.739Z · LW(p) · GW(p)
That's not helping me at all.
In order to help you, I have to know what you need help doing.
I know the fact value distinction.
You suggested political values (which I'm re-interpreting as either "value" or "policy preference") presuppose facts. I think our definition of "value" must diverge if that is what you think is the case, and assume you are referring to policy preferences instead.
I'm asking for specific examples so I can understand how you personally apply that.
I'm not sure what you're asking for examples of, but here are some of my values:
Honesty. Correctness. Efficiency.
Here are some of my policy preferences:
Free speech (including lies or simple wrongness). Bodily autonomy (including abortion, drug usage, and sexuality). Market autonomy (that is, what is commonly referred to as capitalism).
The underlying axiom underneath my policy preferences is autonomy and self-responsibility. My central personal value is integrity.
comment by Douglas_Knight · 2016-02-01T19:10:42.170Z · LW(p) · GW(p)
If people's beliefs cluster, then there must be a common cause of the beliefs. One possible cause is politics. But you left out another potential common cause, somewhere between your (b) and (c), which is that there is some single factual belief which causes the large number of specific factual beliefs, which, in turn, cause politics.
This is related to the existence of a left-right political spectrum. Why is politics one dimensional? Many people say that this is the result of the two party system, forcing people into coalitions. Do people wind up with factual beliefs supporting their entire party platform, including coalition partners whose interests are not the same as their own? If parties are coalitions, you might expect different coalitions in different countries. Correlations between factual beliefs might switch between countries. But I do not think this happens. And many countries, like Germany, have proportional representation systems that do not require parties to be large coalitions. Yet German politics seems pretty one-dimensional to me.
Replies from: Viliam↑ comment by Viliam · 2016-02-02T10:27:12.509Z · LW(p) · GW(p)
Why is politics one dimensional? Many people say that this is the result of the two party system, forcing people into coalitions.
I believe it's the other way round. People were dividing others to "us" and "them" long before political parties were invented.
I'd say that "us" and "them" is hardcoded in people. We also have a bias to imagine that all our enemies are in some sense the same (so there is only one "them", instead of "them1", "them2", "them3"...) Most people are probably bad at imagining that more then two options are possible.
Also, there are often binary decisions to make: Someone proposes a new change of law in the parliament, do you vote "yes" or do you vote "no"?
If parties are coalitions, you might expect different coalitions in different countries. Correlations between factual beliefs might switch between countries. But I do not think this happens.
Sure it does. For example, in Slovakia, the only political party that supports legalization of marijuana and gay marriage is classified as right-wing (their political opponents love to say "extreme right-wing"), because they also happen to support free market. If I understand it correctly, in USA marijuana and gay marriage are generally considered left-wing issues.
Replies from: Lumifer, Douglas_Knight↑ comment by Lumifer · 2016-02-02T19:02:02.448Z · LW(p) · GW(p)
We also have a bias to imagine that all our enemies are in some sense the same (so there is only one "them", instead of "them1", "them2", "them3"...) Most people are probably bad at imagining that more then two options are possible.
Is that a falsifiable statement and do you have support for it?
By introspection this is false for me, but then I'm not "most people". However by the same token I would be wary of sweeping generalisations about "most people".
If someone told me "all my (political) enemies are the same, no significant difference between them", I would probably consider that person pretty stupid.
Replies from: Vaniver, Viliam↑ comment by Vaniver · 2016-02-02T19:09:51.248Z · LW(p) · GW(p)
The technical term is out-group homogeneity.
↑ comment by Douglas_Knight · 2016-02-02T18:54:45.677Z · LW(p) · GW(p)
Us vs Them would suggests two parties, not one-dimensional politics. If people are forced into coalitions, then the variation inside the coalitions should be orthogonal to the direction between the coalitions. But it seems to me that most of it is along the axis separating the parties. Maybe that's an illusion because it is salient - who is a swing voter, who can negotiate with the other party, etc. Moreover, when I look at a European country, I see a bunch of parties strung out along a single axis. This seems less likely to be an illusion, though I have less experience with European politics. How is this driven by Us vs Them?
Slovakia has a small libertarian party. So does America. The Slovak party is larger, partly because small parties are viable in a proportional system, and maybe also for other reasons reflecting a difference between the populations. That doesn't necessarily mean that the correlations between political positions are different between the two countries.
Moreover, the topic of the post is factual beliefs. Even if the structure of the coalitions causes people's policy preferences, that is a weaker claim than that it causes their factual beliefs. Maybe free-market Americans join the Republican party and come to oppose gay marriage and drug legalization, whereas free-market Slovaks join a different party with different influence. But I rather doubt that positions on gay marriage are driven by factual beliefs. Drug legalization might be, but it is probably a minor belief, a higher-order correction, compared to the important beliefs that drive their free-market positions. What I said I don't believe is that different coalitions drive the correlations between factual beliefs.
Replies from: Douglas_Knight↑ comment by Douglas_Knight · 2016-02-02T19:02:05.987Z · LW(p) · GW(p)
I said that when I look at a European country, I see a bunch of parties strung out along a left-right axis. But, actually, I guess I don't see anything, I just hear people describing the parties that way. Say, a left party L, a center party, C, and a right party R, allowing LC and CR coalitions. But often, when I look closely, they do seem to have exotic platforms that shouldn't rule out the LR coalition. For example, people were shocked by the British Liberal Democrats forming a coalition with the Tories, because "everyone knew" that they were a left party. (I guess everyone knew that because the Liberals and Social Democrats used to be left-wing, but somewhere along the way the Liberal positions because right-wing.) Similarly "everyone knows" that anti-immigrant parties are right-wing, and that it is completely impossible for them to form coalitions with left-wing parties, but most of them are single-issue parties with little opinion on anything else, certainly not a right-wing positions. (But they can’t form coalitions with anyone because they are anti-establishment.)
And, similarly, it is odd that the Slovak libertarian party is labeled “extreme right-wing” seeming to rule out the possibility of including it in a left coalition giving it control of civil liberties.
Replies from: Viliam↑ comment by Viliam · 2016-02-03T12:10:11.967Z · LW(p) · GW(p)
And, similarly, it is odd that the Slovak libertarian party is labeled “extreme right-wing” seeming to rule out the possibility of including it in a left coalition giving it control of civil liberties.
Well, in Slovakia "left-wing" means communists, so the civil liberties are a right-wing topic here. The current "left-wing" topic is how we need to hire hundreds of new policemen, to protect us from the immigrant hordes.
I think that the communists in the post-communist countries are psychologically an equivalent of the religious right in the countries that didn't have communism. That's another part of what makes speaking about "left" and "right" so confusing.
Replies from: Lumifer↑ comment by Lumifer · 2016-02-03T16:29:28.952Z · LW(p) · GW(p)
I think the word you need is "statism" -- the belief that strong central power is the best. It is shared by e.g. communists and fascists.
Replies from: Viliamcomment by Jiro · 2016-02-01T02:21:40.647Z · LW(p) · GW(p)
How many beliefs concern propositions which aren't probabilistically correlated with each other, though?
Also, this seems to ignore the possibility of deriving one's values from one's beliefs.
Also, this seems to ignore the possibility that people cannot research everything themselves, and therefore have to trust others to get some of their beliefs, and that if they have reason to distrust someone who happens to be using a true belief, they can justifiably distrust true beliefs. Climate change is extremely convenient for the left to promote leftist policies, to the point where anyone on the right who doesn't know a lot of science would justifiably think "the people who believe in climate change are probably engaged in motivated reasoning".
Replies from: gjm, gjm↑ comment by gjm · 2016-02-01T13:22:56.938Z · LW(p) · GW(p)
this seems to ignore the possibility of deriving one's values from one's beliefs
The possibility (or something closely related) is raised in the section headed "Causes of agreement between political views and factual beliefs" -- it's option (b) in the list near the start of that section. But that option gets dismissed rather rapidly. (Too rapidly, I think.)
↑ comment by gjm · 2016-02-01T13:21:43.336Z · LW(p) · GW(p)
Climate change is extremely convenient for the left to promote leftist policies
In so far as this is true, it is also true that climate change denial[1] is extremely convenient for the right. (And vice versa.)
[1] I intend this here to mean simply "denying" rather than "denying in the face of what ought to be overwhelming evidence"; there doesn't seem to be a neutral way of putting it.
This is a general phenomenon: if there are rival positions X and Y on a factual matter, which if true would support rival positions P and Q on a matter of policy, then you may suspect partisans of P of bias when they assert X, but you may equally suspect partisans of Q of bias when they assert Y. So if you are not yourself very partisan, what difference should this make to your opinions about X versus Y? It should make you treat someone's opinion about X/Y as less informative in so far as they have a partisan position on P/Q that would explain it. (But bear in mind that the causation may go X/Y -> P/Q rather than the other way around, so the appropriate discounting is less than you might naively think.)
So, in this case, you might reasonably be very suspicious about American politicians' statements about climate change, because in the US the issue is very politicized. So, where else should you look?
- Outside the US. It's really only in the US that such a tight relationship between politics and opinion on climate change is present. Outside the US, even right-wing politicians (whom you might, in terms of the model here, expect to have a strong bias against belief in anthropogenic climate change) generally agree that the climate is warming because of human industrial activity, and they generally favour taking measures to reduce the warming. (But "conservative parties do not challenge coal or petroleum in countries with large reserves of these resources".)
- The opinions of actual climate scientists. Being a scientist is of course no guarantee of being unbiased, and there are ways for scientific institutions to be biased even when individual scientists are not.[2] But reducing the influence of bias is a large part of what scientific methods and institutions are for, and probably most climate scientists got into the field before it was as political as it is now. (And many of them are outside the US; see above.) Well, it appears that climate scientists are pretty close to unanimous about the reality of anthropogenic climate change.
- People whose opinions on politics and climate change don't "match". E.g., take a look at the results here. It appears that about 24% of conservative Republicans accept anthropogenic climate change, and about 19% of liberal Democrats don't. Maaaaybe that's evidence that anthropogenic climate change is real. (But you need to be careful about this sort of thing: it could be that the political bias is stronger one way than the other. For what it's worth, my guess is that on this issue the conservative->no-ACC link is stronger than the liberal->ACC link, which would mean that that discrepancy is evidence for ACC, but I don't have any very compelling arguments to support that guess.)
[2] E.g., a wealthy entity -- a government, an industry consortium, etc. -- might provide funding for climate research in ways that happen to favour employment of scientists whose views match the wealthy entity's. Those scientists may individually be perfectly unbiased, following the evidence wherever it leads, but the process that selects them may be biased.
Replies from: Jiro↑ comment by Jiro · 2016-02-01T15:24:37.515Z · LW(p) · GW(p)
Climate change is extremely convenient for the left to promote leftist policies
In so far as this is true, it is also true that climate change denial[1] is extremely convenient for the right. (And vice versa.)
According to this argument, you should never think that any position of someone who disagrees with you is based on motivated reasoning. Because if the reasoning is convenient for opinion X, the opposite is always convenient for ~X. For instance, "evolution is a conspiracy of scientists" is convenient for creationists, but "evolution is not a conspiracy of scientists" is convenient for non-creationists. So if you claim the former is motivated reasoning, the creationist can claim the latter is motivated reasoning.
It doesn't really work this way; the positive claim is the one that needs to be analyzed for motivated reasoning.
Replies from: gjm↑ comment by gjm · 2016-02-01T15:58:45.284Z · LW(p) · GW(p)
you should never think that any position of someone who disagrees with you is based on motivated reasoning.
No; I didn't say that and if you think you can infer it from things I did say then at least one of us has made a serious error.
What I do claim is: (1) the mere fact that their position matches their values is not sufficient ground for thinking they're engaged in motivated reasoning; (2) motivated reasoning can happen (and does happen) on both sides of any issue and you shouldn't assume it's only on one side.
the positive claim is the one that needs to be analysed for motivated reasoning.
This is at best a heuristic (just as e.g. the notion of "the burden of proof" is). The same claim can often be cast as "positive" or "negative" without changing its content, and any claim (positive or negative) may be the result of motivated reasoning. (Stefan's paper gives right-wingers' skepticism about global warming as an example; in this thread you give left-wingers' endorsement of global warming as an example; I bet the amount of motivated reasoning going on in both cases is non-zero[1].)
[1] The amounts may be very different in the two cases.
Let's take a more careful look at your example of creationism. It is absolutely correct that "evolution is a conspiracy of scientists" is convenient for creationists, and that "evolution is not a conspiracy of scientists" is convenient for evolutionists. We can flip them around so that "positive" and "negative" change places: "evolution is an extremely well supported scientific theory and is almost certainly correct" is a positive claim. (You may notice that evolution and climate change fit the same templates.)
So the "positive versus negative" test is no use here. What else can we do? We can investigate the matter thoroughly for ourselves (in which case we no longer need to care much whether other people are engaging in motivated reasoning). We can look for less-biased populations, as I did two comments upthread, in which case we'll find plenty of devoutly religious people and non-scientists who accept evolution and very few devoutly irreligious people and scientists who reject evolution; and we'll find that in places where evolution is less "politicized" (meaning, in this case, less used as a shibboleth in arguments about religion or about the prestige of science) it's very widely accepted.
Or we can do what Stefan's paper describes (which overlaps with what I describe), and look at the extent to which people's attitudes to evolution are part of a bigger picture where whatever's hypothetically motivating them motivates other things too. Do anti-evolutionists tend also to be anti-abortion, anti-same-sex-marriage, (in the US) Republican rather than Democratic, etc.? Why, yes, they very much do, which by Stefan's heuristic suggests that their anti-evolutionism is likely to be the product of their religion. Do evolutionists tend to have positions opposite to those? Probably yes, but not to nearly the same extent.
What if we consider not religion but "prestige of science" as a possible cognition-motivator? Do evolutionists tend to accept anthropogenic climate change, quantum mechanics, general relativity, heliocentrism, etc.? Yes, but most of the things hidden under the "etc." are more or less uncontroversial, and on the actually-controversial ones (e.g., climate change) my prediction is that again we'll see more alignment with allegedly-motivating beliefs on the creationist side than on the non-creationist side.
So, it looks to me as if we can do pretty well at telling whether creationism or its reverse is more likely the result of motivated reasoning, but we need to work harder to do so than just decreeing one of them the "positive" position. Likewise with climate change. Do you disagree?
Replies from: Jiro↑ comment by Jiro · 2016-02-01T16:59:27.398Z · LW(p) · GW(p)
Do anti-evolutionists tend also to be anti-abortion, anti-same-sex-marriage, (in the US) Republican rather than Democratic, etc.? Why, yes, they very much do, which by Stefan's heuristic suggests that their anti-evolutionism is likely to be the product of their religion. Do evolutionists tend to have positions opposite to those? Probably yes, but not to nearly the same extent.
1) What is the extent? "Probably the same extent" doesn't really help you here if you don't know what it is.
2) This would suggest that the arguments for same-sex marriage were motivated reasoning 20 years ago, and the arguments for atheism were motivated reasoning 100 years ago. What the heuristic is really detecting is popularity, because unpopular beliefs tend to have much higher correlations with other beliefs.
Replies from: gjm↑ comment by gjm · 2016-02-01T19:12:37.223Z · LW(p) · GW(p)
What is the extent?
One can often compare things better than one can quantify them individually. I expect there are more Baptist preachers in the United States than in Bolivia, but I couldn't tell you how many there are of either.
In the present case, it seems to me that almost all anti-evolutionists just happen to be adherents of conservative forms of religions like Christianity and Islam that involve a relatively recent divine creation, which happen also to have something of a tradition of opposing abortion and homosexuality. (The link with right-wing politics in the US is a bit arbitrary, and actually on further reflection that link may be weaker than I thought, because it's specifically white conservative Christianity in the US that goes with voting Republican.)
What the heuristic is really detecting is popularity, because unpopular beliefs tend to have much higher correlations with other beliefs.
That might be true. (It's not obvious to me that it is.) But if it is, the obvious next question is: why are unpopular beliefs more highly correlated with other beliefs? The obvious candidate answers seem to me to amount to saying that unpopular beliefs are more likely to be caused by something other than the truth. E.g., an unpopular belief may correlate with other beliefs because it's held almost exclusively by a few small groups who were convinced of it by some single person, and he convinced them of a bunch of other things too. That's not the same thing as motivated reasoning, I agree, but it's got the exact same problem: it means that those beliefs are quite likely not very rationally held.
(Only "quite likely". Sometimes the single persuasive person really has spotted something important that others have missed. But the odds aren't good, especially once the belief in question has been around for a while and others have had a chance to be persuaded of it without the founder's charisma.)
Replies from: Jiro↑ comment by Jiro · 2016-02-01T20:59:31.187Z · LW(p) · GW(p)
why are unpopular beliefs more highly correlated with other beliefs?
The point is that even beliefs we consider correct were, when they were still unpopular, limited to a small group of people and highly correlated with other beliefs. That's how ideas spread. At one point, atheism was correlated with lots of radical ideas because all of society was religious, and atheism was so far from the status quo that nobody was going to be one without a type of conviction that would lead them to extremism in general (by their time's standards).
Do you think that support for gay marriage 20 years ago was rare but randomly distributed through the population?