Confabulation Bias
post by EricHerboso · 2012-09-28T01:27:19.040Z · LW · GW · Legacy · 2 commentsContents
2 comments
(Edit: Gwern points out in the comments that there is previous discussion on this study at New study on choice blindness in moral positions.)
Earlier this month, a group of Swedish scientists published a study that describes a new type of bias that I haven't seen listed in any of the sequences or on the wiki. Their methodology:
We created a self-transforming paper survey of moral opinions, covering both foundational principles, and current dilemmas hotly debated in the media. This survey used a magic trick to expose participants to a reversal of their previously stated attitudes, allowing us to record whether they were prepared to endorse and argue for the opposite view of what they had stated only moments ago.
In other words, people were surveyed on their beliefs and were immediately asked to defend them after finishing the survey. Despite having just written down how they felt, 69% did not even notice that at least one of their answers were surreptitiously changed. Amazingly, a majority of people actually "argued unequivocally for the opposite of their original attitude".
Perhaps this type of effect is already discussed here on LessWrong, but, if so, I have not yet run across any such discussion. (It is not on the LessWrong wiki nor the other wiki, for example.) This appears to be some kind of confabulation bias, where invented positions thrust upon people result in confabulated reasons for believing them.
Some people might object to my calling this a bias. (After all, the experimenters themselves did not use that word.) But I'm trying to refer less to the trick involved in the experiment and more toward the bias this experiment shows that we have toward our own views. This is a fine distinction to make, but I feel it is important for us to recognize.
When I say we prefer our own opinions, this is obvious on its face. Of course we think our own positions are correct; they're the result of our previously reasoned thought. We have reason to believe they are correct. But this study shows that our preference for our own views goes even further than this. We actually are biased toward our own positions to such a degree that we will actually verbally defend them even when we were tricked into thinking we held those positions. This is what I mean when I call it confabulation bias.
Of particular interest to the LessWrong community is the fact that this bias apparently is more susceptible to those of us that are more capable of good argumentation. This puts confabulation bias in the same category as the sophistication effect in that well informed people should take special care to not fall for it. (The idea that confabulation bias is more likely to occur with those of us that argue better is not shown in this study, but it seems like a reasonable hypothesis to make.)
As a final minor point, I just want to point out that the effect did not disappear when the changed opinion was extreme. The options available to participants involved agreeing or disagreeing on a 1-9 scale; a full 31% of respondents who chose an extreme position (like 1 or 9) did not even notice when they were shown to have said the opposite extreme.
2 comments
Comments sorted by top scores.
comment by gwern · 2012-09-28T01:42:27.690Z · LW(p) · GW(p)
http://lesswrong.com/lw/elg/new_study_on_choice_blindness_in_moral_positions/ ?
Replies from: EricHerboso↑ comment by EricHerboso · 2012-09-28T01:59:47.143Z · LW(p) · GW(p)
Thank you for pointing this out. I'm embarrassed for not noticing this in advance of writing the above.