On Expressing Your Concerns
post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-12-27T04:04:44.000Z · LW · GW · Legacy · 38 commentsContents
38 comments
The scary thing about Asch’s conformity experiments is that you can get many people to say black is white, if you put them in a room full of other people saying the same thing. The hopeful thing about Asch’s conformity experiments is that a single dissenter tremendously drove down the rate of conformity, even if the dissenter was only giving a different wrong answer. And the wearisome thing is that dissent was not learned over the course of the experiment—when the single dissenter started siding with the group, rates of conformity rose back up.
Being a voice of dissent can bring real benefits to the group. But it also (famously) has a cost. And then you have to keep it up. Plus you could be wrong.
I recently had an interesting experience wherein I began discussing a project with two people who had previously done some planning on their own. I thought they were being too optimistic and made a number of safety-margin-type suggestions for the project. Soon a fourth guy wandered by, who was providing one of the other two with a ride home, and began making suggestions. At this point I had a sudden insight about how groups become overconfident, because whenever I raised a possible problem, the fourth guy would say, “Don’t worry, I’m sure we can handle it!” or something similarly reassuring.
An individual, working alone, will have natural doubts. They will think to themselves, “Can I really do XYZ?” because there’s nothing impolite about doubting your own competence. But when two unconfident people form a group, it is polite to say nice and reassuring things, and impolite to question the other person’s competence. Together they become more optimistic than either would be on their own, each one’s doubts quelled by the other’s seemingly confident reassurance, not realizing that the other person initially had the same inner doubts.
The most fearsome possibility raised by Asch’s experiments on conformity is the specter of everyone agreeing with the group, swayed by the confident voices of others, careful not to let their own doubts show—not realizing that others are suppressing similar worries. This is known as “pluralistic ignorance.”
Robin Hanson and I have a long-running debate over when, exactly, aspiring rationalists should dare to disagree. I tend toward the widely held position that you have no real choice but to form your own opinions. Robin Hanson advocates a more iconoclastic position, that you—not just other people—should consider that others may be wiser. Regardless of our various disputes, we both agree that Aumann’s Agreement Theorem extends to imply that common knowledge of a factual disagreement shows someone must be irrational.1 Despite the funny looks we’ve gotten, we’re sticking to our guns about modesty: Forget what everyone tells you about individualism, you should pay attention to what other people think.
Ahem. The point is that, for rationalists, disagreeing with the group is serious business. You can’t wave it off with, “Everyone is entitled to their own opinion.”
I think the most important lesson to take away from Asch’s experiments is to distinguish “expressing concern” from “disagreement.” Raising a point that others haven’t voiced is not a promise to disagree with the group at the end of its discussion.
The ideal Bayesian’s process of convergence involves sharing evidence that is unpredictable to the listener. The Aumann agreement result holds only for common knowledge, where you know, I know, you know I know, etc. Hanson’s post or paper on “We Can’t Foresee to Disagree” provides a picture of how strange it would look to watch ideal rationalists converging on a probability estimate; it doesn’t look anything like two bargainers in a marketplace converging on a price.
Unfortunately, there’s not much difference socially between “expressing concerns” and “disagreement.” A group of rationalists might agree to pretend there’s a difference, but it’s not how human beings are really wired. Once you speak out, you’ve committed a socially irrevocable act; you’ve become the nail sticking up, the discord in the comfortable group harmony, and you can’t undo that. Anyone insulted by a concern you expressed about their competence to successfully complete task XYZ will probably hold just as much of a grudge afterward if you say, “No problem, I’ll go along with the group,” at the end.
Asch’s experiment shows that the power of dissent to inspire others is real. Asch’s experiment shows that the power of conformity is real. If everyone refrains from voicing their private doubts, that will indeed lead groups into madness. But history abounds with lessons on the price of being the first, or even the second, to say that the Emperor has no clothes. Nor are people hardwired to distinguish “expressing a concern” from “disagreement even with common knowledge”; this distinction is a rationalist’s artifice. If you read the more cynical brand of self-help books (e.g., Machiavelli’s The Prince) they will advise you to mask your nonconformity entirely, not voice your concerns first and then agree at the end. If you perform the group service of being the one who gives voice to the obvious problems, don’t expect the group to thank you for it.
These are the costs and the benefits of dissenting—whether you “disagree” or just “express concern”—and the decision is up to you.
1See “The Modesty Argument.” http://lesswrong.com/lw/gr/the_modesty_argument.
38 comments
Comments sorted by oldest first, as this post is from before comment nesting was available (around 2009-02-27).
comment by manuelg · 2007-12-27T05:01:19.000Z · LW(p) · GW(p)
But isn't this just another failure mode of groups working together, which we already know is far from optimal?
Like so many of the other failure modes of groups (stupid but loud people having an over-sized influence, smart but shy people having no influence, stopping exploring the problem/solution space way to early, couching everything in weasel-words, etc), you can do so much better with an iterative process:
Quick brainstorming
Written summary of everything said during brainstorming
All participants work on sub-problems on their own.
All participants present individual findings before whole group.
Repeat (solo-work becoming less about research and more about production as time goes on)
This gets to the heart of one thing I don't understand about "Overcoming Bias: The Blog-site". Is the idea to stamp out bias in others, or is the idea to prevent bias in ourselves?
The only people who have a chance over "overcoming-bias" are the ones striving for a goal under significant constraints. Because the are the only ones willing to shoulder the burden of consistent rationality.
comment by Tristram_Brelstaff · 2007-12-27T09:31:30.000Z · LW(p) · GW(p)
So, if you are an emperor wanting honest advice on your wardrobe, then Asch's results suggest that 'planting' one or more dissenters would be a good way to get it.
Replies from: theguyfromoverthere↑ comment by theguyfromoverthere · 2014-06-19T18:05:49.329Z · LW(p) · GW(p)
That is very clever!
Replies from: Autolykos↑ comment by Autolykos · 2017-02-09T13:10:50.150Z · LW(p) · GW(p)
It's probably one of the many useful functions of the court jester :)
Replies from: matthew-milone↑ comment by Matt Vincent (matthew-milone) · 2022-05-04T19:20:18.127Z · LW(p) · GW(p)
It's useful until the jester gains a reputation as someone whose views shouldn't be taken seriously, at which point the jester's dissent may begin to have the opposite effect.
Replies from: martin-randall↑ comment by Martin Randall (martin-randall) · 2024-09-26T12:32:51.161Z · LW(p) · GW(p)
This can be countered, the emperor can occasionally take the jester's side, and the jester can hide serious views behind a mask of silliness.
comment by Al_Fin · 2007-12-27T09:37:59.000Z · LW(p) · GW(p)
Progress comes from the persons willing to be different and create a new approach to solving problems--or entire new industries. Conformity is the bane of the politically correct approach to "consensus." In a conformity environment the best one can hope for is a local optimum solution that will likely be outworn quickly as reality sets in.
The rush to premature consensus destroys the possibility of achieving a global optimum.
In Russia and China one can be shot for being different. In the west, one is merely ostracized and demonized.
Replies from: lucidfox↑ comment by lucidfox · 2011-07-30T17:04:32.587Z · LW(p) · GW(p)
In Russia and China one can be shot for being different.
I think you might need to update your beliefs about Russia. The ones you seem to have are stuck in the 1930s-1940s.
Replies from: Chris_Roberts↑ comment by Chris_Roberts · 2012-08-24T16:08:19.934Z · LW(p) · GW(p)
Maybe not shot, but still jailed:
http://news.blogs.cnn.com/2012/08/17/russian-court-to-rule-in-pussy-riot-trial/?iref=allsearch
comment by RobinHanson · 2007-12-27T14:27:47.000Z · LW(p) · GW(p)
I wonder: by introducing dissenters can one get people to disagree too much with the majority?
comment by Richard_Hollerith2 · 2007-12-27T14:39:13.000Z · LW(p) · GW(p)
Is the idea to stamp out bias in others, or is the idea to prevent bias in ourselves?
The only persons we can help to become more rational are those who have freely chosen to try to become more rational. It is impossible with our current technology to force another person to become more rational against his will.
The mere fact that you asked that question makes me a little worried about you, manuel. Consider avoiding for a few years anyone who describes their politics as "progressive" or Leftist or who frequently appeals in a unquestioning way to the ideal of "social justice", diversity or multiculturalism!
Note that the progressives and the Leftists can and do prevent people in strategic occupations such as the media and the universities from expressing certain beliefs in public, but that is different from being able to force them to become more rational.
Of course other groups try to stamp out beliefs and opinions, too, but someone who comments here was most likely led to the idea that it might be a good idea to stamp out bias in others by the Leftists or the progressives.
The only people who have a chance over "overcoming-bias" are the ones striving for a goal under significant constraints.
I agree.
comment by Chris · 2007-12-27T18:18:43.000Z · LW(p) · GW(p)
I've always wondered, since I was very small, why 'The Emperor's New Cloths' as commonly told doesn't include the scene where the Emperor has the Imperial Guard clear the street with a sabre charge.
Replies from: taryneast, tlhonmey↑ comment by taryneast · 2010-12-19T14:43:42.160Z · LW(p) · GW(p)
Probably for the same reason as that the proverb doesn't go:
In the land of the blind, the one-eyed man is burned at the stake for espousing the heresy of light in our kingdom of blessed darkness.
People are trying to make their own point with proverbs... and don't like them to be turned back upon themselves.
Replies from: wizzwizz4↑ comment by wizzwizz4 · 2019-07-08T17:09:10.620Z · LW(p) · GW(p)
That sounds like a H. G. Wells story (you can listen to it here).
↑ comment by tlhonmey · 2022-05-16T19:59:53.259Z · LW(p) · GW(p)
Because, once the child had said it and everyone was laughing, it was too late. Everyone knows the emperor is an idiot now, his authority is pretty well broken. If he gets violent at that point his head will be on a spike by sundown.
Which... Not all emperors in real history have been that smart. So it could be a fitting end for the story nonetheless.
comment by manuelg · 2007-12-27T18:58:33.000Z · LW(p) · GW(p)
The mere fact that you asked that question makes me a little worried about you, manuel.
Uh, thx 4 ur concern. Kthxbye.
I call myself a liberal. Not because I act or think like most self-described liberals, but because the simple word "liberal" sends waves of debilitating revulsion through many people. Precisely the people whom I identify with a low probability of sustaining rational thought.
I am a liberal, but I am profoundly uninterested in coercing change in the beliefs or behavior of others. I find it a full-time job to coerce change in the beliefs and behavior of myself, consistent with goals, values, responsibilities, and personal roles I choose for myself. After working on myself, there is no time or energy left to try to affect others.
Frankly, I have zero confidence in any program of coercing change in the beliefs or behavior of others, regardless of the agency or the means. The specific means always overtake whatever was the initial positive goal. And the outcome becomes waste, sin, or cruelty.
That is what I find puzzling about "Overcoming Bias: The Blog Website". It is interesting when it discusses self-disciplines that are conducive to rationality. It is puzzling when it discusses irrationality of others. Because there is no agency or means to force others to be rational.
Others delight in irrationality. Full stop.
Replies from: taryneast↑ comment by taryneast · 2010-12-19T14:47:21.271Z · LW(p) · GW(p)
It is puzzling when it discusses irrationality of others. Because there is no agency or means to force others to be rational.
I think there's a hope that we'll find ways of persuading others. So it's worth tossing ideas around just on the offchance we'll find new and more potent tools for that purpose.
It also helps us feel better about our own choice ;)
comment by Richard_Hollerith2 · 2007-12-27T19:49:04.000Z · LW(p) · GW(p)
Sorry for reading your question in an uncharitable way and for lecturing you, manuel. You have made me aware that the name of this blog is less than ideal because it admits an unfortunate second interpretation (namely, "stamping out bias in others").
Replies from: Luke_A_Somers↑ comment by Luke_A_Somers · 2015-09-24T14:21:06.548Z · LW(p) · GW(p)
Stamping out is definitely excessively strong, but perhaps finding ways to promote social structures which in turn help promote rationality is not too much?
comment by Barkley_Rosser · 2007-12-27T20:25:40.000Z · LW(p) · GW(p)
It is well known that capable leaders consciously surround themselves with advisers who hold competing views, with at least some able to tell the leader when things are not going well. We have just been seeing a counterexample of this in an important real world position for the last seven years...
comment by Chris_Hibbert · 2007-12-28T01:08:14.000Z · LW(p) · GW(p)
In some companies I've worked for, we've found ways of running meetings that encouraged contributing information that is considered an attack in many other companies. The particular context was code reviews, but we did them often enough that the same attitude could be seen in other design discussions. The attitude we taught the code's presenter to have was appreciation for the comments, suggestions, and actual bugs found. The catechism we used to close code reviews was that someone would ask the presenter whether the meeting had been valuable, and the appropriate response was always "yes". The presenter could find different things to say about the value contributed by the attendees, but that catechism reinforces the point of view that improving the code is worth the time spent by the reviewers. As people get better at reviewing and being reviewed in the proper spirit, everyone who worked with us seemed to learn that finding fault with the code and explaining the problem clearly helped the company produce better products.
Once the engineers had learned how to provide constructive criticism, and others in the company learned to understand the spirit in which it was intended, it was easier to present disagreement on other subjects without needing to disagree at the end.
comment by Colin_Reid · 2007-12-29T18:03:02.000Z · LW(p) · GW(p)
Assuming spontaneous original thought is too difficult (and I doubt anything in this comment is original), how about this as a ritualised way of avoiding group-think:
A company has regular meetings to discuss its tactics. However, before the meeting, the boss tells one of the participants to be a rebel. (The others don't know who is the designated rebel at a given meeting, but it is understood that everyone will be told to play rebel sooner or later, for fairness if nothing else.) The rebel's job is to come up with persuasive arguments against the consensus position, even if it's a consensus the boss is believed to support (assuming the matter is still up for discussion). The rebel doesn't have to always take a minority position, so as not to force him into absurdities, but he has a bias in favour of rebellious behaviour because it will please the boss.
Why the secrecy? Because the uncertainty about who is the rebel creates a window for other participants to genuinely express anti-consensus opinions, something they'd otherwise be afraid to do for fear of ostracism. This is the real purpose of the rebel from the perspective of the boss.
Now the danger here is that the designated rebel will come to the meeting wearing black, so to speak, and so won't actually count for much in the social perspective of the other participants. However, the rebel has an incentive not to make it so obvious. In fact, even the would-be conformists benefit from disguising the rebel, if they think the consensus is genuinely the right position, because as soon as the rebel is unmasked, the aura of the boss is also clear to see on him, so others would be socially obliged to show him more respect. (But this could make them inclined to agree with him, which makes his job as rebel intellectually taxing, coupled with the extra pressure of increased attention, so he won't enjoy this reverence too much.) Also, the others may feel sympathy for the rebel, because it's not a role he has chosen, and the chances are that they will be called on to do the same. This sympathy also extends to possible rebels. So this will hopefully make dissent much more socially acceptable, and reduce the urge to 'destroy the traitor' by ignoring or ridiculing him.
Why have only one rebel? Because if everyone were rewarded for rebellion, it would create constant disagreement for the sake of it. (To make it clear, the boss does not automatically reward all rebellions in the meeting, only those of the designated rebel.) The designated rebel is just there to break the spell of unanimity. He is made to sacrifice much of his own freedom of action, but in a decent-sized group this is hopefully compensated for by the increased independence of the others.
Does this kind of manufactured dissent actually work in reducing bias overall? Or would the 'we hate the lone rebel' bias prove too strong to overcome, even when it's theoretically trumped by the approval of the boss?
Replies from: danlowlite↑ comment by danlowlite · 2010-10-29T14:18:53.311Z · LW(p) · GW(p)
I've been in a meeting where this was done and an openly designated "contrarian" was appointed. The specific instance where this was performed was a "diversity" training, so YMMV.
He didn't do anything. He was too new and high in the organization to be effective. His position, when he did speak up, made it unlikely that someone would contradict his contradictions. While eventually he became effective at his job (replacing a much-loved person, no easy task), it was still simply not like him to do this; we all saw it and he didn't work out in this contrarian role.
See also: Good Cop/Bad Cop.
comment by Ben_Jones · 2007-12-30T01:20:33.000Z · LW(p) · GW(p)
Some very good points. I think this is the same phenomenon that means that when a single person sees a traffic accident, they'll instantly phone for help and get involved. When the same accident happens in a crowded street, everyone looks at everyone else to see what to do.... I'm sure this has been mentioned in a past post though.
If only everyone had a built-in heuristic saying 'Shut your eyes! Decide what you think! Now, open your eyes and go for it!'
Eliezer - I can't count the number of times I've arrived on the scene to meet some (normally-rational) friends who, for the sake of politeness, have 'chosen' to embark on a truly silly 'agreed' course of action. This even occurs to the point where when asked 'what were you thinking?', the response is simply to look at one another and wait for an answer....
comment by Remco · 2008-01-05T10:00:24.000Z · LW(p) · GW(p)
Another thing from software development: I know companies that try to decrease this effect when making estimates in planning, by giving every team member a stack of cards with possible guesses (1 day, 2 days, 3 days, 5 days, etc). Each member chooses a card in secret, then everybody turns theirs over at the same moment. If everybody more or less agrees, then that value is chosen. If there's disagreement, they talk more (typically people who chose outliers get to explain why), then the procedure is repeated.
First step is clearly good, in the second step conformity is probably a factor again.
Another interesting point (not for this discussion, but for software estimates :-)) is that the numbers on the cards are the fibonacci series, and they don't actually have hours or days, but "points". The project leader has historical data that tells him how long one "point" actually took in past projects.
Replies from: orthonormal↑ comment by orthonormal · 2011-12-06T16:14:25.328Z · LW(p) · GW(p)
An obvious fix is to make this a secret ballot- shuffle together the chosen answers and set aside the others. This would make it far easier to dissent. The dissenter might choose not to speak up if the group asks who the dissenter was, but the mere presence of the outlier should spur debate.
comment by taryneast · 2010-12-19T14:39:30.052Z · LW(p) · GW(p)
Machiavelli actually strongly advises that a prince should avoid flatterers (ie conformists). In fact, in "the Prince", he specifically says "the only way to safeguard yourself against flatterers is by letting people understand that you are not offended by the truth".
Perhaps you're confusing this advice with that of Castiglione's "The Courtier", which was aimed at a different audience. "The Courtier" has a lot of advice about how to suppress your own thoughts and feelings so as to put on a proper show for one's peers.
Advice for princes is different to advice for their lackeys ;)
Replies from: wedrifid↑ comment by wedrifid · 2010-12-19T15:20:43.132Z · LW(p) · GW(p)
In fact, in "the Prince", he specifically says "the only way to safeguard yourself against flatterers is by letting people understand that you are not offended by the truth".
Or you could deliberately say stupid things sometimes and shun anyone who agrees with you. (A common courtship behavior, with the origins of the 'courtship' term being rather pertinent.)
Replies from: taryneast↑ comment by taryneast · 2010-12-19T15:23:57.993Z · LW(p) · GW(p)
I like this one. It reminds me of something that I heard at a Billy Connelly show once.
He was wearing eye-wateringly lurid, stripey pants, and told us this was his "idiot detector"... anybody that tried to make fun of him, he didn't need to bother talking to.
He also recommended hideous brooches (for men) for the same purpose... eg three flying ducks.
Replies from: wedrifid↑ comment by wedrifid · 2010-12-19T15:35:58.883Z · LW(p) · GW(p)
Did he happen to mention the necessity of acquiring high status via other means (or signalling conformance to high status trends in most other areas) before trying things like this?
Replies from: taryneast↑ comment by taryneast · 2010-12-19T16:09:08.656Z · LW(p) · GW(p)
Not that I recall... though I suspect that it'd actually work better as an idiot detector if you were not of high status... because you would be more likely to get an honest reaction.
If I were somebody very famous - some people might actually repress their natural tendencies to be an arse about something they didn't expect... and instead assume that I was being "avant garde"... ie a fashion inspiration instead.
Replies from: wedrifidcomment by MarkusRamikin · 2012-04-13T07:40:29.719Z · LW(p) · GW(p)
A group of rationalists might agree to pretend there's a difference, but it's not how human beings are really wired.
Is this the most that we can accomplish, agree to pretend?
comment by 1gn1t0r · 2012-08-10T07:59:26.617Z · LW(p) · GW(p)
The thought always occurred to me that The Emperor's New Clothes represent a religious allegory.
I remember as a child I always thought other people could hear and see and communicate with God and that I was the only one which could not. Hence I would always pretend to be able to see the clothes of the Emperor.
Indeed, if there are no dissenters then you don't want to be the first.
comment by Yoav Ravid · 2019-01-08T17:18:35.061Z · LW(p) · GW(p)
The hopeful thing about Asch's conformity experiments is that a single dissenter tremendously drove down the rate of conformity, even if the dissenter was only giving a different wrong answer. (...) Being a voice of dissent can bring real benefits to the group. But it also (famously) has a cost. And then you have to keep it up. Plus you could be wrong.
so a good thing to do can be to voice a disagreement with the group (even if you don't know the answer), in order to give someone else the courage to speak the truth, and then support whatever it is he said (assuming for some reason you still can't know the answer)?
that's assuming that other people aren't likely to speak up whatever just so someone else will speak the truth, so it's probable that if someone defy the group he has a good reason to think he's right.
comment by gianlucatruda · 2021-04-07T09:55:16.582Z · LW(p) · GW(p)
There will be a Sequences Discussion Club event to talk about this post. Join us on Clubhouse tonight for a ~1h discussion. https://www.joinclubhouse.com/event/PQRv1RoA