Adversarial System Hats
post by Johnicholas · 2009-03-11T16:56:05.745Z · LW · GW · Legacy · 15 commentsContents
15 comments
In Reply to: Rationalization, Epistemic Handwashing, Selective Processes
Eliezer Yudkowsky wrote about scientists defending pet hypotheses, and prosecutors and defenders as examples of clever rationalization. His primary focus was advice to the well-intentioned individual rationalist, which is excellent as far as it goes. But Anna Salamon and Steve Rayhawk ask how a social system should be structured for group rationality.
The adversarial system is widely used in criminal justice. In the legal world, roles such as Prosecution, Defense, and Judge are all guaranteed to be filled, with roughly the same amount of human effort applied to each side. Suppose individuals chose their own roles. It is possible that one role turns out more popular. Because different effort is applied to different sides, selecting for the positions with the strongest arguments will no longer much select for positions that are true.
One role might be more popular because of an information cascade: individuals read the extant arguments and then choose a role, striving to align themselves with the truth, and create arguments for that position. Alternately, a role may be popular due to status-based affiliation, or striving to be on the "winning" side.
I'm well aware that there are vastly more than two sides to most questions. Imagine a list of rationalist roles something like IDEO's "Ten Faces".
Example rationalist roles, leaving the obvious ones for last:
- The Mediator, who strives for common understanding and combining evidence.
- The Wise, who may not take a stand, but only criticize internal consistency of arguments.
- The Perpendicularist, who strives to break up polarization by "pulling the rope sideways".
- The Advocate, who champions a controversial claim or proposed action.
- The Detractor, who points out flaws in the controversial claim or proposed action.
Due to natural group phenomena (cascades, affiliation), in order to achieve group rationality, there need to be social structures that strive to prevent those natural phenomena. Roles might help.
15 comments
Comments sorted by top scores.
comment by Daniel_Burfoot · 2009-03-12T02:40:32.930Z · LW(p) · GW(p)
This is a nice proposal. I think the heart of the issue is the fact that the proposer of an idea or a suggestion gets a status boost if the idea is accepted. Thus, we experience a misalignment of incentives: ideally, I want my incentive structure to push me towards the truth, but actually I am pushed toward advocacy of ideas I proposed.
I would suggest some other roles:
The Fact Checker: this role seeks to ensure that the factual content of the proposed arguments are correct.
The Separator: the person in this role strives to separate terminal from instrumental values.
The Simplifier: the person seeks to simplify and refactor the arguments, making them as clean and direct as possible.
It would be fun to get a bunch of people, find some standard debating topic, assign roles, and see what kind of conclusions come out. In other words, to test the non-adversarial debating format and see what happens.
comment by Annoyance · 2009-03-11T19:37:42.964Z · LW(p) · GW(p)
It's an interesting idea. I don't think it's necessary, but it may be convenient.
A peculiar feature of the idea is that it may not be desirable for the individual roles to be rational. In order to best fulfill certain roles, the people playing the part must play up the aspects of themselves that do what the role requires and stifle the aspects that serve the functions of others.
If you've taken up a role of generating new ideas, you need to lower your standards so that you're free to come up with things you wouldn't normally think of; if you're the skeptical evaluator for the group, you have to have very high standards and not offer any slack.
Replies from: Johnicholas↑ comment by Johnicholas · 2009-03-11T20:06:34.571Z · LW(p) · GW(p)
"It may not be desirable for the individual roles to be rational."
Exactly! Group rationality may be easier to achieve by encouraging "irrational" verbal positions. Of course, playing a role doesn't mean believing it; so the actual people behind the roles would have more rational, less one-sided beliefs.
Replies from: Annoyance↑ comment by Annoyance · 2009-03-11T20:15:06.824Z · LW(p) · GW(p)
Okay, point taken. Individual behavior would become irrational, even if the thought processes behind the actions were perfectly rational. I will note that our beliefs tend to follow our actions. Trying to act irrationally likely has corrosive effects on our capacity to be rational unless we're very careful.
Now the question is: are there any benefits to this sort of group rationality, as opposed to simply encouraging everyone in the group to become more individually rational?
Replies from: Johnicholas↑ comment by Johnicholas · 2009-03-11T20:28:12.133Z · LW(p) · GW(p)
I don't know, but I think the legal system might be (some) evidence in favor of using roles.
If cases were judged by professionals encouraged to be individually rational, it's not clear how the cases would turn out differently, but there don't seem to be as many corrective mechanisms against small biases. It would be something like open-loop control.
Replies from: Annoyance↑ comment by Annoyance · 2009-03-11T20:30:38.353Z · LW(p) · GW(p)
I tend to view any system in which rhetoric is emphasized to be a great example of how not to be rational.
Speech and debate competitions in high school only solidified that belief.
Replies from: HCE↑ comment by HCE · 2009-03-12T01:14:56.670Z · LW(p) · GW(p)
at the same time, if you're a lawyer defending someone likely to be innocent and your goal is to have him exonerated, the most rational strategy is to use whatever lawyerly wiles you have at your disposal to convince an irrational jury of his innocence. an airtight bayesian argument may not be understood or it may be understood but disregarded, whereas a persuasive story vividly told can convince a jury of almost anything.
you cannot win the game if you refuse to accept the rules, and one of the implicit rules in almost every social game is that almost all of the participants are irrational almost all of the time.
comment by pjeby · 2009-03-11T18:16:51.792Z · LW(p) · GW(p)
Sounds a bit like deBono's Six Thinking Hats, as well.
Replies from: soreff, Johnicholas↑ comment by Johnicholas · 2009-03-11T19:07:46.638Z · LW(p) · GW(p)
Yes, I referenced those in a previous post, and should have mentioned them again explicitly. Thanks.
comment by Scott Alexander (Yvain) · 2009-03-12T13:38:21.489Z · LW(p) · GW(p)
I'm completely uncertain whether this would work better, worse, or the same as more common methods of group decision-making. It's certainly an interesting idea.
I would make one caution, though. I find that businesses, schools, and decision-making workshops are far too willing to accept any cute or clever or radical sounding idea without any evidence that it works. It's easier to use them as an boast: "Don't say our decisions aren't rational. We care so much about being rational that we make all our decisions with special rationalist hats. If you're so rational, what do you do?" With "make decisions as well as possible based on available information" being a less acceptable answer than "have color coded teams using the Ten Step Rationalo-X Colors Method" or whatever.
For me to use this, I would need evidence that it worked. The best evidence would be assigning people to random groups, having one group talk it out informally and having the other use this hats method, and making them work on problems that are difficult but where there is one correct answer. If the hat people come to the correct answer more often than the non-hat people, then we use hats for everything.
I don't know why people don't do this more often for the common decision-making systems proposed, but I'll bet Robin Hanson would have some choice things to say about it.
comment by [deleted] · 2009-03-12T00:59:13.512Z · LW(p) · GW(p)
It seems to me it would be helpful if people would at least explicitly TAG themselves as to their role. Even if they picked them themselves, we could get a better overall view of the argument, if there was only one person on side B, we could pay special attention to them etc.
Replies from: None↑ comment by [deleted] · 2009-03-12T02:13:08.461Z · LW(p) · GW(p)
Here's a crude schematic of how positional tagging could be implemented on a site like this. You'd pick an existing tag, or make a new one if none of them fit your position. You could have a different position in different posts.
If your position is similar to an existing position, you can pick a color which is similar, and so on.
If you don't tag your own posts, anybody can come along and tag them for you until you do.
comment by Mestroyer · 2012-07-28T03:02:39.971Z · LW(p) · GW(p)
Can't altruistic rationalists who want to be right as individuals, and want the group to be right more than they want to be known to be right avoid information cascades? All they have to do is form their own private opinion, and even if they are swayed toward the majority opinion by the evidence of the other rationalists' opinions, pretend to be contrarians, because of the consequences that will have for the rest of the group (providing them more information)? Or at least say something like "I accept the majority opinion because the majority accepts it and they are unlikely to be wrong, but here are some arguments against that opinion I came up with anyway?"
comment by [deleted] · 2009-03-11T21:12:32.685Z · LW(p) · GW(p)
Another possible solution would be to let people choose to be part of the justice system, but NOT choose what role they play. Randomly assign them judge, prosecution, or defense. If we only need to ensure that equal effort is applied to both sides, it seems like this would be enough.