Some Remarks on the Nature of Political Conflict

post by Paperclip Minimizer · 2018-07-04T12:31:07.840Z · LW · GW · 6 comments

Contents

6 comments

[Part of the debate around: Conflict Vs. Mistake]

[Criticizing articles like: In defence of conflict theory, Conservatives As Moral Mutants (part of me feels like the link is self-trigger-warning, but I guess I will just warn you that this is not a clever attention-grabbing title, the link means exactly what it says and argues it at some length)]

[Related to: Knowing About Biases Can Hurt People [LW · GW], Would Your Real Preferences Please Stand Up? [LW · GW], The Cowpox of Doubt, Guided By The Beauty Of Our Weapons]

[Epistemic effort: I thought of this argument and was so pleased by my own cleverness that I decided to post it.]

[Note: I have a nagging feeling I’ve spent a thousand words spelling out something completely obvious. Still, I hope there’s value in actually spelling it out.]

There has been a flurry of discussion around the nature of political conflict in the rationality movement for the last five months, sparked by a blog post by Scott Alexander [? · GW] on his blog Slate Star Codex making a dichotomy between mistake theorists who think their political opponents are mistaken on factual policy questions and conflict theorists who think their political opponents are innately evil [LW · GW]. There have been a lot of good articles on the subject on every side and on both the object-level and the meta-level (well, on both the meta-level and the meta-meta-level), but also many bad ones resting on mistakes (I know, I am showing my side here).

One class of pro-conflict-theory arguments that bother me a lot goes like this:

Mistake theory can't be the correct worldview because, for example, it's historically documented that tobacco companies hired scientists to spread misinformation about whether smoking causes cancer instead of thinking about it in a rational way.

Other historical case studies used include the rise of liberal democracy, the abolition of slavery, giving women the right to vote, the end of segregation, etc.

A scientific theory that is often used in this kind of argument is Jonathan Haidt's work on political psychology. Jonathan Haidt, with his co-conspirator Jesse Graham, created moral foundations theory, according to which morality is divided into five foundations:

A shocking and unexpected discovery of moral foundations theory is that conservatives value Loyalty, Authority, and Sanctity more than liberals do. (Liberals also value Care and Fairness more than conservatives do, but this effect is magnitudes smaller than the other one.) Some conflict theorists, both liberal and conservative, have seized over this to claim conflict theory is correct and those darned Blues are moral mutants who can't listen.

This is the popular understanding of moral foundations theory, anyway. In reality, this is only pluralism, the fourth claim of moral foundations theory. The four claims of moral foundations theory are¹:

  1. Nativism: There is a “first draft” of the moral mind
  2. Cultural learning: The first draft gets edited during development within a particular culture
  3. Intuitionism: Intuitions come first, strategic reasoning second
  4. Pluralism: There were many recurrent social challenges in our evolutionary history, so there are many moral foundations

The third claim is intuitionism. Social intuitionism, as a psychological theory, is older than the moral pluralism that is often equated with moral foundations theory in pop science. Jonathan Haidt wrote about it in 2001, years before he wrote about moral pluralism. Social intuitionism is a model that proposes that moral positions and judgments are²:

  1. primarily intuitive ("intuitions come first")
  2. rationalized, justified, or otherwise explained after the fact
  3. taken mainly to influence other people, and are
  4. often influenced and sometimes changed by discussing such positions with others

If you look at what you think is moral foundations theory (but is actually only moral pluralism without the background of social intuitionism that is necessary to fully understand it), you might get the impression that people with different moral intuitions than you consciously do so. The reality is much much worse than that. Let's say Pro-Skub people value Skub and Anti-Skub people don't. Pro-Skub People don't know that their moral positions and judgments are primarily intuitive. They don't know that intuitions come first. They rationalize [? · GW] it, justify it, and otherwise explain it after the fact. Similarly, Anti-Skub people will rationalize their not valuing of Skub, justify it, and otherwise explain it after the fact.

This is very different from what popular misunderstanding suggest ! Popular misunderstanding suggest that you can trust your brain to be correct about the value of Skub, given the only reason that your opponents do/don't value Skub is that they have different terminal values than you. In reality, social intuitionism say that your brain is broken, is rationalizing its reasons to value or not value Skub, and your opponents' brain are also broken in the same way. Social intuitionism say that you can't trust your broken brain.

Rationalization is, of course, not limited to moral positions and judgments. It and its buddies confirmation bias and motivated cognition wander everywhere. It's not a coincidence that Motivated Stopping and Motivated Continuation [LW · GW] specifically use the example of tobacco science. But you - yes, you - aren't immune from rationalization, confirmation bias, or motivated cognition. You can't trust your brain to not do it. You can't trust your brain to not be the next conflict theorist case study.

Luckily, the fourth tenet of social intuitionism is that moral positions and judgments are often influenced and sometimes changed by discussing such positions with others. Your best way to not let your brain be the next conflict theorist case study is to deliberately exploit this as best you can. To not let your brain be the next conflict theorist case study, debate is essential. We all bring different forms of expertise to the table, and once we all understand the whole situation, we can use wisdom-of-crowds to converge on the correct answer. Who wins on any particular issue is less important creating an environment where your brain won't be the next conflict theorist case study.

What's the worst thing you could do in your quest to not let your brain be the next conflict theorist case study ? Probably treat everything as war and viewing debate as having a minor clarifying role at best. That's the best way for rationalization, confirmation bias, motivated cognition, and self-serving bias to creep in. This is how most of the conflict theorist case studies thought.

Mistake theory is the correct worldview precisely because tobacco companies hired scientists to spread misinformation about whether smoking causes cancer instead of thinking about it in a rational way.

¹: Graham, Jesse and Haidt, Jonathan and Koleva, Sena and Motyl, Matt and Iyer, Ravi and Wojcik, Sean P. and Ditto, Peter H., Moral Foundations Theory: The Pragmatic Validity of Moral Pluralism (November 28, 2012). Advances in Experimental Social Psychology, Forthcoming. Available at SSRN: https://ssrn.com/abstract=2184440

²: Haidt, Jonathan (2012). The Righteous Mind: Why Good People Are Divided by Politics and Religion. Pantheon. pp. 913 Kindle ed. ISBN 978-0307377906.

6 comments

Comments sorted by top scores.

comment by cousin_it · 2018-07-04T15:34:09.048Z · LW(p) · GW(p)

I've had similar thoughts but formulated them a bit differently. It seems to me that most people have the same bedrock values, like "pain is bad". Some moral disagreements are based on conflicts of interest, but most are importance disagreements instead. Basically people argue like "X! - No, Y!" when X and Y are both true, but they disagree on which is more important, all the while imagining that they're arguing about facts. You can see it over and over on the internet.

Importance disagreements happen because most of our importance judgments are just absorbed uncritically from other people. When many people tell you that something is important, you tend to believe it and look for more info about it, which makes it self-reinforcing. For example, people can argue about the relative importance of freedom vs equality, but that doesn't mean anything real - they just got stuck on different importance judgments which are both self-reinforcing. That's also how echo chambers work, it can be tough to point out their factual beliefs but it's always easy to see the shared importance judgment that people bond over.

I'm not sure how to fight that. You could ask yourself "am I right that something is important?" and look for objective answers based on bedrock values, but that seems hard. Maybe a good start is to ask yourself "what do I think is important and why?" and then just stare at the list for awhile.

Replies from: Paperclip Minimizer
comment by Paperclip Minimizer · 2018-07-04T18:32:12.357Z · LW(p) · GW(p)

I don't agree with your pessimism. To re-use your example, if you formalize the utility created by freedom and equality, you can compare both and pick the most efficient policies.

Replies from: cousin_it
comment by cousin_it · 2018-07-05T08:37:57.625Z · LW(p) · GW(p)

Yeah, you can do that if you try. The only problem is that something like "freedom of association is important" itself feels important. The same thing happens with personal importance judgments, like "I care about becoming a published writer" or "being a good Christian matters to me". They are self-defending.

Replies from: Paperclip Minimizer
comment by Paperclip Minimizer · 2018-07-09T09:48:19.234Z · LW(p) · GW(p)

I'm not sure what you mean.

comment by totallybogus · 2018-07-05T01:18:52.330Z · LW(p) · GW(p)

It's surprising to me that people are even debating whether mistake- or conflict-theory is the "correct" way of viewing politics. Conflict theory is always true ex ante, because the very definition of politics is the stuff that people might physically fight over, in the real world! You can't get much more "conflict-theory" than that. Now of course, this is not to say that debate and deliberation might not also become important, and such practices do promote a "mistake-oriented" view of political processes. But that's a means of de-escalation and creative problem solving, not some sort of proof that conflict is irrelevant to politics. Indeed, this is the whole reason why norms of fairness are taken to be especially important in politics, and in related areas such as law: a "fair" deliberation is generally successful at de-escalating conflict, in a way that a transparently "unfair" one (perhaps due to rampant elitism or over-intellectualism)-- even one that's less "mistaken" in a broader sense-- might not be.

Replies from: Paperclip Minimizer
comment by Paperclip Minimizer · 2018-07-05T08:12:56.319Z · LW(p) · GW(p)

This isn't what "conflict theory" mean. Conflict theory is a specific theory about the nature of conflict, that say conflict is inevitable. Conflict theory doesn't simply mean that conflict exist.