Rationality Considered Harmful (In Politics)
post by The_Jaded_One · 2017-01-08T10:36:37.384Z · LW · GW · Legacy · 19 commentsContents
1. Rationality considered harmful for Scott Aaronson in the great gender debate 2. Rationality considered harmful for Sam Harris in the islamophobia war 3. Rationality considered harmful when talking to your left-wing friends about genetic modification 4. Takeaways None 19 comments
Why you should be very careful about trying to openly seek truth in any political discussion
1. Rationality considered harmful for Scott Aaronson in the great gender debate
In 2015, complexity theorist and rationalist Scott Aaronson was foolhardy enough to step into the Gender Politics war on his blog with a comment stating that extreme feminism that he bought into made him hate himself and try to seek ways to chemically castrate himself. The feminist blogoshere got hold of this and crucified him for it, and he has written a few followup blog posts about it. Recently I saw this comment by him on his blog:
2. Rationality considered harmful for Sam Harris in the islamophobia war
I recently heard a very angry, exasperated 2 hour podcast by the new atheist and political commentator Sam Harris about how badly he has been straw-manned, misrepresented and trash talked by his intellectual rivals (who he collectively refers to as the "regressive left"). Sam Harris likes to tackle hard questions such as when torture is justified, which religions are more or less harmful than others, defence of freedom of speech, etc. Several times, Harris goes to the meta-level and sees clearly what is happening:
3. Rationality considered harmful when talking to your left-wing friends about genetic modification
In the SlateStarCodex comments I posted complaining that many left-wing people were responding very personally (and negatively) to my political views.
One long term friend openly and pointedly asked whether we should still be friends over the subject of eugenics and genetic engineering, for example altering the human germ-line via genetic engineering to permanently cure a genetic disease. This friend responded to a rational argument about why some modifications of the human germ line may in fact be a good thing by saying that "(s)he was beginning to wonder whether we should still be friends".
A large comment thread ensued, but the best comment I got was this one:
One of the useful things I have found when confused by something my brain does is to ask what it is *for*. For example: I get angry, the anger is counterproductive, but recognizing that doesn’t make it go away. What is anger *for*? Maybe it is to cause me to plausibly signal violence by making my body ready for violence or some such.
Similarly, when I ask myself what moral/political discourse among friends is *for* I get back something like “signal what sort of ally you would be/broadcast what sort of people you want to ally with.” This makes disagreements more sensible. They are trying to signal things about distribution of resources, I am trying to signal things about truth value, others are trying to signal things about what the tribe should hold sacred etc. Feeling strong emotions is just a way of signaling strong precommitments to these positions (i.e. I will follow the morality I am signaling now because I will be wracked by guilt if I do not. I am a reliable/predictable ally.) They aren’t mad at your positions. They are mad that you are signaling that you would defect when push came to shove about things they think are important.
Let me repeat that last one: moral/political discourse among friends is for “signalling what sort of ally you would be/broadcast what sort of people you want to ally with”. Moral/political discourse probably activates specially evolved brainware in human beings; that brainware has a purpose and it isn't truthseeking. Politics is not about policy!
4. Takeaways
This post is already getting too long so I deleted the section on lessons to be learned, but if there is interest I'll do a followup. Let me know what you think in the comments!
19 comments
Comments sorted by top scores.
comment by Gram_Stone · 2017-01-08T17:03:11.982Z · LW(p) · GW(p)
This post is already getting too long so I deleted the section on lessons to be learned, but if there is interest I'll do a followup. Let me know what you think in the comments!
I at least would be interested in hearing anything else that you have to say about this topic. I'm not averse to private conversation on the matter either; most such conversations of mine are private.
Hypothesis: Fiction silently allows people to switch into truthseeking mode about politics.
A history student friend of mine was playing Fallout: New Vegas, and he wanted to talk to me about which ending he should choose for the game's narrative. The conversation was mostly optimized for entertaining one another, but I found that this was a situation where I could slip in my real opinions on politics without getting wide-eyed stares! Like this one:
The question you have to ask yourself is "Do I value democracy because it is a good system, or do I value democracy per se?" A lot of people will admit that they value democracy per se. But that seems wrong to me. That means that if someone showed you a better system that you could verify was better, you would say "This is good governance, but the purpose of government is not good governance, the purpose of government is democracy." (I do, however, understand democracy as a 'current best bet' or 'local maximum'.)
I have in fact gotten wide-eyed stares for saying things like that, even granting the final pragmatic injunction on democracy as local maximum. I find that weird, because it seems like one of the first steps you would take towards thinking about politics clearly, not even as cognitive work but for the sake of avoiding cognitive anti-work, to not equivocate democracy with good governance. If you were further in the past and the fashionable political system were not democracy but monarchy, and you, like many others, consider democracy preferable to monarchy, then upon a future human revealing to you the notion of a modern democracy, you would find yourself saying, regrettably, "This is good governance, but the purpose of government is not good governance, the purpose of government is monarchy."
But because we were arguing for fictional governments, I seemed to be sending an imperceptibly weak signal that I would defect in a real tossup between democracy and something else, and thus my conversation partner could entertain my opinion whilst looking through truthseeking goggles instead of political ones.
The student is one of two people with whom I've had this precise conversation, and I do mean in the particular sense of "Which Fallout ending do I pick?" I slipped this opinion into both, and both came back weeks later to tell me that they spent a lot of time thinking about that particular part of the conversation and that the opinion I shared seemed deep. If Eliezer's hypothesis about the origin of feelings of deepness is true, then this is because they were actually truthseeking when they evaluated my opinion, and the opinion really got rid of a real cached thought: "Democracy is a priori unassailable."
In the spirit of doing accidentally effective things deliberately, if you ever wanted to flip someone's truthseeking switch, you might do it by placing the debate within the context of a fictional universe.
Replies from: The_Jaded_One, The_Jaded_One↑ comment by The_Jaded_One · 2017-01-08T18:47:45.125Z · LW(p) · GW(p)
That's an interesting insight actually, and dovetails with what I am saying. Politics isn't about Policy. If you want to do Policy, you need to talk about computer games ;0
↑ comment by The_Jaded_One · 2017-01-08T18:54:30.685Z · LW(p) · GW(p)
I at least would be interested in hearing anything else that you have to say about this topic. I'm not averse to private conversation on the matter either; most such conversations of mine are private.
Thanks, if there's a decent amount of interest I'll definitely do a followup. I might do one anyway, it's half-written, I need to do some editing before I unleash the hordes on it though!
comment by moridinamael · 2017-01-09T16:49:25.403Z · LW(p) · GW(p)
Not to be pedantic, but it was irrational for Aaronson to open his mouth about Feminism; Harris' approach may be rational given his stated aims, but his surprise at the reaction of his opponents is not rational; and talking about identitarian-adjacent topics like genetic modification without first carefully preparing the ground for discussion is going to risky.
(Unfortunately) the actual rationalist-who-wins is the one who goes about his ambitions like a good Slytherin and never publicly states his beliefs.
Replies from: The_Jaded_One↑ comment by The_Jaded_One · 2017-01-09T17:29:08.373Z · LW(p) · GW(p)
(Unfortunately) the actual rationalist-who-wins is the one who goes about his ambitions like a good Slytherin and never publicly states his beliefs.
I think this is mostly true, though there are a few problems with "Slytherin Rationality".
it was irrational for Aaronson to open his mouth about Feminism
Suppose modern elevator-gate-y feminism operates a bit like a mafia protection racket: they (the feminists) cream off status and money for themselves by propagating a set of ideas that are clearly ridiculous, but they keep everyone in line by threatening to doxx and shame and generally destroy the reputation of anyone who challenges them. A small group of Rebecca Watsons could dominate a much larger group of Slytherin Rationalists if all the Slytherins aren't prepared to take even a small risk to stand up for what they believe in.
talking about identitarian-adjacent topics like genetic modification without first carefully preparing the ground for discussion is going to risky.
If you never talk about the things that you actually care about, you will never manage to find people who you want to be close friends and allies with.
You can "prepare the ground" to some extent, but really what that means is that you take the slow route to unfriending the person rather than the fast route. You want to hang around in your free time with someone who you have to constantly filter yourself around and construct elaborate lies for? I didn't think so....
Preparing the ground is probably best used on someone who you see as a means to something, for example you want to extract favors from them, get money or other contacts from them, etc.
Replies from: plethora, moridinamael↑ comment by plethora · 2017-01-14T22:40:07.388Z · LW(p) · GW(p)
If you never publicly state your beliefs, how are you supposed to refine them?
But if you do publicly state your beliefs, the Rebecca Watsons can eat you, and if you don't, the Rebecca Watsons can coordinate against you.
How do you solve that?
"I believe that it's always important to exchange views with people, no matter what their perspectives are. I think that we have a lot of problems in our society and we need to be finding ways to talk to people, we need to find ways to talk to people where not everything is completely transparent. ... I think often you have the best conversations in smaller groups where not everything is being monitored. That's how you have very honest conversations and how you can think better about the future." -- Thiel on Bilderberg
↑ comment by moridinamael · 2017-01-11T22:27:47.989Z · LW(p) · GW(p)
Suppose modern elevator-gate-y feminism operates a bit like a mafia protection racket: they (the feminists) cream off status and money for themselves by propagating a set of ideas that are clearly ridiculous, but they keep everyone in line by threatening to doxx and shame and generally destroy the reputation of anyone who challenges them. A small group of Rebecca Watsons could dominate a much larger group of Slytherin Rationalists if all the Slytherins aren't prepared to take even a small risk to stand up for what they believe in.
You're absolutely right. I don't know of any good coordinative solution to this that doesn't look more like people sticking their necks out and getting guillotined in sequence.
You can "prepare the ground" to some extent, but really what that means is that you take the slow route to unfriending the person rather than the fast route. You want to hang around in your free time with someone who you have to constantly filter yourself around and construct elaborate lies for? I didn't think so....
I find it's much easier to do this kind of thing in real life. If you really want to talk about some unusual belief, you can always calibrate your approach based on the initial position of the person you're trying to talk to. On the Internet (which is where this type of thing usually goes wrong) you're usually posting semi-contextualized text in public. It's almost doomed to failure.
Replies from: The_Jaded_One↑ comment by The_Jaded_One · 2017-01-12T06:43:38.602Z · LW(p) · GW(p)
coordinative solution
Well it isn't as if this is the first time ever that humans have had to coordinate on something. The usual tricks would include creating anti-SJW movements, setting up an alternative status-structure with its own reward and punishment mechanisms, giving power and status to key people who challenge SJWs.
comment by Fluttershy · 2017-01-09T05:07:56.844Z · LW(p) · GW(p)
Most of my friends can immediately smell when a writer using a truth-oriented approach to politics has a strong hidden agenda, and will respond much differently than they would to truth-oriented writers with weaker agendas. Some of them would even say that, conditional on you having an agenda, it's dishonest to note that you believe that you're using a truth-oriented approach; in this case, claiming that you're using a truth-oriented approach reads as an attempt to hide the fact that you have an agenda. This holds regardless of whether your argument is correct, or whether you have good intentions.
There's a wide existing literature on concepts which are related to (but don't directly address) how to best engage in truth seeking on politically charged topics.The books titled Nonviolent Communication, HtWFaIP, and Impro, are all non-obvious examples. I posit that promoting this literature might be one of the best uses of our time, if our strongest desire is to make political discourse more truth-oriented.
One central theme to all of these works is that putting effort into being agreeable and listening to your discussion partners will make them more receptive to evaluating your own claims based on how factual they are. I'm likely to condense most of the relevant insights into a couple posts once I'm in an emotional state amenable to doing so.
Replies from: The_Jaded_One↑ comment by The_Jaded_One · 2017-01-09T07:38:17.374Z · LW(p) · GW(p)
Most of my friends can immediately smell when a writer using a truth-oriented approach to politics has a strong hidden agenda
well then you have awesome friends and I'm jealous!
comment by NatashaRostova · 2017-01-08T22:56:10.534Z · LW(p) · GW(p)
One problem I have with communicating this is that I was only able to pick up on it after lots of academic studying (degree doesn't matter so much as having read and understood the growth of Social Science knowledge and research), and reading blogs of academics who have run into trouble for years.
Whether it's InfoProc on genetic engineering, West Hunter on evolution, SSC on feminism, and so forth.
By time you read all this stuff and it starts coming together in your head, you realize you can't rationally discuss it with other people. If I'm at a party and someone mentions they are a feminist, I'm definitely not going to mention I'm an anti-feminist (or try to explain why I think the entire idea of flippant 'ism identification' is broken). Or even outside a party, it's a heavy discussion to bring up for no real gain.
There is no nice starting position, or clear argument on why rationality is often harmful in politics without contemporary and historical examples. It takes hours of conversations with close friends who are willing to have their mind changed, simply to explain that there is this entire world that no one is allowed to discuss. I have friends who get so frustrated by this, they decide to go full Alt-Right or Neoreaction, which I think is also a mistake.
One nice place to start though is in the past, where the institutions that many people view today as the most rational and truthful were completely wrong. That can at lease plant a seed of doubt. This interview with 20th century journalist Malcolm Muggeridge, who traveled through the Soviet Union during the Holodomor, is one of my favorites: (http://www.ukrweekly.com/old/archive/1983/228321.shtml)
Shortly before Mr. Muggeridge's articles appeared in the Guardian, the Soviet authorities declared Ukraine out of bounds to reporters and set about concealing the destruction they had wreaked. Prominent statesmen, writers and journalists - among them French Prime Minister Edouard Herriot, George Bernard Shaw and Walter Duranty of The New York Times - were enlisted in the campaign of misinformation.
Or point to guys like Walter Duranty (https://en.wikipedia.org/wiki/Walter_Duranty).
The problem though is unlike lots of EY-Rationality-Facts, you can't learn why rationality is often harmful in politics without loads of examples throughout time. And unlike cognitive biases, it's really hard to shortly explain.
Replies from: The_Jaded_One↑ comment by The_Jaded_One · 2017-01-08T23:06:24.924Z · LW(p) · GW(p)
Thanks, that's an interesting perspective!
You know it occurs to me that it would be nice to have some kind of guide to all the "forbidden knowledge" that's out there - West Hunter, HBDChick, Infoproc.
Replies from: NatashaRostova↑ comment by NatashaRostova · 2017-01-08T23:16:15.500Z · LW(p) · GW(p)
I think that's what most people who were or want to be part of the rationalist community want to work on now. That's what Scott Alexander does full time with SSC and his comments. Even on LW despite the weird and dated rules, everyone wants to discuss this stuff and work on slowly figuring it out. I don't think anyone really cares how a 22 year old has reinterpreted EY's post on cognitive biases or some new version of AI risk(and I say that having put all my faith in 22 year old engineering kids saving the world).
I'll probably just post on it more now here, and see what happens.
Replies from: The_Jaded_One↑ comment by The_Jaded_One · 2017-01-08T23:31:41.563Z · LW(p) · GW(p)
yeah, you should do. I feel like knowing the key posts and ideas is helpful. For example West Hunter has a wide range of types of posts: some are goofing off and some are really important. Same with gnxp.
comment by MrMind · 2017-01-09T11:07:00.167Z · LW(p) · GW(p)
The fact that you are opposed with straw-manning and tribal shaming doesn't imply that you are engaging in truth seeking, as you seem to suggest.
Truth seeking is not done by exposing an opinion, it's done by confronting with people who disagree but can explain why, and you both engage in exploring your priors, collecting and evaluating evidence, etc.
It is to be expected that when you send your opinion in the wild, you will be met with both angry blind reactions and calm considered ones. Hopefully you will engage with the interesting ones and ignore the rest.
What I see in your post is that you don't want any negative reaction when you expose an unpopular opinion, because it would be truth-seeking to expose an unpopular opinion without any hidden agenda. But that is not enough: is how you respond to argumentative critics that qualify you as truth seeker.
In an ideal world all kind of discussions would be of the rational kind, but there's a reason rationality must be made up of walled gardens.
That's why I reject your thesis: those three examples were not exemplifying paragon rational behavior, and it is not actively harmful to discuss rationally of politics, unless you have a very fragile online reputation.
comment by Qiaochu_Yuan · 2017-01-12T07:27:01.935Z · LW(p) · GW(p)
Really not a fan of the title; my objection is basically the same as moridinamael's. The thing you're pointing out as a mistake is not rationality but a particular bad move in a social game, namely stating certain kinds of unpopular opinions.
One possible steelman of the point I think you're making can be found in Paul Christiano's If we can't lie to others, we will lie to ourselves. In any case, the way I get around this is not engaging even slightly publicly in conversations where I might even have an opportunity to state my least popular opinions.
comment by Gleb_Tsipursky · 2017-01-08T13:36:00.875Z · LW(p) · GW(p)
FYI: http://lesswrong.com/r/discussion/lw/ofi/rational_politics_project/
Replies from: The_Jaded_One↑ comment by The_Jaded_One · 2017-01-08T14:03:18.021Z · LW(p) · GW(p)
sounds interesting, thanks!