Rationality Reading Group: Part G: Against Rationalization
post by Gram_Stone · 2015-08-12T22:09:42.072Z · LW · GW · Legacy · 6 commentsContents
G. Against Rationalization None 6 comments
This is part of a semi-monthly reading group on Eliezer Yudkowsky's ebook, Rationality: From AI to Zombies. For more information about the group, see the announcement post.
Welcome to the Rationality reading group. This fortnight we discuss Part G: Against Rationalization (pp. 293-339). This post summarizes each article of the sequence, linking to the original LessWrong post where available.
G. Against Rationalization
67. Knowing About Biases Can Hurt People - Learning common biases won't help you obtain truth if you only use this knowledge to attack beliefs you don't like. Discussions about biases need to first do no harm by emphasizing motivated cognition, the sophistication effect, and dysrationalia, although even knowledge of these can backfire.
68. Update Yourself Incrementally - Many people think that you must abandon a belief if you admit any counterevidence. Instead, change your belief by small increments. Acknowledge small pieces of counterevidence by shifting your belief down a little. Supporting evidence will follow if your belief is true. "Won't you lose debates if you concede any counterarguments?" Rationality is not for winning debates; it is for deciding which side to join.
69. One Argument Against An Army - It is tempting to weigh each counterargument by itself against all supporting arguments. No single counterargument can overwhelm all the supporting arguments, so you easily conclude that your theory was right. Indeed, as you win this kind of battle over and over again, you feel ever more confident in your theory. But, in fact, you are just rehearsing already-known evidence in favor of your view.
70. The Bottom Line - If you first write at the bottom of a sheet of paper, “And therefore, the sky is green!”, it does not matter what arguments you write above it afterward; the conclusion is already written, and it is already correct or already wrong.
71. What Evidence Filtered Evidence? - Someone tells you only the evidence that they want you to hear. Are you helpless? Forced to update your beliefs until you reach their position? No, you also have to take into account what they could have told you but didn't.
72. Rationalization - Rationality works forward from evidence to conclusions. Rationalization tries in vain to work backward from favourable conclusions to the evidence. But you cannot rationalize what is not already rational. It is as if "lying" were called "truthization".
73. A Rational Argument - You can't produce a rational argument for something that isn't rational. First select the rational choice. Then the rational argument is just a list of the same evidence that convinced you.
74. Avoiding Your Belief's Real Weak Points - When people doubt, they instinctively ask only the questions that have easy answers. When you're doubting one of your most cherished beliefs, close your eyes, empty your mind, grit your teeth, and deliberately think about whatever hurts the most.
75. Motivated Stopping and Motivated Continuation - When the evidence we've seen points towards a conclusion that we like or dislike, there is a temptation to stop the search for evidence prematurely, or to insist that more evidence is needed.
76. Fake Justification - We should be suspicious of our tendency to justify our decisions with arguments that did not actually factor into making said decisions. Whatever process you actually use to make your decisions is what determines your effectiveness as a rationalist.
77. Is That Your True Rejection? - People's stated reason for a rejection may not be the same as the actual reason for that rejection.
78. Entangled Truths, Contagious Lies - Given that things are often entangled with other things in ways which we do not know, it is really really difficult to tell a perfect lie.
79. Of Lies and Black Swan Blowups - An extreme example of what happens when you get caught in a web of your own lies.
80. Dark Side Epistemology - If you want to tell a truly convincing lie, to someone who knows what they're talking about, you either have to lie about lots of specific object level facts, or about more general laws, or about the laws of thought. Lots of the memes out there about how you learn things originally came from people who were trying to convince other people to believe false statements.
This has been a collection of notes on the assigned sequence for this fortnight. The most important part of the reading group though is discussion, which is in the comments section. Please remember that this group contains a variety of levels of expertise: if a line of discussion seems too basic or too incomprehensible, look around for one that suits you better!
The next reading will cover Part H: Against Doublethink (pp. 343-361). The discussion will go live on Wednesday, 26 August 2015, right here on the discussion forum of LessWrong.
6 comments
Comments sorted by top scores.
comment by Gram_Stone · 2015-08-12T22:16:15.377Z · LW(p) · GW(p)
Has anyone managed not to Bottom Line in their everyday thinking? I find that it's very difficult. It's so natural and it's a shortcut that I find useful more often than harmful. I wonder if it's best to flag issues where epistemic irrationality would be very bad and primarily focus on avoiding Bottom Lining at times like that. I feel that the things I'm talking about are in a different spirit than those originally intended by the article, where you're not so much emotionally invested in the world being a certain way as you are, say, relying on your intuition as the primary source of evidence for the sake of saving time and avoiding false starts.
Replies from: tailcalled, Gust↑ comment by tailcalled · 2015-08-13T10:24:53.926Z · LW(p) · GW(p)
Well, that Bottom Line is generated by your intuition, and your intuition is probably pretty good at weighing evidence to find a good Bottom Line (with the caveat that your intuition probably does a lot more signalling than truth-seeking). In principle, though, that means that you don't have to justify that bottom line, at least not as much; instead, it would be more productive to search for flaws. The most likely flaw is that your subconcious figured "My friends like this conclusion" rather than "This conclusion is true".
Replies from: torekp↑ comment by torekp · 2015-08-15T13:44:33.353Z · LW(p) · GW(p)
This, and more. Statements don't come pre-labeled as premise or conclusion. All evidential relations are bidirectional. For example, if two statements are inconsistent, logic tells you not to accept both. But it doesn't tell you which one (s) to reject.
↑ comment by Gust · 2015-08-13T14:43:14.177Z · LW(p) · GW(p)
The way I see it, having intuitions and trusting them is not necessarily harmful. But you should actually recognize them by what they are: snap judgements made by subconscious heuristics that have little to do with actual arguments you come up with. That way, you can take it as a kind of evidence/argument, instead of a Bottom Line - like an opinion from a supposed expert which tells you the "X is Y", but doesn't have the time to explain. You can then ask: "is this guy really an expert?" and "do other arguments/evidence outweight the expert's opinion?"
Replies from: tailcalled↑ comment by tailcalled · 2015-08-13T16:53:06.849Z · LW(p) · GW(p)
That way, you can take it as a kind of evidence/argument, instead of a Bottom Line - like an opinion from a supposed expert which tells you the "X is Y", but doesn't have the time to explain. You can then ask: "is this guy really an expert?" and "do other arguments/evidence outweight the expert's opinion?"
Note that both for experts and for your intuition, you should consider that you might end up double-counting the evidence if you treat them as independent of the evidence you have found - if everybody is doing everything correctly (which very rarily happens), you, your intuition and the experts should all know the same arguments, and naive thinking might double/triple-count the arguments.
Replies from: Gust