That You'd Tell All Your Friends
post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-01T12:04:39.721Z · LW · GW · Legacy · 53 commentsContents
53 comments
Followup to: The Most Frequently Useful Thing
What's the number one thing that goes into a book on rationality, which would make you buy a copy of that book for a friend? We can, of course, talk about all the ways that the rationality of the Distant World At Large needs to be improved. But in this case - I think the more useful data might be the Near question, "With respect to the people I actually know, what do I want to see in that book, so that I can give the book to them to explain it?"
(And again, please think of your own answer-component before reading others' comments.)
53 comments
Comments sorted by top scores.
comment by [deleted] · 2009-03-01T17:17:02.241Z · LW(p) · GW(p)
deleted
Replies from: caiuscamargarus↑ comment by caiuscamargarus · 2009-03-01T18:42:58.717Z · LW(p) · GW(p)
Second that. I know too many people who are unwilling to even discuss things like physicalism because they think it can only lead to overwhelming existential angst. It would be nice if the "That which can be destroyed by the truth..." principle were enough to compel people to think about things; but while there is this crippling fear of existential angst, perhaps reassurance that angst isn't permanent or even necessary is the best thing you can spread.
comment by Scott Alexander (Yvain) · 2009-03-04T11:45:42.786Z · LW(p) · GW(p)
I am likely to buy the book for my friends in any case, but I did just think of something that would give me an extra incentive. I want my friends to understand consequentialism/utility.
Most people I know are very stuck in "morality means not doing anything bad" mode. Not only would they not push the fat person off the bridge in the trolley problem, but many (perhaps most) wouldn't so much as push the lever to switch tracks. A lot of them think the whole question is some sort of reductio ad absurdum of consequentialist philosophy ("you're a consequentialist? I thought that was discredited when people showed that philosophy could be used to justify murdering someone in that trolley problem"). I've had no success arguing against these people, and one of the things that impressed me most the first time I read Overcoming Bias was that it was a community of people who used consequentialist arguments without fumbling for excuses, who thought Torture Vs. Dust Specks was a real problem instead of just "It would be wrong to inflict either of those options on people, so it's not a moral issue" (which is what my old philosophy professor said when I mentioned it to her).
I don't think Overcoming Bias ever officially argued for consequentialism, but a book aimed at the general public might have to. Eliezer is one of the most persuasive writers I've ever read, and if I thought his book could convince my friends to think consequentially I would be recommending it even more than I would otherwise.
comment by SarahNibs (GuySrinivasan) · 2009-03-03T08:21:19.190Z · LW(p) · GW(p)
Uncertainty is in the mind, not in reality. I was going to say the Mind-Projection Fallacy, but might as well make it very specific and focused. If you can make everyone I know believe intuitively that embracing "I don't know" does not cause reality to crumble... that'd be great.
We can get to "I don't currently know" later. :)
comment by Nick Hay (nickjhay) · 2009-03-02T02:28:44.393Z · LW(p) · GW(p)
Thou Art Godshatter: gives an intuitive grasp for why and how human morality is complex, but that not any complex thing will do.
Replies from: grobstein↑ comment by grobstein · 2009-03-06T02:19:38.374Z · LW(p) · GW(p)
Totally agree -- helps if you can convince them to read Fire Upon the Deep, too. I'm not being facetious; the explicit and implicit background vocabulary (seems to) make it easier to understand the essays.
(EDIT: to clarify, it is not that I think Fire in particular must be elevated as a classic of rationality, but that it's part of a smart sci/fi tradition that helps lay the ground for learning important things. There's an Eliezer webpage about this somewhere.)
comment by Z_M_Davis · 2009-03-01T17:52:11.541Z · LW(p) · GW(p)
Clearly the earlier material is more important than the later. Include stuff like "The Bottom Line," "Update Yourself Incrementally," "Think Like Reality," "Conservation of Expected Evidence," "Avoiding Your Belief's Real Weak Points," the Fake Utility Functions sequence, &c. Also consider including the material between "But There's Still a Chance, Right?" through "0 and 1 are Not Probabilities," and "Mind Projection Fallacy" through "If You Demand Magic, Magic Won't Help." Ooh, and reprint the "Twelve Virtues"!
Don't mention quantum mechanics or the Singularity. Don't mention morality except for something along the lines of "Feeling Rational."
Replies from: Vladimir_Gritsenko, rwallace↑ comment by Vladimir_Gritsenko · 2009-03-01T22:13:57.121Z · LW(p) · GW(p)
To this I would add The Simple Truth, and perhaps a few expositions of failed intuition, a la Hindsight Devalues Science. Others have already mentioned Something to Protect and Joy in the Merely Real - as the "motivation" behind rationality. Finally, Newcomb's Problem and the Regret of Rationality, or anything that clearly separates The Way from Hollywood stereotypes.
↑ comment by rwallace · 2009-03-01T23:05:08.579Z · LW(p) · GW(p)
These are good suggestions, though if you are going to print "0 and 1 are Not Probabilities" (which makes a coherent argument even though I disagree with it), I would suggest also printing the post where you caution people against putting the label "probability estimate" on brown numbers.
comment by CannibalSmith · 2009-03-01T14:05:00.714Z · LW(p) · GW(p)
Politics is the mind-killer.
comment by imaxwell · 2009-03-02T03:22:19.665Z · LW(p) · GW(p)
Firstly, the combined ideas of "something to protect" and "rational agents should win" and "joy in the merely real." The idea that you should want to be rational not because it's inherently virtuous or important, but because it will allow you to get what you ultimately want, whatever that might be. The person I want to give this book to, currently believes that rationality is defined by cold, bloodless disinterest in the world, and thus has no interest in it.
Secondly, the combined ideas of "writing the bottom line first" and "guessing the teacher's password" and "your ability to be more confused by fiction". That you cannot first choose an opinion from the ether, and then figure out how to argue for it. In the effort to correlate your beliefs with reality, it is only your beliefs that you are capable of changing. The person I want to give this book to, currently believes that it is better to "win" a debate than to "lose", and thus defends his beliefs against all comers without first finding out where they are come from.
Thirdly, the idea that nature is allowed to set impossibly high standards and flunk you even if you do everything humanly possible to meet them. The person I want to give this book to, currently thinks that "I'm doing the best I can" is a viable excuse for failure.
Of course there should be advice on how, specifically, to be more rational, and the many failure modes possible. These three ideas are primarily the motivational ones: that rationality is necessary to anything you want to accomplish, and yet so difficult that it will take you a lifetime of effort to maintain a fighting chance of doing so. They are the ideas that, if internalized, will make people really want to try harder.
comment by komponisto · 2009-03-01T21:30:24.533Z · LW(p) · GW(p)
Just a week or two ago I found out -- to my utter astonishment -- that at least two people whom I see regularly have religious doubts about evolution -- in one case even about the age of the earth. They played various familiar cards, e.g.
-"You have faith in science [i.e. just like I have faith in religion]"
-"Evolution may work as science, but I must suspend judgement on whether it's true"
-"I believe in evolution within species"
-"How do you know the radioactive decay rates have remained constant?"
-"Are you a person of faith? [Answer: No.] You see? That's why this conversation will go absolutely nowhere." (Btw, anyone else notice how the religious are always the first ones to declare this?)
I was too shocked to be able to respond effectively. Would that I had had The Book to hand them.
Replies from: Cameron_Taylor↑ comment by Cameron_Taylor · 2009-03-02T06:26:20.019Z · LW(p) · GW(p)
And I hope Eleizer includes a reference to that book in his bibliography. :P
comment by badger · 2009-03-01T16:54:09.725Z · LW(p) · GW(p)
I would ideally like to see a book of concrete controversies and scenarios that illustrate the principles of rationality. I envision it almost like a set of exercises. As you've noted before, the knowledge that a problem has been solved before tends to kill motivation, so they would need to be framed in a way to elicit curiosity. Quantum mechanics and p-zombies worked well for me, but I was already a long time traditional rationalist. Unfortunately, math tends to be scary to my friends and concepts like p-zombies wouldn't be considered relevant to the real world. Because an intro book has a lot of ground to cover, this may not be entirely feasible, but readers should have something to try their hand at.
For material to build up to controversies, I'd start with an explanation of guessing the teacher's password. I bet the concept rings true with most people who have sat through American high schools. Science classes give lots of answers but few explanations. The distinction between real and fake explanations also allows you to hold out the promise of real explanations to build curiosity.
Next, standard examples like the application of Bayes to medical tests and the Wason selection task (on letters and numbers vs. ages and beverages) are good introductory problems. They quickly show the necessity of a strict approach to rationality.
Next, Bayesian scoring à la Techical Explanation. I think my friends would love this, if it were explained right. Once you have a score, the urge is to maximize it. This brings home the points that there is a definite way to judge beliefs, the importance of making a prediction before hand, and the balance that must be struck between vagueness and precision. Beliefs have to pay rent, otherwise they are messing up your score.
Once predictions have been introduced, the book has to cover "the map != the territory" and the Mind Projection fallacy. Jaynes on the physics of coin-flipping did this for me. More "real-world" examples like drug testing might be useful here.
That is a lot of ground to cover, but it feels like a solid core. After that, practice on real issues (suitably divorced from ingrained beliefs) is important, both to hone skills and to make the reader feel like these principles are useful and important.
Replies from: AnnaSalamon↑ comment by AnnaSalamon · 2009-03-02T02:01:13.703Z · LW(p) · GW(p)
Badger, this sounds like a book I would buy for my friends, but not like Eliezer's book (which I would also, for different reasons, buy for my friends). Do you think you could write a book like this?
Replies from: badger↑ comment by badger · 2009-03-02T03:40:31.911Z · LW(p) · GW(p)
I admit I don't remember exactly what Eliezer's intentions for his book are, so I just outlined what I would write. If only I had time right now (although that's not saying much, considering how much time I spent sites like this). Any thoughts on possible issues that are controversial enough to be interesting, but not enough to bring out biases and cached thoughts?
comment by rwallace · 2009-03-01T16:14:17.748Z · LW(p) · GW(p)
Tsuyoku Naritai.
Replies from: Eliezer_Yudkowsky↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-01T16:48:22.493Z · LW(p) · GW(p)
Just to check, that's what you want me to tell all your friends, not you personally?
Replies from: caiuscamargarus, rwallace↑ comment by caiuscamargarus · 2009-03-01T18:47:48.505Z · LW(p) · GW(p)
Given how many people I see making the choice of being mediocre--and I say this from an elite university--I would second rwallace. Without wanting to become stronger you lose much of the incentive to be truly rational. Tsuyoku Naritai should be the first thing in your book.
↑ comment by rwallace · 2009-03-01T22:55:20.455Z · LW(p) · GW(p)
Yes. For myself, I already subscribed to that philosophy (though am happy to see it written down in a form more eloquent than I could've expressed myself) - your OB posts that come to mind from which I learned something I didn't previously know would be the excellent series on quantum mechanics. But that's not relevant to most people (honestly, quantum mechanics isn't of practical relevance to me either, though it is intellectually interesting). Tsuyoku Naritai is in my opinion the one thing from which most people would derive most benefit.
comment by MixedNuts · 2011-07-19T15:17:04.426Z · LW(p) · GW(p)
Detailed explanation of why thought has rules, including minds as engines.
comment by Johnicholas · 2009-03-01T16:26:58.588Z · LW(p) · GW(p)
Science fiction with mind-blowing philosophy and hard science, preferably without in-jokes.
I might buy a concrete and practical handbook and recommend it to my friends. For example, I did that with the anti-depression handbook "Feeling Good". However, they wouldn't read it.
Greg Egan, Charlie Stross, and Nick Bostrom's Dragonology were all effective doors to interesting discussions, discussions that indicated my friends "got it". Nick Bostrom's more academic-ish stuff just bounced.
comment by Jonnan · 2009-03-06T05:04:01.290Z · LW(p) · GW(p)
I'm not sure how useful this is, and I feel odd posting it this way (Intuitive Rationality?), but there is a 'feel' to when there is a fallacy camouflaged in the discussion. If the reader could learn to pay attention to it and dig for it when they feel it, I would consider that a worthwhile book.
comment by jimmy · 2009-03-02T02:21:20.468Z · LW(p) · GW(p)
I'm putting another vote for the physics guys. I understand that the QM sequence may be too tangential to be included in a book about rationality, but all of the posts on "philosophy of science" that made MWI obviously true seem worth including.
Since I'm a physics major and my friends are too, the whole "philosophy of science" bit is what I'd loan out the book for (The bit on baye's law occam's razor, belief in the implied invisible, "the map is not the territory", minds as cognitive engines and such)
I guess the "here's what we're trying to do" part seems more interesting than (and should be learned before) the "here's where human minds consistently don't do that" part.
Replies from: Cameron_Taylor↑ comment by Cameron_Taylor · 2009-03-02T06:23:58.483Z · LW(p) · GW(p)
I would LOVE to see the QM sequence in there and I'd give it to my friends just for that. At the same time I suspect it may reduce the overall impact of the book. I know a lot of intelligent people (for most part those over 40) who would reject the book based on their disagreement, particularly since it is not Eleizer's field of expertise.
I'd let a rationalist with a Physics background fight that battle.
comment by swestrup · 2009-03-01T19:40:52.541Z · LW(p) · GW(p)
I think the explanation of how and why quantum mechanics is perfectly understandable and explainable would be the thing most likely to make me give the book to someone else. In many ways, its also the most valuable thing that I've taken away from the Overcoming Bias site.
I started out in Physics, and while in College and University, I often encountered the "don't try to understand it -- just do the math" point-of-view and I never subscribed to it. I always felt that there should be a way to make it all make sense, but I was never capable of doing so myself.
The articles on QM from Overcoming Bias resolved many of my doubts about how things worked in this realm, while showing that everything could be described by a consistent model that actually made sense. Its hard to overstate how happy I was to see that.
comment by Gordon Seidoh Worley (gworley) · 2009-03-28T03:33:50.934Z · LW(p) · GW(p)
I still go back to saying the most important thing that you've ever written, for me personally, was CFAI section 2.2, where you argued against anthropomorphism in AI. That was the thing that cracked my mind loose and set it on its journey without feeling trapped by traditional views of rationality, intelligence, or wisdom. Not that this appeared on OB as far as I can remember, but I think it's important enough that it should appear somewhere in the book.
comment by michaelkeenan · 2009-03-02T10:06:48.410Z · LW(p) · GW(p)
I would like you to persuade my friends to hold their beliefs with much less confidence than they currently have. Overcoming Bias had this effect on me.
comment by Cameron_Taylor · 2009-03-02T06:19:40.172Z · LW(p) · GW(p)
High status makes it more difficult to understand what other people are saying.
comment by anonym · 2009-03-02T00:12:37.045Z · LW(p) · GW(p)
A friendly, concise overview of some of the most important techniques for improving thought, such as identifying cached thoughts and tabooing your words, holding off on proposing solutions, recognizing fake justification, leaving a line of retreat, righting wrong questions, identifying true sources of disagreement, etc.
This presupposes some motivating discussion of cognitive biases and why it is so unnatural to think rationally.
comment by MichaelHoward · 2009-03-01T14:11:15.203Z · LW(p) · GW(p)
Make them catch the bug. Leave the reader enthusiastic to learn more. That's the #1 thing that would make me want to buy it for others, being confident what they get from it won't be a flash in the pan.
If your book makes less money for you than it's bibliography makes for Amazon, I'd call it a success.
Replies from: MichaelHoward↑ comment by MichaelHoward · 2009-03-03T00:44:32.011Z · LW(p) · GW(p)
I'm shocked and very intrigued that no-one seems to agree with this one. Maybe I need to do some updating.
What am I missing? Why isn't it a huge deal whether the readers are motivated to keep improving and getting stronger after they've finished the book?
Replies from: Eliezer_Yudkowsky↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-03T00:57:33.567Z · LW(p) · GW(p)
It is a huge deal. But you didn't say which specific post or thing-to-learn was most effective at motivating. You just expressed a wish for something nice to happen.
Replies from: MichaelHoward↑ comment by MichaelHoward · 2009-03-03T14:07:06.849Z · LW(p) · GW(p)
I was avoiding specifics about how you'd enthuse them, thinking you'd know better which combination of points to use, but when you put it like that... fair point. I'll think about it...
comment by beoShaffer · 2011-07-19T16:47:23.641Z · LW(p) · GW(p)
Inferential Differences, so the understand(part of) why they have problems understanding other stuff I try to explain.
comment by grobstein · 2009-03-06T02:16:39.900Z · LW(p) · GW(p)
Clarity and transparency. One should be able to open the book to a page, read an argument, and see that it is right.
(Obviously this trades off against other values -- and is in some measure a deception --, but it's the kind of thing that impresses my friends.)
comment by Emile · 2009-03-02T09:08:48.263Z · LW(p) · GW(p)
One Argument against an Army, Politics is the Mind-killer and related posts. (Actually, I'm surprised those posts don't reference each other, in my mind they're pretty close.)
Politics is probably the domain where I see the most emotional thinking around me (probably because people have little incentive to be actually right, and strong social incentives to go align with their peers).
comment by Gleb_Tsipursky · 2014-11-03T00:27:14.588Z · LW(p) · GW(p)
I often promote Intentional Insights by talking about updating beliefs. I highlight the value of overcoming the negative emotions associated with learning about making a mistake, and instead encourage associating positive emotions with acknowledging reality and wanting to know the truth. This lets people take the best thing that they can from the mistake and move forward to achieve more!
comment by A1987dM (army1987) · 2013-11-17T03:17:41.080Z · LW(p) · GW(p)
.
comment by steven0461 · 2009-03-02T06:43:36.570Z · LW(p) · GW(p)
If I gave the book to a friend it would probably be for carefully-argued futurism content.
Replies from: Vladimir_Nesov↑ comment by Vladimir_Nesov · 2009-03-02T09:54:13.242Z · LW(p) · GW(p)
Don't you worry about driving your friends insane?
Replies from: steven0461↑ comment by steven0461 · 2009-03-02T16:38:49.331Z · LW(p) · GW(p)
Should I? Is this a common outcome?
comment by Cameron_Taylor · 2009-03-02T06:28:21.062Z · LW(p) · GW(p)
A chapter on Science vs Probability Theory. eg Science isn't strict enough.
comment by [deleted] · 2009-03-02T08:36:52.164Z · LW(p) · GW(p)
I don't buy books, and I don't have friends.