What are the Best Hammers in the Rationalist Community?
post by Bound_up · 2018-01-24T14:43:18.611Z · LW · GW · 7 commentsContents
7 comments
I've toyed with the idea of giving an intro to rationality, hoping I can find the 20% of the material that provides 80% of the gains (or the 5% that provides 50% of the gains).
Here and there, I've also been asked what the rationality community is about, and I've struggled to give an effective summary. I usually say the 3 keys are seeing in terms of concepts rather than words (a la rationalist Taboo or hyperdimensional thingspace), fighting motivated reasoning (since most people are already strong enough to prove others wrong and just need to turn that force upon themselves), and correcting for the heuristics and biases. But are these really the top 3 most important pieces? Even if they are, what other Hammers would take 4th, 5th, and 6th place?
The Hammers and Nails post got me thinking again: What are the most powerful pieces of rationalist lore? If I had 1 day to tutor a brilliant, but untrained mind, what are the most powerful techniques that can most easily be taught?
The comments under Hammers and Nails are full of people saying what their favorite and most powerful techniques are, but I'd like to make an explicit invitation here to share. What are your hammers? (see https://www.lesserwrong.com/posts/QzBuuNEqJGQFeWM4f/hammers-and-nails)
7 comments
Comments sorted by top scores.
comment by Quaerendo · 2018-01-26T22:13:05.300Z · LW(p) · GW(p)
I cannot say much about CFAR techniques, but I'd nominate the following as candidates for LW "hammers":
- Taboo your words and replace the symbol with the substance
- Dissolve the question by asking which mental algorithm is generating the feeling of a question
- Check that your belief is controlling your anticipation, and is not just there for social motives
- Rationality is not about sticking to a particular ritual of cognition, but about winning; don't lose track of your goals
- Try to be truly curious about the world; notice when you are rationalizing a belief you've become attached to
- Be wary of making something a part of your identity
- Don't treat properties of your mind as if they were properties of the external world (the map is not the territory)
- Reversed stupidity is not intelligence
- When dealing with others, keep in mind inferential distance and the typical mind fallacy
- Many human behaviors have (subconscious) signaling motives
- You are unlikely to beat the market, or the expert consensus in a scientific field
- Altruism requires one to "shut up and multiply"
Of course, the list is not exhaustive.
comment by cousin_it · 2018-01-24T16:47:40.620Z · LW(p) · GW(p)
The most important LW idea for me was the Mysterious Answers to Mysterious Questions sequence. Someone can read it from start to end and come out a different person. Basically it teaches you how to get rid of "floating" beliefs that are underconstrained by data. My summary doesn't do it justice - Eliezer did an incredible job with that sequence, there's tons of metaphors and examples to make it stick. I reread it every couple years as a kind of spring cleaning.
Replies from: alkjash↑ comment by alkjash · 2018-01-24T17:50:36.285Z · LW(p) · GW(p)
This sequence was also quite profound to me. Maybe we should do a poll on the most powerful Sequence and reorganize them in that order.
Replies from: habryka4↑ comment by habryka (habryka4) · 2018-01-24T19:36:49.576Z · LW(p) · GW(p)
I am very interested in any input on improving the structure and order of the sequences, though I also think it's a super hard problem and the correct solutions might require significant amounts of innovation and care.
comment by ChristianKl · 2018-01-25T09:48:12.087Z · LW(p) · GW(p)
Stating subjective probabilities of outcomes before the event happens.
Examples:
(1) If you ask for people whether they will come to your meetup, instead of asking the binary Yes/No, ask them for their probability.
(2) Elon Musk reports that he feels it's easier to make couragious decisions such as starting SpaceX and Tesla after he thought through the probabilities.
(4) Determining odds for bets via stating your probabilities before. Based on the post Even Odds a member of our dojo created a tool that implements the math.
comment by Ben Pace (Benito) · 2018-01-24T18:03:43.956Z · LW(p) · GW(p)
I've also been asked what the rationality community is about, and I've struggled to give an effective summary
Nitpick: the rationality community is not about anything simple, though the rallying flag is Rationality: A-Z. (cf 'The Rallying Flag is not the Tribe' by Scott)
comment by ryan_b · 2018-01-24T16:34:14.486Z · LW(p) · GW(p)
It seems to me that in the context of Hammers and Nails, rationality is a community of Nails organized around the problems of heuristics and biases. The problem will be considered solved when we can reliably become Hammers because we have developed successful strategies that can be applied whenever such a heuristic or bias may reveal itself.
I think it is worth distinguishing this point because the heuristics and biases are innate to ourselves, so we can expect to have no choice but to become a Hammer in order to think clearly and correctly about whatever object-level problem we encounter.
For most powerful rationality technique I nominate shut up and multiply, because it avoids lots of heuristics and biases by deliberately invoking System 2 and it has a very strong track record (such as technology).