[SEQ RERUN] Avoiding Your Belief's Real Weak Points
post by MinibearRex · 2011-09-17T02:17:14.944Z · LW · GW · Legacy · 11 commentsContents
11 comments
Today's post, Avoiding Your Belief's Real Weak Points was originally published on 05 October 2007. A summary (taken from the LW wiki):
When people doubt, they instinctively ask only the questions that have easy answers. When you're doubting one of your most cherished beliefs, close your eyes, empty your mind, grit your teeth, and deliberately think about whatever hurts the most.
Discuss the post here (rather than in the comments to the original post).
This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was We Change Our Minds Less Often Than We Think, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.
Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.
11 comments
Comments sorted by top scores.
comment by Shmi (shminux) · 2011-09-17T07:00:35.503Z · LW(p) · GW(p)
I'd imagine that talking to smart people who don't subscribe to the belief in question would be more productive than trying to dig for inner hurt.
comment by Swimmy · 2011-09-17T02:50:13.015Z · LW(p) · GW(p)
This article was absolutely essential in my rejecting Christianity. There had been thoughts quickly darting in and out of my head for years that I never let myself think fully: "Heaven sounds boring and I don't want to go there." "What the Bible says about killing nonbelievers is abhorrent." What the Bible says about killing rape victims is even worse." They were quick, emotional responses. Little, "Hmm, that doesn't seem right" moments that I always tucked away in short order with a "God is all-wise, stop questioning," etc. I'm not sure why, but I actually took this article's simple piece of advice: focus on what's actually painful to focus on. There's a series of dark-side strategies involved with guiding people to do exactly the opposite, "God is all wise, you are stupid" being a small part of it.
comment by fubarobfusco · 2011-09-17T02:43:20.825Z · LW(p) · GW(p)
When you're doubting one of your most cherished beliefs, close your eyes, empty your mind, grit your teeth, and deliberately think about whatever hurts the most. Don't rehearse standard objections whose standard counters would make you feel better. Ask yourself what smart people who disagree would say to your first reply, and your second reply. Whenever you catch yourself flinching away from an objection you fleetingly thought of, drag it out into the forefront of your mind. Punch yourself in the solar plexus. Stick a knife in your heart, and wiggle to widen the hole.
There are lots of things that make this especially difficult for certain sorts of beliefs. Here are a couple:
- The smart objections to your belief may be difficult to find. There may actually be lots of stupid but popular objections to your belief; so in order to find the smart objections, you may have to do a good deal of unpleasant research deep behind enemy lines. If your belief is commonly ridiculed, and those who object to it typically denounce you as a scoundrel for even entertaining it, then good luck finding smart objections to it before you become too pissed-off to reason very successfully about them.
- The smart objections to your belief may be written in a vocabulary you're unfamiliar with. For instance, if your belief is about political economy (socialism or libertarianism, let's say), you may actually need to get into some pretty serious economics to understand the smart objections. It is easy to state naïve beliefs about political economy whose best rebuttal is a technical one; sound-bite objections such as Bastiat's parable of the broken window, or Keynes' "markets can remain irrational a lot longer than you and I can remain solvent" are probably not the smartest objections.
comment by [deleted] · 2011-09-17T02:33:00.251Z · LW(p) · GW(p)
This seems like a skill that is very hard to learn and even harder to teach to others. I'm not particularly good at it, but I'm not really sure how to improve. How exactly does one get better at noticing mental blind spots, anyway? Has anyone noticed significant improvement in this area, and if so, what did you do to achieve it?
Replies from: MinibearRex↑ comment by MinibearRex · 2011-09-17T04:33:34.297Z · LW(p) · GW(p)
I do believe that I've gotten pretty good at this, although it's not something that's easy to measure. I can remember multiple times during the past week when I almost skipped over an important objection to my then-current beliefs, and I returned and focused on it (and this apparently wasn't in vain, since on three of these occasions I actually did change my opinion in some way). On the other hand, I can't easily remember times that I didn't do this, and how often I deliberately looked at an objection I tried to flinch away from is not a statistic I even tracked until I decided to change it.
That being said, I think an important part of what goes through my mind at that point is a question of self-image. I do think of myself as someone who is not an intellectual coward, and someone who will work to ensure that false beliefs don't get to live comfortably in my own mind. The idea of there being some serious flaw to one of my beliefs that I am avoiding thinking about is a possibility that worries me.
If you genuinely do want to improve your own abilities at this skill, It's likely that the ideal of someone who bravely confronts their own false beliefs is something that already exists in you. In that case, if you can at some point catch yourself in the act of mentally "looking away" from a painful idea, and you can associate that feeling with a sense that this-is-not-acceptable, you may be able to make it an instinctive response.
comment by Armok_GoB · 2011-09-18T19:23:10.843Z · LW(p) · GW(p)
When try really doing this, my willpower runs out after just a few seconds and before anything has time to actually happen, and then I spend a long time sulking over how worthless I am for not being a proper rationalist and/or being BSOD'd from the paradox of not believing what I know is rational to believe.
Replies from: Clarica↑ comment by Clarica · 2011-09-27T23:49:01.823Z · LW(p) · GW(p)
Are you instinctively also only choosing questions with easy answers? Or are your doubts raising a different kind of question?
Replies from: Armok_GoB↑ comment by Armok_GoB · 2011-09-28T09:19:47.625Z · LW(p) · GW(p)
I don't really know. When I search my mind right now for instances of "hard questions" it only turns up instances of "questions where more evidence is needed" and "hard math problems", and no actual instances of "hard question"s, so I can't get any clear idea of what a hard question would look like.
comment by [deleted] · 2011-09-17T13:42:40.572Z · LW(p) · GW(p)
People can stand what is true, for they are already enduring it.
Though I like the rest of the litany, this part seems blatantly false. People deny basic truths all the time, often at their own detriment. Or am I missing something?
Replies from: Desrtopacomment by DanielLC · 2011-09-17T06:50:18.066Z · LW(p) · GW(p)
I don't think this is a matter of training, but a matter of instinct. People don't think about the real weak points of their beliefs for the same reason they don't touch an oven's red-hot burners; it's painful.
You let go of red-hot burners because of instinct. You stop touching them because the pain trains you not to. This is a matter of training. It's just training that it's impossible to not undergo.