I Want To Believe: Rational Edition

post by 27chaos · 2014-11-18T20:00:21.531Z · LW · GW · Legacy · 27 comments

Relevant: http://lesswrong.com/lw/k7h/a_dialogue_on_doublethink/

I would like this conversation to operate under the assumption that there are certain special times when it is instrumentally rational to convince oneself of a proposition whose truth is indeterminate, and when it is epistemically rational as well. The reason I would like this conversation to operate under this assumption is that I believe questioning this assumption makes it more difficult to use doublethink for productive purposes. There are many other places on this website where the ethics or legitimacy of doublethink can be debated, and I am already aware of its dangers, so please don't mention such things here.

I am hoping for some advice. "Wanting to believe" can be both epistemically and instrumentally rational, as in the case of certain self-fulfilling prophecies. If believing that I am capable of winning a competition will cause me to win, believing that I am capable of winning is rational both in the instrumental sense that "rationality is winning" and in the epistemic sense that "rationality is truth".

I used to be quite good at convincing myself to adopt beliefs of this type when they were beneficial. It was essentially automatic, I knew that I had the ability and so applying it was as trivial as remembering its existence. Nowadays, however, I'm almost unable to do this at all, despite what I remember. It's causing me significant difficulties in my personal life.

How can I redevelop my skill at this technique? Practicing will surely help, and I'm practicing right now so therefore I'm improving already. I'll soon have the skill back stronger than ever, I'm quite confident. But are there any tricks or styles of thinking that can make it more controllable? Any mantras or essays that will help my thought to become more fluidly self-directed? Or should I be focused on manipulating my emotional state rather than on initiating a direct cognitive override?

I feel as though the difficulties I've been having become most pronounced when I'm thinking about self-fulfilling prophecies that do not have guarantees of certainty attached. The lower my estimated probability that the self-fulfilling prophecy will work for me, the less able I am to use the self-fulfilling prophecy as a tool, even if the estimated gains from the bet are large. How might I deal with this problem, specifically?

27 comments

Comments sorted by top scores.

comment by Unknowns · 2014-11-18T21:52:48.750Z · LW(p) · GW(p)

Trying to "convince" yourself is doing it wrong. Say it out loud, say it to yourself (and avoid saying the opposite to yourself), and perform the external actions that someone normally does who believes it.

This is how people get "belief in belief" even when it is contrary to their evidence, and it works.

comment by Richard_Kennaway · 2014-11-19T17:19:47.714Z · LW(p) · GW(p)

The lower my estimated probability that the self-fulfilling prophecy will work for me, the less able I am to use the self-fulfilling prophecy as a tool, even if the estimated gains from the bet are large. How might I deal with this problem, specifically?

By not using doublethink? I know you didn't want that questioned, but I'm going to call it into question by suggesting a different method, by illustrative example from personal experience.

The first time I considered doing a hundred-mile bicycle ride, once I had made the assessment that I could do it (I had never ridden further than 50 miles in one day before), I did not concern myself thereafter with wondering about the matter. That would be wasted motion: something which, predictably at the time you think it, will afterwards be seen to have contributed nothing to the task at hand. I had made a decision, I signed up for the event, I undertook what practice I judged useful, I turned up on the day, I completed the course. Belief that I could do it did not enter into the matter after that initial decision.

The key word is intention. This might be something similar to, or an instance of, what Brienne calls a "mental posture". It's something you do with your mind, not a proposition you believe. Decision screens off belief from action.

Replies from: Unknowns
comment by Unknowns · 2014-11-19T19:13:08.801Z · LW(p) · GW(p)

Intending to do something implies the belief that you can do it.

Replies from: TheOtherDave, Richard_Kennaway
comment by TheOtherDave · 2014-11-19T20:53:40.198Z · LW(p) · GW(p)

A great deal of complexity is buried underneath the simple word "implies" here.

A perfectly consistent and rational agent A, I agree, would likely be unable to intend to perform some task T in the absence of some reasonably high level of confidence in the proposition "A can do T," which is close enough to what we colloquially mean by the belief that A can do T. After all, such an A would routinely evaluate the evidence for and against that proposition as part of the process of validating the intention.

(Of course, such an A might attempt T in order to obtain additional evidence. But that doesn't involve the intention to do T so much as the intention to try to do T.)

The thing is, most humans don't validate their intentions nearly this carefully, and it's consequently quite possible for most humans to intend to do something in the absence of a belief that they can do it. This is inconsistent, yes, but we do it all the time.

comment by Richard_Kennaway · 2014-11-20T14:55:25.816Z · LW(p) · GW(p)

Intending to do something implies the belief that you can do it.

The direction of causality is the reverse, though, and the two are causally separated by the decision. One assesses a situation and arrives at a decision about what to do. The decision made, one acts to carry it out. The decision screens off the beliefs from the action: the action should depend only on the decision, not additionally on the belief. Thinking all the while "but what if I fail? but what if I fail?" is a useless distraction. When one realises this, one does not allow oneself to be distracted.

A concrete example, again drawn from specific experiences. If I decide to run to catch a train that I might miss, I do not put a mere 50% effort into it if I am only 50% sure of catching the train. On the contrary, the decision made, I will run as fast as I can, up to the point where the outcome has been placed beyond any reasonable doubt. Either I get on the train before it leaves, or the train leaves before I can get on. At that point I can stop running.

comment by Viliam_Bur · 2014-11-19T09:53:53.037Z · LW(p) · GW(p)

In these situations people naturally use selection bias. Make a list of all similar situations where you achieved some success in something similar to the upcoming competition. With a bottom line "...therefore, I will win this competition, too".

Write it on a paper, put it on your bedroom door, read it every morning then you wake up. Try to re-live (imagine in "near mode") the situations mentioned on the paper, and then when you are in the right mood, imagine the victory in this competition. (How would it feel like, for your senses? What would you see? What would you hear?) Then have a small exercise, or do something that at least symbolically contributes to your victory.

comment by John_Maxwell (John_Maxwell_IV) · 2014-11-19T10:49:52.806Z · LW(p) · GW(p)

It sounds like you're talking about convincing System 1 of something, or forming an "alief": http://agentyduck.blogspot.com/2014/06/growth-mindset-forest-system-1.html

comment by lucidian · 2014-11-19T19:51:01.754Z · LW(p) · GW(p)

Read more things that agree with what you want to believe. Avoid content that disagrees with it or criticizes it.

comment by James_Miller · 2014-11-19T04:17:16.023Z · LW(p) · GW(p)

Start with an intermediate step, beliefs whose truth value depends on whether you believe in them such as placebo effects, social confidence strategies, and relaxation techniques.

comment by ChristianKl · 2014-11-18T21:42:19.926Z · LW(p) · GW(p)

It's causing me significant difficulties in my personal life.

What kind of difficulties?

Replies from: 27chaos
comment by 27chaos · 2014-11-18T21:58:47.861Z · LW(p) · GW(p)

I have social anxiety that this technique would normally help me overcome, for one. For another, I have motivational problems that this has helped with in the past.

Replies from: Brillyant
comment by Brillyant · 2014-11-18T22:53:17.167Z · LW(p) · GW(p)

I would say in regard to social anxiety, the reality is you can overcome it, and thus it isn't a belief at all. Destroying lies about why you should be anxious might help.

I have social anxiety (and I'm in sales). One trick I've learned is this: People don't give a shit about you when you walk in a room. They aren't judging you nearly as much as you think they are. They don't care if you flub a sentence or drop your portfolio. They won't remember the negative aspects of your interactions. Just be nice. Smile. Don't try to do too much.

This trick is to counter the feeling that many with social anxiety have that others are actively judging their words and actions. It's just not true.

comment by someonewrongonthenet · 2014-11-22T22:44:13.373Z · LW(p) · GW(p)

The skill you are trying to cultivate is good for things like lying, manipulation, and bluffing, but not particularly helpful for social anxiety. Lying and bluffing in social situations is an advanced level social skill, not beginner level. At the beginner level you just want to treat it like it's a scary spider - gradually up your exposure until you feel comfortable.

Since you want an answer though - it's a game, a play, an act. Purposefully pretending and taking on roles is more effective than attempting to lie to yourself.

I don't think I can actually lie to myself on purpose about propositional statements, but emotions aren't propositional statements, you don't really have to lie to yourself to feel emotions that you wouldn't normally feel.

Replies from: someonewrongonthenet
comment by someonewrongonthenet · 2014-11-23T23:27:08.976Z · LW(p) · GW(p)

Another thought: It helps to consider how certain things are true in a non-literal sense. It's a different sort of truth, an emotional truth. "We all most fulfill our purposes in life, God will be happy if I finish my paper" isn't literally true, but it's true in the sense that the universe will be configured in a more favorable way if you do it, and you don't have to think the complex, nuanced, and way-too-difficult-for-lower-primates-to-understand version every single time you consider it.

If you recognize that emotional truths often correspond to instrumental rationality at the end of the day, the same way Newton's physics isn't right but still works better, the part of your brain that yells at inconsistencies will quieten down.

comment by Anomylous · 2014-11-20T05:42:33.685Z · LW(p) · GW(p)

I can't tell, from your post, what kind of propositions you are trying to convince yourself of. If it's an attempt to win competitions, then you're putting your effort in the wrong place. Whether you win any given competition is largely going to be determined by who else shows up to compete. Improving your chances means reducing the number of people who can reliably beat you, and that only happens through research and practice (since murdering competitors is generally seen as bad sportsmanship).

Other than that, it sounds like you've discovered the flaw in Pascal's original wager (well, one of its flaws anyway). You can decide it's rational to believe something, but actually believing it is a different matter. In religion, actual belief is key, and therefore Pascal's wager isn't going to make a lot of true converts, even though it's a beautiful piece of reasoning.

I am having a similar issue, and am currently dealing with it by developing better acting skills. As long as I do and say things consistent with the belief set I wish I had (and don't), my ends are achieved regardless of whether I actually hold that belief set. This may or may not be applicable to your situation.

Replies from: Unknowns
comment by Unknowns · 2014-11-20T06:32:13.660Z · LW(p) · GW(p)

You can actually believe it, if you want to. If you are doing and saying things consistent with it, the last step (as I posted earlier) is to say the same thing to yourself, internally, and completely avoid saying to yourself the opposite or things that imply the opposite.

It sounds like you are saying to yourself things like "I wish I believed that X, but unfortunately I know that it is isn't true because of Y..." If you decide to do so, you can simply stop telling yourself things like that, and instead tell yourself things like "X. X is true. It really is."

If you do that, I assure you that you will indeed begin to believe that X. Of course, in reality even if you say you wished you had that belief set, you might not really wish you had it, given that the cost is losing the truth. So this may explain why you refuse to take this step.

comment by Artimaeus · 2014-11-19T05:53:20.323Z · LW(p) · GW(p)

I think there's a limit to how much a belief like this can be instrumental. It's a belief whose benefit would be very highly context dependent. For example, believing that you are capable of winning a boxing match might be highly instrumental when you're actually in the ring, but it would not be instrumental before the match when you're deciding how much money you want to bet on your performance in the competition (since you would bet as if your likelihood of winning was higher than it actually is).

That said, I think that psychologically dealing with high-stress situations is a skill just like any other. You practice, you fuck up, you think about it, and eventually you get better.

Replies from: Lumifer
comment by Lumifer · 2014-11-19T06:51:34.811Z · LW(p) · GW(p)

dealing with high-stress situations is a skill just like any other. You practice, you fuck up, you think about it, and eventually you get better.

Well, that's the good version :-) The not-so-good version goes like this: you practice, you fuck up, you die, the end. There is also the less-awful version where you practice, you fuck up, and you never get another chance at that.

High-stress situations are often high-stress for a reason.

Replies from: fubarobfusco
comment by fubarobfusco · 2014-11-19T07:14:48.543Z · LW(p) · GW(p)

On the other hand, high-stress situations are sometimes artificially high-stress because someone else wants you to take yourself out of the running so that they can have less competition.

This is sometimes known as "psyching your opponent out."

Replies from: Lumifer
comment by Lumifer · 2014-11-19T15:30:19.994Z · LW(p) · GW(p)

Well, of course -- sometimes your System 1 is just throwing a hissy fit for no good reason. But sometimes there is a good reason.

As the saying goes, "If you never succeed on the first try, skydiving might not be for you" :-)

Replies from: Artimaeus
comment by Artimaeus · 2014-11-20T04:01:06.991Z · LW(p) · GW(p)

I suppose that is true, although I've certainly been in a lot more high-stress situations than I've been in life-threatening situations, and I expect that the same goes for most people on this forum. But then again, I don't think that was necessarily the point you were trying to make. I didn't mean to downplay the difficulty of coping high stress situations-- it's legitimately hard. But practice is the best way to increase your likelihood of not-dying.

For example, when I was taking martial arts, I was told that the best thing you can do if you actually want to be able to defend yourself is to drill your basic punches and kicks. The idea being that if you get into an actual fight, you are so comfortable with your moves that you'll be able to execute them, despite the fact that your higher reasoning centers are all going FUCKFUCKFUCKIMUNDERATTACK.

Same principle applies in other areas. You increase your chances of success by training a couple of behaviors or thought patterns until they come so naturally that they will happen even when your brain goes into panic mode.

comment by polymathwannabe · 2014-11-18T20:56:48.989Z · LW(p) · GW(p)

If believing that I am capable of winning a competition will cause me to win, believing that I am capable of winning is rational

Self-confidence will only help you in games where other factors already favor you. Unfounded self-confidence (or any unfounded belief) is very harmful.

Replies from: maxikov, Viliam_Bur, DanielLC, 27chaos
comment by maxikov · 2014-11-19T12:22:17.988Z · LW(p) · GW(p)

Unfounded self-confidence (or any unfounded belief) is very harmful.

Citation needed. Bluff (i.e. unfounded confidence) seems to be a very efficient strategy in many games. Apparently, even in chess:

UNIDENTIFIED MALE #2: Rook to D1.

CAMPBELL: And this particular move was really bad, and so it caused us to give up the game right away.

FOO: This really bad move confused Kasparov. Murray says he heard Kasparov's team stayed up that night trying to analyze the logic behind that move - what it meant. The only thing was - there was no logic.

comment by Viliam_Bur · 2014-11-19T09:47:41.374Z · LW(p) · GW(p)

Self-confidence will only help you in games where other factors already favor you.

A charitable reading of the article would be that the other factors are already okay.

Maybe not good enough to guarantee a 100% victory, but let's say they give a 20% chance of victory (but only if the writer believes in their victory, otherwise they cannot find enough motivation to train properly), the gains of victory are huge, and the costs of training are relatively low. So it could be instrumentally useful to believe that victory is almost sure. (Of course, with all the caveats and proper compartmentalization. For example, author shouldn't make bets about their victory. Only to believe in the victory while training.)

comment by DanielLC · 2014-11-19T00:59:33.403Z · LW(p) · GW(p)

Self-confidence also has beneficial social effects, since it signals that you are capable of winning. As a result, people tend to be more self-confident than can be rationally justified, and they're also built to counteract those effects. For example, they are risk-averse. If you are only as confident as is rationally justified, but you are still risk averse etc., you will avoid taking actions that would benefit you.

comment by 27chaos · 2014-11-18T20:57:39.835Z · LW(p) · GW(p)

I already know this. But you weren't supposed to mention it. I'd appreciate it if you'd delete this comment.

Replies from: polymathwannabe
comment by polymathwannabe · 2014-11-18T22:37:15.617Z · LW(p) · GW(p)

I would like this conversation to operate under the assumption

You started this debate by affirming a consequent, so you don't get to complain when the whole inconsistency of the matter is remarked upon. Willful embrace of cognitive dissonance is Dark Arts. By deleting my comment, I would be implicitly helping you keep the pretense that short-circuiting your brain is beneficial.