Don't Believe You'll Self-Deceive

post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-09T08:03:20.329Z · LW · GW · Legacy · 72 comments

I don't mean to seem like I'm picking on Kurige, but I think you have to expect a certain amount of questioning if you show up on Less Wrong and say:

One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.

"If you know it's double-think...

...how can you still believe it?" I helplessly want to say.

Or:

I chose to believe in the existence of God—deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.

If you know your belief isn't correlated to reality, how can you still believe it?

Shouldn't the gut-level realization, "Oh, wait, the sky really isn't green" follow from the realization "My map that says 'the sky is green' has no reason to be correlated with the territory"?

Well... apparently not.

One part of this puzzle may be my explanation of Moore's Paradox ("It's raining, but I don't believe it is")—that people introspectively mistake positive affect attached to a quoted belief, for actual credulity.

But another part of it may just be that—contrary to the indignation I initially wanted to put forward—it's actually quite easy not to make the jump from "The map that reflects the territory would say 'X'" to actually believing "X".  It takes some work to explain the ideas of minds as map-territory correspondence builders, and even then, it may take more work to get the implications on a gut level.

I realize now that when I wrote "You cannot make yourself believe the sky is green by an act of will", I wasn't just a dispassionate reporter of the existing facts.  I was also trying to instill a self-fulfilling prophecy.

It may be wise to go around deliberately repeating "I can't get away with double-thinking!  Deep down, I'll know it's not true!  If I know my map has no reason to be correlated with the territory, that means I don't believe it!"

Because that way—if you're ever tempted to try—the thoughts "But I know this isn't really true!" and "I can't fool myself!" will always rise readily to mind; and that way, you will indeed be less likely to fool yourself successfully.  You're more likely to get, on a gut level, that telling yourself X doesn't make X true: and therefore, really truly not-X.

If you keep telling yourself that you can't just deliberately choose to believe the sky is green—then you're less likely to succeed in fooling yourself on one level or another; either in the sense of really believing it, or of falling into Moore's Paradox, belief in belief, or belief in self-deception.

If you keep telling yourself that deep down you'll know—

If you keep telling yourself that you'd just look at your elaborately constructed false map, and just know that it was a false map without any expected correlation to the territory, and therefore, despite all its elaborate construction, you wouldn't be able to invest any credulity in it—

If you keep telling yourself that reflective consistency will take over and make you stop believing on the object level, once you come to the meta-level realization that the map is not reflecting—

Then when push comes to shove—you may, indeed, fail.

When it comes to deliberate self-deception, you must believe in your own inability!

Tell yourself the effort is doomed—and it will be!

Is that the power of positive thinking, or the power of negative thinking?  Either way, it seems like a wise precaution.

72 comments

Comments sorted by top scores.

comment by kurige · 2009-03-10T00:31:46.144Z · LW(p) · GW(p)

I don't mean to seem like I'm picking on Kurige, but I think you have to expect a certain amount of questioning if you show up on Less Wrong and say:

One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.

I realize that my views do not agree with the large majority of those who frequent LW and OB - but I'd just like to take a moment to recognize that it's a testament to this community that:

A) There have been very few purely emotional or irrational responses.

B) Of those that fall into (A) all have been heavily voted down.

comment by Tyrrell_McAllister · 2009-03-09T18:31:42.597Z · LW(p) · GW(p)

I hope that Kurige comes back to verify this, but I'll bet that when he said

I chose to believe in the existence of God - deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.

he did not mean, "My belief isn't correlated with reality". Rather, I'll bet, he meant exactly what you meant when you said

telling yourself X doesn't make X true

By saying that his choice had no effect on reality, I expect that he meant that his control over his belief did not entail control over the subject of that belief, i.e., the fact of the matter.

His attribution of Orwellian doublethink to himself is far more confusing. I have no idea what to make of that. Maybe your advice in this post is on point there. But the "absolutely zero effect" quote seems unobjectionable.

Replies from: kurige, kurige
comment by kurige · 2009-03-10T09:15:45.846Z · LW(p) · GW(p)

His attribution of Orwellian doublethink to himself is far more confusing. I have no idea what to make of that. Maybe your advice in this post is on point there. But the "absolutely zero effect" quote seems unobjectionable.

From the original comment:

One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.

I don't have the original text handy, but a quick search on wikipedia brings up this quote from the book defining the concept:

The power of holding two contradictory beliefs in one's mind simultaneously, and accepting both of them. … To tell deliberate lies while genuinely believing in them, to forget any fact that has become inconvenient, and then, when it becomes necessary again, to draw it back from oblivion for just so long as it is needed, to deny the existence of objective reality and all the while to take account of the reality which one denies.

The first sentence and the first sentence alone is the definition I had in my mind when I wrote the comment. It has been quite a while since I last read 1984 and I had forgotten the connotation that to "double-think" is to "deny the existence of objective reality." This was not my intention at all, although, upon reflection, it should have been obvious.

This was bad homework on my part; I should have looked the quote up before writing the comment. Instead of focusing on the example of morality that I used in the original comment I'm going to try to step back a bit to clarify my original point... Instead of blind-faith in religious tenants, my world-view currently accommodates two traditionally exclusive systems of belief: religion and science.

These two beliefs are not contradictory, but the complexity lies in reconciling the two.

If one does not agree with the other then my understanding of one or the other is flawed.

Replies from: Tyrrell_McAllister, Amanojack
comment by Tyrrell_McAllister · 2009-03-10T16:48:24.293Z · LW(p) · GW(p)

Okay, so, when you say that you engage in "doublethink", do you mean that you simultaneously hold two beliefs that are currently "unreconciled", and which you don't yet know how to reconcile, but which you believe can yet be reconciled?

If that's right, then I would be curious to know more about this "unreconciled" relation. Can you give other example of pairs of "unreconciled" beliefs that you hold?

Replies from: HughRistik
comment by HughRistik · 2009-03-10T21:17:36.206Z · LW(p) · GW(p)

I'm also having trouble seeing kurige's "doublethink."

The double-think comes into play when you're faced with non-axiomatic concepts such as morality. I believe that there is a God - and that He has instilled a sense of right and wrong in us by which we are able to evaluate the world around us. I also believe a sense of morality has been evolutionarily programmed into us - a sense of morality that is most likely a result of the formation of meta-political coalitions in Bonobo communities a very, very long time ago.

These two beliefs are not contradictory, but the complexity lies in reconciling the two.

As you observe, the beliefs are not contradictory. There are various creative ways of reconciling them, such as deism (e.g. "God started the Big Bang"). Whether these reconciliations are true, or reasonable, is another question. Yet they are internally consistent, so there is no contradiction or double-think.

Instead of blind-faith in religious tenants, my world-view currently accommodates two traditionally exclusive systems of belief: religion and science.

I think that this is the closest to a contradiction you have displayed. It doesn't seem like your form of religion excludes the claims of science, but your version of science may exclude the claims of religion.

If, in your view, science requires the use of Occam's Razor, and you think belief in God violates Occam's Razor (as I do), yet you continue to believe in God, then I think you would be engaging in double-think. Yet if you don't think that Occam's Razor is valid, or you don't think that belief in God violates it, then I wouldn't claim that you were engaging in double-think without additional information.

Replies from: Annoyance
comment by Annoyance · 2009-03-10T21:25:27.481Z · LW(p) · GW(p)

"There are various creative ways of reconciling them, such as deism (e.g. "God started the Big Bang"). Whether these reconciliations are true, or reasonable, is another question."

If the purported reconciliation isn't reasonable, it's not a reconciliation, just as an asserted solution to a mathematical problem that doesn't match the requirements isn't an actual solution.

If I hit you in the head with a bat, would you accept that God was responsible because your injury wouldn't have occurred if (we presume) the universe had not been set into motion?

Replies from: HughRistik
comment by HughRistik · 2009-03-10T22:58:56.794Z · LW(p) · GW(p)

Annoyance said:

If the purported reconciliation isn't reasonable, it's not a reconciliation, just as an asserted solution to a mathematical problem that doesn't match the requirements isn't an actual solution.

First, I'm not sure what you are trying to show by your analogy to a mathematical problem, or by your question.

When I say that beliefs are reconciled, I am talking about internal consistency. Belief systems can be internally consistent without being true or reasonable.

If someone believes X and Y, and they do not contradict each other, then their beliefs are reconciled and internally consistent, even if Y is false or unreasonable. (Unless they hold another belief, Z, which implies that Y is false.)

Being wrong or unreasonable is not necessarily double-think. Do you not agree?

If we take someone who has seemingly internally consistent, but certain demonstrably false or unreasonable beliefs, then we might wonder if we could dig up a contradiction in their beliefs if we dug hard enough. Take, for instance, a theist who turns out to believe Occam's Razor. In this case, the internal consistency of their beliefs falls apart.

Yet even then, this still isn't necessarily double-think. Orwell's definition requires "holding two contradictory beliefs in one's mind simultaneously." If our theist never even thought about their beliefs in God and how they measured up to Occam's Razor, then this would not be double-thinking, it would be lack-of-thinking.

Replies from: Annoyance
comment by Annoyance · 2009-03-11T17:32:39.801Z · LW(p) · GW(p)

"When I say that beliefs are reconciled, I am talking about internal consistency. Belief systems can be internally consistent without being true or reasonable."

They might not be true, and they might not be reasonable *in regard to a framing system of beliefs and knowledge, but they DO have to be reasonable relative to each other.

Saying that God is responsible for the existence of creation does not imply that everything that happens (including evolutionary processes) was designed by God. Evolutionary development as a concept is incompatible with the concept of intentional design. The two beliefs are not compatible with each other.

Replies from: tlhonmey
comment by tlhonmey · 2022-05-12T16:47:33.095Z · LW(p) · GW(p)

So that raises an interesting question...  Because that's exactly the same as suggesting that, when a programmer uses a code generator algorithm instead of writing every line carefully himself, that he somehow ceases to be the "designer" of the system.

And yet, he wrote the code generator, and he gave it the parameters and tweaked them until he got a result that was within his tolerances...

It occurs to me that it's really not possible for us to determine whether or not life on this planet was the result of an intelligently-guided design process just by looking at the results.  We'd also have to know what said, hypothetical intelligence's design goals were.  

While we're definitely not built the way we would choose to build ourselves given an opportunity -- to hold that up as proof that there was no intelligence involved at all is a pretty arrogant assertion that all "intelligent" beings must think just like humans and share our preferences...

comment by Amanojack · 2010-03-14T05:43:53.502Z · LW(p) · GW(p)

Instead of blind-faith in religious tenants, my world-view currently accommodates two traditionally exclusive systems of belief: religion and science.

In other words, it seems you meant "doublethink" in the collective sense based on traditional sentiment, rather than in the actual sense of a logical contradiction between any one specific religious tenet A and any one specific scientific theory B. If there are no actual contradictions, "doublethink" was just an (unfortunate) turn of phrase and there is nothing to be reconciled.

comment by kurige · 2009-03-10T09:08:30.341Z · LW(p) · GW(p)

His attribution of Orwellian doublethink to himself is far more confusing. I have no idea what to make of that. Maybe your advice in this post is on point there. But the "absolutely zero effect" quote seems unobjectionable.

From the original comment:

One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.

I don't have the original text handy, but a quick search on wikipedia brings up this quote from the book defining the concept:

The power of holding two contradictory beliefs in one's mind simultaneously, and accepting both of them. … To tell deliberate lies while genuinely believing in them, to forget any fact that has become inconvenient, and then, when it becomes necessary again, to draw it back from oblivion for just so long as it is needed, to deny the existence of objective reality and all the while to take account of the reality which one denies.

The first sentence and the first sentence alone is the definition I had in my mind when I wrote the comment. It has been quite a while since I last read 1984 and I had forgotten the connotation that to "double-think" is to "deny the existence of objective reality."

This was bad homework on my part; I should have looked the quote up before writing the comment. Instead of focusing on the example of morality that I used in the original comment I'm going to try to step back a bit to clarify my original point... My world-view currently accommodates two traditionally exclusive systems of belief: religion and science.

These two beliefs are not contradictory, but the complexity lies in reconciling the two.

comment by Annoyance · 2009-03-09T16:06:04.911Z · LW(p) · GW(p)

"When it comes to deliberate self-deception, you must believe in your own inability!"

That is both contrary to facts, and a pretty effective way to ensure that we won't search for and find examples where we've been deceiving ourselves. Without that search, self-correction is impossible.

"Tell yourself the effort is doomed - and it will be!"

Tell yourself that victory is assured, and failure becomes certain.

Replies from: Science Dogood
comment by Science Dogood · 2023-04-07T11:40:51.520Z · LW(p) · GW(p)

I'm surprise no one responded to this in 14 years [edit, I think the Hanson and Eliezer thread below addresses it well]. I think I agree with the post that explicit self-deception doesn't work, but automatic self-deception via default selfish attention rationing happens all the time. Similarly, people can choose to be biased even if they can't directly choose beliefs, because it is necessary to have simplifying algorithms to think at all. A common example would be that all the logical razors people use are also biases, and you can explicitly choose to not reply on the razor and keep thinking.

I think this is one of the sort of things one can find if they do go looking for cases of accidental self-deception, and people not doing this can put people in a mental trap where they think their beliefs are rational to an unjustified degree.

comment by WKnorpp · 2009-03-09T13:52:36.553Z · LW(p) · GW(p)

One question here obviously concerns doxastic voluntarism (DV). You ask:

"If you know your belief isn't correlated to reality, how can you still believe it?"

Is this a rhetorical question aiming to assert that if you know your belief isn't correlated to reality, you can't still believe it"?

If so, then it just isn't clear that you're right. One possibility is that DV is true (there are, of course, many reasons to believe that it is). And, if DV is true, it's likely that different people have different degrees and kinds of control over their beliefs. After all, people differ with regard to all other known cognitive skills. Some irrational folks simply might have a kind of control over their beliefs that others don't have. That's an empirical question. (Though we normally think that folks who are more rational have greater control over their beliefs.)

You might, however, mean: if you know your belief isn't correlated to reality, you shouldn't still believe it.

That's a normative claim, not an empirical, psychological one. If that's what you mean, then you're in effect expressing surprise that anyone can be that irrational. If so, I guess I'm a little surprised at your surprise. It is a fairly pure case, but it seems to me that it's not that unusual to hear things like this.

comment by Cameron_Taylor · 2009-03-09T09:53:04.704Z · LW(p) · GW(p)

When it comes to deliberate self-deception, you must believe in your own inability!

Tell yourself the effort is doomed - and it will be!

Is that the power of positive thinking, or the power of negative thinking? Either way, it seems like a wise precaution.

The positive power of negative thinking. There is a book waiting to happen. Scratch that, google tells me the title is already taken. Either way, the idea is fascinating.

Just what is the difference between deceiving yourself and 'positive thinking'? It is clear that Eleizer advocates telling yourself things that may not actually be true. You may tell yourself "I cannot believe what I know is not true". In some cases you may know yourself well enought to estimate that there is only a 40% chance that the claim could ever reasonably qualify as true no matter how dilligent your pep-talking may be, yet it may still be worth a try. On first glance that seems like it is 60% self deception. Yet there is some sort of difference.

When we go about affirming to ourself that "I am charming, assertive, have an overwhelming instinct to maintain reflective consistency and am irresistible to the opposite sex" we are not so much lieing as we are using the mechanics of our brains to alter our computational hardware to an improved state. But then, a believer could plausibly use the same defence.

Is it the potential for self fullfillment that makes our not-quite-truths 'ok'? We know that by telling ourselves we are assertive or that we can't stand to bullshit ourselves we probably do influence these traits somewhat. Yet again, the more we know ourselves the more we are able to know just to what extent we will be able to modify our cognitive behaviors. If we know that we'll never have the desired trait to a respectable degree then we have less scope to affirm ourselves without blatant lies. Having more self awareness would limit our options for self improvement. Now, there may be something to that connection, but it isn't something I would want to formalise into my understanding of what constitutes 'self deception'.

Could it be that these affirmative non-truths are different because they are self referential? When Eleizer delved into subjectivity he etched into my mind the quote from Robert Dick, "Reality is that which, when you stop believing in it, doesn't go away". We could almost argue that because we are talking about things that change based on what we believe, we are outside the scope of reality so have free reign. Almost. It still seems to me that as a statement of the state of the universe, "I can't fool myself!" may objectively be nonsense both as a current observation and as a prediction of the future and yet still be worth saying to yourself. That's right. "I can't fool myself and even though I can you'll probably believe me anyway, which helps, so knock 5% off the probability that I'll be able to believe something really idiotic. Thanks, bye."

Maybe the central difference is just that it's a "white lie". If the goal is to create the most accurate map of reality it is quite possibly the case that the optimal strategy is to believe certain false things. Try limitting yourself to only ideally rational behaviors and you may well end up less rational than if you'd taken a few liberties and made allowances for your weaknesses.

Replies from: Tyrrell_McAllister, zaph
comment by Tyrrell_McAllister · 2009-03-09T18:17:37.833Z · LW(p) · GW(p)

It is clear that Eleizer advocates telling yourself things that may not actually be true.

I don't think so. He is advocating telling yourself something on the condition that telling it to yourself causes it to be true.

It's not equivalent to telling yourself "I'm attractive to the opposite sex." Say that you doubted this prior to uttering it. Then, yes, after uttering it, you might have reason to think that it is marginally more likely to be true. But you almost certainly wouldn't be justified in believing it with high confidence. That is, you still shouldn't believe the statement, so telling it to yourself is dishonest.

In contrast, Eliezer is suggesting that perhaps regularly uttering the statement

I can't get away with double-thinking! Deep down, I'll know it's not true! If I know my map has no reason to be correlated with the territory, that means I don't believe it!

does alter you so as to make itself true. If that's right, then, conditioned on your having uttered it, you are justified in believing what you uttered, so you are not being dishonest.

It's not a matter of being outside of reality. The utterance is part of reality. That's precisely why it may have the power to cause itself to be true.

Of course, it may be that this particular statement just doesn't have that power. If the probability of that were above a certain threshold, I expect that Eliezer wouldn't advocate saying it unless it's true already.

Replies from: HalFinney, pjeby
comment by HalFinney · 2009-03-09T22:26:41.334Z · LW(p) · GW(p)

What evidence is there that yelling at yourself like this is going to make a difference? Let us imagine two kinds of people: those who cannot fall into Moore's paradox (believing the map but not the territory) and those who can. People in the first class, who are immune to the problem, will gain no benefit from reciting these mantras. People in the second class, for whom there is a real risk of making these kinds of errors, are supposed to vigorously tell themselves that there is no such risk! They are supposed to lie to themselves in the hope that the lie will become true. But why should they believe it?

And how different is this lie, really, from the wannabe god-worshiper who similarly insists to himself that he believes that god exists, even though it is not true?

I can't help wondering whether this posting is meant to be ironic. It comes perilously close to outright self-contradiction.

Replies from: AnnaSalamon
comment by AnnaSalamon · 2009-03-09T22:47:56.627Z · LW(p) · GW(p)

Hal, perhaps Eliezer's view is that there are "suggestible" portions of one's mind that it is okay to suggest things to, but there is some other, reason-capable faculty that one can and should use to form true, un-self-deceived, evidence before bottom line, beliefs.

Whether or not that's Eliezer's view, the above view seems right to me. It would be silly not to suggest useful frames, emotional stances, energy levels, etc. to the less rational parts of myself -- that would leave me freezing in particular, arbitrary/chance/un-useful starting states. But for the part of myself that can do full cost-benefit analyses, and math, and can assemble my best guess about the world -- misleading that part of myself would be terrifying, like putting my eyes out. (I mean, I deceive the reason-capable part of myself all the time, like most humans. But it's terrifying that I do, and I really really want to do otherwise... including by suggestibility tricks, if they turn out to help.)

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-09T23:16:52.010Z · LW(p) · GW(p)

Tyrrell and Anna have stated my views better than I'd previously gone so far as verbalizing.

There are large sectors of the mind in which belief tends to become reality, including important things like "I am the sort of person who continues even in the face of adversity" and "I do have the willpower to pass up that cookie."

But - given that you aren't actually trying to fool yourself - there's a chicken-and-egg aspect that depends on your having enough potential in this area that you can legitimately believe the statement will become true if you believe it. At that point, you can believe it and then it will be true.

There's an interesting analogy here to Lob's Theorem which I haven't yet categorized as legitimate or fake.

To look at it another way, this sort of thing is useful for taking simultaneous steps of self-confidence and actual capability in cases where the two move in lockstep. Or, in the case of anti-competencies like doublethink, the reverse.

Replies from: abigailgem
comment by abigailgem · 2009-03-10T09:43:44.097Z · LW(p) · GW(p)

"I have the potential to be the sort of person who continues even in the face of adversity", or "it is more in my interests to pass up that cookie", or "I really do have a choice whether or not to pass up that cookie". That is what I would recommend.

bill, below, has mentioned "Act as if": "I choose to Act as If I can continue even in the face of adversity, and I intend in this precise moment to continue acting, even if I may just fall down again in two minutes' time".

These have the advantages of being more likely to be true.

Rambling on a little, to be the sort of person who continues in the face of adversity is Difficult, and requires practice, and that practice is very worthwhile. Stating that it is True might make you fail to do the practice, and instead beat yourself up when it appears not to be true.

comment by pjeby · 2009-03-09T19:45:38.795Z · LW(p) · GW(p)

Dishonest or not, convincing yourself that you're attractive to the opposite sex is more likely to produce a positive result. And a rationalist should win. ;-)

comment by zaph · 2009-03-09T16:56:19.839Z · LW(p) · GW(p)

Sorry for the pedantry, but I believe that's Philip K. Dick's quote.

To the "sky is green" idea, I'd counter that the verification path might not work for converting people to atheism. Mormons for instance, suggest to people they will feel a burning in their heart when they read the Book of Mormon, which proves the books veracity. You need to logically piece together that any such physical sensation wouldn't be sufficient to objectively verify anything. There isn't an easy falsification of religious/magical thinking, just following chains of inference from observation. Non-believers just make a commitment to the minimal contortion of facts to fit their paradigm. As obvious as the Silence seems to be, some people don't seem to hear it.

comment by JamesAndrix · 2009-03-11T16:40:26.467Z · LW(p) · GW(p)

I was under the impression that Doublethink involved contradictory ideas, Kurige seem to be talking about descriptions that are not inherently contradictory.

On the subject of not being able to update, I know of an anorexic who claims that even if she were rail-thin, she would be a fat person in a thin body. The knowledge of thinness does not affect the internal concept of self-fatness. (probably formed during childhood)

http://lesswrong.com/lw/r/no_really_ive_deceived_myself/gl I don't think I'd call my situation self deception. I am not making myself "believe the sky is green by an act of will." Rather, something in me says the sky is green, and is not dependent on observations of the sky at all.

No matter how much you're committed to updating your map, you'll face a conundrum when you realize you should have made your map round, and that's not something that you can trivially change about your map. You can understand and minimize the distortions, and use different projections in different situations, but you might always be stuck with a flat map. Knowing the territory is round doesn't change the experience you have of looking at a flat map.

Replies from: theotetia
comment by theotetia · 2009-03-13T04:43:40.243Z · LW(p) · GW(p)

Wow. I love the flat-vs.-round elaboration of the map metaphor. I had never thought about it that way. My thoughts just got way more interesting. Thanks.

comment by Yasser_Elassal · 2009-03-09T15:36:26.837Z · LW(p) · GW(p)

I chose to believe in the existence of God - deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.

If you know your belief isn't correlated to reality, how can you still believe it?

To be fair, he didn't say that the actual existence of God has absolutely zero effect on his decision to believe in the existence of God.

His acknowledgement that the map has no effect on the territory is actually a step in the right direction, even though he has many more steps to go.

Replies from: mamert
comment by mamert · 2019-04-24T11:11:39.981Z · LW(p) · GW(p)

My thoughts exactly. Seeing that statement, I must absolutely AGREE with the second part, and only politely point out that he should rephrase the first part, working "probability" and "working hypothesis" into it.

comment by RobinHanson · 2009-03-09T14:44:05.550Z · LW(p) · GW(p)

It seems to me you are trying to deceive yourself into thinking that you cannot comfortably self-deceive. Your effort may indeed make it harder to self-deceive, but I doubt it changes your situation all that much. Admit it, you are human, and within the usual human range of capabilities and tendencies for self-deception.

Replies from: Eliezer_Yudkowsky, Roko
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-09T16:59:15.733Z · LW(p) · GW(p)

Thus did I carefully write, "cannot deliberately self-deceive", not, "cannot self-deceive".

Replies from: RobinHanson
comment by RobinHanson · 2009-03-09T17:21:21.497Z · LW(p) · GW(p)

We have a continuum of degrees of deliberation to our actions. Even if I agree that you cannot self-deceive at the strongest degree of deliberation, that isn't in practice much of a restriction on your ability to self-deceive.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-09T18:26:48.993Z · LW(p) · GW(p)

Might seem that way to you because you don't actually go around all day saying, "And now I shall doublethink myself into believing X!" Deliberate self-deception is a subset of self-deception well worth slicing off the carcass. E.g. Utilitarian from OB.

Just because the boundary of deliberate self-deception is fuzzy, doesn't mean the boundary is not worth drawing. The more so in this particular case, as if you wonder "Is this a deliberate self-deception that I can't get away with, or a non-deliberate one that I might still be able to pull off?" it has already reached the point of being deliberate. (Repeating this to yourself will make it even more true.)

comment by Roko · 2009-03-09T14:58:18.810Z · LW(p) · GW(p)

It may be the case that you can easily self deceive if and only if you think you can self deceive, in which case robin's comment is an attempt to cause Eliezer serious brain damage...

comment by Roko · 2009-03-09T12:49:01.986Z · LW(p) · GW(p)

" Kurige: One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think."

  • I defy the data! Have you considered the possibility that kurige is a troll? This is an exceptionally weird statement even for a Christian...
Replies from: Kevin
comment by Kevin · 2009-03-09T14:53:43.463Z · LW(p) · GW(p)

A troll seems more likely than the odds of us having a real, live Protestant Christian here... I've never heard of one making it very far on OB. The idea of Christians trying to read OB is a joke to me -- it must seem like crazy gibberish to them.

Replies from: Nick_Tarleton, CarlShulman, theotetia, Roko, ciphergoth
comment by Nick_Tarleton · 2009-03-09T16:41:21.204Z · LW(p) · GW(p)

Needlessly negatively worded generalizations about outgroups considered harmful, especially since someone might eventually want to show this site to a member of that outgroup for whatever reason.

comment by CarlShulman · 2009-03-09T17:25:46.148Z · LW(p) · GW(p)

Utilitarian is or was a Pascal's Wager Christian and OB poster, professing and following Christianity while acknowledging that the evidence suggests that it is very probably false. Kurige's Orwellian stance fits into the same mold of anomalous Christianity reshaped on the rack of contrary evidence.

comment by theotetia · 2009-03-09T18:36:08.517Z · LW(p) · GW(p)

I'm a lifelong atheist trying to see my way clear to [one of the more serious branches of] Christianity. Eliezer's posts, especially, have helped me draw my map of the strengths and limitations of rationalism. I'm not here to troll, I'm here to learn.

Replies from: Marcello, Roko
comment by Marcello · 2009-03-09T18:45:11.272Z · LW(p) · GW(p)

A phrase like trying to see my way clear to should be a giant red flag. If you're trying to accept something then you must have some sort of motivation. If you have the motivation to accept something because you actually believe it is true, then you've already accepted it. If you have that motivation for some other reason, then you're deceiving yourself.

Replies from: theotetia
comment by theotetia · 2009-03-09T18:59:18.640Z · LW(p) · GW(p)

I want it because it's beautiful, but I won't take it unless it's true.

Replies from: danlowlite
comment by danlowlite · 2010-11-30T14:49:50.827Z · LW(p) · GW(p)

Truth has a beauty all its own.

Not that false things can't have beauty, but we usually call those things art.

comment by Roko · 2009-03-09T18:56:48.883Z · LW(p) · GW(p)

This comment is even more interesting than kurige... when you said:

"trying to see my way clear to [one of the more serious branches of] Christianity."

you mean that you want to make yourself into a christian, i.e. you want to have a particular belief? That is fascinating. I'd love to hear more about this or chat to you. I'm easy to email...

comment by Roko · 2009-03-09T16:16:06.829Z · LW(p) · GW(p)

It isn't impossible, but I feel that there is a 50/50 chance that we are being taken for a ride here... I have spoken to many christians in my time, and never, ever have I come accross a christian who admits that they are "engaging in doublethink"

It would be interesting to think about how we could test to see whether Kurige is for real.

comment by Paul Crowley (ciphergoth) · 2009-03-09T16:22:30.523Z · LW(p) · GW(p)

Heh, we did at one point have at least one Christian reader, but they deconverted, at least in part due to what they read here. Weirdly, they still seem to treat Christianity as a proper idea worth taking seriously, but I guess cognitive dissonance takes time to wear off.

(if you're reading - hello!)

comment by Yosarian2 · 2014-01-21T01:06:04.275Z · LW(p) · GW(p)

It sounds like you don't really believe that double-think is impossible; you just have belief in belief in the impossibility of double-think, because you think that belief would be a useful one to have.

As soon as you start "trying to instill a self-fulfilling prophecy", you're going down the same road as the people who say "I believe in God because I think it's useful to have a belief in God."

To be clear, if you're trying to make it impossible for yourself to double-think by planting that thought in your head, that may be a rational strategy. But don't try to convince yourself that it's impossible for other people to double-think just because you wish that were the case; reality is what it is, not what we would like it to be.

comment by billswift · 2009-03-09T12:56:07.006Z · LW(p) · GW(p)

Maybe we need to split this into two words. Belief for when it is not supported by fact, or even against the evidence. I mean I've never heard anybody say, "I believe in gravity". Maybe use the phrase "I accept" for supported ideas, as in "I accept quantum mechanics" or "I accept that god does not exist". "Accept" also seems to have less affect than "believe", which may make it easier to change your mind if the evidence changes.

Replies from: bill, Eliezer_Yudkowsky, thomblake
comment by bill · 2009-03-09T14:51:30.947Z · LW(p) · GW(p)

"Act as if" might work.

For example, I act as if people are nicer than they are (because it gets me better outcomes than other possible strategies I've tried).

This also has the benefit of clearly separating action (what we can do) from information (what we know) and preferences (what we want).

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-09T17:02:27.556Z · LW(p) · GW(p)

"I accept that..." sounds like it could be useful in a lot of cases.

Consider the more swiftly apparent incoherence:

"I accept that people are nicer than they are."

Maybe this is the word we should've been using all along!

Replies from: Document, Document
comment by Document · 2010-10-10T00:32:41.212Z · LW(p) · GW(p)

Currently wondering if synonyms for belief in different contexts should be a page on the wiki.

comment by Document · 2010-10-09T05:58:27.883Z · LW(p) · GW(p)

Other substitutes: "it's clear to me that" and "I recognize".

comment by thomblake · 2009-03-09T15:14:05.557Z · LW(p) · GW(p)

No, in ordinary English, 'believe' means believe - but it also means 'accept' or 'endorse' or various other sorts of things. If we're going to be entrusted with eradicating a common usage (ha) then I say let 'believe' only mean believe. Thus, here, the assertion "I believe X" should be taken to be equivalent to the assertion "X".

Replies from: Annoyance
comment by Annoyance · 2009-03-09T16:03:41.711Z · LW(p) · GW(p)

"Thus, here, the assertion "I believe X" should be taken to be equivalent to the assertion "X"."

We can believe something without asserting it to be true. "I assert X to be true", likewise, doesn't require that we believe X to be true. All sorts of arguments involve assertions of truth that we don't necessarily extend beyond the argument.

It's something like the empty set: when the null symbol is bracketed, the result doesn't mean "the empty set". Empty brackets, or the null by itself, means that.

Replies from: thomblake
comment by thomblake · 2009-03-10T16:05:48.771Z · LW(p) · GW(p)

Asserting something one does not believe is lying. By the principle of charity we should assume our fellows are not lying, in which case "X" implies "I believe X". Obviously, that's only halfway to equivalence.

If I were to say, "I believe that the president is John McCain", and you responded by disputing my claim that the president is John McCain, I would be out of line to respond that I had never asserted that the president is John McCain. Similarly for the exchange "I believe that Annoyance is Caledonian" "But I'm not Caledonian" "I didn't say you were".

And so they are equivalent, unless you deny the principle of charity or have a counterexample for my second point.

comment by Kate Gladstone (kate-gladstone) · 2019-07-13T06:13:18.161Z · LW(p) · GW(p)

A real-world instance of Moore’s Paradox (“It’s raining, but I don’t believe it is”) occurs several times annually at Autzen Stadium in Eugene, Oregon —

https://en.m.wikipedia.org/wiki/Autzen_Stadium

 [quote:]

Since 1990, Don Essig, the stadium's PA announcer since 1968, has declared that "It never rains at Autzen Stadium" before each home game as the crowd chants along in unison. He often prefaces it with the local weather forecast, which quite often includes some chance of showers, but reminds fans that "we know the real forecast..." or "let's tell our friends from (visiting team name) the real forecast..." If rain is actually falling before the game, Essig will often dismiss it as "a light drizzle", or "liquid sunshine" but not actual rain by Oregon standards.[60]

[60] Baker, Mark (March 6, 2010). "Still Quackin'". The Register-Guard. Archived from the original on September 23, 2010. Retrieved September 20, 2010.

[link to archive:] https://web.archive.org/web/20100923102447/http://special.registerguard.com/csp/cms/sites/web/news/cityregion/24519427-41/essig-game-court-basketball-mac.csp

comment by Decius · 2013-08-10T03:46:59.621Z · LW(p) · GW(p)

If I am capable of deliberate self deception, I want to believe that I am capable of deliberate self deception.
If I am not capable of deliberate self deception, I want to believe that I am not capable of deliberate self deception.

comment by Indon · 2013-04-03T20:04:26.433Z · LW(p) · GW(p)

I find it amusing that in this article, you are advocating the use of deliberate self-deception in order to ward yourself against later deliberate self-deception.

That said, I feel the urge to contribute despite the large time-gap, and I suspect that even if later posts revisit this concept, the relevance to my contribution will be lower.

"I believe X" is a statement of self-identity - the map of the territory of your mind. But as maps and territories go, self-identity is pretty special, as it is a map written using the territory, and changes in the map can affect the territory as a result - though not necessarily in the exactly intended fashion. So even if deliberate self-deception isn't possible, then some approximation of it probably is.

Moreover, I'd like to question the definition of 'belief' in the context. If we place an emphasis, in the concept, of a belief as something that affects one's actions, then there is such a thing as a false belief that someone holds: that is to say, an assumption someone intentionally makes, regardless of its' truth or falsehood, that they use to guide their behavior for external reasons.

That is to say, acting, or role-playing.

I'm rather a believer in cognitive minimalism - that our brains are very uncomplex. So I would assert that the same system that we use to model others' behavior - or to play others' roles - we use for our own self-identity. So when you say, "I believe X", you're effectively saying, "I act as if X is true". And if we use the same system to act like ourselves, to model our own behavior, as we do to model or act like anyone else, then that's most of what the practical impact of a belief is.

What I'm trying to say is that the only difference between acting a certain way and believing a certain thing is that you only do the acting under certain practical conditions - the belief, insofar as a belief is different from an act, is acting in a certain way all the time, for any reason.

Replace "I believe X because..." with "I act as if X is true because..." and I don't think it's confusing anymore. Self-identity modification as a tool is pretty important to human cognition, not just for trying to convince yourself that what you don't think is true, is.

Edit: Actually, I want to amend that last part now that I think on it. I would assert that there is no difference whatsoever; that all reasonable beliefs are contingent. In fact, a big part of acting rationally is about making your beliefs contingent on the truth or falsehood of the object of the belief. Beliefs that aren't based on accuracy are still contingent, just on things like, "This is beneficial to me in some way." And really, a rational belief is similar, it just goes, "I believe X because it is accurate," with the implied addition, "and accuracy is good to have in a belief," so that boils down to a practical reason as well.

comment by christopherj · 2013-09-30T16:56:08.362Z · LW(p) · GW(p)

I chose to believe in the model of science—deliberately and consciously. This decision, however, has absolutely zero effect on the actual scientific method. I choose to believe science not because I can show it to be likely true, but simply because it is useful for making accurate predictions. I choose to reject, at least in so far as my actions, my internal beliefs about how the world works when they conflict with the ways science says the world works. I reject my intuition and all my firsthand experience that velocity is additive because relativity says it is not. I reject my intuition and firsthand experience that smaller and smaller particles act like proportionally smaller grains of sand because quantum theory says they behave like waves. I choose to fight every bias I possess as I become aware of it, though I clearly believe and act as if that bias were true when I am not fighting it.

If I cannot choose to reject old beliefs and accept beliefs I do not currently possess, how can I choose to overcome bias or become less wrong?

Replies from: hairyfigment
comment by hairyfigment · 2013-09-30T17:10:17.554Z · LW(p) · GW(p)

Not to put too fine a point on it, but you sound like you already expect science's predictions for velocity to come true before you "choose to reject old beliefs". If someone asked you beforehand to bet on whether your intuitions or science would pan out here (in those words), you'd bet on science.

I sometimes feel (less often now) that if I 'follow the rules' nothing really bad can happen to me. I try to fight this feeling because even my own sheltered life suggests its predictions would fail eventually.

ETA: alief.

comment by pjeby · 2009-03-09T19:54:23.581Z · LW(p) · GW(p)

It seems to me that you are confused.

There are two kinds of belief being discussed here: abstract/declarative and concrete/imperative.

We don't have direct control over our imperative beliefs, but can change them through clever self-manipulation. We DO have direct control over our declarative beliefs, and we can think whatever the heck we want in them. They just won't necessarily make any difference to how we BEHAVE, since they're part of the "far" or "social" thinking mechanism.

You seem to be implying that there's only one kind of belief, and that it should be subject to some sort of consistency checking. However, NEITHER kind of belief has any global or automatic consistency checking. We can stop intellectually believing that we're dumb or incompetent, for example, and still go on believing it emotionally, because although the abstract memory involved has been updated, the concrete memory hasn't.

It isn't even necessary to DO anything in order to have contradictory beliefs; it merely suffices to neglect the cross-checking, and perhaps a bit of effort to avoid thinking about the connection when somebody tries to show it to you.

And that avoidance can take place automatically, if you have a strong enough emotional reason for wanting to maintain the intellectual belief. Even among my clients who WANT to change some belief or fix some problem in their heads, the first step for me is always getting them to stop abstracting themselves away from actually looking at what they believe on the concrete/emotional level, as opposed to what they'd prefer to believe on the abstract/intellectual level.

Imagine how much harder it must be for someone who isn't TRYING to change their beliefs!

Replies from: abigailgem
comment by abigailgem · 2009-03-10T09:48:43.518Z · LW(p) · GW(p)

"The monster will get me if I make a mistake" can be a deep concrete belief, one looks at it rationally, and thinks, that is ridiculous- but getting rid of it can be hard work.

comment by azergante · 2024-02-24T12:51:06.302Z · LW(p) · GW(p)

If you know your belief isn't correlated to reality, how can you still believe it?

 

Interestingly, physics models (map) are wrong (inaccurate) and people know that but still use them all the time because they are good enough with respect to some goal.

Less accurate models can even be favored over more accurate ones to save on computing power or reduce complexity.

As long as the benefits outweigh the drawbacks, the correlation to reality is irrelevant.

Not sure how cleanly this maps to beliefs since one would have to be able to go from one belief to another, however it might be possible by successively activating different parts of the brain that hold different beliefs, in a way similar to someone very angry that completely switches gears to answer an important phone call.

comment by omalleyt · 2016-09-08T18:59:04.033Z · LW(p) · GW(p)

I'm going to go off the assumption that this post is deliberate satire, and say it's brilliant.

"Even if it's not true, I'm going to decide to believe that people can't sincerely self-deceive."

comment by cleonid · 2009-03-09T16:58:02.185Z · LW(p) · GW(p)

All people have a marked preference to believe what they want to believe, especially when there are no direct costs associated with the false belief. The majority therefore prefers the belief in a charitable high power to the uncaring universe guided solely by the laws of physics.

The fact that a minority made by the self-declared rationalists can get by without this belief may have less to do with their rationalism than with the warm feeling of the superiority they feel towards the rest of the mankind. This can at least in part console them for giving up religion. Personally I get my consolation from feeling superior to both groups.

comment by cleonid · 2009-03-09T16:57:35.495Z · LW(p) · GW(p)

All people have a marked preference to believe what they want to believe, especially when there are no direct costs associated with the false belief. The majority therefore prefers the belief in a charitable high power to the uncaring universe guided solely by the laws of physics.

The fact that a minority made by the self-declared rationalists can get by without this belief may have less to do with their rationalism than with the warm feeling of the superiority they feel towards the rest of the mankind. This can at least in part console them for giving up religion. Personally I get my consolation from feeling superior to both groups.

comment by Cameron_Taylor · 2009-03-09T08:54:48.537Z · LW(p) · GW(p)

I chose to believe in the existence of God - deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.

If you know your belief isn't correlated to reality, how can you still believe it?

A good question. Perhaps it could be distance a little more from quote that preceeds it? That quote by itself seems to be rational. (The irrational basis of the deliberate and conscious choice in question is nearly guarunteed but at least out of context.)

comment by [deleted] · 2015-08-08T04:43:26.943Z · LW(p) · GW(p)

What you're saying has supercharged my cognitive flexibility. I never even thought to check whether my self-reported beliefs correlate with thoughts that I have positive affect towards and examine the implications!

Reminds me of Journeyman's comment on my EA article:

I don’t think EAs do a very good job of distinguishing their moral intuitions from good philosophical arguments; see the interest of many EAs in open borders and animal rights. I do not see a large understanding in EA of what altruism is and how it can become pathological. Pathological altruism is where people become practically addicted to a feeling of doing good which leads them to act sometime with negative consequences. A quote from the book in that review, which shows some of the difficulties disentangling moral psychological from moral philosophy:

comment by Jarogers326 · 2015-08-03T17:33:28.791Z · LW(p) · GW(p)

"'I chose to believe in the existence of God—deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.'

If you know your belief isn't correlated to reality, how can you still believe it?"

It's the difference between someone who's afraid of heights standing twenty feet from a cliff and standing two inches from the cliff. The former knows what will happen if he moves over and looks down, the latter is looking down and feeling the fear.

If you tell yourself you believe in a wall, then you're less likely to worry about what's on the other side.

comment by MarsColony_in10years · 2015-04-03T20:57:06.999Z · LW(p) · GW(p)

If you keep telling yourself that you can't just deliberately choose to believe the sky is green—then you're less likely to succeed in fooling yourself on one level or another; either in the sense of really believing it, or of falling into Moore's Paradox, belief in belief, or belief in self-deception.

If you keep telling yourself that you'd just look at your elaborately constructed false map, and just know that it was a false map without any expected correlation to the territory, and therefore, despite all its elaborate construction, you wouldn't be able to invest any credulity in it—

If you keep telling yourself that reflective consistency will take over and make you stop believing on the object level, once you come to the meta-level realization that the map is not reflecting—

Then when push comes to shove—you may, indeed, fail.

I'd phrase this differently. It's pretty clear from general evidence, not just from the examples provided, that people readily can believe contradictory things simultaneously. It's even commonly recognized, so much so that we have a well known word for it: cognitive dissonance. Psychologists use the term to denote the unpleasant feeling we get when we have two or more conflicting beliefs, ideas, or values. The human mind is incredibly plastic, and is constantly changing as we acquire new information. Given how complex the mind is, it's not terribly surprising that updating one belief may not cause a related belief to update if the mind doesn't realize the two are related. But when we eventually realize that there is a contradiction, surely there is a process for correcting it, rather than the weaker belief simply instantly disappearing. In order to infer a little about what that process might look like, we can look at how people reduce and resolve easier to observe types of cognitive dissonance, such as the case where someone's desires are in conflict. Wikipedia provides this example:

Cognitive dissonance theory is founded on the assumption that individuals seek consistency between their expectations and their reality. Because of this, people engage in a process called dissonance reduction to bring their cognitions and actions in line with one another. This creation of uniformity allows for a lessening of psychological tension and distress. According to Festinger, dissonance reduction can be achieved in four ways. In an example case where a person has adopted the attitude that they will no longer eat high fat food, but is eating a high-fat doughnut, the four methods of reduction would be:

  1. Change behavior or cognition ("I will not eat any more of this doughnut")

  2. Justify behavior or cognition by changing the conflicting cognition ("I'm allowed to cheat every once in a while")

  3. Justify behavior or cognition by adding new cognitions ("I'll spend 30 extra minutes at the gym to work this off")

  4. Ignore or deny any information that conflicts with existing beliefs ("This doughnut is not high fat")

What might we get if we tried to generalize this to cases where beliefs are in conflict, rather than desires? Here's my guess:

  1. Change behavior or cognition ("My belief X is wrong.")

  2. Justify behavior or cognition by changing the conflicting cognition ("Well, X2 can still be true even if X1 isn't.")

  3. Justify behavior or cognition by adding new cognitions ("X may conflict with Y, but Z can fix the issue." or "If Z is true or the mind/world works like Z, then the apparent conflict between X and Y is explained away!")

  4. Ignore or deny any information that conflicts with existing beliefs ("Everyone has the right to their own opinion. Right or wrong, I prefer to believe X", "My belief X doesn't ACTUALLY conflict with Y", "You just can't compare X and Y", or "You can't apply logic to X".)

I wonder if all that is needed to make it easier to choose option 1 is for options 2-4 to become stigmatized. Aka, if every time I am naturally inclined to choose option 4 I am reminded of all discussion on LessWrong about trying not to be an option 4 person, and I naturally identify with the option 1 crowd and want to be more like the option1-ers I admire, then will my gut impulse be more likely to be option 1? Or is changing one's mind destined to always be a struggle of intellect vs impulse?

comment by jooyous · 2013-01-10T04:59:37.798Z · LW(p) · GW(p)

I thought about believing that people are nicer than they really are before reading this and the previous article and I was worried I did that thing where I believed I succeeded in deceiving myself. Then I unpacked it to be "it is beneficial to act like you expect the next person you meet to be nice because if you believe that they are likely to turn out mean then you will start acting as if you expect them to be a jerk, which is more likely to make them act like a jerk; therefore just act as if you already think they're nice but be prepared to appropriately react to evidence that they're a jerk if they present it." Which I think is straight-forward and not contradictory, right? Because it doesn't tell me to believe anything that conflicts with reality, it just tells me how to act.

I'm curious if this maps at all onto the existence of God. Does acting like you believe God exists cause you to do certain good things that you wouldn't do otherwise?

Replies from: Qiaochu_Yuan
comment by Qiaochu_Yuan · 2013-01-10T05:07:58.768Z · LW(p) · GW(p)

it is beneficial to act like you expect the next person you meet to be nice

Are you suggesting a strategy different from "default to acting nice to people"? You can justify this strategy without phrasing it in terms of acting as if you have a belief you don't have.

Does acting like you believe God exists cause you to do certain good things that you wouldn't do otherwise?

Probably, but as someone who reads LW, you hopefully recognize that you can just do those things anyway without making any statements about your beliefs.

Replies from: jooyous
comment by jooyous · 2013-01-10T05:15:36.138Z · LW(p) · GW(p)

Are you suggesting a strategy different from "default to acting nice to people"?

Oops, sorry! I am suggesting the strategy "continue meeting strangers and being nice to them" for the problem of finding nice people. As opposed to "after meeting 5 jerks in a row, conclude that everyone is a jerk and hide from humans forever."

You can justify this strategy without phrasing it in terms of acting as if you have a belief you don't have.

Exactly! And I think I phrased it more or less this way when I computed it for personal use. But I keep encountering people who try to argue that there's no point of meeting the next person because the past 5 people they've talked to turned out to be jerks. And I think arguing with those people turned my argument into "Well you shouldn't BELIEVE the next person is going to be a jerk because that's probably skewing your data." Which isn't quite what I meant; it just got stuck in my head in that flawed form. I wasn't trying to get them to believe their data away; I was trying to get them to act nice in spite of it. =P

Does acting like you believe God exists cause you to do certain good things that you wouldn't do otherwise?

Yeah, I was trying to sorta-direct the comment at the person mentioned in the body of the post who is probably long gone by now. I was wondering if there was a "useful action" component in their desire to keep a God node around in their head that they consciously keep from melting away.

comment by BlindDancer · 2011-04-03T14:13:00.762Z · LW(p) · GW(p)

See... beliefs are emotional statements rooted heavily in cultural heritage and instinct. overcoming them is difficult. So for example no matter how hard I stand in the cockpit screaming at myself that I'm doing something stupid, I still react with a fear response to frightening images shown on a movie screen.

Though I guess the problem here is a definitional one. You define belief a bit more narrowly then I do, so I'm quibbling. I feel the need to bring this up (for your consideration), but I'm not going to pursue it. I'm probably being stupid even bringing it up.

comment by EmbraceUnity · 2009-06-21T23:05:54.314Z · LW(p) · GW(p)

Plenty of people, including myself, seem to understand that they are risk-averse, and yet fail to seek risk-neutrality.

comment by Mike Bishop (MichaelBishop) · 2009-03-10T05:28:01.877Z · LW(p) · GW(p)
       Tell yourself the effort is doomed - and it will be!

@Eliezer: People are going to misinterpret this far too frequently. Add an addendum to the post to clarify it.