Belief in Self-Deception

post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-05T15:20:27.590Z · LW · GW · Legacy · 114 comments

I spoke yesterday of my conversation with a nominally Orthodox Jewish woman who vigorously defended the assertion that she believed in God, while seeming not to actually believe in God at all.

While I was questioning her about the benefits that she thought came from believing in God, I introduced the Litany of Tarski—which is actually an infinite family of litanies, a specific example being:

  If the sky is blue
      I desire to believe "the sky is blue"
  If the sky is not blue
      I desire to believe "the sky is not blue".

"This is not my philosophy," she said to me.

"I didn't think it was," I replied to her.  "I'm just asking—assuming that God does not exist, and this is known, then should you still believe in God?"

She hesitated.  She seemed to really be trying to think about it, which surprised me.

"So it's a counterfactual question..." she said slowly.

I thought at the time that she was having difficulty allowing herself to visualize the world where God does not exist, because of her attachment to a God-containing world.

Now, however, I suspect she was having difficulty visualizing a contrast between the way the world would look if God existed or did not exist, because all her thoughts were about her belief in God, but her causal network modelling the world did not contain God as a node.  So she could easily answer "How would the world look different if I didn't believe in God?", but not "How would the world look different if there was no God?"

She didn't answer that question, at the time.  But she did produce a counterexample to the Litany of Tarski:

She said, "I believe that people are nicer than they really are."

I tried to explain that if you say, "People are bad," that means you believe people are bad, and if you say, "I believe people are nice", that means you believe you believe people are nice.  So saying "People are bad and I believe people are nice" means you believe people are bad but you believe you believe people are nice.

I quoted to her:

  "If there were a verb meaning 'to believe falsely', it would not have any
  significant first person, present indicative."
          —Ludwig Wittgenstein

She said, smiling, "Yes, I believe people are nicer than, in fact, they are.  I just thought I should put it that way for you."

  "I reckon Granny ought to have a good look at you, Walter," said Nanny.  "I reckon
  your mind's all tangled up like a ball of string what's been dropped."
          —Terry Pratchett, Maskerade

And I can type out the words, "Well, I guess she didn't believe that her reasoning ought to be consistent under reflection," but I'm still having trouble coming to grips with it.

I can see the pattern in the words coming out of her lips, but I can't understand the mind behind on an empathic level.  I can imagine myself into the shoes of baby-eating aliens and the Lady 3rd Kiritsugu, but I cannot imagine what it is like to be her.  Or maybe I just don't want to?

This is why intelligent people only have a certain amount of time (measured in subjective time spent thinking about religion) to become atheists.  After a certain point, if you're smart, have spent time thinking about and defending your religion, and still haven't escaped the grip of Dark Side Epistemology, the inside of your mind ends up as an Escher painting.

(One of the other few moments that gave her pause—I mention this, in case you have occasion to use it—is when she was talking about how it's good to believe that someone cares whether you do right or wrong—not, of course, talking about how there actually is a God who cares whether you do right or wrong, this proposition is not part of her religion—

And I said, "But I care whether you do right or wrong.  So what you're saying is that this isn't enough, and you also need to believe in something above humanity that cares whether you do right or wrong."  So that stopped her, for a bit, because of course she'd never thought of it in those terms before.  Just a standard application of the nonstandard toolbox.)

Later on, at one point, I was asking her if it would be good to do anything differently if there definitely was no God, and this time, she answered, "No."

"So," I said incredulously, "if God exists or doesn't exist, that has absolutely no effect on how it would be good for people to think or act?  I think even a rabbi would look a little askance at that."

Her religion seems to now consist entirely of the worship of worship.  As the true believers of older times might have believed that an all-seeing father would save them, she now believes that belief in God will save her.

After she said "I believe people are nicer than they are," I asked, "So, are you consistently surprised when people undershoot your expectations?"  There was a long silence, and then, slowly:  "Well... am I surprised when people... undershoot my expectations?"

I didn't understand this pause at the time.  I'd intended it to suggest that if she was constantly disappointed by reality, then this was a downside of believing falsely.   But she seemed, instead, to be taken aback at the implications of not being surprised.

I now realize that the whole essence of her philosophy was her belief that she had deceived herself, and the possibility that her estimates of other people were actually accurate, threatened the Dark Side Epistemology that she had built around beliefs such as "I benefit from believing people are nicer than they actually are."

She has taken the old idol off its throne, and replaced it with an explicit worship of the Dark Side Epistemology that was once invented to defend the idol; she worships her own attempt at self-deception.  The attempt failed, but she is honestly unaware of this.

And so humanity's token guardians of sanity (motto: "pooping your deranged little party since Epicurus") must now fight the active worship of self-deception—the worship of the supposed benefits of faith, in place of God.

This actually explains a fact about myself that I didn't really understand earlier—the reason why I'm annoyed when people talk as if self-deception is easy, and why I write entire blog posts arguing that making a deliberate choice to believe the sky is green, is harder to get away with than people seem to think.

It's because—while you can't just choose to believe the sky is green—if you don't realize this fact, then you actually can fool yourself into believing that you've successfully deceived yourself.

And since you then sincerely expect to receive the benefits that you think come from self-deception, you get the same sort of placebo benefit that would actually come from a successful self-deception.

So by going around explaining how hard self-deception is, I'm actually taking direct aim at the placebo benefits that people get from believing that they've deceived themselves, and targeting the new sort of religion that worships only the worship of God.

Will this battle, I wonder, generate a new list of reasons why, not belief, but belief in belief, is itself a good thing?  Why people derive great benefits from worshipping their worship?  Will we have to do this over again with belief in belief in belief and worship of worship of worship?  Or will intelligent theists finally just give up on that line of argument?

I wish I could believe that no one could possibly believe in belief in belief in belief, but the Zombie World argument in philosophy has gotten even more tangled than this and its proponents still haven't abandoned it.

I await the eager defenses of belief in belief in the comments, but I wonder if anyone would care to jump ahead of the game and defend belief in belief in belief?  Might as well go ahead and get it over with.

114 comments

Comments sorted by top scores.

comment by Tyrrell_McAllister · 2009-03-05T22:29:00.603Z · LW(p) · GW(p)

I don't know how well you know this person, so my advice may be unnecessary. But your post gives me the impression that you need to be much more careful about speculating on how her mind works. I think that it's a red flag when you write first that

I can see the pattern in the words coming out of her lips, but I can't understand the mind behind on an empathic level. I can imagine myself into the shoes of baby-eating aliens and the Lady 3rd Kiritsugu, but I cannot imagine what it is like to be her.

. . . and then proceed to make apparently confident declarations about how her mind works, such as

I now realize that the whole essence of her philosophy was her belief that she had deceived herself, and the possibility that her estimates of other people were actually accurate, threatened the Dark Side Epistemology that she had built around beliefs such as "I benefit from believing people are nicer than they actually are."

She has taken the old idol off its throne, and replaced it with an explicit worship of the Dark Side Epistemology that was once invented to defend the idol; she worships her own attempt at self-deception. The attempt failed, but she is honestly unaware of this.

As you yourself have observed, we largely understand other people by taking a portion of our own black-box mind, plugging in a few explicit settings (such as beliefs or experiences), letting the model run for a bit, and seeing what pops out. In particular, to understand how another person makes judgments, we collect their evinced beliefs, try to twiddle some dials until our model expresses the same beliefs, and then let it run for a bit. We then try to peer into the model as best we can, getting as good a picture of its inner workings as introspection allows us. We then take this picture as our hypothesis about how the other person thinks.

But the first quote above is strong evidence that your mind works differently from hers in some highly relevant respects. Therefore, you should be highly skeptical that what is going on in her mind resembles what it took to make the model of her in your own mind match her utterances. But you give me the impression that you haven't been sufficiently skeptical of the match between her mind and your model of it. I think that this has led you astray on several points.

For example, based on what you've written, I don't think that you're using the right model to understand what was going on in her mind when she said, "I believe that people are nicer than they really are." You were led to this confusion because she was not using the word "believe" in the way that you, and your model of her, do. You are using "belief" to mean a feature of a model of how the world is. But that, I expect, is not what she meant. Thus, your remarks here --

I tried to explain that if you say, "People are bad," that means you believe people are bad, and if you say, "I believe people are nice", that means you believe you believe people are nice. So saying "People are bad and I believe people are nice" means you believe people are bad but you believe you believe people are nice.

-- were irrelevant because they do not apply to the sense of the word "believe" that she was using.

For what it's worth, in my model of her, when she said "I believe that people are nicer than they really are," she meant, "When I reflect on my emotional attitude towards people, I see that this attitude is of the sort that, in the absence of its actual cause, could have been caused by a falsely high belief (in your sense) about peoples' niceness."

The actual cause for her emotional attitude is perhaps her "religion". Or perhaps it is something else. Perhaps she has no idea what the actual cause is, or perhaps she thinks she does, but she doesn't really. But none of this implies that she was attributing to herself the belief that people are nicer than she actually believes them to be (where, here, I'm using "belief" in your sense.)

Her utterance seems analogous to someone who walks out of an optometrist's office after having his pupils dilated and says, "Because of those drops the optometrist gave me, I believe the sun is brighter than it really is." If we heard this, we shouldn't conclude that he believes something contradictory, or that he has incorrect beliefs about his beliefs. His word "belief" in this case probably does not mean "best guess about how things really are." Rather, it's a clumsy way to say that some qualities of his experience of the world are as if he had a certain belief (in the sense normally understood). He does not mean to imply that he has any wrong beliefs (in the conventional sense). It would be a mistake to say that his subjective experience of the light is in any way erroneous. After all, it accurately reflects the fact that he had those drops put in his eyes.

Similarly, your interlocutor's statement that she "believes" that people are nicer than they really are referred to a particular quality of her emotional attitude towards them, not to a belief (in your sense) about how they are. In particular, it didn't imply any expectation about how they would behave. That, I expect, is why she was initially taken aback when you asked, "So, are you consistently surprised when people undershoot your expectations?" The problem wasn't, as you appear to think, that she had prevented her own mind from drawing obvious conclusions. The problem was that you (because of her confusing wording) were speaking of her so-called "belief" as though it were a belief in the normal sense, something that should lead to certain expectations about other peoples' actions. But I expect that it wasn't any such thing, notwithstanding her unfortunate choice of words.

Replies from: Eliezer_Yudkowsky, billswift, DimitriK
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-06T00:02:26.944Z · LW(p) · GW(p)

For what it's worth, in my model of her, when she said "I believe that people are nicer than they really are," she meant, "When I reflect on my emotional attitude towards people, I see that this attitude is of the sort that, in the absence of its actual cause, could have been caused by a falsely high belief (in your sense) about peoples' niceness."

An interesting hypothesis, Tyrrell; but she explicitly explained to me about how, if you think people are nicer than they really are, then this makes you happier.

Replies from: Tyrrell_McAllister
comment by Tyrrell_McAllister · 2009-03-06T09:15:27.179Z · LW(p) · GW(p)

You're right to call it a mere hypothesis. I hope that I made its tentative nature clear.

But that explanation of hers seems to me to be consistent with my hypothesis. No surprise, because it was part of the data that I was trying to fit when I constructed it.

I would be curious to know more about how she responded when you asked her, "So, are you consistently surprised when people undershoot your expectations?" Did she have anything more to say after repeating the question?

Replies from: Yasser_Elassal
comment by Yasser_Elassal · 2009-03-06T19:52:47.404Z · LW(p) · GW(p)

My hypothesis is that she simply meant, "It makes me happy to pretend that people are nicer than they really are."

comment by billswift · 2009-03-06T00:34:13.009Z · LW(p) · GW(p)

I think the first time Eliezer said he couldn't get into her mind was that he couldn't understand the psychological state she needed to be in to make that statement. The second time - where he was writing about what she believed - he was discussing her apparent epistemological state.

There are significant differences between the two for observers. I can almost never understand someone else's psychological state, but I can often figure out what they are talking about and how they got there epistemologically - that is, what could have caused their stated beliefs.

comment by DimitriK · 2014-11-16T18:05:27.501Z · LW(p) · GW(p)

When I read "i believe people are nicer than they really are" I got the impression her meaning was along the lines of "people are nicer inside than their actions. On reflection, this might be because that's what I believe. It ties in to fundamental attribution error. Peoples actions are based so much on environment and circumstance that if you had a way to truly look into a person I think you'd see a better person than you would have guessed if you only looked at their actions. Most people don't see themselves as evil. They do things we see as evil but in their heads they are doing what they think is good.

Id be interested in hearing what exactly she said that brought on your analysis Eliezer. I realise it was a long time ago, and im not likely to get a reply anyway, but it seems likely to me her statemen came from an intuitive belief in fundamental attribution error. I know I held that belief long before I encountered it first in HPMoR, so its possible for her too.

Replies from: Tyrrell_McAllister
comment by Tyrrell_McAllister · 2014-11-17T00:42:52.949Z · LW(p) · GW(p)

I think that there's a better chance that he'll see your comment if you reply directly to the post rather than to another comment. At least, I think that that's how it works.

comment by pjeby · 2009-03-06T04:55:13.303Z · LW(p) · GW(p)

I'll go one step further and defend belief in belief, infinitely regressed. ;-) As you point out, the placebo effect here is simply the expectation of a positive result -- and it applies equally at any level of recursion here.

Humans only need a convincing argument for predicting a positive result, not a rational proof of that prediction! Once the positive result is expected, we get positive emotions activated every time we think of anything linked to that result, leading to self-fulfilling prophecies on every level.

This being the case, one might question whether it's rational to disbelieve in belief, if you have nothing equally beneficial to replace it with.

When it comes to external results, sure, it makes sense to have greater prediction accuracy. But for interior events -- like confidence, creativity, self-esteem, etc. -- biasing one's predictions positively is a significant advantage, as it stabilizes what would otherwise be an unstable system of runaway feedback loops.

People whose systems are negatively biased, on the other hand, can get seriously stuck. They basically hit one little setback and become paralyzed because of runaway negative self-fulfilling prophecy.

(I've been such a person myself, and I've worked with/on many of them. Indeed, it was noticing that other, far less "rational" and "intelligent" individuals were much more confident, calm, and successful than I was, that led me to start seriously investigating the nature of mind and beliefs in the first place, and to begin noting the distinctions between people I dubbed "naturally successful" and those I considered "naturally struggling".)

comment by findis · 2012-12-29T19:29:34.276Z · LW(p) · GW(p)

I await the eager defenses of belief in belief in the comments, but I wonder if anyone would care to jump ahead of the game and defend belief in belief in belief? Might as well go ahead and get it over with.

My boyfriend was once feeling a bit tired and unmotivated for a few months (probably mild depression), and he also wanted to stop eating dairy for ethical reasons. He felt that his illness was partly mentally generated. He decided that he was allergic to dairy, and that dairy was causing his illness. Then he stopped eating dairy and felt better!

He told me all this, and also told me that he usually believes he is actually allergic to dairy, and it is hard to remember that he is not. When someone asks how he knows he is allergic to dairy, he says something plausible and false ("The doctor ran blood tests") and believes it if he doesn't stop and think too much.

He believes he is not allergic to dairy, but he believes he believes he is allergic to dairy? Belief-in-belief. But he recognizes this and explained it to me -- so that's a belief-in-belief-in-belief? But it helped him get over his mental illness and stop eating dairy... that's winning.

In general I would say a belief-in-belief is useful if you decide some behaviors are desirable, but some false model of the world better motivates you to behave properly. Belief-in-belief-in-belief is useful if you know too much to think both "Z is true" and "I believe not-Z". Then you tell yourself you have a belief-in-belief.

Disclaimer: This is weird to me and I don't really understand how he pulls it off.

comment by Marcello · 2009-03-05T18:23:36.571Z · LW(p) · GW(p)

If I had been talking to the person you were talking to, I might have said something like this:

Why are you deceiving yourself into believing Orthodox Judaism as opposed to something else? If you, in fact, are deriving a benefit from deceiving yourself, while at the same time being aware that you are deceiving yourself, then why haven't you optimized your deceptions into something other than an off-the-shelf religion by now? Have you ever really asked yourself the question: "What is the set of things that I would derive the most benefit from falsely believing?" Now if you really think you can make your life better by deceiving yourself, and you haven't really thought carefully about what the exact set of things about which you would be better off deceiving yourself is, then it would seem unlikely that you've actually got the optimal set of self-deceptions in your brain. In particular, this means that it's probably a bad idea to deceive yourself into thinking that your present set of self deceptions is optimal, so please don't do that.

OK, now do you agree that finding the optimal set of self deceptions is a good idea? OK, good, but I have to give you one very important warning. If you actually want to have the optimal set of self deceptions, you'd better not deceive yourself at all while you are constructing this set of self deceptions, or you'll probably get it wrong, because if, for example, you are currently sub-optimally deceiving yourself into believing that it is good to believe X, then you may end up deceiving yourself into actually believing X, even if that's a bad idea. So don't self deceive while you're trying to figure out what to deceive yourself of.

Therefore, to the extent that you are in control of your self deceptions, (which you do seem to be) the first step toward getting the best set of self deceptions is to disable them all and begin a process of sincere inquiry as to what beliefs it is a good idea to have.


And hopefully, at the end of the process of sincere inquiry, they discover the best set of self deceptions happens to be empty. And if they don't, if they actually thought it through with the highest epistemic standards, and even considered epistemic arguments such as honesty being one's last defence, slashed tires, and all that.... Well, I'd be pretty surprised, but if I were actually shown that argument, and it actually did conform to the highest epistemic standards.... Maybe, provided it's more likely that the argument was actually that good, as opposed to my just being deceived, I'd even concede.

Disclaimer: I don't actually expect this to work with high confidence, because this sort of person might not actually be able to do a sincere inquiry. Regardless, if this sort of thought got stuck in their head, it could at least increase their cognitive dissonance, which might be a step on the road to recovery.

Replies from: Roko, Eliezer_Yudkowsky, pre
comment by Roko · 2009-03-05T18:30:37.447Z · LW(p) · GW(p)

"Disclaimer: I don't actually expect this to work with high confidence, because this sort of person might not actually be able to do a sincere inquiry."

  • well exactly... If the person were thinking rationally enough to contemplate that argument, they really wouldn't need it.

I have never successfully converted a religious person to atheism, but my ex-girlfriend did. I am a more rational person than her, I know more philosophy, I have earnestly tried many times, she just did this once, etc. How did she do it? The person in question was male and his religion forbade him from sex outside marriage. Most people are mostly ruled by their emotions.

Replies from: Marcello, Annoyance, eirenicon
comment by Marcello · 2009-03-06T00:47:39.927Z · LW(p) · GW(p)

well exactly... If the person were thinking rationally enough to contemplate that argument, they really wouldn't need it.

My working model of this person was that the person has rehearsed emotional and argumentative defenses to protect their belief, or belief in belief, and that the person had the ability to be reasonably rational in other domains where they weren't trying to be irrational. It therefore seemed to me that one strategy (while still dicey) to attempt to unconvince such a person would be to come up with an argument which is both:

  • Solid (Fooling/manipulating them into thinking the truth is bad cognitive citizenship, and won't work anyway because their defenses will find the weakness in the argument.)

  • Not the same shape as the argument their defenses are expecting.

Roko: How is your working model of the person different from mine?

Replies from: Roko
comment by Roko · 2009-03-06T16:17:46.572Z · LW(p) · GW(p)

My working model of a religious person such as the above is that they assess any argument first and foremost on the basis "will accepting this argument cause me to have to abandon my religious belief?". If yes, execute "search for least implausible counterargument".

As such, no rational argument whose conclusion obviously leads to the abandonment of religion will work. However, rational arguments that can be accepted on the spot without obviously threatening religion, and which lead via hard-to-predict emotional channels to the weakening and defeat of that belief might work. It is my suspicion that persuading someone to change their mind on a really important issue almost always works like this.

comment by Annoyance · 2009-03-05T19:35:46.158Z · LW(p) · GW(p)

"she just did this once, etc. How did she do it? "

By appealing to a non-rational or irrational argument that would lead the person to adopt rationality.

Arguing rationally with a person who isn't rational that they should take up the process is a waste of time. If it would work, it wouldn't be necessary. It's easy to say what course should be taken with a rational person, because rational thought is all alike. Irrational thought patterns can be nearly anything, so there's no way to specify an argument that will convince everyone. You'd need to construct an argument that each person is specifically vulnerable to.

Replies from: billswift, David_Gerard
comment by billswift · 2009-03-06T00:49:40.466Z · LW(p) · GW(p)

The problem is that you often don't know until you actually start arguing with them that they are irrational or just confused and misled.

George H Smith has a pretty good essay about arguing with people to convert them to rationality, " Atheism and the Virtue of Reasonableness". For example, he advocates the "Presumption of Rationality" - you should always presume your adversary is rational until he demostrates otherwise. I don't know if the essay is on-line or not, I read it as the second chapter of "Atheism, Ayn Rand, and Other Heresies."

comment by David_Gerard · 2011-02-22T11:24:45.265Z · LW(p) · GW(p)

Irrational thought patterns can be nearly anything, so there's no way to specify an argument that will convince everyone. You'd need to construct an argument that each person is specifically vulnerable to.

Irrational thought patterns can be nearly anything, but of course they strongly tend to form around standard human cognitive biases. This saves a great deal of time.

comment by eirenicon · 2009-03-05T19:17:39.565Z · LW(p) · GW(p)

"Most people are mostly ruled by their emotions."

To be more specific, most men, for a considerable portion of their lives, are mostly ruled by their sex drives.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-05T18:54:33.401Z · LW(p) · GW(p)

To be clear, she never did say, "I am deceiving myself" or "I falsely believe that there is a God".

Replies from: Marcello
comment by Marcello · 2009-03-06T00:16:49.797Z · LW(p) · GW(p)

I stand corrected. I hereby strike the first two sentences.

comment by pre · 2009-03-05T20:00:09.287Z · LW(p) · GW(p)

I would expect a reply along the lines of: It's precisely because I can't trust my own reasoning when deciding which false beliefs I should have that I accept these which are handed down. I pick Judaism because it's the oldest and thus has shown through memetic competition that it's the strongest set of false beliefs one could have.

Or ..." I pick Christianity because it's the most popular and has therefore proven itself memetically competitive."

I have a lot of friends who think "it's old therefore it must be good to have survived this long" about Tarot and eastern religions etc.

Personally I'd wanna eliminate the false beliefs even if it cost me my mojo, but that's a different set of priorities I guess.

Replies from: David_Gerard
comment by David_Gerard · 2011-02-22T11:16:38.019Z · LW(p) · GW(p)

I have a lot of friends who think "it's old therefore it must be good to have survived this long" about Tarot and eastern religions etc.

In fact, the argument from tradition is considered very strong in alternative medicine in particular and New Age culture in general, even if whatever it is was actually made up^W^Wrediscovered last week.

comment by RobinHanson · 2009-03-05T16:29:59.032Z · LW(p) · GW(p)

Consistent consciously intended self-deception may be hard. But our minds are designed to produce self-deceptions all the time without us noticing. Just don't look behind the curtain and "let it be", "go with the flow" etc. and you can be as self-deceived as most folks.

comment by kilobug · 2011-09-15T13:24:12.986Z · LW(p) · GW(p)

"I believe that people are nicer than they really are." That part made me ponder. Because, actually, it's something I believe, too. So I froze for a while, and looked at that belief. Do I have Escher loops in my belief networks ? Well, maybe, I'm far from being a perfect bayesian, but I can't allow myself to stop here.

My first justification for that thought was : I don't refer to the same thing in the two parts of the sentence. A bit like "sound" can refer to acoustic vibrations, or to a perception, and if you switch from one to the other into the same sentence, you can make a sentence that seems self-contradicting but is still valid.

People is a vast group. Nice or not is a characteristic of a person. So, to attribute "niceness" to people in general, you've to make an aggregate value. There are many ways to make an aggregate value, for example, mean and median. So that sentence could mean something like "I believe the median people to be nicer than the average people" (implying a minority of very un-nice people who drive the average backward, but don't change the median).

But then I thought "Hey, stop. You're trying to find excuses here. That's not really what you meant with that sentence, or you would have said it clearly. Don't find yourself excuses, just face the fact you were doing knots with your believes."

So I tried to dig where this knot could came from. And I think I found it, and it's linked to the first excuse, but not as simple. The problem comes from the fact that I use different algorithms to evaluate the niceness of a single person I'm interacting with (be it a friend/family, or just a passerby asking me "what time is it ?") and to evaluate the overall nicest of "people" in general.

When I evaluate niceness of people in general, I think about the horrors of history, about the Milgram experiment, about the crimes the news love to report about, the scary statistics about the number of husbands who hit their wives, ... And also about the "heroes", those who did risk their lives to hide unknown Jews during WW2, those who did run in a house of fire to save their neighbor. That gives me a mitigated image of people in general, neither very nice nor very un-nice, able of the best and of the worst.

When I evaluate niceness in one single person interacting with me, I tend more to recall my own interactions with individual persons. And in those, I had a few bad memories (like being assaulted to steal my money and cell phone once) but mostly good memories, be it by luck or by selective memory, most of the interactions with others I can remember were mostly positive. So when I interact with a new individual person, I assume there is a huge chance of that person being "nice", even if I have a more mitigated view of humans in general.

That probably comes from deeper, evolutionnary psychology reasons : the individual your interact with are your tribe, they are friendly. People in general are other tribes, not so friendly. But I'm not well versed enough into evolutionnary psychology to go further on that line. But anyway that's, I think, where the contradiction comes from. It may be partly justified by the fact that the median is higher the average, if it really is (which I've no factual evidence of, only a vague feeling). But it mostly comes from just using to different algorithms, which should, in a well-calibrated brain, lead to the same result, but which for many reasons (all the biases, imperfect knowledge, ...) just don't.

Replies from: Oscar_Cunningham
comment by Oscar_Cunningham · 2011-09-15T14:10:05.458Z · LW(p) · GW(p)

"I believe the median people to be nicer than the average people"

But if you'd actually meant this you'd have just said "The median people are nicer than the average people". Saying "I believe the median people to be nicer than the average people" would indicate that you didn't believe it but did believe you believed it.

Replies from: wedrifid, None
comment by wedrifid · 2011-09-15T15:55:17.202Z · LW(p) · GW(p)

But if you'd actually meant this you'd have just said "The median people are nicer than the average people". Saying "I believe the median people to be nicer than the average people" would indicate that you didn't believe it but did believe you believed it.

I don't quite agree there. Saying "I believe the median people to be nicer than the average people" indicates that you believe that you believe it but it doesn't indicate that you don't actually believe it. You could say it is neutral with respect to whether or not you actually believe it but not that it indicates outright that you don't.

Replies from: Oscar_Cunningham
comment by Oscar_Cunningham · 2011-09-15T16:30:29.778Z · LW(p) · GW(p)

Indeed, but it does hint that you don't actually believe it, otherwise you would have said the simpler thing.

Replies from: thomblake
comment by thomblake · 2011-09-15T16:38:41.358Z · LW(p) · GW(p)

I disagree. In general, saying "I believe x" is evidence that you believe x, and therefore cannot be evidence that you do not believe x. I would be interested to see evidence that people usually use "I believe x" in such a way that it can be taken as evidence that one does not believe x.

I believe that people usually use "I believe x" instead of "x" in cases where they want to stress the possibility, however small, that they are wrong. Usual caveats for religious and "I believe in" statements, as well as unrelated senses of 'believe', apply.

Replies from: kilobug
comment by kilobug · 2011-09-15T17:45:03.875Z · LW(p) · GW(p)

Yes, that distinction definitely applies to me. Usually when I say "X" it means "I believe X with almost certainty" and when I say "I believe X" indicates that there is some doubt still, maybe a 90% confidence, but not a 99% confidence.

But in that specific case, as Misha said, I didn't need to actually believe it - it was a belief in belief in my chain of thoughts, an attempt to rationalize the initial mistake, that appeared, with further analysis, to not be the real cause of it. Having this as a real belief or not wouldn't change the reasoning.

comment by [deleted] · 2011-09-15T14:58:36.569Z · LW(p) · GW(p)

And this is, in fact, part of kilobug's point.

But then I thought "Hey, stop. You're trying to find excuses here. That's not really what you meant with that sentence, or you would have said it clearly. Don't find yourself excuses, just face the fact you were doing knots with your believes."

(while we're on the subject, the plural of belief is "beliefs", contrary to all reason)

comment by pure-awesome · 2012-06-09T11:10:44.751Z · LW(p) · GW(p)

"I wish I could believe that no one could possibly believe in belief in belief in belief..."

You wish you could believe Eliezer? Is this a dliberate stroke of irony or a subconcious hint at the fact that you do have an empathic understanding of the thought processes behind tailoring your own beliefs?

Replies from: hannahelisabeth
comment by hannahelisabeth · 2012-11-12T09:49:56.294Z · LW(p) · GW(p)

I think the idea behind this is that he wishes reality played out in such a way that, to a rational observer, it would engender belief. It's a roundabout way of saying "I wish reality were such that..."

comment by Psy-Kosh · 2009-03-05T22:28:46.650Z · LW(p) · GW(p)

Hrm... While on the one hand I can look at her position and basically react with a "your mind is entirely alien to me", on the other hand, I can actually imagine being in that state.

That does NOT mean, of course, that it is a reasonable state to be in, but it does seem to be the sort of state that my mind can support.

I guess the basic key is that human minds aren't necessarally naturally consistent. So we can end up in actual inconsistent states. Including states a bit confused about consistency itself.

A bit more of a personal example would be a state I sometimes recall having been in in the past, and have certainly seen in others, would be when one might say something like, oh, I dunno, "and scientifically, the universe is about 13.7 billion years old and earth is about 4.5 billion years old" and of course, the world was created about 6000 years ago."

As near as I can tell, happens is that we almost imagine the "scientific world" and the "religious world" as parallel universes that... are actually the same one, so mentally we keep track of it by keeping track of different things.

The way this works is someone might manage to end up in a state that they completely fail to really face the question of "okay, but if you rewind time a bit, will you see 6000 years ago the universe poofing into existence, or can you go farther back, etc? ie, what ACTUALLY happened in ACTUAL REALITY?"

Then, when facing that question, all sorts of Escher mentality stuff starts forming as a defense. But what I think initially happens, at least in part, is sort of mentally tracking those as being about different subjects, rather than contradictory statements about the same thing. So that one will end up, with "science glasses", visualizing prehistoric humans doing stuff tens of thousands of years ago, while etc etc etc...

At least, that's my own, partly introspective model of what's going on here, of how people can end up in these states.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-06T00:05:03.797Z · LW(p) · GW(p)

I think that people who had actual mental models of the world would notice a contradiction that large.

People who profess two different beliefs may not see a contradiction. It's just good to profess one, and also good to profess the other, for different reasons. They aren't visualizing a world that, at one time or another, needs to either poof or go on. They're visualizing that "science" and "religion" both seem like good groups to join.

Replies from: Psy-Kosh, Swimmer963
comment by Psy-Kosh · 2009-03-06T01:13:03.462Z · LW(p) · GW(p)

I think that may be part of it, but I'm also thinking back a bit to when I was more religious, and so on, and also thinking about how some people I know seem to talk, and as near as I can tell, there really does seem to be a bit of that.

I'm claiming they're visualizing a world that goes "poof, 'LET THERE BE LIGHT!'". AND visualizing a world that goes farther back, and somehow doing some form of funny doublethink them thinking of those as different worlds that are both in some sense true, while some aspect of them is treating those not as contradictory models, but almost as, well, different worlds. ie, two different "truths" ("but what is truth?" :))

That is, simply holding the contradiction in place, having two "models", not along the lines of two competing models, but that (though they don't actually notice it), they're imagining it more as parallel worlds that, depending on circumstances, they'll consider either one or the other "this world"

They would (usually, see somewhat below) not ever actually say, or even notice that they're thinking that way. In other words, I'd expect if you asked such a person something like "do you believe in a set of parallel realities, one in which the world was spoken into existance ~6000 years ago, and another about 13.7 billion years old or at least certainly older than 6000 years", they'll probably give you funny looks. But I think, without them noticing, something like that is going on in how it's being stored.

And I can speak from personal experience about some of the REALLY weird stuff I used to think in terms of, so it's in part a "pay no attention to the contradiction behind the curtain" situation.

Heck, sometimes when I bring various contradictions up, I'll get responses like "this isn't a debate class" or "this isn't a court room and you're not a lawyer", and basically have it laughed off like that from some family members. (and, of course, the infamous "in your opinion" fully general retort to any position you don't like. :))

I'm not saying this is all of it, but it sure seems to me that something like this is going on in some cases. It may also be what underlies stuff like "I believe people are nicer than they are". That is, statements like that may partly cash out to "I have a couple different models of people, one of which says they're nicer than the other. I hold both of these at the same time, but I call one my belief, and one the actual situation"

At least, when I try to imagine being in a mental state that could provoke me to utter such a statement, ie, when I try to simulate that state on myself, that seems to be what the result "looks like."

Oh, that bit from earlier, well... sometimes it's made a bit explicit.

I've come across some bits of occult philosophy that basically talks about how there can be many histories that are "true" (no, not in the sense a physicist might talk about interference), and they'll explicitly say stuff like the "there's the actual historical history, but that's not the only 'true' one.."

But also just from introspection, well, it does feel to me that in the past I would be in such a state, have multiple models that I wasn't so much treating as competing so much as treating as, well, simply true, in different senses.

The Escher mental tangle can get REALLY strange. :)

comment by Swimmer963 (Miranda Dixon-Luinenburg) (Swimmer963) · 2011-02-22T16:25:03.972Z · LW(p) · GW(p)

From what I've seen, fundamentalist Christians (this is the only group I've had a chance to speak to) often see the contradiction, and are PROUD of their ability to believe on 'pure faith' despite it. As if it's some kind of accomplishment to say 'wow, god is so powerful that he can even overcome THAT'. I don't know how far they carry through in creating mental models of the world, but I know that their expectations of a world with God in it are VERY different from a world without god, i.e. the node is included in their models. This is a particular religious group where receiving "prophetic words" and visions is common, and the people I knew based their expectations on what "God" said to them in these visions. And were sometimes sorely disappointed, but their 'faith' never seemed to be affected. At the start, they seemed as alien to me as the woman you're describing seemed to you. After befriending some people in this group, I started to understand the geometry of their minds a little bit more. This was nearly a year ago, though, so I have trouble explaining the insights I've had because my mind has gone back to 'how could anybody be that STUPID?'

comment by [deleted] · 2012-08-13T01:32:23.762Z · LW(p) · GW(p)

I am so much a one-level person that my sense of social insincerity has atrophied.

Rational straight man syndrome. So much a truth-finder you forget how to not speak the truth.

Replies from: algekalipso
comment by algekalipso · 2013-05-02T07:45:55.330Z · LW(p) · GW(p)

This seems to be associated with higher than average testosteron levels. If you inject testosterone to a random man he will very prone to not lie and be overly straightforward.

Replies from: None
comment by [deleted] · 2013-05-03T11:47:18.528Z · LW(p) · GW(p)

Sources?

Maybe I should get a blood test.

Replies from: algekalipso, tondwalkar
comment by tondwalkar · 2013-05-23T02:23:18.991Z · LW(p) · GW(p)

I suffer the same symptom. (and have an excessive amount of body hair, not that that's more than negligibly indicative of high testosterone levels)

What's the cheapest/easiest way to get tested? (more out of curiosity than anything else)

Replies from: None
comment by [deleted] · 2013-05-24T00:20:38.753Z · LW(p) · GW(p)

If I understand it correctly, you can go to your physician and ask for it. The test itself is quick, requires a blood sample, and I don't think it is very expensive.

comment by Roko · 2009-03-05T18:00:49.109Z · LW(p) · GW(p)

"And so humanity's token guardians of sanity (motto: "pooping your deranged little party since Epicurus") must now fight the active worship of self-deception - the worship of the supposed benefits of faith, in place of God."

  • As I keep saying, helping people to overcome biases (such as the above) is a lot easier if there are psychologically viable places for people to jump to once they've overcome their bias.

You should have spent much more of your time in this debate convincing your tangled friend that, if she were to abandon her religious belief (or belief in belief, or whatever), she would still be able to feel good about herself and good about life; that life would still be a happy meaningful place to be.

Maybe she has a massive internal guilt complex and thinks of herself as a bad person, and she thinks that only religion can help her with this. Maybe she is frightened that atheism will lead to nihilism.

Replies from: PeteG
comment by PeteG · 2009-03-05T18:55:40.808Z · LW(p) · GW(p)

"You should have spent much more of your time in this debate convincing your tangled friend that, if she were to abandon her religious belief (or belief in belief, or whatever), she would still be able to feel good about herself and good about life; that life would still be a happy meaningful place to be."

I don't think Eliezer cared so much to correct someone's one wrong belief as much as he cared to correct the core that makes many such beliefs persist. Would he really have helped her if all his rational arguments failed, but his emotional one succeeded? My guess is that it wouldn't be a win for him or her.

Replies from: Roko
comment by Roko · 2009-03-05T19:06:15.742Z · LW(p) · GW(p)

Well that depends on whether your aim is to make people have correct beliefs, or whether you want to make people have correct beliefs by following the ritual of rational argument... and I think that EY would claim to be aiming for the former.

Replies from: Annoyance
comment by Annoyance · 2009-03-05T19:14:57.259Z · LW(p) · GW(p)

What use is it to have correct beliefs if you don't know they're correct?

If the belief cannot be conveniently tested empirically, or it would be useless to do so, the only way we can know that our belief is correct is by being confident of the methodology through which we reached it.

Replies from: Cameron_Taylor
comment by Cameron_Taylor · 2009-03-05T19:45:54.665Z · LW(p) · GW(p)

What use is it to have correct beliefs if you don't know they're correct?

When I'm fleeing through an ancient temple with my trusty whip at my side, and I come to a fork in the road, I'll take the path I belief leads to safety. This will turn out to be a wise choice, because the other one would lead me to a pit full of snakes, falling boulders and almost certainly walls that slowly but surely move closer and closer. That's the sound of inevitability.

I naturally prefer to having enough evidence to be confident in my beliefs. Given time I would definitely look up the trusy map I was given of the doom riddled temple. I'd also get someone else to go through ahead of me just to make sure. However, my beliefs will inevitably determine what decisions I make.

To be honest I am a little confused about what that question means. It makes no sense to me, although I can see that someone would conceivably be able to wrangle their mind into that incoherent state. If they believe, but apparently don't know that they believe then I assume that all their decisions are made in accordance with that belief but that they will describe their belief as though they are not confident in it.

Replies from: Annoyance
comment by Annoyance · 2009-03-05T19:52:57.361Z · LW(p) · GW(p)

"I naturally prefer to have a high level of confidence in my beliefs."

Doesn't that depend on how reliable those beliefs are?

If you're fleeing through the temple pursued by a boulder, you don't want to dither at an intersection, so whichever direction you think you should go at one moment should be constant. But there's no reason why your confidence should be high to avoid dithering; you need merely be stable.

"'ll take the path I belief leads to safety. This will turn out to be a wise choice"

If, and only if, your belief is correct. If your belief is wrong your choice is a disastrous one. Rationality isn't about being right or choosing the best course, it's about knowing that you're right and knowing which is the best course to choose.

Replies from: Cameron_Taylor
comment by Cameron_Taylor · 2009-03-05T19:58:09.375Z · LW(p) · GW(p)

Thanks Annoyance, I replaced 'have a high level of confidence' with 'having enough evidence to be confident'. That makes my intended meaning clearer.

Replies from: Annoyance
comment by Annoyance · 2009-03-05T20:10:56.240Z · LW(p) · GW(p)

Then I think I agree with you, mostly. If time or a similar limited resource makes rigorous justification too expensive, we shouldn't require it. But whatever we do accept should be minimally justified, even if it's just "I have no idea where to go so I'll pick at random".

I wouldn't look at the map if I were running from the boulder. But I would have looked at it before entering the temple, and you can bet I'd be trying very hard to retrace my steps on the way out, unless I thought I could identify a shortcut. Even then I might not take the gamble.

comment by artsyhonker · 2011-02-22T11:17:42.233Z · LW(p) · GW(p)

As a theist, I don't believe in God because I perceive some positive benefit from that belief. My experiences and perceptions point to the existence of God. Of course those experiences and perceptions may be inaccurate and are subject to my own interpretations, so I can't claim that my beliefs are rational. I accept on an intellectual level that my belief could be wrong. This doesn't seem to enable me to stop believing.

However, I am involved in a religious community because there are positive benefits -- chiefly that of being able to compare notes with other people who share my irrational belief in God and my desire to do good work in the world. I can see that there might be positive benefits in religious communities for non-theists, though I don't really see the point.

Replies from: TheOtherDave, Swimmer963, prase, CuSithBell, None
comment by TheOtherDave · 2011-02-22T15:06:39.484Z · LW(p) · GW(p)

I know several non-theists, including atheists, who belong to religious communities because they value the benefits that such belonging provides. It helps, of course, that they belong to the kinds of religious communities that welcome people like them.

Replies from: taryneast, Swimmer963
comment by taryneast · 2011-03-29T15:07:23.945Z · LW(p) · GW(p)

There are also plenty of non-religious communities that one can belong to. These also provide the "benefits of belonging" without having to be the odd one out (ie the person that doesn't actually follow the one major point of the community itself). Therefore I agree with artsyhonker in not seeing the point. I'd only consider it the rational move if there were no such other communities nearby (or none that were attractive).

Replies from: TheOtherDave
comment by TheOtherDave · 2011-03-29T18:25:55.516Z · LW(p) · GW(p)

Sure. That makes sense, and if it weren't for my actual experience with people who do seem to get benefits from that group membership that they consider worthwhile, despite also being members of other communities, I would agree with this wholeheartedly.

Of course, it's certainly possible that they're all merely confused and not actually getting benefits they value, or that they could be getting all the same benefits from their other groups and somehow don't realize it.

Replies from: taryneast
comment by taryneast · 2011-03-29T19:18:10.347Z · LW(p) · GW(p)

Ah - no - you miss what I was trying to say. They definitely get benefits - not at all confused. I'll try and give an example to explain what I mean - and I'll leave religion out of it for the moment.

Lets say that near to me is the local football club, and the local wildlife-walks group. both of them have a thriving community and are welcoming and interesting people. Thus if I join either one I will be assured of the benefits of belonging to a community.

But lets say that I happen to have absolutely no passion for football, but really enjoy wildlife walks.

So - the rational move for me would be to join the wildlife group, in favour over the football club. not because there are no benefits to the football club, but because I would get even more out of being in a group where I share the passions and interests with the majority of members.

This is kinda what I was driving at. There's nothing wrong with an atheist joining a local christian group to gain the benefits of community... but if there's another local group that has the same sense of community - but founded around a principle that the atheist actually shares... then they'll probably get even more out of it.

Replies from: TheOtherDave
comment by TheOtherDave · 2011-03-29T20:36:23.068Z · LW(p) · GW(p)

If, in that situation, I observed you evaluating both groups and choosing to join the football club, that observation would increase my confidence that you are obtaining something of value from the football club that you aren't getting elsewhere, even if I have no clue what that might be.

Replies from: taryneast
comment by taryneast · 2011-03-30T10:34:54.177Z · LW(p) · GW(p)

Yup, no argument here. I would be curious to know what it was.

Replies from: TheOtherDave
comment by TheOtherDave · 2011-03-30T13:58:32.773Z · LW(p) · GW(p)

(nods) Me too. The impression I've gotten from conversations with my non-theist friends who belong to religious communities is that they provide a more close-knit and mutually committed community than their secular equivalents. This is especially relevant for those with children.

Replies from: taryneast
comment by taryneast · 2011-03-31T11:15:56.079Z · LW(p) · GW(p)

Yes, I've found that most (but not all) hobby-based communities tend to be fairly loosely constructed. People are expected to hang around for a few years, perhaps, but not really to contribute more than just some passing time.

Exceptions I've found to this rule are: ethnic/expat groups, parenting support groups, and (strangely) some geeky groups: SF/F (in certain cities), and the SCA.

The latter was my biggest surprise, when I joined. There a third-generation SCAdians... some of whom have a fourth generation on the way.

Replies from: wedrifid
comment by wedrifid · 2011-03-31T13:41:50.096Z · LW(p) · GW(p)

SCA?

Replies from: taryneast, Risto_Saarelma
comment by taryneast · 2011-03-31T15:39:11.092Z · LW(p) · GW(p)

The Society for Creative Anachronism

AKA an excuse to have fun dressing up and feasting the night away after a day of hand-to-hand fighting (if that's your wont)... along with a zillion other interesting things to learn and do, with the only caveat being a well-meaning attempt at remaining within the time period of "fall of the roman empire up to and including the early renaissance" (oh, and don't take "renn faire" as a good example... in the SCA everybody is a participant, not a spectator).

Replies from: wedrifid
comment by wedrifid · 2011-03-31T15:49:42.759Z · LW(p) · GW(p)

The hand to hand combat is tempting.

Replies from: taryneast
comment by taryneast · 2011-03-31T17:05:29.282Z · LW(p) · GW(p)

Yep - it brings in most of the (male) converts... whereas the feasting/dancing/singing/cooking is what usually tempts in us womenfolk... this means that it's not only appealing to the geeky types... but actually has an amazingly good gender balance. It also means that you can bring your SO and they will actually have something to do. This is a benefit of community-building not to be overlooked. :)

Replies from: gwern
comment by gwern · 2011-03-31T17:34:13.559Z · LW(p) · GW(p)

So, it's kind of like anime conventions and cosplay then.

Obviously we need to work out how to integrate costumes or cooking into LessWrong meetups...

Replies from: RobinZ, taryneast
comment by RobinZ · 2011-03-31T18:18:42.872Z · LW(p) · GW(p)

Nutrition?

comment by taryneast · 2011-04-01T13:53:46.594Z · LW(p) · GW(p)

:)

Obviously the costumes need integrated paperclips...

comment by Swimmer963 (Miranda Dixon-Luinenburg) (Swimmer963) · 2011-02-22T16:29:15.960Z · LW(p) · GW(p)

I was one of those people for a while. I was accepted, I think, because the particular group I hung out with had an overwhelming need to convert people, and couldn't resist a juicy atheist/agnostic specimen like me.

I also sing in a church choir, which is kind of similar except that it's explicit I'm there for the musical education and not the religion.

Replies from: TheOtherDave
comment by TheOtherDave · 2011-02-22T16:54:38.293Z · LW(p) · GW(p)

the particular group I hung out with had an overwhelming need to convert people, and couldn't resist a juicy atheist/agnostic specimen like me.

Ah, that's unfortunate.

As far as I can tell, the religious communities my atheist/agnostic church-going friends belong to consider them full-fledged members of the community no more in need of alteration than anybody else, which seems like a much more honest arrangement.

Though, of course, I have no way of knowing for sure.

comment by Swimmer963 (Miranda Dixon-Luinenburg) (Swimmer963) · 2011-02-22T16:32:10.313Z · LW(p) · GW(p)

I've found the same thing–if you want to actually accomplish good things in the world, it seems more rational to attach yourself to a religious community than not. I have my own reasons for believing that it's morally right to help others, but a lot of the non-religious/atheist people my age haven't really thought about this at all, and religious people my age tend to be VERY involved.

comment by prase · 2011-02-22T12:59:32.978Z · LW(p) · GW(p)

I accept on an intellectual level that my belief could be wrong. This doesn't seem to enable me to stop believing.

Of course it doesn't. To accept that your belief can be wrong isn't the same as accepting that it is wrong. The former is a complete triviality (if person doesn't accept that his particular belief can be wrong, even in principle, either the belief is not a real belief, or the person is seriously irrational). The latter not only may enable you stop believing, but should force you to do so.

Of course those experiences and perceptions may be inaccurate and are subject to my own interpretations, so I can't claim that my beliefs are rational.

As is true for any experiences of any person, and still, a lot of people strive to have rational beliefs. While your formulation seems to imply that you happily accept being irrational. Which leads me to ask why? Is it because you think that rationality (however you define it) isn't always the best way to arrive to true beliefs? Or because you don't always mind having false beliefs? Or some other reason?

comment by CuSithBell · 2011-02-22T16:08:23.192Z · LW(p) · GW(p)

If it's okay with you, would you mind describing these experiences / perceptions and how they led to your particular beliefs? I'd be quite interested in hearing.

comment by [deleted] · 2014-06-13T06:50:10.040Z · LW(p) · GW(p)

Mundus vult decipi, ergo decipiatur

comment by Ryan · 2009-03-05T23:10:02.144Z · LW(p) · GW(p)

I know some people who are like the woman you describe, my own folks might be like that to some extent. I became atheist pretty early on. So I'm not sure that adults who believe in belief are likely to be passing that along to their kids, if they even try. In my case, I put on a show for a while, but when I stopped it was no big deal.

If these people are able to agree with a scientific worldview and not be obstructionist on things like stem cell, but simply want to add "and I believe there is a god" to the end of it, fine. Seems like a natural step towards the end of belief in god entirely.

comment by Emile · 2009-03-05T16:09:22.502Z · LW(p) · GW(p)

To further illustrate the point that self-deception isn't easy: if believe you're shy, you can't just make yourself believe you're not shy.

Maybe you can make yourself believe that you believe that you're not shy, but I don't think you'll reap many benefits from placebo effect - you'll still get nervous when you want to speak up or go talk to a girl you don't know or whatnot. You can't argue yourself logically into self-confidence.

Replies from: Cameron_Taylor
comment by Cameron_Taylor · 2009-03-05T18:32:47.954Z · LW(p) · GW(p)

It does tend to be counterproductive to directly convince yourself you are not shy. I know I wouldn't have much luck just willing myself to believe I was self-confident. You can, however argue yourself into self-confidence, if you do so indirectly.

One way argue yourself into self-confidence is to identify sources of bias, noticing irrational thought patterns that lead to the conclusion that you are shy. This is the cornerstone of Rational Emotive Behavioral Therapy and Cognitive Behavioral Therapy in general. For eample, you may observe that you have overgeneralised from one particular incident where you hesitated from nerves. One incident is clearly insufficient evidence. You may also observe that your thinking is being distorted towards pessimism simply because you slept too few hours the previous night!

The other obvious way to persuade yourself that you are not shy is simply to realise that you have just brought the situation into your self awareness. Once the thought is brought to the conscious level it can be simple to consider the situation from a different perspective, perhaps rationally evaluating the risks and rewards. That helps release some of the anxiety. The key there is that you aren't forcing belief that you are self-confident, but convincing yourself that self-confidence is the rational state to be in. Belief in that self-confidence follows naturally.

Now, this is all well and good for managing social anxiety and certainly a useful tool for improving our dating game. But how exactly does it relate to the quest for belief in belief?

If you can use sound arguments and evidence to change beliefs towards a desired belief in a belief then you can almost certainly use bogus arguments and fictional evidence to grant yourself a belief that you believe something stupid. It's hard to force belief in believing you're not shy or belief that you believe in a God. However, the application of focussed rational thought helps the former while the latter is handled by the unconscious irrational wriggling that humans are so talented at.

comment by thomblake · 2009-03-05T15:33:44.351Z · LW(p) · GW(p)

Why destroy placebo effects? According to some stuff Robin Hanson points to, it seems that most of medicine might consist of placebos. Aren't you fighting what wins in favor of the truth?

Replies from: ciphergoth, Annoyance
comment by Paul Crowley (ciphergoth) · 2009-03-05T22:13:15.413Z · LW(p) · GW(p)

There is evidence that placebos work even if you know that they contain no active ingredients, so we may be spared this interesting dilemma!

comment by Annoyance · 2009-03-05T19:38:30.204Z · LW(p) · GW(p)

Why would we regard an effective placebo as a victory? Why would we want our enemies to profit?

I can think of all sorts of reasons to oppose the existence of a type of person who is made more fit by delusion. Simple eugenics combined with long-term thinking would seem to suggest that we should encourage the destruction of such people.

Replies from: thomblake
comment by thomblake · 2009-03-05T20:12:03.600Z · LW(p) · GW(p)

If you regard those who are not rational as 'our enemies', then I suppose that reasoning holds.

  1. a Utilitarian, considering what's best in the long-term, would certainly prefer people who've managed to be made more fit by the truth - delusion is clearly more costly ceteris paribus.

  2. Anyone who accepts an egoistic ethics should accept that the mere fact of them being 'enemies' is enough to want them less fit.

  3. Kantians value truth for obvious reasons. Lying is probably the only act to which Kant successfully applied the categorical imperative.

Of course, a certain sort of Altruist might think that making people feel nice now is worth... well, they'd probably stop thinking at that point.

But even given all this, as it turns out I'm one of the ordinary humans that's aided by placebos, and don't regard humans as the 'enemy'. So I'm in favor of placebos, for now. Though I'm doubly in favor of altering human cognitive architecture so that the truth works even better.

comment by DPiepgrass · 2019-11-10T19:01:37.115Z · LW(p) · GW(p)
This is why intelligent people only have a certain amount of time (measured in subjective time spent thinking about religion) to become atheists.

Just a data point. I spent over twenty (20) years, thinking multiple hours every week about subjects related to my religion. I was deeply confused, but I needed too badly for it to be true to go earnestly looking for evidence that is was false. Which reminds me of another Yudkowsky quote:

Existential depression has always annoyed me; it is one of the world's most pointless forms of suffering.

If my religion was false, not only would it mean that the people around me were horrifyingly delusional for believing it, but it would also mean that the wonderful future I was told about would be replaced with the utter destruction of my soul—and everyone's soul—at death.

"The telestial kingdom is so great, if we knew what it was like we would kill ourselves to get there." - Joseph Smith, Apocryphal (the telestial kingdom was the lowest tier of the afterlife, i.e. hell)

As the years passed, very slowly and inevitably, I lost faith. But why did it take me over 20 years between the onset of doubt and my decision to leave the religion? It's easy to yell out "confirmation bias". But everyone has that. I think the real problem is that in all that time, no one gave me a link to cesletter.org. I heard lots of atheists hurling cheap insults at believers, belittling them, talking about how obvious it was that they were right and we were wrong. I heard precious few people making strong but fair and compassionate arguments of the sort I needed to hear.

Replies from: erioire
comment by ErioirE (erioire) · 2024-02-05T18:15:26.465Z · LW(p) · GW(p)

I know it's been 4 years since your comment, but if I'm reading this many years later there will be others later still.

Another former Mormon here. I also encountered the infuriating prevalence of destructive criticism.

Also of note is the toxicity of places like r/exmormon. A significant portion of those who frequent exmo-specific groups tend to be those who are angry, bitter and still blame the church for everything bad in their life even decades after leaving. Those with a more healthy outlook tend to move on and find better things to do. Those with a less healthy outlook also seem to be more likely to produce Mormon-critical media and infect others with their own biases, dispite having otherwise valid criticism.

Back when I was a questioning member, encountering exmo groups was counter-productive because it only served to feed the confirmation bias of "wow, all these ex-mormons sure are miserable, just like I've been told!"

comment by Arandur · 2011-08-01T02:27:42.166Z · LW(p) · GW(p)

It sounds to me that she simply is using a different definition of "to believe". If she says "I believe people are nicer than they are," I think she means something like, "I choose to act as if people are nicer than they really are, because it is consonant with my sense of morality to do so." It's choosing to give people the benefit of the doubt, knowing they probably don't deserve it.

Replies from: Zuckaschnegge
comment by Zuckaschnegge · 2012-04-27T07:19:49.150Z · LW(p) · GW(p)

I would much rather think of it the other way around. As far as I know the average person is exactly as nice as the average person is. However, when she said she believed people are nicer than they actually are i guess it is because her estimation of the average niceness of a person is biased and she is actually falsely believing people are worse than they are. This might well be some kind of defense mechanism she developed. Of course if you expect worse than average, the chances of you being positively surprised are way higher than the other way around.

comment by haig · 2009-03-06T12:45:30.014Z · LW(p) · GW(p)

Placebo effects from 'belief in (false) beliefs' only work as long as self-deception is maintainable.

I think the point at which self-deception ceases to work is when you can consciously be aware of it breaking your causal models of the world. Highly intelligent people, or anyone for that matter, cannot continue to deceive themselves into believing in god or unregulated markets, or whatever complex concept take your pick, if you explicitly show how it breaks a model they cannot disagree with. Controversial topics of the day like belief in god, public policy, etc. are not single data points under contention, but tangled balls of causation that must be dealt with in a somewhat parallel fashion--to see the big picture and say, wait a minute that cannot fit unless this, and this, and this, and finally reach a dead end and have to relinquish their starting belief. The more abstract or the more complex a concept is, the easier it is to deceive yourself of its falsehood.

The limits to working memory plays a role here, and if we are to truly be less wrong, we not only have to overcome biases, but we need to amplify our rational intelligence by using tools designed for these specific purposes. What if beliefs such as 'a personal god exists' were as hard to believe in as 'the sky is green'? What if it was explicitly laid out in front of someone that they absolutely could not hold a belief in something because of all the cascading links it breaks in their world model that is confirmed to be 'reality'.

I want to work on such tools.

comment by cleonid · 2009-03-06T00:21:51.575Z · LW(p) · GW(p)

Voltaire, using rationalist arguments, concluded that “if God did not exist, it would be necessary to invent him”. So could it be that adhering to facts in all situations is essentially an irrational position?

Consider the following statements:

1) Rational humans (unlike rational AI) should aim to be happy.

2) Rational humans should not believe fanciful notions unsupported by empirical evidence.

3) Empirical studies (e.g. http://www.lifesitenews.com/ldn/2008/mar/08031807.html) suggest that humans who believe in such notions are more likely to be happier.

The consequence of the above statements seems to be that a rational human should reject rationality.

Does anyone see flaws in this reasoning?

Replies from: MinibearRex
comment by MinibearRex · 2011-03-31T22:51:33.868Z · LW(p) · GW(p)

3) Empirical studies (e.g. http://www.lifesitenews.com/ldn/2008/mar/08031807.html) suggest that humans who believe in such notions are more likely to be happier.

Are they actually happier, or do they just believe that they're happier? ;)

Replies from: Zuckaschnegge
comment by Zuckaschnegge · 2012-04-27T07:31:36.902Z · LW(p) · GW(p)

What would the actual difference be? You have a subjective view of your emotions (and anything else anyway). so believing you are happy would be the same as being happy, as long as you are not aware of the fact that you are only believing in your happiness.

Replies from: alex_zag_al, MinibearRex
comment by alex_zag_al · 2017-11-05T23:13:59.943Z · LW(p) · GW(p)

I think that someone who merely believed they were happy, and then experienced real happiness, would not want to go back.

comment by MinibearRex · 2012-04-28T22:08:46.237Z · LW(p) · GW(p)

I suspect that there is a difference, but I'm not extremely confident of this. It seems to me that a noticeable fraction of the people I've encountered over my life are in decidedly sub-optimal situations, and could with relative simplicity change to a more optimal lifestyle, yet are convinced that their own lifestyle is the best thing ever.

comment by nazgulnarsil · 2009-03-05T21:25:14.563Z · LW(p) · GW(p)

This is a perfect example of the web that builds itself around even one confusion of a value statement and a factual statement. I fear we all have these lurking.

comment by PeterL (peter-loksa) · 2024-02-25T15:13:44.911Z · LW(p) · GW(p)

I agree with both "emotion" and "pretend" hypotheses. It is (according to my world view) extremely difficult to pretend emotions you are not possessing. Thus, the easiest way to pretend your beliefs might be to manipulate your own emotions.

comment by raptortech97 · 2012-04-19T21:04:08.946Z · LW(p) · GW(p)

I benefit from believing people are nicer than they actually are.

I empathize with her here. I believe that it is in my advantage to act towards people the way I would act if they were nicer than they actually are. I'll try to parse that out. Let's say Alice is talking to Bob. Cindy, at a different time, also talks to Bob. Bob is a jerk; we assume he is not nice.

  • Alice honestly expects that Bob is nicer than he actually is, and accordingly she is nice to Bob.
  • Cindy honestly expects that Bob is exactly as nice as he actually is, and accordingly she is dismissive of Bob.

I expect that Bob will be nicer towards Alice than towards Cindy. (Warning: This is starting to feel like a belief, suggesting that it is actually a belief in belief.) My theory is that I should act like Alice. Of course, there are alternatives, like simply being to nice to people.

I hope this comment made sense to you. I know I'm pretty confused about it myself now.

Replies from: hannahelisabeth
comment by hannahelisabeth · 2012-11-12T09:57:42.791Z · LW(p) · GW(p)

I think when you parse this out you realize that there are a lot of other factors at play here, it's not just a "belief in belief" thing.

Treating someone nicely has an influence on how they subsequently treat you and others. So it's not so much that you're believing someone is nice when they're not, it's that you're believing that they do not have a fixed property state of "niceness", that it is variable dependent on conditions, which you can then manipulate to promote niceness, for the benefit of yourself and others.

None of this is belief in belief. When you look closer you see that you are comparing two different things: how nice Bob has been in the past and how nice Bob will be in the present/future, dependent on what type of environment he is in, and you are thus modifying your behavior on the assumption that your contribution to the environment can make it such that Bob will be nice, or at least nicer. And there is evidence to support this assumption, so it's not irrational to expect Bob to be(come) nice when treat him nicely accordingly.

It's just misleading to phrase it as "I benefit from believing perople are nicer than they are," because what you mean by the first "are" (will be) is not the same as what you mean by the second "are" (have been).

Replies from: Peterdjones
comment by Peterdjones · 2012-11-12T10:29:18.752Z · LW(p) · GW(p)

I don't think that would mislead most people, since most people can handle context and don't expect ordinary English phraseology to conform to logical rigour.

Replies from: hannahelisabeth
comment by hannahelisabeth · 2012-11-13T15:52:23.816Z · LW(p) · GW(p)

My point was that it's misleading to those trying to interpet it directly into a logical statement, which is what Eliezer seemed to be trying to do. I'm sure there are lots of people who could read that sentiment and understand the meaning behind it (or at least a meaning; some people interpret it differently than others). It's certainly possible to comprehend (obviously, otherwise I wouldn't have been able to explain it), but the meaning is nevertheless in an ambiguous form, and it did confuse at least some people.

comment by Olle · 2009-03-05T21:11:13.203Z · LW(p) · GW(p)

I believe the following five things.

(1) Barcelona will not win the Champions League.

(2) Manchester U will not win the Champions League.

(3) Chelsea will not win the Champions League.

(4) Liverpool will not win the Champions League.

(5) I falsely believe one of the statements (1), (2), (3) and (4).

This seems to me like a reasonable counterexample to Wittgenstein's doctrine.

Replies from: Eliezer_Yudkowsky, Olle, thomblake
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-06T00:07:51.611Z · LW(p) · GW(p)

You need to work with probabilities, and then make statements about your expected Bayes-score instead of truth or falsity; then you'll be consistent. I have a post on this but I can't remember what it's called.

Replies from: Z_M_Davis
comment by Olle · 2009-03-06T07:30:41.652Z · LW(p) · GW(p)

topynate: It was only for reasons of space that I listed five events with probability 0.8 each, rather than 1000 events with probability 0.999 each; the modification is obvious.

Eliezer: Point taken.

comment by thomblake · 2009-03-05T21:15:29.562Z · LW(p) · GW(p)

I think Wittgenstein's point was that you're using 'believe' in a strange way. I have no idea what you meant by the above comment; you're effectively claiming to believe and not believe the same statement simultaneously.

If you're using paraconsitent logic, you should really specify that before making a point, so the rest of us can more efficiently disregard it.

Replies from: Olle, Peterdjones
comment by Olle · 2009-03-05T21:28:51.876Z · LW(p) · GW(p)

I judge each of the four teams to have probability 0.2 of winning the Champions League. Their victories are mutually exclusive. Hence I judge each of statements (1)-(5) to have probability 0.8.

Replies from: topynate
comment by topynate · 2009-03-06T00:29:06.459Z · LW(p) · GW(p)

Hm. Wittgenstein requires that the meaning be "indicative". In English the indicative mood is used to express statements of fact, or which are very probable. They don't necessarily have to be true or probable, of course, but they express beliefs of that nature. You say "I believe X" when you assign a probability of at least 0.8 to X; 0.8 is probable, but not very probable. Would you state baldly "Barcelona will not win the Champions League", given your probabilities? I doubt it. When you say instead "I believe Barcelona will not win the Champions League", you could equally say "Barcelona will probably not win the Champions League." But this isn't in the indicative mood, but rather in something called the potential/tentative mood, which has no special form in English, but does in some other languages, e.g. daro in Japanese (which has quite a complex system for expressing probability). It's better to just say your degree of belief as a numeric probability.

comment by Peterdjones · 2011-05-19T15:35:06.323Z · LW(p) · GW(p)

He is illustrating that "belief" has more than one meaning, for all that he hasn't clarified the meanings.

A candidate theory would be belief-as-cold-hard-fact versus beliefs-as-hope-and-commitment.

Consider a politican fighting an election. Even if the polls are strongly against them, they can't admit that they are going to lose as a matter of fact, because that will make the situation worse. They invariably refuse to admit defeat. That is irrational if you treat belief as a solipsistic, pasive registration of facts, but makes perfect sense if you recoginise that beliefs do things in the world and influence other people. If one person commits to something , others can, and that can lead to it becoming a fact.

Treating people as nicer than they are might make them nicer than they were.

Replies from: Peterdjones
comment by Peterdjones · 2012-11-12T10:51:16.395Z · LW(p) · GW(p)

Of course , if "belief" does have these two meanings, the argument against dark side epistemolgoy largely unravels...

comment by Zuckaschnegge · 2012-04-27T07:48:03.407Z · LW(p) · GW(p)

(One of the other few moments that gave her pause—I mention this, in case you have occasion to use it—is when she was |talking about how it's good to believe that someone cares whether you do right or wrong—not, of course, talking about how there actually is a God who cares whether you do right or wrong, this proposition is not part of her religion—

And I said, "But I care whether you do right or wrong. So what you're saying is that this isn't enough, and you also need to believe in something above humanity that cares whether you do right or wrong." So that stopped her, for a bit, because of course she'd never thought of it in those terms before. Just a standard application of the nonstandard toolbox.)

What i think about here is, that whether or not you care about whether she does right or wrong, to her you are an outsider, one who does not know everything she knows, one who has no insight in what she thinks about the things she does, no insight in what she actually intends to do. So in other words you have no real way of judging her doing to be right or wrong. The only way for her to think of someone to overlook her actions, is to actually believe in an omniscent god, im atheist but i still believe there are good things and bad things for me to do. (might not be a rational thought but i think of it as a neccessary one). In other words my conscience is the being overlooking my doings.

So my guess here would be that she might give her conscience a name and form it in a way to fit in with others people consciences(in other words any religious group whatsoever). To her, god might well be her conscience with a name atheists dont like to hear.

comment by MarkusRamikin · 2011-07-18T19:06:44.258Z · LW(p) · GW(p)

< "Pooping your deranged little party since Epicurus."

I love that. Did you pick it up somewhere or do I credit you with it?

comment by Annoyance · 2009-03-05T19:30:49.074Z · LW(p) · GW(p)

If you recognize that, in certain terms, believing certain things has positive instrumental results even if they're not true, why can't you simply abolish the false beliefs and just create those results directly?

Human brains are (loosely speaking) Universal Turing Machines - they can emulate any computation. So if we're looking for a particular set of results, we're not tied to a way to reach them that's invalid. There's always a valid path that gets us to where we want to be.

Replies from: thomblake, billswift, Eliezer_Yudkowsky
comment by thomblake · 2009-03-05T19:59:51.397Z · LW(p) · GW(p)

Human brains are (loosely speaking) Universal Turing Machines

You'd have to be speaking very loosely for that comparison to be correct. Unless you're talking about creating posthumans, we're tied to all sorts of non-universal cognitive architecture. You go to war with the brain you have, not the brain you want.

Replies from: Annoyance, Swimmer963
comment by Annoyance · 2009-03-05T20:06:30.541Z · LW(p) · GW(p)

But those good ol' frontal lobes permit universal computation. We can do it. We're just not very good at it.

If you can emulate arithmetic, the only limit is memory capacity. Ignore that issue, and you're a UTM.

Replies from: thomblake
comment by thomblake · 2009-03-05T20:16:19.451Z · LW(p) · GW(p)

I suppose I should grant that - the principle of charity does not permit me to assume anyone thought there was an equivalent to an infinite tape in reality.

comment by Swimmer963 (Miranda Dixon-Luinenburg) (Swimmer963) · 2011-02-22T16:34:24.195Z · LW(p) · GW(p)

I've thought a lot about this question. How about this: a small portion of our brain is dedicated to universal computation, and the rest is dedicated to shortcuts/heuristics that allow us to actually function.

comment by billswift · 2009-03-06T00:58:54.473Z · LW(p) · GW(p)

Not just loosely speaking - the brain IS a Universal Turing Machine. Or at least as much a one as currently exists - the key definition is the universality of computations - the infinite tape is a visualization mechanism.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-06T00:15:20.700Z · LW(p) · GW(p)

Is that you, Caledonian? (Said without looking at email address.)

comment by alexpunct · 2012-11-01T17:07:47.724Z · LW(p) · GW(p)

I Believe this will be the next form of religion.