The Meditation on Curiosity

post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-10-06T00:26:28.000Z · LW · GW · Legacy · 101 comments

Contents

101 comments

The first virtue is curiosity.

—“The Twelve Virtues of Rationality [? · GW]”

As rationalists, we are obligated to criticize ourselves and question our beliefs . . . are we not?

Consider what happens to you, on a psychological level, if you begin by saying: “It is my duty to criticize my own beliefs.” Roger Zelazny once distinguished between “wanting to be an author” versus “wanting to write.” Mark Twain said: “A classic is something that everyone wants to have read and no one wants to read.” Criticizing yourself from a sense of duty leaves you wanting to have investigated, so that you’ll be able to say afterward that your faith is not blind. This is not the same as wanting to investigate.

This can lead to motivated stopping [? · GW] of your investigation.  You consider an objection, then a counterargument to that objection, then you stop there.  You repeat this with several objections, until you feel that you have done your duty to investigate, and then you stop there. You have achieved your underlying psychological objective: to get rid of the cognitive dissonance that would result from thinking of yourself as a rationalist, and yet knowing that you had not tried to criticize your belief.  You might call it purchase of rationalist satisfaction [? · GW]—trying to create a "warm glow" of discharged duty.

Afterward, your stated probability level will be high enough to justify your keeping the plans and beliefs you started with, but not so high as to evoke incredulity from yourself or other rationalists.

When you’re really curious, you’ll gravitate to inquiries that seem most promising of producing shifts in belief, or inquiries that are least like the ones you’ve tried before. Afterward, your probability distribution likely should not look like it did when you started out—shifts should have occurred, whether up or down; and either direction is equally fine to you, if you’re genuinely curious.

Contrast this to the subconscious motive of keeping your inquiry on familiar ground, so that you can get your investigation over with quickly, so that you can have investigated, and restore the familiar balance on which your familiar old plans and beliefs are based.

As for what I think true curiosity should look like, and the power that it holds, I refer you to “A Fable of Science and Politics [? · GW]” in the first book of this series, Map and Territory. The fable showcases the reactions of different characters to an astonishing discovery, with each character’s response intended to illustrate different lessons. Ferris, the last character, embodies the power of innocent curiosity: which is lightness, and an eager reaching forth for evidence.

Ursula K. LeGuin wrote: “In innocence there is no strength against evil. But there is strength in it for good.”1 Innocent curiosity may turn innocently awry; and so the training of a rationalist, and its accompanying sophistication, must be dared as a danger if we want to become stronger. Nonetheless we can try to keep the lightness and the eager reaching of innocence.

As it is written in “The Twelve Virtues of Rationality”:

If in your heart you believe you already know, or if in your heart you do not wish to know, then your questioning will be purposeless and your skills without direction. Curiosity seeks to annihilate itself; there is no curiosity that does not want an answer.

There just isn’t any good substitute for genuine curiosity. A burning itch to know is higher than a solemn vow to pursue truth. But you can’t produce curiosity just by willing it, any more than you can will your foot to feel warm when it feels cold. Sometimes, all we have is our mere solemn vows.

So what can you do with duty? For a start, we can try to take an interest in our dutiful investigations—keep a close eye out for sparks of genuine intrigue, or even genuine ignorance and a desire to resolve it. This goes right along with keeping a special eye out for possibilities that are painful, that you are flinching away from—it’s not all negative thinking.

It should also help to meditate on “Conservation of Expected Evidence [? · GW].” For every new point of inquiry, for every piece of unseen evidence that you suddenly look at, the expected posterior probability should equal your prior probability. In the microprocess of inquiry, your belief should always be evenly poised to shift in either direction. Not every point may suffice to blow the issue wide open—to shift belief from 70% to 30% probability—but if your current belief is 70%, you should be as ready to drop it to 69% as raise it to 71%. You should not think that you know which direction it will go in (on average), because by the laws of probability theory, if you know your destination, you are already there. If you can investigate honestly, so that each new point really does have equal potential to shift belief upward or downward, this may help to keep you interested or even curious about the microprocess of inquiry.

If the argument you are considering is not new, then why is your attention going here? Is this where you would look if you were genuinely curious? Are you subconsciously criticizing your belief at its strong points, rather than its weak points? Are you rehearsing the evidence?

If you can manage not to rehearse already known support, and you can manage to drop down your belief by one tiny bite at a time from the new evidence, you may even be able to relinquish the belief entirely—to realize from which quarter the winds of evidence are blowing against you.

Another restorative for curiosity is what I have taken to calling the Litany of Tarski, which is really a meta-litany that specializes for each instance (this is only appropriate). For example, if I am tensely wondering whether a locked box contains a diamond, then rather than thinking about all the wonderful consequences if the box does contain a diamond, I can repeat the Litany of Tarski:

If the box contains a diamond,
I desire to believe that the box contains a diamond;
If the box does not contain a diamond,
I desire to believe that the box does not contain a diamond;
Let me not become attached to beliefs I may not want.

Then you should meditate upon the possibility that there is no diamond, and the subsequent advantage that will come to you if you believe there is no diamond, and the subsequent disadvantage if you believe there is a diamond. See also the Litany of Gendlin.

If you can find within yourself the slightest shred of true uncertainty, then guard it like a forester nursing a campfire. If you can make it blaze up into a flame of curiosity, it will make you light and eager, and give purpose to your questioning and direction to your skills.

1Ursula K. Le Guin, The Farthest Shore (Saga Press, 2001).

101 comments

Comments sorted by oldest first, as this post is from before comment nesting was available (around 2009-02-27).

comment by Robin_Hanson2 · 2007-10-06T00:36:17.000Z · LW(p) · GW(p)

This is especially well written, btw.

comment by Selfreferencing · 2007-10-06T03:15:46.000Z · LW(p) · GW(p)

It just seems so old fashioned to think that it is courageous to be willing to doubt any of your beliefs. Here's a nice reflection on the matter with regard to the epistemic propriety of religious belief:

http://comp.uark.edu/~senor/wrong.html

comment by g · 2007-10-06T11:37:27.000Z · LW(p) · GW(p)

An OB post from November 2006 is a useful counterpoint to van Inwagen's paper, and there's been other discussion of van Inwagen's claims, generally in the context of the Aumann agreement theorem.

I think van Inwagen is wrong; if he really considers that David Lewis's disagreement with his position has enough evidential force that his continued holding of it is ill-supported by the evidence, then he should stop holding it.

van Inwagen doesn't really argue against this; he just says that it seems obvious to him that he's entitled to hold whatever opinions he finds himself holding, with whatever confidence he finds himself attaching to them. And in one sense he certainly is entitled to; he is also entitled to believe that the government of the USA has been taken over in secret by alien lizard-creatures. But he isn't entitled to do that and still be thoroughly rational.

Well, van Inwagen does offer one pseudo-argument: he "doesn't want to be forced" to adopt a position of "general philosophical skepticism", which he thinks accepting Clifford's rule of evidence would commit him to. Well, OK, but it seems a bit poor for a philosopher to be so openly embracing wishful thinking. I don't want to be a philosophical skeptic, and neither do other philosophers; therefore philosophical skepticism must be rejected (bah!); therefore Clifford's rule of evidence must be rejected.

Someone with more intellectual self-respect would say not "I must have some mysterious incommunicable philosophical insight unavailable to Lewis" but "I think Lewis is missing these specific points, and here is why he is wrong in what he's said about them".

I have a dark suspicion that deep down, van Inwagen is a general philosophical skeptic (after all, he's said that adopting a policy of basing one's beliefs on the evidence would lead to that position), but he finds it more congenial to go on making confident assertions on the basis of insufficient evidence.

comment by g · 2007-10-06T11:38:08.000Z · LW(p) · GW(p)

Incidentally: whether something "seems old-fashioned" has very little to do with whether it's true.

Replies from: pnrjulius
comment by pnrjulius · 2012-06-09T00:34:34.703Z · LW(p) · GW(p)

The Earth revolves around the Sun? Why, how old-fashioned!

comment by Selfreferencing · 2007-10-06T22:21:01.000Z · LW(p) · GW(p)

G,

Welp, I've only been reading this blog for 2007. Silly me. I just read the post and all the comments. I have to say that Philip Bricker has the upper hand.

Bricker suggested the option that you advocate, by the way. But he dismisses it. Here's why, I think: If you suspend judgment in response to reasonable disagreement, you're going to have to suspend judgment about basically all philosophical theses. By doing so, you're going to run yourself into quite a few problems.

Note: By 'old-fashioned', I meant that the view advocated in the post relies on epistemological ideas that most epistemologists reject. I sure hope that has something to do with whether it's true. Although, maybe it doesn't.

comment by g · 2007-10-07T00:10:49.000Z · LW(p) · GW(p)

I've only been reading OB for a month or thereabouts myself, but I had a little trawl through the archives looking for interesting things.

If epistemologists-as-a-class take any particular stand on whether a general willingness to doubt all one's beliefs is courageous, then that's the first I've heard of it. But I'm not an expert on epistemology, still less on epistemologists, so maybe that wouldn't be too surprising. Anyway: What epistemological ideas, generally rejected by epistemologists these days, are being relied on by those who say things like "It is courageous to be prepared to revise any of your ideas, if the balance of evidence turns out to be against them"?

(I expect a lot of epistemologists would insist that you probably have some ideas for which you'll never be able to find yourself in that position, because they're so firmly built into the structure of your brain or of the reasoning processes you're using. But that's quite separate from whether a willingness to doubt anything you do get good evidence against is either courageous or wise, and doesn't seem to me to have anything much to do with what Eliezer is saying here.)

Isn't your explanation of why Bricker dismisses "the option [I] advocate" just "If I adopt this policy then I'll have to do a lot of judgement-suspending, and I don't want to"? Or does he (or do you) have some specific problems in mind, that one would run into by doing this? (Being uncertain about some questions one would rather be confident about isn't, in my view, a "problem".)

For the avoidance of doubt: I am not proposing (though I think there are contributors here who would) that when considering any philosophical problem it's illegitimate to have opinions of one's own that differ from the majority view among philosophers. (Or among the very best philosophers, or whatever.) But I do think it's a sign of something probably wrong if you find yourself in disagreement with others who (at least on the face of it) are better placed to understand the matter clearly, and don't have anything to say in favour of your position other than that it seems right to you. Because when you do that, you're basically appealing to the quality of your intuition, and ex hypothesi those disagreeing others have intuitions likely to be at least as good as yours.

comment by MrHen · 2010-01-29T19:24:51.684Z · LW(p) · GW(p)

When you're really curious, you'll gravitate to inquiries that seem most promising of producing shifts in belief, or inquiries that are least like the ones you've tried before. Afterward, your probability distribution likely should not look like it did when you started out - shifts should have occurred, whether up or down; and either direction is equally fine to you, if you're genuinely curious.

Strangely, following this behavior leads me to attack my most "rational" beliefs. If I am holding an irrational belief I find it less likely that it will shift. The way I have to dig these out is to keep hacking away at the foundations that built the irrational beliefs. If my inner wannabe rational is using A, B, or C to defend an irrational belief, I need to start firing at A, B, C. This leads me down a path of silly beliefs until I finally find something that is likely to change. I am not arguing that this is good or correct; on the contrary, it is the source of many, many problems with my Map.

Going after the irrational beliefs directly doesn't do anything. They are in their little walled areas and are immune to mere arguments and inquiries. I have to knock down the walls first.

Instead of halting all development until I get the walls down I let my curiosity roam in the free territories, allowing it to grow stronger. It gains ground and traction and I can already see its effect on the walls around my evil, cherished beliefs.

All this being said, I get the feeling that something is Terribly Wrong when I start poking around on the map and asking questions about the territory. These feelings are not being repressed and one day I expect to turn around and wonder how the wall was able to stand so long.

Replies from: JGWeissman
comment by JGWeissman · 2010-01-29T19:40:09.416Z · LW(p) · GW(p)

Is it a fair summary that you have theistic beliefs now, but you expect that in the future you will not have these theistic beliefs, and that your modified beliefs without theism will better correspond with reality?

If so, I would suggest as an exercise, that you consider how you would explain to a theist who expects to maintain his own theistic beliefs, why you expect to lose yours.

Replies from: MrHen
comment by MrHen · 2010-01-29T20:13:21.194Z · LW(p) · GW(p)

I don't expect to lose mine. How could I? If I thought I would lose them I would have an area with more promise of shifting beliefs. I can imagine scenarios where I would lose my beliefs but that is completely different than predicting the loss. If I actually thought I would lose by beliefs I would be attacking them voraciously.

[...] and that your modified beliefs without theism will better correspond with reality?

I assume that if I did lose my theism that it would only happen in a circumstance where the new beliefs better corresponded with reality. Essentially:

  • I have not always built beliefs within the confines of the Map/Territory and Beliefs are Predictors of Reality concepts
  • I now build beliefs with those concepts
  • Old beliefs not based on those concepts are still in the network
  • To replace an old belief with a new belief, the new belief must use the new concepts

So, if I replaced Theism with Atheism, Atheism had better match reality or I have not improved my belief making process from years ago when I was putting beliefs all over the place because it seemed like a good idea. Atheism isn't attractive in and of itself. If it were, I would be starting at the bottom line.

What good is it to believe the Truth if you are believing incorrectly?

That being said, even though I don't expect to lose my theism, it sure as hell* better update once Curiosity gets ahold of it. I don't expect my beliefs to stay the same but I am unable to predict where they are going to end up.

* Hehe, that's funny, given the context

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2010-01-29T20:27:53.772Z · LW(p) · GW(p)

If you don't expect to lose it, why are you so scared of critically examining it?

Replies from: Unknowns, MrHen
comment by Unknowns · 2010-01-29T20:32:41.352Z · LW(p) · GW(p)

He may be scared of losing it but not expect that, just as someone can be scared of ghosts without actually expecting to meet one.

comment by MrHen · 2010-01-29T20:38:06.282Z · LW(p) · GW(p)

Good question.

I don't feel as if I am scared of losing it to critical examination. I more feel like critical examination isn't going to do anything useful at this point. But I will have to think more about that and get back to you because I am catching a lot of invalid and doublethinky thoughts running through my head.

If I don't post a response by the end of tomorrow, start pestering me because I apparently decided to avoid the topic. I don't trust my future self enough to follow through on this.

Replies from: Blueberry
comment by Blueberry · 2010-01-29T21:09:44.247Z · LW(p) · GW(p)

I am catching a lot of invalid and doublethinky thoughts running through my head.

I'd love to know what they are, if you'd be willing to catch them and write them down.

Replies from: MrHen
comment by MrHen · 2010-01-29T23:02:35.903Z · LW(p) · GW(p)

Here's a mind dump. I don't have a lot of time right now, but here goes.

If you don't expect to lose it, why are you so scared of critically examining it?

Err... I'm not scared?
Than examine it.
No. I decided not to do that.
Why?
Hmm... what have I said on that subject...

If I am holding an irrational belief I find it less likely that it will shift.

Going after the irrational beliefs directly doesn't do anything. They are in their little walled areas and are immune to mere arguments and inquiries. I have to knock down the walls first.

Okay, sure that makes sense, but what if the wall is merely a creation of fear?
Okay, do I have any fear of changing away from Theism.
I want to say no...
But I have to say yes because I feel fear.
What is the fear of?
Potentials:

  • Fear of losing a belief
  • Fear of social implications
  • Fear of the unknown
  • Fear of judgement/punishment
  • Fear of being wrong
  • Fear of admitting mistakes

Let's go down the list: Fear of losing a belief.
I don't fear losing a belief.
A belief or any belief?
Mmm... most beliefs? I don't know.
Can I think of a belief I would fear losing?
Can I think of a belief I don't fear losing?
Sure, that's easy.
Than name it.
Uh... I guess I need a list of beliefs...

  • My name is my name
  • 2 + 2 = 4
  • The show tonight will be a success
  • I am getting more rational

The first two have no fear.
The third has more emotional attachment, but I don't fear losing that belief. I'd rather the show tonight be a success, but losing that belief doesn't scare me.
The last... well, it's true or not. I would rather lose that belief if it were incorrect so I could change what I needed to become more rational. So no, I don't fear losing it.
Is it more accurate to say that I fear keeping it when I shouldn't?
Yes.
Is this a good fear?
Yes, in as much as fear can ever be good.
Can I think of a more valid fear?
We are getting off subject.
Okay. Do I fear losing Theism?
Which part?
All of it.
Uh... I don't see how that can happen as of yet.
So? It doesn't matter if you can imagine it. Does it scare you?
This wasn't the original question:

If you don't expect to lose it, why are you so scared of critically examining it?

Okay. But this answer matters.
Why?
Because it eliminates a potential cause for being scared of critically examining it.
Okay, what are the other causes?

  • Fear of losing Theism
  • Time wasted on other things
  • Fear of confirming Theism and dealing with the social consequences
  • Preemptive rejection of Rationality and/or Reality

Okay. So do I fear losing Theism?
I don't know.
You don't know or you don't want to know?
Well, what would be the point in not wanting to know?

  • Meta-belief
  • Belief in belief
  • Convenient ignorance

(Ooh, Convenient Ignorance may be a good subject for a top-level post...)
Okay... so do I believe in my belief of Theism?
Sure, in the sense that I believe I believe in Theism.
Is that the same thing?
Err... no, I guess not.
So, do I believe in my belief?
What is the definition again?

You can much more easily believe that it is proper, that it is good and virtuous and beneficial, to believe that the Ultimate Cosmic Sky is both perfectly blue and perfectly green. Dennett calls this "belief in belief".

Okay, no, I do believe Theism.
Do you believe in your belief of Theism?
I don't think so, since I don't begrudge others their disbelief.
You match the description: "It is good and virtuous and beneficial to believe God exists."
Only in the sense that if it is true it is good to believe.
So if it wasn't true, you wouldn't want to believe?
Correct.
So go find out if it is true.
Yeah, okay, show me how.
Critically examine it.
I can't.
Why not?
There is a wall. That belief isn't accessible through critical examination.
If it were, would you examine it?
I don't know.
You don't know, or you don't want to know?
What difference does it make if I can't examine it anyway?
Because you may be able to examine it and you are lying to yourself about not being able to.
Oh.

And that's all the time I have. I'll try to add more tomorrow. If there is a better place to do this or people would rather me post a summary I am more than willing to comply.

EDIT: Part 2. (It isn't as interesting.)

Replies from: Alicorn, Blueberry, Eliezer_Yudkowsky, CronoDAS, byrnema, MrHen, pjeby, AngryParsley
comment by Alicorn · 2010-01-29T23:16:07.565Z · LW(p) · GW(p)

I find this really interesting to read and would love to see more, although it's kind of carriage-return intensive and might be better hosted offsite somewhere. I can offer space if you don't have a place to put it.

Replies from: CronoDAS, MrHen
comment by CronoDAS · 2010-01-29T23:27:33.652Z · LW(p) · GW(p)

Me too.

comment by MrHen · 2010-01-30T07:04:17.753Z · LW(p) · GW(p)

I just posted it. Thanks for the offer, though.

comment by Blueberry · 2010-01-29T23:31:26.606Z · LW(p) · GW(p)

I'd love to read more, and I'm especially curious what it would mean to you to no longer identify as a theist, and how that would feel. I'm also curious about the last two:

Fear of confirming Theism and dealing with the social consequences

Preemptive rejection of Rationality and/or Reality

Thanks for posting this!

Replies from: MrHen
comment by MrHen · 2010-01-30T07:13:10.096Z · LW(p) · GW(p)

I'd love to read more, and I'm especially curious what it would mean to you to no longer identify as a theist, and how that would feel.

It is a complicated feeling. It is hard to adequately explain without delving into detail explanations of (a) my particular beliefs (b) the society of friends and family I inhabit and (c) a heck of a lot of personal history. I am not ready to deal with all of that here. I suspect bits and pieces will leak out.

The one thing I will say now is that it would completely wreck almost every aspect of my life. I have everything invested in this.

I'm also curious about the last two:
Fear of confirming Theism and dealing with the social consequences

Since, at this point, I don't have much to think that critical examination will lead to me dropping Theism, it is still possible that it will strengthen Theism. I don't think it is more likely but I expect it would provoke a stronger reaction than my confession did.

Preemptive rejection of Rationality and/or Reality

If I really were scared enough to dodge critical examination I would be smart enough to drop anything that threatened a critical examination. As in, it wouldn't be given a foothold. I have enough power over my beliefs to choose what I want to believe. Right now, Rationality has my attention. If it scared me enough I would just leave and never return.

This hasn't happened and I do not expect it to happen. But if the situation were that dire, I would want to hold off on the critical examination until it was less scary.

For that to even make sense you have to give me the benefit of the doubt in terms of how I argue with myself. I don't expect it to translate well into other person's belief system. Also, it is very late... so... I don't promise anything and reserve the right to recant tomorrow. :)

Replies from: Blueberry, ciphergoth, Eliezer_Yudkowsky
comment by Blueberry · 2010-02-01T19:29:16.833Z · LW(p) · GW(p)

The one thing I will say now is that it would completely wreck almost every aspect of my life. I have everything invested in this.

Wow. Then it's not at all surprising you feel this way. You've left out a lot of details of your life, so I can't really comment on specifics (though please feel free to share them if you're ever ready to do so here). But given that, it's going to be almost impossible for you to change that belief.

I'm very confident that a detailed, unbiased examination of your theistic beliefs would reveal that there's no evidence for them and you hold them for social reasons. Do you agree? That being the case, you may not want to try to engage in this kind of examination right now. It sounds like you need time to think about what you really want in your life, and what kind of life you want to lead, independent of your beliefs about theism. Do you want to uproot your life right now?

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-02-01T19:51:38.297Z · LW(p) · GW(p)

Blueberry, the human species has got to do this sometime. Please don't get in the way.

Replies from: Blueberry
comment by Blueberry · 2010-02-01T19:57:30.557Z · LW(p) · GW(p)

I agree that humanity needs to do this sometime, and I agree that MrHen needs to do this sometime.

I don't know enough about MrHen's situation to know whether it's in his best interest to suddenly uproot himself from every aspect of his life right now, or whether there are ways of creating support networks and easing the transition that would help him. I'm not saying he should hide from the truth; I'm saying he may need to lay the groundwork for finding the truth first.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-02-01T20:07:37.462Z · LW(p) · GW(p)

AFAIK these things just get more difficult the longer you put them off. This is the usual rule, and it's also the usual rule that people are heavily motivated on a cognitive level to find excuses to let things slide. Someone wrote about this very eloquently - I'm not sure who, possibly Tim Ferris or Robert Greene - with the notion that "hoping" things will get better isn't really hope so much as a form of passivity, motivated more by fear of action and change than any positive hope. Any delay of this sort should have a definite deadline attached to it.

Replies from: AdeleneDawner
comment by AdeleneDawner · 2010-02-01T20:20:43.195Z · LW(p) · GW(p)

I've found a definite (and not necessarily complete) list of steps to be useful in the absence of a deadline, and I think that's what Blueberry was getting at: MrHen might be best served by adding things to his to-do list that answer the question "what things do I need to do to get my personal life arranged in such a way that I would be able to be 'out' as an atheist without major repercussions?"

Replies from: vividhkothari@gmail.com
comment by Vivid Dylan (vividhkothari@gmail.com) · 2022-02-16T15:44:06.005Z · LW(p) · GW(p)

I've found a definite (and not necessarily complete) list of steps to be useful in the absence of a deadline

 

Can I have that list? Can you talk to the 12 years old AdeleneDawner if she still has it?

comment by Paul Crowley (ciphergoth) · 2010-02-01T22:38:05.907Z · LW(p) · GW(p)

In that case, you probably shouldn't think about whether or not there is a God just now.

Rather, you should first think about what you're going to do if you conclude there isn't. In your case, the line of retreat is rather more literal for you than it is for other people. Who would you bring in on your thinking before it had reached a conclusion, to let them know you're really wondering? What would you do to make the best of the situation, given how much you have invested? You'll find it very hard to think about this rationally until you can really face the thought of it going either way.

Replies from: MrHen, MrHen
comment by MrHen · 2010-02-01T22:54:06.446Z · LW(p) · GW(p)

You'll find it very hard to think about this rationally until you can really face the thought of it going either way.

Yeah. This is a hard mental exercise... and this area of thought experiments encounters a lot of resistance. Something is actively blocking this area and that is Very Bad. I have a hunch about what it is but don't know how to explain it well.

Hmm...

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2010-02-02T00:20:44.724Z · LW(p) · GW(p)

But don't delay. Whichever conclusion you come to, I can't imagine you would ever turn around and think "I'm really glad I spent so long putting off really thinking hard about that". You won't enjoy it, and you're unlikely to see it as time well spent.

I'm not saying rush to a conclusion; I am saying rush to thought.

Replies from: MrHen
comment by MrHen · 2010-02-02T01:26:22.293Z · LW(p) · GW(p)

Agreed. Today is not the day, however, due to other circumstances. If I don't have at least two plausible options for both of the following questions by Saturday, February 6th feel free to pester me.

  • Who would you bring in on your thinking before it had reached a conclusion, to let them know you're really wondering?

  • What would you do to make the best of the situation, given how much you have invested?

Replies from: MrHen
comment by MrHen · 2010-02-08T06:50:12.805Z · LW(p) · GW(p)

Answer is up, one day late.

comment by MrHen · 2010-02-08T06:49:41.643Z · LW(p) · GW(p)

Who would you bring in on your thinking before it had reached a conclusion, to let them know you're really wondering?

If it came to the point where I began expecting to drop Theism I would tell my wife, my brother, and probably a good friend of mine in Minnesota. My wife because it affects her, my brother because he would probably have advice on how to deal with switching, and my friend because he has always had good advice before. And he's the one I feel I could actually talk to about the subject.

What would you do to make the best of the situation, given how much you have invested?

Given the option, I would leave my current city and go back to school. I suppose everything else revolves around the conversation I have with my wife. I would prefer to stay together but I honestly don't know what would happen. I don't see us splitting up, but I am not confident in this.

As for personal and non-social impacts, I would start over again. I would take the beliefs I have built in the journey to dropping Theism and continue the process. I expect I would continue acting relatively the same but with an attempt at slowly replacing all of the habits and rituals I have grown accustomed to having.

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2010-02-08T08:38:15.419Z · LW(p) · GW(p)

Thanks for thinking about this and answering. I hope that you're talking to these people now about the overall journey that you're on with respect to rationality, whether or not you raise the specific subject of theism. I think you'll have an easier conversation if you talk to them about the journey as it's going on than if you suddenly find yourself having arrived at somewhere that was not where you set off before those closest to you knew you were even setting out.

Replies from: MrHen
comment by MrHen · 2010-02-08T14:32:19.057Z · LW(p) · GW(p)

Actually, I find it hard to talk about rationality because everyone I would want to talk to about it would think it was completely obvious. I talk about biases and the like, and particular examples, the but the basic concepts tend to get responses like, "Well... yeah? And?"

EDIT: Note that this is somewhat of a self-fulfilling prophecy. The people I would want to talk to about it are the most likely to have already thought about these subjects.

Replies from: Kevin
comment by Kevin · 2010-02-08T14:38:45.899Z · LW(p) · GW(p)

How about talking about the solution to determinism versus free will, or "if a tree falls in the woods does it make a sound?"

Replies from: byrnema, MrHen
comment by byrnema · 2010-02-08T14:47:21.300Z · LW(p) · GW(p)

I had a snark here that I thought was amusing for like 2 minutes, and then I started to feel guilty. Taken out.

comment by MrHen · 2010-02-08T14:54:32.209Z · LW(p) · GW(p)

The solution? Everyone would get the concept of the topics involved. Most of them would get bored and move the conversation along.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-02-02T01:40:03.155Z · LW(p) · GW(p)

I should also mention that, judging from the stories I've heard, it's a lot easier to talk about your doubts with your spouse when they're doubts. I presume you have a wife and kids and parents and siblings and local community who are all deeply religious? I don't know about the others, but the sooner you start talking to your wife about your doubts, the more likely you are to stay together as you go down whatever path you go down.

Replies from: MrHen
comment by MrHen · 2010-02-02T02:35:07.996Z · LW(p) · GW(p)

This is good advice. Thank you.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-01-29T23:32:06.536Z · LW(p) · GW(p)

I value this data. Keep commenting.

Replies from: MrHen
comment by MrHen · 2010-01-30T07:13:56.238Z · LW(p) · GW(p)

I am glad. What do you find most valuable about it? Is there a way I could make it more valuable?

comment by CronoDAS · 2010-01-29T23:44:16.282Z · LW(p) · GW(p)

"Theism" is something of a catch-all term that can include lots of different things. I think that it is indeed possible that our universe has a Creator, but I'll bet my immortal soul that the God of Abraham isn't it. ;)

Maybe you could simply pin down your beliefs instead of "critically examining" them?

Replies from: Alicorn, Kevin, MrHen
comment by Alicorn · 2010-01-29T23:48:20.143Z · LW(p) · GW(p)

It's poor form to bet things you can't pony up if you lose!

comment by Kevin · 2010-01-30T03:43:01.910Z · LW(p) · GW(p)

And beneath the powers of the creator of a universe, a Type 2+ civilization should be able to seed new planets with life, which is one of the more important powers of the monotheistic God.

comment by MrHen · 2010-01-30T07:17:39.603Z · LW(p) · GW(p)

Maybe you could simply pin down your beliefs instead of "critically examining" them?

For me, "pinning down" means fine tuning definitions. This and "critically examining" use the same toolset. I essentially see them as one and the same. If I am mucking around and bothering with those pesky definitions I am going to see the inconsistencies.

I can describe how I act and that is how I generally translate my old belief system. Rationality encourages beliefs as predictors and I am taking new forming beliefs and entering them that way. The data hasn't come back from those beliefs yet but I am eagerly awaiting.

comment by byrnema · 2010-01-30T04:16:18.166Z · LW(p) · GW(p)

All that sounds like natural rambling free-association to me, and more like fear of double-think than any actual double-think.

Are you reluctant to "critically examine" your beliefs because it just sounds like a lot of work? (Counselors will say, 'let's work on this' and then an hour later when you feel like an exposed mess of emoted goo and they'll say, 'OK, see you next week.')

Given that you're comfortable with your beliefs, perhaps you're reluctant to expose your beliefs because it'll be like throwing them to the wolves. If not indiscriminate slaughter (no offense to the more militant atheists here), it'll still be something like 12 to 1.

Well, if you ever decide to do this, if it helps, I offer to help you defend your views to the extent that I can competently do so.

Replies from: MrHen
comment by MrHen · 2010-01-30T07:24:55.160Z · LW(p) · GW(p)

All that sounds like natural rambling free-association to me, and more like fear of double-think than any actual double-think.

For me, free association clears up doublethink. If I write my thought into a sentence, the sentence has a strict meaning in the English language. I can write the other side of doublethink as a second sentence and let them duke it out over a conversation with myself.

Also, by the time I had responded with the rambling I had mostly sorted out the initial emotional response. I was very surprised that I had one. (It wasn't big; but any at all is a BIG RED FLAG.)

Are you reluctant to "critically examine" your beliefs because it just sounds like a lot of work? (Counselors will say, 'let's work on this' and then an hour later when you feel like an exposed mess of emoted goo and they'll say, 'OK, see you next week.')

No. At least, not how I think of "a lot of work." I certainly avoid some topics because they are a lot of work but this isn't one of them.

Given that you're comfortable with your beliefs, perhaps you're reluctant to expose your beliefs because it'll be like throwing them to the wolves. If not indiscriminate slaughter (no offense to the more militant atheists here), it'll still be something like 12 to 1.

Nah. I am reluctant to expose my beliefs because that is a lot of work. I am too verbose for my own good and have a hard time not responding to every single comment or question.

Well, if you ever decide to do this, if it helps, I offer to help you defend your views to the extent that I can competently do so.

Hmm... how is this different than the clever arguer in The Bottom Line? Honestly, I won't need help defending my views. If I cannot defend them, why should you? The goal in talking about my beliefs wouldn't be defense and offense oriented (at least, not for me). Seeking the truth is not (or shouldn't be) a war.

Replies from: byrnema
comment by byrnema · 2010-01-30T13:32:25.859Z · LW(p) · GW(p)

OK, you don't sound afraid or like you'll want help.

You seem more self-possessed than I am. (This could be related to gender.) When I was arguing for theism, I felt like the inferential distance was great and that there were too many angles to parry at once. I would have been grateful for an interpreter/mediator.

I was most uncomfortable when people speculated about my motives, often with motives I couldn't relate to. I felt more flubbed by identity issues than atheist arguments (which I find I like well enough when they're relevant).

Seeking the truth is not (or shouldn't be) a war.

I think there is one, out there. A war of world views. LW is a sandbox where we can see how different angles and themes will play out once physical materialism becomes more mainstream.

Hmm... how is this different than the clever arguer in The Bottom Line?

My impression of the origin of due process is that the designers of the legal system were well aware of "the clever arguer" and thought the only remedy was to even the playing field.

Replies from: MrHen
comment by MrHen · 2010-01-30T16:40:50.221Z · LW(p) · GW(p)

You seem more self-possessed than I am. (This could be related to gender.) When I was arguing for theism, I felt like the inferential distance was great and that there were too many angles to parry at once. I would have been grateful for an interpreter/mediator.

I wouldn't sell your gender short. I have been doing this sort of arguing for a long time so I kind of know what to expect. The idea of an interpreter is actually significantly more interesting to me than a defender. Perhaps I misunderstood your original intent.

I was most uncomfortable when people speculated about my motives, often with motives I couldn't relate to. I felt more flubbed by identity issues than atheist arguments (which I find I like well enough when they're relevant).

I can understand that. I think I am approaching this from a different angle than you did; we'll see how it goes. :)

I think there is one, out there. A war of world views. LW is a sandbox where we can see how different angles and themes will play out once physical materialism becomes more mainstream.

I think people are fighting each other and they keep trying to dig up a war so they can tell other people to fight for them. Christianity loves to talk about this war of ideas. I am not convinced such a war needs to exist and have decided not to partake. When it comes to the bottom line, I choose what I believe. I take the evidence and come to a conclusion and move forward. The war just isn't interesting to me.

My impression of the origin of due process is that the designers of the legal system were well aware of "the clever arguer" and thought the only remedy was to even the playing field.

Near the end of What Evidence Filtered Evidence?, EY says something similar.

My impressions of the community so far have been good. The vague confession didn't really draw a lot of heat and people were very kind when asking for more details. So all signs point to good things ahead.

That being said, I would still love your input when the time comes. I just don't want you to feel like you have to pick sides. I'm not picking a side and it'll be my beliefs on the table.

comment by MrHen · 2010-01-30T07:02:30.547Z · LW(p) · GW(p)

Okay, I finished it tonight. I should warn you that the rest of this is significantly less entertaining. It is longer and less focused/more rambling. Since I read all of your replies it was hard to keep you guys out of my head... there is one part I self-censor and a few places I drift off track. There were a few interruptions as well. They are easily marked. As it is with interruptions, things don't pick up exactly where they left off. (At least one had extremely unfortunate timing.)

If there was a spoiler tag so I could auto-collapse this that would be great. If not, such a feature would be nifty. (Or possibly auto-collapsing comments after a certain length.)

Hopefully someone gets some use out of this. There is a single paragraph summary near the end if all you care about is the result.

It may take a few edits to find all the formatting typos. If you notice one let me know.


There is a wall. That belief isn't accessible through critical examination.
If it were, would you examine it?
I don't know.
You don't know, or you don't want to know?
What difference does it make if I can't examine it anyway?
Because you may be able to examine it and you are lying to yourself about not being able to.
Oh.

So... am I able to examine the wall around Theism?
Let's start with Theism. Ignore the wall.
Okay, but first we need to decide how much of this is public.
Hmm... okay. What wouldn't be?
Event X.
Okay... anything else?
Specific beliefs, I suppose.
Okay, start with Theism. What in Theism is private?
Should we even bother keeping this private?
Honestly, this is a waste of time. Why is Theism inaccessible?
Because of event X.
And that's it? Is that the only thing?
Well, yeah.
So imagine event X disappearing. It is gone; event X never happened.
Okay...
Are you scared?
No.
Why not?
Well, event X is why my emotions are even here... without X, why would I fear anything?
Okay. Imagine event X and still believing in Theism. Is it possible?
Huh. Okay, that will take awhile.
No rush.
...
No. It doesn't make sense.
Why not?
Undoing event X precludes abandoning Theism.
No it doesn't; it is just the most likely result of Theism if you undo event X.
Well, okay, sure, but if I undo X and keep Theism...
It would suck.
It would suck.
So... what does that say about Theism?
Nothing. It says something about X.
Bah, we are way off topic.
And we did this once.
Okay, starting again, why is Theism inaccessible?
Man, this sucks. I don't see how we can do this without talking about X.
X doesn't matter.
Yes it does. And no one is going to want to read this.
So? This isn't for them. It's for you and they asked for it.
They didn't ask for this-
Anyway, this is irrelevant. Stay on topic.
The topic is X!
No it isn't. The topic is Theism.
...
I don't even know how to explain X-
Theism!
Grr...
If you tried, right now, to critically examine Theism without undoing X, what would happen?
[interruption from wife]


We still aren't getting anywhere. If you tried, right now, to critically examine Theism without undoing X, what would happen?
*sigh*
Okay, are all areas of Theism inaccessible?
No.
Name an area that is accessible.
The omni- attributes.
So critically examine those.
Here?
Well, no. But does it make you scared?
No.
Have you critically examined them?
Yeah. But not a whole lot.
Why not?
Because they don't matter that much.
Matter... how?
My behaviors won't change.
Why not?
Because I don't treat God as if he has any of those attributes.
Why not?
Because they failed the critical examination.
Okay... so how much have you examined?
Enough to know I cannot proceed unless I deal with X.
Argh!
Look, it's not my fault. You know why.
Yeah, but how do we tell them that?
We don't. Why do we need to tell them anything?
...
No, seriously, we don't need to tell them anything. And none of this has anything to do with fearing critical examination.
So do you fear critical examination?
Not the examination I have done.
Can you do more?
Absolutely.
Than why don't you?
Because my tools suck. I want better tools.
And when you get better tools?
Then I work on the framework of belief.
And then?
I make sure the new beliefs coming in are solid and useful.
And then?
Then I look at my old beliefs.
Which ones?
The ones affecting everyday behavior; then the ones affecting monthly choices, yearly, and so on.
Why not start with the bigger ones?
Because they are built on smaller ones.
Really?
Uh, yes?
How do you know?
Where is this going?
Answer the question.
Hmm...
...
Okay, something has to drive the bigger choices.
Like Theism.
No, Theism is a bigger belief.
That's what I meant.
Oh, okay. Yeah, like Theism. Theism is something that affects a larger scope of actions than others.
So why focus on the small stuff?
Because the small stuff is easier to attach to Reality.
Okay, that makes sense. Give me an example.
Assuming my tools work well, the way I spend my daily time.
Sure, that makes sense. And then?
The subjects to spend the time on.
Okay.
I suspect that Theism will hit at this point.
Right. And are you scared of that?
No.
Why not?
Because it is so far out I cannot predict anything about it. Even if I feared losing Theism, I have no reason to think I will drop theism from critical examination.
Okay. But do you fear losing Theism?
Well, sure. What was the original question?

If you don't expect to lose it, why are you so scared of critically examining it?

[interrupted by the show]


Okay, so I fear losing Theism but the remaining question is whether I am scared of critically examining it.
First, do I even accept the first part of this question: "If you don't expect to lose it..."
Yes, I said that clearly.
So if you were to lose it, would it be through critical examination?
Yeah, probably.
So critical examination is the most likely way to lose Theism.
Yes.
And I fear losing Theism.
In the sense that I fear not having it.
So the most likely path to this end is through critical examination.
Yes.
Does that make you fear critical examination?
No. If anything, I fear what it might do.
Would that prevent you from the examination?
If the fear was strong enough... sure.
Is it strong enough?
No. I have critically examined areas of my Theism.
But those really weren't core aspects. They would never attack Theism, only particular beliefs inside of Theism.
Which brings us back to the wall around Theism.
Right, so we are back in the same place.
Well, what have we learned?

  • I fear not having Theism for various reasons
  • I am not ready to critically examine Theism
    -- Event X
    -- Higher priorities (better tools, incoming beliefs, beliefs that are "closer" to Reality)
  • Theism will eventually be critically examined
  • When this happens, I do not expect Theism to fall
  • If Theism is untrue I will want to know it is untrue
  • I still fear not having Theism even if it is untrue
  • The fear has little to do with belief and more to do with the fallout of not believing

[interrupted]


So the direct answer to the question is that I am not critically examining Theism because (a) I don't expect significant progress and (b) doing other things will likely improve my ability to critically examine things which will eventually be useful with Theism.

Followup questions for a future time:

  • Completing analysis of the list of potential fears. I only looked at one.
  • Looking at the list of reasons I might fear critical examination. I ended up taking a completely different route to the conclusion... so most of this was extraneous.
  • Convenient Ignorance is still an interesting topic. Is there a full post here?
  • How does Belief in Belief work with beliefs that are self-referential and dictate morality? Should it be a red flag when a belief includes the clause, "And believing this belief is good"? Hunches say yes.
  • I didn't really define the wall around Theism.
  • At some point, I will probably need to explain and define Event X. I expect this to be troublesome and slightly awkward. I apologize if this vagueness annoys you; I do not apologize for being vague.
  • This question was never directly answered: "If you tried, right now, to critically examine Theism without undoing X, what would happen?" It would be good to revisit.
  • The actual priority list could use a good examination.
  • This sentence may be touching a bigger topic: "Because the small stuff is easier to attach to Reality." Something connected to that would provide enough material for a full post. It is likely someone has already posted it... so start with a search.
  • In the meantime, whilst not examining Theism, what is the correct way to act?
Replies from: JGWeissman, tlhonmey
comment by JGWeissman · 2010-02-01T22:07:48.236Z · LW(p) · GW(p)

I still fear not having Theism even if it is untrue

Why?

How has this affected your thinking?

Replies from: MrHen
comment by MrHen · 2010-02-01T22:25:39.021Z · LW(p) · GW(p)

There are impacts from not having Theism. The most obvious are social. Most of the others are easy enough to deal with. There is also a really, really vague one that I haven't figured out how to do talk about yet.

Sorry there isn't more information being offered here.

I don't understand your second question.

Replies from: JGWeissman
comment by JGWeissman · 2010-02-01T22:37:30.957Z · LW(p) · GW(p)

I don't understand your second question.

Would your belief in theism be different if you did not have a fear of losing the belief even if not true? To what extent does this fear compete with your desire for accurate beliefs?

Replies from: MrHen
comment by MrHen · 2010-02-01T22:47:15.095Z · LW(p) · GW(p)

Ah, okay. Bullet point answers:

  • IF Assuming Theism was not true
  • AND Assuming no fear of losing Theism if Theism was not true, then
  • THEN I would drop Theism as soon as I convinced myself it wasn't true

Other variations on the above format:

  • IF Assuming Theism was not true
  • AND Assuming there is fear of losing Theism if Theism was not true, then
  • THEN I would drop Theism as soon as I convinced myself it wasn't true
  • ONLY IF I overcame my fear of losing Theism.

I would expect convincing myself Theism isn't true would be harder than overcoming my fear of losing Theism. This leads into your question:

To what extent does this fear compete with your desire for accurate beliefs?

You are implying a scenario more like the following:

  • IF Assuming Theism was not true
  • AND Assuming there is fear of losing Theism if Theism was not true, then
  • THEN I would convince myself Theism wasn't true
  • ONLY IF I overcame my fear of losing Theism.

Which is a subtle but important difference. I like to think that my fear wouldn't cloud my ability to perceive the truth... but I don't actually know how to verify that. Signs seem to point the exact opposite way, in fact.

I suppose one solution would be to lesson my fear in losing Theism, which seems to be the route pjeby suggested in another comment.

comment by tlhonmey · 2022-05-18T21:08:50.854Z · LW(p) · GW(p)

One thing that occurs to me while reading this is that for most people, their religion consists nearly entirely of cached beliefs.  Things they believe because they were told, not because they derived the result themselves.

This makes any truly critical examination of one's religious beliefs rather a daunting task.  To start with, you're going to have to recompute potentially thousands of years of received wisdom for yourself.  That's...  A lot of work.  There's a reason we cache beliefs, otherwise it would take a lifetime just to be minimally educated.

And then there's the bigger one that I think most of the other commenters have glossed over.  Recomputing your religion into self-consistency can be scary because if you recognize that previous generations were no less intelligent and no less searching for truth than you are, then there is a not-insignificant chance that your recalculations will introduce more errors than they correct.  That would be bad.

On the other hand, if nobody ever grinds through all the equations again, then any bad values that slipped in somewhere never get caught.  At some point the balance of old mistakes vs. potential new mistakes tips in your favor.

My personal strategy is that, when there's a contradiction, recompute until it is resolved without creating any new ones.  If you can't, flag it as a hole in your model and keep your eyes open for a better fit.  

Obviously, any religion that prohibits honest questioning should be laughed off the face of the Earth.

comment by pjeby · 2010-02-01T20:58:29.856Z · LW(p) · GW(p)

I find this self-dialog very interesting; in certain aspects it resembles the sort of self-dialog I teach people to use to throw off more mundane fears and mental/emotional blocks, outdated moral injunctions, etc.

There are a few places in what you're doing where a more focused approach would be helpful, though. For example, I would define an outcome and a test procedure: what are you attempting to change, and how will you know if you changed it? This alone will help you trim distractions a bit.

Also, generally speaking, the key to getting rid of an irrational belief is to clearly identify the past negative consequences associated with disbelief in that belief. Your expectations of what will happen in the future are usually either an irrelevant speculative extrapolation by your logical mind, or a simple projection from emotional memory... And it's the latter category that's relevant, as long as you focus on identifying the "near", sensory memory of the events your future prediction is based on.

In particular, you are looking for memories involving the loss of either personal status/significance, the loss of connective bonding, the loss of perceived safety, or the loss of available novelty/stimulation, (with these latter two being far less common), associated with either you or someone else failing to believe (or act upon) the belief in question.

The neurological phenomenon known as "reconsolidation" explains why access to the original memory is useful; it's simplest to remove an emotional attachment to a thought or belief by reinterpreting the original memory that triggers the emotions, rather than to build elaborate reroutings of thought "downstream" of the source.

Once you've identified the specific memory you're using to form your emotional/intuitive judgment (creating the fear), you can use further questioning to cast doubt on your original interpretation of events, consider other possible interpretations, wonder whether the situation is different, etc... and in the process, this sets up alternative lines of thought linked from the original memory, allowing you to have a different emotional probability distribution, so to speak.

I'm being necessarily terse here, as I know of at least two whole books that have been written on minor variations of this basic process: "Loving What Is" by Byron Katie, and "Recreating Your Life" by Morty Lefkoe, each proposing a different sequence and set of questions, but essentially following the same general process I've just described. I've also done workshops on my own set of variations, with slightly different scopes of applicability than either of their methods.

Either book, however, is quite good with respect to having lots of example dialogues to show how to apply their processes in practice, and either one would, I think be helpful in focusing your approach to this, or any other attempt to change an emotional belief or judgment.

Replies from: MrHen
comment by MrHen · 2010-02-01T21:58:26.040Z · LW(p) · GW(p)

There are a few places in what you're doing where a more focused approach would be helpful, though. For example, I would define an outcome and a test procedure: what are you attempting to change, and how will you know if you changed it? This alone will help you trim distractions a bit.

Agreed. I posted Easy Predictors as an attempt to get input from the community about easy to test predictor beliefs but didn't get much of a response. I am keeping track of smaller things that have easy turnaround times to see if it is possible to do this sort of thing informally.

This does not apply to outcomes of belief creation, however. Is there a good way to test things like that? Or am I misinterpreting your suggestion? Or... ?

The rest of your comment is interesting to me because it directly focuses on the prediction of trauma due to dropping Theism (and related subjects). I hadn't really thought about the details of the fallout beyond key trouble spots. Is this a fair two-sentence reduction of your suggestions?

Looking at similar past events that carry the same emotional trauma due to dropped beliefs can give me the ability to question the validity of my fear of the future by comparing and contrasting the differences. In addition, this process may reveal a solution to the projected trauma by preventing it from happening or weakening its impact.

Am I close?

Replies from: orthonormal, pjeby
comment by orthonormal · 2010-02-02T01:56:04.778Z · LW(p) · GW(p)

Mr. Hen, I'm going to break custom and say something that may be regarded as poisoning the well. It's my conclusion that P.J. Eby is more or less a quack trying to drum up support for his psychological services, and that (in such an important matter as this) you shouldn't be trying to understand his jargon, let alone trying to take his advice.

His persistent trumpeting of perceptual control theory, which couples grandiose claims of precision with a complete lack of experimental support, is telling, and it's not the only red flag I've seen...

Replies from: MrHen, pjeby, wedrifid
comment by MrHen · 2010-02-02T02:42:58.463Z · LW(p) · GW(p)

I am still willing to at least listen and dialog with pjeby, but I find it interesting that this comment is at +3 so quickly. Thank you for the warning (and concern). It does have an impact. (The karma swing helped.)

comment by pjeby · 2010-02-02T04:10:21.287Z · LW(p) · GW(p)

trying to drum up support for his psychological services

Right, that's why I recommended two books written by other people. You have brilliantly exposed my clever scheme:

  1. Offer assistance, while recommending books by other authors
  2. ????
  3. Profit!!!
comment by wedrifid · 2010-02-02T09:29:51.693Z · LW(p) · GW(p)

Mr. Hen, I'm going to break custom and say something that may be regarded as poisoning the well.

I should note, now that the parent is at -1, that my vote does not represent disapproval of well poisoning, just disagreement in this instance. Pjeby's practical advice seems well founded to me and I believe it will benefit those willing to receive it.

I probably agree with you when it comes to the rigid use of PCT models and of his custom jargon. I find PJ's practical experience more useful than his abstract theorizing. I would not vote except, as you say, the matter is important. Even more so when someone's reputation is at stake.

comment by pjeby · 2010-02-02T04:01:18.677Z · LW(p) · GW(p)

This does not apply to outcomes of belief creation, however. Is there a good way to test things like that? Or am I misinterpreting your suggestion? Or... ?

I mean that if you're going to go digging around your head to change something, it would be best to have a criterion by which you can judge whether or not you've succeeded. Otherwise, you can rummage around in there forever. ;-)

An example criterion in this case might be "Thinking about not believing in God no longer causes an emotional reaction, as evidenced by my physical response to a specific thought about that."

Defining a test in this way -- i.e., observing whether your (repeatable) physical reaction to a thought has changed -- allows you to determine whether any particular approach has succeeded or failed. I suggested the two books I did because I have found it relatively easy to produce such repeatable, testable results with their techniques, once I got the hang of paying attention to my sensory responses to the questions asked, and ignoring my logical/abstract ones. (Since changing one's logical beliefs isn't the hard part.)

The rest of your comment is interesting to me because it directly focuses on the prediction of trauma due to dropping Theism (and related subjects). I hadn't really thought about the details of the fallout beyond key trouble spots. Is this a fair two-sentence reduction of your suggestions?

Looking at similar past events that carry the same emotional trauma due to dropped beliefs can give me the ability to question the validity of my fear of the future by comparing and contrasting the differences. In addition, this process may reveal a solution to the projected trauma by preventing it from happening or weakening its impact.

No, what I'm saying is that your projection is based on some specific, sensory experience(s) you had, like for example your parents speaking disparagingly about atheists, or other non-followers of your parents' belief system. At some point, to feel threatened by being outcast, you had to learn who the outgroups were, and this learning is primarily experiential/emotional, rather than intellectual, and happens on a level that bypassed critical thought (e.g. because of your age, or because of the degree of emotion in the situation).

Identifying this experience and processing it through critical thought, weakens the emotional response triggered by the thought, then gives you the ability to think rationally about the subject again... thereby leading to potential solutions. Right now, the fear response paralyzes your critical and creative thinking, making it very hard to see what solutions may be in front of you.

IOW, your prediction of trauma comes from a past trauma -- our brains don't come with a built-in prior probability distribution for what beliefs will cause people to like or not like us. ;-) If you want to switch off the fear, you have to change the prediction, which means changing the probability data in your memory... which means accessing and reinterpreting the original sensory experience data.

In order to find this information, you focus on the sensory portion of your prediction, prior to verbalization. That is, when you ask, "What bad thing is going to happen?" refrain from verbalizing and pay attention to the images, feelings, and general impressions that arise. Then, let your mind drift back to when you first saw/felt/experienced something like that.

A recent personal example: I discovered yesterday that the reason I never gave my software projects a "1.0" version is because I was afraid to declare anything "finished" or "complete"... but the specific reason, was that when I did chores as a kid, or cleaned my room, my mother found faults and yelled at me. Emotionally, I learned that as long as someone else could possibly find a way to improve it, I was not allowed to call it "finished", or I would be shamed (status reduction).

Until I uncovered this specific way in which I came by my emotional response, all my conscious efforts to overcome this bad habit were without effect. The emotion biased my conscious thoughts in such a way that I really and truly sincerely believed that my projects were not "finished"... because the definition I was unconsciously using for "finished" didn't allow me to be the one who declared them so.

But having specifically identified the source of this learning, it was trivial to drop the emotional response that drove the behavior... and immediately after doing so, I realized that there were a wide variety of other areas in my life affected by this bias, that I hadn't noticed before.

Most psychological discussion of fears tends to focus on the abstract level, i.e. obviously I was afraid to declare things finished, for "fear of criticism". But that abstract knowledge is almost entirely useless for actually changing the feelings, and therefore removing the bias. Mostly, what such abstract knowledge does is sometimes allow people to spend a lifetime trying to work around or compensate for their feeling-driven biases, rather than actually changing them.

And that's why I urge you to focus on specific sensory experience information in your dialoging, and treat all abstract, logical, or verbally sophisticated thoughts that arise in response to your questions as being lies, rumor, and distraction. If your logical abstract thoughts were actually in charge of your feelings, you'd already be done. Save 'em till the bias has been repaired.

Replies from: pjeby, MrHen, wedrifid
comment by pjeby · 2010-02-02T04:09:14.864Z · LW(p) · GW(p)

IOW, your prediction of trauma comes from a past trauma -- our brains don't come with a built-in prior probability distribution for what beliefs will cause people to like or not like us. ;-) If you want to switch off the fear, you have to change the prediction, which means changing the probability data in your memory... which means accessing and reinterpreting the original sensory experience data.

Btw, the Iowa Gambling Task is an example of a related kind of unconscious learning that I'm talking about here. In it, people learn to feel fear about choosing cards from a certain deck, long before their conscious mind notices or accounts for the numerical probabilities involved. Then, their conscious minds often make up explanations which have little if any connection to the "irrational" (but accurate) feeling of fear.

So if you seem to irrationally fear something, it's an indication that your subconscious picked up on raw probability data. And this raw probability data can't be overrided by reasoning unless you integrate the reasoning with the specific experiences, so that a different interpretation is applied.

For example, suppose there's someone who always looks away from you and leaves the room when you enter. You begin to think that person doesn't like you... and then you hear they actually have a crush on you. You have the same sensory data, but a different interpretation, and your felt-response to the same thoughts is now different. Voila... memory reconsolidation, and your thoughts are now biased in a different, happier way. ;-)

comment by MrHen · 2010-02-02T04:40:43.543Z · LW(p) · GW(p)

No, what I'm saying is that your projection is based on some specific, sensory experience(s) you had, like for example your parents speaking disparagingly about atheists, or other non-followers of your parents' belief system. At some point, to feel threatened by being outcast, you had to learn who the outgroups were, and this learning is primarily experiential/emotional, rather than intellectual, and happens on a level that bypassed critical thought (e.g. because of your age, or because of the degree of emotion in the situation).

Okay, that makes sense. My initial reaction is that the fear has less to do with people's reactions to me and more the amount of change in the actions I take. Their responses to these new actions is more severe than their expected actions as a result of my dropping Theism.

But the more I think about it the more I think that this is just semantics. I'll give your suggestion a shot and see what happens. I am not expecting much but we'll see. The main criticism that I have at this point is that my "fears" are essentially predictions of behavior. I do not consider them irrational fears...

So if you seem to irrationally fear something, it's an indication that your subconscious picked up on raw probability data. And this raw probability data can't be overrided by reasoning unless you integrate the reasoning with the specific experiences, so that a different interpretation is applied.

Ah, okay, this part relates to the trigger of dealing with the initial reaction to the questions being asked.

My personal solutions for this style of fear (which is separate from the fear of future social reactions, which I can understand may not have been obvious) is the same as my pattern of behavior relating to pain tolerance. It goes away if I focus on it just the right way.

By the end of the week I expect to be able to return to the topic without any overt hinderances. I take this to mean the fear is gone or I am so completely self-deluded that the magic question no longer means the same thing as it did when it was first asked. I prefer to think it is the former.

Replies from: pjeby
comment by pjeby · 2010-02-02T05:06:32.018Z · LW(p) · GW(p)

My initial reaction is that the fear has less to do with people's reactions to me and more the amount of change in the actions I take. Their responses to these new actions is more severe than their expected actions as a result of my dropping Theism.

I was just giving an example. The key questions are:

  1. What is the trigger stimulus? and
  2. What is the repeatable, observable reaction you wish to change?

In what you said above, the trigger is "thinking about what I'd do if I were not a theist", and you are using the word "fear" to describe the automatic reaction.

I'm saying that you should precisely identify what you mean by "fear" - does your pulse race? Palms sweat? Do you clench your teeth, feel like you're curling into a ball, what? There are many possible physical autonomic reactions to the emotion of fear... which one are you doing automatically, without conscious intent, every time you contemplate "what I'd do if I were not a theist"?

This will serve as your test - a control condition against which any attempted change can be benchmarked. You will know you have arrived at a successful conclusion to your endeavor when the physiological reaction is extinguished - i.e., it will cease to bias your conscious thought.

I consider this a litmus test for any psychological change technique: if it can't make an immediate change (by which I mean abrupt, rather than gradual) in a previously persistent automatic response to a thought, it's not worth much, IMO.

But the more I think about it the more I think that this is just semantics.

Focus on what the stimulus and response are, and that will keep you from wandering into semantic questions... which operate in the verbal "far" mind, not the nonverbal "near" mind that you're trying to tap into and fix.

This is one of those "simple, but not easy" things... not because it isn't easy to do, but because it's hard to stop doing the verbal overshadowing part.

We all get so used to following our object-level thoughts, running in the emotionally-biased grooves laid down by our feeling-level systems, that the idea of ignoring the abstract thoughts to look at the grooves themselves seems utterly weird, foreign, and uncomfortable. It is, I find, the most difficult part of mindhacking to teach.

But once you get used to the idea that you simply cannot trust the output of your verbal mind while you're trying to debug your pre-verbal biases, it gets easier. During the early stages though, it's easy to be thinking in your verbal mind that you're not thinking in your verbal mind, simply because you're telling yourself that you're not... which in hindsight should be a really obvious clue that you're doing it wrong. ;-)

Bear in mind that your unconcious mind does not require complex verbalizations (above simple if-then noun-verb constructs) to represent its thought processes. If you are trying to describe something that can't be reduced to "(sensory experience X) is followed by (sensory experience Y)", you are using the wrong part of your brain - i.e., not the one that actually contains the fear (or other emotional response).

comment by wedrifid · 2010-02-02T09:35:20.847Z · LW(p) · GW(p)

IOW, your prediction of trauma comes from a past trauma -- our brains don't come with a built-in prior probability distribution for what beliefs will cause people to like or not like us.

The brain doesn't need past trauma in this instance. Our brains do come with a built-in prior probability distribution for what will happen when you become an apostate, rejecting the beliefs of the tribe in which you were raised.

Replies from: pjeby
comment by pjeby · 2010-02-02T14:44:40.405Z · LW(p) · GW(p)

Our brains do come with a built-in prior probability distribution for what will happen when you become an apostate, rejecting the beliefs of the tribe in which you were raised.

Ahem. We are adaptation executers, not fitness maximizers. Our brains come with a moral mechanism that's been shaped by that probability distribution, but they don't come with that specific prediction built in at an object level.

Instead, we simply learn what behaviors cause shaming, denunciation, etc., and this then triggers all the conscious shame/guilt/etc., as well as the idealizing, moralizing, punishing others, and punishing of non-punishers... with all of these actions being more highly-motivated in cases where the behavior is desirable to the individual involved.

Professing or failing to profess certain beliefs is just one minor case of "behavior" that can be regulated by this mechanism. I have not observed anything that suggests there is a mechanism specific to religious beliefs or even beliefs per se, distinct from other kinds of behavior. There is litle difference between an injunction to say some belief is true or good, and an injunction to always say thank you, or to never brag about yourself. (Or my recently discovered injunction not to say something is finished!)

All of these are just examples of verbal behavior that can regulated by the same mechanism. (In any case, MrHen has already pointed out that the fear is less about him stating new beliefs, than it would be about acting on them.)

Anyway, it seems to me that we have only one "moral injunction" apparatus that is applied generically, and the feelings that it generates do not contain any information about being kicked out of the tribe or failure to mate, etc. Instead, the memory of a shaming event is itself the bad prediction or negative reinforcer. Adaptation execution FTW, or more like FTL in this case at least.

Replies from: wedrifid
comment by wedrifid · 2010-02-02T15:45:31.849Z · LW(p) · GW(p)

Adaptation execution FTW, or more like FTL in this case at least.

That isn't the issue here. Yes, adaptation execution, Woohoo!! Obviously the probability distribution for expected consequences isn't built in to the amygdala.

I nevertheless assert that the universal human aversion to changing our fundamental signalling beliefs is more than just Mommy Issues filtered through PCT. Human instinctive responses are sophisticated and a whole lot of them are built in, no shaming required. We're scared of spiders, snakes and apostasy. They're adaptations right there in the DNA.

Replies from: pjeby
comment by pjeby · 2010-02-02T16:43:54.516Z · LW(p) · GW(p)

We're scared of spiders, snakes and apostasy.

Er, research please. Everything I've seen shows that even monkeys have to learn to fear snakes and spiders - it has to be triggered by observing other monkeys being afraid of them first.

I nevertheless assert that the universal human aversion to changing our fundamental signalling beliefs is more than just

Occam's razor says you're more likely to be wrong than I am: a general purpose mechanism for conditioning verbal behavior is more than sufficient to produce the results we observe, especially if you consider internal verbal thinking a form of verbal behavior -- which it pretty plainly is.

For example, this provides a simpler mechanism for "belief in belief", than your proposal of a distinct mechanism. It allows us to "believe" - i.e. consistently say we believe (even to ourselves on the inside), when in fact we don't.

[edited to delete unfair rhetoric of my own]

Mommy Issues filtered through PCT.

FWIW I said nothing about PCT, nor did I say that a parent had to be the one delivering the shame. If your own personal bias about me is such that you can't avoid engaging in this type of rhetorics, perhaps you should consider giving yourself some cooling off time before you reply.

Replies from: Cyan, wedrifid
comment by Cyan · 2010-02-02T16:56:58.889Z · LW(p) · GW(p)

I'll gently ignore the part where I've logged a lot more time with a lot more people, working on this type of belief than you have, making testable behavior changes.

Proslepsis!

Replies from: ciphergoth, pjeby
comment by Paul Crowley (ciphergoth) · 2010-02-02T17:19:33.450Z · LW(p) · GW(p)

Now now, you can't have points for that twice!

Replies from: Cyan
comment by Cyan · 2010-02-02T17:20:44.218Z · LW(p) · GW(p)

But it worked so well the first time! Aww.

comment by pjeby · 2010-02-02T18:40:07.595Z · LW(p) · GW(p)

Oops. I actually intended to delete that, because I felt it was the same sort of unfair rhetoric as I was accusing wedrifid of. Thanks for bringing it to my attention.

comment by wedrifid · 2010-02-03T00:55:01.981Z · LW(p) · GW(p)

Er, research please. Everything I've seen shows that even monkeys have to learn to fear snakes and spiders - it has to be triggered by observing other monkeys being afraid of them first.

I was quoting Steven Pinker but my copy is an audio book so I can't give you the specific references to the study he mentions. A simple google search brings up plenty of references. (Google gives popularised summaries. Follow the links provided therein to find the actual research.)

Your claim mentions 'everything you have seen'. Given that contradictory reports are so freely available and your confidence in the model your are asserting I would have expected you to have a somewhat more broad exposure to the relevant science.

For example, this provides a simpler mechanism for "belief in belief", than your proposal of a distinct mechanism. It allows us to "believe" - i.e. consistently say we believe (even to ourselves on the inside), when in fact we don't.

Skinner had a similar 'simple' theory. But he was wrong. Not wrong because the mechanisms he described weren't important parts of human psychology but wrong because he asserted them to the exclusion of all else.

I'll gently ignore the part where I've logged a lot more time with a lot more people, working on this type of belief than you have, making testable behavior changes.

I believe you can make testable behavior changes and your work with clients impresses me. I also believe you could change people to be less afraid of, for example, heights. Nevertheless, I would not necessarily believe your report on how these anxieties came into being. People can be afraid of heights even if they didn't make a habit of falling off cliffs in their childhood.

If your own personal bias about me is such that you can't avoid engaging in this type of rhetorics, perhaps you should consider giving yourself some cooling off time before you reply.

I have a strong bias for you PJ, in all but your tendency to be quite rigidly minded when it comes to forcing reality into your simple models. I allow myself to vocally reject the parts of your comments that I disagree with because that way I will not be dismissed as a fan boy when I speak in your defense. You aren't, for example, a quack and your advice, experience and willingness to share it are invaluable. I also, for what it is worth, find PCT to be a useful way of describing the dynamics of human behavior much of the time.

Replies from: pjeby
comment by pjeby · 2010-02-03T05:54:47.610Z · LW(p) · GW(p)

I was quoting Steven Pinker but my copy is an audio book so I can't give you the specific references to the study he mentions. A simple google search brings up plenty of references. (Google gives popularised summaries. Follow the links provided therein to find the actual research.)

Perhaps I'm missing something, but I don't see where it says we're all automatically afraid of snakes. I have seen research that monkeys have an inbuilt ability to learn to fear snakes, but the mechanism has to be switched on via learning, and my understanding is that humans are the same way... unless you are arguing that individual variations in fear of snakes is purely determined by genetics.

[Edit to add: one of the first papers you linked to includes this quote: "For studies of captive primates, King did not find consistent evidence of snake fear." And the second page goes on to describe the very "they have to learn to fear snakes" research that I previously spoke of.]

Given that contradictory reports are so freely available and your confidence in the model your are asserting I would have expected you to have a somewhat more broad exposure to the relevant science.

I think perhaps we are miscommunicating: I do not deny that primate brains contain snake detectors. I do deny that said detectors are unaffected by learning: humans and monkeys can and do learn which snakes to fear, or not fear.

Skinner had a similar 'simple' theory. But he was wrong. Not wrong because the mechanisms he described weren't important parts of human psychology but wrong because he asserted them to the exclusion of all else.

We seem to be miscommunicating again. What mechanism is it that you think I am asserting "to the exclusion of all else"? The model I personally use contains several mechanisms, and the moral injunctions aspect I spoke of here is only one such mechanism. It is certainly not the only relevant mechanism in human behavior, even in the relatively narrow field of applicability where I use it.

People can be afraid of heights even if they didn't make a habit of falling off cliffs in their childhood.

I don't do classical phobia work, actually, so I wouldn't have a valid opinon on that one, one way or the other. ;-)

Nevertheless, I would not necessarily believe your report on how these anxieties came into being.

It's certainly true that, In order to reach scientific standards, I would need to find a way to double-blindly substitute a placebo version of childhood memories for the real thing in order to prove that it's the modification of the memory that makes it work. (I have occasionally tested single-blind placebo substitutions on other things, but not this, as I have no idea what I could substitute.)

Mainly, what I do to test alternative hypotheses regarding a change technique is to see what parts of it I can remove, without affecting the results. Whatever's left, I assume has some meaning. (Side note: most published descriptions of actually-working self-help techniques contain superfluous steps, that, when removed, tend to make each technique sound like a mere minor variation on one of a handful of major themes... which I expect to correspond to mechanisms in the brain.)

In the instant discussion of moral injunctions, examining the memory of the learning or imprint experience appears to be indispensable, and therefore I conclude (hypothesize, if you prefer) that these memories are an integral part of the process of formation of moral injunction-regulated behavior.

I have a strong bias for you PJ, in all but your tendency to be quite rigidly minded when it comes to forcing reality into your simple models.

FWIW, I do not claim universal applicability of my models outside their target domain. However, within that target domain, most discussions here tend to have only vaporous speculation weighing against many, many tests and observations. When someone proposes a speculative and more complex model than one I am already using, I want to see what their model can predict that mine cannot, or vice versa.

If you have a more parsimonious model for "belief in belief" than simple moral injunctions regarding spoken behavior, I'd love to see it. But since "belief in belief" cleanly falls out as a side effect of my model, I don't see a reason to go looking for a more complicated, special-purpose belief module, just because there could be one. Should I encounter a client who needs a belief-in-belief fixed, and find that my existing model can't fix it, then I will have reason to go looking for an updated model.

Now, when I do see a more parsimonious model here than one I'm already using, I adopt it wholeheartedly. For all that people seem to frame me as having brought PCT to Lesswrong.com, the reverse is actually true:

lesswrong is where I heard about PCT in the first place!

And I adopted it because it fit very neatly into my existing model... it was as though my model was a graph with lots of edges, but no nodes, and PCT gave me a paradigm for what I should expect "nodes" to look like. (And incorporating it into my model also subsequently allowed me to discover a new kind of "edge" that I hadn't spotted previously.)

So actually, I don't consider PCT to be a comprehensive model in itself either, because it lacks the "edges" that my own model contains!

Which makes it a bit frustrating any time anyone acts as though I 1) brought PCT to LW, and 2) think it's a cure-all or even a remotely complete model of human behavior... it's just better than its competitors, such as the aforementioned Skinnerian model you mentioned.

I allow myself to vocally reject the parts of your comments that I disagree with because that way I will not be dismissed as a fan boy when I speak in your defense.

Great. I would appreciate it, though, if you not use boo lights like "mommy issues" and "PCT" (which sadly, seems to have become one around these parts), especially when the first is a denigratory caricature and the second not even relevant. (Moral injunctions are an "edge" in my own model, not a "node" from PCT.)

Replies from: wedrifid
comment by wedrifid · 2010-02-03T07:21:21.827Z · LW(p) · GW(p)

I think perhaps we are miscommunicating: I do not deny that primate brains contain snake detectors. I do deny that said detectors are unaffected by learning: humans and monkeys can and do learn which snakes to fear, or not fear.

I agree on this note. I do not agree that Occam suggests that fear of snakes, spiders and heights is the sole result of learned associations. I also do not agree that aversion to fundamental belief switching is purely the result of learning from trauma.

Replies from: pjeby
comment by pjeby · 2010-02-03T16:43:30.003Z · LW(p) · GW(p)

I do not agree that Occam suggests that fear of snakes, spiders and heights is the sole result of learned associations. I also do not agree that aversion to fundamental belief switching is purely the result of learning from trauma.

Of course not. I never claimed they were. I only make the claim that learning is an essential component of the moral injunction mechanism. You have to learn which beliefs not to switch, at the very least!

I've also described a variety of apparently built-in behaviors triggered by the mechanism: proselytizing, gossip, denouncing others, punishing non-punishers, feelings of guilt, etc. These are just as much built-in mechanisms as "snake detectors"... and monkeys appear to have some of them.

What I say is that, just like the snake detectors, these mechanisms require some sort of learning in order to be activated... and that evolutionarily, applying these mechanisms to behavior would be of primary importance; applying them to beliefs would have to come later, after language.

And at that point, it's far more parsimonious to assume evolution would reuse the same basic behavior-control mechanism, rather than implementing a new one specifically for "beliefs"... especially since, to the naive mind, "beliefs" are transparent. There's simply "how things are".

To an unsophisticated mind, someone who thinks things are different than "how things are" is obviously either crazy, or a member of an enemy tribe.

Not an "apostate".

Most of the behavior mechanisms involved are there for the establishment and maintenance of tribe behavioral norms, and were later memetically co-opted by religion. I quite doubt that religion or anything we'd consider a "belief system" (i.e., a set of non-reality-linked beliefs used for signalling) were what the mechanism was meant for.

IOW, ISTM the support systems for reality-linked belief systems had to have evolved first.

This is not a claim of exclusivity of mechanism, so I don't really know where you're getting that from. I'm only saying that I don't see the necessity for an independent belief-in-belief system to evolve, when the conditions that make use of it would not have arrived until well after a "group identity behavioral norms control enforcement" system was already in place, and the parsimonious assumption is that non-reality-linked beliefs would be at most a minor modification to the existing system.

Replies from: wedrifid
comment by wedrifid · 2010-02-03T17:37:31.932Z · LW(p) · GW(p)

To an unsophisticated mind, someone who thinks things are different than "how things are" is obviously either crazy, or a member of an enemy tribe.

Not an "apostate".

No. I'm talking about apostasy. I'm not talking about someone who is crazy. I am not talking about a member of an enemy tribe. I am talking about someone from within the tribe who is, or is considering, changing their identifying beliefs to something that no longer matches the in-group belief system. This change in beliefs may be to facilitate joining a different tribe. It may be a risky play at power within the tribe. It may be to splinter off a new tribe from the current one.

Since we are talking in the context of religious beliefs the word apostate fits perfectly.

Replies from: pjeby
comment by pjeby · 2010-02-03T18:02:36.578Z · LW(p) · GW(p)

I am talking about someone from within the tribe who is, or is considering, changing their identifying beliefs to something that no longer matches the in-group belief system. This change in beliefs may be to facilitate joining a different tribe. It may be a risky play at power within the tribe. It may be to splinter off a new tribe from the current one.

In order for any of those things to be advantageous (and thus need countermeasures), you first have to have tribes... which means you already need behavior-based signaling, not just non-reality-linked "belief" signaling.

So I still don't see why postulating an entirely new, separate mechanism is more parsimonious than assuming (at most) a mild adaptation of the old, existing mechanisms... especially since the output behaviors don't seem different in any important way.

Can you explain why you think a moral injunction of "Don't say or even think bad things about the Great Spirit" is fundamentally any different from "Don't say 'no', that's rude. Say 'jalaan' instead," or "Don't eat with your left hand, that's dirty?"

In particular, I'd like to know why you think these injunctions would need different mechanisms to carry out such behaviors as disgust at violators, talking up the injunction as an ideal to conceal one's desire for non-compliance, etc.

Replies from: wedrifid, Cyan
comment by wedrifid · 2010-02-03T19:02:50.582Z · LW(p) · GW(p)

If I were God I would totally refactor the code for humans and make it more DRY.

Replies from: pjeby
comment by pjeby · 2010-02-03T19:29:50.146Z · LW(p) · GW(p)

If I were God I would totally refactor the code for humans and make it more DRY.

You seem to be confusing "simplicity of design" with "simplicity of implementation". Evolution finds solutions that are easily reached incrementally -- those which provide an advantage immediately, rather than requiring many interconnecting pieces to work. This makes reuse of existing machinery extremely common in evolution.

It is also improbable that any selection pressure for non-reality-based belief-system enforcement would exist, until some other sort of reality-based behavioral norms system existed first, within which pure belief signaling would then offer a further advantage.

Ergo, the path of least resistance for incremental implementation simplicity, supports the direction I have proposed: first behavioral enforcement, followed by belief enforcement using the same machinery -- assuming there's actually any difference between the two.

I could be wrong, but it's improbable, unless you or someone else has some new information to add, or some new doubt to shed upon one of the steps in this reasoning.

Replies from: wedrifid
comment by wedrifid · 2010-02-03T19:49:56.007Z · LW(p) · GW(p)

You seem to be confusing "simplicity of design" with "simplicity of implementation". Evolution finds solutions that are easily reached incrementally -- those which provide an advantage immediately, rather than requiring many interconnecting pieces to work. This makes reuse of existing machinery extremely common in evolution.

I'm not and I know.

I could be wrong, but it's improbable, unless you or someone else has some new information to add, or some new doubt to shed upon one of the steps in this reasoning.

Earlier in this conversation you made the claim:

Er, research please. Everything I've seen shows that even monkeys have to learn to fear snakes and spiders - it has to be triggered by observing other monkeys being afraid of them first.

This suggested that if "everything you have seen" didn't include the many contrary findings then either you hadn't seen much or what you had seen was biased.

I really do not think new information will help us. Mostly because approximately 0 information is being successfully exchanged in this conversation.

Replies from: pjeby
comment by pjeby · 2010-02-03T20:00:55.548Z · LW(p) · GW(p)

This suggested that if "everything you have seen" didn't include the many contrary findings then either you hadn't seen much or what you had seen was biased. I really do not think new information will help us.

I still don't see what "contrary" findings you're talking about, because the first paper you linked to explicitly references the part where monkeys that grow up in cages don't learn to fear snakes. Ergo, fear of snakes must be learned to be activated, even though there appears to be machinery that biases learning in favor of associating aversion to snakes.

This supports the direction of my argument, because it shows how evolution doesn't create a whole new "aversive response to snakes" mechanism, when it can simply add a bias to the existing machinery for learning aversive stimuli.

In the same way, I do not object to the idea that we have machinery to bias learning in favor of mouthing the same beliefs as everyone else. I simply say it's not parsimonious to presume it's an entirely independent mechanism.

At this point, it seems to me that perhaps this discussion has consisted entirely of "violent agreement", i.e. both of us failing to notice that we are not actually disagreeing with each other in any significant way. I think that you have overestimated what I'm claiming: that childhood learning is an essential piece in moral and other signaling behavior, not the entirety of it... and I in turn may have misunderstood you to be arguing that an independent inbuilt mechanism is the entirety of it.

When in fact, we are both saying that both learning and inbuilt mechanisms are involved.

So, perhaps we should just agree to agree, and move on? ;-)

Replies from: wedrifid
comment by wedrifid · 2010-02-04T02:49:34.083Z · LW(p) · GW(p)

We differ in our beliefs on what evidence is available. I assert that it varies from 'a bias to learn to fear snakes' to 'snake naive monkeys will even scream with terror and mob a hose if you throw it in with them'. This depends somewhat on which primates are the subject of the study.

It does seem, however, that our core positions are approximately compatible, which leaves us with a surprisingly pleasant conclusion.

Replies from: pjeby
comment by pjeby · 2010-02-04T17:41:23.698Z · LW(p) · GW(p)

We differ in our beliefs on what evidence is available. I assert that it varies from 'a bias to learn to fear snakes' to 'snake naive monkeys will even scream with terror and mob a hose if you throw it in with them'. This depends somewhat on which primates are the subject of the study.

We also disagree in how much relevance that has to the position you've been arguing (or at least the one I think you've been arguing).

I've seen some people claim that humans have only two inborn fears (loud noises and falling) on the basis that those are the only things that make human babies display fear responses. Which, even if true, wouldn't necessarily mean we didn't have instinctive fears kick in later than life!

And that's why I don't think any of that is actually relevant to the specific case; it's really the specifics of the case that count.

And in the specific case of beliefs, we don't get built-in protein coding for which beliefs we should be afraid to violate. We have to learn them, which makes learning an essential piece of the puzzle.

And from my own perspective, the fact that there's a learned piece means that it's the part I'm going to try to exploit first. If it can be learned, then it can be unlearned, or relearned differently.

As I said in another post, I can't make my brain stop seeking SASS (status, affiliation, safety, and stimulation). But I can teach it to interpret different things as meaning I've got them.

Clearly, we can still learn such things later in life. After all, how long did it take most contributors' brains to learn that "karma" represents a form of status, approval, or some combination thereof, and begin motivating them based on it?

Replies from: wedrifid
comment by wedrifid · 2010-02-05T01:47:52.963Z · LW(p) · GW(p)

We also disagree in how much relevance that has to the position you've been arguing (or at least the one I think you've been arguing).

That being, "We don't need a past traumatic experience to have an aversive reaction when considering rejecting the beliefs of the tribe in which we were raised."

I agree with the remainder of your post and, in particular, this is exactly the kind of reasoning I use when working out how to handle situations like this:

And from my own perspective, the fact that there's a learned piece means that it's the part I'm going to try to exploit first. If it can be learned, then it can be unlearned, or relearned differently.

Replies from: pjeby
comment by pjeby · 2010-02-05T06:21:31.909Z · LW(p) · GW(p)

That being, "We don't need a past traumatic experience to have an aversive reaction when considering rejecting the beliefs of the tribe in which we were raised."

I don't recall claiming that a traumatic experience was required. Observing an aversive event, yes. But in my experience, that event could be as little as hearing your parents talking derisively about someone who's not living up to their norms... not too far removed, really, from seeing another monkey act afraid of a snake.

Aversion, however, (in the form of a derogatory, shocked, or other emotional reaction) seems to be required in order to distinguish matters of of taste ("I can't believe she wore white after Labor Day") and matters of import ("I can't believe she spoke out against the One True God... kill her now!"). We can measure how tightly a particular belief or norm is enforced by the degree of emotion used by others in response to either the actual situation, or the described situation.

So it appears that this is where we miscommunicated or misunderstood, as I interpreted you to be saying that aversive learning was not required, while you appear to have interpreted what I'm saying as having some sort of personal trauma being required that directly links to an individual belief.

It's true that most of the beliefs I work with tend to be rooted in direct personal experience, but a small number are based on something someone said about something someone else did. Even there, though, the greater the intensity of the emotional surrounding the event (e.g. a big yelling fight or people throwing things), the greater the impact.

Like other species of monkeys, we learn to imitate what the monkeys around us do while we're growing up; we just have language and conceptual processing capabilities that let us apply our imitation to more abstract categories of behavior than they do, and learn from events that are not physically present and happening at that moment.

comment by Cyan · 2010-02-03T19:24:23.460Z · LW(p) · GW(p)

In fairness, the "left hand" thing has to do with toilet hygiene pre-toilet-paper, so at one time it had actual health implications.

Replies from: pjeby
comment by pjeby · 2010-02-03T19:33:55.464Z · LW(p) · GW(p)

In fairness, the "left hand" thing has to do with toilet hygiene pre-toilet paper, so at one time it had actual health implications.

That's why I brought it up - it's in the category of "reality-based behavior norms enforcement", which has much greater initial selection pressure (or support) than non-reality-based behavior norms enforcement.

Animals without language are capable of behavioral norms enforcement, even learned norms enforcement. It's not parsimonious to presume that religion-like beliefs would not evolve as a subset of speech-behavior norms enforcement, in turn as a subset of general behavior norms enforcement.

[Edit: removed "enfrorcement" typo]

Replies from: Cyan
comment by Cyan · 2010-02-03T19:47:20.558Z · LW(p) · GW(p)

I guess I was just pointing out that it seemed to be in a different category ("reality-based behavior norms enforcement" is as good a name as any) than the other examples.

comment by AngryParsley · 2010-02-02T02:04:22.051Z · LW(p) · GW(p)

There is a wall. That belief isn't accessible through critical examination.

Are you talking about separate magesteria or something? How does one get correct beliefs without examining evidence and understanding arguments?

Replies from: MrHen
comment by MrHen · 2010-02-02T03:01:49.218Z · LW(p) · GW(p)

No. This is not separate magesteria.

Okay, I guess the first point is that "belief" for a majority of my belief network was not Predictor based. It is Action based. The concept of separate magesteria applies to a Predictor based belief system such as the Map/Territory concept promoted here. An Action based belief system has trouble with the concepts of magesteria.

The whole system is ridiculously complicated because I never bothered to sit down and sort it out. Theism is behind a wall of beliefs built on a system completely incompatible with Predictor based beliefs. "Incompatible," here, means "untranslatable."

If I am not making sense I can try another path of explanation. I am typing up a full explanation now, actually, so... yeah.

comment by dlthomas · 2012-04-29T15:20:13.876Z · LW(p) · GW(p)

[Y]ou should be as ready to drop it to 69% as raising it to 71%.

No, you should be as ready to drop it to 69% as raise it to ~70.98%. With rounding, obviously, the above isn't numerically wrong, but that's not my objection: it encourages the reader to think of probability updates in percentages as addative, which is wrong.

(edited: fixed my wrong numbers...)

Replies from: TheOtherDave, ciphergoth, waveman
comment by TheOtherDave · 2012-04-29T16:04:04.256Z · LW(p) · GW(p)

Yes, yes, yes, yes, yes. Speaking as someone who keeps making this mistake despite knowing better, I appreciate the attempt to discourage me from it.

comment by Paul Crowley (ciphergoth) · 2013-03-03T07:47:01.343Z · LW(p) · GW(p)

Your numbers are still wrong I'm afraid - guessing you mean ~70.98%...

Replies from: dlthomas
comment by dlthomas · 2013-07-31T04:32:53.489Z · LW(p) · GW(p)

Yes, fixed.

comment by waveman · 2015-08-06T01:48:54.189Z · LW(p) · GW(p)

I take your point about ratios but there is a bigger issue. In many cases the expected change in probability is not symmetrical or uniform.

From the article on conservation of expected evidence: "If you expect a strong probability of seeing weak evidence in one direction, it must be balanced by a weak expectation of seeing strong evidence in the other direction. "

Say I believed that the Sun went around the earth. Given a new piece of evidence it is likely that it will not change your probability much at all. But there is a slight chance that a new piece of evidence will radically change your probability. It is your weighted probabilities of a change in probability that need to balance.

Example, many people who lost their religious faith suddenly came upon a piece of evidence that caused a drastic change in their probability estimate for the existence of God. [in part this may be due to biases such as ignoring contrary evidence, but not entirely.]

Imagine my wife buys a lottery ticket. My estimate of her chance of winning is very low. My wife runs into the room looking excited and brandishing the ticket, my estimate suddenly goes up a lot. Then when I check the numbers it goes up a lot more. On the other hand if I see the ticked crumpled up in the garbage bin, my estimate goes down only a little (from 1/1000000 to 1/1000000000).

comment by [deleted] · 2015-07-17T01:51:29.266Z · LW(p) · GW(p)

If the box contains a diamond, I desire to believe that the box contains a diamond; If the box does not contain a diamond, I desire to believe that the box does not contain a diamond; Let me not become attached to beliefs I may not want. —The Meditation on Curiosity

Only when I'm planning for things that are contingent upon facts related to the physical world.

comment by M Chalk (m-chalk) · 2022-10-31T12:04:40.085Z · LW(p) · GW(p)

Hey, sorry if someone in the comments already addressed this but where does Tarski actually pose this litany?

comment by Eric Covert (eric-covert-1) · 2024-05-31T17:15:56.531Z · LW(p) · GW(p)

If in your heart you believe you already know, or if in your heart you do not wish to know, then your questioning will be purposeless and your skills without direction. Curiosity seeks to annihilate itself; there is no curiosity that does not want an answer.

 

I always found this exemplified in the concept of the "empty cup" from varied middle eastern philosophies. A "full cup" is a heart that believes it already knows.

comment by Ben Pace (Benito) · 2024-09-17T23:52:06.929Z · LW(p) · GW(p)

There just isn’t any good substitute for genuine curiosity. A burning itch to know is higher than a solemn vow to pursue truth. But you can’t produce curiosity just by willing it, any more than you can will your foot to feel warm when it feels cold. Sometimes, all we have is our mere solemn vows.

So what can you do with duty? For a start, we can try to take an interest in our dutiful investigations—keep a close eye out for sparks of genuine intrigue, or even genuine ignorance and a desire to resolve it. This goes right along with keeping a special eye out for possibilities that are painful, that you are flinching away from—it’s not all negative thinking.

This post focuses on internal cultivation of curiosity, (which it does fantastically and why this post has such a strong reputation for being so widely loved). But as I read it, my thoughts move more naturally toward policy or project level changes. Some potential examples.

  • During the course of my work, if I'm not spending 5% of my efforts "just finding things out because I want to know the answer" then I should take this as a red flag that I need to change something in order to allow my natural curiosity to be present and helping.
  • Every day, ask yourself "what's something about the world I really just want to know by the end of the day" and sit with it until my curiosity overcomes me and picks something. I might be busy and so I don't commit to putting in the work to find out, but I should at least know what it is I'm curious about.
  • During the course of my personal life, also keep a track of how regularly I'm pursuing knowledge for its own sake. Is it too low? Then it's time to find a way to get it higher.

I don't know how well I'm doing on this at the minute. I'd like to reflect on it more.