Emotional tools for the beginner rationalist

post by Gleb_Tsipursky · 2015-10-09T05:01:57.952Z · LW · GW · Legacy · 43 comments

Contents

43 comments

Something that I haven't seen really discussed is what kind of emotional tools would be good for beginner rationalists. I'm especially interested in this topic since as part of my broader project of spreading rationality to a wide audience and thus raising the sanity waterline, I come across a lot of people who are interested in becoming more rational, but have difficulty facing the challenges of the Valley of Bad Rationality. In other words, they have trouble acknowledging their own biases and faults, facing the illusions within their moral systems and values, letting go of cached patterns, updating their beliefs, etc. Many thus abandon their aspiration toward rationality before they get very far. I think this is a systematic failure mode of many beginner aspiring rationalists, and so I wanted to start a discussion about what we can do about it as a community.

 

Note that this emotional danger does not feel intuitive to me or likely to many of you. In a Facebook discussion with Viliam Bur, he pointed out how he did not experience the Valley. I personally did not experience it that much either. However, based on the evidence of the Intentional Insights outreach efforts, this is a typical mind fallacy particular to many but far from all aspiring rationalists. So we should make an effort to address it in order to raise the sanity waterline effectively.

 

I'll start by sharing what I found effective in my own outreach efforts. First, I found it helpful to frame the aspiration toward rationality not as a search for a perfect and unreachable ideal, but as a way of constant improvement from the baseline where all humans are to something better. I highlight the benefits people get from this improved mode of thinking, to prime people to focus on their current self and detach themselves from their past selves. I highlight the value of self-empathy and self-forgiveness toward oneself for holding mistaken views, and encourage people to think of themselves as becoming more right, rather than less wrong :-)

 

Another thing that I found helpful was to provide new aspiring rationalists with a sense of community and social belonging. Joining a community of aspiring rationalists who are sensitive toward a newcomers' emotions, and help that newcomer deal with the challenges s/he experiences, is invaluable for overcoming the emotional strains of the Valley. Something especially useful is having people who are trained in coaching/counseling and serve as mentors for new members, who can help be guides for their intellectual and emotional development alike. I'd suggest that every LW meetup group consider instituting a system of mentors who can provide emotional and intellectual support alike for new members.

 

Now I'd like to hear about your experiences traveling the Valley, and what tools you and others you know used to manage it. Also, what are your ideas about useful tools for that purpose in general? Look forward to hearing your thoughts!

 

 

 

43 comments

Comments sorted by top scores.

comment by masters02 · 2015-10-10T19:14:55.148Z · LW(p) · GW(p)

I certainly had a huge emotional problem with being wrong. Three years ago when I was a Muslim, I had a considerably stronger attachment to my beliefs than to reality and truth. As far as I was concerned, my beliefs were the truth (haha) and I could never have distinguished between the two. In fact, everyone I knew was exactly the same, if not worse.

What helped me was having role models that showed me a completely different way of life (many hats off to Dawkins, Pinker, Buffett, Munger, Krauss, et al). I watched them for hours and hours in countless interviews, debates and discussions. Of course, youtube videos didn't make me feel judged, and I think that was important for me at that time. They all fascinated me, and as I observed them, I began modelling some of their thought processes and philosophies. Eventually, I felt an emotional attachment towards reality and felt smug whenever I could openly admit that I was wrong. Now, I would feel like an emotionally-fragile dumbass if I couldn't admit to being wrong and subsequently change my mind.

Ladies and gentlemen, I didn't know it would be so great on this side.

Replies from: None, Viliam, Gleb_Tsipursky
comment by [deleted] · 2015-10-12T11:07:19.088Z · LW(p) · GW(p)

I'll echo this as a relative beginner. Being wrong still feels like a kick in the stomach - even more so when I'm wrong in front of someone else. I actually transitioned to the night shift recently, and being able to troubleshoot technical issues without my coworkers around has helped a lot. Unfortunately, while my technical skills are improving, I have no reason to confront my self-consciousness on a daily basis. So my collaboration skills are still poor.

comment by Viliam · 2015-10-11T10:26:51.517Z · LW(p) · GW(p)

Congratulations!

Please correct me if I'm wrong, but it seems to me that you probably did care about the truth even years ago, but you assumed that the Muslim story does correspond to how things really are. Or am I wrong here? Did you already see the mismatch between belief and reality, and just told yourself "whatever; the belief is what matters"?

Replies from: masters02
comment by masters02 · 2015-10-11T10:53:49.242Z · LW(p) · GW(p)

Why thank you.

You're correct. I did think that the Muslim story was the truth. There were times when I was forced to face a mismatch between my beliefs and reality (evolution is an obvious one), but as you can imagine, I was a pro at rationalising things away. And when rationalisations didn't suffice, I simply put it down to my ignorance and didn't bother pursuing it. And to consolidate all my irrational behaviour, I had enormous social proof that this was the right thing to do.

I should have made it clear that I had no respect whatsoever for 'evidence'. I laugh thinking about it now, but I would openly use/deny evidence whenever it was convenient for me. I would interpret and reinterpret the Quran so that it made sense to me. Talk about cognitive dissonance.

Replies from: Viliam
comment by Viliam · 2015-10-11T15:22:39.213Z · LW(p) · GW(p)

So what exactly was the thing that at the beginning made you watch Dawkins et al. on youtube? Was it something like "Oh, these people have weird (and obviously wrong) beliefs, but they seem so sure and talk about many things... let's watch how far they can get before they get hit with an evidence they can't process?"

Essentially, what motivated you to spend your time watching someone you believed was wrong?

comment by Gleb_Tsipursky · 2015-10-10T19:45:53.238Z · LW(p) · GW(p)

Nice, sounds like you have a powerful story to share! Would you be willing to video yourself sharing your story, that might be something we could put up on the Intentional Insights website and Youtube channel to help other people who are struggling with the same dynamic. Email me at gleb@intentionalinsights.org to talk more about this.

Replies from: masters02
comment by masters02 · 2015-10-11T08:21:37.267Z · LW(p) · GW(p)

I appreciate the opportunity, but I am very much a private person. Looks like an interesting website, I'll be checking it out.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-10-11T16:52:22.371Z · LW(p) · GW(p)

Totally understand that, and thanks for the nice words about the website. It's our broader project of expanding rationality to a wide audience.

comment by Kaj_Sotala · 2015-10-09T17:29:18.935Z · LW(p) · GW(p)

On emotional tools in general -

there are actually a number of schools of thought that teach what might be called a rationalist approach to your emotions, i.e. seeing that your emotions are a map that's good to distinguish from the territory, and giving you tools for both seeing the distinction and evaluating the map-territory correspondence better.

1) In cognitive behavioral therapy, there is the "ABC model": Activating Event, Belief, Consequence. Idea being that when you experience something happening, you will always interpret that experience through some (subconscious) belief, leading to an emotional consequence. E.g. if someone smiles at me, I might either believe that they like me, or that they are secretly mocking me; two interpretations that would lead to very different emotional responses. Once you know this, you can start asking yourself the question of "okay, what belief is causing me to have this emotional reaction in response to this observation, and does that belief seem accurate?".

2) In addition to seeing your emotional reactions as something that tell you about your beliefs, you can also see them as something that tells you about your needs. This is the approach taken in Non-Violent Communication, which has the four-step process of Observation, Feeling, Need, Request. The four-step process is most typically discussed as something that's a tool for dealing with interpersonal conflict, as in "when I see you eating the foods I put in the fridge, I feel anxious, because I need the safety of being able to know whether I have food in stock or not; could you please ask before eating my food in the future?". However, it's also useful for dealing with personal emotional turmoil and figuring out what exactly is upsetting you in general, or for dealing with internal conflict.

3) In both CBT and NVC, an important core idea is that they teach you to distinguish between an observation and interpretation, and that it's the interpretations are what cause your emotional reactions. (For anyone curious, the more academic version of this is appraisal theory; the paper "When are emotions rational?" is relevant.) However, the NVC book, while being an excellent practical manual, does not do a very good job of explaining the theoretical reasons for why it works, which sometimes causes people to arrive at interpretations of NVC which cause them to behave in socially maladapted ways. For this reason, it might be a good idea to first read Crucial Conversations, which covers a lot of similar ground but goes into more theory about the "separating observations and interpretations" thing. Then you can read NVC after you've gotten the theory from CC. (CC doesn't talk as much about needs, however, so I do still recommend reading both.)

4) It's fine to say that "okay, if you're having an emotional reaction you're having difficulties dealing with, try to figure out the beliefs and needs behind it and see what they're telling you and whether you're having any incorrect beliefs"! But it's a lot harder to actually be able to apply that if you're in an emotionally charged situation. That's where the various courses teaching mindfulness come in - mindfulness is basically the ability to step a little back from your emotions and thoughts, observe them as they are without getting swept up in them, and then being able to evaluate them critically if needed. You'll probably need a lot of practice in various mindfulness exercises in order to get the techniques from CBT, NVC, and CC to live up to their full potential.

5-6) An important idea that's been implied in the previous points, but not entirely spelled out, is that your emotions are your friends. They communicate to you information about your subconscious assessments of the world, as well as of your various needs. A lot of people tend to have somewhat of a hostile approach to their emotions, trying to at least control and get rid of their negative emotions. But this is bound to lead to internal conflict; and various studies indicate that a willingness to accept negative emotions and pain will actually make them much less serious. In my personal experience, once you take to the habit of asking your emotions what they're telling you and then processing that information in an even-handed way, then those negative emotions will often tend to go away after you've processed the thing they were trying to tell you. By "even-handed" I mean that if you're feeling anxious because you're worried of some unpleasant thing X being true, then you actually look at the information suggesting that X might be true and consider whether it's the case, rather than trying to rationalize a conclusion for why X wouldn't be true. Your subconscious will know, and keep pestering you. Some of CFAR's material, such as aversion factoring points this way; also Acceptance and Commitment Therapy as elaborated on in Get out of your mind and into your life seems to be largely about this, though I've only read about the first 30% so far.

Some of my earlier posts on these themes: suffering as attention-allocational conflict, avoid misinterpreting your emotions.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-10-10T17:28:58.114Z · LW(p) · GW(p)

Thanks for these suggestions, I will try to adopt some of these in my rationality outreach. The idea about emotions being your friends, and the CBT techniques, are going to be especially useful to beginner rationalists. I'll also check out your earlier posts as well, thanks for including them.

comment by MathiasZaman · 2015-10-09T12:32:36.741Z · LW(p) · GW(p)

The rationalist tumblr sphere helped me a lot. It's a lot more approachable for newcomers than this site is and has a very low barrier for making low-effort, high emotion posts, which is something I could totally use assistance on at the time. It also helped that I could see rationalist practices and the results in (more-or-less) real time, which were highly available examples (I've always learned better with good, tangible examples) and showed me that rationality could be practised by "real" people, rather than mythical figures like Jeffreyssai, the Defence Professor or Eliezer Yudkowsky.

Yudkowsky's fiction also helped because it provided easy-to-read content that teaches the basics. For the same reasons, I Shall Wear Midnight (by Terry Pratchett) was useful

Replies from: None, Gleb_Tsipursky
comment by [deleted] · 2015-10-12T11:02:58.827Z · LW(p) · GW(p)

I didn't know that there was a rationalist tumblr sphere. I should look into that.

Replies from: MathiasZaman
comment by MathiasZaman · 2015-10-12T17:52:57.682Z · LW(p) · GW(p)

This would be a good place to start looking. It's a list that holds most of the (self-proclaimed) rationalists on tumblr, although I can't guarantee the quality or level of activity of each tumblr. Notable absences are Scott's tumblr and theunitofcaring.

comment by Gleb_Tsipursky · 2015-10-10T17:29:35.097Z · LW(p) · GW(p)

Good idea about the Tumblr sphere and the fiction, thanks!

comment by ChristianKl · 2015-10-09T08:24:50.453Z · LW(p) · GW(p)

Note that this emotional danger does not feel intuitive to me or likely to many of you. In a Facebook discussion with Viliam Bur, he pointed out how he did not experience the Valley. I personally did not experience it that much either.

I'm not sure that lack of noticing effects like this is an indication that they aren't there. From the outside perspective you developed a depression strong enough to affect your work after being exposed to rationality.

The prevelance of depression in this community as seen via the LW census is also higher than the baseline.

Replies from: MathiasZaman, Viliam
comment by MathiasZaman · 2015-10-09T12:22:52.757Z · LW(p) · GW(p)

To be fair, the LW census also shows an average IQ that is significantly larger than the baseline and we know intelligence and depression to be correlated.

But intuitively (e.g. without any evidence) I can understand why this community could have a higher-than-baseline level of depression, apart from the intelligence issue. Stuff like: "If you aren't winning, you aren't being rational?", "If you are rational, than why aren't you sitting on a giant pile of utility/money," and "heroic responsibility" (everything wrong with the world is your fault) can be overwhelming, especially if you are (as many newcomers to this community are) a slightly above average student with no real money and (possibly) no real plan for getting tons of money. It doesn't even need to be that particular situation. Every time someone has a bad period (for whatever reason), those memes will make them think it's their fault and their responsibility to fix it, which (probably) isn't conductive to mental health.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-10-10T17:41:20.795Z · LW(p) · GW(p)

Good points, and that's why I highlight the benefits of orienting toward improvement instead of perfectionism and having a supportive community.

comment by Viliam · 2015-10-09T15:12:43.224Z · LW(p) · GW(p)

I'm not sure that lack of noticing effects like this is an indication that they aren't there.

I'm aware of the possibility, and I have also mentioned it in the facebook debate. Or, more likely, I have problems finding the right words to express what I want to say:

I had situations where I didn't know something, when I forgot things, when I believed an information that was wrong, etc. Lots of them. Still doing it. Most likely will always do.

In the past (before finding LW) I have repeatedly experimented with belief in belief (because I wanted the placebo effects or social approval), but those experiments were always half-assed and very short-termed; they felt incompatible with my personality. I couldn't stop being aware that I am merely acting.

I also fail a lot at instrumental rationality. I am aware of what I should do... and I somehow just don't do it.

But I don't remember having a situation where I enjoyed being wrong or didn't care about being wrong, like described here and here. That just feels completely strange to me. I have problem empathising with people who, upon learning that they were wrong, just don't give a fuck.

Therefore -- that's why I mentioned it in the debate -- I have no clue about what to tell them to help them change their ways. I have never been there (as far as I know), and I have no idea what it feels like to be there. So I have no model that would help me test which ideas might be attractive enough to draw a person out of there.

EDIT: I feel like I should add so many disclaimers here. I am happy that at least Gleb understands what I was trying to say.

Of course there are reasons when you want to keep a map despite knowing it is not correct. When it is a useful simplification, like Newtonian physics. I am talking about people whose maps are not even approximately correct, but they still keep them because... I am only guessing here... they still provide emotional comfort.

I don't feel comfortable with having an obviously wrong map, even if it would be socially approved. I have problem belonging to most groups, because sooner or later there is a shared group map you have to accept. For example, having a political opinion (in the sense of: completely buying a standardized map) feels like insanity; on the same level as belonging to a cult. (I am strongly sympathetic to the libertarian ethics of not initiating force. That doesn't convince me that the best way to organize a society is to dismantle all states and let the warlords fight it out in the "free market".)

There may also be unlucky situations where I am wrong, other people are right, but they lack the right words to convince me (sometimes because they themselves believe the right thing for the wrong reasons, e.g. because it is a standard belief in their social group). But I don't have an epistemic strategy for avoiding such situations without making things worse on average; or course believing everything wouldn't be an improvement.

Etc.

Replies from: Gleb_Tsipursky, ChristianKl, ChristianKl, Lumifer
comment by Gleb_Tsipursky · 2015-10-10T17:38:10.999Z · LW(p) · GW(p)

Viliam, I indeed do understand what you're saying. Having a belief that I know is wrong is anathema to me.

But I think you and I, and probably many Less Wrongers, are on the far end of the spectrum of having a strong emotional valuation of having true beliefs, and there are so many people who give much less of a fuck about that than we do. Moreover, they have a strong emotional value of being attached to their beliefs.

That's why the project of spreading rationality is hard - only a small subset of the population has that strong intuitive value. This is why I'm posting about this here about the challenge I run into with Intentional Insights - how do we expand that subset through equipping those who want to learn the truth with the emotional tools they need to do so.

Replies from: Viliam
comment by Viliam · 2015-10-11T10:20:44.920Z · LW(p) · GW(p)

As a very rough intuitive model, we could divide people into three rationality stages:

  • R0 -- does not care about having true beliefs
  • R1 -- cares about having true beliefs, but does not know the rationality techniques
  • R2 -- cares about having true beliefs and knows the rationality techniques

I can imagine moving people from R1 to R2. More or less, you give then the Sequences to read, and connect them with the rationalist community. At least that is what worked for me. No idea about R0 though, and they happen to be a vast majority of the population.

(There is even the technical problem of how to most effectively find R1 people in the general population. Is there a better method than making a website and hopind that they will find it?)

Another problem is that if we succeed to make LW-style rationality more popular, we will inevitably get another group growing:

  • R3 -- does not care about having true beliefs, but learned about the rationality techniques and keywords, and uses them selectively
Replies from: Kaj_Sotala, Gleb_Tsipursky, entirelyuseless
comment by Kaj_Sotala · 2015-10-12T11:46:47.139Z · LW(p) · GW(p)

I think "cares / does not care about having true beliefs" is too coarse: the actual question is, in which domains do people care about true beliefs?

Most people care about having true beliefs when it actually lets them achieve things. Few parents would prefer a false belief that their child is safe, to the true belief that their child is in danger, if the true belief allowed them to get the child out of danger. The issue is just that when we talk about things like evolution or religion, it genuinely does not matter what your beliefs are, or if it does, "false" beliefs often allow you to achieve things better.

Think of beliefs as tools. People will care about having the right tool if they couldn't get the job done otherwise, but if the wrong tool still lets them get something done, they don't care. Except for some weird "rationalist" guys who insist that you should have the right tools for their own sake, because there's a theoretical chance that having the wrong tool for some problem might cause you trouble, perhaps.

If it helps, think of it as physicist/mathematician thing. A physicist might calculate something using a way that's not quite correct and would drive the mathematician up a wall. While the physicist is like, hey, my result and method are good enough to do the job I care about, so so what if I never proved all of my assumptions.

If you want to get people to actually care and think about the truth in more domains, you need to give them habits of thought that do that in one domain, and see if it'd transfer to some other domain. E.g. this is the approach that CFAR settled on:

...the sea change that occurred in our thinking might be summarized as the shift from, "Epistemic rationality is about whole units that are about answering factual questions" to there being a truth element that appears in many skills, a point where you would like your System 1 or System 2 to see some particular fact as true, or figure out what is true, or resolve an argument about what will happen next.

  • We used to think of Comfort Zone Expansion[6] as being about desensitization. We would today think of it as being about, for example, correcting your System 1's anticipation of what happens when you talk to strangers.
  • We used to think of Urge Propagation[6] as being about applying behaviorist conditioning techniques to yourself. Today we teach a very different technique under the same name; a technique that is about dialoging with your affective brain until system 1 and system 2 acquire a common causal model of > whether task X will in fact help with the things you most care about.
  • We thought of Turbocharging[6] as being about instrumental techniques for acquiring skills quickly through practice. Today we would also frame it as, "Suppose you didn't know you were supposed to be 'Learning Spanish'. What would an outside-ish view say about what skill you might be practicing? Is it filling in blank lines in workbooks?"
  • We were quite cheered when we tried entirely eliminating the Bayes unit and found that we could identify a dependency in other, clearly practical, units that wanted to call on the ability to look for evidence or identify evidence.
  • Our Focused Grit and Hard Decisions units are entirely "epistemic" -- they are straight out just about acquiring more accurate models of the world. But they don't feel like the old "curse of epistemic rationality" units, because they begin with an actual felt System 1 need ("what shall I do when I graduate?" or similar), and they stay in contact with System 1's reasoning process all the way through.

When we were organizing the UK workshop at the end of 2014, there was a moment where we had the sudden realization, "Hey, maybe almost all of our curriculum is secretly epistemic rationality and we can organize it into 'Epistemic Rationality for the Planning Brain' on day 1 and 'Epistemic Rationality for the Affective Brain' on day 2, and this makes our curriculum so much denser that we'll have room for the Hamming Question on day 3." This didn't work as well in practice as it did in our heads (though it still went over okay) but we think this just means that the process of our digesting this insight is ongoing.

We have hopes of making a lot of progress here in 2015. It feels like we're back on track to teaching epistemic rationality - in ways where it's forced by need to usefully tackle life problems, not because we tacked it on. And this in turn feels like we're back on track toward teaching that important thing we wanted to teach, the one with strategic implications containing most of CFAR's expected future value.

Similarly Venkat:

I have never met anybody who has changed their reasoning first and their habits second. You change your habits first. This is a behavioral conditioning problem largely unrelated to the logical structure and content of the behavior. Once you’ve done that, you learn the new conscious analysis and synthesis patterns.

This is why I would never attempt to debate a literal creationist. If forced to attempt to convert one, I’d try to get them to learn innocuous habits whose effectiveness depends on evolutionary principles (the simplest thing I can think of is A/B testing; once you learn that they work, and then understand how and why they work, you’re on a slippery slope towards understanding things like genetic algorithms, and from there to an appreciation of the power of evolutionary processes).

People come to consider beliefs true if those beliefs work in giving them rewards. This is similarly the case for meta-beliefs, like "having true beliefs is important" - people come to believe that true beliefs are important if they frequently work for acquiring more accurate beliefs, and this lets them perform better. If you want to make people to adopt that metabelief, come up with habits that explicitly cause them to acquire more true beliefs, and which also help them forward, and get them to adopt those habits.

Replies from: Viliam
comment by Viliam · 2015-10-13T08:03:32.248Z · LW(p) · GW(p)

Most people care about having true beliefs when it actually lets them achieve things.

Here I have a general feeling that any true belief may be useful in the future, and any false belief may be harmful in the future. I feel the world as connected. (As a most obvious example, a belief in supernatural in any area implies a belief in supernatural in general, which in turn influences all areas of life.)

Maybe "the world is connected" is one of the unspoken premises for rationality. If you don't have it, any rationality technique will be merely something you use inside the lab.

(Of course, not everything is equally likely to be useful, so I try to get more info in some areas and ignore other areas. But I would still feel bad about making false beliefs even in the less important areas. If I don't feel certain about my knowledge somewhere, and don't have time to improve the knowledge, I update to "don't know".)

comment by Gleb_Tsipursky · 2015-10-11T17:13:45.642Z · LW(p) · GW(p)

Nice typology! Let's dive into this a little deeper.

I agree that we can't do anything with R0.

I think many people belong to R1, but there is a huge spectrum along which they place a value on having true beliefs. At the far end of the spectrum are people like you and I, and I think most Less Wrongers, before we learned about rationality - we already cared a lot about having true beliefs. For us, giving us the Sequences, and connecting with the rationalist community, was sufficient. We can call people like that R1.999, to indicate that maybe 1 out of a 1000 people is like that. That's a rough Fermi Estimate, and I may be optimistic (I have a personal optimism bias issue), but let's go with that for the sake of the discussion.

Now what about the people who range from R1.001 to R1.998? This is the whole point of the Intentional Insights project - how do we move these people further up the sanity waterline spectrum? The challenge is that these people's emotional intuitions do not line up with truth-seeking. So to get them into rational thinking, we need to increase their positive emotions around rational thinking, decrease their negative emotions about letting go of their current beliefs, and even before that bring rationality to their attention.

To do so, we at InIn do several things:

1) Increase the emotional intuitive valuation they place on rational thinking. To do so, here are active steps we are taking: making engaging videos and blogs that say "yay rational thinking, you should have warm fuzzies around it and value it emotionally to reach your own goals."

2) Decrease the negative emotions they have around letting go of their past beliefs. That's been a challenge, and one of the reasons I wrote this discussion post. I listed above some things that worked for us. We also write blogs highlighting people's personal stories about updating their beliefs, to make this appear more doable and cognitively easy.

3) Getting this information to people's attention. The way we do this is through out website, through collaborating with a wide variety of reason-oriented groups, and through publishing articles and doing interviews in prominent media venues.

So those encompass the what I think it takes to move R1 to R2. I also agree about the dangers of R3, which is why it's important to get people into a community with more advanced rationalists, otherwise they might just remain half a rationalist.

comment by entirelyuseless · 2015-10-11T13:03:14.644Z · LW(p) · GW(p)

I doubt there can literally be someone who "does not care about having true beliefs." No matter how false and irrational someone's beliefs are, he still wants those beliefs to be true, so he still wants true beliefs. What happens is this:

Some people want to believe the truth. Position X seems likely to be true. So they want to believe X.

Other people want to believe X. If X is true, that would be a reason to believe it. So they want X to be true.

The first people will be in your categories R1 and R2. The second people will be in your category R0, in the sense that what is basically motivating them is the desire to believe a concrete position, not the desire to believe the truth. But they also have the desire to believe the truth. It is just weaker than their desire to believe X.

But as you say, if someone wants something more than the truth, he wants that more than the truth. No argument is necessarily going to change his desires.

comment by ChristianKl · 2015-10-10T15:01:18.192Z · LW(p) · GW(p)

In the past (before finding LW) I have repeatedly experimented with belief in belief (because I wanted the placebo effects or social approval), but those experiments were always half-assed and very short-termed; they felt incompatible with my personality. I couldn't stop being aware that I am merely acting.

Normal people don't experiment with belief in belief. They just have it.

Wikipedia writes for Athenian slaves:

Slaves could not own property, but their masters often let them save up to purchase their freedom,[97] and records survive of slaves operating businesses by themselves, making only a fixed tax-payment to their masters.

Replies from: Viliam
comment by Viliam · 2015-10-11T10:28:23.277Z · LW(p) · GW(p)

Normal people don't experiment with belief in belief. They just have it.

Yes... and I envied them. :D

I suspect that if there is a parallel universe where I got religious, the proper strategy was to find a sufficiently intelligent clever arguer (someone like Chesterton, but with 50 more IQ points), or more likely, a group of Chesterton-level clever arguers I could spend a lot of time with, and thus have a social proof for their rationalizations. (Something like Dark CFAR.)

Replies from: ChristianKl
comment by ChristianKl · 2015-10-11T11:25:18.088Z · LW(p) · GW(p)

If I wanted to make someone religious I would give them experiences that aren't easily reconciled with their previous world view and then provide a religious belief system that can explain those experiences.

It's not easy to sustain being an atheist when you have a vision of Jesus rising from the cross. It not that hard to theoretically accept that the human mind can produce visions at random but it's another issue not to take one's own experience too seriously.

comment by ChristianKl · 2015-10-10T12:00:28.352Z · LW(p) · GW(p)

Did you have 1-on-1 interaction with people where you believe that didn't care at all about whether their beliefs are true?

Replies from: Viliam, Gleb_Tsipursky, entirelyuseless
comment by Viliam · 2015-10-11T10:46:41.907Z · LW(p) · GW(p)

A few times I got a reaction like: "I don't want to hear your facts!" which I translated as: "If there is a part of reality that doesn't match my map, I don't want to know about that part."

The part "your facts" is already weird. As if saying that different people live in different realities, and I don't want my reality to become contaminated by your reality (which could happen if I start to observe your reality too close or under your guidance). But of course we are talking about maps here. So basicly "your facts" means: "There is only my map and your map, and I am not interested in your map." So it's not like I don't want my map to correspond to the territory, but rather like there is no territory that could judge my map and find it wanting. There are only maps, and of course your map is going to differ from my map, but if you insist on me looking at your map, that is merely an aggression, a status move.

(I can even see how our educational system contributes to this feeling that it's maps all the way down. Most of what happens in schools is students copying the teachers' maps. But I digress.)

EDIT: Another example, maybe better. There are people who love to tell "their opinions" on theory of relativity, quantum physics, evolution, whatever. But if you suggest thay they read a textbook, or a popular science book on the topic, to fix at least their most obvious misconceptions, they proudly refuse. They prefer their original bullshit interpretation, even if there is an option to fix the obvious mistakes and improve their bullshit to make it more credible (which IMHO should be preferable even for people who like their own bullshit theories).

Replies from: ChristianKl, VoiceOfRa
comment by ChristianKl · 2015-10-11T11:25:12.897Z · LW(p) · GW(p)

As if saying that different people live in different realities, and I don't want my reality to become contaminated by your reality (which could happen if I start to observe your reality too close or under your guidance). But of course we are talking about maps here.

There are various new agey people who would disagree with you on that.

But if you suggest thay they read a textbook, or a popular science book on the topic, to fix at least their most obvious misconceptions, they proudly refuse.

Most people don't read textbooks. A sizeable portion of people doesn't even read any books once they left school.

If you disagree with a religious person and they tell you that you just have to read the bible or another religious book and then you would understand, that likely wouldn't be enough either to get you to read the book.

Replies from: entirelyuseless
comment by entirelyuseless · 2015-10-11T12:23:52.981Z · LW(p) · GW(p)

Yes, and in fact telling someone, "I disagree with you but I don't have time to explain why, read this book to discover the truth," will often come across as being arrogant, since the person doesn't want to spend a lot of time explaining things, but he wants the other person to spend a lot of time reading a book.

comment by VoiceOfRa · 2015-10-12T23:07:47.652Z · LW(p) · GW(p)

A few times I got a reaction like: "I don't want to hear your facts!"

I think that's more a case of people becoming jaded from constantly being presented with "facts" that are false or at least highly misleading backed by arguments too clever for them to refute.

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2015-10-13T13:06:42.414Z · LW(p) · GW(p)

I think that's more a case of people becoming jaded from constantly being presented with "facts" that are false or at least highly misleading backed by arguments too clever for them to refute.

I'm sure that you will never be guilty of such a presentation.

comment by Gleb_Tsipursky · 2015-10-10T17:39:31.085Z · LW(p) · GW(p)

I very much did have those interactions, especially with religious people about religion. They specifically denied truth/reason as having any value, and specifically oriented to faith as the thing one must have.

Replies from: ChristianKl
comment by ChristianKl · 2015-10-10T20:17:40.174Z · LW(p) · GW(p)

Truth and reason are not the same thing. If you believe that the truth is that god works in mysterious ways that aren't decipherable by humans reason loses it's value.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-10-11T16:56:09.959Z · LW(p) · GW(p)

Sure, I agree that truth and reason are not the same thing. I meant to indicate that I heard both types of comments, and often together, from religious people - that the truth as determined by science, reason, and logic do not have value in comparison to personal felt experience.

Replies from: ChristianKl
comment by ChristianKl · 2015-10-11T21:28:12.072Z · LW(p) · GW(p)

I think most of those people consider personal felt experience to show the truth.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-10-11T22:26:22.001Z · LW(p) · GW(p)

Yup, I hear you. I think this is a matter of semantics - I am using the word truth as it is generally understood on Less Wrong, meaning the truth of reality as indicated by concrete sensory experience, the closer to the senses, the better.

Replies from: ChristianKl
comment by ChristianKl · 2015-10-11T22:54:30.133Z · LW(p) · GW(p)

I think the question of whether someone wants to have correct beliefs is quite distinct from whether they believe that reason is a method that's useful for finding the truth.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-10-12T03:34:19.037Z · LW(p) · GW(p)

Yes, I agree that these are distinct things.

comment by entirelyuseless · 2015-10-10T14:33:51.494Z · LW(p) · GW(p)

I think it's probably impossible not to care at all whether your beliefs are true, but some people care a lot more than others. And I have had a number of people who told me to "forget about arguments" because I came to a conclusion that they didn't want me to believe.

That is not caring about truth in an effective sense, even if strictly speaking they still want their beliefs to be true, and in that sense they care about the truth of their beliefs.

comment by Lumifer · 2015-10-09T15:48:46.356Z · LW(p) · GW(p)

I am talking about people whose maps are not even approximately correct, but they still keep them because... I am only guessing here... they still provide emotional comfort.

Two points. First, a very important word here is "matters". A lot of maps don't matter. If I believe that there are adepts meditating in secret caves in Tibet and they have direct access to the the Akashic records and so can see into the future and into the past -- so what? Does that affect my life in any way? (note, by the way, the difference between "could matter" and "does matter").

Second, an incorrect map is also known as "fiction". That makes for an interesting connection to the parallel thread about the use{full|less}ness of fiction.

Replies from: Viliam
comment by Viliam · 2015-10-09T19:33:54.887Z · LW(p) · GW(p)

I enjoy fiction. Also, when I talk e.g. with religious people, I imagine that we are all talking about some imaginary world; then it doesn't bother me that their arguments do not apply to our world. I can discuss Bible the same way I can discuss Tolkien, and sometimes it's fun. Only when people remind me that they actually believe the elves are real, it gets weird.

Your first example... that's also in the weird territory. I could enjoy it as a fiction. I don't see any other use for it. -- Is it just an aesthetic difference?