Non Polemic: How do you personally deal with "irrational" people?

post by Emiya (andrea-mulazzani) · 2020-11-02T13:44:24.841Z · LW · GW · 5 comments

This is a question post.

Contents

  Answers
    23 lsusr
    18 Daniel Kokotajlo
    9 aa.oswald
    8 Viliam
    7 ChristianKl
    6 Vladimir_Nesov
    5 waveman
    4 Gunnar_Zarncke
    4 Stuart Anderson
    4 remizidae
    3 cousin_it
    2 TheFishBowl
    2 BladeDoc
    1 Productimothy
None
5 comments

I'm finally managing to finish my "basic" training in rationality, which is to mean finish studying "Rationality: A-Z" (I had studied the first half years ago, but I foolishly stopped when I got to the part about reductionism, which was unbelievably stupid of me even with all the reasons the led to me doing so). I plan to continue studying even more material once I'm done with it, to train myself in instrumental rationality and everything else I can find to make myself as smart as I could possibly be. I'm very satisfied with my progresses, the first half of the sequences helped me improve tremendously years ago, and now I can see myself improving again.

 

But, even while I am still at what I think is just the beginning of my improvement, I'm noticing more and more a rather serious problem.

To put it politely, I hate how people think, now.

 

I know it's really unfair because I didn't know any better mere weeks ago, and years ago I was a good textbook example of an intelligent person who'd keep mainly using his intelligence to rationalise whatever questionable decisions he made, but I just can't help it.

 

I notice logic leaps, cognitive missteps and dumb conclusions of people who are considered smart, deep and expert on stuff while they talk on the radio or on other medias and I get angry.

I notice idiotic ideas, as well as practices of thoughts that are the cognitive equivalent of shooting yourself in both knees, spreading inside ideologies I deeply care about, because the evils they fight are very real and demonstrated by science, but now I can see how all the truth is hopelessly getting mixed up with stuff that's just stupid or wrong, and that the intelligent people that once introduced me to these ideologies are absolutely incapable of judging and criticising any bad idea that's coming from their own side, and I get livid.

Half the time I hear someone talking I have to choose between politely tearing apart the majority of what he said, growing more and more annoyed, or just shutting off my attention and think about something else while pretending to listen to them.

And all this is just when I have to deal with intelligent people. 

I can't comprehend how a stupid person thinks unless I just stop thinking of him as an actual human being, switch off my empathy completely and just model him as a badly designed computer program with a bunch of floating beliefs in his memory and no analytical or critical skill whatsoever. If I try doing it the intuitive way, using empathy and telling my brain to think like him, my brain just keeps running out of suspension of disbelief as I can't avoid thinking that, no matter how much I could believe that political party/religion/philosophy x is right, I'd still recognise that blatantly idiotic part of it as a very, very stupid idea the first time I'd heard it, since even before rationality I've never actually been stupid enough to believe something that even at surface level was just plain dumb, so I can't even understand why he's doing what he's doing, forget predicting it.

 

And all this is really starting to weight on me. I think my mood has changed for the worse in the last weeks. 

If you have read HPMOR, I think I'm starting to feel like professor Quirrel, and my brain has started to actually think the words "avada kedavra" when I hear something particularly stupid and hateful. I wouldn't do that even if I could get away with it, but, emotion-wise, I have to consciously remind myself reasons why to kill someone that stupid wouldn't just be a net positive gain for mankind and wouldn't just spare us a waste of oxygen. The me of several years ago would have just smirked and nodded at this kind of thoughts, but I want to be smarter than the old me, and smarter than professor Quirrel as well.

 

I'm sorry if that was longer and more emotional than what strictly necessary, I wanted to communicate exactly how I feel and really needed to say these things to someone. I'll try to go straight to the point now. 

I think that rationality is completely worth it, I don't regret at all studying it, I don't want anyone to think that I regret studying it or suggest not studying it, and I will continue to move forward and improve myself. But I also think that the smart thing to do is look for ways to cheat and avoid paying this "price" as well.

 

So, what I want to know is: 

  1. Did other people who already learned rationality went through this as well?
  2. If so, does it continue or eventually you just get used to other people being insane and you don't emotionally mind it that much anymore? I can't remember being this annoyed at people when I had read the first half of the sequences.
  3. Do you know of or have you tried any particular strategy to not being annoyed or feel... disinterested in other people? If so, did it worked? Could you suggest any material that explains it in more details?
  4. What do you currently do when you have to deal with the kind of problem I have described? (If your answer to this is similar to 3. you can just skip this)
  5. Can you suggest me any material or strategy to effectively model and predict stupid people's behaviour?

And, on a side note:

6. Can you recommend me any reading material or training you think it made you smarter or better at predicting the world or other people? I have checked some of the posts about it on this side but still thought it was worth asking. If you know of posts and lists about this, linking those would also be a huge help.

 

Thanks to everyone who will choose to answer this, I'll really appreciate any help and information I can get. 

 

Edit 04/11/2020: I stress tested some of the advice I could apply right away, by watching a 45 mins video of interviews made at a Covid-19 deniers mass protest.

I got angry about twice and got a really odd look from the person who was with me because I said out loud something about the most annoying kitten I ever saw, but I have to say my mood was a lot better than when I usually tried to just not get angry at people.

What seemed to work the most was:

This question has been really useful to me already, I expect its usefulness will shot up a lot further as I read the materials people suggested me. 

I really wish to thank everyone for the excellent advice, and please do feel free to still post  advice on 6. if you wish to!

Answers

answer by lsusr · 2020-11-03T21:27:16.058Z · LW(p) · GW(p)

Just as intelligence is orthogonal to morality [LW · GW], the intrinsic value of a human being is orthogonal to that human being's intelligence. I don't judge other people for being stupid anymore than I would judge a dog for being stupid. We are all just animals. I love dogs and people for being exactly what we are.

I went through a cynicism phase similar to what you seem to be going through. I realize, looking back, that my disdain was connected to having low status myself. These days, now that I have high status, I think of dumb people more like kittens and less like bad guys.

If you think you are smarter than other people then either you are wrong or you are right. If you are wrong then you should change your mind. If you are right then you live in an extremely inefficient world and can make a killing [LW · GW]. The antidote to stupid words is intelligent action. If you're not winning then you're doing rationality wrong.

In the land of the blind, the one-eyed person is dictator. It's good to be the dictator. If you're not dictator then either you are blind or you do not live in the land of the blind.

Can you recommend me any reading material or training you think it made you smarter or better at predicting the world or other people?

  • Abstain from stupid media [LW · GW] like news, Facebook and videogames.
  • Learn to use Anki spaced repetition software.
  • Teach yourself to read and write Chinese. (This is my favorite antidote for thinking you're smarter than other people.) Then read The Art of War in its original language.
  • Complete a college degree in physics.
  • Complete a college degree in mathematics.
  • Learn economics, especially microeconomics.
  • Read all of Paul Graham's articles.
  • Teach yourself computer science and machine learning.
  • Start a tech company.
  • Start a non-tech enterprise.
  • Get in shape by lifting weights.
  • Learn history. Make sure you cover at least three major civilizations (China, the Islamic World and Europe is a good place to start). This helps with perspective.
  • Read ethnographies on pastorialism and hunter-gatherers. Two excellent books are Arabian Sands by Wilfred Thesiger and Nisa by Marjorie Shostak. This helps you understand what people were designed for.
  • Learn the basics of evolutionary biology.
  • Acquaint yourself with the research on IQ and the Big 5 personality traits.
  • Take a long-distance trip with $100 in your pocket, earning the money you need to survive en route.
  • Teach classes.
comment by Emiya (andrea-mulazzani) · 2020-11-04T03:00:43.388Z · LW(p) · GW(p)

This is... an impressive list. I really mean it.

Some items are pretty much exactly what I need for my goals, and if I had a lot of time I could try a lot more. 

Sadly I need to get as smart as I can really fast. I do know a lot of things that are going in my "first century of life" list, though.

 

I don't judge other people for being stupid anymore than I would judge a dog for being stupid.

It's funny, I got to a similar moral conclusion about an hour before reading it in your answer. 

 

I think of dumb people more like kittens and less like bad guys.

This is an extremely useful way to think about it.

 

If you are right then you live in an extremely inefficient world and can make a killing.

I have had an insistent feeling about this for a while, but I just had vague ideas I couldn't focus on or test. This seems an extremely good point from which to start thinking about it.

 

In the land of the blind, the one-eyed person is dictator. It's good to be the dictator. If you're not dictator then either you are blind or you do not live in the land of the blind.

I guess it's not really relevant, but this is the first time someone manages to describe my exact feelings about this. Thank you.

(Not that I want to literally be a dictator, I'm stating it out loud just so I don't risk being misunderstood by someone else who hasn't had my exact thoughts) 

 

Teach yourself to read and write Chinese. (This is my favorite antidote for thinking you're smarter than other people.) Then read The Art of War in its original language.

I had tried to idly learn Japanese as a past time, it took me around five days to realise it was just wasted time if I couldn't dedicate some serious efforts to it. I think I was told by friends that Chinese is substantially harder. Could you give me an estimate of how much The Art of War loses when read in a good translation?

Replies from: lsusr
comment by lsusr · 2020-11-04T05:56:01.736Z · LW(p) · GW(p)

When translated into English, The Art of War loses almost as much as Romeo and Juliet loses when translated into Japanese.

If you can't read it in Chinese then this the best translation I know of.

comment by Nacruno96 · 2020-11-05T23:54:02.550Z · LW(p) · GW(p)

Which gives this person who is asking nothing. Just do what is fun for you wound be a better advice

answer by Daniel Kokotajlo · 2020-11-02T16:31:18.622Z · LW(p) · GW(p)

A wise friend once said to me something like this:

"You could look at all the stuff that's happening in the world, and all the things people are saying and doing, and be like 'They're all monkeys! Monkeys in suits! AAaaaagh!' However, you could also look and say: 'Wow, look at what the monkeys built! It's so cool that they got even this far!"

When you think about it, because of the way evolution works, humans are probably hovering right around the bare-minimal level of rationality and intelligence needed to build and sustain civilization. Otherwise, civilization would have happened earlier, to our hominid ancestors. We're just monkeys that have learned some cool tricks.

The next thing to remember, of course, is that you're a monkey too. You may be teaching yourself some cool rationality stuff, but you are still a monkey, and if you aren't careful you'll get arrogant/overconfident or some other such problem.

...practices of thoughts that are the cognitive equivalent of shooting yourself in both knees, spreading inside ideologies I deeply care about, because the evils they fight are very real and demonstrated by science, but now I can see how all the truth is hopelessly getting mixed up with stuff that's just stupid or wrong, and that the intelligent people that once introduced me to these ideologies are absolutely incapable of judging and criticising any bad idea that's coming from their own side ...

I sympathize with this bit especially. My reaction tends to be more cosmic horror than anger/frustration though. I tried to express it here [LW · GW].

comment by [deleted] · 2020-11-03T07:18:59.535Z · LW(p) · GW(p)

"When you think about it, because of the way evolution works, humans are probably hovering right around the bare-minimal level of rationality and intelligence needed to build and sustain civilization. Otherwise, civilization would have happened earlier,"

I actually profoundly disagree with this both empirically and theoretically.

Civilizations are not some kind of natural inevitable 'next step' that must happen when you have a smart animal.  They are a thing that CAN happen in the context of a smart animal that is capable of inventing agriculture.  But there are other prerequisites.

I find the argument that complex culture is a thing that can happen in dense enough human populations, running away as it further densifies the population, persuasive.  The idea is that in a low density human population ideas sometimes fail to percolate down the generations, while in a dense enough social network innovations stick down the generations more frequently because losses are less likely.  It is possible that you can reach a 'tipping point' in a dense enough population at which point the ability to pass on new innovations allows a denser population still and further accumulation of complex culture.

There is a bit of a case study in Tasmania.  The native Tasmanian population had continuity with the aboriginal Australian population before the end of the ice age, when the two landmasses were united.  Ten thousand years later, upon European contact, the Aboriginal Australians maintained oral culture of events and places tens of thousands of years back, and had kept and expanded upon the toolset that existed in the united landmass... while the Tasmanians, with a smaller social network and a less dense population on that land, had lost large numbers of tools and skills including the ability to produce fire de novo (while still being able to propagate it).  

I think Neanderthals are also likely evidence pointing in this direction.  Their brains were more or less the same size as ours, and they had a common ancestor a full 500,000 years ago with us.  But they lived in the frozen wastes of ice age Europe, in small isolated subpopulations if the homozygosity of the neanderthal paleogenomes is to be believed, with LOTS of small subpopulation bottlenecks.  That's a perfect recipe for repeatedly losing your complex material culture down the generations.

Empirically, human brain size has also been on a downtrend for the past fifteen thousand years as agriculture and civilization has spread.  It is a simpler environment with fewer complex things you need to interact with on their own terms and significantly worse nutrition, so we give up some small fraction of our highly expensive intelligence over long periods of time.

Replies from: daniel-kokotajlo
comment by Daniel Kokotajlo (daniel-kokotajlo) · 2020-11-03T17:03:53.458Z · LW(p) · GW(p)

Good points. I think I agree with everything you said, so I'm confused as to why we disagree. I guess your model is that we got intelligence + rationality first, and then civilization came later when we had population density, and therefore we might have more intelligence + rationality than we need to sustain civilization. The fact that brain size has been shrinking supports this; maybe we were more rational 15,000 years ago, or at least more intelligent.

I think my claim is still true though -- it does seem like civilization would collapse if we got significantly dumber or less rational. I guess I had been meaning "hovering around bare minimum level" more loosely than you.

Replies from: daniel-kokotajlo
comment by Daniel Kokotajlo (daniel-kokotajlo) · 2020-11-04T11:47:35.689Z · LW(p) · GW(p)

I think I concede that my argument was shaky and that we probably aren't at the bare minimum level for reasons you mention. But I still think we are close, for a loose definition of close.

comment by Emiya (andrea-mulazzani) · 2020-11-02T18:16:19.816Z · LW(p) · GW(p)

When you think about it, because of the way evolution works, humans are probably hovering right around the bare-minimal level of rationality and intelligence needed to build and sustain civilization.

I think this will be a really helpful thought to keep in mind, thank you.

 

The next thing to remember, of course, is that you're a monkey too.

Also helpful, I think I was starting to think of myself as having done with all the basic biases.

 

My reaction tends to be more cosmic horror than anger/frustration though. I tried to express it here.

I guess I could try to see it that way, at least I wouldn't be angry at people who actually helped me improve back in the past. 

I sometime thought about what happened to them in the terms of "That stupid way of thinking got them, it will mess them up more and more if they don't get rid of them". 

It seems better to try seeing them as infected by a bad, harmful meme than to just get angry at them because they're suddenly being stupid.

answer by aa.oswald · 2020-11-02T16:22:49.585Z · LW(p) · GW(p)

When a person changes their way of thinking radically, it is normal for them to want to tell everybody about them. This happens even if the change is what people here might consider irrational- think becoming religious. There's even a Wikitionary phrase for it, "passion of a convert".

So, the first thing I would say to your anger phase is, "Don't worry, you'll get over it."

If you want to speed up getting over it, it might be useful to practice two things. The first is to really focus on personal improvement and realize you're still a newb. The second is to deeply empathize with why other people do and believe the things they do, and realize that you were that way even a few weeks, months, years ago.

A sophomore in engineering can't feel angry that an undecided freshmen doesn't know calculus. A senior in aerospace engineering can't feel angry that a senior in mechanical engineering doesn't know anything about wing design. Who are you to get angry that a person hasn't memorized yourbias.is when you can't even differentiate the Many Worlds interpretation from the Copenhagen interpretation? 

Everybody is still building out their map, and just because you've luckily found yourself on a part of elevated territory and you're able to make a better map, doesn't mean those with a lesser view are worse

Secondly, it would help to read about how people come to their world views, and also specifically read about how people came to the rat-community. Basically, read people's personal "testimonies" and you'll find that a lot of it is driven by a mixture of personal and cultural facts. Also read testimonies of people that converted into different religions, or even the testimonies of people who didn't convert at all. 

For example, I have a Jehovah's Witness friend. She got very close to deconversion 10 years ago to the point of listing out reasons that the JWs were wrong. Yet, last I saw on Instagram she was going to the JW headquarters and performing missionary work. Her family, her extended family, and most of her friends were all religious. Can I really be angry that her brain said, "Yeah, I'm going to believe what gives me massive amounts of comfort or am I going to believe something that could literally cause my death?"

As far as books, I would encourage reading Jonathan Haidt's The Righteous Mind. The book attempts to look at the evolutionary background for humans' moral systems, and is very good at injecting a large dose of empathy into its readers.  

comment by Emiya (andrea-mulazzani) · 2020-11-03T13:43:22.780Z · LW(p) · GW(p)

So, the first thing I would say to your anger phase is, "Don't worry, you'll get over it."

That's a relief.

A sophomore in engineering can't feel angry that an undecided freshmen doesn't know calculus.

Yeah, I usually try to think like that, what I felt lately was more like... finding out that your calculus professor doesn't actually know how to do calculus in one case, and finding that the freshmen in a scientific faculty can't actually manage to understand simple Aristotelian logic... 

Usually I get most of my annoyance from listening to supposed experts who are making evident mistakes, or from listening to people who are particularly stupid.

 

Everybody is still building out their map, and just because you've luckily found yourself on a part of elevated territory and you're able to make a better map, doesn't mean those with a lesser view are worse

A really... sobering way to look at it, thank you. 

I had been trying to be as smart as I could for years even before finding rationality, but finding something that good, which jumpstarted my accuracy and intelligence a lot, was sheer luck.

Also I didn't really do anything to be born with an above average intelligence, I didn't do anything to be grown in a home where education was highly valued, so I guess that even trying to be smarter isn't such of an obvious idea to have.

I guess we could call it the self-made man fallacy, if you saw hard work working for you you feel like everyone else ought to just try and it would work as well for them too, but you don't notice the strokes of luck you had or that you still had an advantage to start with. 

And I knew all this stuff already, but... I don't know, I guess I still felt as if certain things were so obvious than anyone not figuring those out wouldn't have any excuses, since those things I always knew, so I've allowed my emotional response to be shaped by how this feels from the inside

 

Can I really be angry that her brain said, "Yeah, I'm going to believe what gives me massive amounts of comfort or am I going to believe something that could literally cause my death?"

Your friend isn't the kind of people I'd have got mad at, at least if I knew what you know about the things that trapped her into staying... which I just realised it's the correspondence bias word for word

If I can't see why people are missing the obvious truth (though I don't consider dropping religion as obvious, I know it can be pretty hard) I might just not know enough about how they learned to think or what harmful meme got their cognition before I met them or what do they think it would happen if they didn't believed what they believe... Even pure cognitive laziness has to be caused by something, I shouldn't just have written dumb on my model of their cognitive processes, as if dumbness was a simple mystical essence with no moving parts, I should have know better.

I've read about this exact mistake so many times that it's not even funny, I had to force myself to spell it out here even if I knew it's really a good thing that I've noticed and that I'm admitting it, because getting the basics wrong feels just so embarrassing.

 

Asking this question was extremely useful to me, it seems. I'll check out the book, seems pretty much what I was looking for.

answer by Viliam · 2020-11-03T00:05:01.176Z · LW(p) · GW(p)

Reading Less Wrong made me unable to enjoy debating politics. Now the average online debate seems like a competition who is most stupid. When Facebook shows me a news article with more than 100 comments and I read a few of them, I feel dirty.

My recommended first help would be: think less about stupidity of other people, and more about your own. (Applying my lesson on myself: why am I clicking the "comments" link when I see there are more than 100 comments? And why am I even browsing Facebook in the first place?) If you are so rational, why aren't you winning more? Yeah, some things in life depend on cooperation of others. But some other things don't -- have you already maximized those? Why not? Did you already clean up your room?

And my point here is not that if you focus on improving yourself, miracles are going to happen just because you read the Sequences. It's just that focusing on improving yourself has a chance to lead to something useful, unlike complaining about the stupidity of others.

Most people simply don't care about their sanity. It is a fact about your environment, deal with it. To certain degree, this is about the "near" vs "far" thinking (Robin Hanson writes a lot about it); people usually behave quite reasonably in their everyday lives, and say utterly crazy bullshit about anything abstract or remote. They survive, because they do not try to connect these two parts; it is as if they live in two completely different universes at the same time.

When you think about incentives, here is the reason: in the "near" mode you are rewarded or punished by the natural consequences of your actions; in the "far" mode you are rewarded or punished by the social consequences of your statements. This it makes sense to act reasonably in your everyday life, and spout exactly the type of crazy bullshit that gets rewarded in given social situation. On average. Sometimes following the socially approved action (using homeopathics for actual illness, or not wearing face mask in COVID-19 situation) gets you killed. But historically, way more people got killed becaused they pissed off their neighbors by openly disagreeing with them about something; and it didn't matter who was actually right.

I kinda see people on a scale, roughly separated into three groups: On one extreme, wannabe rationalists. Those are my tribe. On the other extreme, actively irrational; the kind that not only believes something crazy, but won't shut up about it. Those I consider hopeless. But between them, and I think it might be the majority of population, is people who kinda try to do their best, sometimes impressively, sometimes their best is not very good; who have some bullshit in their heads because their environment put it there, but they are not actively promoting it, they are merely unable to clean it up; and who are able to see and listen. With those, I need to find the safe set of conversation topics, and remain there most of the time, sometimes gently probe the boundaries. There is this "agree to disagree" bullshit, which would be intellectually lazy and kinda offensive against your fellow rationalists, but is a great peace-keeping tool between different tribes.

I never try to convert people. I explain, sometimes I nudge. If there is no reaction, I stop.

I am bad at predicting stupid people. I mean, I can vaguely predict that they will most likely "do something stupid", but it is hard to make specific predictions. People are usually driven by emotions: they defend what they like, and attack what they dislike. They like things that make them feel good, and dislike things that make them feel bad (e.g. being told they are wrong about something). But in real-life situations, multiple forces act upon them at the same time, and I can't predict which effect will prevail.

comment by Emiya (andrea-mulazzani) · 2020-11-03T09:43:07.495Z · LW(p) · GW(p)

My recommended first help would be: think less about stupidity of other people, and more about your own.

This is generally good advice and I do need to be more mindful of my own stupidity, but my problem isn't that I go searching for other people stupidity so I can get angry at them, more that... I'm getting more and more annoyed every time I accidentally bump into it and I'm trying to avoid reacting by shutting off everything and everyone. Though some of the advice I'm receiving looks helpful about not doing that.

 

... people usually behave quite reasonably in their everyday lives, and say utterly crazy bullshit about anything abstract or remote.

But historically, way more people got killed becaused they pissed off their neighbors by openly disagreeing with them about something; and it didn't matter who was actually right.

I guess that could explain the lack of critical sense they show about stuff they aren't expert on. I've never cared about simply agreeing with other people ideas if they didn't seemed right to me at first sight, and usually thought I were the one knowing best (even when deeply wrong about it) so that's not a factor my brain considers when trying to simulate other people. Thank you for this useful insight.

 

There is this "agree to disagree" bullshit, which would be intellectually lazy and kinda offensive against your fellow rationalists, but is a great peace-keeping tool between different tribes.

I hadn't thought of it that way. I was refusing to "agree to disagree" as if it was a moral rule, but I should stick with that if I see no chances I can actually persuade someone. To be more precise, I had figured out that between non rationalists it was often better to agree to disagree since it would be a lost cause, but I thought I just couldn't do that, no matter who I was talking to. 

I'm still a bit queasy about apparently supporting bad epistemology, so I think I'll try to state it like "We can't both be right, but I guess talking about it won't lead us anywhere, so let's just forget about it".

Replies from: Viliam
comment by Viliam · 2020-11-03T17:07:12.142Z · LW(p) · GW(p)

"We can't both be right, but I guess talking about it won't lead us anywhere, so let's just forget about it"

Yep. Let's not fight about it.

I would say that even among rationalists, it may be sometimes useful to settle for: "logically, at least one of us must be wrong... but finding out which one would probably be too costly, and this topic is not that important".

Replies from: andrea-mulazzani
comment by Emiya (andrea-mulazzani) · 2020-11-03T18:06:48.174Z · LW(p) · GW(p)

Ironically I understood the "too costly" logic between rationalists pretty fast, since I've witnessed arguments being dissolved or hitting an objectively hard barrier to overcome really fast. 

When I'm dealing with non rationalists, instead, I kinda have the impression agreement is just behind the corner. 

"I understood your point of view and I have changed mine if I was doing a mistake. If we are still talking it means I figured out what mistake you are doing, why can't you just understand what I'm saying or tell me the part you aren't understanding, I'm doing my best to explain and I've been  honest with you..." 

That's the sensation I usually feel when I care enough to argue about something and just don't write the effort as hopeless from the start, but it's just that, what I feel, it's clearly not easy at all doing all of a sudden what I specifically trained myself to do.

comment by lsusr · 2020-11-03T21:33:25.780Z · LW(p) · GW(p)

To certain degree, this is about the "near" vs "far" thinking (Robin Hanson writes a lot about it); people usually behave quite reasonably in their everyday lives, and say utterly crazy bullshit about anything abstract or remote.

When you think about incentives, here is the reason: in the "near" mode you are rewarded or punished by the natural consequences of your actions; in the "far" mode you are rewarded or punished by the social consequences of your statements.

This is very good.

answer by ChristianKl · 2020-11-04T15:23:17.162Z · LW(p) · GW(p)

I notice logic leaps, cognitive missteps and dumb conclusions of people who are considered smart, deep and expert on stuff while they talk on the radio or on other medias and I get angry.

If you are listening to an expert on the radio and similar mainstream media the expert gives you a dumbed down argument for the position he's holding. 

In an interviews you have cycles of the expert making a complex claim and then the interviewer telling them: "Can you say this in a more concise way?"

If the expert doesn't really get it they might also be told: "Part of our audience is housewifes who never went to college who listen to our program while doing the dishes, can you make your point in a way that the housewife also understands while she does the dishes?" (This example is recounted from memory from https://media.ccc.de/v/24c3-2334-de-die_wahrheit_und_was_wirklich_passierte/related )

The argument that the same expert would make when sitting down with collegues where the expert can have an off-the-record conversation will be more nuanced and complex then the argument the expert gives on the radio.

If you hear an obviously flawed argument on the radio you shouldn't jump to the conclusion that the expert making it is stupid but that they are just not in a position to give you the nuanced argument.

comment by Vladimir_Nesov · 2020-11-04T17:13:30.049Z · LW(p) · GW(p)

"Can you say this in a more concise way?"

"No."

(When talking to non-experts, most points should become less concise than when talking to other experts, because to meaningfully communicate anything to a non-expert, you also have to communicate the necessary prerequisites that other experts already know.)

Replies from: ChristianKl
comment by ChristianKl · 2020-11-05T10:42:08.251Z · LW(p) · GW(p)

It's a valid stance to take but it's the stance that gets the journalist to ask some other expert that's willing to be concise. Those people you hear interviewed are generally willing to play the game of the journalists. 

When being a news consumer it's useful to not have misconceptions about what kind of information you are exposed to.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2020-11-05T12:28:13.185Z · LW(p) · GW(p)

Exactly, that's what makes the question as you formulated it funny. It's not a question, or even a request. It's a non-negotiable demand. If you don't concede, the whole deal is off. Yet not conceding is often the only reasonable thing to do, so it's a demand to be unreasonable masquerading as a question, because don't be rude.

comment by Emiya (andrea-mulazzani) · 2020-11-04T15:52:10.376Z · LW(p) · GW(p)

I hadn't thought about this possibility. 

I remember having noticed people explaining badly things I knew were actually right and had better proofs than what was being explained. If I wouldn't know about the evidence already I wouldn't have noticed they were misrepresenting the position, but the subjects I get angry on are rarely the kind of things where background knowledge is so complex you can't explain it properly to a laymen.

Some of what I got angry about were just plainly stupid ideas, it doesn't look plausible that the people talking had better reasons to sustain those and just weren't saying them, it clearly looked as if someone was trying to be clever rather than trying to think about the evidence, and those people were the experts of their side, not the village fools.

But I did get angry at people who were quoting research and studies I hadn't read, because, from the way they explained said studies, it was clear to me that research was plain rubbish conceived by someone who just doesn't understand what research and evidence are. 

But it's indeed possible that the people quoting it had just made a mess and understood nothing... 

I always tried to avoid being mislead because I didn't understood passages of the two minute version of an idea, if I wasn't understanding why they had said a thing I'd go read more on it. 

I never thought that people could go as far as completely botch the two minute version they had explained to me, even when they are supposed to have studied it, but it's clearly possible, even if not so likely.

I'll have to remember to check the original sources when I really should get something right.

Replies from: ChristianKl
comment by ChristianKl · 2020-11-04T20:48:18.882Z · LW(p) · GW(p)

I remember a TV interview I did with a friend on Quantified Self. One of the elements was my friend measuring stress with a emWave2. In the process of dumbing down the complexity of what we were doing to make it TV compatible, my friend in the end said that he was measuring heart rate with the emWave2 to measure stress. 

The thing that emWave2 actually measure is heart rate variability but there was no time to explain what heart rate variabilty is. If a viewer would actually understand the subject matter they would rightfully find it strange that my friend said he measures heartrate for stress but for the average viewer that inaccuracy wouldn't be a big deal. 

Complexity reduction like that happens when focusing on expressing oneselves in a way that works on TV and the radio.

Replies from: andrea-mulazzani
comment by Emiya (andrea-mulazzani) · 2020-11-05T00:39:03.600Z · LW(p) · GW(p)

I see, thank you for this example. 

I'll remember to prepare the dumbed down explanations in advance, in my plans I'll have to communicate a lot in the future.

answer by Vladimir_Nesov · 2020-11-02T16:19:05.178Z · LW(p) · GW(p)

I experienced something similar with spelling mistakes for a while. The solution was to explicitly conceptualize text-on-the-page as separate from idealized-text, so that the mistakes could be imagined to be blissfully absent in the idealized text.

The issue is that when you notice a bug, there is an urge to fix it that demands satisfaction. Sometimes, there is an actual plan that fixes the bug, but intuition won't come up with it [LW · GW], so deliberative thought needs to help. When fixing the bug is not on the table, it might suffice to just carefully formulate what's known about it, perhaps writing up some notes.

For people, productive activities include charity and steelmanning: figuring out why a behavior actually happens and how to channel its purpose better.

comment by Emiya (andrea-mulazzani) · 2020-11-03T13:46:32.106Z · LW(p) · GW(p)

Thanks for the link, the chewing example does feels similar to my experience, will try to think about that.

answer by waveman · 2020-11-03T07:22:23.279Z · LW(p) · GW(p)

It is kind of a meme that people learn about rationality and then observe how irrational everyone else is . It is a lot easier to observe others' irrationality than your own. But probably one's own irrationality is more important. 

1. So, work on your own irrationality first before focusing on others' limitations. 

2. As for dealing with other people's irrationality, see (1). 

3. Finally, people are going to do what they want to do. With some very rare people you can introduce them to rationality things and they might change. With most, they cannot or don't want to be rational. This is the reality that you need to rationally deal with. 

4. Also be aware that full rationality is not possible. This in the sense that you cannot do all the calculations needed to behave totally rationally. You need to employ all sorts of heuristics and short cuts. My computational capacity is limited, also my memory. Gathering data is costly.  Time is short. So tolerate other people who deal with this in ways you might not prefer.

comment by Emiya (andrea-mulazzani) · 2020-11-03T14:23:14.821Z · LW(p) · GW(p)

It is kind of a meme that people learn about rationality and then observe how irrational everyone else is . It is a lot easier to observe others' irrationality than your own. But probably one's own irrationality is more important. 

I've just felt how much this is true by thinking about some of the answers I got. 

There really is a huge difference between just "knowing" something (I'd have knew this even before being told in these replies) and actually realising that I was making stupid mistakes in how I thought about this very subject. 

I would have agreed to point 1. and 2. right away, and I wrote this question with 3. firmly in mind, so I thought I was being really rational about all this issue, since I actually knew I was just supposed to search for a way to solve the way I felt about it and not magically expecting people to change overnight, and I still had overlooked a mistake I was making about how I thought about "dumb people" that was causing most of my negative feelings.

 

About 4, heuristics, shortcuts and not wanting to think about something were all thing I understood and tolerated in other people. 

I felt angry when 

1) facing the sheer, total lack of judgement that some people show in areas where I felt they should at least try to have some, and 

2) facing the more questionable approaches to finding truth that supposedly smart experts use while talking about stuff they are thought to know. The kind of stuff you find in magic theories and psychoanalysis, only that apparently it has been creeping into all types of humanistic modern fields, and when before I could just vaguely recognise that something was just wrong with how they reached a conclusion, as soon as I finished reading about reductionism and all useful parts of cognition having to be Bayesian at some level, I could suddenly give that a name, and understand exactly what they were doing wrong and put it into words, so I got suddenly a lot more annoyed at them, as if it was the mistake that had just gotten dumber rather than me getting smarter.

comment by lsusr · 2020-11-03T22:01:42.456Z · LW(p) · GW(p)

I like how much your answer bears resemblance to advice on other subjects unrelated to rationality.

answer by Gunnar_Zarncke · 2020-11-03T21:57:01.658Z · LW(p) · GW(p)

Related: The treacherous path to rationality [LW · GW]

I think that people don’t want to use explicit reason. And if they want to, they fail. And if they start succeeding, they’re punished. And if they push on, they get scared. And if they gather their courage, they hurt themselves. And if they make it to the other side, their lives enriched and empowered by reason, they will forget the hard path they walked and will wonder incredulously why everyone else doesn’t try using reason for themselves.

Maybe your question is addressed by this part:

People just like their friends. It simply feels right. It’s what everyone does. The way out of the valley [of disintegration] is to not to reject this impulse [...] but to integrate your deep and sophisticated friend-liking mental machinery with your explicit rationality and everything else.

The way to progress in rationality is not to use explicit reason to brute-force every problem but to use it to integrate all of your mental faculties: intuition, social cognition, language sense, embodied cognition, trusted authorities, visual processing… The place to start is with the ways of thinking that served you well before you stumbled onto a rationalist blog or some other gateway into a method and community of explicit reasoners.

comment by Emiya (andrea-mulazzani) · 2020-11-04T02:12:35.734Z · LW(p) · GW(p)

I was really puzzled reading that post, to me learning rationality always felt wonderful, my first round with it was like I had suddenly noticed I was living in a really small cage inside my head, and now I could suddenly open the door to get out and walk outside on my legs for the first time and then run. Now that I'm finally managing to continue I feel like the rest of the world just gets clearer and clearer to understand, even if I got these negative emotions as side effects. 

I can only assume I was the ideal subject to learn it, when I stumbled into it I was managing to self sabotage myself at everything relevant I tried to do in an obstinate attempt to not risk gaining any possible disconfirmation about my intelligence.

I had wrote more about it, but then I realised I should just write a coming of age or postmortem about this.

 

Back to the subject:

And if they make it to the other side, their lives enriched and empowered by reason, they will forget the hard path they walked and will wonder incredulously why everyone else doesn’t try using reason for themselves.

I guess this kinda describes what happened to me, it wasn't exactly a perilous path but I did put in a lot of work.

The way to progress in rationality is not to use explicit reason to brute-force every problem but to use it to integrate all of your mental faculties: intuition, social cognition, language sense, embodied cognition, trusted authorities, visual processing… The place to start is with the ways of thinking that served you well before you stumbled onto a rationalist blog or some other gateway into a method and community of explicit reasoners.

I'm really unsure about how I could try to integrate my intuitions into my explicit reason, at first sight they seem like incompatible processes since you can't really understand why you are having a particular intuition (if the post uses intuitions to mean the kind of ideas of judgements you can't explain at an explicit level). 

Or the suggestion is to apply explicit reason to check if the initial suggestions my intuitions give me make sense?

So far I haven't managed to intentionally use intuition to solve a single relevant problem, I think my mind mostly uses intuition when I don't have the time to make all the calls by explicit reason or by pre-selecting for good ideas and pointing out possible mistakes I then examine.

All in all, I don't think I trust my intuitions much because explicit reason improved my performances a lot and I feel very nervous about going with something I can't make sense of.

If anyone has thoughts on this or suggestions I'd love to hear them. The other mental faculties mentioned seem easier to integrate with explicit reason.

 

People just like their friends. It simply feels right. It’s what everyone does. The way out of the valley [of disintegration] is to not to reject this impulse [...] but to integrate your deep and sophisticated friend-liking mental machinery with your explicit rationality and everything else.

I know, I wrote this question also because I didn't wanted to risk feeling angry or disinterested toward my friends. Even if I know they are relatively "crazy" I don't feel at all like I shouldn't be friend with them anymore... 

I guess it would be a good idea to remind myself to notice and appreciate what I like about them and the warm things they do, even if they aren't at all related to being smart or rational.

Replies from: Vladimir_Nesov, Gunnar_Zarncke
comment by Vladimir_Nesov · 2020-11-04T02:49:58.276Z · LW(p) · GW(p)

you can't really understand why you are having a particular intuition

Intuition is distilled [LW · GW] deliberation. Deliberation is a sequence of intuitive steps, amplified [LW · GW] intuition. A given intuition is formed by (and stands for) the dataset that trains it, the habits of deliberative thought on its specific topic.

comment by Gunnar_Zarncke · 2020-11-04T19:06:00.979Z · LW(p) · GW(p)

I didn't intend to imply that learning rationality can feel difficult or hard. It sure didn't for me as my path started early and I had a lot of support. But I guess it can be challenging in some circumstances.

Replies from: andrea-mulazzani
comment by Emiya (andrea-mulazzani) · 2020-11-05T00:35:26.884Z · LW(p) · GW(p)

I understand, what I meant was that I initially felt confused reading the post you linked, since that one did implied that a lot of people do. 

But having thought about it, it seems likely that a lot of people would find themselves in those challenging circumstances.

answer by Stuart Anderson · 2020-11-03T06:10:07.279Z · LW(p) · GW(p)

-

comment by gilch · 2020-11-03T18:23:09.473Z · LW(p) · GW(p)

The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.

—George Bernard Shaw

I'm all for having an accurate map, and that does mean updating that map. But don't let that stop you from trying to alter the territory—and actually fixing problems.

If the world fails to meet your expectations, sometimes the problem is with the world.

Replies from: stuart-anderson
comment by Stuart Anderson (stuart-anderson) · 2020-11-04T00:08:22.510Z · LW(p) · GW(p)

-

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2020-11-04T02:12:42.179Z · LW(p) · GW(p)

Caring about things is by definition an emotive act.

I strongly disagree (about "by definition"; it's of course a popular sense of the word). Operationalization of caring is value, preference. It's channeled by decision making, and deliberative thought is capable of taking over decision making. As such, it may pursue an arbitrary purpose that a person can imagine. A purpose not derived from emotion in any way might be thought to be an incorrect idealization of preference, but even a preference ultimately grounded in emotion will be expressed by decisions that emotions are occasionally incapable of keeping up with.

Replies from: stuart-anderson
comment by Emiya (andrea-mulazzani) · 2020-11-03T12:40:20.023Z · LW(p) · GW(p)

When the world fails to meet your expectations then the problem isn't with the world.

This situation is an issue of emotional regulation (at the very least). I can recommend DBT as efficacious there.

Yes, that's exactly how I was looking at it, though I guess I didn't made a very good job at explaining that in my question. 

I mean, I still think the current lack of rationality in the world is a big problem, but it's not like I expect people to do better any time soon, I was just looking for ways to avoid feeling like I feel when I'm reminded of that. 

I'll look into DBT and try your advice, thanks.

Replies from: stuart-anderson
answer by remizidae · 2020-11-02T15:41:55.907Z · LW(p) · GW(p)

Maybe it would help if you realized that most people most of the time are not interested in being explicitly rational. They’re focused on something else: often they’re focused on building relationships, or getting a task done, or enjoying themselves. Maybe you could try focusing on those things too, especially the relationship-building bit, instead of choosing between “tearing apart” or ignoring what they say.

Also, I don’t know how old you are, but I’ve noticed that the people I interact with have gotten more congenial over time. As a child/teen/college student, many of my interactions were with nonchosen family or classmates. Now most of my interactions are with chosen family, friends, or workmates filtered to be more like me.

Oh, and since you mention being annoyed by “experts” on the radio, maybe...don’t listen to the radio or other media. You probably don’t need to do that, you’re not getting any relationship-building benefits out of it, and it’s annoying you.

comment by Emiya (andrea-mulazzani) · 2020-11-02T18:32:57.139Z · LW(p) · GW(p)

Maybe it would help if you realized that most people most of the time are not interested in being explicitly rational.

I'm afraid that's the main reason I'm getting angry at them, the utter lack of trying at being intelligent when they have to choose what to do or believe. 

I never get angry at people for enjoying something stupid, or felt like they should treat each other are robots, or because they just follow (non evil) instructions, that I can understand.

I get angry only when it involves something where they really, really should try to get it right, and they don't even have excuses like being under stress or pressure, and they still don't try even just a little.

 

Though, now that I spell it out in more details, I realise (well, more remember, I already knew that) that people could focus on those other aspects you mention even when they shouldn't.

This was helpful, I hadn't noticed that I needed a more complex model of "being dumb" both to model people and to not get angry at them.

 

Unfortunately I had already selected what I'm exposed to as relations and media as much as I could a while ago, it wouldn't be easy to make a second selection.

answer by cousin_it · 2020-11-03T16:33:24.212Z · LW(p) · GW(p)

One aspect of intelligence/rationality is estimating the productiveness of a conversation before it happens. Another is expressing your views in a way that sounds palatable even to those who disagree. Another is recognizing that on any given topic there are more knowledgeable people than you, and seeking them out. Another is directing most of your effort and emotion toward things you can influence. You can't learn these things from a book though, you have to practice them.

comment by Emiya (andrea-mulazzani) · 2020-11-03T17:58:01.281Z · LW(p) · GW(p)

I think I'm doing more or less okay in most of these (still room for a lot of improvement of course), my problem seems to be focused around:

1)

Another is expressing your views in a way that sounds palatable even to those who disagree.

I can only do this if I understand how someone thinks, and I have to get a better model of how people I usually wouldn't really want to talk to think. (I need for my goals to be able to influence those kind of people as well)

2)

Another is directing most of your effort and emotion toward things you can influence.

I'm pretty good at avoiding wasted efforts, not good at all at directing my emotions toward things that I can actually influence.

You can't learn these things from a book though, you have to practice them.

I'd say books make a stronger starting point for practice, but it hadn't occurred me I could just try and choose to not get angry. I've managed to regulate my emotions before this way, though I never tried on quick emotions...

I've gotten advice that should make not feeling angry easier, but will also give practice a shot. 

I should really make a habit to focus my emotions and attention toward stuff that actually matters, it screwed me up in a lot of ways already. Thanks.

answer by TheFishBowl · 2020-11-05T18:59:00.611Z · LW(p) · GW(p)

"I know it's really unfair because I didn't know any better mere weeks ago, and years ago I was a good textbook example of an intelligent person who'd keep mainly using his intelligence to rationalize whatever questionable decisions he made, but I just can't help it."

If we approach this from an economists lens the situation seems to change slightly. To an economist, a rational actor is someone or something that acts in her own self interest. Acting in ones own self interest is to make decisions where the foreseen benefits outweigh the foreseen costs. This means that even someone who is addicted to a hard drug and continues to use that drug is acting rationally as long as the benefits of continuing to use said hard drug outweigh the costs for that particular person. 

Societally the values may not align but that doesn't mean that they are irrational. It just means that our foreseen benefits and costs are different from theirs. If you want to go down that rabbit whole, look into behavioral economics. They like to claim that people can act irrationally.

comment by Emiya (andrea-mulazzani) · 2020-11-06T11:37:49.486Z · LW(p) · GW(p)

I understand what you mean, but, under these lens, I'd be using "irrational" to describe thought processes that negatively affect attempts to estimate the foreseen benefits and costs of a decision, or that cause people to connect their foreseen benefits with actions that have no real reason to lead to those.

Also, the way the word is used on this site, "rationality" is also the art of managing to not have your short term benefits get in the way of the real long term benefits you'd rather choose, and in choosing which foreseen benefits and costs should matter most to you.

So the addict would be irrational if in his decision he considered only: "pleasure from drug + 10 utility", "temporary pain from stopping - 50 utility", rather than also "likelihood of slow disintegration of my life, 20% of -1000 utility", and "slowly decreasing effects from the drug and likelihood of having increasing future difficulties in obtaining drug, -80% of future "pleasure from drug +10 utility", because a) he's not considering at all the third factor because it's not certain, or b) he's considering only the first two because they are temporally closer, or d) he thinks what has seen for every drug addict he knows that's further in the temporal curve won't hold true for him without having a good reason to think so. (this is not a good model of drug addiction I think, I was just trying to describe what I mean).

 

Still, this is not the kind of irrationality I was getting mad at, I think I was getting mad mostly at irrational decisions and thoughts that didn't had evident "strong causes", addiction and social pressure are strong causes.

 

Going down that rabbit hole was in my plans, I'll check out behavioural economics.

answer by BladeDoc · 2020-11-02T16:13:15.569Z · LW(p) · GW(p)

People are not only not rational, most do not WANT to be rational, and value many other things higher than rationality. Remember the story arcs of characters like Spock, Data, and Sheldon do not celebrate their becoming more rational.

comment by Emiya (andrea-mulazzani) · 2020-11-03T13:52:42.718Z · LW(p) · GW(p)

That's an interesting thought, I was aware that most fiction kept saying to people that Kirk beats Spock, I hadn't noticed that even the character arcs of rational or smart characters are almost never about them getting smarter or improving their minds...

I think I saw that in a very few manga, even those that have a genius main character doesn't think about him getting more rational or smarter at all, he's just a genius from start to finish, any progress he makes are usual on other sides of himself...

It seems this is really something that's lacking from our culture.

answer by Productimothy · 2023-01-30T18:41:17.739Z · LW(p) · GW(p)

You may be confused by some of my response. I'm well aware it deviates substantially from your inquiry--there is just substantial back-end stuff I think would help your autonomy to more efficiently improve in anything.

In Eliezer's "12 Virtues of Rationality", read the last virtue--the nameless virtue of the void. Take what follows as a guide to approach what he writes.

You appear to be approaching these problems with a vague mainframe--possibly even rationality as a whole with a vague superframe. When you ask for advice and sources to help, you think you want the subframes, which will fit on your vague mainframe. While it will correlate to better decisions and will eventually lead to a clear mainframe, it will not nail them as efficiently or as expansively than could be accomplished if you were to deliberate it the other way around (recall the effects skimming a book before reading, or defining the purpose before action, versus reading the book before skimming or acting without purpose).

To devise a mainframe, though, you do need some knowledge both about how to best make a schema and general knowledge about your area of improvement. Very quickly, you will find yourself scaffolding a formalization of the outer-boundaries of what you and rationality currently knows.

This principle can be applied to learning efficiency, rationality, or anything cognitive. This is how the mind works most naturally. This is what top thinkers are actually doing; it is how some people see the world clearer than others. This is how you prevent yourself from creating sub-optimal circumstances from within your own confusion and ignorance. This is not clearly widespread, and much less so brought to application. There are tools and decisions that arise from it. 

If you do not have a clear and accurate model on which to assess yourself, you cannot expect to understand the beat of a situation, will not respond in the best way pragmatically possible, and your improvement will be drastically slower. You may be guessing about what exactly constitutes your insufficiency and thus not target your limiting attributes as well.

This is to aid you in constructing a proper mainframe for your specific inquiry:

When you feel emotional tension, there is two options: you can change yourself or you can change others. Pragmatically, you cannot often change others. It is the job of your short-term advocate to choose, and it is the job of your long-term advocate to make the prior knowledge required to assess if it can (or should) be done.

With tension, there is some underlying value you are predisposed to assume. You can change this emotional tension from within the experience by changing your lens from which you are viewing it. Or, you can train the predisposition, which is to internalize general features of the desirable type of lens-changes.

Both are indispensable for a bounded rationalist. Training the predisposition means you can make better decisions across more instances, quicker, and with less cognitive effort. And being able to change your lens real-time is a good patch where your predisposition is insufficient. This autonomy can be defined as a controller of predispositions. 

You do not want to eradicate emotional tension, you merely want to get rid of the unhelpful tension. Tension within can be extremely useful because it necessitates thought and behaviors to occur. We just want those thoughts and behaviors to be aligned to wider knowledge and purpose. My wider purpose through my bottle-necked knowledge, in short, is to minimize human suffering while maximizing sustainability.

Don't let these simple words fool you--there is a great complexity to what they actually mean and how they may be applied. Abstract thinking applied seems to be the foundation for all decision-making; this is what rationality is in thought and action. Abstractness prevents details, thus inherently coming out more correct. After practice and targeted training can one refine his abstractions down to subsets of abstractions, and further still.

I recommend these two as the strongest sources that have brought me to the above propositions. 
ICanStudy ("chunkmapping" is what they call the efficient frame-making. I cannot think of a more efficient and pragmatic way to organize a schema. Principles: Video 1, Video 2.)
and Jordan Peterson's lecture series 2017 Personality and its Transformations.

5 comments

Comments sorted by top scores.

comment by gilch · 2020-11-02T19:04:43.741Z · LW(p) · GW(p)

Have you seen Street Epistemology yet? It's an effective way of leading irrational people to notice their own contradictions, but it does take some patience.

Replies from: andrea-mulazzani
comment by Emiya (andrea-mulazzani) · 2020-11-03T13:45:37.155Z · LW(p) · GW(p)

I'll give that a look, maybe I could try that on the people I care about.

Replies from: gilch
comment by gilch · 2020-11-04T05:03:31.176Z · LW(p) · GW(p)

Let us know how it goes.

comment by Nacruno96 · 2020-11-06T00:10:09.247Z · LW(p) · GW(p)

I have the same problem. But I kind of focus on my goals and don’t care so much about what other people say do or recommend. I also doubt that learning about rationality changed you. It was caring about rationality. Because I cared about it most of the time quiet deeply and I was a bit like that all the time. Find people like yourself and if there are no people like yourself just do what you enjoy. And to a certain extent you can enjoy irrational people. They have often some resemblance of humor. Also we are probably not totally rational. 

Replies from: andrea-mulazzani
comment by Emiya (andrea-mulazzani) · 2020-11-06T11:46:11.763Z · LW(p) · GW(p)

I know I'm not totally rational, most of my anger was coming from my own cognitive missteps that I was explicitly warned against while studying rationality. I also knew that I wasn't perfect before, I think my anger came out when I witnessed rationality dropping below what I thought were unjustifiable levels (because I was failing to understand what could be messing up other people's cognitive skills or how they could have had so few to start with). 

I can hear my thoughts and I can see they have changed. I'm performing all kind of mental operations that I wasn't doing before, so the way my brain produces beliefs has changed.

Also, I was caring about rationality a lot before going in the second phase of my training. I don't think I care more now, just that I know better about it and I can see irrationality more clearly.

 

You are right about the fact that my "caring about rationality" so much was the fuel for my change, I wouldn't have applied myself this hard if I didn't just cared so much about finally having a way to be a lot smarter.