I'm becoming intolerant. Help.

post by loup-vaillant · 2011-06-30T15:30:47.970Z · LW · GW · Legacy · 85 comments

Contents

85 comments

Basically, I cannot stand people who will not bow to the Truth.

I always had this trait, but I noticed lately that it is becoming worse, and has consequences.  Ironically, the main trigger seems to be the sequences. They gave me a confidence that sometimes frightens me. There are multiple manifestations:

The closest semi-famous embodiment of this character trait I can think of is Xah Lee. I like much of his writing, but he can be very blunt, sometimes to the point of insult.

Needless to say, I do not endorse all these changes. The problem is, while I know I should calm down, I just can't lose when I'm confident truth is on my side. I'm not even sure I should. (Note however that I'm rather good at losing to evidence.)

So, what do you think? What should I do? Thanks.

85 comments

Comments sorted by top scores.

comment by Vladimir_M · 2011-06-30T18:11:25.672Z · LW(p) · GW(p)

The best cure against such prideful attitudes is to ask yourself what you have to show in terms of practical accomplishments and status if you're so much more rational and intellectually advanced than ordinary people. If they are so stupid and delusional to be deserving of such intolerance and contempt, then an enlightened and intellectually superior person should be able to run circles around them and easily come out on top, no?

Now, if you actually have extremely high status and extraordinary accomplishments, then I guess you can justify your attitudes of contemptuous superiority. (Although an even higher status is gained by cultivating attitudes of aristocratic generosity and noblesse oblige.) If not, however, and if you're really good at "losing to evidence," as you put it, this consideration should be enough to make your attitudes more humble.

Replies from: scientism, gimpf
comment by scientism · 2011-07-01T02:06:13.525Z · LW(p) · GW(p)

I don't think it follows from being "more rational and intellectually advanced" that you would be more accomplished and have higher status. This is especially true if you're surrounded by incompetents. For example, how would a rational person achieve high status if the majority of people making status judgments are irrational? To "run circles around them" by exploiting their foolishness would require such a high-level of understanding of human psychology that it far out strips merely being "more rational and intellectually advanced." It's quite possible (perhaps likely) that greater intelligence and rationality would be a huge detriment in a society of incompetents. This would be true until science progressed to the point that we had a complete enough understanding of psychology to exploit or reform them. There's nothing in rationality that makes a rational person automatically able to understand and exploit the irrational.

Replies from: None, Nisan, prase
comment by [deleted] · 2011-07-01T09:34:46.968Z · LW(p) · GW(p)

For example, how would a rational person achieve high status if the majority of people making status judgments are irrational?

  1. Analyse their status assessing mechanism.
  2. Find exploits and hacks.
  3. Munchkin ...uh... I mean optimize away.
  4. (profit?)
comment by Nisan · 2011-07-02T05:12:43.977Z · LW(p) · GW(p)

The kind of "rationality" we're talking about is the kind that lets you win. If I notice that there are people who have more money than me, are happier than me, have better friends and friendships than me, are more able to achieve their goals — who acquired all those things through virtue or behaving a particular way, and not by chance — and if I haven't bothered to determine what those virtues and behaviors are, and whether they tend to actually work, and how I can implement them myself — why, then, I'm not such a hotshot rationalist after all.

I'd agree that mere Traditional Rationality may not help one get ahead.

Replies from: scientism
comment by scientism · 2011-07-02T18:01:55.479Z · LW(p) · GW(p)

I'm talking about winning. In practical terms, I don't think anyone is going succeed in making great strides in success or status in general society without acquiring a great deal of knowledge about human psychology, so I doubt that winning will look like social or economic success in the short-term. I think when you speak of those "who acquired all those things through virtue or behaving a particular way, and not by chance" you betray a false dichotomy. Those aren't the only two options. The third option is that the game is rigged.

We live in a society that intentionally confines "winning" to a small, highly-controlled, socially maligned group so that its fruits can be exploited by the larger majority who are unconcerned with such things. It's more than an issue of what individuals do and do not do, our society and its institutions are designed to reward and punish behaviour in a way that's at odds with rationality. Doing well in our society is indeed a product of "behaving in a particular" way in the most general sense of that term but is not a factor of anyone doing anything one could simply learn to do or do better.

The only way to find success in our situation is by understanding human psychology at a deep level and having a much fuller operational understanding of it than we have now. It would be either a process of extreme reform (i.e., replacing the whole of society) or one of exploitation and subterfuge (essentially treating people as a means to an end).

Replies from: Nisan
comment by Nisan · 2011-07-03T04:44:49.738Z · LW(p) · GW(p)

We live in a society that intentionally confines "winning" to a small, highly-controlled, socially maligned group so that its fruits can be exploited by the larger majority who are unconcerned with such things.

This sentence makes me think that we're probably talking about entirely different things. I indicated in the grandparent that I consider people who are wealthy and happy and who have good friends to be "winning"; I don't believe such people are maligned; surely the opposite is the case. Perhaps we're talking about different things.

Replies from: scientism
comment by scientism · 2011-07-03T14:24:02.221Z · LW(p) · GW(p)

Presumably you consider "winning" a self-directed act, so not everybody who is wealthy, happy and has good friends is necessarily a winner, they can also get lucky or be favoured in a rigged game. Furthermore, the majority of people, even the ones who have lived arguably self-directed lives, did not do so in the methodical way we're proposing to do so. Living a good life is, typically, a non-transferrable skill. "Winning", as I interpret it, is about creating a transferrable skill for achieving such goals. It's about identifying the things you or somebody like you need do to achieve such goals in a systematic way. What you, as a rationalist, would do in order to win is not necessarily the same as what those people you hold as exemplars of the state you hope to achieve have done. People who live good lives aren't maligned, that's true, but people who pursue goals in a systematic, transferable way are maligned.

I agree that ultimately the outcome should be to be wealthy, happy, socially successful, etc. I simply disagree on how easy that is. If somebody came to me and told me they've made great strides in rationality, I wouldn't expect them to be rich and happy and to have the best of friends. I wouldn't expect them to have made great scientific breakthroughs. I'd expect them to have something to show for it but I'd expect that it would be a modest accomplishment and likely only recognised by their peers. I suspect if we took a poll on Less Wrong - if we could agree on who are the best rationalists and tallied up their achievements - those achievements would be in line with my expectations.

Why is this? Because most success in our society is much more like, say, becoming a successful politician than becoming a successful athlete. One might suppose that I can become a successful athlete, given the right genes, simply by training hard and being acknowledged for my skill. But to become a successful politician I'd have to pretend to be somebody I'm not almost every waking moment of my life. That's what I mean by committing subterfuge in a rigged game. Many rationalists appear to think everything in life is like becoming a successful athlete - I can just look at what other people do and do that or do it better - but I think that's wrong. Almost everything is like becoming a successful politician. I need to look at what other people do, figure out what is salient in their behaviour and then find a way to exploit it to my own ends in an environment where rationality (i.e. systemic, transferable pursuit of a goal) is maligned or even punished, and that's a hard problem. That's a problem that involves considerably more advanced knowledge of human psychology than we have now.

comment by prase · 2011-07-01T10:39:22.833Z · LW(p) · GW(p)

You are right that being more rational doesn't automatically imply the ability to exploit others. But this may miss the point. Loup-vaillant says not that he is more rational than the others, but that he feels superior because of that. Although rationality and intelligence aren't directly linked to status, the superiority feeling is part of the status game, and feeling superior when one's status is about average can be viewed as a sort of false belief.

Replies from: scientism
comment by scientism · 2011-07-01T16:00:25.296Z · LW(p) · GW(p)

There's more than one status game though. For example, a high status scientist might nonetheless have a low social status generally, especially if his or her high status is in an esoteric field. Wouldn't it make sense for a rational person to reject the status judgments of the irrational and instead look for status among his peers? We might expect loup-vaillant to have some accomplishments that would set him apart from the irrational masses in the eyes of his peers - that he's not all talk - but I doubt this would set the bar very high. He'd then be free to feel superior to most people.

Replies from: prase
comment by prase · 2011-07-01T21:30:38.270Z · LW(p) · GW(p)

It's possible to feel superior to most people because you can recite Koran by heart even if you are a homeless beggar. It's possible to feel superior because you can solve a Rubik's cube faster than anybody else. There will always be some peers who would award you high status for unusual accomplishments. If that's what you want, there are hardly any objections to be made. But from my experience, the superiority feeling quickly fades away when I realise that it is based on status game which the "inferiors" don't wish to participate in.

comment by gimpf · 2011-06-30T22:25:06.593Z · LW(p) · GW(p)

I did not interpret his article as "I am superior to all", but as "Help, I act as I am superior to all!". I probably got that totally wrong, though, as like most of the times.

Replies from: loup-vaillant
comment by loup-vaillant · 2011-06-30T23:10:01.343Z · LW(p) · GW(p)

Unfortunately, those two are related. My acting superior generally comes from a genuine feeling of being right. And it happen often enough to raise alarm bells even I can hear.

comment by fubarobfusco · 2011-06-30T17:41:24.145Z · LW(p) · GW(p)

Slytherin answer: If you're surrounded by idiots, figure out how idiots work and use them to your advantage; ideally in ways that they don't even recognize. Getting irritated at them for being idiots is like getting irritated at a cat for not being a dog — it's bad instrumental rationality; the irritation doesn't help you accomplish your goals. They may be idiots, and you can't fix that; but you can treat them nicely enough that they won't get in your way and may even be useful to you. Find ways to practice this.

Hufflepuff answer: Sounds like you need the company of other rationalists. Does your area have a LW meetup yet? Meanwhile, try to consider the obstacles, distractions, and other cognitive interference that these other folks might be dealing with. Find ways to sympathize — after all, you're not perfect, either. (And for that matter, if religionists are so wrong, why does going to church make so many of them so happy? They must be right about something.)

Replies from: atucker
comment by atucker · 2011-06-30T22:44:52.684Z · LW(p) · GW(p)

Other Hufflepuff answer: Aww. Maybe I should find a way to be nicer to them so that I can help them find their mistakes in ways that don't make them think I dislike them. I wonder if having more accurate beliefs in some areas would actually hurt them...

Gryffindor answer: And that's why you must be strong to help save the world without their help.

Ravenclaw answer: Some people just don't care. If you want to talk about the truth, talk to other truthseekers, not other people. Non-truthseekers can still be fun, but you don't have to talk to them about your beliefs.

Replies from: fubarobfusco
comment by fubarobfusco · 2011-07-01T00:48:56.277Z · LW(p) · GW(p)

Your Huffle-fu is stronger than mine. Seeking a rationalist meetup is Ravenclaw.

Replies from: atucker
comment by atucker · 2011-07-01T03:50:01.340Z · LW(p) · GW(p)

I really like the idea of answering based on Hogwarts houses.

Replies from: nazgulnarsil
comment by nazgulnarsil · 2011-07-01T20:57:19.017Z · LW(p) · GW(p)

it's really a variation on internal families.

Replies from: atucker
comment by atucker · 2011-07-01T22:23:04.408Z · LW(p) · GW(p)

True, but it has the added advantage of common knowledge as to what the different aspects refer too.

comment by Zed · 2011-06-30T16:35:22.789Z · LW(p) · GW(p)

Very interesting, because my exposure to LW (and the sequences in particular) had the opposite effect. I'm now better at dealing with others and with dealing with stupidity in general.

My slightly exaggerated thought process used to be: "I'm clearly right about this, so I'll just repeat and rephrase my arguments until they figure out they're wrong and I'm right. If they don't understand it they're hopeless and I'll just "flip the bit" on them and move on with my life."

The problem, of course, is that the strategy is ineffective, and using an ineffective strategy again and again is not rational at all. So I would say the correct strategy is to ask yourself: "Given my understanding of the sequences and of human psychology, what line of argumentation is going to be most effective?". In this situation you probably want to leave a line of retreat and you probably want to make an effort to close the inferential gap.

If you're right (in a "facts are on my side" kind of way) you can usually force people to give in but at what cost? Resentment and burned bridges. You might win the battle, but you'll lose the war.

PS: Insulting your opponent, although an understandable outlet of your frustration, is a form of defecting from the positive-sum game of a civil discussion. I remind myself of this whenever I feel the impulse to insult.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2011-06-30T20:25:59.028Z · LW(p) · GW(p)

PS: Insulting your opponent, although an understandable outlet of your frustration, is a form of defecting from the positive-sum game of a civil discussion.

It's not always positive-sum (or even often, if you pick random interlocutors). Your time spent arguing can easily be worth more than what the other person gains. Insulting probably doesn't help though.

Replies from: CaveJohnson
comment by CaveJohnson · 2011-07-01T09:43:55.953Z · LW(p) · GW(p)

Your time spent arguing can easily be worth more than what the other person gains. Insulting probably doesn't help though.

One of the nice things of being part of the academic establishment is that its other people's duty to explain things that are already covered. Except when public relations are concerned. shudders

comment by Pavitra · 2011-07-01T00:08:40.273Z · LW(p) · GW(p)

Hang out with people who are smarter than you are, so that you get lots of practice being the one who's wrong in an argument.

Remember that when you are right, your goal is not to emit true statements but to cause the other person to believe true ideas. The default implicit model of argument is that if they don't get it at first, you just have to hit harder; try instead to think of convincing someone as navigating a maze or solving a puzzle, a complex and delicate process that may require lots of backtracking if you mess up.

Replies from: Vaniver, Zetetic
comment by Vaniver · 2011-07-01T06:17:53.143Z · LW(p) · GW(p)

Hang out with people who are smarter than you are, so that you get lots of practice being the one who's wrong in an argument.

This seems useless at learning how to deal with people who are wrong, and instead reinforces the "life is an academic debate" meme.

Replies from: Pavitra
comment by Pavitra · 2011-07-02T06:41:35.599Z · LW(p) · GW(p)

You shouldn't expect any single strategy to solve all possible problems. The course of action I recommended is, I believe, a good way to fix the problem described in the original post. If you also want to solve this different problem, you will probably have to also take a corresponding different course of action.

comment by Zetetic · 2011-07-01T02:32:18.879Z · LW(p) · GW(p)

Hang out with people who are smarter than you are, so that you get lots of practice being the one who's wrong in an argument.

That could be easier said than done if you live in a fairly isolated environment. I would love to find a group of people who are all smarter than me and would want to hang out and debate various topics, but I have no clue where to go. My university doesn't seem to have very many people that I can chat with, and even the people at the graduate mathematics courses I took at one of the neighboring universities were not very stimulating for the most part. I think this is largely due to the quality of the schools, but there is little I can do about that (at least that I know of) for the time being beyond waiting for grad school and chatting online; this might be the only non-specialist community site where I find multiple people who consistently know more than I do.

Replies from: Pavitra
comment by Pavitra · 2011-07-01T04:20:51.870Z · LW(p) · GW(p)

Hanging out on the internet can work for this. LW is my personal smarter-than-me hangout of choice.

(It would probably be a good idea to list some other candidates, to help maintain the metaphorical biodiversity of the meme pool. Unfortunately, I can't think of any. Suggestions?)

comment by Vladimir_Nesov · 2011-06-30T20:38:43.758Z · LW(p) · GW(p)

I emotionally/connotationally associate the condition of not thinking clearly with poverty. A person can be born in unfavorable conditions, in which case it might be almost impossible to get into a better situation without substantial help, or it might take a lot of luck, or significant ability.

Since there is already a well-absorbed set of emotional connotations with the condition of povery (low-status but with lean to status-agnostic; burden for others not in this condition; unfair, deserving of compassion and help; theoretical possibility of full recovery) that seems to match what one would wish to associate with people not thinking clearly, you could just transfer these intuitions by associating the categories in your thinking.

We also need a productive charity, to make use of comparative advantage.

comment by Emile · 2011-06-30T16:02:42.343Z · LW(p) · GW(p)

I'm more tolerant of religion than I was a few years ago, mostly because once I got an idea of all the other ways humans (including myself!) are irrational, singling those who hold incorrect opinions on something irrelevant like metaphysics is a bit unfair.

Ways human tend to be irrational: choosing a career based on very little information (the idea of the number of well-off teenagers in western countries that know more about the World of Warcraft gameplay system (or equivalent) than about the costs and benefits of the various career paths they could choose is depressing); pretty much any strongly-held opinion on politics that isn't backed by some serious scholarship or experience, opinions on what others think of you and how much that matters, opinions on what kind of things are good and bad, buying unneeded stuff, getting in debt, moral panics, drugs ...

Next to those, does it matter if somebody incorrectly thinks the Bible was divinely inspired, or that we get reincarnated after death, as long as he's being a reasonable, civilized human being (and not a fanatical nut)? That'd be a good reason to ignore their opinions on abstract intellectual subjects, but not a reason to think very harshly of them.

Replies from: Morendil
comment by Morendil · 2011-07-01T10:13:09.677Z · LW(p) · GW(p)

People don't just hold these beliefs about death and the Bible and the bearded guy in the sky (who loves kids dunked in water at birth by priests more than he loves other living things).

They also often send their kids to sunday school to contaminate them with these beliefs, instead of waiting for their kids to grow up enough to adopt the beliefs if they make sense to them.

Replies from: Emile
comment by Emile · 2011-07-01T12:16:05.096Z · LW(p) · GW(p)

If sending the kids to Sunday school makes the lives of the kids better than not sending them to sunday school, then why not? There may be better things to have your kids do on Sunday, but it's probably better then having them watch TV all day.

(I've never been to Sunday school, but the people I know who did don't seem particularly worse off)

Replies from: sketerpot, Morendil
comment by sketerpot · 2011-07-01T23:40:19.234Z · LW(p) · GW(p)

I've been to sunday school, at several churches, when I was a child. I also "taught" sunday school when I was a teenager. In all cases, it was a glorified daycare blessed with the halo effect of God: a way to make parents feel virtuous about leaving their kids somewhere for an hour on Sunday while they have coffee and cookies. This was perhaps valuable as parental stress relief, but it wasn't a particularly great thing for the people actually in sunday school. If anything, it was kind of boring, and got everybody fidgety from being cooped up in a room.

So, yeah, if you're looking for things to do with children on a Sunday morning, may I suggest hiking, or reading, or playing somewhere, or anything but sunday school? It's not horrible, but I would characterize it as intensely meh.

Replies from: NancyLebovitz
comment by NancyLebovitz · 2011-07-02T18:55:45.709Z · LW(p) · GW(p)

Training kids to tolerate intensely meh experiences, especially when there's no obvious gain from them, may be unhealthier than is obvious. At least in my case, I think it's done a lot to build a habit of killing time.

comment by Morendil · 2011-07-01T13:08:44.943Z · LW(p) · GW(p)

It's not clear to me that sunday school systematically makes kids' lives any better, and the epistemic danger seems real enough.

For instance, the guilt-trip nature of the doctrine of "original sin" strikes me as a clear harm when inflicted upon children, who do not have the intellectual resources to receive it critically.

It's one thing to tolerate people who choose to have certain beliefs. It's another, more difficult, to tolerate people who actively foist these beliefs onto the more vulnerable.

comment by prase · 2011-06-30T18:07:00.526Z · LW(p) · GW(p)

Well, the effect LW had on me was the opposite. Many arguments have subtle sides which are hidden from the first sight, and much of this I have realised reading LW. It can happen that it's me who misses the point, and it's very unpleasant after having argued about the point passionately. And even if I am right and the opponent is wrong, I know that the path to the truth isn't usually simple and short. I used to have beliefs which today I see as clearly wrong. I am fairly confident that today I have beliefs which I would find wrong in the future, and which other better informed people consider wrong even today. If I don't want to call past myself a moron (I certainly don't) and don't want to be called a moron by the wiser people, I should be quite careful in putting the moron label onto others.

So, what to do if you want to be more tolerant, for example, when you meet a religious believer? My advice is based on things that usually help me:

  • Try to remember that you were effectively an agnostic not long ago, and if your interlocutor deserves to be called a moron for believing in God, you deserved to be called at least a half-moron for being agnostic about the question. Perhaps you wouldn't like that conclusion.
  • Try to remember when you have changed your mind about other issues you were pretty certain before. That helps to understand that being mistaken isn't necessarily the same as being an idiot. (If you can't remember a single instance when you have abandoned a strongly held belief, take it as evidence that you do something wrong.)
  • Think about high-status intelligent religious people.
  • Remember that abstract reasoning about abstract issues like (some modern weak version of) religion or free will isn't hallmark of practical rationality and intelligence. Being right about abstract problems that don't affect their life is not every person's priority.
  • Before the debate starts, suppose that the interlocutor has a really clever logically consistent theology supported by clever convoluted intuition pumps and that you would need to destroy dozens of Plantingaesque proofs of God to make a good counter-argument. In most of the actual cases your opponent's argument would rather be "but what keeps the atoms together, if not Jesus", but it doesn't matter. A long debate against a skilled theologian demagogue should be such an unpleasant experience that even the slightest chance of that happening should convince you to drop the topic as soon as possible.
  • In the beginning of any debate, try to think what you want to achieve. The opponent saying "oops, you were right and I was wrong, I deconvert now"? That's not going to happen, and it's a rationalist's obligation not to engage in futile actions.

Modification from religion to Bayesianism or other issues is pretty straightforward.

And whether you indeed should calm down? If you debating style even remotely resembles that of Xah Lee (I don't have that impression from reading your comments), you should. I have clicked on your link to XL's rant against Wikipedia. After having read the first sentence, I thought "what a self-conscious jerk", and at the end of the whole thing I was fairly certain that I am going to ignore all his opinions. If you want to actually communicate with others - rather than voice your opinions - you should avoid that style at all costs.

Replies from: loup-vaillant
comment by loup-vaillant · 2011-06-30T23:43:40.169Z · LW(p) · GW(p)

If you can't remember a single instance when you have abandoned a strongly held belief…

Ouch. I can't. Even reading the whole sequences didn't trigger any feeling of updating. I learned quite a few things, and it just made sense as a whole. But nowhere I saw something that made me jump "wait, what?", followed by the mandatory Oops.

I probably should take ideas I disagree with as seriously as possible. Surely there is one that will change my mind?

Replies from: prase, fubarobfusco
comment by prase · 2011-07-01T10:27:09.441Z · LW(p) · GW(p)

The updating needn't necessarily be instant, it can take months or years. For me, it is never an instant change. Not much "wait, what?!", it's rather more like "this can't be true, let's try to find a counter-argument", followed by "I can't find a satisfactory counterargument, so there may be some merit in that" after some time gap. But after that, I am able to see that I don't anymore hold a belief X which I was ready to defend fiercely a while ago.

comment by fubarobfusco · 2011-07-01T00:52:35.193Z · LW(p) · GW(p)

Was that "ouch" an oops, or a wince?

Replies from: loup-vaillant
comment by loup-vaillant · 2011-07-01T08:09:37.461Z · LW(p) · GW(p)

A wince. I noticed my failure to update a while ago. (Or at least my failure to notice update. That doesn't feel likely, but I've seen my Mother do it, saying "of course" instead of "oops". It could let me update, which is good, but it wouldn't get rid of the "I've been right all along" feeling, which is bad.)

comment by jsteinhardt · 2011-06-30T17:49:21.909Z · LW(p) · GW(p)

Can you elaborate on what you mean when you say you regard anyone who isn't Bayesian as moronic? I'm not sure what it means to "be Bayesian".

Replies from: BenLowell, loup-vaillant
comment by BenLowell · 2011-06-30T23:41:12.962Z · LW(p) · GW(p)

Here is an article written for you! What is Bayesianism? My personal struggle is where this differs from 'clear-headedness.' I think that much of this website is geared towards helping us get closer to the ideal Bayesian, though the connections are not mentioned specifically.

Can anyone give an example of where they explicitly used Bayesian reasoning? It makes sense that it is right, but ... unlike other things on this website that can be transferred into skills or habits. My guess is that having a deeper understanding of Bayesian probability would help with understanding what evidence is and how much confidence should be placed in what.

A separate confusion of mine is that in Eliezer's explanation of Bayes theorem----I was able to do the math problems correctly and so I didn't make whatever the usual mistake was. Because of this, I have knowledge of the right way to solve probability problems (at least if I spend a long time thinking about them), buI never went down the wrong path got slapped by having an Incorrect Answer. That doesn't mean I won't notice a mistake, but I think that learning things the wrong way helps you understand why they are wrong later. So my confusion is that I am never very confident as to whether I am doing things the "Bayesian way" or not. I've found that the Law of Conservation of Expected Evidence has been the most helpful in understanding the consequences of Bayesian reasoning, beyond solving math problems.

Edited for clarity.

Replies from: Kutta, jsteinhardt
comment by Kutta · 2011-07-01T12:26:25.932Z · LW(p) · GW(p)

My awareness of Bayesian reasoning doesn't quite enable me to use it explicitly with success most of the time, or maybe the successes are not vivid and spectacular enough to be noticed, but it does make me aware of Bayes-stupid inferences committed by me and others.

Just yesterday my father proclaimed that a certain beggar who tends to frequent our street with a kid or two and claim to be a homeless is a liar, because, well, he's not a homeless because he is also often seen in a company of drunkards and he probably drags around the kids for show and they aren't even his. I asked my dad whether the beggar's claim of homelessness makes him more or less likely to be homeless. He said less likely, but after that he denied that the beggar's failure to claim so would make him more likely to be a homeless.

Replies from: Emile
comment by Emile · 2011-07-01T13:01:19.260Z · LW(p) · GW(p)

I'm not sure I understand - why would he deny that the beggar's failure to claim so would make him less likely to be homeless? I have trouble imagining how the conversation you're describing went.

Replies from: Kutta
comment by Kutta · 2011-07-01T15:32:55.086Z · LW(p) · GW(p)

make him less likely to be homeless?

Uh, I mixed up a less likely and a more likely. Corrected.

Replies from: Emile
comment by Emile · 2011-07-01T15:34:45.709Z · LW(p) · GW(p)

In that case:

He said more likely, but after that he denied that the beggar's failure to claim so would make him more likely to be a homeless.

... the first bit should probably be "He said less likely", in which case what you say makes much more sense.

comment by jsteinhardt · 2011-07-01T06:09:51.281Z · LW(p) · GW(p)

I personally feel like a deeper understanding of Bayesian probability has mainly just helped me to formalize things that are already obvious (the goal being to replicate what is obvious to humans in a computer, e.g. computer vision, robotics, AI, etc.). There have been few instances where it has actually helped me weigh evidence more effectively. But maybe I am missing some set of practical techniques.

Also, I was unable to parse the final paragraph that you wrote, would you mind re-stating it?

comment by loup-vaillant · 2011-06-30T23:23:24.231Z · LW(p) · GW(p)

I basically mean using probability theory when you deal with your own beliefs. With the understanding that you only have partial (and flawed) information about the world. Understanding what is evidence, what counts as evidence to you (that last one depends on the relationship between your prior knowledge and the piece of evidence you look at).

And most of all, understanding that Occam's Razor (or Solomonov induction / Kolmogorov complexity) isn't just a fancy trick to force atheism and manyworlds down people's throats.

That said, my knowledge is still feels flaky. I may be a bit under-educated by my own standard.

Replies from: jsteinhardt
comment by jsteinhardt · 2011-07-01T06:05:39.623Z · LW(p) · GW(p)

What does it mean to "use probability theory to deal with your beliefs"? How do you use probability, and how does it change your conclusions?

Replies from: Will_Newsome
comment by Will_Newsome · 2011-07-01T07:43:44.256Z · LW(p) · GW(p)

This question might be worth a discussion post. I constantly use visuospatial and kinesthetic qualia when thinking, which to a non-negligible extent draw on intuitions begotten from understanding the basic concepts of algorithmic probability theory and its relations---information theory, probability theory, computer science, and statistical mechanics. That said, I almost never pull out pen and paper, and when I do pull out pen and paper it's to help structure my Fermi calculations, not to plug numbers into Bayes' theorem. It seems obvious to me both that there are large benefits to having Bayes-influenced intuitions firing all the time and also that there are few benefits of even remembering how to actually write out Bayes' theorem.

Edited to separate the following trivial factoid from above less trivial factoids: (Formal use of Bayes is pretty popular among---and abused by---Christian apologists. User:lukeprog would know more about that though.)

Replies from: Dorikka, jsteinhardt, NancyLebovitz
comment by Dorikka · 2011-07-02T00:48:54.591Z · LW(p) · GW(p)

(Formal use of Bayes is pretty popular among---and abused by---Christian apologists. User:lukeprog would know more about that though.)

This doesn't seem to belong here. My guess is that you're just inserting a fact for general knowledge because you found it interesting, but it looks like an argument of the form "X does Y, X tends to exhibit low levels of rationality, so don't do Y", which is fallacious. I might remove it for potential mind-killing potential.

Replies from: Will_Newsome
comment by Will_Newsome · 2011-07-02T04:47:41.590Z · LW(p) · GW(p)

I praise your right view, and will edit my comment accordingly.

comment by jsteinhardt · 2011-07-01T15:06:29.784Z · LW(p) · GW(p)

I would be interested in reading such a post (it seems like it might even be worth a top-level post depending on how much you have to say).

comment by NancyLebovitz · 2011-07-02T18:59:15.709Z · LW(p) · GW(p)

Could you give an example of using visuospatial and kinesthetic qualia when thinking?

Replies from: Will_Newsome
comment by Will_Newsome · 2011-07-02T20:10:41.379Z · LW(p) · GW(p)

Long-winded reply: I think it's not uncommon for folk to have kinestheticly-experienced conceptual aesthetics or decision-making processes. "That doesn't feel quite right" is commonly heard, as is the somewhat-ambiguous "Sorry, I just don't feel like going out tonight". Anyhow, others' apparent confidence in seemingly inelegant ontologies very distincly activates a lot of my thinking qualia. For example, if I hear a person resignedly accepting a uniform prior over vaguely defined objects in the mathematical universe hypothesis. The really fundamental feeling there is that it's just doesn't fit... I picture it in my head as an ocean of improper prior-ness flooding the Earth because some stupid primordial being didn't have enough philosophical aesthetics to realize that the mathematics shouldn't look like that, objectively speaking. And it feels... it just feels wrong. Often an idea or a hypotheses feels grinding, and sometimes it feels awkward, but most of the time things just feel not-right, inharmonious, off-kilter, dukkha. Sorry, very little sleep this week, not particularly coherent.

Edit: Not sure if this matters at all, but I think that I wouldn't be able to do clear timeful/timeless reasoning if I didn't have access to those intuitions. I also doubt that I could grok the concepts of statistical mechanics. That said, I really don't understand things like algebra or geometry... it must be something to do with implicit-movement, static things just don't work. (Edit: Mixtures and measures, logarithms, symmetric limits, proportionality, physical dimensionality, raw stuff of creation, creation self-similarity, causal fluid, causal structure... it's like crack.) I think that's why I love ambient/timeless control so much, it lets me think about Platonic objects using my flow-structure intuitions, which is cool 'cuz the Forms are so metaphysically appealing. I'm getting an fMRI soon and doing a whole bunch of cognitive tests soon, maybe that'll give a hint.

comment by Normal_Anomaly · 2011-06-30T17:25:02.159Z · LW(p) · GW(p)

Your experience is interesting. I find that while I have started looking on more things as more insane than before, it has made me less argumentative and more tolerant. My thought process is something like, "So many people are so wrong on so much stuff that my trying to help them usually won't make a difference. They never had a chance to become right because they were exposed to all the wrong memes. So I'll stop trying to improve other people's thinking except where I think it might actually work, and then I'll be nice so that I don't lose a rare chance to help somebody." The eventual effect is that I see more of the irrationality around me, but feel less need to do anything about it.

I don't know if trying to emulate my reaction sounds like something you'd want, and I'm even less confident that it will work, but I find that just seeing how other people deal with something can give me ideas about how I should do so.

comment by Perplexed · 2011-07-01T14:25:40.338Z · LW(p) · GW(p)

What should I do?

Step 1: Stop being frustrated with them for not knowing/understanding. Instead, direct your frustration at yourself for not successfully communicating.

Step 2: Come to know that the reason for your failure to communicate is not a lack of mastery over your own arguments. It is a lack of understanding of their arguments. Cultivate the skill of listening. Ask which school of martial arts presents the best metaphor for your current disputation habits. Which school best matches the kind of disputation you would like to achieve?

Step 3: In the course of learning to listen, you may also learn other things. The people you are talking to are probably not idiots. Sometimes they will be right and you will be wrong. Notice those occasions. Examine, analyze, and cherish those occasions. Come to see them as winning. After all, you came out of that dispute having gained something useful (knowledge). Your teacher, on the other hand, gained nothing but ephemeral status.

comment by Thomas · 2011-06-30T17:46:07.744Z · LW(p) · GW(p)

It very well may be, that this intolerance of yours has nothing to do with this site. You would became intolerant anyway, only to a slightly different set of beliefs.

Replies from: loup-vaillant
comment by loup-vaillant · 2011-06-30T23:51:21.008Z · LW(p) · GW(p)

Quite possible. Another possibility is that I at last found a tribe I can identify with. Also, reading the sequences didn't trigger updating. I either learned or readily agreed. That may have spoiled me a little.

comment by saturn · 2011-06-30T17:42:35.664Z · LW(p) · GW(p)

You can't expect to win a singlehanded fight to protect the entire world from its own stupidity. You need to choose your battles.

comment by CronoDAS · 2011-07-01T01:41:39.844Z · LW(p) · GW(p)

Against stupidity, the gods themselves contend in vain.

-- Friedrich Schiller

comment by JoshuaZ · 2011-06-30T21:33:00.761Z · LW(p) · GW(p)

I know I've believed some pretty stupid stuff that only seemed dumb in retrospect. I've found that keeping this in mind helps one be more tolerant of people believing in stupid things. Would you be intolerant of yourself from two years ago or five years ago? If you had a time machine, how would you treat your past self?

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2011-06-30T23:44:22.945Z · LW(p) · GW(p)

This is at best an intuition transfer tool, given without a reason that justifies its use, and one that happens not to apply to myself. I would easily bite the bullet and say that Nesov_2008 was a crackpot and Nesov_2006 was a shallow naive fool. There seems to be no common reason to tolerate interaction with either of them beyond necessity. (A desire to help, to make better, is a different question. It doesn't need to involve valuating the damaged condition as better than it is.)

Replies from: Perplexed
comment by Perplexed · 2011-07-01T14:04:24.052Z · LW(p) · GW(p)

I would easily bite the bullet and say that Nesov2008 was a crackpot and Nesov2006 was a shallow naive fool.

Ah. But would you make the obvious predictions about the opinion Nesov2013 and Nesov2015 will have regarding Nesov2011?

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2011-07-01T15:20:42.192Z · LW(p) · GW(p)

You are being overly cryptic (obvious predictions?). Judgments like this are not relative. I don't think I'm currently in anywhere close to that much of a trouble, and haven't been since about summer 2010 (2009 was so-so, since I regarded some known-to-be-confused hypotheses I entertained then at unjustified level of confidence, and was too quick to make cryptic statements that weren't justified by much more detailed thoughts). I'm currently confused about some important questions in decision theory, but that's the state of my research, and I understand the scope of my confusion well enough.

Replies from: Perplexed
comment by Perplexed · 2011-07-01T15:54:41.883Z · LW(p) · GW(p)

My implicit point was this: Nesov2006 probably did not realize that Nesov2006 was a fool and Nesov2008 probably did not judge himself to be a crackpot. Therefore, a naive extrapolation ("obvious prediction") suggests that Nesov2011 has some cognitive flaws which he doesn't yet recognize; he will recognize this, though, in a few years.

JoshuaZ, as I understood him, was suggesting that one improves ones tolerance by enlarging ones prior for the hypothesis that one is currently flawed oneself. He suggests the time-machine thought experiment as a way of doing so.

You, as I understood you, claimed that JoshuaZ's self-hack doesn't work on you. I'm still puzzled as to why not.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2011-07-01T16:04:11.913Z · LW(p) · GW(p)

My implicit point was this: Nesov2006 probably did not realize that Nesov2006 was a fool and Nesov2008 probably did not judge himself to be a crackpot.

To a significant extent, both would agree with my judgment. The problem was not so much inability to see the presence of a problem, they just didn't know its nature in enough detail to figure out how to get better. So the situation is not symmetric in the way you thought.

See The Modesty Argument for a discussion of why I won't believe myself crazy just because there are all those crazy people around who don't believe themselves crazy.

comment by jsalvatier · 2011-06-30T16:48:41.365Z · LW(p) · GW(p)

Perhaps you should solidify in your mind whether you think it's a good thing or a bad thing on net. Come up with ways in which it could be a good thing and ways in which it could be a bad thing. One particular way that it could be a bad thing is that you dramatically underestimate inferential distance, so it's much harder to actually change people's minds than it feels (there's a reason the sequences are long; those had more design time go into them than whatever you come up with on the fly). This means that if there are any social drawbacks to arguing with people, they can easily outweigh the benefits improving thought.

Replies from: torekp
comment by torekp · 2011-07-05T00:41:37.385Z · LW(p) · GW(p)

I'd like to echo jsalvatier's first point, and add one plea in "favor" of intolerance. Namely, tolerate your intolerance.

What this means in practice is roughly, instead of thinking "I'm acting/feeling intolerant - I'm a bad person," try "I'm acting/feeling intolerant - let me note the context, and the results so far. Let me think about what to do next." Try some of the alternative, more-tolerant responses suggested by LWers, and note their results too.

Keep separate in your mind your thoughts versus emotions versus behavior. You can have intolerant thoughts and tolerant behavior, which might (or might not) give you all the benefits you seek from tolerance. Emotions are sort of a middle ground, since they tend to be harder to keep private, but are often less salient to others than your behavior.

comment by Dorikka · 2011-06-30T15:59:25.194Z · LW(p) · GW(p)

Would it help to do a cost-benefit analysis of being more tolerant vs. the status quo? I've found that the amount of enlightenment that I can give certain people is small enough that I lose more utility through the emotional impact of the argument than I gain through giving them knowledge.

comment by Zetetic · 2011-06-30T21:45:20.582Z · LW(p) · GW(p)

I now tend regard anyone who isn't Bayesian as either uneducated or moronic. Same thing about materialist reductionism, only with a slightly lower confidence.

To be blunt, that is a bit strange. In my experience you are far more likely to find a material reductionist than you are to find a Bayesian. This leads me to think that you might be too withdrawn, which might be causing you to have a poor model of what ideas other people are likely to hold, and adding to your general sense of misanthropy. Of course, I'm generalizing from only a few examples (I have talked with others who have this problem) so you should take that into consideration.

Are you in college? If so, perhaps you would do well to hang around the science buildings and look around for science or atheism or freethought-centric groups that meetup on some regular basis (assuming there are no LW meetups near you, as is the case with me). Could you give a brief outline of your current situation? Maybe that would help us to help you.

Replies from: loup-vaillant
comment by loup-vaillant · 2011-07-01T00:07:17.221Z · LW(p) · GW(p)

My slightly lower confidence doesn't flow from popularity, but from the fact that Bayesianism is more meta than materialist reductionism. Along with the current state of the art of science, it causes my belief in a reductionistic world. Without it, I would be allowed to believe in souls. But it would be harder to abandon Bayesianism if I discovered that we do have immaterial souls.

I currently work at a programming shop. I intend to do a thesis soon. I live in France, far from Paris (or I promise I would have gone to that meetup not long ago).

Replies from: Zetetic
comment by Zetetic · 2011-07-01T02:20:38.932Z · LW(p) · GW(p)

I suppose I thought it was strange because I was a reductionist long before I knew about Bayesianism; I've always had an interest in science and I always gave scientific theories precedence (though when I was very young this was more out of my recognition that science had the authority on truth rather than a rational dissection of epistemology). I read A.J. Ayer and Karl Popper before I read Jaynes (unfortunately, I really wish that it had been Jaynes I read first).

I'm still an undergraduate and I live in the U.S. so I'm afraid that I can offer little in the way of insight. I could perhaps share my experience with I can tell you that I often have similar feelings. I do not live near any meetups and none of my friends share any interest in mathematics, the sciences or rationality. I do have one friend who is very intellectual, but he's a soft science type who, again, doesn't share any of my specific interests.

comment by djcb · 2011-07-02T10:27:42.029Z · LW(p) · GW(p)

Suppose you hear someone stating that yesterday it was 7C and today it is 14C, so it's "twice as warm".

When I hear that, I cringe a bit, but these days (older, wiser, milder) I think the better thing to do is just to lightly smile or something. The 'higher status behavior' is to not always to try to "score", but instead to ignore it, unless there is some direct negative effect.

comment by Friendly-HI · 2011-06-30T21:13:31.985Z · LW(p) · GW(p)

I know exactly how you feel.

As far as I'm concerned, recognizing that I could be that completely oblivious and ignorant person if I was subjected to a different personal experience from my current one helps a lot to not think significantly less of them.

Actually, I once was that ignorant person. So I try to imagine how someone would have needed to talk to me, in order to convince me of something, back when I was an ignorant superstitious fool myself. It's not easy, that's for sure. Try to thoroughly imagine how you would talk to someone whom you love and respect, but who unfortunately holds some highly irrational beliefs... how would you talk with your parents or your kids about this? Imagine it vividly and carry that attitude and style of "soft" arguing over to people whom you aren't as close to. It takes some effort to cultivate that attitude, but I'm trying because both emotionally and intellectually I know it's the right thing to do.

comment by dspeyer · 2011-07-03T03:24:31.033Z · LW(p) · GW(p)

Another approach is to contemplate the various virtues that people can have, and consider their relative importance. You might need to do this as a sort of regular meditation.

As an off-the-cuff, how would you sort by importance: rationality, creativity, knowledge, diligence, empathy^1 , kindness, honor, and generosity^2 ? Does how you act correspond to how you answer? If not, make a practice of reminding yourself.

You may also find it useful to enumerate the virtues of the specific people who are annoying you. If you cannot think of any, stop associating with them. If the thought of not associating with them is unpleasant, examine that unpleasantness to discover their virtues.

1= Empathy is a talent for understanding others, which may or may not result in being kinder to them.

2 = Generosity should be taken in the broadest sense: determination to help others despite costs to oneself, and may or may not involve giving material possessions.

comment by mutterc · 2011-07-02T19:05:14.537Z · LW(p) · GW(p)

I know this sounds snarky, but it's serious: Are you married?

Ideally a life partner will share many of your values, but no two people share all values, and you'll need to respect the ones that differ. (Even if you're both Bayesian, in area where you have different values/axioms you will not necessarily agree).

Replies from: loup-vaillant
comment by loup-vaillant · 2011-07-08T13:12:06.089Z · LW(p) · GW(p)

I live with my SO. As far as I can tell, she didn't completely abandoned belief in belief. She also doesn't seem to accept Occam's Razor (seemingly because it "doesn't interest" her), and use that to reject many-worlds. Or maybe many-worlds sounds absurd, and she only reject Occam's Razor by contraposition.

Anyway, all this has been a source of significant tension, which is now subsidized (I hope). The factual disagreements remains, though. Lesson learned: "Thou will not convince everyone".

As for our values, I didn't noted any significant divergence yet.

comment by byrnema · 2011-06-30T20:31:38.056Z · LW(p) · GW(p)

I wonder if it's not a problem with compartmentalization? Because in many contexts, these issues about truth needn't be in the forefront. In contexts where issues about truth are at the forefront, wherever people are having intellectual discussions are making decisions, it is often more contextually appropriate to be argumentative.

Maybe your concern about intolerance is a warning that you need more interactions of the former type for a better balance in your life. That is, interactions that are social and comfortable and bring back your sense of humor and comradery towards other people.

comment by zntneo · 2011-06-30T17:59:16.255Z · LW(p) · GW(p)

I think I can relate quite a bit. It is absolutely infuriating when someone does anyone care to try to be rational. I am always having to explain to people why I care about what is true. The question to me has become like nails on a chalkboard. the thing that has helped me mildly is that most people do not have any education on what it means to be rational. they they have not even been introduced to the concept ( other than Hollywood rationality which is almost as irritating). I also remember that at one time I was kind of like them which makes it so that I tend to educate them about it (although I think I am as a teacher/mentor).

comment by Peterdjones · 2011-06-30T17:39:49.848Z · LW(p) · GW(p)

Find some really intolerant people to hang out with. Objectivists would be -f-o-o-d- good. (But that was an interesting idea for a while).

Replies from: Dorikka
comment by Dorikka · 2011-06-30T17:44:39.438Z · LW(p) · GW(p)

I'm failing to parse your comment.

Replies from: Normal_Anomaly, NancyLebovitz, GuySrinivasan
comment by Normal_Anomaly · 2011-06-30T19:04:12.621Z · LW(p) · GW(p)

I think Peterdjones means that Objectivists would be good people for Loup-Vaillant to hang out with, to teach him what it feels like to be subjected to obnoxious argumentation and make him realize on a gut level that it doesn't help and causes needless unhappiness. I suspect from personal experience that it would backfire--I tend to act more like the people I hang out with, and I was a lot more obnoxious when I spent time on Pharyngula before coming here--but YMMV.

At least, that makes more sense to me than Peterdjones actually wanting Loup-Vaillant to eat Objectivists.

comment by NancyLebovitz · 2011-06-30T22:03:41.189Z · LW(p) · GW(p)

I interpret it as "Objectivists would be food (food is struck out to indicate a mixture of humor and hostility)-- a good opportunity to dump anger and feel like you're winning arguments, or should be.

comment by SarahSrinivasan (GuySrinivasan) · 2011-06-30T18:56:49.105Z · LW(p) · GW(p)

Notice your confusion.

Replies from: None
comment by [deleted] · 2011-06-30T19:20:54.547Z · LW(p) · GW(p)

Is this phrase accompanied by an article? If so could someone link it to me please?

edit: I managed to find "I notice that I am confused". I don't know how to delete comments.

Replies from: Alicorn
comment by Alicorn · 2011-06-30T19:27:40.929Z · LW(p) · GW(p)

You can't delete comments right now; the functionality has been replaced by "retraction" as of the recent site update.
You can also edit your comment down to nothing.