Intelligence-disadvantage

post by casebash · 2014-03-16T07:14:57.212Z · LW · GW · Legacy · 24 comments

Contents

24 comments

While LessWrong contains a large amount of high-quality material, most of the rationality advice isn't actually targeted at our core audience. The focus seems to be more on irrational things that people do, rather than irrational things that smart people do. (Sidenote: If we wanted to create a site focused on spreading general rationality, then we'd need to simplify the discussion, remove a lot of the maths/controversial ideas and add in some friendly images. Does such a site exist?).

This has led to a number of comments questioning the real world value of having read the sequences. If your average person had the patience to read through the core sequences and understand them, they'd find them extremely valuable. It'd provide them with a glimpse into a new way of thinking and even though they would likely hardly appear to be very logical to most Less Wrongers, they'd be much better than they were at the start.

On the other hand, most Less Wrongers already know the basics of logic. That's not to say that we don't act extremely irrational much of the time, but just that going over the basics of logic again probably provides minimal benefit. What is needed is something specifically targeted at the kind of irrational mistakes and beliefs that intelligent people make. I would argue that if this were a sequence, it would be the most important sequence in the entire site. But, since I lack that level of writing ability, I'm not even going to attempt such a project. So I just created a post where we can list articles or ideas that should be part of such a sequence in the hope that someone else might pick it up

Here are some examples of mistakes that intelligent people make:

Taking a fixed instead of a growth mindset - shying away from challenges, convincing oneself that we are just naturally bad at non-intellectual things and that we shouldn't focus on them

Directly pointing out people's flaws

Overthinking issues that are really very simple

Counter-signalling by ignoring the value of fashion, money, being liked

Valuing intelligence above all other qualities

Rigidly adhering to rules

Expecting other people to always be rational

Not considering popularity as a signal that is worthwhile understanding

Overvaluing being right

I'm sure there are plenty more. Any other suggestions or relevant articles?

 

 

24 comments

Comments sorted by top scores.

comment by ChrisBillington · 2014-03-16T07:48:17.821Z · LW(p) · GW(p)

I believe CFAR workshops address a lot of these issues, a huge focus of them being the interplay between high-level, logically thought out cognition (system 2) and the lower level, intuitive thinking (system 1). One of the major points was that system 1 is actually very useful at providing information and making decisions, so long as you ask the question right. Smart people I think tend to under-utilise system 1, often ignoring their gut feeling when it is providing useful information.

To use your fashion example, If I consider dressing up nicely, part of me says "This is all pointless, it's just signalling!", and wants me to squash the little voice that is telling me that actually, I value looking nice and should just give in to that desire because there are actual upsides and no downsides!

Again, I found that the CFARian way is to provisionally accept your terminal goals and gut feelings as legitimate, and go about satisfying them rather than criticising them too much (criticising them being a job for epistemic rationality, CFAR being more about instrumental rationality)

I'm hesitant to link you to Julia Galef's "The Straw Vulcan" talk, since everything you're writing seems so in line with it that I suspect you've already seen it! But if you haven't, it's incredibly relevant to this topic.

Another thing I'm reminded of is the post Reason as a Memetic Disorder. Basically, sometimes there are good cultural practises that smart people fail to see the logic behind (because it's subtle, or because it's inconsistent with some false belief they have), and so they drop the practise, to their detriment. Less smart people keep doing it, since they're happy to simply conform without having reasons for everything.

comment by michaelkeenan · 2014-03-17T04:20:43.121Z · LW(p) · GW(p)

I suspect that the Sequences don't seem so useful to you for the reason that Scott Alexander pointed out in his recent five-year Less Wrong retrospective. Having read the Sequences, people find upon reading them that the concepts seem obvious, though they recall thinking they were revelatory at the time. There's something about learning an epistemological framework that makes it difficult to remember what it could have been like to think in any other terms. And yet, I did once argue about ways in which words were wrong, or fail to take into account the outside view, or waver between utilitarianism and deontology without even knowing those words. And I still suffer from scope insensitivity and the typical mind fallacy and a host of other issues.

The Sequences are for smart people. Rationality for not-as-smart people is: use contraception; quit smoking; astrology doesn't work; Cosmo's not a reliable source; don't get into credit card debt for trivial things; try to be more mindful about your hedonic or status treadmill.

Replies from: casebash
comment by casebash · 2014-03-19T04:20:50.132Z · LW(p) · GW(p)

Scope insensitivity, utilitarianism vs. deontology, ect. are all good things to know, but they wouldn't place in a list of the biggest mistakes smart people make

comment by mare-of-night · 2014-03-16T12:26:34.203Z · LW(p) · GW(p)

Outside of LesssWrong, the (Five Geek Social Fallacies)[http://www.plausiblydeniable.com/opinion/gsf.html] comes to mind.

Replies from: Nornagest
comment by Nornagest · 2014-03-16T23:18:02.788Z · LW(p) · GW(p)

That's less "smart people" and more "people who've grown up in a certain dysfunctional social environment", but there's a lot of overlap, yes. At least in the States.

comment by Mestroyer · 2014-03-16T11:03:43.506Z · LW(p) · GW(p)

Overthinking issues that are really very simple

Counter-signallign as a smart-person mistake

Valuing intelligence above all other qualities

Rigidly adhering to rules -- compare the two endings of "Three Worlds Collide" and the decision by which they diverge.

Expecting other people to always be rational

Got nothing for the last two. I don't think the last one is a mistake that very many people at all make. (I think being right about things has surprising benefits well past the point that most people can see it having benefits).

Other smart person mistake covering posts that spring to mind: http://lesswrong.com/lw/dxr/epiphany_addiction/ http://lesswrong.com/lw/j8/the_crackpot_offer/

And a lot of the general mistakes that LessWrong warns against are just person mistakes, rather than smart person or normal person mistakes. [edit: grammar]

Replies from: pianoforte611
comment by pianoforte611 · 2014-03-16T15:24:57.425Z · LW(p) · GW(p)

Adding to this:

Directly pointing out people's flaws

I also don't see the problem with valuing being right highly. I can see the problem with letting people know that you are right too much.

comment by seez · 2014-03-16T09:28:37.845Z · LW(p) · GW(p)

Here are some examples of mistakes that intelligent people make

Looks like you mean "here are some examples of mistakes people on LessWrong still make."

Highly intelligent people such as great artists and writers, successful politicians and lawyers and drug kingpins, often depend on continued popularity, value social signaling extremely highly, know most people aren't rational, and don't rigidly follow rules.

However, I think it is interesting to consider whether there are qualities that are associated with intelligence, either biologically or through the way intelligent people are socialized, that are unintelligent in themselves. It seems like this is true with rationality; perhaps something about thinking rationally also causes people to, say, undervalue popularity to an irrational extent. I'm having a harder time thinking of examples that seem as likely with intelligence in general, although that's made trickier by having a vaguer definition of intelligence than rationality.

Replies from: satt
comment by satt · 2014-03-17T00:13:08.097Z · LW(p) · GW(p)

It seems like this is true with rationality; perhaps something about thinking rationally also causes people to, say, undervalue popularity to an irrational extent.

Rather than A causing B, I reckon it's more that A & B have a common cause, C: psychological traits that make someone more interested in thinking rationally tend to make them less interested in popularity.

comment by pianoforte611 · 2014-03-16T15:19:54.827Z · LW(p) · GW(p)

I agree that LW isn't good at spreading general rationality to the general public, and I'm not even sure if that would be a good idea, since partial rationalists are great at pointing out errors in other people and coming across as arrogant jerks. However I don't agree that the errors that LW discusses don't happen to intelligent people. I have been guilty of all of the errors here.

comment by Gunnar_Zarncke · 2014-03-16T08:49:02.437Z · LW(p) · GW(p)

Part of the sequence could be about finding our place in society. Links about that:

This gives a solid picture of the value of traditions and customs. In a way this is a social analog to the cognitive system 1. The analogy goeing as follows: System 2 - reason = well-defined formal structures; System 1 - intuition = vague or unjustified customs and traditions

Replies from: Slackson
comment by Slackson · 2014-03-16T21:38:03.430Z · LW(p) · GW(p)

System 1 is the intuitive one, system 2 is the formal reasoning.

Replies from: Gunnar_Zarncke
comment by Gunnar_Zarncke · 2014-03-16T21:42:43.161Z · LW(p) · GW(p)

fixed.

comment by niceguyanon · 2014-03-17T16:22:23.387Z · LW(p) · GW(p)

The focus seems to be more on irrational things that people do, rather than irrational things that smart people do.

Help me understand better where you are coming from because I'd argue the opposite. Of the 9 articles or ideas that you want to see on LW, you listed 2 posts, Mestroyer and others listed 4 more, plus a few posts germane to the idea of smart people fail modes. intellectual hipsters and metacontrarianism is an all time top post.

Advice about akrasia, motivation, identity and happiness are for the benefit of people in general, and if anything most posts are tailored for smart people in mind. I might agree with you if I start seeing posts about wearing seat belts and use of contraception, but instead we have posts like the curse of identity and strategic choice of identity. Reflection on identity is epistemic rationality for smart people.

Replies from: casebash
comment by casebash · 2014-03-19T04:24:04.431Z · LW(p) · GW(p)

"Of the 9 articles or ideas that you want to see on LW, you listed 2 posts, Mestroyer and others listed 4 more, plus a few posts germane to the idea of smart people fail modes" - There are quite a few posts, but they are scattered over the whole site, rather than collected in a sequence or in a page.

comment by A1987dM (army1987) · 2014-03-16T14:03:52.612Z · LW(p) · GW(p)

Countersignalling is not necessarily a mistake, depending on how much the relevant people already know about you.

comment by ChristianKl · 2014-03-17T17:11:54.436Z · LW(p) · GW(p)

Fashion is a very interesting topic. I would like to have a CFAR zip hoodie that looks well to wear at event of the startup scene or when interacting with other programmers.

It could have the CFAR logo on the front and the CFAR mission statement at the back:

What if we could shrug off our feelings of defensiveness, and honestly evaluate the evidence on both sides of an issue before deciding which legislation to pass, what research to fund, and where to donate to do the most good?

Wear the right size, but you don't have to wake around with a Nike logo.

comment by ChristianKl · 2014-03-17T16:34:09.642Z · LW(p) · GW(p)

On the other hand, most Less Wrongers already know the basics of logic. That's not to say that we don't act extremely irrational much of the time, but just that going over the basics of logic again probably provides minimal benefit.

Logic is about the categories of true and false, Bayesianism is basically about the fact that 0 (false) and 1 (true) are not probabilities. It comes out of rejecting the "basics of logic" as Aristotle formulated them.

Going over the basics of reasoning is very useful because we aren't really clear about what they are. That's one of the core purposes of Lesswrong. It exists to explore how rationality actually works. Yes, frequently those discussions won't produce knowledge that's useful for daily life but if we can get more clear about how rationality works through discussing it in detail we can advance the field.

For Salsa Congresses there's the saying: The beginners go to the intermediate classes, the intermediates take the advanced classes and the advances folks take the beginner classes to brush up their basics.

In Zen there's the concept of the beginners mind, to constantly stay at the level where you think about basics.

comment by polymathwannabe · 2014-03-17T00:20:47.311Z · LW(p) · GW(p)

Sidenote: If we wanted to create a site focused on spreading general rationality, then we'd need to simplify the discussion, remove a lot of the maths/controversial ideas and add in some friendly images. Does such a site exist?

You're describing RationalWiki.

Replies from: Will_Newsome
comment by Will_Newsome · 2014-03-17T03:30:58.372Z · LW(p) · GW(p)

RationalWiki is for spreading scientism and leftism, not rationality. But if the existence of RationalWiki is enough reason for people to refrain from making some new hoi polloi monstrosity, then by all means just point everyone to RationalWiki.

Replies from: polymathwannabe
comment by polymathwannabe · 2014-03-17T21:14:47.820Z · LW(p) · GW(p)

Wow. I shudder to think how far to the right you must be to believe RationalWiki is leftist.

Replies from: Will_Newsome
comment by Will_Newsome · 2014-03-17T21:45:06.493Z · LW(p) · GW(p)

I was a minor contributor to RationalWiki before I'd ever heard of LessWrong, many years ago. At the time I was a typical atheist scientismist leftist. (I can still recall the flush of indignation I felt whenever anyone dared offer opinions that differed from those of leftist orthodoxy. Mindkiller indeed.) In fact, I found RationalWiki by Googling for clever and contemptuous retorts to the ridiculous arguments made by my quick-thinking Christian creationist conservative friend. Which is what RationalWiki's original purpose was: a reaction to Conservapedian shenanigans and to the sociopolitical currents that allowed Conservapedia to exist in the first place.

P.S. Heil Hitler

Replies from: polymathwannabe
comment by polymathwannabe · 2014-03-17T22:34:29.188Z · LW(p) · GW(p)

Thanks for sharing the back story. This updates my estimate of your political position closer to the middle.

On a side note, do your Christian friends know of your notion of superhuman gods? What do they think about it?

Replies from: Will_Newsome
comment by Will_Newsome · 2014-03-20T23:15:29.468Z · LW(p) · GW(p)

(Vladimir_Nesov, I think this 5 karma hit thing shouldn't apply to all comments in a thread stemming from a highly downvoted comment; it discourages resolution of confusions and breeds resentment toward LW as a community, which occasionally leads to publicity problems in other fora. Which I don't really care about but I bet you do.)

This updates my estimate of your political position closer to the middle.

I am probably somewhere in the middle. I don't have very developed views about politics; my take on political discourse is more sociological than anything. I don't think that I have enough information or wisdom to have justified political opinions. Mostly my feelings are, people suck at reasoning about politics, right-leaning people suck in a quaint mostly ineffective way, left-leaning people suck in a really dangerous way, Marxists are really smart and I don't understand them yet but I get the feeling they're engaged in some fucked up casuistry, reactionaries should stick to deconstruction of leftism (which is great! they should team up with Marxists!) and stop making fools of themselves with impossible policy recommendations and myopic political theory, libertarians are largely philosophically misguided which is a problem since they reason from ethics so much, I have to have contempt for liberaltarians because otherwise people might mistake me for a stereotype of that group to which I am most related, et cetera.

To me having confident political opinions is like going into some rainforest ecosystem where everything is strangling everything else and being like, ya know what, I'm definitely going to take sides with the lizards in this scenario, they seem most righteous, fuck the snakes and the birds and the trees and the bugs. I mean, liking lizards is cool, go ahead, but when it comes to 'alright let's cull the snake population, they're keeping down the lizards', at that point it's like people are almost trying to shoot themselves in the feet.

On a side note, do your Christian friends know of your notion of superhuman gods? What do they think about it?

I had two close seriously Christian friends in high school, one a conservative and ideological Christian, the other more of a nuanced philosopher. I barely ever talk to the former, and the latter may have changed his views since high school and is reticent to talk about them. So, no, they're not familiar with my notions. I befriended a Mormon girl within the last year and she and I mostly talk about the problem of discernment. Her perspective is more 'you can go with your instinct about whether an experience confirms the interpretational framework in which you experience it', whereas mine is more 'basically everyone is heavy-handed in their interpretation of these things, the phenomena seem to be purposefully baffling in a way that makes correct interpretation nearly impossible' (which is sort of my perspective on everything, but it applies triply to topics as tricky as the supernatural). So, she thinks I should have more faith or something. Other than that, I haven't talked much with Christian people about the supernatural (and I'd argue Mormonism barely counts as Christianity from a theological perspective and perhaps also an anthropological one). Mostly I stick to theology, where in some ways I feel more confident I know what I'm talking about.