Why Don't Rationalists Win?

post by Adam Zerner (adamzerner) · 2015-09-05T00:57:28.156Z · LW · GW · Legacy · 118 comments

Contents

    Epistemic
    Altruistic
    Success
    Happiness
    Social
    Failure?
  A LOT
None
118 comments

Here are my thoughts on the "Why don't rationalists win?" thing.

Epistemic

I think it's pretty clear that rationality helps people do a better job of being... less wrong :D

But seriously, I think that rationality does lead to very notable improvements in your ability to have correct beliefs about how the world works. And it helps you to calibrate your confidence. These abilities are useful. And I think rationality deserves credit for being useful in this area.

I'm not really elaborating here because I assume that this is something that we agree on.

However, I should note that rationalists aren't really making new and innovative discoveries (the non-superstar ones anyway), and that this may increase the "why don't rationalists win?" thing. I think that a big reason for this lack of progress is because a) we think about really really really difficult things! And b) we beat around the bush a lot. Big topics are often brought up, but I rarely see people say, "Ok, this is a huge topic so in order to make progress, we're going to have to sit down for many hours and be deliberate about this. But I think we could do it!". Instead, these conversations seem to be just people having fun, procrastinating, and never investing enough time to make real progress.

Altruistic

I also think that rationality is doing a great job in helping people to do a better job at being altruistic. Another thing that:

For people with altruistic goals, rationality is helping them to achieve their goals. And I think it's doing a really good job at this. But I also think that it doesn't quite feel like the gains being made here are so big. I think that a major reason for this is because the gains are so:

  1. High level.
  2. Likely to be realized far in the future.
  3. Are the sort of thing that you don't personally experience (think: buying a poor person lunch vs. donating money to people in Africa).

But we all know that (1), (2), and (3) don't actually make the gains smaller, it just makes them feel smaller. I get the impression that the fact that the gains feel smaller results in an unjustified increase in the "rationalists don't win" feeling.

Success

I get the impression that lack of success plays a big role in the "why don't rationalists win?" thing.

I guess an operational definition of success for this section could be "professional, financial, personal goals, being awesome...". 

I don't know much about this, but I would think and hope that rationality helps people to be notably more successful than they otherwise would be. I don't think rationality is at the point yet where it could make everyone millionaires (metaphorically and/or literally). But I think that a) it could get there, and b) we shouldn't trivialize the fact that it does (I'm assuming) make people notably more successful than they otherwise would be.

But still, I think that there are a lot of other factors that determine success, and given their difficulty/rarity, even with rationality in your toolbox, you won't achieve that much success without these things.

  1. Plain old hard work. I'm a huge believer in working smart, but I also think that given a pretty low and relatively sufficient level of smartness in your work, it's mostly a matter of how hard you work. You may ask yourself, "Take someone who studies really hard, but is lacking big time when it comes to rationality - wouldn't they not be successful?". I think an important (and sad) point to make is that at this point in history, you could be very successful with domain specific knowledge, but no rationality. And so people who work really hard but don't have an ounce of rationality often end up being very good at what they do, and very successful. I think we'll reach a point where things progress enough and rationality does in fact become necessary (the people with domain specific knowledge but no rationality will fail).
  2. Aptitude/starting early. I'm not sure the extent to which aptitude is actually a thing. I sense that a big part of it is simply how early on you started. When your brain was at that "sponge-stage". Regardless, aptitude/starting early seems to be pretty important. Someone who works hard but started too late will certainly be at a disadvantage.
  3. Opportunity. In one sense, not much will help you if you have to work 3 jobs to survive (you won't have much time for self-improvement or other necessary investments of time). In another sense, there's the idea that "you are who you surround yourself with". So people who are fortunate enough to grow up around other smart and hard working people will have had the opportunity to be socially pressured into doing the same. I think this is very underrated, but also very overcommable. In another sense, some people are extremely fortunate and are born into a situation where they have a lot of money and connections.
  4. Ambition/confidence. Example: imagine a web developer who has rationality + (1) + (2) + (3) but doesn't have (4). He'll probably end up being a good web developer. But he might not end up being a great web developer. The reason for that is because he might not have the ambition or confidence to think to pursue certain skills. He may think, "that stuff is for truly smart people, I'm just not one of those people". And he may not have the confidence to pursue the goal of being a great software engineer (more general and wide-ranging). He may not have the confidence to learn C and other stuff. Note that there's a difference between not having the confidence to try, and not having the confidence to even think to try. I think that the latter is a lot more common, and blends into "ambition territory". On that note, this hypothetical person may not think to pursue innovative ideas, or get into UX, or start a startup and do something bigger.
My point in this section is that rationality can help with success, but 1-4 are also extremely important, and probably act as a limiting factor for most of us (I'd guess that most people here are rational enough such that 1-4 probably acts as a barrier to their success, and marginal increases in rationality probably won't have too big a marginal impact).

(I also bet that 1-4 is insufficient and that there are important things I'm missing.)

Happiness

I get the impression that lack of happiness plays a big role in the "why don't rationalists win?" thing.

Luke talked about the correlates of happiness in How to Be Happy:

Factors that don't correlate much with happiness include: age,7 gender,8 parenthood,9 intelligence,10 physical attractiveness,11 and money12 (as long as you're above the poverty line). Factors that correlate moderately with happiness include: health,13 social activity,14 and religiosity.15 Factors that correlate strongly with happiness include: genetics,16 love and relationship satisfaction,17 and work satisfaction.18

One thing I want to note is that genetics seem to play a huge role, and that plus the HORRIBLE hedonic adaptation thing makes me think that we don't actually have that much control over our happiness.

Moving forward... and this is what motivated me to write this article... the big determinants of happiness seem like things that are sort of outside rationality's sphere of influence. I don't believe that, and it kills me to say it, but I thought it'd make more sense to say it first and then amend it (a writing technique I'm playing around with and am optimistic about). What I really believe is:

Social

Socially, LessWrong seems to be a rather large success to me. My understanding is that it started off with Eliezer and Robin just blogging... and now there are thousands of people having meet-ups across the globe. That amazes me. I can't think of any examples of something similar.

Furthermore, the social connections LW has helped create seem pretty valuable to me. There seem to be a lot of us who are incredibly unsatisfied with normal social interaction, or sometimes just plain old don't fit in. But LW has brought us together, and that seems incredible and very valuable to me. So it's not just "it helps you meet some cool people". It's "it's taken people who were previously empty, and has made them fulfilled".

Still though, I think there's a lot more that could be done. Rationalist dating website?* Rationalist pen pals (something that encourages the development of deeper 1-on-1 relationships)? A more general place that "encourages people to let their guard down and confide in each other"? Personal mentorship? This is venturing into a different area, but perhaps there could be some sort of professional networking?

*As someone who constantly thinks about startups, I'm liking the idea of "dating website for social group X that has a hard time relating to the rest of society". It could start off with X = 1, and expand, and the parent business could run all of it.

Failure?

So, are we a failure? Is everything moot because "rationalists don't win"?

I don't think so. I think that rationality has had a lot of impressive successes so far. And I think that it has

A LOT

of potential (did I forget any other indicators of visual weight there? it wouldn't let me add color). But it certainly hasn't made us super humans. I get an impression that because rationality has so much promise, we hold it to a crazy high standard and sometimes lose sight of the great things it provides. And then there's also the fact that it's only, what, a few decades old?


(Sorry for the bits of straw manning throughout the post. I do think that it lead to more effective communication at times, but I also don't think it was optimal by any means.)

118 comments

Comments sorted by top scores.

comment by Fluttershy · 2015-09-05T02:24:09.911Z · LW(p) · GW(p)

I've never really understood the "rationalists don't win" sentiment. The people I've met who have LW accounts have all seemed much more competent, fun, and agenty than all of my "normal" friends (most of whom are STEM students at a private university).

I should note that rationalists aren't really making new and innovative discoveries (the non-superstar ones anyway)

There have been plenty of Gwern-style research posts on LW, especially given that writing research posts of that nature is quite time-consuming.

Replies from: lmm, TheAncientGeek
comment by lmm · 2015-09-12T18:04:41.011Z · LW(p) · GW(p)

I went to an LW meetup once or twice. With one exception the people there seemed less competent and fun than my university friends, work colleagues, or extended family, though possibly more competent than my non-university friends.

Replies from: lahwran, drethelin
comment by the gears to ascension (lahwran) · 2015-09-15T21:59:48.941Z · LW(p) · GW(p)

That was also true for me until I moved to the bay. I suspect it simply doesn't move the needle much, and it's just a question of who it attracts.

comment by drethelin · 2015-09-12T20:26:12.603Z · LW(p) · GW(p)

I have the opposite experience! Most people at LW meetups I've been to have tended to be succesful programmers or people with or working on stuff like math phds. Generally more socially awkward but that's not a great proxy for "competence" in this kind of crowd.

Replies from: None
comment by [deleted] · 2015-09-13T02:08:14.182Z · LW(p) · GW(p)

Do you think this was caused by their rationality? It seems more likely to me that these people are drawn to rationality because it validates how they already think.

Replies from: lahwran
comment by the gears to ascension (lahwran) · 2015-09-15T22:02:05.216Z · LW(p) · GW(p)

What you just said doesn't make sense. "Rationality", as formally defined by this community, refers to "doing well" (which I contest, but whatever); Therefore, the question is not "was it caused by their rationality", but "was it caused by a lack of rationality", or perhaps "Was their lack of rationality caused by using LW techniques?".

Replies from: None
comment by [deleted] · 2015-09-16T20:06:24.345Z · LW(p) · GW(p)

Defining rationality as winning is useless in most discussions. Obviously what I was referring to is rationality as defined by the community, EG "extreme epistemic rationality".

Replies from: TheAncientGeek
comment by TheAncientGeek · 2015-09-16T20:47:59.174Z · LW(p) · GW(p)

The community defines rationality as epistemic rationality AND as winning, and not noticing the difference between the two leeds to the idea that rationalists ought to win at everything...that the winningness of instrumental rationality. and the universality of ER can be combined.

comment by TheAncientGeek · 2015-09-05T13:41:14.121Z · LW(p) · GW(p)

There are plenty of researchers who have never heard of lesswrong, but who manage to produce good work in fields lesswrong respects. So have they...

A over come their lack of rationality,

..or...

B learnt rationality somewhere else?

And, if the answer is B, what is LW adding to rationality? The assumption that rationality will make you good at range of things that aren't academic or technical?

Replies from: None, adamzerner
comment by [deleted] · 2015-09-07T02:27:25.982Z · LW(p) · GW(p)

The answer is, of course, (B), but what LW adds is a common vocabulary and a massive compilation of material in one place. Most people who learn how to think from disparate sources have a hard time codifying what they understand or teaching it to others. Vocabulary and discourse help immensely with that.

So, for instance, I can neatly tell people, "Reversed stupidity is not intelligence!", and thus save myself incredible amounts of explanation about how real life issues are searches through spaces for tiny sub-spaces encoding solutions to your problem, and thus "reversing" some particularly bad solution hasn't done any substantial work locating the sub-space I actually wanted.

Replies from: TheAncientGeek
comment by TheAncientGeek · 2015-09-07T09:34:25.529Z · LW(p) · GW(p)

It only creates a common vocabulary amongst a subculture. LW vocabulary relabels a lot of traditional rationality terms.

Replies from: Vaniver, None
comment by Vaniver · 2015-09-07T13:48:13.253Z · LW(p) · GW(p)

LW vocabulary relabels a lot of traditional rationality terms.

Has anyone put together a translation dictionary? Because it seems to me that most of the terms are the same, and yet it is common to claim that relabeling is common without any sort of quantitative comparison.

Replies from: Good_Burning_Plastic, Viliam, Jiro, satt, btrettel
comment by Good_Burning_Plastic · 2015-09-11T08:22:45.506Z · LW(p) · GW(p)

Huh, lemme do it.

Schelling fencebright-line rule

Semantic stopsignthought-terminating cliché

Anti-inductivenessreverse Tinkerbell effect

"0 and 1 are not probabilities"Cromwell's rule

Tapping out → agreeing to disagree (which sometimes confuses LWers when they take the latter literally (see last paragraph of linked comment))

ETA (edited to add) → PS (post scriptum)

That's off the top of my head, but I think I've seen more.

Replies from: ScottL, None, Vaniver, TheAncientGeek
comment by ScottL · 2015-09-11T12:39:36.180Z · LW(p) · GW(p)

Thanks for this. Let me know if you have any others and I will add them to this wiki page I created: Less Wrong Canon on Rationality. Here are some more that I already had.

  • Fallacy of gray → Continuum fallacy
  • Motivated skepticism → disconfirmation bias
  • Marginally zero-sum game → arms race
Replies from: Jiro
comment by Jiro · 2015-09-11T15:48:14.853Z · LW(p) · GW(p)
comment by [deleted] · 2015-09-19T17:03:45.304Z · LW(p) · GW(p)

Funging Against -> Considering the alternative

Akrasia -> Procrastination/Resistance

Belief in Belief -> Self-Deception

Ugh Field ->Aversion to (I had a better fit for this but I can't think of it now)

comment by Vaniver · 2015-09-11T14:16:10.163Z · LW(p) · GW(p)

Thanks for the list!

I am amused by this section of Anti-Inductiveness in this context, though:

Not that this is standard terminology - but perhaps "efficient market" doesn't convey quite the same warning as "anti-inductive". We would appear to need stronger warnings.

comment by TheAncientGeek · 2015-09-13T14:49:23.044Z · LW(p) · GW(p)

Instrumental/terminal = hypothetical/categorical

rationalist taboo = unpacking.

Replies from: None
comment by [deleted] · 2015-12-02T07:46:12.222Z · LW(p) · GW(p)

Instrumental and terminal are pretty common terms. I've seen them in philosophy and business classes.

comment by Viliam · 2015-09-10T20:32:02.749Z · LW(p) · GW(p)

Has anyone put together a translation dictionary?

It was many times debated on LW whether LW needlessly invents new words for already existing terms, or whether the new words label things that are not considered elsewhere.

I don't remember the outcomes of those debates. It seems to me they usually went like this:

"LW invents new words for many things that already have standard names."
"Can you give me five examples?"
"What LW calls X is called Y everywhere else." (provides only one example)
"Actually X is not the same concept as Y."
"Yes it is."
"It is not."
...

So I guess at the end both sides believe they have won the debate.

comment by Jiro · 2015-10-12T14:43:05.882Z · LW(p) · GW(p)

I just ran into this one because it became used in a reddit thread: in this post Eliezer uses the term "catgirl" to mean a non-sentient sexbot. While that isn't a traditional rationality term, I think it fits the spirit of the question (and predictably, many people responded to the Reddit thread using the normal meaning of "catgirl" rather than Eliezer's.)

comment by btrettel · 2015-09-11T20:19:35.082Z · LW(p) · GW(p)

RationalWiki discusses a few:

Another problem of LessWrong is that its isolationism represents a self-made problem (unlike demographics). Despite intense philosophical speculation, the users tend towards a proud contempt of mainstream and ancient philosophy[39] and this then leads to them having to re-invent the wheel. When this tendency is coupled with the metaphors and parables that are central to LessWrong's attraction, it explains why they invent new terms for already existing concepts.[40] The compatibilism position on free will/determinism is called "requiredism"[41] on LessWrong, for example, and the continuum fallacy is relabeled "the fallacy of gray." The end result is a Seinfeldesque series of superfluous neologisms.

In my view, RationalWiki cherry picks certain LessWrongers to bolster their case. You can't really conclude that these people represent LessWrong as a whole. You can find plenty of discussion of the terminology issue here, for example, and the way RationalWiki presents things makes it sound like LessWrongers are ignorant. I find this sort of misrepresentation to be common at RationalWiki, unfortunately.

Replies from: Kawoomba
comment by Kawoomba · 2015-09-11T20:56:48.898Z · LW(p) · GW(p)

Their approach reduces to an anti-epistemic affect-heuristic, using the ugh-field they self-generate in a reverse affective death spiral (loosely based on our memeplex) as a semantic stopsign, when in fact the Kolmogorov distance to bridge the terminological inferential gap is but an epsilon.

Replies from: Good_Burning_Plastic, XFrequentist, nyralech
comment by Good_Burning_Plastic · 2015-09-18T20:30:13.812Z · LW(p) · GW(p)

You know you've been reading Less Wrong too long when you only have to read that comment twice to understand it.

comment by XFrequentist · 2015-09-12T19:08:50.014Z · LW(p) · GW(p)

I got waaay too far into this before I realized what you were doing... so well done!

Replies from: Kawoomba
comment by Kawoomba · 2015-09-12T20:22:30.098Z · LW(p) · GW(p)

What are you talking about?

comment by nyralech · 2015-09-13T17:00:20.113Z · LW(p) · GW(p)

I'm afraid I don't know what you mean by Kolmogorov distance.

comment by [deleted] · 2015-09-07T14:04:02.435Z · LW(p) · GW(p)

Well yes. And I fully support LW moving towards more ordinary terminology. But it's still good to have someone compiling it all together.

comment by Adam Zerner (adamzerner) · 2015-09-05T14:29:39.170Z · LW(p) · GW(p)

I feel rather confident in saying that it's (A). I think that domain specific knowledge without rationality can actually lead to a lot of success.

Replies from: TheAncientGeek
comment by TheAncientGeek · 2015-09-05T15:50:21.610Z · LW(p) · GW(p)

You actually think someone irrational can do maths, science or engineering?

Replies from: adamzerner, None
comment by Adam Zerner (adamzerner) · 2015-09-05T15:52:01.790Z · LW(p) · GW(p)

Yes, absolutely. Aren't there a bunch of examples of STEM PhD's having crazy beliefs like "evolution isn't real"?

Actually, that example in particular was about a context that is outside the laboratory, but I also think that people can be irrational inside the laboratory, but still successful. Ex. they might do the right things for the wrong reasons. Ex. "this is just the way you're supposed to do it". That approach might lead to success a lot of the time, but it isn't a true model of how the world works.

(Ultimately, the point I'm making is essentially the same as that Outside The Laboratory post.)

Replies from: TheAncientGeek
comment by TheAncientGeek · 2015-09-06T09:04:55.305Z · LW(p) · GW(p)

Yes, absolutely. Aren't there a bunch of examples of STEM PhD's having crazy beliefs like "evolution isn't real"?

Not a very big bunch.

Since (instrumental) rationality is winning, and these people are winning within their own domains, they are being instrumental rationalists within them. So the complaint that they are not rational enough amounts to the complain that they are not epistemic rationalists, ie they don't care enough about truth outside their domains. But why should they? Epistemic rationality doesn't deliver the goodies, in terms of winning .. that's the message of your OP, or it would be if you distinguished ER and IR.

Those who value truth for its own sake, the Lovers of Wisdom, will want to become epistemic rationalists, and may well acquire the skills without MIRI or CFAR's help, since epistemic rationality is not a new invention. The rest have the problem that they are not motivated, not the problem that they are not skilled (or, rather, that they lack the skills needed to do things they are not motivated to do).

I can see the attraction in "raising the rationality waterline" , since people aren't completely pigeonholed into domains, and do make decisions about wider issues, particularly when they vote.

MIRI conceives of raising the rationality waterline in terms of teaching skills, but if it amounts to supplementing IR with ER, and it seems that it does, then you are not going to do it without making it attractive. If you merge ER and IR, then it looks like raising the waterline could lead to enhanced winning, but that expectation just leads to the disappointment you and others have expressed. That cycle will continue until "rationalists" realise rationality is more than one thing.

Replies from: None
comment by [deleted] · 2015-09-19T17:10:39.735Z · LW(p) · GW(p)

Or, you could just assume that it wouldn't make sense for Adam Zerner to define winning as failing, so he was referring to rationality as the set of skills that LW teaches.

Replies from: TheAncientGeek
comment by TheAncientGeek · 2015-09-20T11:46:51.854Z · LW(p) · GW(p)

It's not like winning <--> losing is the only relevant taxis.

Replies from: None
comment by [deleted] · 2015-09-21T00:06:01.804Z · LW(p) · GW(p)

It's definitely relevant to this reasoning:

"Since (instrumental) rationality is winning, and these people are winning within their own domains, they are being instrumental rationalists within them."

Just assume Adam Zerner wasn't talking about instrumental rationality in this case.

comment by [deleted] · 2015-09-06T20:33:18.755Z · LW(p) · GW(p)

Heck, I'll bite the bullet and say that applied science, and perhaps engineering, owe more to 'irrational' people. (Not a smooth bullet, to be sure.)

My grandfather worked with turbines. He viewed the world in a very holistic manner, one I can't reconcile with rationalism no matter how hard I try. He was an atheist who had no problems with his children being Christians (that I know of). His library was his pride; yet he had made no provisions for its fate after his death. He quitted smoking after his first heart attack, but went on drinking. He thought electrons' orbits were circular, and he repaired circuitry often. He preferred to read a book during dinners than to listen to us talk, to teach us table manners.

And from what I heard, he was not half bad at engineering.

comment by pjeby · 2015-09-05T02:21:26.646Z · LW(p) · GW(p)

The big reason? Construal theory, or as I like to call it, action is not an abstraction. Abstract construal doesn't prime action; concrete construal does.

Second big reason: the affect (yes, I do mean affect) of being precise, is very much negative. Focusing your attention on flaws and potential problems leads to pessimism, not optimism. But optimism is correlated with success, pessimism is not.

Sure, pessimism has some benefits in a technical career, in terms of being good at what you do. But it's in conflict with other things you need for a successful career. TV's Dr. House is an extreme example, but most real people are not as good at the technical part of their job as House nor are the quality of their results usually as important.

Both of these things combine to create the next major problem: a disposition to non-co-operative behavior, aka the "why can't our kind get along?" problem.

Yes, not everyone has these issues, diverse community, etc. But, as a stereotypical and somewhat flippant summary, the issue is that simply by the nature of valuing truth -- precise truth, rather then the mere idea of truth -- one is treating it as being more important than other goals. That means it's rather unlikely that a person interested in it will be sufficiently interested in other goals to make progress there. I would expect it more likely that a person who is not naturally inclined towards rationalism would be able to put it to good use, than someone who's just intellectually interested in rationalism as a conversation topic or as an ideal to aspire to.

To put it another way, if you already have "something to protect", such that rationality is a means towards that end, then rationality can be of some value. If you value rationality for its own sake, well, then that is your goal, and so you can perhaps be called "successful" in relation to it, but it's not likely that anyone who doesn't value rationality for its own sake will consider your accomplishments impressive.

So, the truth value of "rationalists don't win" depends on your definition of "win". Is it "win at achieving their own, perhaps less-than-socially-valued goals? Or "win at things that are impressive to non-rationalists"? I think the latter category is far less likely to occur for those whose terminal values are aimed somewhere near rationality or truth for its own sake.

Replies from: TheAncientGeek, 27chaos, entirelyuseless
comment by TheAncientGeek · 2015-09-05T11:44:14.319Z · LW(p) · GW(p)

So, the truth value of "rationalists don't win" depends on your definition of "win"

Or the definition of rationalism. Maybe epistemic rationalism never had much to do with winning.

Replies from: ragintumbleweed, satt
comment by ragintumbleweed · 2017-02-25T00:08:57.436Z · LW(p) · GW(p)

Epistemic rationality isn’t about winning?

Demonstrated, context-appropriate epistemic rationality is incredibly valuable and should lead to higher status and -- to the extent that I understand Less-Wrong jargon --“winning.”

Think about markets: If you have accurate and non-consensus opinions about the values of assets or asset classes, you should be able to acquire great wealth. In that vein, there are plenty of rationalists who apply epistemic rationality to market opinions and do very well for themselves. Think Charlie Munger, Warren Buffett, Bill Gates, Peter Thiel, or Jeff Bezos. Winning!

If you know better than most who will win NBA games, you can make money betting on the games. E.g., Haralabos Voulgaris. Winning!

Know what health trends, diet trends, and exercise trends improve your chances for a longer life? Winning!

If you have an accurate and well-honed understanding of what pleases the crowd at Less Wrong, and you can articulate those points well, you’ll get Karma points and higher status in the community. Winning!

Economic markets, betting markets, health, and certain status-competitions are all contexts where epistemic rationality is potentially valuable.

Occasionally, however, epistemic rationality can be demonstrated in ways that are context-inappropriate – and thus lead to lower status. Not winning!

For example, if you correct someone’s grammar the first time you meet him or her at a cocktail party. Not winning!

Demonstrate that your boss is dead wrong in front of a group of peers in way that embarrasses her? Not winning!

Constantly argue about LW-type topics with people who don’t like to argue? Not winning!

Epistemic rationality is a tool. It gives you power to do things you couldn’t do otherwise. But status-games require a deft understanding of when it is appropriate and when it is not appropriate to demonstrate the greater coherence of one’s beliefs to reality to others (which itself strikes me as a form of epistemic rationality of social awareness). Those who get it right are the winners. Those who do not are the losers.

Replies from: Lumifer, TheAncientGeek
comment by Lumifer · 2017-02-27T16:23:40.355Z · LW(p) · GW(p)

Epistemic rationality is a tool.

Well, technically speaking, it isn't. It is the propensity to select courses of action which will most likely lead to the outcomes your prefer. Correcting grammar on the first date is not a misapplication of epistemic rationality, it just is NOT epistemically rational (assuming reasonable context, e.g. you are not deliberately negging and you are less interested in grammar than in this particular boy/girl).

Epistemic rationality doesn't save you from having bad goals. Or inconsistent ones.

ETA: Ah, sorry. I had a brain fart and was writing "epistemic rationality" while meaning "instrumental rationality". So, er, um, disregard.

Replies from: jwoodward48, TheAncientGeek
comment by jwoodward48 · 2017-03-03T00:25:12.401Z · LW(p) · GW(p)

(I recognize that you meant instrumental rationality rather than epistemic rationality, and have read the comment with that in mind.)

Epistemic rationality is not equivalent to "being a Spockish asshole." It simply means that one values rationality as an end and not just a means. If you do not value correcting people's grammar for its own sake, then there is no reason to correct someone's grammar. But that is an instrumental statement, so I suppose I should step back...

If you think that epistemic and instrumental rationality would disagree at certain points, try to reconsider their relationship. Any statement of "this ought to be done" is instrumental. Epistemic only covers "this is true/false."

Replies from: Lumifer
comment by Lumifer · 2017-03-03T15:36:48.526Z · LW(p) · GW(p)

Epistemic rationality is not equivalent to "being a Spockish asshole."

Yes, of course. Notably, epistemic rationality only requires you to look for and to prefer truth. It does not require you to shove the truth you found into everyone else's face.

If you think that epistemic and instrumental rationality would disagree at certain points

One can find edge cases, but generally speaking if you treat epistemic rationality narrowly (see above) I would expect such a disagreement to arise very rarely.

On the other hand there are, as usual, complications :-/ For example, you might not go find the truth because doing this requires resources (e.g. time) and you feel these resources would be better spent elsewhere. Or if you think you have difficulties controlling your mind (see the rider and the elephant metaphor) you might find useful some tricks which involve deliberate denial of some information to yourself.

comment by TheAncientGeek · 2017-02-28T11:31:15.726Z · LW(p) · GW(p)

It is the propensity to select courses of action which will most likely lead to the outcomes your prefer.

So how does it differ from instrumental rationality?

Replies from: Lumifer, Elo
comment by Lumifer · 2017-02-28T15:42:11.713Z · LW(p) · GW(p)

See ETA to the comment.

comment by Elo · 2017-02-28T11:41:24.589Z · LW(p) · GW(p)

I think this is a bad example. The example seems like an instrumental example. Epistemic alone would have you correct the grammar because that's good epistemics. Instrumental would have you bend the rules for the other goals you have on the pathway to winning.

Replies from: jwoodward48, hairyfigment
comment by jwoodward48 · 2017-03-03T00:15:09.607Z · LW(p) · GW(p)

"See ETA to the comment." Lumifer meant instrumental rationality.

Replies from: Elo
comment by Elo · 2017-03-03T00:19:36.226Z · LW(p) · GW(p)

Comment was before his eta. Ta.

Replies from: jwoodward48
comment by jwoodward48 · 2017-03-03T00:27:11.100Z · LW(p) · GW(p)

Hmm? Ah, I see; you think that I am annoyed. No, I only quoted Lumifer because their words nearly sufficed. Rest assured that I do not blame you for lacking the ability to gather information from the future.

comment by hairyfigment · 2017-03-03T01:29:27.157Z · LW(p) · GW(p)

How could correcting grammar be good epistemics? The only question of fact there is a practical one - how various people will react to the grammar coming out of your word-hole.

comment by TheAncientGeek · 2017-02-25T19:09:45.604Z · LW(p) · GW(p)

Epistemic rationality isn’t about winning?

Demonstrated, context-appropriate epistemic rationality is incredibly valuable and should lead to higher status and -- to the extent that I understand Less-Wrong jargon --“winning.”

Valuable to whom? Value and status aren't universal constants.

You are pretty much saying that the knowledge can sometimes be instrumentally useful. But that does not show epistemic rationality is about winning..

The standard way to show that instrumental and epistemic rationality are not the same is to put forward a society where almost everyone holds to some delusory belief, such as a belief in Offler the Crocodile god, and awards status in return for devotion. In that circumstance, the instrumental rationalist will profess the false belief, and the epistemic rationalist will stick to the truth.

In a society that rewards the pursuit of knowledge for its own sake (which ours does sometimes), the epistemic rationalist will get rewards, but won't be pursuing knowledge in order to get rewards. If they stop getting the rewards they will still pursue knowledge...it is a terminal goal for them....that is the sense in which ER is not "about" winning and IR is.

Epistemic rationality is a tool.

ER is defined in terms of goals. The knowledge gained by it may be instrumentally useful, but that is not the central point.

Replies from: ragintumbleweed
comment by ragintumbleweed · 2017-02-27T21:03:32.321Z · LW(p) · GW(p)

You are pretty much saying that the knowledge can sometimes be instrumentally useful. But that does not show epistemic rationality is about winning..

What I'm saying is that all things being equal, individuals, firms, and governments with high ER will outperform those with lower ER. That strikes me as both important and central to why ER matters.

I believe you seem to be saying high ER or having beliefs that correspond to reality is valuable for its own sake. That Truth matters for its own sake. I agree, but that's not the only reason it's valuable.

In your society with Offler the Crocodile God, yes, irrational behavior will be rewarded.

But the society where devotion to Offler is rewarded over engineering prowess will have dilapidated bridges or no bridges at all. Even in the Offler society, medicine based on science will save more lives than medicine based on Offler's teachings. The doctors might be killed by the high priests of Offler for practicing that way, but it's still a better way to practice medicine. Those irrational beliefs may be rewarded for some short term, but they will make everyone's life worse off as a result. (Perhaps in the land of Offler's high priests, clandestine ER is the wisest approach).

If the neighboring society of Rational-landia builds better bridges, has better medical practices, and creates better weapons with sophisticated knowledge of projectile physics, it will probably overtake and conquer Offler's people.

In North Korea today, the best way to survive might be to pledge complete loyalty to the supreme leader. But the total lack of ER in the public sphere has set it back centuries in human progress.

NASA wasn't just trying to figure out rocket science for its own sake in the 1960s. It was trying to get to the moon.

If the terminal goal is to live the best possible life ("winning"), then pursuing ER will be incredibly beneficial in achieving that aim. But ER does not obligate those who seek it to make it their terminal goal.

Replies from: TheAncientGeek
comment by TheAncientGeek · 2017-02-28T10:24:08.957Z · LW(p) · GW(p)

What I'm saying is that all things being equal, individuals, firms, and governments with high ER will outperform those with lower ER.

That is probably true, but not equivalent to your original point.

I believe you seem to be saying high ER or having beliefs that correspond to reality is valuable for its own sake.

I am not saying it is objectively valuable for its own sake. I am saying an epistemic rationalist is defined as someone who terminally, ie for its own sake, values knowledge, although that is ultimately a subjective evaluation.

If the terminal goal is to live the best possible life ("winning"), then pursuing ER will be incredibly beneficial in achieving that aim. But ER does not obligate those who seek it to make it their terminal goal.

It's defined that way!!!!!

Replies from: ragintumbleweed
comment by ragintumbleweed · 2017-02-28T17:55:22.777Z · LW(p) · GW(p)

Forgive me, as I am brand new to LW. Where is it defined that an epistemic rationalist can't seek epistemic rationality as a means of living a good life (or for some other reason) rather than as a terminal goal? Is there an Académie française of rationalists that takes away your card if you use ER as a means to an end?

I'm working off this quote from EY as my definition of ER. This definition seems silent on the means-end question.

Epistemic rationality: believing, and updating on evidence, so as to systematically improve the correspondence between your map and the territory. The art of obtaining beliefs that correspond to reality as closely as possible. This correspondence is commonly termed "truth" or "accuracy", and we're happy to call it that.

This definition is agnostic on motivations for seeking rationality. Epistemic rationality is just seeking truth. You can do this because you want to get rich or get laid or get status or go to the moon or establish a better government or business. People's motivations for doing what they do are complex. Try as I might, I don't think I'll ever fully understand why my primate brain does it what it does. And I don't think anyone's primate brain is seeking truth for its own sake and for no other reasons.

Also, arguing about definitions is the least useful form of philosophy, so if that's the direction we're going, I'm tapping out.

But I will say that if the only people the Académie française of rationalists deems worthy of calling themselves epistemic rationalists are those with pure, untainted motivations of seeking truth for its own sake and for no other reasons, then I suspect that the class of epistemic rationalists is an empty set.

[And yes, I understand that instrumentality is about the actions you choose. But my point is about motivations, not actions.]

Replies from: TheAncientGeek, Elo
comment by TheAncientGeek · 2017-03-22T12:38:43.822Z · LW(p) · GW(p)

Forgive me, as I am brand new to LW. Where is it defined that an epistemic rationalist can't seek epistemic rationality as a means of living a good life (or for some other reason) rather than as a terminal goal?

From the wiki:-

Epistemic rationality is that part of rationality which involves achieving accurate beliefs about the world. ..... It can be seen as a form of instrumental rationality in which knowledge and truth are goals in themselves, whereas in other forms of instrumental rationality, knowledge and truth are only potential aids to achieving goals.

comment by Elo · 2017-02-28T20:36:43.996Z · LW(p) · GW(p)

ER vs IR. I am not sure what your question is.

I think of ER as sharpening the axe. not sure how many trees I will cut down or when, but with a sharp axe I will cut them down swiftly and with ease. I think of IR as actually getting down to swinging the axe. Both are needed. ER is a good terminal goal because it enables the other goals to happen more freely. Even if you don't know the other goals, having a sharper axe helps you be prepared to cut the tree when you find it.

comment by satt · 2015-09-13T12:43:21.584Z · LW(p) · GW(p)

Upvoted, but I want to throw in the caveat that some baseline level of epistemic rationalism is very useful for winning. Schizophrenics tend to have a harder time of things than non-schizophrenics.

comment by 27chaos · 2015-09-06T09:13:04.359Z · LW(p) · GW(p)

That is a limitation of looking at this community specifically, but the general sense of the question can also be approached by looking at communities for specific activities that have strong norms of rationality.

I think most of the time rationality is not helpful for applied goals because doing something well usually requires domain specific knowledge that's acquired through experience, and yet experience alone is almost always sufficient for success. In cases where the advice of rationality and experience conflict, oftentimes experience wins even if it should not, because the surrounding social context is built by and for the irrational majority. If you make the same mistake everyone else makes you are in little danger, but if you make a unique mistake you are in trouble.

Rationality is most useful when you're trying to find truths that no one else has found before. Unfortunately, this is extremely difficult to do even with ideal reasoning processes. Rationality does offer some marginal advantage in truth seeking, but because useful novel truths are so rare, most often the costs outweigh the benefits. Once a good idea is discovered, oftentimes irrational people are simply able to copy whoever invented the idea, without having to bear all the risk involved with the process of the idea's creation. And then, when you consider that perfect rationality is beyond mortal reach, the situation begins to look even worse. You need a strategy that lets you make better use of truth than other people can, in addition to the ability to find truth more easily, if you want to have a decent chance to translate skill in rationality into life victories.

Replies from: None
comment by [deleted] · 2015-09-07T02:30:51.382Z · LW(p) · GW(p)

In cases where the advice of rationality and experience conflict

What is "rationality" even supposed to be if not codified and generalized experience?

comment by entirelyuseless · 2015-09-05T15:14:37.187Z · LW(p) · GW(p)

Yes. This is much like I said in my comment: people from Less Wrong are simply much more interested in truth in itself, and as you say here, there is little reason to expect this to make them more effective in attaining other goals.

comment by Gunnar_Zarncke · 2015-09-05T10:10:42.080Z · LW(p) · GW(p)

What is the observed and self-reported opinion on LW about "rationalists don't win"? Lets poll! Please consider the following statements (use your definition of 'win'):

I don't win: [pollid:1023]

Rationality (LW-style) doesn't help me win (by my definition of 'win'): [pollid:1024]

Rationality (LW-style) doesn't help people win (by my definition of 'win'): [pollid:1025]

I think rationalists on average don't win more than on average (by my definition of 'win'): [pollid:1026]

I think the public (as far as they are aware of the concept) thinks that rationalists don't win (by their standards): [pollid:1027]

Clarification: 'Don't win' is intended to mean 'lose' thus a negative effect of rationality. If you think that rationality has no effect or is neutral with regard to winning, then please choose the middle option.

Replies from: Good_Burning_Plastic, EngineerofScience, None
comment by Good_Burning_Plastic · 2015-09-05T17:37:17.732Z · LW(p) · GW(p)

Clarification: 'Don't win' is intended to mean 'lose' thus a negative effect of rationality. If you think that rationality has no effect or is neutral with regard to winning, then please choose the middle option.

I wish more surveys had such clarifications. I can never tell whether "Strongly disagree" with "X should be more than Y" means "I strongly believe X should be less than Y", "I strongly believe X should be about the same as Y", or "I strongly believe it doesn't matter whether X is more or less than Y" (and, as a result, what I should pick if my opinion is one of the latter two).

comment by EngineerofScience · 2015-09-20T19:35:05.007Z · LW(p) · GW(p)

I'm not sure how effective this is considering most people who would see this are rationalists and people like to think good of themselves.

comment by [deleted] · 2015-09-06T20:01:51.397Z · LW(p) · GW(p)

You know, it's hard for me to simultaneously think of someone as winning and not a rationalist, not to mention always correcting the result by 'my definition'. I could say that the confirmation bias is at fault, but really... Shouldn't we just dissolve the questions?:) I mean, suppose I do know a person who has trouble letting sunk causes go, and has probably firmly forgotten about the Bayes theorem, and uses arguments as soldiers on occasion... But she is far more active than me, she keeps trying out new things, seeking out jobs even beyond her experiences etc. Should I consider her rational? I don't know. Brave, yes. Rather smart, yes. Winning, often. But rational?

Replies from: Viliam, lahwran, TheAncientGeek, None
comment by Viliam · 2015-09-07T08:57:20.322Z · LW(p) · GW(p)

it's hard for me to simultaneously think of someone as winning and not a rationalist

Winning a lottery? (Generalize it to include genetic lottery etc.)

Replies from: lmm
comment by lmm · 2015-09-12T18:07:27.585Z · LW(p) · GW(p)

Many stories I've seen of lottery winners lost the money quickly through bad investments and/or developed major life issues (divorce, drug addiction).

Replies from: WalterL
comment by WalterL · 2015-09-29T20:19:21.976Z · LW(p) · GW(p)

I think there's an element of rubbernecking there. The general feeling of the mob is that lottery = tax on stupidity. We are smart to not play the lottery. Story of a winner challenges general feeling, mob feels dumb for not buying winning ticket. Unmet need exists for story to make mob happy again.

General form of story is that lottery money is evil money. Lottery winners, far from being better than you, dear reader, are actually worse! They get divorced, they squander the money! Lawsuits!!

No one wants to read about the guy who retires and pays off his credit cards. No story there. But there are a lot of lotteries, so there will be an idiot somewhere you can use to reassure your viewers that they are double smart for not being rich.

Replies from: Jiro
comment by Jiro · 2015-09-30T06:10:23.815Z · LW(p) · GW(p)

The entire world is a tax on stupidity.

Replies from: jwoodward48
comment by jwoodward48 · 2017-03-03T00:18:35.479Z · LW(p) · GW(p)

Sounds meaninglessly deep to me.

Replies from: Jiro
comment by Jiro · 2017-03-05T06:56:14.633Z · LW(p) · GW(p)

It isn't. It's meant to point out that calling something a 'tax on stupidity" is itself meaninglessly deep-sounding. Intelligence is used for pretty much everything; calling something a tax on stupidity says nothing more about it than "it's part of the world".

comment by the gears to ascension (lahwran) · 2015-09-15T21:57:12.867Z · LW(p) · GW(p)

In my conversations with LW and CFAR community folks, they seem to consider "rationality" to be strictly equal to "winning" - unless I ask them directly if that's true. I think they really could benefit from clearer and simpler words, rather than naming fucking everything after their favorite words.

comment by TheAncientGeek · 2015-09-07T09:27:34.282Z · LW(p) · GW(p)

Does "rational" have to have meaning? Is that not a way of dissolving the question.

comment by [deleted] · 2015-09-07T02:24:44.008Z · LW(p) · GW(p)

I mean, suppose I do know a person who has trouble letting sunk causes go, and has probably firmly forgotten about the Bayes theorem, and uses arguments as soldiers on occasion... But she is far more active than me, she keeps trying out new things, seeking out jobs even beyond her experiences etc. Should I consider her rational? I don't know. Brave, yes. Rather smart, yes. Winning, often. But rational?

Winning = rational, rational = winning. If you define rational as something other than "the intellectual means of winning", there's no point other than a religious fetish for a theorem that's difficult to compute with.

Replies from: None, lmm
comment by [deleted] · 2015-09-07T04:41:41.563Z · LW(p) · GW(p)

Then how does one understand 'rationalists don't win'? 'Rationalists expect to win and fail, just like, for example, XYZ-ists do, only rationalists have trained themselves to recognize failure and in this way can still salvage more and so don't lose as completely (though we have no actual measure, because XYZ-ists will still think they have won)?:)

Replies from: None, lahwran
comment by [deleted] · 2015-09-07T14:07:29.560Z · LW(p) · GW(p)

No, I'd understand it as more like, "Calling oneself a 'rationalist' or 'aspiring rationalist' isn't correlated with object-level winning".

comment by the gears to ascension (lahwran) · 2015-09-15T21:58:41.171Z · LW(p) · GW(p)

The point of the "rationalists win" thing was to define rationality as winning. Which, among other things, makes it very unclear why the word "intelligence" is different. Everyone seems to insist it is in fact different when I ask, but nobody can explain why, and the inductive examples they give me collapse under scrutiny. what?

Replies from: hamnox
comment by hamnox · 2015-09-18T17:29:08.534Z · LW(p) · GW(p)

Pretty sure inductive examples of intelligence fail because we really are pointing at different things when we say it.

Some mean "shows a statistically higher base rate for acquiring mental constructs (ideas, knowledge, skills)" when they say it. This usage tends to show up in people who think that model-building and explicit reasoning are the key to winning. They may try to tack this consideration onto their definition of intelligence in some way.

Some try to point at the specific differences in mental architecture they think cause people to use more or fewer mental constructs, like working memory or ability to abstract. This usage tends to show up in people who are trying to effect useful changes in how they or others think. They may notice that there's a lot of variation in which kind of mental constructs are used, and try to single out the combination that is most important to winning.

There's also the social stereotype of who has a preference for "doing" and experiencing vs. who is drawn to "thinking" and planning. People who think "doing" or having a well-integrated System 1 is the key to winning may favor this definition, since it neatly sidesteps away from the stupid argument over definitions the thinkers are having. I like to use it in conversations because it's loose enough to kinda encapsulate the other definitions — which role you think you fit is going to correlate with which you use more, which itself correlates with what your natural abilities lend themselves to. I'm less likely to talk past people that way..

But it's also because of this last interpretation that I point blank refuse to use intelligence as a synonym for rationality. The word 'rational' comes with just as many shades of denying emotion and trusting models over intuition, but they're at least framed as ignoring extraneous factors in the course of doing what you must.

comment by lmm · 2015-09-12T18:08:45.349Z · LW(p) · GW(p)

I want to talk about the group (well, cluster of people) that calls itself "rationalists". What should I call it if not that?

Replies from: lahwran
comment by the gears to ascension (lahwran) · 2015-09-15T21:58:59.563Z · LW(p) · GW(p)

CFAR community, or LW community, depending on which kind of person you mean.

comment by ZoltanBerrigomo · 2015-09-11T23:59:42.178Z · LW(p) · GW(p)

Side-stepping the issue of whether rationalists actually "win" or "do not win" in the real world, I think a-priori there are some reasons to suspect that people who exhibit a high degree or rationality will not be among the most successful.

For example: people respond positively to confidence. When you make a sales pitch for your company/research project/whatever, people like to see you that you really believe in the idea. Often, you will win brownie points if you believe in whatever you are trying to sell with nearly evangelical fervor.

One might reply: surely a rational person would understand the value of confidence and fake it as necessary? Answer: yes to the former, no to the latter. Confidence is not so easy to fake; people with genuine beliefs either in their own grandeur or in the greatness of their ideas have a much easier time of it.

Robert Kurzbans' book Why Everyone (Else) Is a Hypocrite: Evolution and the Modular Mind is essentially about this. The book may be thought of as a long-winded answer to the question "Why aren't we all more rational?" Rationality skills seem kinda useful for bands of hunter-gatherers to possess, and yet evolution gave them to us only in part. Kurzban argues, among other things, that those who are able to genuinely believe certain fictions have an easier time persuading others, and therefore are likely to be more successful.

Replies from: ChristianKl, adamzerner
comment by ChristianKl · 2015-09-19T18:13:32.638Z · LW(p) · GW(p)

For example: people respond positively to confidence. When you make a sales pitch for your company/research project/whatever, people like to see you that you really believe in the idea. Often, you will win brownie points if you believe in whatever you are trying to sell with nearly evangelical fervor.

CFAR's Valentine manages to have a very high charisma. He also manages to get out of his way to tell people not to believe him too much and explicetly that that he's not certain.

In http://lesswrong.com/lw/mp3/proper_posture_for_mental_arts/ he suggests:

Based on my current fledgling understanding, this seems to look something like taking a larger perspective, like imagining looking back at this moment 30 years hence and noticing what does and does not matter. (I think that's akin to tucking your hips, which is a movement in service of posture but isn't strictly part of the posture.) I imagine this is enormously easier when one has a well-internalized sense of something to protect.

Having this strong sense of something worth to protect seems to be more important than believing that individual ideas are necessarily correct.

You don't develop a strong sense of something worth by doing debaising techniques but at the same time it's a major part of rationality!CFAR and rationality!HPMOR. At the same time there are people in this community plagued by akrasia who don't have that strong sense on an emotional level.

comment by Adam Zerner (adamzerner) · 2015-09-12T01:37:29.536Z · LW(p) · GW(p)

I agree with your point about the value of appearing confident, and that it's difficult to fake.* I think it's worth bringing up, but I don't think it's a particularly large component of success. Depending on the field, but I still don't think there's really many fields where it's a notably large component of success (maybe sales?).

*I've encountered it. I'm an inexperienced web developer, and people sometimes tell me that I should be more confident. At first this has very slightly hurt me. Almost negligibly slight. Recently, I've been extremely fortunate to get to work with a developer who also reads LW and understands confidence. I actually talked to him about this today, and he mirrored my thoughts that with most people, appearing more confident might benefit me, but that with him it makes sense to be honest about my confident levels (like I have been).

Replies from: ZoltanBerrigomo
comment by ZoltanBerrigomo · 2015-09-12T06:23:05.449Z · LW(p) · GW(p)

Not sure...I think confidence, sales skills, and ability to believe and get passionate about BS can be very helpful in much of the business world.

comment by entirelyuseless · 2015-09-05T15:08:50.697Z · LW(p) · GW(p)

Human beings are not very interested in truth in itself. They are mostly interested in it to the extent that it can accomplish other things.

Less Wrongers tend to be more interested in truth in itself, and to rationalize this as "useful" because being wrong about reality should lead you to fail to attain your goals.

But normal human beings are extremely good at compartmentalization. In other words they are extremely good at knowing when knowing the truth is going to be useful for their goals, and when it is not. This means that they are better than Less Wrongers at attaining their goals, because the truth does not get in the way. And their errors do not hinder their goals, for the most part, since they know when they need the truth and look for it on those occasions. Less Wrong's rationalization for how being interested in the truth in itself will help you attain your goals, is simply wrong. Not caring about the truth can get you to your goals anyway, and better than caring about the truth, if you make sure to care about the truth exactly on the occasions when you need it, and only then.

However, it is possible to hold Less Wrong's position without any rationalization, by saying that the truth really is that important in itself. In this case reaching the truth is winning, regardless of what else happens.

Replies from: None
comment by [deleted] · 2015-09-07T02:22:04.055Z · LW(p) · GW(p)

But normal human beings are extremely good at compartmentalization. In other words they are extremely good at knowing when knowing the truth is going to be useful for their goals, and when it is not. This means that they are better than Less Wrongers at attaining their goals, because the truth does not get in the way.

If you really believe this, I'd love to see a post on a computational theory of compartmentalization, so you can explain for us all how the brain performs this magical trick.

Replies from: entirelyuseless, TheAncientGeek, PhilGoetz, TheAncientGeek
comment by entirelyuseless · 2015-09-07T03:34:37.337Z · LW(p) · GW(p)

I'm not sure what you mean by "magical trick." For example, it's pretty easy to know that it doesn't matter (for the brain's purposes) whether or not my politics is objectively correct or not; for those purposes it mainly matters whether I agree with my associates.

Replies from: None
comment by [deleted] · 2015-09-07T14:10:27.207Z · LW(p) · GW(p)

it's pretty easy to know that it doesn't matter (for the brain's purposes) whether or not my politics is objectively correct or not

Bolded the part I consider controversial. If you haven't characterized what sort of inference problem the brain is actually solving, then you don't know the purposes behind its functionality. You only know what things feel like from the inside, and that's unreliable.

Hell, if normative theories of rationality were more computational and less focused on sounding intellectual, I'd believe in those a lot more thoroughly, too.

comment by TheAncientGeek · 2015-09-19T07:50:51.043Z · LW(p) · GW(p)

If you have some some sort of distributed database with multiple updates from multiple sources, its likely to get into an inconsistent state unless you to measures to prevent that. So the way to achieve the magic of compartmentalised "beliefs" is to build a system like that, but don't bother to add a consistency layer.

comment by PhilGoetz · 2015-09-18T21:58:20.102Z · LW(p) · GW(p)

Perhaps he will, if you agree to also post your computational theory of how the brain works.

If you don't have one, then it's unreasonable to demand one.

Replies from: None
comment by [deleted] · 2015-09-18T23:26:52.786Z · LW(p) · GW(p)

Perhaps he will, if you agree to also post your computational theory of how the brain works.

That was several months ago.

Replies from: PhilGoetz
comment by PhilGoetz · 2015-09-24T21:26:25.546Z · LW(p) · GW(p)

Nice! I'll bookmark that.

comment by TheAncientGeek · 2015-09-07T09:38:33.351Z · LW(p) · GW(p)

You think there is no evidence that it does?

Replies from: None
comment by [deleted] · 2015-09-07T14:04:54.148Z · LW(p) · GW(p)

You only really understand something when you understand how it's implemented.

Replies from: TheAncientGeek
comment by TheAncientGeek · 2015-09-07T18:34:22.962Z · LW(p) · GW(p)

Whatever. The statement "But normal human beings are extremely good at compartmentalization" has little to do with understanding or implementation, so you would seem to be changing the subject.

Replies from: None
comment by [deleted] · 2015-09-07T20:05:00.902Z · LW(p) · GW(p)

Well no. I'm saying that folk-psychology has been extremely wrong before, so we shouldn't trust it. You invoke folk-psychology to say that the mind uses compartmentalization to lie to itself in useful ways. I say that this folk-psychological judgement lacks explanatory power (though it certainly possesses status-attribution power: low status to those measly humans over there!) in the absence of a larger, well-supported theory behind it.

Replies from: TheAncientGeek
comment by TheAncientGeek · 2015-09-08T08:57:51.893Z · LW(p) · GW(p)

Is it better to assume non compartmentisation?

Replies from: None
comment by [deleted] · 2015-09-08T12:41:04.972Z · LW(p) · GW(p)

No, it's better to assume that folk-psychology doesn't accurately map the mind. "Reversed stupidity is not intelligence."

Your statement is equivalent to saying, "We've seen a beautiful sunset. Clearly, it must be a sign of God's happiness, since it couldn't be a sign of God's anger." In actual fact, it's all a matter of the atmosphere refracting light from a giant nuclear-fusion reaction, and made-up deities have nothing to do with it.

Just because a map seems to let you classify things, doesn't mean it provides accurate causal explanations.

Replies from: TheAncientGeek
comment by TheAncientGeek · 2015-09-12T06:49:35.667Z · LW(p) · GW(p)

If we don't know enough about how the mind works to say it is good at compermentalisation, we also don't know enough to say it is bad at compartmentalisation

Your position requires you to be noncommittal about a lot of things. Maybe you are managing that.

The analogy with sunsets isn't analogous, because we have the science as an alternative

Replies from: Jiro
comment by Jiro · 2015-09-12T16:07:11.288Z · LW(p) · GW(p)

I wouldn't be able to tell if someone is a good mathematician, but I'd know that if they add 2 and 2 the normal way and get 5, they're a bad one. It's often a lot easier to detect incompetence, or at least some kinds of incompetence, than excellence.

Replies from: TheAncientGeek
comment by TheAncientGeek · 2015-09-12T17:51:31.323Z · LW(p) · GW(p)

Is compartmentalisation supposed to be a competence or an incompetence, or neither?

Replies from: None
comment by [deleted] · 2015-09-12T20:03:41.913Z · LW(p) · GW(p)

Personally, I don't think "compartmentalization" actually cuts reality at the joints. Surely the brain must solve a classification problem at some point, but it could easily "fall out" that your algorithms simply perform better if they classify things or situations between contextualized models - that is, if they "compartmentalize" - than if they try to build one humongous super-model for all possible things and situations.

Replies from: TheAncientGeek
comment by TheAncientGeek · 2015-09-13T13:37:50.771Z · LW(p) · GW(p)

But you don;t have proof of that theory, do you?

Replies from: None
comment by [deleted] · 2015-09-14T13:48:35.950Z · LW(p) · GW(p)

Your original thesis would support that theory, actually.

Replies from: TheAncientGeek
comment by TheAncientGeek · 2015-09-15T10:40:33.682Z · LW(p) · GW(p)

I havent made any object level claims about psychology.

comment by Epictetus · 2015-09-19T20:00:30.676Z · LW(p) · GW(p)

Let's say I wanted to solve my dating issues. I present the following approaches:

  1. I endeavor to solve the general problem of human sexual attraction, plug myself into the parameters to figure out what I'd be most attracted to, determine the probabilities that individuals I'd be attracted to would also be attracted to me, then devise a strategy for finding someone with maximal compatibility.

  2. I take an iterative approach: I devise a model this afternoon, test it this evening, then analyze the results tomorrow morning and make the necessary adjustments.

Which approach is more rational? Given sufficient time, Approach 1 will yield the optimal solution. Approach 2 has to deal with the problem of local maxima and in the long run is likely to end up worse than Approach 1. An immortal living in an eternal universe would probably say that Approach 1 is vastly superior. Humans, on the other hand, will die well before Approach 1 bears fruit.

While rationality can lead to faster improvement using Approach 2, a rationalist might try Approach 1, whereas a non-rationalist is unlikely to use Approach 1 at all.

Simple amendments to the general problem such as "find the best way to get the best date for next Saturday" will likely lead to solutions making heavy use of deception. If you want to exclude the Dark Arts from the solution space, then that's going to limit what you can accomplish. The short-term drawbacks of insisting on truth and honesty are well-documented.

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2015-09-20T00:57:34.025Z · LW(p) · GW(p)

How would you do (1) without making hypotheses and testing them, i.e. (2)?

Replies from: Viliam
comment by Viliam · 2015-09-21T08:49:28.420Z · LW(p) · GW(p)

Reading a book... debating with other smart people in a web forum... reading another book... trying to solve the problems in your map alone before you even touch the territory...

Seems to me this is what people often do when they try to do (1).

comment by SecondWind · 2015-09-07T19:29:05.746Z · LW(p) · GW(p)

"Local rationalist learns to beat akrasia using this one weird trick!"

comment by Gunnar_Zarncke · 2015-09-05T09:58:35.460Z · LW(p) · GW(p)

"rationalists don't win"

Depends on what 'win' means. If (epistemic) rationality helps with a realistic view of the world, then it also means looking behind the socially constructed expectations of 'life success'. I think these are memes that our brains pattern match to something more suitable in an ancestral environment. Hedonic treadmill and peter principle ensue. I think that a realistic view of the world has helped me evade these expectations and live an unusual fulfilled interesting life. I'd call that a private success. Not a public one. With my abilities I surely could have met higher expectations ('get rich', 'famous scientist',...) but as the OP notes: This would have meant hard work in any case - and thus trade-offs in other places e.g. family. There are always trade-offs. Rationality makes you more aware of that. And helps finding a Goldilocks Solution somewhere away from dead-beat extremes. Or maybe I'm just lazy but smart at rationalizing it :-)

comment by CAE_Jones · 2015-09-19T00:15:04.484Z · LW(p) · GW(p)

My life got worse after I found LessWrong, but I can't really attribute that to a causal relationship. I just don't belong in this world, I think.

I can imagine LW-style rationality being helpful if you're already far enough above baseline in enough areas that you would have been fairly close to winning regardless. (I am now imagining "baseline" as the surface of liquids in Sonic the Hedgehog 1-3. If I start having nightmares including the drowning music, ... I'll... ... have a more colorful way to describe despair to the internet, I guess.)

comment by ladycarstairs · 2015-10-01T00:14:05.024Z · LW(p) · GW(p)

I agree. First off, I think it has a lot to do with a person's overall definition of 'win'. In the eyes of 'outside society' rationalists don't win. I believe that is because, as you said, if you look at things overall, you don't see an influx of success for the people who are rationalists. That isn't to say that they don't win, or that rationalism is pointless and does't offer up anything worthwhile. That would be a lie. I think that rationalists have a better grip on the workings of the world, and thenceforth, know what to do and how to achieve success, or at least now the theoretical way to victory. The thing/problem/catch would be that we have not, as of yet- because we will (someday it is going to be absolutely necessary to be a rationalist or at least dabble in rationalism), achieved a success large enough as to be noted by society as a success. I believe that we could, with the right amount of hard work and effort, gain a victory broad enough that i can't be put down or denied. A victory so large that it will be seen as a victory and not just a minor win. Overall, the question 'why don't rationalists win?' seems to depend on the, per se, 'winners' definition of victory/success. And, until we can all 1) achieve a success so large it is counted by everyone as a success, or 2) decide on a set definition of success and reach that, rationalist are going to appear, to the world, as if we have not, and may never win.

comment by cameroncowan · 2015-09-26T05:10:25.428Z · LW(p) · GW(p)

We are the people who knew too much.....

comment by EngineerofScience · 2015-09-20T19:34:00.756Z · LW(p) · GW(p)

I am not so sure that rationalists don't win, but rather that "winning"(ie. starting a company, being a celebrity, etc.) is rare enough and that few people are rationalists that people that win tend not to be rationalists because being a rationalist is rare enough that very few people that win are rationalists, even if each rationalist has a better chance of winning.

comment by Ixiel · 2015-09-19T11:36:56.926Z · LW(p) · GW(p)

So you say altruism is something to "assume that we mostly agree on, and thus not really elaborate" and I know the sentiment is sometimes that it's like jazz and pornography, but fwiw I'd be curious about an elaboration. I don't think that particular prejudice is a big part of rationalist failures, but raising the possibility of it being a part is interesting to me.

Replies from: adamzerner
comment by Adam Zerner (adamzerner) · 2015-09-19T14:32:46.922Z · LW(p) · GW(p)

I just meant that rationalists overwhelmingly seem to have altruistic goals.

I'm not sure what you meant with "jazz and pornography".

Replies from: ChristianKl
comment by ChristianKl · 2015-09-19T18:00:11.405Z · LW(p) · GW(p)

A popular definition for what happens to be pornography is "I know it when I see it." There seem to be a similar sentiment with jazz.

comment by the gears to ascension (lahwran) · 2015-09-15T21:54:33.225Z · LW(p) · GW(p)

What you less wrong folks call "rationality" is not what everyone else calls "rationality" - you can't say "I also think that rationality is doing a great job in helping people", that either doesn't make sense or is a tautology, depending on your interpretation. Please stop saying "rationality" and meaning your own in-group thing, it's ridiculously offputting.

Also, my experience has been that CFAR-trained folks do sit down and do hard things, and that people who are only familiar with LW just don't. It has also been my experience that they don't do enough hard things to just "win", in the sense defined here, and that the difference between "winning" and not is actually not easily exploitable with slightly more intelligent macro-scale behavior. The branching points that differentiate between the winning and losing paths are the exploitable points - things like deciding whether or not to go to college, or whether to switch jobs - and they're alright at choosing between those, but so are other people. CFAR-trained folks are typically reasonably better than equivalently intelligent folks who have had the same experience so far, but not dramatically so.

Replies from: PhilGoetz
comment by PhilGoetz · 2015-09-18T21:54:18.117Z · LW(p) · GW(p)

you can't say "I also think that rationality is doing a great job in helping people", that either doesn't make sense or is a tautology,

He may have meant that he thinks rationality is effective for altruists.