Desire is the direction, rationality is the magnitude

post by So8res · 2015-04-05T17:27:57.757Z · LW · GW · Legacy · 13 comments

Contents

13 comments

What follows is a series of four short essays that say explicitly some things that I would tell an intrigued proto-rationalist before pointing them towards Rationality: AI to Zombies (and, by extension, most of LessWrong). For most people here, these essays will be very old news, as they talk about the insights that come even before the sequences. However, I've noticed recently that a number of fledgling rationalists haven't actually been exposed to all of these ideas, and there is power in saying the obvious.

This essay is cross-posted on MindingOurWay.


A brief note on "rationality:"

It's a common trope that thinking can be divided up into "hot, emotional thinking" and "cold, rational thinking" (with Kirk and Spock being the stereotypical offenders, respectively). The tropes say that the hot decisions are often stupid (and inconsiderate of consequences), while the cold decisions are often smart (but made by the sort of disconnected nerd that wears a lab coat and makes wacky technology). Of course (the trope goes) there are Deep Human Truths available to the hot reasoners that the cold reasoners know not.

Many people, upon encountering one who says they study the art of human rationality, jump to the conclusion that these "rationalists" are people who reject the hot reasoning entirely, attempting to disconnect themselves from their emotions once and for all, in order to avoid the rash mistakes of "hot reasoning." Many think that these aspiring rationalists are attempting some sort of dark ritual to sacrifice emotion once and for all, while failing to notice that the emotions they wish to sacrifice are the very things which give them their humanity. "Love is hot and rash and irrational," they say, "but you sure wouldn't want to sacrifice it." Understandably, many people find the prospect of "becoming more rational" rather uncomfortable.

So heads up: this sort of emotional sacrifice has little to do with the word "rationality" as it is used in Rationality: AI to Zombies.

When Rationality: AI to Zombies talks about "rationality," it's not talking about the "cold" part of hot vs cold reasoning, it's talking about the reasoning part.

One way or another, we humans are reasoning creatures. Sometimes, when time pressure is bearing down on us, we make quick decisions and follow our split-second intuitions. Sometimes, when the stakes are incredibly high and we have time available, we deploy the machinery of logic, in places where we trust it more than our impulses. But in both cases, we are reasoning. Whether our reasoning be hot or cold or otherwise, there are better and worse ways to reason.

(And, trust me, brains have found a whole lot of the bad ones. What do you expect, when you run programs that screwed themselves into existence on computers made of meat?)

The rationality of Rationality: AI to Zombies isn't about using cold logic to choose what to care about. Reasoning well has little to do with what you're reasoning towards. If your goal is to enjoy life to the fullest and love without restraint, then better reasoning (while hot or cold, while rushed or relaxed) will help you do so. But if your goal is to annihilate as many puppies as possible, then this-kind-of-rationality will also help you annihilate more puppies.

(Unfortunately, this usage of the word "rationality" does not match the colloquial usage. I wish we had a better word for the study of how to improve one's reasoning in all its forms that didn't also evoke images of people sacrificing their emotions on the altar of cold logic. But alas, that ship has sailed.)

If you are considering walking the path towards rationality-as-better-reasoning, then please, do not sacrifice your warmth. Your deepest desires are not a burden, but a compass. Rationality of this kind is not about changing where you're going, it's about changing how far you can go.

People often label their deepest desires "irrational." They say things like "I know it's irrational, but I love my partner, and if they were taken from me, I'd move heaven and earth to get them back." To which I say: when I point towards "rationality," I point not towards that which would rob you of your desires, but rather towards that which would make you better able to achieve them.

That is the sort of rationality that I suggest studying, when I recommend reading Rationality: AI to Zombies.

13 comments

Comments sorted by top scores.

comment by Shmi (shminux) · 2015-04-05T18:39:19.980Z · LW(p) · GW(p)

I like your write-up, very clear and accessible. You certainly have a gift for popularization, not just research. A rare combination.

I would just note upfront that

Reasoning well has little to do with what you're reasoning towards.

and

Rationality of this kind is not about changing where you're going, it's about changing how far you can go.

are white lies, as you well know. It's not unusual in the process of reasoning of how to best achieve your goal to find that the goal itself shifts or evaporates.

"How to best serve God" may result in deconversion.

"How to make my relationship with partner a happy one" may result in discovering that they are a narcissistic little shit I should run away from. Or that both of us should find other partners.

"How to help my neighborhood out of poverty" might become "How to make the most money" in order to donate as much as possible.

This goal-evaporation danger is rarely as extreme, but it is ubiquitous. Every goalpost shifts when you optimize your shot hard enough. Your analogy

Your deepest desires are not a burden, but a compass

is very apt: following it strictly helps you reach your destination, but does not mean your destination is there or has what you expected. Or won't get you killed in the process.

In this essay you talk about Instrumental Rationality as if it were separate from Epistemic. It is not. The dangers of good reasoning ought to be noted upfront, the way MoR!Harry did to Draco, only more so. Hopefully you already plan to talk about it in one of your remaining three essays.

Replies from: So8res, fubarobfusco
comment by So8res · 2015-04-05T23:02:38.593Z · LW(p) · GW(p)

Thanks!

white lies

You've caught me :-)

My stance on terminal values is "it's possible to be wrong about what you deeply desire." The person who deconverted through trying to figure out how to better serve God likely did so in the process of realizing they had deeper humanitarian values. Similarly with the person who tried to help their neigborhood out of poverty and become an EA.

This is in part why I said that reasoning well has "little" (instead of "nothing") to do with what you're reasoning towards. Similarly, in "it's not about changing where you're going," I had no intention of equating "where you're going" with "where you think you're going" :-)

However, I agree that the default apparent connotation contains one doozy of a white lie.

comment by fubarobfusco · 2015-04-08T22:59:02.473Z · LW(p) · GW(p)

Every goalpost shifts when you optimize your shot hard enough.

Optimize hard enough for "get the ball into the goal as fast as possible" and you explode the ball and drive its husk through the bodies of the defending team, and you don't get asked to play football any more.

comment by 27chaos · 2015-04-05T23:32:51.292Z · LW(p) · GW(p)

(And, trust me, brains have found a whole lot of the bad ones. What do you expect, when you run programs that screwed themselves into existence on computers made of meat?)

Fairly flippant for what's supposed to be a primer.

The rationality of Rationality: AI to Zombies isn't about using cold logic to choose what to care about. Reasoning well has little to do with what you're reasoning towards. If your goal is to enjoy life to the fullest and love without restraint, then better reasoning (while hot or cold, while rushed or relaxed) will help you do so. But if your goal is to annihilate as many puppies as possible, then this-kind-of-rationality will also help you annihilate more puppies.

This seems like bad salesmanship. There's no need to say things like "rationality is compatible with puppy annihilation" in a paragraph that's supposed to be reassuring people rationality is not immoral or hostile to human emotions. Instead, I would say you should to the opposite, and tell people that if they really love someone, that means that it is actually a rational thing to want to make sacrifices for them, because if you weigh everything up their happiness matters more to you than their own.

So, whereas the message of this post can be summarized as "rationality is neutral", I think the post's message should be closer to "rationality is good - it lets you do things like love EVEN BETTER".

Replies from: So8res, Furslid
comment by So8res · 2015-04-06T00:15:53.361Z · LW(p) · GW(p)

Thanks! These are good suggestions. I intended for these essays to be geared more towards old-friends-who-are-intrigued-by-this-rationality-business, and didn't intend it to be preface-for-the-book-type-material. I agree that skewing more towards your message would be the right thing to do in the latter context, though :-)

Replies from: 27chaos, evand
comment by 27chaos · 2015-04-06T00:52:23.254Z · LW(p) · GW(p)

Okay, gotcha.

comment by evand · 2015-04-12T17:44:42.811Z · LW(p) · GW(p)

I would find the latter more useful, if I was going to send other people these links.

These are good essays. Thanks!

comment by Furslid · 2015-04-06T16:55:24.881Z · LW(p) · GW(p)

I actually like that line. There are a lot of people and organizations that are portrayed as rational and evil. Walmart sacrificing all soft values to maximize profit and the robot overlords systematically destroying or enslaving humanity are also views of rationality. They can be used as objections as much as Spock can. This quick joke shows that problems like this are considered, even if they aren't dealt with in depth here.

Replies from: Kindly
comment by Kindly · 2015-04-06T18:03:26.006Z · LW(p) · GW(p)

Part of it might just be the order. Compare that paragraph to the following alternative:

The rationality of Rationality: AI to Zombies isn't about using cold logic to choose what to care about. Reasoning well has little to do with what you're reasoning towards. If your goal is to annihilate as many puppies as possible, then this kind of rationality will help you annihilate more puppies. But if your goal is to enjoy life to the fullest and love without restraint, then better reasoning (while hot or cold, while rushed or relaxed) will also help you do so.

comment by ChristianKl · 2015-04-07T15:06:37.291Z · LW(p) · GW(p)

(Unfortunately, this usage of the word "rationality" does not match the colloquial usage. I wish we had a better word for the study of how to improve one's reasoning in all its forms that didn't also evoke images of people sacrificing their emotions on the altar of cold logic. But alas, that ship has sailed.)

In academia that study is called "decision science".

Over time I'm becoming more skeptical about this blanket way of thinking where we consider a single mode of thinking the ultimate way of thinking.

I can be in a mode where I think about what I'm going to say in a reflective way before I say it. I can also be in a mode where I speak without such filtering.

Sometimes, when the stakes are incredibly high and we have time available, we deploy the machinery of logic

In academia there debate about unconscious vs. conscious thought and what's better. While the results are mixed it doesn't seem to be clear that having time available means that the time is usually well spent with using the machinery of logic.

Sometimes it feels to me like using that machinery is about not trusting your brain to do it's work unconsciously in the background. Like the brain doesn't work when it's not micromanaged.

Many think that these aspiring rationalists are attempting some sort of dark ritual to sacrifice emotion once and for all, while failing to notice that the emotions they wish to sacrifice are the very things which give them their humanity.

It's not as if no member of this community goes down that paths. Many PUA's use rejection therapy in a way to numb certain emotions. Similar things pop up from time to time.

comment by kilobug · 2015-04-07T08:46:14.974Z · LW(p) · GW(p)

Nicely put for an introduction, but of course things are in reality not as clear-cut, "rationality" changing the direction and "desire" the magnitude.

  1. Rationality can make you realize some contradictions between your desires, and force you to change them. It can also make you realize that what you truly desire isn't what you thought you desired. Or it can make you desire whole new things, that you didn't believe to be possible initially.

  2. Desire will affect the magnitude because it'll affect how much effort you put in your endeavor. With things like akrasia and procastination around, if you don't have strong desire to do something, you are much likely to do it, especially if there is an initial cost. That's what Eliezer calls "something to protect".

Of course those two are mostly positive feedback between rationality and desire, but there also be can negative feedbacks between the two, usually due to human imperfections.

comment by [deleted] · 2015-04-11T00:43:16.761Z · LW(p) · GW(p)

I object to the concept that there's no such thing as hot reasoning. It's basically everything I do. I reason because I care.