Summary of "The Straw Vulcan"

post by alexvermeer · 2011-12-26T16:29:46.539Z · LW · GW · Legacy · 27 comments

Contents

  What is real rationality?
  “Straw Vulcan” Rationality Principles
    Straw Vulcan Principle #1: Being rational means expecting other people to be rational too.
    Straw Vulcan Principle #2: Being rational means never making a decision until you have all the information.
    Straw Vulcan Principle #3: Being rational means never relying on intuition.
    Straw Vulcan Principle #4: Being rational means not having emotions.
    Straw Vulcan Principle #5: Being rational means valuing only quantifiable things, like money, efficiency, or productivity.
  The Main Takeaway
None
27 comments

Followup to: Communicating rationality to the public: Julia Galef's "The Straw Vulcan"

The Straw VulcanI wrote a summary of Julia Galef's "The Straw Vulcan" presentation from Skepticon 4. Note that it is written in my own words, but all of the ideas should be credited to Julia and her presentation (unless I unintentionally misrepresent any of them!).

---

The classic Hollywood example of rationality is the Vulcans from Star Trek. They are depicted as an ultra-rational race that has eschewed all emotion from their lives.

But is this truly rational? What is rationality?

A “Straw Vulcan”—an idea originally defined on TV Tropes—is a straw man used to show that emotion is better than logic. Traditionally, you have your ‘rational’ character who thinks perfectly ‘logically’, but then ends up running into trouble, having problems, or failing to achieve what they were trying to achieve.

These characters have a sort of fake rationality. They don’t fail because rationality failed, but because they aren’t actually being rational. Straw Vulcan rationality is not the same thing as actual rationality.

What is real rationality?

There are two different concepts that we refer to when we use the word ‘rationality’:

1. The method of obtaining an accurate view of reality. (Epistemic Rationality) — Learning new things, updating your beliefs based on the evidence, being as accurate as possible, being as close to what is true as possible, etc.

2. The method of achieving your goals. (Instrumental Rationality) — Whatever your goals are, be them selfish or altruistic, there are better and worse ways to achieve them, and instrumental rationality helps you figure this out.

These two concepts are obviously related. You want a clear model of the world to be able to achieve your goals. You also may have goals related to obtaining an accurate model of the world.

How do these concepts of rationality relate to Straw Vulcan rationality? What is the Straw Vulcan conception of rationality?

“Straw Vulcan” Rationality Principles

Straw Vulcan Principle #1: Being rational means expecting other people to be rational too.

Galef uses an example from Star Trek where Spock, in an attempt to protect the crew of the crashed ship, decides to show aggression against the local aliens so that they will be scared and run away. Instead, they are angered by the display of aggression and attack even more fiercely, much to Spock’s dismay and confusion.

But this isn’t being rational! Spock’s model of the world is severely tarnished by his silly expectation for everyone else to be as rational as he would be. Real rationality would require you to try to understand all aspects of the situation and act accordingly.

Straw Vulcan Principle #2: Being rational means never making a decision until you have all the information.

This seems to assume that the only important criteria for making decisions is that you make the best one given all the information. But what about things like time and risk? Surely those should factor into your decisions too.

We know intuitively that this is true. If you want a really awesome sandwich you may be willing to pay an extra $1.00 for some cheese, but you wouldn’t pay $300 for a small increase in the quality of a sandwich. You want the best possible outcome, but this requires simultaneously weighing various things like time, cost, value, and risk.

What is the most rational way to find a partner? Take this example from Gerd Gigerenzer, a well-respected psychology describing how a rationalist would find a partner:

“He would have to look at the probabilities of various consequences of marrying each of them—whether the woman would still talk to him after they’re married, whether she’d take care of their children, whatever is important to him—and the utilities of each of these…After many years of research he’d probably find out that his final choice had already married another person who didn’t do these computations, and actually just fell in love with her.”

But clearly this isn’t optimal decision making. The rational thing to do isn’t to merely wait until you have as much information as you can possibly have. You need to factor in things like how long the research is taking, the decreasing number of available partners as time passes, etc.

Straw Vulcan Principle #3: Being rational means never relying on intuition.

Straw Vulcan rationality says that anything intuition-based is illogical. But what is intuition?

We have two systems in our brains, which have been unexcitingly called System 1 and System 2.

System 1—the intuitive system—is the older of the two and allows us to make quick, automatic judgments using shortcuts (i.e. heuristics) that are usually good most of the time, all while requiring very little of your time and attention.

System 2—the deliberative system—is the newer of the two and allows us to do things like abstract hypothetical thinking and make models that explain unexpected events. System 2 tends to do better when you have more resources and more time and worse when there are many factors to consider and you have limited time.

Take a sample puzzle: A bat and ball together cost $1.10. If the bat costs $1 more than the ball, how much does the ball cost?

When a group of Princeton students were given this question, about 50% of them got it wrong. The correct answer is $0.05, since then the bat would cost $1.05 for a total of $1.10. The wrong answer of $0.10 is easily generated (incorrectly) by our System 1, and our System 2 accepts it without question.

Your System 1 is prone to biases, and it is also incredibly powerful. Our intuition tends to do well with purchasing decisions or other choices about our personal lives. System 1 is also very powerful for an expert. Chess grandmasters can glance at a chessboard and say, “white checkmates in three moves,” because of the vast amount of time and mental effort spent playing chess and building up a mental knowledge base about it.

Intuition can be bad and less reliable when based on something not relevant to the task at hand or when you don’t have expert knowledge on the topic. You opinions of AI may be heavily influenced by scifi movies that have little basis in reality.

The main thing to take away from this System 1 and 2 split is that both systems have strengths and weaknesses, and rationality is about finding the best path—using both systems at the right times—to epistemic and instrumental rationality.

Being “too rational” usually means you are using your System 2 brain intentionally but poorly. For example, teenagers were criticized in an article for being “too rational” because they could reason themselves into things like drugs and speeding. But this isn’t a problem with being too rational; it’s a problem with being very bad at System 2 reasoning!

Straw Vulcan Principle #4: Being rational means not having emotions.

Rationality and emotions are often portrayed in a certain way in Straw Vulcan rationalists, such as when Spock is excited to see that Captain Kirk isn’t dead, and then quickly covers up his emotions. The simplistic Hollywood portrayal of emotions and rationality is as follows:

Note that emotions can get in the way of taking actions on our goals. For example, anxiety causes us to overestimate risks; depression causes us to underestimate how much we will enjoy an activity; and feeling threatened or vulnerable causes us to exhibit more superstitious behavior and and likely to see patterns that don’t exist.

But emotions are also important for making the decisions themselves. Without having any emotional desires we would have no reason to have goals in the first place. You would have no motivations to choose between a calm beach and a nuclear waste site for your vacation. Emotions are necessary for forming goals; rationality is lame without them!

[Galef noted in a comment that the intended meaning is in line with “Emotions are necessary for forming goals among humans, rationality has no normative value to humans without goals.”]

This leaves us with a more accurate portrayal of the relationship between emotions and rationality:

How do emotions make us irrational? Emotions can be epistemically irrational if they are based on a false model of the world. You can be angry at your husband for not asking how your presentation at work went, but then upon reflection realize you never told him about it so how would he know it happened? Your anger was based on a false model of reality.

Emotions can be instrumentally irrational if they get in the way of you achieving your goals. If you feel things are hopeless and there are no ways to change the situation, you may be wrong about that. Your emotions may prevent you from taking necessary actions.

Our emotions also influence each other. If you have a desire to be liked by others and a desire to sit on a couch all day, you may run into problems. These desires may influence and conflict with each other.

We can also change our emotions. For example, cognitive behavioral therapy has many exercises and techniques (e.g. Thought Records) for changing your emotions by changing your beliefs.

Straw Vulcan Principle #5: Being rational means valuing only quantifiable things, like money, efficiency, or productivity.

If it isn’t concrete and measurable then there is no reason to value it, right? Things like beauty, love, or joy are just irrational emotions, right?

What are the problems with this? For starters, money can’t be valuable in and of itself, because it is only a means to obtain other valued things. Also, there is no reason to assume that money and productivity are the only things of value.

The Main Takeaway

Galef finishes off with this final message:

“If you think you’re acting rationally but you consistently keep getting the wrong answer, and you consistently keep ending worse off than you could be, then the conclusion you should draw from that is not that rationality is bad, it’s that you’re bad at rationality.

In other words, you’re doing it wrong!

You're Doing It Wrong!

First three images are from measureofdoubt.com > The Straw Vulcan: Hollywood’s illogical approach to logical decisionmaking.
You're Doing It Wrong image from evilbomb.com.

27 comments

Comments sorted by top scores.

comment by Swimmy · 2011-12-27T02:27:52.007Z · LW(p) · GW(p)

I think you should change "principle" to "myth." You don't want to ruin the flow of the article; people who aren't reading carefully (which is a whole lot of people) are going to scroll through, read the bold, and think you are advising such things.

Replies from: alexvermeer
comment by alexvermeer · 2011-12-27T14:47:00.160Z · LW(p) · GW(p)

That crossed my mind while writing, but I didn't want to stray too far from the wording in the presentation. I just changed it to "Straw Vulcan Principle #x". Is that a good compromise?

Replies from: Swimmy
comment by Swimmy · 2011-12-27T18:25:17.620Z · LW(p) · GW(p)

I think that works.

comment by KPier · 2011-12-26T19:52:40.740Z · LW(p) · GW(p)

You need to factor in things like how long the research is taking, the decreasing number of available females as time passes, etc.

Not to be picky, but could we say "available partners"? Please?

Otherwise very nice job, and upvoted.

Replies from: alexvermeer
comment by alexvermeer · 2011-12-26T21:47:23.675Z · LW(p) · GW(p)

Absolutely. The original example was explicit "male looks for female", but no reason for the summary to keep that. Fixed, and thanks.

Replies from: KPier
comment by KPier · 2011-12-26T23:35:45.135Z · LW(p) · GW(p)

Thanks!

The example didn't bother me, but when it switched to second person ("you need to factor in...") the continued gendering seemed unnecessary.

comment by lukeprog · 2011-12-27T01:24:34.204Z · LW(p) · GW(p)

An even shorter version is Why Spock is Not Rational.

comment by dlthomas · 2011-12-26T18:44:33.827Z · LW(p) · GW(p)

Was Spock meant to actually "be rational"? Re-watching the show recently, "Spock really, really wants to think of himself as rational" seems a much better description.

Replies from: Normal_Anomaly, jmh
comment by Normal_Anomaly · 2011-12-26T19:49:55.668Z · LW(p) · GW(p)

I haven't watched the show, but I've sometimes seen essays from people saying that Kirk, Spock, and Bones represent "body, mind, and spirit." And whatever the creators' intentions, there does seem to be a popular misconception that rationalists or rational people or both act like Spock.

Replies from: dlthomas, Eugine_Nier
comment by dlthomas · 2011-12-26T20:23:13.414Z · LW(p) · GW(p)

I agree that there is a popular conception as you say, but I think Spock works more effectively as a warning against rational attire as opposed to rationality. I don't actually know the creators' intentions. I just think than when Spock admonishes Kirk for his illogical play in making the winning move in a chess game early on, it's plain enough what's up - although maybe it's my trouble imagining a rational theory of chess wherein the correct move is one other than the one that puts your opponent in checkmate.

Replies from: Normal_Anomaly
comment by Normal_Anomaly · 2011-12-27T03:21:01.696Z · LW(p) · GW(p)

I can't find any authoritative discussion of Spock's intended purpose. I asked someone who's seen the show in as non-loaded a way as I could, and ey said that Spock was generally intended to be perceived as rational, and that the chess games in particular are often a metaphor for the action of the episode. McCoy and Spock often function as Kirk's System 1 and System 2, giving him advice that he combines into an instrumentally rational decision. I agree that Spock is often a good example of what not to do.

comment by Eugine_Nier · 2011-12-27T02:48:04.826Z · LW(p) · GW(p)

there does seem to be a popular misconception that rationalists or rational people or both act like Spock.

I suspect there a reasonable amount of truth to this belief. At least I suspect Spock was a reasonable caricature of the type of self-proclaimed "rational people" prevalent during the 50s and 60s.

comment by jmh · 2020-01-23T12:59:14.471Z · LW(p) · GW(p)

That's an interesting take. I think one might view the character traits as managing emotion by deep suppression that leads to a purely analytic mind view, distances and a bit dispassionate about "life values", so have become fully comfortable with the types of calculations that amount to "it is fine to kill 100 people if 1,000 are saved".

As you note, that is not actually rational but might produce the appearance of rationality given the calculated maximizing/optimizing type of decision-making that replaces (suppresses?) the emotional response and any empathy to life in the specific, individual cases.

I do think the theme of rational calculation versus human emotion, how they are often interdependent and other explorations was present throughout most episodes (I'm not sure it was captured in the movies as well).

This makes me think a third from of rationality is perhaps worth noting -- though one could certainly argue if can be subsumed under one of both of others. I think emotional rationality is worth perhaps keeping in mind -- but also can view this more as emotional maturity which then is achieve via epistemic and instrumental rationality.

But then considering the diagram where emotion is interfering with the rational relation between goal and action the pose seems to be taking a bit of the same approach as the Vulcan (and so I think missing something about being rational). While I agree the emotions can induce us to do imprudent things we regret later but am not sure the extreme view of emotions conflict/oppose rationality is right. I think more thought in that area would be fruitful for those seeking to improve the decisions making processes they find have grown in their own head and how to improve those processes. (I know there have been some posts in this area on LW but no time to search an link here.)

This said, I have found this a good post to read and appreciate it having been shared.


Replies from: MoritzG
comment by MoritzG · 2020-01-23T17:21:06.319Z · LW(p) · GW(p)

The way you commented it is not clear what you are referring to. I did not understand your comment because I did not get "where you were coming from".

Replies from: jmh
comment by jmh · 2020-01-23T18:41:23.099Z · LW(p) · GW(p)

First, was Spock rational or just wanted to think himself rational. I am not completely sure that was the underlying character trait of Vulcan's in the show -- though also agree that much can support it. Seems like their history was that of excessive passion, apparently to an uncontrollable and very destructive level. Their solution seems to have been to suppress their emotions, and so the passion, which then left the purely intellectual response to the external world and their own thinking/decisions.

Since I don't see emotion and rationality as either opposites or necessarily antagonistic to one another I wonder if considering rationality through a third lens -- epistemic, instrumental and emotional might help lead to some better decision-making than placing them in opposition. Principle #4 gets at this with the diagrams showing them as opposing but the argument questioning that approach. (I actually missed on this bit in my first comment.)

comment by Turgurth · 2013-07-18T01:38:51.909Z · LW(p) · GW(p)

To add to Principle #5, in a conversational style: "if something exists, that something can be quantified. Beauty, love, and joy are concrete and measurable; you just fail at it. To be fair, you lack the scientific and technological means of doing so, but - failure is failure. You failing at quantification does not devalue something of value."

comment by DanielLC · 2011-12-26T23:53:02.234Z · LW(p) · GW(p)

A “Straw Vulcan”—an idea that originally comes from TV Tropes

It doesn't come from TV Tropes. TV Tropes catalogs ideas that already exist. I'd suggest saying that it's a term that was originally defined by TV Tropes.

Replies from: alexvermeer
comment by alexvermeer · 2011-12-27T00:30:36.060Z · LW(p) · GW(p)

That's a good point. I like your wording. Fixed, and thanks.

comment by Normal_Anomaly · 2011-12-26T17:22:01.031Z · LW(p) · GW(p)

Good presentation of good ideas. Thanks for summarizing the talk for those of us who couldn't go to Skepticon. One typo: in the second paragraph after the quote in principle 3, the bat costs $1.05, not %.05.

Replies from: alexvermeer
comment by alexvermeer · 2011-12-26T17:35:41.244Z · LW(p) · GW(p)

Fixed. Thanks :)

Replies from: FiftyTwo
comment by FiftyTwo · 2011-12-27T12:24:57.738Z · LW(p) · GW(p)

Emotions can be instrumentally irrational if they get in the way of you achieving your goals. If you feel things are hopeless and there are no ways to change the situation, you may be wrong about that. Your emotions may prevent your from taking necessary actions.

Another typo, 'your' should be 'you' I think.

Replies from: alexvermeer
comment by alexvermeer · 2011-12-27T14:48:28.798Z · LW(p) · GW(p)

Fixed, thanks.

comment by MoritzG · 2020-01-23T12:16:51.783Z · LW(p) · GW(p)

Straw_Vulcan is an example of an attack of two of the three types of thinkers on another.

The moral-thinkers try to show their superiority. In Star Trek this is ever present. In all the stories morality and principles always win over rational compromise. The captains usually favor the best possible short term outcome over risk minimization and the long term. As it is fiction this always works out.

The three thinking types as formalized/categorized (to my knowledge) by Rao Venkatesh of ribbonfarm.

https://fs.blog/venkatesh-rao/

Venkatesh Rao: The Three Types of Decision Makers [The Knowledge Project Ep. #7]

I can hardly express how useful I found this to make sense of the world.

comment by Mets · 2013-10-31T18:24:14.684Z · LW(p) · GW(p)

You could argue rationality isn't desirable by showing instrumental and epistemic rationality can conflict. The idea that having more self-confidence than warranted can lead to better results (even if not as great as expected) is an example.

I agree that Spock isn't rational according to either definition though.

comment by duckduckMOO · 2011-12-31T02:05:20.203Z · LW(p) · GW(p)

“If you think you’re acting rationally but you consistently keep getting the wrong answer, and you consistently keep ending worse off than you could be, then the conclusion you should draw from that is not that rationality is bad, it’s that you’re bad at rationality.

This is waaaaayyyyy too blanket. There are potentially limitless reasons you could consistently end worse off than you could be. Any situation where you need to come up with an answer more quickly than you are capable of will get you the wrong answer pretty consistently if you are rational (because your best shot is to guess). Those are good warning signals but not specifically of lacking rationality.

I'm specifically thinking epistemic rationality of the not-wearing-paradigm-glasses/suspending judgement variety can be very bad for you in the short term but good for you in the long term.

Replies from: army1987
comment by A1987dM (army1987) · 2012-06-09T15:47:03.493Z · LW(p) · GW(p)

Any situation where you need to come up with an answer more quickly than you are capable of will get you the wrong answer pretty consistently if you are rational (because your best shot is to guess).

In situations where spending too much time to choose is worse than choosing sub-optimally in a short time, then guessing is rational. It's addressed by SVP#2 in the post. Being “rational” in your sense of the word in such a situation is failing the twelfth virtue.

Replies from: duckduckMOO
comment by duckduckMOO · 2012-06-10T16:38:35.124Z · LW(p) · GW(p)

That is what I was saying. sometimes the rational course of action, which is to guess in situations like that, will get you the wrong answer pretty consistently, not that that course of action is irrational.

I assume you read "the wrong answer" as referring to the choice to guess rather than the outcome of the guess.