[Question] What's your Elevator Pitch For Rationality?

post by atucker · 2011-09-06T21:43:21.967Z · LW · GW · Legacy · 38 comments

Contents

  My Current Pitch:
  My Thoughts on That:
None
38 comments

You're talking with someone you like, and they ask you what you mean by rationality, or why you keep going to LessWrong meetups. Or you meet someone who might be interested in the site.

What do you say to them? If you had to explain to someone what LW-style rationality is in 30 seconds, how would you do it? What's your elevator pitch? Has anyone had any success with a particular pitch?

My Current Pitch:

My current best one, made up on the spot, lacking any foreplanning, basically consists of:

"Basically, our brains are pretty bad at forming accurate beliefs, and bad in fairly systematic ways. I could show you one, if you want."

Playing the triplet game with them, then revealing that the numbers just need to be ascending

Upon failure, "Basically, your brain just doesn't look for examples that disprove your hypothesis, so you didn't notice that it could have a been a more general rule. There are a bunch of others, and I'm interested in learning about them so that I can correct for them."

My Thoughts on That:

It's massively effective at convincing people that cognitive biases exist (when they're in the 80% that fails, which has always been the case for me so far), but pretty much entirely useless as a rationality pitch. It doesn't explain at all why people should care about having accurate beliefs, and takes it as a given that that would be important.

It's also far too dry and unfun (compared to say, Methods), and has the unfortunate side effect of making people feel like they've gotten tricked. It makes it look non-cultish though.

I suspect that other people can do better, and I'll comment later with one that I actually put thought into. There's a pretty good chance that I'll use a few of the more upvoted ones and see how they go over.

38 comments

Comments sorted by top scores.

comment by [deleted] · 2011-09-07T03:54:34.372Z · LW(p) · GW(p)

I explain to the elevator that there is no need to be afraid of heights, rationally the chances of malfunction are ignorable - and more generally examining ones own biases and neuroses and correct for them will help ver achieve ver goal of helping people move between floors in an energy efficient way.

comment by AnnaSalamon · 2011-09-07T00:27:37.242Z · LW(p) · GW(p)

I haven't tested this, but maybe:

"People seem to be craziest about the questions that matter most for them, like their girlfriends or boyfriends, whether it was actually a mistake to enroll in their PhD program, etc. I'd like to learn how to not be that way."

comment by ArisKatsaris · 2011-09-07T08:47:27.731Z · LW(p) · GW(p)

I'd start by telling them about the commuting paradox (link grabbed from Yvain's post about rational house buying )- how people end up making themselves miserable by not properly estimating/valuing their own preferences.

That's a concrete example of something that negatively affects hundreds of millions people worldwide; and its applicability to real life is much better understood than the applicability of the triplet game.

comment by Raemon · 2011-09-06T23:46:48.069Z · LW(p) · GW(p)

I only tried the triplet pitch once, and they got the answer, but didn't feel like "oh, I had a bias," they just felt like it was a trick question. Then I generalized from one example and stopped using it.

I've stopped pitching 'Rationality' per se, but when people ask and seem plausibly interested, I say "Rationality is basically the study of making good decisions." If they inquire further, I think the new intro to Less Wrong is approximately right, although doesn't quite translate into conversational speech.

Replies from: Bugmaster, sixes_and_sevens
comment by Bugmaster · 2011-09-07T04:19:13.627Z · LW(p) · GW(p)

they got the answer, but didn't feel like "oh, I had a bias," they just felt like it was a trick question.

I haven't tried the triplet game on anyone yet, but this is the reaction I generally get in response to similar problems. In my (entirely anecdotal) experience, people are unable or unwilling to view rationality as a generally applicable principle. Instead, they treat it as a one-off tool that was designed to apply to a narrow set of specific problems.

"For example" -- people would say -- "you could use rationality to get a better price on your mortgage, or to demonstrate that Wiccans can't really affect reality through spells. But you couldn't use it to determine whether your homeopathic remedy really works, or whether your aunt Helga really does have prophetic dreams, or whether Christians can affect reality through prayer. These questions are altogether different from mortgage/Wicca/whatever questions, as everyone knows".

I don't think this kind of cognitive bias can be defeated by a 30-second pitch. In fact, I doubt it can be defeated at all.

comment by sixes_and_sevens · 2011-09-07T08:59:47.015Z · LW(p) · GW(p)

I only tried the triplet pitch once, and they got the answer, but didn't feel like "oh, I had a bias," they just felt like it was a trick question. Then I generalized from one example and stopped using it.

I put a poll on my blog isomorphic to the Allais Paradox, and I ought to have seen it ahead of time, but it's alarming the extent to which some people will go to rationalise their decisions. With one respondent I whittled the scenario down to the point where he obstinately claimed his choice between A or B would change given identical odds but a different method of randomisation.

This was one of a few efforts that basically put me off trying to recruit for rationality.

comment by Craig_Heldreth · 2011-09-07T15:58:23.433Z · LW(p) · GW(p)

This might be a question with no good answer. I am reminded of Chomsky's complaint in the video version of Manufacturing Consent that ideas like his can never get a hearing on television because they cannot be condensed into meaty meaningful sound bites.

The two data points worthy of consideration as I see it:

  1. Elevator pitches work best for popular culture like Lady Gaga or Harry Potter. I do not think it is a coincidence that a big fraction of the new people in Less Wrong introduction threads state they came to the site through Harry Potter and the Method of. Maybe pitch only the story and not the transcendental critique?

  2. The only successful advertising I have seen for any similar product is the Teaching Company. So if I wanted to write an elevator pitch for rationality or for Less Wrong I would first study the Teaching Company advertisements very closely.

comment by Morendil · 2011-09-07T09:46:17.892Z · LW(p) · GW(p)

Define success? ;)

I like to show people things like the Spinning Dancer and the "this is an attention test" video. I think of them as an invitation to reflect on exactly what kind of beings we are.

comment by Hyena · 2011-09-07T07:57:53.133Z · LW(p) · GW(p)

Not to troll, but if we're assuming someone needs LessWrong's services, shouldn't we create the least rational pitch we can, deploying the deepest of Darkest Arts and when they say "save me brother!" reply "it's good that you've seen the light, now we can work on not being blinded by it"?

Replies from: jsalvatier, Logos01, ArisKatsaris
comment by jsalvatier · 2011-09-07T13:34:12.691Z · LW(p) · GW(p)

I'd be concerned about getting a culty reputation (well, more).

comment by Logos01 · 2011-09-08T12:07:13.935Z · LW(p) · GW(p)

Poor rationality skills do not necessarily translate to an incapacity for rationality.

In my case I'd just say "I prefer to be less wrong. I use the best tools to make the best outcomes I can. Wouldn't you want to?" (Usually I time this after I've given extensive advice to someone based on just that.)

comment by ArisKatsaris · 2011-09-07T08:47:31.391Z · LW(p) · GW(p)

Not to troll, but if we're assuming someone needs LessWrong's services, shouldn't we create the least rational pitch we can

...um, no? Same way that one shouldn't try to cure people's headaches by banging them on the head with a hammer?

Replies from: lessdazed
comment by lessdazed · 2011-09-07T09:54:52.350Z · LW(p) · GW(p)

Not downvoted because Hyena said "shouldn't we" and didn't perfectly hold off on proposing solutions and raise his or her ideas a bit more abstractly first, and harsh responses aren't terrible against that.

But Hyena was the first to raise an excellent point, so your response is far too strong, I think.

Phrasing it as a question was certainly enough for Hyena to get an upvote from me; it's a middle ground between "There are advantages and disadvantages of using the Dark Arts that we should discuss," and "Let's deploy the deepest Darkest Art we can!"

Replies from: Bugmaster
comment by Bugmaster · 2011-09-08T09:38:06.263Z · LW(p) · GW(p)

Sorry, I'm new on the site, so I'm missing some of the jargon. What are these "Dark Arts" of which you speak ? The reason I ask (besides my everlasting hunger for power, mwa ha harghble) is because you seem to be (to my newbie eyes) claiming to possess some set of conversational techniques that will make almost anyone believe almost anything. I have heard such claims in the past, and they have all failed spectacularly, so now I'm more than a little wary of them.

Then again, I could be completely mistaken about what you mean by "Dark Arts"; if so, I apologize.

Replies from: ArisKatsaris, lessdazed, Xachariah
comment by ArisKatsaris · 2011-09-08T09:43:27.883Z · LW(p) · GW(p)

Looking for the term in the search engine will lead you to a good description at the wiki: http://wiki.lesswrong.com/wiki/Dark_arts

Basically any technique that seeks to persuade by exploiting (or even amplifying), not correcting, the cognitive biases of others.

comment by lessdazed · 2011-09-08T12:56:26.211Z · LW(p) · GW(p)

Holding off on proposing solutions.

Dark Arts. And here and here.

One issue is the matter of "persuasion" and "manipulation". Some people see them as words describing things that are different in kind, others see them as words describing different areas of a continuum.

See my comments here. These are some of the more common things meant by the term.

claiming to possess some set of conversational techniques that will make almost anyone believe almost anything.

I think similar sounding claims come from people claiming to be far better at manipulation than others as a means of selling you the knowledge. For this to be plausible, the skill has to come from a few simple key insights that universally apply.

The claim here is different, it's that for each person, there are ways to manipulate them beyond persuading them or more generally influencing them as they would wish to be influenced. As we are not trying to sell a simple technique that always does this, the claim is far less ambitious - it isn't that manipulation is something so simple it's easy to buy and learn, and so universal that you don't need anything else. The claim is similar in that it is about people being manipulable, but the discussion is about the morality and efficacy of pushing those levers consciously at all. Sellers of manipulation have to claim it works every time or nearly so, the discussion here is relevant if one tactic works once in a hundred tries - and the consensus here is that yes, people are somewhat manipulable, and there are many tactics.

Replies from: Bugmaster
comment by Bugmaster · 2011-09-09T02:11:15.691Z · LW(p) · GW(p)

Thanks lessdazed and others, that was very informative. In retrospect, I totally should've searched the wiki, but I kind of forgot this site had a wiki -- sorry about that.

I can see at least one problem with using the Dark Arts for the purpose of persuading people to learn about rationality: breach of trust. If your target person ever finds out that you manipulated him -- as he is in fact likely to do, assuming that he actually does learn more about rationality due to your successful manipulation attempt -- you will lose his trust, possibly forever. As the result, he may come to view rationality as a sort of seedy mind-game that evil people (such as, in his newly acquired opinion, yourself) play on each other for sport, and not as a set of generally useful mental techniques.

comment by Xachariah · 2011-09-08T10:16:39.150Z · LW(p) · GW(p)

A simple way of describing Dark Arts is as the mirror image of (ir)rationality used for evil.

For example Eliezer writes about how our brains literally believe everything they're told, and are unable to filter out falsehoods while distracted.

The Less Wrong thing to do is to say, "Oh, better pay attention when untrue things are being said, so my brain can classify them properly."

The Dark Arts thing to do is to say, "I'd better distract people when I lie to them, so that even if they know I'm lying, they will still believe me subconsciously."

comment by Paul Crowley (ciphergoth) · 2011-09-07T07:37:46.435Z · LW(p) · GW(p)

Very good question!

I'm not sure it's even worth trying for a 30-second pitch. My pitches for topics on this site generally take around three minutes. I use anchoring as my cognitive bias - specifically the "anchoring Gandhi's birth date on ludicrously far-off dates" example - and say things like "it's not that we fail to hit the target - it's that all our darts fall on the same side of it" to explain systematic bias, refer to "the mathematics of consistent decision making" and then say "it's crazy that we rely on our brains so much but don't take the time to learn about the ways in which they systematically fail".

comment by Nisan · 2011-09-07T06:12:23.092Z · LW(p) · GW(p)

I tell one of the stories from this post about the Planning Fallacy. It's a concrete example of a familiar and relevant way we fail to make good decisions, which has an easy fix.

comment by [deleted] · 2011-09-07T04:13:57.759Z · LW(p) · GW(p)

.

comment by atucker · 2011-09-07T01:20:02.069Z · LW(p) · GW(p)

My thinking-about-it-for-5-minutes pitch:

"I'm thrown in a game where nobody's told me the rules. There's no victory condition, but there's some stuff that I want to do. My most important piece is myself, so I'd like to figure out how to use it.

I currently think that my brain is pretty bad at doing a lot of important things (changing my mind, actually deciding to do things), so I'd like to get better at that. I want to learn more about how the world works, as well as how I work and what I want so that I can do things that will actually get me what I actually want, rather than just kind of doing things that occasionally kind of work for no particularly good reason."

comment by [deleted] · 2011-09-07T08:38:14.078Z · LW(p) · GW(p)

I have not spent much time on this site, so I may have an incorrect understanding of rationality. However, I see rationality more as a vehicle for pursuing and understanding truth. The first argument is to convince people to value truth, and then the next step would be to present rationality as a different method of thinking which would be better able to pursue truth. Convincing someone to value truth is its own battle, especially to folks who have the postmodern belief that their own perception is valuable simply because they perceive it. Simply, if someone does value truth, introducing rationality should follow easily. If someone does not value truth, then they will not accept rationality.

Replies from: lessdazed, falenas108
comment by lessdazed · 2011-09-08T13:25:29.723Z · LW(p) · GW(p)

The first argument is to convince people to value truth

One can value it as a means or an end.

Replies from: None
comment by [deleted] · 2011-09-09T04:20:02.344Z · LW(p) · GW(p)

I'm confused by your response. You've used a lot of pronouns, so in this context, I'm interpreting your sentence as rationality being a means to the end of truth. However, because of the pronouns, your sentence brings to mind the question: Can rationality be used as a means to ANY end?

If a person values personal happiness, can a rationalist present rationality as a way to be happy? If a person values a successful, blissful marriage, can a rationalist present rationality as a means to love your wife? And (just for the sake of testing the extremes) can rationality be a means to knowing God more deeply?

Replies from: lessdazed
comment by lessdazed · 2011-09-09T05:13:08.479Z · LW(p) · GW(p)

You've used a lot of pronouns

I failed to communicate, sorry, I will try again:

One can value rationality/(systematically believing true things and trying to shed false beliefs) as a means or an end.

If a person values personal happiness, can a rationalist present rationality as a way to be happy?

Not exactly

I mean that to care about truth you have to have something to protect. You have to care about what's true because you desperately want to actually achieve a goal, rather than fitting in with the people who talk about achieving the goal.

Replies from: None
comment by [deleted] · 2011-09-09T06:07:01.602Z · LW(p) · GW(p)

If rationality requires truth, and truth requires a motivation, can rationality exist as a motivation on its own? To me, it seems not.

I think my wording of the second sentence you quoted actually sabotaged the question I was really asking. Can rationality give a person happiness given that's their goal?

Replies from: lessdazed
comment by lessdazed · 2011-09-09T06:22:41.769Z · LW(p) · GW(p)

If rationality requires truth, and truth requires a motivation, can rationality exist as a motivation on its own?

It logically can exist as a motivation of its own, but a great many think that they have such motivation, far more than actually do. Even if one feels that one seeks truth for its own sake, it's probably not true.

I think I remember that Nietzsche did not believe it was possible.

Can rationality give a person happiness given that's their goal?

Rationality gives people different things depending on the person and their environment. The best way to predict what would happen in a hypothetical scenario is to be rational. Being able to predict things accurately probably causes more happiness than it prevents, for most. This is a mild side effect of rationality, things designed around happiness would have more of a chance of being good at affecting that (I suspect most basically fail and there are a few gems there).

My view that others, such as Eliezer, do not share is that rationality is much more related to losing than to winning. Rationality prevents people from making mistakes, this is only equivalent to winning and positively creating success if one goes on a significant not-losing streak.

So I'd say that if you are happy naturally, and unhappy when bad things happen to you, it will probably help a lot. If you are naturally unhappy, and need good things to happen to be happy, it won't make you happy at all, it will only lessen the frequency and severity of failures and problems. It helps one's net happiness but doesn't make one happy.

comment by falenas108 · 2011-09-08T13:12:32.674Z · LW(p) · GW(p)

You may want to read http://lesswrong.com/lw/go/why_truth_and/ for an understanding of what this site thinks about that.

Replies from: None
comment by [deleted] · 2011-09-09T04:50:57.572Z · LW(p) · GW(p)

If I'm understanding empiricism correctly, rationalists value truth because it allows them to properly function in their world. I'm confused. Is a rationalist's success more important than the truth which gives them success?

Replies from: lessdazed
comment by lessdazed · 2011-09-09T06:00:25.803Z · LW(p) · GW(p)

Rifle scopes do not help help snipers shoot guns. They help snipers know where to aim to hit a target. If the military cut all funding for scopes, it's still physically possible to perform all the actions that would have been chosen had they had the equipment. It's even physically possible to shoot more accurately by firing unaimed shots than by firing aimed shots.

However, that would be a stupid idea. It's stupid because the odds are not better for a random shot than for an aimed shot.

Likewise, rationalists want to win, to hit the target. Sometimes we reason that for an individual shot, it feels like we would do better by not aiming. We check our reasoning over and over, but the output is "It is slightly better to not aim than aim here, this is an exception to the usual rule." In such cases, we aim anyway.

One problem with trying to believe false things is that those things can corrupt other beliefs and areas of study where we need truth and can't afford to be wrong. We can do better by relentlessly seeking truth, even when it seems like it would somewhat be better not to know.

Opinions may differ for cases where it seems extremely important to avoid the truth.

In short, we seek truth not for its own sake, but to win, and still seek it when it seems falsehood would probably better help us win, because that seeming is unreliable and usually wrong.

Likewise for killing people to accomplish a goal, they are analogous.

I say "we" but in truth only speak for myself.

Replies from: None
comment by [deleted] · 2011-09-09T06:28:29.983Z · LW(p) · GW(p)

Thank you! I think I have a better understanding of rationality now.

comment by Jonathan_Graehl · 2011-09-06T23:25:18.772Z · LW(p) · GW(p)

Re: triplets, it also occurs to me that folks have a prior over triplet-rules that disfavors those that contains most or even half of triples (for any bound on all the integers), and favors those that use "interesting" (typically thought to be IQ-testing) rules. This is still no excuse, however :)

comment by XFrequentist · 2011-09-08T20:44:39.843Z · LW(p) · GW(p)

For an acquaintance that is generally interested in similar themes but unfamiliar with LW:

Thinking and deciding affect how likely we are to succeed at anything that we care about, but we've learned that people generally suck at this and fail in predictable ways. There's lots of science on this, and I'm sure you've even noticed it yourself.

Example/anecdote (personal and tailored to their interests if possible)

There's lots of info on these kinds of errors, but I find them really slippery to identify and correct in myself. LessWrong is a community designed to figure out how to do better, and design strategies to help people apply these lessons to daily life. I've found it really useful, you should check it out.

comment by nazgulnarsil · 2011-09-08T07:50:50.022Z · LW(p) · GW(p)

"Basically your brain is trolling you."

comment by DanielLC · 2011-09-07T03:54:54.927Z · LW(p) · GW(p)

This reminds me of trying to find an elevator pitch for utilitarianism. More specifically, donating large amounts to a good charity. All I can think of is "Sell your house or hundreds of innocent people will die", but I have a feeling that won't work.

comment by Oscar_Cunningham · 2011-09-06T23:14:42.687Z · LW(p) · GW(p)

Maybe instead of playing the triplets "trick" on people, you could relate a story from your own life about how you made a mistake that rationality could have let you avoid.

Replies from: Oscar_Cunningham
comment by Oscar_Cunningham · 2011-09-07T13:58:38.324Z · LW(p) · GW(p)

I just realised that I fail massively at not immediately proposing solutions. Could everyone downvote the parent please?