Elevator pitches/responses for rationality / AI

post by lukeprog · 2012-02-02T20:35:23.304Z · LW · GW · Legacy · 69 comments

Contents

69 comments

I'm trying to develop a large set of elevator pitches / elevator responses for the two major topics of LW: rationality and AI.

An elevator pitch lasts 20-60 seconds, and is not necessarily prompted by anything, or at most is prompted by something very vague like "So, I heard you talking about 'rationality'. What's that about?"

An elevator response is a 20-60 second, highly optimized response to a commonly heard sentence or idea, for example, "Science doesn't know everything."

 

Examples (but I hope you can improve upon them):

 

"So, I hear you care about rationality. What's that about?"

Well, we all have beliefs about the world, and we use those beliefs to make decisions that we think will bring us the most of what we want. What most people don't realize is that there is a mathematically optimal way to update your beliefs in response to evidence, and a mathematically optimal way to figure out which decision is most likely to bring you the most of what you want, and these methods are defined by probability theory and decision theory. Moreover, cognitive science has discovered a long list of predictable mistakes our brains make when forming beliefs and making decisions, and there are particular things we can do to improve our beliefs and decisions. [This is the abstract version; probably better to open with a concrete and vivid example.]

"Science doesn't know everything."

As the comedian Dara O'Briain once said, science knows it doesn't know everything, or else it'd stop. But just because science doesn't know everything doesn't mean you can use whatever theory most appeals to you. Anybody can do that, and use whatever crazy theory they want.

"But you can't expect people to act rationally. We are emotional creatures."

But of course. Expecting people to be rational is irrational. If you expect people to usually be rational, you're ignoring an enormous amount of evidence about how humans work.

"But sometimes you can't wait until you have all the information you need. Sometimes you need to act right away."

But of course. You have to weigh the cost of new information with the expected value of that new information. Sometimes it's best to just act on the best of what you know right now.

"But we have to use intuition sometimes. And sometimes, my intuitions are pretty good!"

But of course. We even have lots of data on which situations are conducive to intuitive judgment, and which ones are not. And sometimes, it's rational to use your intuition because it's the best you've got and you don't have time to write out a bunch of probability calculations.

"But I'm not sure an AI can ever be conscious."

That won't keep it from being "intelligent" in the sense of being very good at optimizing the world according to its preferences. A chess computer is great at optimizing the chess board according to its preferences, and it doesn't need to be conscious to do so.

 

Please post your own elevator pitches and responses in the comments, and vote for your favorites!

 

69 comments

Comments sorted by top scores.

comment by Vladimir_Nesov · 2012-02-03T00:17:22.262Z · LW(p) · GW(p)

What most people don't realize is that there is a mathematically optimal way to update your beliefs in response to evidence, and a mathematically optimal way to figure out which decision is most likely to bring you the most of what you want

You've said something similar in a recent video interview posted on LW, and it cringed me then, as it does now. We don't know of such optimal ways in the generality the context of your statement suggests, and any such optimal methods would be impractical even if known, which again is in conflict with the context. Similarly, turning to the interview, SingInst's standard positions on many issues don't follow from formal considerations such as logic and decision theory, there is no formal theory that represents them to any significant extent. If there is strength to the main arguments that support these positions, it doesn't currently take that form.

Replies from: lukeprog, Dr_Manhattan, siodine
comment by lukeprog · 2012-02-05T21:17:17.899Z · LW(p) · GW(p)

Fair enough. My statement makes it sounds like we know more than we do. Do you like how I said it here, when I had more words to use?

comment by Dr_Manhattan · 2012-02-04T00:39:43.818Z · LW(p) · GW(p)

It made me cringe as well but more because it will make people hug the opposite wall of the proverbial elevator, not because such methods are conclusively shown as impractical - http://decision.stanford.edu/.

comment by siodine · 2012-02-04T00:15:53.456Z · LW(p) · GW(p)

I think Ian Pollock more effectively got at what Luke is trying to communicate.

comment by Shmi (shminux) · 2012-02-02T21:21:44.881Z · LW(p) · GW(p)

First, a general comment on your versions, sorry: you tend to use big words, scientific jargon, and too few examples.

Compare your pitch with the following (intentionally oversimplified) version:

"So, I hear you care about rationality. What's that about?"

It's about figuring out what you really want and getting it. If you are at a game, and it's really boring, should you walk out and waste what you paid for the tickets? If you apply for a position and don't get it, does it help to decide that you didn't really want it, anyway? If you are looking to buy a new car, what information should you take seriously? There are many pitfalls on the road to making a good decision; rationality is a systematic study of the ways to make better choices in life. Including figuring out what "better" really means for you.

Replies from: RobertLumley, 911truther
comment by RobertLumley · 2012-02-02T22:01:05.508Z · LW(p) · GW(p)

First, a general comment on your versions, sorry: you tend to use big words, scientific jargon, and too few examples.

And doing that is going to instantly turn people off.

Replies from: None
comment by [deleted] · 2012-02-02T22:22:32.550Z · LW(p) · GW(p)

.

comment by 911truther · 2012-02-02T21:39:20.260Z · LW(p) · GW(p)

It's about figuring out what you really want and getting it. If you are at a game, and it's really boring, should you walk out and waste what you paid for the tickets? If you apply for a position and don't get it, does it help to decide that you didn't really want it, anyway? If you are looking to buy a new car, what information should you take seriously? There are many pitfalls on the road to making a good decision; rationality is a systematic study of the ways to make better choices in life. Including figuring out what "better" really means for you.

Makes it sound great, but what are the real world benefits? I've been rational for years and it hasn't done anything for me.

Replies from: Eliezer_Yudkowsky, None, dbaupp, faul_sname
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2012-02-03T05:29:39.003Z · LW(p) · GW(p)

15 comments and -120 karma? Okay, at this point I may begin immune response against trolling (delete further comments, possibly past comments, as and when I get around to seeing that they were made).

I also remind everyone: Please do not respond at length to trolls, attention stimulates their reward centers.

Replies from: RobertLumley, Armok_GoB, Risto_Saarelma, 911truther
comment by RobertLumley · 2012-02-03T12:31:18.620Z · LW(p) · GW(p)

I'm not so sure he's a troll. He very well might be, but at least he made this comment which is at 4 karma right now. His more recent comments seem better than his previous ones, too. p(troll) seems pretty high, but not so high that I would support a ban, comment deletions, etc. at this point.

Replies from: Viliam_Bur, Larks, Risto_Saarelma
comment by Viliam_Bur · 2012-02-03T13:39:31.921Z · LW(p) · GW(p)

Most of his comments are essentially saying "you are wrong". Once he was right in saying that, many times he was wrong. He probably knows a lot of facts about many topics, and he expresses with very high certainty; unfortunately the quality of his comments does not match this certainty, and he seems very immune to feedback. Low karma just proves he is right.

He is very negative towards others. Almost all his comments contain something like: "Your work is wrong." "I never said anything like this" "I never flamed anyone." "spelled wrong" "I have no such delusions." "it hasn't done anything for me." "it's definitely going to do more harm than good." "I already explained why it's not possible." "There is practically no chance" "It's a misconception" "This idea is based on a whole range of confusions and misunderstandings" "just another example of people not understanding" It's like his only point in discussions is to show that everyone else is wrong, but it's often him who is wrong. Did he make some useful contribution? I don't see any.

And then the -- "You are trying to submit too fast. try again in %i minutes." and "You do not have enough karma to downvote right now. You need 1 more point." -- just make me want to scream. (Though the fact that he does not have enough karma to downvote makes me happy. I guess he was going to downvote those who disagree with him. I am happy that LW karma system does not allow him to make dozen sockpuppet accounts and start a downvoting war.)

Maybe the guy is not having fun, maybe that's just what he honestly is... but anyway his comments seem like optimized to create mental suffering in others, certainly in me. I have left websites where people like this became frequent. If this kind of behavior becomes tolerated on LW, I will either write some GreaseMonkey plugin that will remove all his comments from the page, or I will simply stop reading LW. In theory I am reading this site for information, not for positive emotion, but I am just a human... if this site will give me negative emotion too often, I will stop reading it.

I tried to give him the benefit of doubt, and answered his comment seriously, but now I feel it was totally not worth doing. This is my worst experience on LW so far. Though this mostly means that I did not have bad experiences on LW so far. :) But I prefer it to stay this way.

Replies from: RobertLumley
comment by RobertLumley · 2012-02-03T14:04:34.008Z · LW(p) · GW(p)

I tend to agree with you. I think I just have a higher threshold for banning. As such, I would like to see him actively ignore our suggestions before entirely dismissing him, which is not sure is something he's done yet.

comment by Larks · 2012-02-12T21:26:45.691Z · LW(p) · GW(p)

Less Wrong isn't some kind of human right that we need to go beyond reasonable doubt to withdraw from someone; it's an online community run by an enlightened dictator, and if you want to keep your well kept garden, you have to accept some collateral damage.

Replies from: Maelin
comment by Maelin · 2012-02-14T06:30:18.994Z · LW(p) · GW(p)

I am extremely wary of this kind of thinking. Partly because using power is a slippery slope to abusing power, and each time you use the banhammer on a maybe-troll it gets a little bit easier to use it on the next maybe-troll.

Not just because of that, but also because when other people come to a community full of self-purported rationalists, and they see someone who does not obviously and immediately pattern match as a troll receiving the banhammer for presenting community-disapproved opinions in what seems superficially to be an adequately calm and reasonable manner, that sets off the 'cult' alarms. It makes us look intolerant and exclusionary, even if we aren't.

It's fine for places like the SA forums to throw the banhammer around with reckless abandon, because they exist only for fun. But we have higher goals. We have to consider not just keeping our garden tidy, but making sure we don't look like overzealous pruners to anybody who has a potentially nice set of azaleas to contribute.

Replies from: Larks
comment by Larks · 2012-02-15T14:03:30.032Z · LW(p) · GW(p)

Slipper slopes work in both directions. Each time you don't strike down injustice, it becomes a bit easier to walk by the next time. I'd sooner have Marginal Value > Marginal Cost than Marginal Value < Marginal Cost and a lower Average Value.

Bad impressions work in both directions. When other people come to a community full of self-purported rationalists, and they see someone presenting stupid, low-status, incendary comments and being treated as worthy of respect, it makes LW look stupid, low-status and incendary because of the Representativeness Heuristic.

Obveously there is a continuum between anarchy and banning everything, and both extremes are local minima. The issue is to judge the local gradient

Replies from: Maelin
comment by Maelin · 2012-02-16T00:02:26.180Z · LW(p) · GW(p)

Upvoted for valid point. I agree, but I think there is enough of a difference between 'being treated as worthy of respect' and 'not being banned' that we can probably ride in the middle ground comfortably without any significant image damage.

On consideration, though... maybe I'm prejudiced against banning because of the sense of finality of it. I guess it's not hard to make a new account.

I'm still opposed to deleting past comments though, because deleted comments make a mess of the history.

comment by Risto_Saarelma · 2012-02-04T12:46:15.194Z · LW(p) · GW(p)

I'm not so sure he's a troll. He very well might be, but at least he made this comment which is at 4 karma right now.

This is how trolling works.

Replies from: RobertLumley
comment by RobertLumley · 2012-02-04T15:46:44.930Z · LW(p) · GW(p)

Well he hasn't commented recently, so I'm guessing he either took our advice and made a new account, or just left the site, neither of which I would attribute to troll behaviour. (Or Eliezer is deleting his posts as promised, which would, obviously, weaken that hypothesis.)

comment by Armok_GoB · 2012-02-03T20:18:02.035Z · LW(p) · GW(p)

I say just ban him.

comment by Risto_Saarelma · 2012-02-03T07:59:30.874Z · LW(p) · GW(p)

I wonder if downvotes have gone from a punishment to a reward at this point.

comment by 911truther · 2012-02-03T05:58:22.305Z · LW(p) · GW(p)

I hope you'll treat me fairly as a person and actually read and try to understand my comments instead of jumping to conclusions based on my "score".

Replies from: Emile
comment by Emile · 2012-02-03T15:13:26.093Z · LW(p) · GW(p)

Your best way to be taken seriously would be just to create a new account without making any reference to this one, and, well, not act like a troll.

Replies from: Dr_Manhattan
comment by Dr_Manhattan · 2012-02-04T00:44:12.305Z · LW(p) · GW(p)

Huh. Come to think of it, on the Internet there IS a second chance to make a first impression. (a good argument to always using handles). Noted.

comment by [deleted] · 2012-02-03T01:11:25.426Z · LW(p) · GW(p)

.

comment by dbaupp · 2012-02-03T02:11:06.839Z · LW(p) · GW(p)

Are you enjoying wasting your time on this website?

You have 15 comments and a grand total of -120 karma. That is a strong indication that you are doing something wrong. To save you some time: the standard response is "I'm being censored! You're an Eliezer-cult! All these downvotes are just because you're scared of the Truth!".

Please don't use it, because it is not true: e.g. two links you've already seen, people call Eliezer out on mistakes, naunced responses to "Yay for Eliezer/rationality/SI!"-type posts. Part of the reason I like LW is precisely because people do disagree, but there are almost never flame wars: the disagreement means that people actually think about what they believe and even change their minds!

What you are doing is not fitting into the community norms of discussion, like research and linking/referring to specific sources (anyone can say "I've done research!", but that doesn't mean that you have). (I'll pre-empt another common whinge: yes, in most cases, Wikipedia is an acceptible reference to use on LW).

The parent comment might not be particularly bad; but your history (and your username) puts you very close to "troll", and that makes the parent comment look like a pattern-matched response (rather than a genuine question) which is the reason I downvoted.

Replies from: 911truther
comment by 911truther · 2012-02-03T02:44:21.941Z · LW(p) · GW(p)

To save you some time: the standard response is "I'm being censored! You're an Eliezer-cult! All these downvotes are just because you're scared of the Truth!".

I never said anything like this and I never invoked Eleizer. I don't understand why you're telling me off for something I didn't do. Look at my post history if you don't trust me.

What you are doing is not fitting into the community norms of discussion, like research and linking/referring to specific sources

It only makes sense to do so when making a claim. Yet people on this site have refused to back up their own claims with citations because apparently "I'm not worth bothering with".

but there are almost never flame wars

I never flamed anyone. The only guy who is calling people names "like troll for example" is you (well now that you've done it others are following your lead too, well done..).

Are you enjoying wasting your time on this website?

Not really, I didn't expect to get rejected so harsly. I've read all the sequences twice and been rational for years so I don't know what the problem is. What's the point of all this meta discussion, why is everyone trying to drag me into these metadiscussions and brand me as a troll after I passed 100 downvotes. We should get back onto the actual topic.

You are trying to submit too fast. try again in 6 minutes.

Replies from: RobertLumley, pedanterrific, dbaupp
comment by RobertLumley · 2012-02-03T03:17:17.364Z · LW(p) · GW(p)

One of the problems is that you say things like "I've been rational for years". Sorry. No, you haven't. EY hasn't been rational for years. You may have been an aspiring rationalist, but that's a far cry from actually being rational. When you say things like that it is extremely off-putting because it sounds self-congratulatory. That's something that this community struggles with a lot, and we typically heavily downvote things that are that way because they send very bad signals about what this website is. Beyond that, when it's said by someone with the username "911truther", it implies an element of "You're not rational unless you're a truther too", which mean it or not, is how it comes across.

Secondly, and this relates, your username. It's inherently political, which brings up all of our opposition to politics every time you make a post. That's not a good thing, and it will be very difficult for anyone on this site to take you seriously. If two different people wrote two articles that were of exactly equal caliber, and one was named BobSmith, and the other was named Obama2012, I would anticipate at least 2-3 times the upvoting on the former and 2-3 times the downvoting on the latter. And 9/11 is so much more of a polarizing issue. The vast, vast majority of people here disagree with you. But roland, despite being wildly downvoted every time he brings up 9/11, actually manages positive karma, because it's not inherently brought up every time he posts. I can not recommend strongly enough that you delete your account and create a new username if you wish to continue on this site. If you're a 911 truther, I would not suggest lying about that, but choosing that as the phrase by which you identify yourself is not a very effective strategy for being taken seriously on this site.

Thirdly, the great grandparent to this isn't a terrible comment. I agree with you there. I likely would have upvoted it had it been made by a different username, since I didn't think it deserved that level of downvoting (but not because I thought it was particularly wonderful in and of itself).

comment by pedanterrific · 2012-02-03T17:19:28.055Z · LW(p) · GW(p)

Yet people on this site have refused to back up their own claims with citations because apparently "I'm not worth bothering with".

I found this claim difficult to believe, so I looked it up. For the record:

911truther: Freezing things makes water expand and burst the fragile parts of your brain.

gwern: Freezing canard: proof you have not read the cryonics literature. Instant downvote.

911truther: If "the cryonics literature" (presumably explaining why freezing does not destroy the brain) actually exists why don't you link to it?

gwern: Because spending the time to look up references solid enough that they cannot be glibly rejected indicates that I think someone is worth educating, that I can educate them, or it's a sign of respect.

None of those three are true. So if you think you are right, you are free to bring your own references to the table.

Replies from: praxis, TimS
comment by praxis · 2012-02-05T00:47:31.984Z · LW(p) · GW(p)

I do wish we could discourage the attitude displayed here by gwern. It's pure ego to respond in this way to someone you deem a "troll". It certainly won't change their mind, and it will only spur them to comment more. Either ignore them completely after downvoting, or be polite in your reply. One might justify these posts as important to make sure that 911truther knows why he's being downvoted, but the aggression in them is entirely counter-productive and, frankly, is quite rude.

For the record, I do think people are a little over-eager to accuse someone of being a "troll" (I think it is much more probable that 911truther is simply ignorant) although I think moderation is warranted in this case.

comment by TimS · 2012-02-03T17:32:19.027Z · LW(p) · GW(p)

Was this before or after the other links in other conversations?

comment by dbaupp · 2012-02-03T03:26:57.569Z · LW(p) · GW(p)

I never said anything like this and I never invoked Eleizer. I don't understand why you're telling me off for something I didn't do. Look at my post history if you don't trust me.

I know you didn't invoke Eliezer, but that is a common statement by people who find themselves downvoted a lot, so I was pre-empting it (if you were not going to do that, I apologise and that sentence should be considered removed from my quote, however the rest still stands). The only reason I said that, was because I looked at your post history and saw this one:

[...] If you look at my user page (http://lesswrong.com/user/911truther) it's blatantly obvious that someone is systematically downvoting everything I post multiple times. I don't claim to be persecuted but clearly there is an attempt to censor me. Frankly it just proves that I'm right, if I was wrong people could easily disprove me.

For the rest:

  • People have been providing links and citations to back up their claims. (Several of the replies in this thread)
  • I wasn't implying that you flamed anyone, just that dissent is part of this website, and it is treated with respect.
  • Dismissing accusation of "troll" with uncheckable and irrelevant claims of rationality is not the right way to do it.
comment by faul_sname · 2012-02-03T01:21:32.585Z · LW(p) · GW(p)

Rational compared to who?

comment by daenerys · 2012-02-02T22:01:06.465Z · LW(p) · GW(p)

First, thanks to lukeprog for posting this discussion post. The Ohio Less Wrong group has been discussing elevator pitches, and the comments here are sure to help us!

I often end up pitching LW stuff to people who are atheists, but not rationalists. I think this type of person is a great potential "recruit", because they WANT a community, but often find the atheistic community a little too "patting ourselves on the back"-ish (as do I). My general pitch is that Less Wrong is like the next step: "Yeah, we're all (mainly) atheists, but now what??"

Here's an example from a recent facebook comment thread:

Other person- What exactly do atheist groups do? I went to a couple meetings of [ Freethought Group ] here at [ Local big university], but it turned out to be exactly like Sunday school but except for reading Bible verses, everyone talked about why religion was terrible. It's not exactly what I'm all about.

Me- Yeah, I hate "Rah rah, Atheism!" stuff too. I know [Person A ] and [Person B ] from lesswrong.com . I like the site because it's like..."Yeah, we've all got the atheism stuff figured out. Let's move on and see where we can go from there."

Then I point them to Methods of Rationality, and hopefully now to our meetups.

Replies from: lukeprog, rysade
comment by lukeprog · 2012-02-03T00:04:27.347Z · LW(p) · GW(p)

Coming up with elevator pitches/responses strikes me as a great activity to do at LW meetups.

Replies from: None
comment by [deleted] · 2012-02-03T01:14:19.826Z · LW(p) · GW(p)

.

Replies from: Dr_Manhattan
comment by Dr_Manhattan · 2012-02-04T01:27:41.943Z · LW(p) · GW(p)

If there is interest in some discussion logs to analyse, I'm having a lengthy FB thread with a fairly intelligent theist I knew from rabbinical seminary. I don't think his arguments are particularly good, and I'm not great at arguing either, though I hope my content is a bit more convincing despite lack of style. I do not expect to change his mind - he holds a rabbinical position and chances of him changing his mind are near zero, but there are some observers I care about and this is an exercise in rationality for me. I can anonymize and post if people find this kind of thing interesting, I would certainly appreciate some feedback.

Replies from: pedanterrific, None
comment by pedanterrific · 2012-02-04T01:56:16.575Z · LW(p) · GW(p)

Well, I would find it interesting, but as a point of order: maybe you should let him know you're doing this (even anonymizedly) so he can get help from a gang of his friends too?

Replies from: Dr_Manhattan
comment by Dr_Manhattan · 2012-02-04T02:16:26.423Z · LW(p) · GW(p)

I have no intention to have this turn into a public debate out of a Facebook thread. This is a chance to improve my rationality and argumentation skills.

Replies from: pedanterrific
comment by pedanterrific · 2012-02-04T02:24:06.525Z · LW(p) · GW(p)

Yes... I took "there are some observers I care about" plus "I would appreciate some feedback" to mean 'I'd like some debate advice (which I will be applying)'. If that's not getting help from a gang of your friends, I don't know what is.

Replies from: Dr_Manhattan
comment by Dr_Manhattan · 2012-02-04T02:47:38.806Z · LW(p) · GW(p)

You're correct, it's a side benefit, but having a thread evolve into some kind of public debate looks silly. If public debate on such issues is desired there are order of magnitude better ways of doing it than this.

Replies from: Normal_Anomaly
comment by Normal_Anomaly · 2012-02-04T15:26:10.821Z · LW(p) · GW(p)

I don't think pedanterrific is planning to have a bunch of LWers start commenting on the thread in support of atheism. I think he's expecting a bunch of LWers to give you advice in this thread, which you will then use in your own posts. And he thinks the rabbi should be given an opportunity to ask his own community for similar advice. To use a boxing metaphor, nobody else is going to start fighting, but you're going to have more coaches and your opponent should too.

Replies from: Dr_Manhattan
comment by Dr_Manhattan · 2012-02-04T15:36:08.369Z · LW(p) · GW(p)

I got that, but having to tell him that there are a bunch of people helping, bring your friends, seems awkward in the context. I'd rather not have the help and just let people view the log as a post mortem, for improving my rationality. Another part of it is the fact that I'm actually doing ok in the argument (I think) and "calling for help" would look like/could be spun as a weakness.

Replies from: Normal_Anomaly
comment by Normal_Anomaly · 2012-02-04T15:44:51.362Z · LW(p) · GW(p)

Okay then! That makes sense. Also, I support posting the log when the argument is done; I'd enjoy reading it and would be happy to comment.

Replies from: Dr_Manhattan
comment by Dr_Manhattan · 2012-02-04T16:18:29.882Z · LW(p) · GW(p)

The third, compromise, option would be, if I end up using a suggestion from LW, to say "(I got this argument from talking it over with a a friend)", though I'm not sure if that goes far enough to satisfy standards of a fair fight people want to see.

comment by [deleted] · 2012-02-04T04:48:00.827Z · LW(p) · GW(p)

.

comment by rysade · 2012-02-04T10:34:54.218Z · LW(p) · GW(p)

I too am a member of the Ohio Less Wrong group. I was quite surprised to see this topic come up in Discussion, but I approve wholeheartedly.

My thoughts on the subject are leaning heavily towards the current equivalent of an 'elevator pitch' we have already: the Welcome to Less Wrong piece on the front page.

I particularly like the portion right at the beginning, because it grabs onto the central reason for wanting to be rational in the first place. Start with the absolute basics for something like an elevator pitch, if you ask me.

Thinking and deciding are central to our daily lives. The Less Wrong community aims to gain expertise in how human brains think and decide, so that we can do so more successfully.

I might cut out the part about 'human brains' though. Talk like that tends to encourage folks to peg you as a nerd right away, and 'nerd' has baggage you don't want if you're introducing an average person.

comment by siodine · 2012-02-02T23:32:30.498Z · LW(p) · GW(p)

Possible absolute shite ahead (I went the folksy route):

"So, I hear you care about rationality. What's that about?"

It's about being like Brad Pitt in Moneyball. (Oh, you didn't see it? Here's a brief spoiler free synopsis). It's the art seeing how others, and even yourself, are failing and then doing better.

"Science doesn't know everything."

Oh, yeah, I completely agree. But, it does know a helluva lot. It put us on the moon, gave us amazing technology like this [pull out your cellphone], and there's every reason to think it's going to blow our minds in the future.

"But you can't expect people to act rationally. We are emotional creatures."

Yeah, no that's true. We've recently seen all kinds of bad decisions--housing crisis and so on. But that's all the more reason to try and get people to act more rationally.

"But sometimes you can't wait until you have all the information you need. Sometimes you need to act right away."

Yeah, true... true. Still, we can prepare in advance for those situations. For example, you might have reason to believe that you're going to start a new project at your job. That's going to involve a lot of decisions and any poor decision at such an early stage can magnify as times goes by. That's why you prepare the best you can for those quick decisions that you know you'll be making.

"But we have to use intuition sometimes. And sometimes, my intuitions are pretty good!"

Yeah, intuitions are just decisions based on experience. I remember reading that chess masters, y'know like Billy Fisher or Kasparov, don't even deliberate on their decisions, they just know; whereas, chess experts, a level below master, do deliberate. But to get to that level of mastery, you need tens of thousands of hours of practice, man. Only a few of us are lucky enough to have that kind of experience in even a very narrow area. If you're a something like an intermediate chess player in an area with a bunch of skilled chess players, your intuition is going to suck.

"But I'm not sure an AI can ever be conscious."

Maybe not, but that's not really important. Did you hear about Watson? That machine that beat those Jeopardy players? They're saying Watson could act as a medical diagnostician like House and do a better job at it. Not only that, but it'd be easier than playing jeopardy... isn't that crazy?

Replies from: None, Desrtopa
comment by [deleted] · 2012-02-03T01:04:50.888Z · LW(p) · GW(p)

.

comment by Desrtopa · 2012-02-03T04:35:47.390Z · LW(p) · GW(p)

Oh, yeah, I completely agree. But, it does know a helluva lot. It put us on the moon, gave us amazing technology like this [pull out your cellphone], and there's every reason to think it's going to blow our minds in the future.

I like the others, but I think the problem with this one is that it doesn't provide them with any reason why they shouldn't just fill the gaps in whatever science knows now with whatever the hell they want.

comment by Raelifin · 2012-02-03T00:38:17.153Z · LW(p) · GW(p)

The elevator pitch that got me most excited about rationality is from Raising the Sanity Waterline. It only deals with epistemic rationality, which is an issue, and it, admittedly, is best fit towards people who belong to a sanity-focused minority, like atheism or something political. It was phrased with regard to religion originally, so I'll keep it this way here, but it can easily be tailored.

"What is rationality?"

Imagine you're teaching a class to deluded religious people, and you want to get them to change their mind and become atheists, but you absolutely cannot talk about religion in any way. What would you do? You'd have to go deeper than talking about religion itself. You'd have to teach your students how to think clearly and actually reevaluate their beliefs. That's (epistemic) rationality.

"Why is rationality important? Shouldn't we focus on religion first?"

By focusing on rationality itself you not only can approach religion in a non-threatening way, but you can also align yourself with other sane people who may care about economics or politics or medicine. By working together you can get their support, even though they may not care about atheism per se.

comment by JonathanLivengood · 2012-02-03T06:42:21.299Z · LW(p) · GW(p)

"But you can't expect people to act rationally. We are emotional creatures."

Yes, we are emotional creatures. But being emotional is not incompatible with being rational! In fact, being emotional sometimes makes us more rational. For example, anger can inhibit some cognitive biases, and people who sustain damage to "emotional" areas of their brains do not become more rational, even when they retain memory, logical reasoning ability, and facility with language. What we want to do is make the best possible use of our available tools -- including our emotional tools -- in order to get the things that we really want.

Replies from: Dorikka
comment by Dorikka · 2012-02-03T13:50:17.404Z · LW(p) · GW(p)

Remember that your links don't work in speech. :D

Replies from: JonathanLivengood
comment by JonathanLivengood · 2012-02-03T18:24:39.211Z · LW(p) · GW(p)

Clearly right. I had thought about carrying around hard-copies of papers in a backpack so that I could hand them out as I mention them, but ... ;)

comment by RobertLumley · 2012-02-02T22:05:12.898Z · LW(p) · GW(p)

One of the most difficult arguments I've had making is convincing people that they can be more rational. Sometimes people have said that they're simply incapable of assigning numbers and probabilities to beliefs, even though they acknowledge that it's superior for decision making.

Replies from: None, daenerys
comment by [deleted] · 2012-02-02T22:25:22.503Z · LW(p) · GW(p)

.

Replies from: drethelin, badger
comment by drethelin · 2012-02-02T22:35:35.224Z · LW(p) · GW(p)

This. I'm skeptical of almost every numerical probability estimate I hear unless the steps are outlined to me.

Replies from: Postal_Scale, Giles
comment by Postal_Scale · 2012-02-03T23:36:39.392Z · LW(p) · GW(p)

No joke intended, but how much more skeptical are you, percentage-wise, of numerical probability estimates than vague, natural language probability estimates? Please disguise your intuitive sense of your feelings as a form of math.

Ideally, deliver your answer in a C-3PO voice.

Replies from: drethelin
comment by drethelin · 2012-02-04T01:07:27.146Z · LW(p) · GW(p)

40 percent.

comment by Giles · 2012-03-13T19:44:20.115Z · LW(p) · GW(p)

This may be one reason why people are reluctant to assign numbers to beliefs in the first place. People equate numbers with certainty and authority, whereas a probability is just a way of saying how uncertain you are about something.

When giving a number for a subjective probability, I often feel like it should be a two-dimensional quantity: probability and authority. The "authority" figure would be an estimate of "if you disagree with me now but we manage to come to an agreement in the next 5 minutes, what are the chances of me having to update my beliefs versus you?"

comment by badger · 2012-02-03T01:20:30.902Z · LW(p) · GW(p)

Techniques for probability estimates by Yvain is the best we have.

comment by daenerys · 2012-02-02T23:00:49.434Z · LW(p) · GW(p)

I agree that it can be difficult convincing people that they can be more rational. But I think starting new people off with the idea of assigning probabilities to their beliefs is the wrong tactic. It's like trying to get someone who doesn't know how to walk, to run a marathon.

What do you think about starting people off with the more accessible ideas on Less Wrong? I can think of things like: Sunk Costs Fallacy, not arguing things "by definition, and admitting to a certain level of uncertainty. I'm sure you can think of others.

I would bet that pointing people to a more specific idea, like those listed above, would make them more likely to feel like there are actual concepts on LW that they personally can learn and apply. It's sort of like the "Shock Level" theory, but instead it's "Rationality Level":

Rationality Level 0- I don't think being rational is at all a good thing. I believe 100% in my intuitions!
Rationality Level 1- I see how being rational could help me, but I doubt my personal ability to apply these techniques
Rationality Level 2- I am trying to be rational, but rarely succeed (this is where I would place myself.) Rationality Level 3- I am pretty good at this whole "rationality" thing!
Rationality Level 4- I Win At Life!

I bet with some thought, someone else can come up with a better set of "Rationality Levels".

comment by Kyre · 2012-02-04T13:56:45.720Z · LW(p) · GW(p)

"So, I hear you care about rationality. What's that about?"

Rationality is about improving your thinking so that you make better decisions. You know, sometimes you make decisions that turn out bad because there is some piece of knowledge or information that you really needed to know but didn't. But sometimes it turns out even with the same information you can make a better choice if you think about things differently. In the narrow sense rationality is getting your brain to make the best use of the information you have to make the best choice. In the wider sense rationality is about filling your brain up with the best information in the first place.

That might just sound like common sense - that people should think carefully about things - but it turns out that there are a whole lot of really common mistakes that people don't realize that they are making, and it really is possible to learn better patterns of thinking that let you make better decisions.

comment by Micaiah_Chang · 2012-02-08T07:00:17.397Z · LW(p) · GW(p)

I'm not sure if this deserves its own article, so I'm posting it here: What would be an interesting cognitive bias / debiasing technique to cover in a [Pecha kucha] (http://www.pecha-kucha.org/what) style presentation for a college writing class?

Given the format, it should be fairly easy to explain(I have less time than advertised, only 15 slides instead of 20!) So far, I've thought about doing the planning fallacy, representativeness heuristic or the disjunction fallacy. All three are ones I can already speak casually about and don't leap out at me as empowering motivated cognition (...a topic which would empower it, huh)

I would personally like to do Bayes Theorem, but I can't 1) Think of a way to compress it down to five minutes 2) Can't think of ways for other people to help compress it down to five minutes without also omitting the math.

Downvote if this is off topic. If not, please tell me why because I'll just assume it's an offtopic downvote!

comment by juliawise · 2012-02-05T05:12:00.808Z · LW(p) · GW(p)

It's about figuring out the mistakes that people tend to make, so you can avoid making them. ("Like what?") Like people aren't good at changing their minds. They only want to think about information that supports what they already believe. But really, I should look at all the information that comes my way and decide - is my old belief really true? Or should I change my mind based on the new information I got?

comment by NexH · 2012-02-05T09:58:54.205Z · LW(p) · GW(p)

"But you can't expect people to act rationally. We are emotional creatures."

This may be difficult to answer appropriately without knowing what the hypothetical speaker means with “emotions” (or "expect", for that matter). But the phrase seems to me like a potential cached one, so ve may not know it either.

A possible elevator response below:

Rationality is not Vulcan-like behavior; you don't have to renounce to your emotions in order to act rationally. Indeed, for most people, many emotions (like affection, wonder, or love) are very valuable, and applied rationality is knowing how to obtain and protect what is truly precious for you.
What is important is to rationally understand how your emotions affect your judgment, so you can try to consciously avoid or damper unwanted emotional reactions that would otherwise have undesirable consequences for you.

comment by khafra · 2012-02-03T14:04:48.156Z · LW(p) · GW(p)

Does Sark's recent tweet, "Intuitions are machines, not interior decoration," work as an elevator pitch, or is it too opaque to a non-LWer? Or is it too short? Maybe it's a fireman's pole pitch.

comment by Normal_Anomaly · 2012-02-04T15:17:38.456Z · LW(p) · GW(p)

I find the conscious AI response to be the most compelling. Now that I think about it, that's more evidence for the usefulness of concrete examples.

comment by Desrtopa · 2012-02-03T19:18:20.555Z · LW(p) · GW(p)

"Science doesn't know everything."

Yes, but science is all about using whatever methods work to produce more new knowledge all the time. All the new knowledge we can produce with mechanisms that we know are actually trustworthy will eventually become part of science, and the only stuff that's ultimately going to get left out is information we can only generate through means we know aren't reliable at producing truth.

comment by Nick_Beckstead · 2012-02-03T01:26:36.254Z · LW(p) · GW(p)

Biases from Wikipedia's list of cognitive biases. Cue: example of the bias; Response: name of the bias, pattern of reasoning of the bias, normative model violated by the bias.

Edit: put this on the wrong page accidentally.