Five-minute rationality techniques

post by sketerpot · 2010-08-10T02:24:48.246Z · LW · GW · Legacy · 237 comments

Contents

237 comments

Less Wrong tends toward long articles with a lot of background material. That's great, but the vast majority of people will never read them. What would be useful for raising the sanity waterline in the general population is a collection of simple-but-useful rationality techniques that you might be able to teach to a reasonably smart person in five minutes or less per technique.

Carl Sagan had a slogan: "Extraordinary claims require extraordinary evidence." He would say this phrase and then explain how, when someone claims something extraordinary (i.e. something for which we have a very low probability estimate), they need correspondingly stronger evidence than if they'd made a higher-likelihood claim, like "I had a sandwich for lunch." We can talk about this very precisely, in terms of Bayesian updating and conditional probability, but Sagan was able to get a lot of this across to random laypeople in about a minute. Maybe two minutes.

What techniques for rationality can be explained to a normal person in under five minutes? I'm looking for small and simple memes that will make people more rational, on average. Here are some candidates, to get the discussion started:

Candidate 1 (suggested by DuncanS): Unlikely events happen all the time. Someone gets in a car-crash and barely misses being impaled by a metal pole, and people say it's a million-to-one miracle -- but events occur all the time that are just as unlikely. If you look at how many highly unlikely things could happen, and how many chances they have to happen, then it's obvious that we're going to see "miraculous" coincidences, purely by chance. Similarly, with millions of people dying of cancer each year, there are going to be lots of people making highly unlikely miracle recoveries. If they didn't, that would be surprising.

Candidate 2: Admitting that you were wrong is a way of winning an argument. (The other person wins, too.) There's a saying that "It takes a big man to admit he's wrong," and when people say this, they don't seem to realize that it's a huge problem! It shouldn't be hard to admit that you were wrong about something! It shouldn't feel like defeat; it should feel like success. When you lose an argument with someone, it should be time for high fives and mutual jubilation, not shame and anger. The hard part of retraining yourself to think this way is just realizing that feeling good about conceding an argument is even an option.

Candidate 3: Everything that has an effect in the real world is part of the domain of science (and, more broadly, rationality). A lot of people have the truly bizarre idea that some theories are special, immune to whatever standards of evidence they may apply to any other theory. My favorite example is people who believe that prayers for healing actually make people who are prayed for more likely to recover, but that this cannot be scientifically tested. This is an obvious contradiction: they're claiming a measurable effect on the world and then pretending that it can't possibly be measured. I think that if you pointed out a few examples of this kind of special pleading to people, they might start to realize when they're doing it.

Anti-candidate: "Just because something feels good doesn't make it true." I call this an anti-candidate because, while it's true, it's seldom helpful. People trot out this line as an argument against other people's ideas, but rarely apply it to their own. I want memes that will make people actually be more rational, instead of just feeling that way.

 

This was adapted from an earlier discussion in an Open Thread. One suggestion, based on the comments there: if you're not sure whether something can be explained quickly, just go for it! Write a one-paragraph explanation, and try to keep the inferential distances short. It's good practice, and if we can come up with some really catchy ones, it might be a good addition to the wiki. Or we could use them as rationalist propaganda, somehow. There are a lot of great ideas on Less Wrong that I think can and should spread beyond the usual LW demographic.

237 comments

Comments sorted by top scores.

comment by simplicio · 2010-08-10T08:11:16.846Z · LW(p) · GW(p)

I knew I was going to stay on LessWrong when I read the conceptually & rhetorically brilliant:

Ignorance exists in the map, not in the territory. If I am ignorant about a phenomenon, that is a fact about my own state of mind, not a fact about the phenomenon itself. A phenomenon can seem mysterious to some particular person. There are no phenomena which are mysterious of themselves. To worship a phenomenon because it seems so wonderfully mysterious, is to worship your own ignorance.

Which could be perhaps reduced to something like:

Your thoughts are your map; reality is the territory. Watch your step.

and

Mystery is always in the mind, never in the matter.

I don't think these are all that great but I would love a snappy way to express this central insight.

comment by Swimmy · 2010-08-11T18:01:50.734Z · LW(p) · GW(p)

From Avoiding Your Belief's Real Weak Points:

"To do better: When you're doubting one of your most cherished beliefs, close your eyes, empty your mind, grit your teeth, and deliberately think about whatever hurts the most. Don't rehearse standard objections whose standard counters would make you feel better. Ask yourself what smart people who disagree would say to your first reply, and your second reply. Whenever you catch yourself flinching away from an objection you fleetingly thought of, drag it out into the forefront of your mind. Punch yourself in the solar plexus. Stick a knife in your heart, and wiggle to widen the hole."

Condensed: If you catch yourself flinching away from a thought because it's painful, focus on that thought and don't let it go. If the truth hurts, it should.

This is, I think, some of the most important rationalist advice I ever got. It kept me reading OB when it was getting very painful to do so, and allowed me to finally admit that my religion was immoral, a thought I had kept tucked away in rationalization-land since middle school.

comment by ricketson · 2010-08-12T01:44:17.514Z · LW(p) · GW(p)

Hi. I'm new here. Great blog. Great post.

One maxim that I rely on for acting rationally is "know what your time is worth". In my first real job, I was on a one-week project with a scientist who told me that my time is valuable (I think he was implying that my boss was wasting my time). This really opened up my eyes. My first application of this idea was professionally -- I can get more out of my job than just a paycheck. I can learn skills and make contacts and list accomplishments that will advance my career. I can also enjoy what I do (I'm a researcher, so that's assumed in my profession). It's sad to see colleagues who think that their time is worth no more than some measly paycheck.

The second application of this rule was in my "home economy". I used to be very cheap. Now that I've placed a $ value on my time, it puts a lot of activities in perspective and I am much freer spending money when it frees up time for more worthwhile pursuits (it helps that my cheap habits assure that I always have a nice cushion of cash around. This way, I am able to spend money when needed, without reworking my budget -- which would be a real waste of my precious time). It's sad to see people earning $70,000 a year fretting over a dollar. It's also sad to see someone who has something big to contribute to society (such as a teacher or researcher, for example) worrying about how to recycle 1/10 ounce of plastic.

This rule ties in with the "comparative advantage" rule mentioned above.

The other maxim that I like is "question reality". It is basically a directive to question your own beliefs, ask "is this real?" It applies to everything, and it subsumes the traditional "Question authority" maxim, because unjust authority typically depends upon people being indoctrinated with a particular view of reality.

Thanks for reading. I look forward to participating in this site!

comment by Airedale · 2010-08-11T00:39:08.603Z · LW(p) · GW(p)

Great idea for a post and an important topic. A somewhat similar topic came up at our recent Chicago meetup, when someone who saw our sign came up to us to ask us what Less Wrong referred to. We didn't necessarily have a great answer at the ready besides relaying some of the basics (website/group blog about rationality and thinking better, etc.). We spent a few minutes afterward talking about what information a good LW elevator speech might include. We didn't want it to sound too stilted/formal, e.g., "refining the art of human rationality" from the banner at the top doesn't sound that inviting in casual conversation. Does anyone have approaches that have worked?

comment by bentarm · 2010-08-11T10:21:32.644Z · LW(p) · GW(p)

Eliezers "Absence of evidence is evidence of absence" is a good one in my opinion, and relatively easy to explain the relevant maths to pretty much anyone.

The general point about Conservation of Expected Evidence is then likely to come out in the wash (and is a very useful idea).

comment by aribrill (Particleman) · 2010-08-13T16:25:41.938Z · LW(p) · GW(p)

A simple technique I used to use was that whenever I started to read or found a link for an article that made me uncomfortable or instinctively want to avoid it, I forced myself to read it. After a few times I got used to it and didn't have to do this anymore.

comment by [deleted] · 2010-08-10T08:00:40.154Z · LW(p) · GW(p)

Should we be listing the oldies here as well? One of my favorites is still "Don't believe everything you think."

That one made it to book-title and t-shirt status, but I've never heard anyone actually say it. I've read it only a couple of times.

Replies from: SilasBarta, phaedrus, sketerpot
comment by SilasBarta · 2010-08-11T14:16:36.286Z · LW(p) · GW(p)

As true as that is, I don't see how it would lead people to do anything differently -- don't most people already think, er, believe they're living up to whatever that quote asks of them?

Replies from: thomblake, Nornagest
comment by thomblake · 2010-08-11T14:30:59.205Z · LW(p) · GW(p)

I don't think so. I'm pretty sure most people labor under the impression they have something like a unity of consciousness, so while "don't believe everything you see" might seem obvious, "don't believe everything you think" does not, unless specifically considering situations like hallucinations (which many would categorize under "see" rather than "think").

ETA: That's why this is a cornerstone of rationality. Even I am moved to remember the slogan, so that when I think to say, "That's not true!" I stop and ask myself why I think so and whether I should believe this impulse of mine.

Replies from: SilasBarta
comment by SilasBarta · 2010-08-11T14:34:16.139Z · LW(p) · GW(p)

Okay, in that case, I had come up with with a saying to express that same idea but which makes the implications clearer. Here goes:

"Blindness isn't when you see nothing; it's when you see the same thing, regardless of what's in front of you.

"Foolishness isn't when your beliefs are wrong; it's when you believe the same thing, regardless of what you've seen."

Replies from: thomblake
comment by thomblake · 2010-08-11T14:57:24.987Z · LW(p) · GW(p)

I particularly like the first, since the second clause technically includes literal blindness.

I might change "wrong" to "false" when repeating the second.

Replies from: SilasBarta
comment by SilasBarta · 2010-08-11T15:21:59.731Z · LW(p) · GW(p)

Thanks! Any help with touching up my version so it flows better is much appreciated.

I particularly like the first, since the second clause technically includes literal blindness.

Yes, I think this is particularly important, because the cognition involved in literal seeing is a form of believing: your brain is making inferences before there's even an image in your mind. (The raw retinal data looks like garbage.)

comment by Nornagest · 2011-02-11T22:54:20.706Z · LW(p) · GW(p)

I estimate most people would lump "don't believe everything you think" into the space occupied by slogans like "think different" and "question authority"; i.e. at best a generalized endorsement of counterculture ideals, and at worst a cynical attempt to break down any and all ideals in hopes that the gap will be filled by something more congenial to the speaker. The general population is familiar with ideology and unfamiliar with abstract cognition, so unqualified ideas about ideas will usually be taken to refer to the former.

This misconception could be dissolved with half a minute of explanation, but that half minute wouldn't fit on a bumper sticker.

Replies from: SilasBarta
comment by SilasBarta · 2011-02-12T00:50:01.479Z · LW(p) · GW(p)

Thanks, you said what I was thinking so much better.

comment by phaedrus · 2011-02-11T22:20:55.516Z · LW(p) · GW(p)

This reminds me of "It is the mark of an educated mind to be able to entertain a thought without accepting it." -- Aristotle

To me, this uses "educated" in the sense it ought to be meant.

comment by sketerpot · 2010-08-10T08:35:58.829Z · LW(p) · GW(p)

Can you think of a way to explain that to people so they may be able to apply it themselves? It's a nice slogan, but a clever turn of phrase isn't too useful by itself.

Replies from: None
comment by [deleted] · 2010-08-10T12:34:12.642Z · LW(p) · GW(p)

Frank Lantz spends the first five minutes of this video explaining the slogan and suggesting a way to apply it.

Replies from: NancyLebovitz
comment by NancyLebovitz · 2010-08-10T15:48:06.710Z · LW(p) · GW(p)

Thanks for the link.

For me, the most interesting thing is that Lantz doesn't appear to be retarded [1], and yet it was a huge shock for him to find out as an adult that it was possible to think about the odds of a decision being right rather than assuming that decisions were absolutely right or wrong.

I have no doubt that my description of needing years to assimilate the idea that people are really different from each other without this necessarily indicating something the matter with any of them is equally shocking to people who've been vividly aware of psychological differences as long as they can remember. Or I could be wrong-- the variation in clue distribution might be one of the things such people are apt to be clear about.

[1] He actually seems pretty smart-- but "doesn't appear to be retarded" is the only way I can think of to adequately express my surprise that it took him so long to acquire that particular clue.

comment by Christian_Szegedy · 2010-08-11T22:58:44.395Z · LW(p) · GW(p)

I think it is mostly hopeless trying to teach rationality to most people.

For example, both of my parents studied Math in university and still have a very firm grip of the fundamentals.

I just got a phone call yesterday from my father in Germany saying: "We saw in the news, that a German tourist couple got killed in a shooting in San Francisco. Will you avoid going out after dark?" When I tried to explain that I won't update my risk estimates based on any such singular event, he seemed to listen to and understand formally what I said. Anyhow, he was completely unimpressed, finishing the conversation in an even more worried tone: "I see, but you will take care, won't you?"

Replies from: ciphergoth, nerzhin
comment by Paul Crowley (ciphergoth) · 2010-08-12T16:58:29.318Z · LW(p) · GW(p)

"don't worry - that sort of thing is so rare, when it happens, it makes the news!"

Replies from: dreeves, simplicio
comment by dreeves · 2010-08-15T04:03:38.389Z · LW(p) · GW(p)

Well said! Here's how Bruce Schneier put it:

Remember, if it’s in the news don’t worry about it. The very definition of news is “something that almost never happens.” When something is so common that it’s no longer news — car crashes, domestic violence — that’s when you should worry about it.

I wrote an essay about the utter irrationality of "stranger danger" based on that quote: http://messymatters.com/strangers

Replies from: NancyLebovitz
comment by NancyLebovitz · 2010-08-15T09:52:06.208Z · LW(p) · GW(p)

I think not worrying about things in the news needs some fine-tuning-- if a war is happening where you live, it will affect your safety level, and it will be in the news.

comment by simplicio · 2010-08-14T00:51:26.620Z · LW(p) · GW(p)

That's the canonical response now! Thanks!

comment by nerzhin · 2010-08-12T01:25:33.398Z · LW(p) · GW(p)

Your parents aren't saying "Please update your estimate of the probability of your violent death, based on this important new evidence."

The are saying, "I love you."

This has nothing to do with how rational or irrational they are.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-08-12T16:57:12.740Z · LW(p) · GW(p)

They're saying "I love you" in an irrational way. This can hurt because there is no easy way to quibble with the second part and not violate cultural conventions about how to express your acceptance of the first.

Replies from: simplicio
comment by simplicio · 2010-08-14T01:02:15.954Z · LW(p) · GW(p)

This is well-understood by irrationalists. Once in a discussion about the necessity of evidence, I got landed with "But you don't demand evidence that your wife loves you, right? You just have faith..."

A clever move. Now arguing the point requires me to... deny that I have faith in my wife?

Replies from: cupholder, Eliezer_Yudkowsky, ciphergoth, Oligopsony
comment by cupholder · 2010-08-14T06:24:45.109Z · LW(p) · GW(p)

'Why would I need to demand evidence? My wife freely gives me evidence of her love, all the time!'

Replies from: thales, wedrifid, simplicio
comment by thales · 2010-08-18T18:52:06.621Z · LW(p) · GW(p)

I had a similar discussion with a family member, about the existence of the Christian god, where I received that exact response. My wife was sitting right there. I responded with something along the lines of, "True, but my 'faith' in her love is already backed up by evidence, and besides, I have plenty of evidence that she exists. If there was evidence for God and evidence of His love, I would happily put faith in that too."

But I agree - it definitely caused me to pause to consider a tactful response.

comment by wedrifid · 2010-08-14T06:38:56.105Z · LW(p) · GW(p)

And the proper name for a wife that doesn't freely give evidence of her love is an ex-wife!

Replies from: simplicio
comment by simplicio · 2010-08-14T07:44:38.531Z · LW(p) · GW(p)

And for someone who doesn't require evidence to believe in that love - a stalker!

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2010-08-19T17:28:53.929Z · LW(p) · GW(p)

So religious people are all God's stalkers?

comment by simplicio · 2010-08-14T07:43:54.320Z · LW(p) · GW(p)

My reply was in this vein, essentially. But it's still a sneaky bugger of a question.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-08-14T07:22:22.836Z · LW(p) · GW(p)

See also, "The Riddle of Kyon".

Replies from: simplicio
comment by simplicio · 2010-08-14T20:01:08.490Z · LW(p) · GW(p)

It was good! I didn't realize you had other fanfic than HP:MoR.

Replies from: Baughn
comment by Baughn · 2010-08-18T17:49:43.412Z · LW(p) · GW(p)

He has quite a few more. Go look for the sword of good, for example..

Replies from: simplicio
comment by simplicio · 2010-08-18T22:16:43.041Z · LW(p) · GW(p)

Yeah, I should have said I didn't know there were anymore apart from the ones on LW and HPMOR. Brain fart.

comment by Paul Crowley (ciphergoth) · 2010-08-18T19:40:11.302Z · LW(p) · GW(p)

Ah, the old "agree with me or say something rude!" gambit. I wonder if you could turn it around - "what, are you saying you don't think my wife loves me?"

comment by Oligopsony · 2010-08-14T06:52:38.557Z · LW(p) · GW(p)

The error, of course, is that it elides between two meanings of "faith." You trust your wife because you have (one would hope) spent a great deal of time with her and found her to be honest, concerned about your well-being, &c.

Of course, you might at some point come upon evidence that this is not warranted, and in this case the irrationalists might have a point: it may be more wise to use motivated cognition to convince yourself that she is faithful or still in love with you. Othello can be read as an extended argument for avoiding reasonable conclusions if you know that your reactions are not guaranteed to be reasonable.

Replies from: simplicio
comment by simplicio · 2010-08-14T07:47:03.271Z · LW(p) · GW(p)

You trust your wife because you have (one would hope) spent a great deal of time with her and found her to be honest, concerned about your well-being, &c.

Ah, but you see, that cannot be put into a test tube. And as all of your least educated neighbours know, if you can't put it into a test tube, it ain't evidence.

comment by NancyLebovitz · 2010-08-10T03:05:06.642Z · LW(p) · GW(p)

Consider the possibility that when people say they're seeing things differently than you do, that they might be telling you the truth. They could be making it up, they could be just annoying you for the fun of it, but they might actually be weirder than you think.

Replies from: sketerpot, savageorange
comment by sketerpot · 2010-08-10T03:28:38.311Z · LW(p) · GW(p)

Do you have any examples? That's a fascinating one.

(Corollary: if you're angry at someone, and they ask why you're angry, tell them. They might actually not know. Especially if they're a child. I know I'm not the only one who was punished by one or more elementary school teachers for reasons that they refused to explain, since they assumed that I already knew. Oh how I seethed.)

Replies from: SilasBarta, NancyLebovitz
comment by SilasBarta · 2010-08-11T18:18:46.166Z · LW(p) · GW(p)

Yeah, that pretty much describes growing up for me.

"Don't do that."
Why not?
"How dare you disrespect my authority you little terr..."
Oh, no, I'm perfectly fine with obeying, I just wanted to know the rationale so I can identify what kinds of things are off-limits ...
"TIMEOUT! Now!"

Edit: Needless to say, even on this forum, there are people who have no qualms about telling others "Don't do that" without bothering to spell out the boundary, or even understand why that would be necessary. I can't understand what motivates such people beyond, "I like it when others are in a perpetual state of uncertainty and have to keep deferring to me for permission."

Replies from: sketerpot, LeBleu
comment by sketerpot · 2010-08-11T19:13:43.289Z · LW(p) · GW(p)

"How dare you disrespect my authority you little terr..."

You raise an interesting point here. When a parent or teacher imposes their authority on a child, there are two very different goals they could have:

  1. To get the child to comply, and/or

  2. To establish their own dominance.

When you ask why you're being ordered to do something, and you happen to be beneath the age that society considers you a real person, that's taken as an attack on the dominance of the person bossing you around. Obedience isn't enough; a lot of people won't be satisfied with anything less than unquestioning obedience, at least from mere children. I suspect that this is what people are thinking most of the time when they use "because I say so" as a 'reason' for something. (The rest of the time, they're probably using it because they're feeling too harried to explain something to a mere child, and so they trot out that tired old line because it's easy.)

I remember when I was young enough that adults dared to treat me that way. (Notice the emotionally charged phrasing? I'm still irritated.) Someone who gave reasonable orders and provided justifications for them on request, got cooperation from me. My parents were like this, and they say I was very well-behaved. Someone who told me to do things "because I said so" automatically gained my resentment, and I felt no need to cooperate with them. They were less effective because they insisted on unquestioning obedience.

I realize that not every child is as reasonable or cooperative as I was, but providing a reason for your instructions doesn't hurt anything; at worst it's useless, and at best it reinforces your authority by making people perceive you as a reasonable authority figure worthy of listening to.

Replies from: Alicorn, Eliezer_Yudkowsky, thomblake, Eudaimoniac, DanArmak
comment by Alicorn · 2010-08-11T19:43:17.074Z · LW(p) · GW(p)

providing a reason for your instructions doesn't hurt anything

I tend to agree in most cases. However, not all instruction-givers have good reasons for their orders. If they must provide such reasons before they are obeyed, and only inconsistently have them, that means that a plausible motive for their subordinates to question them is the desire not to follow the instruction. (i.e. subordinate thinks there might be no good reason, feels compelled to obey only if there is one, and is checking.) The motive associated in this way with asking for reasons is therefore considered harmful by the instruction-giver.

When I was a kid and got an unobjectionable but confusing order, I usually agreed first and then asked questions, sometimes while in the process of obeying. This tended to work better than standing there asking "Why?" and behaving like I wanted the world to come to a halt until I had my questions answered. Objectionable orders I treated differently, but I was aware when I challenged them that I was setting myself up for a power struggle.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-08-12T16:08:14.004Z · LW(p) · GW(p)

When you ask why you're being ordered to do something, and you happen to be beneath the age that society considers you a real person, that's taken as an attack on the dominance of the person bossing you around.

Harry may or may not get a chance to say this at some point, but it sure is going in my quotes file.

comment by thomblake · 2010-08-11T19:22:02.515Z · LW(p) · GW(p)

I realize that not every child is as reasonable or cooperative as I was, but providing a reason for your instructions doesn't hurt anything; at worst it's useless, and at best it reinforces your authority by making people perceive you as a reasonable authority figure worthy of listening to.

Not true. In many cases, there isn't time (or some other resource) for spelling out your reasons. And when it's a life-or-death situation, you want your child to comply with your orders unquestioningly, not stand there asking "why" and get eaten by a lion.

Replies from: jimrandomh, sketerpot
comment by jimrandomh · 2010-08-11T19:30:15.617Z · LW(p) · GW(p)

These concerns can be balanced better than they usually are by using something like a "Merlin says" rule.

Replies from: RobinZ, sketerpot, thomblake
comment by RobinZ · 2010-08-12T01:18:47.319Z · LW(p) · GW(p)

Such a rule would include an expectation of later justification, of course.

comment by thomblake · 2010-08-11T19:54:58.772Z · LW(p) · GW(p)

That sounds plausible, but I've never seen it attempted in practice.

Though it doesn't sound very different from "Because I say so!" so I don't see why it would work worse.

Replies from: Psy-Kosh
comment by Psy-Kosh · 2010-08-12T17:01:05.525Z · LW(p) · GW(p)

"because I say so" invokes the very fact of the demand as the supreme reason, rather than acting as a promissory note, saying "no time to explain now, but trust me there's a good reason that I'll explain later"

ie, "because I said so" is "bow to my authority, underling" rather than "in this specific circumstance, just do it, trust me (for now) there's a reason, and ask later if it's not obvious to you by then"

comment by sketerpot · 2010-08-11T19:29:57.158Z · LW(p) · GW(p)

Okay, I will admit that there are some situations where telling someone why is impractical. I don't think they're too frequent, though, unless you live in a place with a lot of lions (or whatever).

Replies from: thomblake
comment by thomblake · 2010-08-11T19:55:42.747Z · LW(p) · GW(p)

Most parents and children live in places with a lot of potentially-deadly situations.

Replies from: khafra, Eliezer_Yudkowsky
comment by khafra · 2010-08-12T17:32:01.221Z · LW(p) · GW(p)

For a comparison with modern adults who live in places with a lot of potentially-deadly situations requiring swift obedience, US military personnel are required to obey all lawful orders from those appointed over them, but have (from the order follower's side) several channels for reporting abuses of authority, and (from the order giver's side) official guidance with ways of explaining orders when time permits.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-08-12T16:10:44.459Z · LW(p) · GW(p)

I think that statement becomes a lot stronger if you say "most of your ancestors".

Replies from: RobinZ
comment by RobinZ · 2010-08-16T18:56:18.005Z · LW(p) · GW(p)

Possibly, although most parents and children live in places with automobiles.

comment by Eudaimoniac · 2010-08-12T11:37:41.442Z · LW(p) · GW(p)

I am a parent and I have to disagree with you. The worst case scenario is not that it is worthless. If a child learns to question the "order" given out loud, it would suggest that the child is also questioning the "order" internally. This leads to the internal debate whether to ask for a justification for the "order" or internally decide if it is justifiable or not.

Now you have a situation where the child does not stop up and ask for the justification, but in stead decides that some situations cannot be justifiable and thus will not ask for said justification.

When the parents are around, this is problematic, but when no authoritative figure is close this leads to the child questioning already given "orders" and possible overruling any preexisting justification. They are children after all.

Now you have a child who actively disregards (or might disregard) "orders" given - with or without justification. Sure, you told your daughter not to go with strangers, but the stranger had candy and instead of seeking out parents to gain a justification for the rue of not going with strangers, the child will examine the justification itself and given an upbringing with minimal trauma, might follow the stranger with the candy.

You either have to demand absolute obedience or allow for your child to make its own decisions and accept the danger and risk involved with that, but it is a wrong simplification to say that the worst that can happen is that it is useless. After all - the way you parent your child shapes them - good or bad.

Replies from: wedrifid
comment by wedrifid · 2010-08-12T15:47:07.532Z · LW(p) · GW(p)

I agree Eudaimoniac (nice name by the way!). The worst case scenario is definitely less than worthless. The question of what is best in the average case would be an interesting one. My hunch is that it depends on the neurology of the child and also on the nature of the culture. Expectations of and relationship with 'justification' vary quite a lot between individuals in a way that I trace down to genetics.

comment by DanArmak · 2010-08-16T11:51:20.553Z · LW(p) · GW(p)

providing a reason for your instructions doesn't hurt anything; at worst it's useless, and at best it reinforces your authority by making people perceive you as a reasonable authority figure worthy of listening to.

In addition to what others have said, I think the very concept of 'authority figure' for most people means 'one who is obeyed without question'. The meaning of 'order' does not include a possibility of questioning it. An instruction that comes with explanations simply doesn't belong in the category of 'orders'.

This isn't specific to child-adult relations. Whenever someone is in a position to give orders, asking for justification is seen as a challenge. Reasonable or rational people do, of course, ask for and give out reasons for their orders. But this doesn't reinforce authority and obedience. It creates or reinforces cooperation between two people who are more nearly equals, than a giver and a taker of orders.

The emotional/social basis for giving orders is precisely "because I say so" - orders to establish dominance and obedience - and having to explain yourself automatically subtracts from your authority.

comment by LeBleu · 2010-08-11T19:48:19.464Z · LW(p) · GW(p)

Your attempt to understand these people's motivations seems to assume that these people understand that you don't know the answer. Another possible motivation is that they think the explanation is obvious or common knowledge, and hence you must be asking to antagonize them, not out of actual ignorance. Not to say that I don't think some people's motivation really is the one you've stated - they simply enjoy being in control of people.

Replies from: SilasBarta
comment by SilasBarta · 2010-08-11T19:55:01.717Z · LW(p) · GW(p)

If you're talking about my complaints about the forum, that's not the case. One time, numerous people asked for clarification from this person about which kinds of behavior that person was asking others to stop, so the person clearly knew it was an issue that the others didn't know exactly which behavior was being criticized.

That person eventually resorted to, "I'll tell you when I don't like it, as will a few people I've selected."

18 months later, he/she agreed his/her preferences were not typical.

I will provide the documentation privately if you wish, but I have no desire to start this publicly.

comment by NancyLebovitz · 2010-08-10T08:21:15.730Z · LW(p) · GW(p)

I think what got me into it was Psychetypes, a description of the Myers-Briggs types with some rather abstract theory about how they experience time and space differently than each other. [1] Anyway (and this should be a clue about how hard it can be to learn this sort of thing) when I first started reading the book, I got to the bit about there being many sorts of normal, and I put the book down for two years-- it was that hard to get past the idea that either I was crazy, or everyone else was.

Anyway, look at how a lot of people talk about taste-- a lot of them really believe that everyone should like and dislike the same things they do.

Or people who believe that if some diet/exercise method worked for them, therefore it would work for everyone if they'd just try hard enough.

Or that allergies they haven't got must be illusionnary.

[1] IIRC, SPs experience the present moment most vividly, NTs imagine time as evenly spaced along a ruler, NFs have vivid experience of past emotional moments, and someone (it's got to be another N, and I can't remember what SJs experience) are most aware of future possibilities. You double all this to get 8 types because some people think spacial boundaries are real and others don't.

Replies from: RobinZ, Peter_Lambert-Cole, NancyLebovitz, bentarm, wedrifid
comment by RobinZ · 2010-08-10T15:54:38.664Z · LW(p) · GW(p)

Anyway, look at how a lot of people talk about taste-- a lot of them really believe that everyone should like and dislike the same things they do.

Generalizing From One Example. Top rated Less Wrong article of all time, and we see again and again why. :/

comment by Peter_Lambert-Cole · 2010-08-11T02:54:11.850Z · LW(p) · GW(p)

You mentioned Myers-Briggs types and "the idea that either I was crazy, or everyone else was." I think I had a similar experience but with a different analysis of the MBTI classifications. It was Personality Type: An Owner's Manual by Lenore Thomson and there is a wiki discussion here.

I found the scientific basis fairly flimsy. She connects the 8 cognitive functions to various regions of the brain - left and right, anterior and posterior - but it seems like a just so story to me. However, I have found it immensely useful as a tool for self-improvement.

The main insight I got from it is that while other people are crazy, they are crazy in a fairly well-defined, reproducible way. Other people see things completely differently from you, but it's fairly internally consistent and so you simulate it on your own hardware.

There are two ways I think about this:

One, your brain is is trying to constantly make sense of all this sensory data that comes in. So it determines that one part is the signal and one part is the noise. It tries to minimize the noise and focus on the signal. But then you realize there is a whole other signal in what you thought was noise and there are people tuning into that and think your signal is actually the noise. If you then turn into that signal, you can understand what other people have been listening to the whole time

The other is, we are all playing 8 board games simultaneously, where if we roll the dice our piece moves that amount in each of the games. In order to make sense of this, we focus on one of the games, trying to forget about the others, and try to win this one. But other people are focused on trying to win a different game. So when they try to talk to each other about who is winning, they completely talk past each other. But when you realize that someone thinks he is playing a different game and you figure out what it is, you can have a much more productive conversation/relationship.

Replies from: wedrifid, NancyLebovitz
comment by wedrifid · 2010-08-11T07:32:37.305Z · LW(p) · GW(p)

But other people are focused on trying to win a different game. So when they try to talk to each other about who is winning, they completely talk past each other. But when you realize that someone thinks he is playing a different game and you figure out what it is, you can have a much more productive conversation/relationship.

This is an important insight. I'll add that sometimes being able to understand the different way people think can simply allow us to realise that it is more productive to have no (or minimal) relationship without judging them to be poor thinkers. Judging them not to be 'thinkers' in your original sense at all can be a lesser judgement than concluding that they suck at it.

comment by NancyLebovitz · 2010-08-11T07:23:11.527Z · LW(p) · GW(p)

Thanks-- that's a lot more use than I've made of the system.

Does it make sense to think of yourself as crazy to the same extent that people of other psychetypes are?

Links need to be in a system called Markdown rather than the more usual html-- the details for them are at the help link in the lower left corner that shows up when you start writing a reply.

Replies from: savageorange, torekp, Peter_Lambert-Cole
comment by savageorange · 2010-08-11T10:00:45.292Z · LW(p) · GW(p)

If you take crazy to mean 'acting, thinking or feeling in a way disjointed from or opposed to reality - , I'd say it makes a lot of sense to think of yourself as just as crazy as anyone else (and it reduces the incidence of giving your own feelings and thoughts undue importance, IME.)

comment by torekp · 2010-08-14T00:27:12.715Z · LW(p) · GW(p)

Upvoted for giving technical help.

comment by Peter_Lambert-Cole · 2010-08-11T14:20:54.323Z · LW(p) · GW(p)

Fixed.

Does it make sense to think of yourself as crazy to the same extent that people of other psychetypes are?

I don't think so. The term captures how radically different the another types are from your own. It's about relative distance between you and others, not an absolute quality.

comment by NancyLebovitz · 2010-08-10T10:28:02.494Z · LW(p) · GW(p)

Risto_Saarelma just posted a prime description of how hard it is to believe that other people mean what they're saying about how they see the world-- a woman who'd spent a long time in the New Age culture describes her conversion to skepticism.

Replies from: AdeleneDawner, kpreid
comment by AdeleneDawner · 2010-08-11T06:31:19.105Z · LW(p) · GW(p)

Second link's broken. You may have meant this?

comment by kpreid · 2010-08-10T21:57:18.730Z · LW(p) · GW(p)

Your second link is broken.

Replies from: NancyLebovitz, sketerpot
comment by NancyLebovitz · 2010-08-10T23:42:02.938Z · LW(p) · GW(p)

Thanks. It's fixed.

comment by bentarm · 2010-08-12T11:19:13.888Z · LW(p) · GW(p)

I think we should probably be very wary of taking anything based on the Myers Briggs classifications seriously. They seem to be based almost entirely on Forer Effect type predictions and almost impossible to falsify.

If I remember correctly, the Big Five tests are slightly more robust (eg, a Big Five profile has fairly high predictive power, and is fairly stable over time).

Replies from: Peter_Lambert-Cole, NancyLebovitz
comment by Peter_Lambert-Cole · 2010-08-19T04:45:51.333Z · LW(p) · GW(p)

I think skeptical people are too quick to say "Forer Effect" when they first do Myers-Briggs. They notice that their type only partially describes them and assume that something fishy is going on. But if you switch all the letters and read the description of the exact opposite type, there is almost nothing that could apply to you. That in itself means that there is some non-trivial classification going on. San Francisco may not be LA, but it sure isn't Moscow.

comment by NancyLebovitz · 2010-08-12T16:14:05.260Z · LW(p) · GW(p)

I don't take the specifics very seriously-- I don't try to analyze everyone in terms of MB-- nor the the Enneagram, which I also find somewhat useful. Occasionally, I find someone who seems to have a very strong tendency towards some of the traits described in a system, but most of what I get out of these systems is a clue that people are very varied, that it's normal for people to be different from each other, and some ideas about possible differences.

comment by wedrifid · 2010-08-10T08:26:55.184Z · LW(p) · GW(p)

(it's got to be another N, and I can't remember what SJs experience) are most aware of future possibilities.

NP.

comment by savageorange · 2010-08-11T09:41:35.391Z · LW(p) · GW(p)

Not to omit the distinct (and surprising) possibility that YOU might be weirder than you think.

comment by jimrandomh · 2010-08-10T14:05:19.337Z · LW(p) · GW(p)

Don't ingest words from a poisoned discourse unless you have a concrete reason to think you're immune.

Politics is often poisoned deliberately. Other topics are sometimes poisoned accidentally, by concentrated confusion. Gibberish is toxic; if you bend your mind to make sense of it, your whole mind warps slightly. You see concentrated confusion every time you watch a science fiction show on television; their so-called science is actually made from mad libs. Examples are everywhere; do not assume that there is meaning beneath all confusion.

Replies from: mtraven, ata
comment by mtraven · 2010-08-11T04:48:03.015Z · LW(p) · GW(p)

Here's the exact opposite advice. I wouldn't even bother posting it here except it's from one of the major rationalists of the 20th century:

"In studying a philosopher, the right attitude is neither reverence nor contempt, but first a kind of hypothetical sympathy, until it is possible to know what it feels like to believe in his theories, and only then a revival of the critical attitude, which should resemble, as far as possible, the state of mind of a person abandoning opinions which he has hitherto held.... Two things are to be remembered: that a man whose opinions and theories are worth studying may be presumed to have had some intelligence, but that no man is likely to have arrived at complete and final truth on any subject whatever. When an intelligent man expresses a view which seems to us obviously absurd, we should not attempt to prove that it is somehow true, but we should try to understand how it ever came to seem true. This exercise of historical and psychological imagination at once enlarges the scope of our thinking, and helps us to realize how foolish many of our own cherished prejudices will seem to an age which has a different temper of mind." -- Bertrand Russell, A History of Western Philosophy

Replies from: simplicio
comment by simplicio · 2010-09-01T09:18:31.396Z · LW(p) · GW(p)

I think Russell was right that this is a powerful technique, but he was also naive about the heuristics & biases addendum to classical rationalism.

So he is recommending a technique that is very useful but also epistemically dangerous.

comment by ata · 2010-08-11T04:57:22.537Z · LW(p) · GW(p)

Gibberish is toxic; if you bend your mind to make sense of it, your whole mind warps slightly.

That is very well put.

comment by [deleted] · 2010-08-10T12:24:31.087Z · LW(p) · GW(p)

The most important thing I learned from this site:

If you suspect something is factually true, don't be afraid to believe it. It can't hurt you.

That's simple. Not easy to implement, but easy to express.

Replies from: Vladimir_M, apophenia, satt
comment by Vladimir_M · 2010-08-11T05:36:53.645Z · LW(p) · GW(p)

SarahC:

If you suspect something is factually true, don't be afraid to believe it. It can't hurt you.

This is true only assuming that all beliefs that you suspect might be factually true are respectable. Espousing disreputable beliefs -- and sometimes merely being suspected of harboring them -- can hurt you very badly regardless of how good evidence you have for them. Even if you manage to hide your dangerous thoughts perfectly, there is still the problem that duplicity is very unpleasant for most people, if anything because it requires constant caution and self-discipline to watch your mouth.

Of course, this is irrelevant if there are absolutely no beliefs that a rational person might suspect to be true and that are at the same time disreputable to the point where expressing them might have bad repercussions. However, that's not what I observe in practice. Speaking as someone who happens to believe that some not very respectable views are factually true, or at least plausible, sometimes I can't help but envy people whose opinions are all respectable enough that they can relax and speak their mind openly in all situations.

(I raised the same point on OB a while ago.)

Replies from: None, steven0461, wedrifid, khafra, wedrifid
comment by [deleted] · 2010-08-11T11:43:15.620Z · LW(p) · GW(p)

Oh, I have the same thing. I do have some nearly disreputable views, and I have accidentally hurt people's feelings by airing them. (Pretty mild stuff: "Walmart's not so bad" and "Physical resurrection doesn't make sense.") Now I'm pretty much housebroken, although I worry like wedrifid that it shows in my facial expressions.

But. Would any of you really trade being well-informed for the convenience of not having to hold your tongue? I know I wouldn't.

Replies from: Vladimir_M, Eliezer_Yudkowsky
comment by Vladimir_M · 2010-08-11T22:03:13.422Z · LW(p) · GW(p)

SarahC:

Would any of you really trade being well-informed for the convenience of not having to hold your tongue? I know I wouldn't.

I'm curious whether you'd extend that principle to arbitrarily extreme hypothetical situations.

Imagine the most disreputable factual belief you can think of, and then suppose (for the sake of the argument) that there is in fact some strong evidence in favor of this or some equally disreputable view, which is however ignored or dismissed by all respectable people. Furthermore, suppose that if you find out about it and update your beliefs accordingly, this knowledge will not give you any practical benefit, but merely place you in a situation where your honest beliefs are closer to truth, yet extremely disreputable.

Mind you, we're not talking about your views merely causing some irritation or provoking heated arguments. We're talking about a situation where in most social and all professional situations, you are unable to look at people's faces without thinking that they would consider you an abominable monster unfit for civilized society if they knew your true honest thoughts. You have to live with the fact that people around you (except perhaps for a few close friends and confidants) respect you and are willing to work and socialize with you only insofar as they are misled about what you really believe and what you truly are.

Would you really prefer this outcome to staying blissfully ignorant?

Replies from: Leonhart, None, cata
comment by Leonhart · 2010-08-18T20:07:21.656Z · LW(p) · GW(p)

Well, yes. You mean you don't want to secretly have a powerful and dangerous dark side?

comment by [deleted] · 2010-08-12T00:31:41.474Z · LW(p) · GW(p)

Probably not. A sensible person ought to be willing to suffer for a few very important things... but very few. So a very disreputable belief ought to also, in some way, also be very important to be worth believing. In practice, when a contentious issue also seems not very important (or not very relevant to me) I don't bother investigating it much -- it's not worth becoming disreputable for.

Replies from: steven0461, Vladimir_M
comment by steven0461 · 2010-08-12T00:37:27.409Z · LW(p) · GW(p)

Knowing whether disreputable beliefs are true is helpful in figuring out what intellectual institutions you can trust.

comment by Vladimir_M · 2010-08-12T00:58:29.584Z · LW(p) · GW(p)

This, however, means that your above comment is in need of some strong disclaimers. Unless of course it's directed at someone who lives in a society in which all highly disreputable beliefs happen to be false and outright implausible from an unbiased perspective. (But would you bet that this is the case for any realistic human society?)

comment by cata · 2010-08-18T20:15:07.737Z · LW(p) · GW(p)

I absolutely prefer that outcome. Aren't we all used to having to censor ourselves in all kinds of surroundings?

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-08-12T16:52:53.372Z · LW(p) · GW(p)

Would any of you really trade being well-informed for the convenience of not having to hold your tongue?

Well said. That's a 5-second response right there to quite a lot of people in the econoblogging community who think they're clever.

comment by steven0461 · 2010-08-11T23:02:03.978Z · LW(p) · GW(p)

On a related note, I think too few people realize that it's OK to sometimes hold beliefs that are mistaken in a strongly disreputable direction. If all your errors fall on the reputable side of the line, you're missing out on accuracy. In a noisy world, sufficiently asymmetric suppression of falsehoods is indistinguishable from suppression of truths.

Replies from: Will_Newsome, Will_Newsome, SilasBarta
comment by Will_Newsome · 2010-09-17T06:40:14.531Z · LW(p) · GW(p)

In a noisy world, sufficiently asymmetric suppression of falsehoods is indistinguishable from suppression of truths.

Twitter-worthy!

comment by Will_Newsome · 2011-07-17T07:07:42.697Z · LW(p) · GW(p)

^necrobump

comment by SilasBarta · 2010-09-17T18:20:03.931Z · LW(p) · GW(p)

In a noisy world, sufficiently asymmetric suppression of falsehoods is indistinguishable from suppression of truths.

Is there a name for this theorem? It seems like it follows from invariance of information content (passed through a noisy channel) under permutation of symbols.

comment by wedrifid · 2010-08-11T05:46:49.070Z · LW(p) · GW(p)

Even if you manage to hide your dangerous thoughts perfectly, there is still the problem that duplicity is very unpleasant for most people, if anything because it requires constant caution and self-discipline to watch your mouth.

I agree and add that watching your mouth is not nearly enough. I, for example, are extremely good at watching my mouth when I am attempting to tow absurd party lines but I am irredeemably poor at controlling all the minute details of body language that must go with it. The only reliable way for most people to tell lies with adequate sincerity is to lie to themselves first.

comment by wedrifid · 2010-09-17T07:22:20.333Z · LW(p) · GW(p)

Speaking as someone who happens to believe that some not very respectable views are factually true, or at least plausible

Awesome. Do tell! Allow me to join you in controversy.

comment by apophenia · 2010-08-11T02:11:27.488Z · LW(p) · GW(p)

This is the Litany of Gendlin

comment by satt · 2010-08-12T07:08:47.269Z · LW(p) · GW(p)

This is technically true, in the sense that belief won't hurt me in and of itself. But beliefs inform our actions, and once the two are connected, beliefs acquire causal power to hurt me.

Replies from: Oligopsony
comment by Oligopsony · 2010-08-12T07:17:29.804Z · LW(p) · GW(p)

Also, we have a bias against overturning beliefs.

I think the folk epistemology implied in the distinction between words like "suspect," "think," "feel," "believe," and "know" is, on the whole, fairly useful. You can flatten them all into the word "believe" but you lose something. The dogma here is also to assign probabilities to your beliefs - the zoo of belief-verbs is just a cognitively cheap way of doing so.

comment by Richard_Kennaway · 2010-08-11T09:32:56.787Z · LW(p) · GW(p)

Someone gets in a car-crash and barely misses being impaled by a metal pole, and people say it's a million-to-one miracle

I like to reply to such accounts with "Luckier not to have been in the crash in the first place."

Replies from: xamdam
comment by xamdam · 2010-08-11T13:32:07.331Z · LW(p) · GW(p)

Totally. Theists love this error.

comment by steven0461 · 2010-08-11T01:17:26.115Z · LW(p) · GW(p)

Whenever you're uncertain about an issue where bias might play a role, ask yourself honestly what you would say if you knew that if you gave the wrong answer, rabid leopards would storm into the room and eat you.

Replies from: Spurlock
comment by Spurlock · 2010-08-11T16:58:25.057Z · LW(p) · GW(p)

It's too bad this probably can't be used effectively in argument. If you ask a theist whether God exists and add the leopard clause, he'll probably say "absolutely" and then use the lack of resultant leopards as evidence.

Still, for someone already interested in rationality looking only to correct himself, feels like a strikingly powerful technique.

Replies from: Unknowns, simplicio
comment by Unknowns · 2010-08-24T14:12:26.693Z · LW(p) · GW(p)

I remember reading in the news that one of those crazy guys who went into a school with a gun and started shooting people asked a few of them, "Does God exist," threatening to shoot them if they said yes. Some of them said yes anyway (and he shot them), so it looks like this method isn't going to stop people from believing in God.

Replies from: khafra, RobinZ
comment by khafra · 2010-08-24T15:25:04.662Z · LW(p) · GW(p)

The idea of a physical threat is the same; but the social context is radically different--the school shooters were asking their victims to give up their tribal allegience under circumstances similar to historical threats. Ideally, this question would be posed under conditions which divorced epistemic state from tribal identity as much as possible.

comment by RobinZ · 2010-08-24T17:04:22.234Z · LW(p) · GW(p)

I remember reading in the news that one of those crazy guys who went into a school with a gun and started shooting people asked a few of them, "Does God exist," threatening to shoot them if they said yes. Some of them said yes anyway (and he shot them), so it looks like this method isn't going to stop people from believing in God.

That could be an urban legend - it's the sort of story that martyr-happy adherents would be likely to fabricate and spread, and the story I find searching online (Columbine) only has one person asked that question, and after being shot.

Replies from: Unknowns
comment by Unknowns · 2010-08-24T17:15:05.390Z · LW(p) · GW(p)

No, I read it at the time of the event, in the regular news, although I don't remember the details well enough now to find it again.

Replies from: RobinZ
comment by RobinZ · 2010-08-24T17:22:56.758Z · LW(p) · GW(p)

That could be Columbine. In an earlier Salon article talking about the investigators preparing their report:

But cooperative sources quickly clammed up when questioned about the most celebrated Columbine story of all, immortalized this month in Misty Bernall's bestseller, "She Said Yes: The Unlikely Martyrdom of Cassie Bernall." "This is just too sensitive," a key source said, insisting on anonymity even for that statement. According to Misty Bernall's book, which has energized Christian youth movements around the world, the killers put a gun to her daughter Cassie's head and asked if she believed in God. When she said yes, they blew her away.

But while no one would go on the record, key investigators made it clear that an alternate scenario is far more likely: The killers asked another girl, Valeen Schnurr, a similar question, then shot her, and she lived to tell about it. Schnurr's story was then apparently misattributed to Cassie.

...and the article I linked previously followed up with this:

Wyant is the only living person who actually witnessed Bernall's death. She was hiding beneath a table right beside Cassie when it happened. "Emily was right there next to her, and in fact, she was looking right in her eyes, so you'd think she would be able to hear that, being right next to her, if anything was exchanged. And she can't remember anything being said," Wyant explained.

and this:

Salon News reported last Thursday that investigators believed the famous exchange actually took place between Klebold and Valeen Schnurr, and was mistakenly attributed to Bernall. Now Schnurr herself has confirmed that story. On Tuesday the Denver Post reported her account, which she also told to Salon News:

Schnurr was down on her hands and knees bleeding, already hit by 34 shotgun pellets, when one of the killers approached her. She was saying, "Oh, my God, oh, my God, don't let me die," and he asked her if she believed in God. She said yes; he asked why. "Because I believe and my parents brought me up that way," she said. He reloaded, but didn't shoot again. She crawled away.

comment by simplicio · 2010-08-14T01:09:59.943Z · LW(p) · GW(p)

Upping the emotional ante sometimes works. "What if your daughter's life was at stake?" Kind of a cheap tactic though.

Replies from: khafra
comment by khafra · 2010-08-24T13:59:08.124Z · LW(p) · GW(p)

This is going to sound silly, but I've had some success with wild swings in emotional ante. "What if you and your family were going to be abducted and tortured, and then have your skin sewn into suits by a psycho if you got it wrong... Ok, well, what if I had a really tasty-looking donut that I were going to give you if you got it right?"

It's a bit disingenuous, since I get the emotional impact of seriously proposing something untoward and then get to say "just kidding;" but if it's worth it; take a walk on the dark side.

comment by NancyLebovitz · 2010-08-10T08:27:57.004Z · LW(p) · GW(p)

No method of explanation should be considered good unless it's been tested on a number of ordinary people.

My impression is that the best way to explain Goodhart's Law is to bring up employee incentive plans which don't have the effect the employer was hoping for.

comment by Darmani · 2010-08-10T02:35:52.596Z · LW(p) · GW(p)

Candidate 1: "If a trillion trillion trillion people each flip a hundred coins, someone's going to get all heads." ("If a trillion people each flip a billion coins" might be a stronger meme, though extremely inaccurate.)

Candidate 2: "Knowing the right answer is better than being the first to argue for it."

Candidate 3: "If it moves, you can test it."

Replies from: sketerpot, wedrifid
comment by sketerpot · 2010-08-10T03:17:46.730Z · LW(p) · GW(p)

Those are catchy! Of course none of those is an explanation that most people can use -- the inferential distance is pretty big -- but they'd make great sound-bite segues to a slightly longer explanation.

If you just hit someone with a zinger like that, it'll feel to them that you're just scoring points, and they might get annoyed; but if you use it as the start of a discussion, that's likely to be perceived as more respectful.

comment by wedrifid · 2010-08-10T10:48:45.120Z · LW(p) · GW(p)

I like 1 and 3 but I'm dubious about 2. I am not convinced that it is true in the case of most humans. I'd like it to be but most people live sufficiently in a social reality that actually being right is not all that important.

Replies from: NihilCredo
comment by NihilCredo · 2010-08-11T15:28:46.368Z · LW(p) · GW(p)

"Learning the right answer is better than having come up with the wrong one"?

comment by b1shop · 2010-08-10T20:08:32.453Z · LW(p) · GW(p)

I'd like to come up with something meme-sized about curiosity stoppers. How about:

When you pretend to know an answer, you're wrong twice.

It doesn't get the subtleties across, but it might be enough to gain a foothold in the average person's mind.

Replies from: Spurlock
comment by Spurlock · 2010-08-11T17:00:49.779Z · LW(p) · GW(p)

You might be underestimating just how much curiosity-stoppers feel like actually knowing an answer. I still catch myself reading Wikipedia articles just up to the point where they confirm what I thought. Your meme would have to imply just how difficult it is to notice this in yourself.

Replies from: b1shop
comment by b1shop · 2010-08-15T10:10:44.218Z · LW(p) · GW(p)

I imagine the best a meme can do in this case is convince the host it's wrong to succumb to the bias. That'll lay the ground for change in the future.

comment by bentarm · 2010-08-12T11:50:23.638Z · LW(p) · GW(p)

As I think has been mentioned in this thread by others, but bears repeating, if you want to convince people that they are affected by cognitive biases sometimes you have to really hit them over the head with them.

I've found the examples in Hindsight Devalues Science and the basketball video (I don't imagine there are many people here who haven't seen it, but I'll not post a spoiler just in case). Are particularly effective at this. I guess calibration tests would also be good on this metric. Once you've pointed out a "cortical illusion" like this, most people are genuinely interested as to how it works.

On the other hand, I've found that explaining about results where you have to say "studies show that people tend to biased in direction y" is much harder than explaining about results in which you can say "remember that experiment we just did, where you were biased in direction y?".

Eg, it's nearly impossible to convince someone that they would anchor their guess of how many African countries are in the UN to the last two digits of their phone number. Even if you tell them that everyone does it, they'll assume they're an exception.

Any more examples of biases that you can "hit people over the head with" in this manner?

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2010-08-12T12:40:45.503Z · LW(p) · GW(p)

People will often admit that they'll walk across the road to save $10 on the cost of a $20 memory card but not a $2000 plasma TV.

Replies from: Alicorn
comment by Alicorn · 2010-08-12T19:25:55.345Z · LW(p) · GW(p)

People who are spending $2000 probably value their time more highly than people who are spending $10-20, ceteris paribus. It might be less expensive for the second buyer to cross the street. (Even if it's the same person on a different day or in a different frame of mind.)

Replies from: datadataeverywhere
comment by datadataeverywhere · 2010-09-01T19:16:25.872Z · LW(p) · GW(p)

That's exactly what doesn't make sense; asking the same people whether they'd walk across the street to save money on X should depend on how much they value their time, not on how much they value X. It isn't rational for there to be states of mind where buying more expensive things makes people value their time more when the rest of the environment is identical.

comment by TedW · 2010-08-11T18:52:02.158Z · LW(p) · GW(p)

Here's something that comes up in many, many discussions of climate change and anything else where a lot of arguments come from models or simulations: sometimes you have to do the math to make a valid (counter-)argument.

Example:

A: ...And so you, see, as CO2 increases, the mean global temperature will also increase.

B: That's bullshit, and here's why: as CO2 increases, there will be more photosynthesis -- and the increased plant growth will consume all that extra CO2.

Another example (the one that motivated this comment):

A: And so, as long as the bus is carrying six or more passengers, it'll be more efficient than the passenger-equivalent number of cars.

B: That's bullshit! Buses are ten times heavier than cars, so it's got to be ten or more bus passengers.

People often think that in discussions of quantitative phenomena, it's enough to make arguments based purely on directional drivers/phenomena, when really the magnitudes of those drivers are hugely important. Of course there are negative feedbacks, countervailing forces, etc., but (a) usually they're already dealt with in the original model and so B isn't telling anyone anything new, and (b) magnitude matters.

Replies from: PhilGoetz
comment by PhilGoetz · 2010-08-11T21:14:45.588Z · LW(p) · GW(p)

I believe that in the first example, "A" is supposed to be right. In the second example, is "A" or "B" supposed to be right? B is doing the math, but assumes that fuel required is proportional to mass, which is wrong, due at least to engine size and air resistance. (Consider the (mass x miles)/gallon of a 2005 RST1000 Futura motorcycle (565 x 42 = 23730), a Smart car (1808lb x 40mpg = 72320), a 2010 Honda Civic DX-VP (2709 x 36 = 97524), and a 2010 Toyota Camry SE (3329 x 33 = 109857). All MPG are EPA highway estimates.)

By default, I expect your examples to take the same form (e.g., the counterargument is right in both cases, or wrong in both cases). Deviations from that pattern should be pointed out. Cases where doing math does not qualify as "doing the math" due to incompleteness should be pointed out.

(BTW, reminds me that Brad Templeton, founder of rec.humor.funny and the Oracle, gave a talk at the 2009 Singularity Summit in which he showed data claiming that mass transit typically has the same fuel efficiency as a car with 1.5-3 people in it. Because a mass transit trip (at least the ones I take) usually require you to travel a longer distance than you would by car, mass transit loses to one person in a fuel-efficient car for fuel efficiency. And the cost of mass transit is much higher per person-mile; and the time taken is about double (in the DC metro area). These facts combined suggest that mass transit is neutral or bad for the environment, bad for the passenger, and bad for the economy.)

Replies from: TedW
comment by TedW · 2010-08-12T01:19:41.249Z · LW(p) · GW(p)

I'd meant A to be right in both cases. And of course -- against my own remonstration -- I did none of the math myself. I was unfamiliar with the Templeton data. I looked it up, and it's interesting. I'd note that while Templeton agrees that transit (by the system, not by the fully utilized vehicle) is less efficient than fuel-efficient personal transportation, he still thinks people should make use of existing transit systems.

I ride a bike.

comment by [deleted] · 2010-08-13T19:59:10.956Z · LW(p) · GW(p)

Think about your judgments of confidence in terms of frequencies instead of probabilities - our frequency intuitions tend to be much closer to reality. If you estimate that you're 90% sure of something, ask "if I faced ten similar problems, would I really get nine of them right?"

comment by mstevens · 2010-08-11T15:19:25.689Z · LW(p) · GW(p)

Counter-argument:

"Less Wrong tends toward long articles with a lot of background material. That's great, but the vast majority of people will never read them. What would be useful for raising the sanity waterline in the general population is a collection of simple-but-useful rationality techniques that you might be able to teach to a reasonably smart person in five minutes or less per technique."

Possible alternative angle of attack - get people to read longer articles. Promote things that increase attention span, for example. Admittedly, you then need to return to this topic, as you need short simple ways to advocate for whatever you think increases attention span (personally I think meditation looks promising).

Replies from: sark
comment by sark · 2010-08-13T07:29:58.762Z · LW(p) · GW(p)

Increased attention span is certainly a good thing to have. But one must be wary of insisting on a difficult path to enlightenment.

Morality can sometimes be more about how moral you are and how the rest of the world are not, than about actually doing good. If it is about signaling how good your are, then a costly signal would be preferable, since it is hard for infidels to fake. But if you want good things to happen, then you should strive toward making good acts as easy to accomplish as possible.

Short attention span seems to be a general problem, but it is not a general problem of which irrationality is a special case. The case here is distinct from the case of religion and raising the sanity waterline.

We might want to solve the problem of having a short attention span, but let us not pretend that this will automatically solve, or even simply be the deciding factor in solving, the problem of irrationality

Replies from: mstevens
comment by mstevens · 2010-08-13T14:07:58.745Z · LW(p) · GW(p)

There's room for debate here in my book, but my argument is:

  • rational arguments are often complicated and require attention to detail.
  • Many people have problems with complicated arguments that require attention to detail.

We can try to deal with this in two ways:

  • Making the arguments simpler.
  • Dealing with the problem of people not following detailed arguments (thus my earlier comment)

I think both look like promising lines of attack. It is, of course, always desirable to keep arguments as simple as possible.

Replies from: sark
comment by sark · 2010-08-13T14:22:37.080Z · LW(p) · GW(p)

I think what we should do is to try getting a foot in the door. We want to intrigue people enough such that they will seek further knowledge of rationality. People have the capacity for attention if they want something badly enough.

Replies from: mstevens
comment by mstevens · 2010-08-13T14:28:26.868Z · LW(p) · GW(p)

These people will not yet be very rational (by definition of target audience). Therefore they are likely to judge arguments on emotional grounds.

So I suggest that we need to find short arguments that promote rationality, but make an essentially emotional case for it. Ideally one would find something that overlaps - it persuades at both the emotional and rational levels.

comment by katydee · 2010-08-11T03:30:37.318Z · LW(p) · GW(p)

I think "unknown unknowns" is a good one for this sort of thing. My attempt follows:

We know a lot of things, and generally we know that we know them. These are "known knowns." I know that 1+1 = 2, I know that the year is 2010, and so on.

We also don't know a lot of things, but generally we know that we don't know them-- for example, I don't know the hundredth digit of pi, I don't know how to speak Chinese, and I don't know what stocks are going to do well next year. All of those things are "known unknowns," or unanswered questions. However, because we know what the questions are, it's possible for us to solve them, or at least approach some kind of solution, if we anticipate needing one. If I knew I was going to be quizzed on the hundredth digit of pi, I could look it up or calculate it; if I knew I was going to have to speak Chinese, I could buy a course on it and at least learn the basics; if I knew I was going to need to make stock predictions, I could look at market trends and try to extrapolate what might happen in the future. The fact that I know that I don't know these things allows me to take action to correct that lack of knowledge. So while known unknowns can be bad, we can at least plan around them and minimize their potential impact on our lives.

However, there are also things that we not only don't know, we don't know that we don't know them. There are questions out there that we haven't even considered or thought about. Not only do we not know what the answers to the questions are, we don't even know the questions themselves. These things are "unknown unknowns," and they are very, very dangerous, because there's no way we can plan for them.

For example, Japan in late World War II had a plan for fighting the US if it invaded Japan with ground troops, and Japan also had a plan for fighting the US if it bombed Japan with conventional air raids. But Japan did not have a plan for fighting the US if it bombed Japan with nuclear weapons, because Japan did not know that nuclear weapons existed, much less that the US actually had them. Nuclear weapons, for Japan in World War II, were an unknown unknown.

Replies from: rhollerith_dot_com, NancyLebovitz
comment by RHollerith (rhollerith_dot_com) · 2010-08-12T03:28:21.217Z · LW(p) · GW(p)

Japan did not have a plan for fighting the US if it bombed Japan with nuclear weapons, because Japan did not know that nuclear weapons existed, much less that the US actually had them.

Japan had a research program into nuclear weapons, but they ran into what they considered an insurmountable hurdle, which they believed would stop the US, too. Something to do with the lack of industrial capacity (electricity??) needed to produce enough fissionable material if memory serves.

Replies from: gwern
comment by gwern · 2010-08-12T04:45:52.555Z · LW(p) · GW(p)

If memory serves, both the Japanese and Germany nuclear weapons program made a subtle mistake with the cross-section of uranium atoms (or something like that), and wound up calculating that critical mass would be something like a ton of enriched uranium, and so not a useful weapon within WWII's timeframe.

(I read about this while also reading Copenhagen, but I can't remember what book. IIRC, Heisenberg claimed he had made this mistake deliberately and this was evidence that he wasn't cooperating whole-heartedly with the Nazis, but the countercharge is that he seemed as astounded as the rest of the German physicists in custody when told of Hiroshima & Nagasaki.)

Replies from: Douglas_Knight
comment by Douglas_Knight · 2010-08-12T05:28:45.302Z · LW(p) · GW(p)

Is a duplicated error evidence for or against sabotage?

Heisenberg did not claim to have sabotaged it. Wikipedia claims that the story comes from selective quotation of the last letter here. But, when the bomb was announced, the imprisoned Heisenberg's reaction of frantic work is suspicious to me: it suggests that he knew where the mistake was and wanted to go back and do the work he had blocked (but I don't know the details; maybe he was working on something independent of the mistake).

Replies from: gwern
comment by gwern · 2010-08-12T06:19:52.495Z · LW(p) · GW(p)

Well, for 2 physicists of equal competency, differing results would suggest sabotage, since for both to give the wrong answer suggests that either they are not good enough to get the right answer at all, or they both got the answer and simultaneously decided to sabotage. Heisenberg was great, though, surely greater than anyone on the Japanese project; so I tend to regard the net as a wash, and focus more on Heisenberg's reaction - which as I said suggests he genuinely made a mistake and was not engaged in passive resistance, and his surprise & flurry of activity was a give-away.

No numbers, unfortunately. But I did notice:

We did not know a process for obtaining of 235-Uranium with the resources available under wartime conditions in Germany, in quantities worth mentioning. Even the production of nuclear explosives from reactors obviously could only be achieved by running huge reactors for years on end.

Of course, for a few kilograms of enriched uranium or plutonium, you don't really need huge reactors running for years and years - the hard part is enrichment. Yesterday I was reading a history of modern Korea, and North Korea obtained enough plutonium for a bomb or 3 by running a 20 or 50 megawatt reactor for 2 or 3 years, IIRC. But perhaps by Heisenberg's 1940s standards such a reactor is beyond huge.

comment by NancyLebovitz · 2010-08-11T14:30:05.974Z · LW(p) · GW(p)

(Factual correction) The US didn't have nuclear weapons when Japan started the war.

{Mulling the topic) Not only that, but I think "the other side won't come up with a superweapon" is generally the way to bet, though perhaps less so than it used to be.

I thought radar was invented for WWII, but it's not that simple.

Maybe I've missed something, but I don't think there's been anything but incremental improvement in war tech since WWII-- nothing really surprising.

Replies from: thomblake
comment by thomblake · 2010-08-11T14:37:41.959Z · LW(p) · GW(p)

I thought radar was invented for WWII, but it's not that simple.

It's close enough - as that page notes, what we know as RADAR was developed during the war. That's also when Norbert Wiener developed the first radar-integrated guns.

Maybe I've missed something, but I don't think there's been anything but incremental improvement in war tech since WWII-- nothing really surprising.

It really depends what you call "incremental", and what sorts of increments you're looking at. We have robots with guns!

Replies from: NancyLebovitz
comment by NancyLebovitz · 2010-08-11T14:47:16.881Z · LW(p) · GW(p)

If the standard is nukes and radar, then only things which leave the other side saying "how is that even possible?" or "that came out of nowhere" counts as surprising.

Robot drones are not surprising. I'm pretty sure invisibility tech would not be surprising. Anti-gravity would be surprising.

Replies from: Drahflow, LucasSloan, katydee
comment by Drahflow · 2010-08-11T23:35:48.478Z · LW(p) · GW(p)

Decreasing frequency of surprising technology advancements are caused by faster and more frequent information of the general public about scientific advancements.

If the rate of news consumes grows faster than the rate of innovations produced, the perceived magnitude of innovation per news will go down.

comment by LucasSloan · 2010-08-12T08:27:38.217Z · LW(p) · GW(p)

How many people, even as smart as us, correctly predicted the sorts of wonder weapons that the intense research pressures that a world war would create in say, 1935? If we're talking about surprising sorts of weapons, I expect not to have been exposed to them, or if I have, to have rejected them out of hand.

comment by katydee · 2010-08-11T23:08:48.583Z · LW(p) · GW(p)

It is difficult for me to conceive of military technology that is:

a) potentially surprising
b) powerful enough to make a big difference
c) near-future

"Rods from God" might count, if they exist, but they're not surprising. The best example I can think of is strong memetic warfare, but I'm not confident that will be developed in the near future (or indeed ever).

comment by Paul Crowley (ciphergoth) · 2010-08-10T10:15:43.122Z · LW(p) · GW(p)

If I only have a few minutes, I tell people to study cognitive bias, in the hope that surely any intelligent person can see that understanding what science has to say about the systematic, predictable failings of our own brains can hardly fail to be useful. You need long enough to impart the caution that you have to apply these things to yourself, not just to other people...

Replies from: RobinZ, sark
comment by RobinZ · 2010-08-10T15:48:19.479Z · LW(p) · GW(p)

I agree, and I think Yudkowsky's suggestions in Knowing About Biases Can Hurt People is appropriate, here:

I endeavor to learn from my mistakes. The last time I gave a talk on heuristics and biases, I started out by introducing the general concept by way of the conjunction fallacy and representativeness heuristic. And then I moved on to confirmation bias, disconfirmation bias, sophisticated argument, motivated skepticism, and other attitude effects. I spent the next thirty minutes hammering on that theme, reintroducing it from as many different perspectives as I could.

I wanted to get my audience interested in the subject. Well, a simple description of conjunction fallacy and representativeness would suffice for that. But suppose they did get interested. Then what? The literature on bias is mostly cognitive psychology for cognitive psychology's sake. I had to give my audience their dire warnings during that one lecture, or they probably wouldn't hear them at all.

Whether I do it on paper, or in speech, I now try to never mention calibration and overconfidence unless I have first talked about disconfirmation bias, motivated skepticism, sophisticated arguers, and dysrationalia in the mentally agile. First, do no harm!

comment by sark · 2010-08-12T03:38:05.015Z · LW(p) · GW(p)

Yes, but before people would go and study cognitive bias, they have to be convinced that it exists in the first place! Most people are not already familiar with the idea that our minds systematically fail us.

I think the best way to introduce the idea would to present a striking case of bias (pervasiveness+impact). Then letting them know that there are many many others.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-08-12T16:13:04.128Z · LW(p) · GW(p)

I use the conjunction fallacy for my first illustration.

Replies from: TedW
comment by TedW · 2010-08-12T16:31:18.056Z · LW(p) · GW(p)

Seems to me that all that would do is reinforce someone's opinion that probability theory is irrelevant to the real world.

I personally would start with confirmation bias, partly because there are lots of clear examples in pop culture. Like: last night I was watching a rerun of "Glee." Will Schuester, a teacher and the glee-club advisor, is trying to quash a student's crush. He sings her (Rachel) a medley of songs in which the singer is trying to deflect a much younger woman's advances. (Actually, both songs -- "Don't Stand So Close to Me" and "Young Girl" -- are actually about the singer unsuccessfully trying to resist the temptation of the younger woman, but in the episode the lyrics are changed and edited so that they ostensibly work.) So he sings, and the whole time Rachel is clearly hearing the opposite of the intended message. After the song, Will asks Rachel what his message was, and she says, almost giddily, that his message was clear: "I'm very young and it's hard for you to stand close to me."

comment by MartinB · 2010-08-10T03:13:27.911Z · LW(p) · GW(p)

*Candidate 1 requires intuitive understanding of probability, fat chance.

*Candidate 2 would require a rewiring of humans about the system how status is perceived.

*Candidate 3 just does not work. Talk with people about the image they have what scientists do all day. It is bad. Especially when you go to New Agers.

Maybe you assume that people have a consistent world view, or at least the desire to have one, but no. Please try the proposals on real people, and report back. I expect you to run into the problem that objective truth is widely not accepted, and few care to have correct beliefs about the world anyway.

Here my suggestions

  • rate topics in money, as a way to figure out how much time to spend on decision making.

Spend more time on expensive or repeated expenses, less time on cheap single ones.

  • seek out good counterarguments for whichever position you find

That saved me a few times. Look for rebuttals habitually, avoid the confirmation bias. Gets harder with fringe topic.

  • write things down

Thats not so much rationality, but practical advice.

  • checklists

They are easy to explain, lead to better results. Can be tweaked in some ways to make them even better.

  • habit of reviewing after an event => learning from mistakes

How to make new mistakes instead of repeating the old ones. Learning from experience in a more efficient way.

There are some articles here that can be summed up in short. But that will probably not work, because interest in techniques is not particularly widespread. You are subject to inferential distance. When an article seems trivial to you after a few reads & ponderings that indicates that you understood it. Also there is the annoying uncanny valley of rationality, where a little of it just hurts, or gets applied wrong (ever discussed a high-iq person who happens to be religious?).

What are your own favorite techniques that you actually apply?

Replies from: sketerpot, DuncanS
comment by sketerpot · 2010-08-10T03:49:34.281Z · LW(p) · GW(p)

Candidate 2 would require a rewiring of humans about the system how status is perceived.

I know it's possible, since I've rewired myself in this way, and it wasn't particularly difficult. Am I really that weird?

Candidate 3 just does not work. Talk with people about the image they have what scientists do all day. It is bad. Especially when you go to New Agers.

You don't have to use the word "science". As Darmani put it, "If it moves, you can test it." Follow up with an explanation of why a particular claim is testable, and how to test it. For example, if someone claims that he can tell the difference between an empty water jug and a full water jug with a dowsing rod, then it's easy enough to test it.

I've used this exact approach on quite a few people, and it seems to do a pretty good job of banishing their claims that whatever we're arguing about is untestable. I wouldn't bet on them generalizing this lesson, though.

seek out good counterarguments for whichever position you find

The sticking point here is "good". Most people settle for really crappy counterarguments, including straw-man counterarguments concocted by people who agree with them.

Replies from: sark
comment by sark · 2010-08-12T03:49:23.164Z · LW(p) · GW(p)

I know it's possible, since I've rewired myself in this way, and it wasn't particularly difficult. Am I really that weird?

It's not really about rewiring yourself. Your status depends on how others perceive you. The easiest way to have truth instead of winning arguments as conferring higher status, is to move to a community with such norms, such as LessWrong.

But we are trying to convince the general public of rationality here. So until most people have peers who already value truth over winning arguments, Candidate 2 will face significant challenges.

comment by DuncanS · 2010-08-10T11:26:17.962Z · LW(p) · GW(p)

*Candidate 1 requires intuitive understanding of probability, fat chance.

I agree that an intuitive understanding of probability isn't likely to happen. But what you can do is train yourself to recognise at least some of the situations where your intuitive system is going to mess it up. Hopefully next time you see something and think "What a fantastic coincidence!", your next thought will be "Nice, but remember all the other fantastic coincidences that might have happened and didn't." instead of "My life is so improbable it must have been orchestrated by some unseen force."

comment by SilasBarta · 2010-08-10T02:59:25.073Z · LW(p) · GW(p)

My idea would be to give a truncated version of a point made in Truly Part of You.

The different sound-bite ways to say it are:

  • True knowledge regenerates.
  • Only believe something once you recognize how you would learn it some other way.
  • Your beliefs should be unaffected by your choice of labels.

Low inferential distance explanation: When learning about something, the most important thing is to notice what you've been told. Not understand, but notice: what kinds of things would you expect to see if you believed these claims, versus if you did not? Are you being told about some phenomenon, or just some labels? Once you've noticed what you're being told, think about how it plugs in with the rest of your knowledge: what implications does this body of knowledge have for other fields, and vice versa? What discoveries in one area would force you to believe differently in the others? When you can answer these questions, you have a meaningful, predictive model of the world that can be phrased under any choice of labels.

(Aside: When you are at the stage where most of your knowledge regenerates, I call that the highest level of understanding.)

Btw, I had seen this in open thread and been thinking about a response, and this is what I settled on.

Replies from: Strange7, thomblake
comment by Strange7 · 2010-08-11T14:00:41.923Z · LW(p) · GW(p)

"Wisdom is like a tree. Cut away the pages of predictions, the branches, even the roots, but from a single seed the whole structure can be rebuilt.

Foolishness is like a heap of stones. Stack them up however you please, paint bright colors to catch the eye, call them ancient or sacred or mysterious, and yet a child could scatter them."

Replies from: SilasBarta
comment by SilasBarta · 2010-08-11T14:19:06.208Z · LW(p) · GW(p)

Very well said! Is that your own phrasing?

Replies from: Strange7
comment by Strange7 · 2010-08-11T16:00:58.154Z · LW(p) · GW(p)

It is.

If I were to make a top-level post on how to rephrase truthful things to sound like mysticism or poetry, how many times do you think it would be downvoted?

Replies from: Eliezer_Yudkowsky, SilasBarta, SilasBarta, simplicio, wedrifid
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-08-12T16:47:34.118Z · LW(p) · GW(p)

People seemed to like Twelve Virtues of Rationality and Harry Potter and the Methods of Rationality.

Replies from: Strange7
comment by Strange7 · 2010-08-13T07:32:38.798Z · LW(p) · GW(p)

Yes, but those are polished outputs, and (no offense) have your halo-effect to back them up. I'm talking about sketching in a more generalized algorithm which accepts highly technical explanations as input, and produces output which a member of the general public would intuitively recognize as 'wise,' while retaining the input's truth-value.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-08-13T20:20:35.705Z · LW(p) · GW(p)

There are algorithms for that? My brain just does it automatically on request.

(Also, I presented HPMOR to a new audience with my name stripped off just to check if people still liked what I wrote without the halo effect.)

Replies from: Strange7
comment by Strange7 · 2010-08-14T08:12:31.122Z · LW(p) · GW(p)

Of course there are algorithms. The question is whether they have been adequately documented yet.

comment by SilasBarta · 2010-08-11T16:05:56.387Z · LW(p) · GW(p)

It's not the poetry that's the problem, it's the mysticism. Your quote sounds like the former, not the latter.

Or maybe "ancient wisdom" is the right term to describe what your version sounds like -- but the point is, it tells people why to think some way, and if they endorse it, they endorse a good truth-seeking procedure for the right reason, which is the important part.

comment by SilasBarta · 2010-08-15T01:08:43.394Z · LW(p) · GW(p)

By the way, I had googled "wisdom is like a tree" before asking you, and it didn't seem to turn up any existing quotations. It surprised me that no one had famously compared wisdom to a tree -- not in a positive sense, anyway.

It's a good analogy, and -- if you're into that kind of thing -- you can extend it even further: trees (can) yield fruit, the seed stays dormant if it's not in an environment that lets it grow, all the seeds take a similar path when expanding ...

Replies from: Strange7
comment by Strange7 · 2010-11-18T21:59:41.381Z · LW(p) · GW(p)

That's only a negative sense if you're working with the assumption that the biblical God is a good guy, an assumption which (given the sheer volume of genocide He committed personally, though His direct subordinates, or demanded of His human followers) simply does not hold up to scrutiny for any widely-accepted modern standard of 'good.' I mean, look at Genesis 3:22 if nothing else.

comment by simplicio · 2010-08-14T01:46:19.326Z · LW(p) · GW(p)

I say do it. Literary style is a huge obstacle to the dissemination of skepticism.

comment by wedrifid · 2010-08-12T17:25:35.780Z · LW(p) · GW(p)

If I were to make a top-level post on how to rephrase truthful things to sound like mysticism or poetry, how many times do you think it would be downvoted?

-13. (Well, actually I estimate 18 upvotes and 5 downvotes leaving effectively -13 downvotes).

comment by thomblake · 2010-08-10T14:27:06.889Z · LW(p) · GW(p)

You claim to be good at explaining things. If you have time, you should take a crack at some more short explanations of things.

Replies from: SilasBarta
comment by SilasBarta · 2010-08-10T14:46:12.491Z · LW(p) · GW(p)

I agree. I'm taking suggestions for notoriously difficult rationalist concepts (including information-theoretic ones) that are regarded as difficult to explain, or as having a high inferential distance.

I'm working on some articles related to that, but I'd be more interested in what topics others think I should try explaining better than standard accounts.

comment by Morendil · 2010-08-11T04:20:32.051Z · LW(p) · GW(p)

Show someone the gorilla video, or another of the inattentional blindness tests.

Telling someone their brain is a collection of hacks and kludges is one thing; showing them, having them experience it, is on another level altogether.

Relatedly, my favorite quote from Egan's Permutation City: "You have to let me show you exactly what you are."

Replies from: novalis
comment by novalis · 2010-08-11T16:08:57.397Z · LW(p) · GW(p)

Another classic example of the brain's hackishness, which does not seem to have been mentioned here before, is the sentence, "More people have been to Russia than I have." If you say this sentence to someone (try it!), they'll at first claim that it was a perfectly reasonable, grammatical sentence. But then you ask them what it means, they'll start to say something, then stop, look confused, and laugh.

(Yes, there is a parsing of "have" as "possess", but this is (a) precluded by inflection, and (b) not ever what someone initially comes up with).

Replies from: bentarm, Eliezer_Yudkowsky, komponisto, Oscar_Cunningham
comment by bentarm · 2010-08-12T11:24:55.767Z · LW(p) · GW(p)

"More people have been to Russia than I have."

Does this test not work when written down? Or am I unusual? The sentence jarred immediately on the first reading, and I went back and read it about three times to try and figure out if it could have any meaning at all before carrying on to the rest of the paragraph.

Replies from: novalis, wedrifid
comment by novalis · 2010-08-12T16:04:35.643Z · LW(p) · GW(p)

I have never before attempted to transmit it in writing, and I'm not a linguist. But apparently, it works for at least somewhat for at least some people (see Oscar_Cunningham below). Still, I'm sorry to have spoiled for you the effect of hearing it.

comment by wedrifid · 2010-08-12T14:47:18.219Z · LW(p) · GW(p)

Same experience here. I read it through a few times to whether if it was ungrammatical or just weird. I got a feeling of mental reward when my confusion dissolved and the actual possible meaning clicked. It would take a particular kind of brain for someone to phase a sentence that way.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-08-12T16:49:39.837Z · LW(p) · GW(p)

Ooh, same embedded system crasher as "I couldn't fail to disagree with you less."

Replies from: roryokane
comment by roryokane · 2010-08-14T01:38:32.279Z · LW(p) · GW(p)

I don’t see how that is a system crasher sentence. I think I can successfully parse it as “I must succeed in agreeing with you more”. Yes, it takes a while to figure out the meaning because turning each negative into a positive is a separate step, but there is a meaning in the end, unlike the sentence about Russia.

comment by komponisto · 2010-08-12T15:01:10.256Z · LW(p) · GW(p)

Yes, there is a parsing of "have" as "possess", but this is (a) precluded by inflection,

You picked a particularly bad context in which to confuse inflection with intonation (one of my greatest pet peeves).

Replies from: wedrifid, Vladimir_M, novalis
comment by wedrifid · 2010-08-12T15:16:32.357Z · LW(p) · GW(p)

Wow. That difference is new to me. Thanks, I'll remember that!

comment by Vladimir_M · 2010-08-13T01:13:51.446Z · LW(p) · GW(p)

If we're going to be really precise, wouldn't the difference here be a matter of grammatical stress rather than intonation?

Replies from: komponisto
comment by komponisto · 2010-08-13T01:39:42.543Z · LW(p) · GW(p)

"Grammatical stress" isn't a technical term, as far as I know. In any event, the phenomenon we're discussing here is the grammatical function of a word being communicated by the intonation pattern (as well as, probably, the speed pattern) of the sentence in which the word occurs.

Replies from: Vladimir_M
comment by Vladimir_M · 2010-08-13T02:16:47.743Z · LW(p) · GW(p)

komponisto:

"Grammatical stress" isn't a technical term, as far as I know.

I am not a linguist, but I've see the term "grammatical stress" used to denote situations where the stress of a word is determined by its syntactic context, and where a difference in stress may imply a different syntactic structure of the sentence. This is in contrast to lexical stress, which is a context-independent property of each word, and intonation, whose variation doesn't affect the syntactic structure, but merely changes things at the level of pragmatics.

Now that I've googled around a bit, I see that these terms aren't really standardized, and authors who use them typically make sure to include their favored definitions to avoid confusion. If you use "intonation" also for what I call "grammatical stress" above, then fair enough. (And for all I know, such usage might indeed be more common.)

Still, I think the contrast I have in mind is worth pointing out. In the above example, the difference in stress implies a different syntactic structure -- "have" can either be a complete verb phrase, or just an auxiliary verb referring to an antecedent (i.e. a verb phrase ellipsis). This is different from situations where changing intonation affects only pragmatics.

Replies from: komponisto
comment by komponisto · 2010-08-13T03:16:54.637Z · LW(p) · GW(p)

I'm not sure it's a good idea to restrict the use of "intonation" to describing pitch patterns that don't convey syntactic information. I suppose if one did that, one would have to simply say "pitch" for what we are talking about here, unless there's another term available.

Replies from: Vladimir_M
comment by Vladimir_M · 2010-08-13T03:45:58.178Z · LW(p) · GW(p)

Come to think of it, you're right. It make sense to define "intonation" in purely phonetic terms (i.e. as pitch variation), and in that sense, it's certainly present here. It is possible that I got a mistaken idea about the common technical meaning of this term in my amateurish forays into these subjects.

comment by novalis · 2010-08-12T16:08:20.329Z · LW(p) · GW(p)

I meant inflection: "Alteration in pitch or tone of the voice." But to avoid confusion in the future, I will try to use the linguist's definitions of these words, since they're more precise.

Also, the Wikipedia article suggests that tone rather than intonation might actually be the correct word, since there is a semantic difference.

Replies from: komponisto
comment by komponisto · 2010-08-13T00:45:31.790Z · LW(p) · GW(p)

I will try to use the linguist's definitions of these words, since they're more precise.

Thank you.

Also, the Wikipedia article suggests that tone rather than intonation might actually be the correct word, since there is a semantic difference.

No; "tone" refers to a phenomenon in certain languages (most famously Chinese) wherein otherwise identical words are distinguished from each other -- in isolation, nothing to do with their placement in a sentence -- by the contour of one's voice when pronouncing them. The kind of contextual variation of pitch that you are talking about -- intonation -- is pretty much universal to human speech in all languages.

Replies from: novalis
comment by novalis · 2010-08-13T17:39:55.547Z · LW(p) · GW(p)

Wikipedia says:

English intonation may become semi-lexicalized in common expressions such as "I'unno" (I don't know), and therefore starts to approach the domain of tone.

In this case, "have" is the auxillary verb, rather than the ordinary verb "to posess", and you can tell that by the intonation. That's otherwise identical words distinguished from each other.

Replies from: komponisto
comment by komponisto · 2010-08-13T22:03:16.810Z · LW(p) · GW(p)

Sorry if this sounds a bit harsh, but I'm puzzled by this reply. It's as if you stopped reading my comment immediately after the phrase "otherwise identical words distinguished from each other", and ignored the next part, which happened to be the most important part. So let me try again, using bold for emphasis:

"tone" refers to a phenomenon in certain languages (most famously Chinese) wherein otherwise identical words are distinguished from each other -- in isolation, nothing to do with their placement in a sentence -- by the contour of one's voice when pronouncing them

Did you actually read the Wikipedia article that you cited? Here's an example it gives from Chinese:

1. mā "mother"
2. má "hemp"
3. mǎ "horse"
4. mà "scold"
5. ma (an interrogative particle)

This should have made it clear that we're talking about a different phenomenon from anything that occurs in standard varieties of English. In Chinese, the intonation pattern of an individual word is actually lexical -- it's a fixed property of the word that applies even when the word is pronounced in isolation, entirely like the pattern of consonant and vowel sounds in the word. The five Chinese words above are not homophones, unlike "have" ("possess") and "have" (auxiliary) in English. The two senses of English "have" can't be distinguished when the word is pronounced by itself.

comment by Oscar_Cunningham · 2010-08-12T12:10:05.795Z · LW(p) · GW(p)

Wow, it took me a long while to realise what was wrong with that sentence.

comment by orthonormal · 2010-08-10T16:58:48.215Z · LW(p) · GW(p)

Anti-candidate: "Just because something feels good doesn't make it true."

The Litany of Tarski and Litany of Gendlin are better ways to approach this concept, because they're both inexorably first-person statements.

comment by VNKKET · 2010-08-10T16:53:31.878Z · LW(p) · GW(p)

Candidate: Don't pursue an idea unless it came to your attention by a method that actually finds good ideas. (Paraphrased from here.)

Replies from: Violet
comment by Violet · 2010-08-11T07:12:44.069Z · LW(p) · GW(p)

I actually keep getting good ideas in some areas while sleeping.

E.g. when facing a difficult problem in programming sleeping a night seems to give the solution quite often.

Replies from: VNKKET, djcb
comment by VNKKET · 2010-08-12T16:09:15.683Z · LW(p) · GW(p)

You changed my mind. I'm worried my candidate will hurt more than it helps because people will conflate "bad idea generators" with "disreputable idea generators" -- they might think, "that idea came to me in my sleep, so I guess that means I'm supposed to ignore it."

A partially-fixed candidate: If an idea was generated by a clearly bad method, the idea is probably bad.

comment by djcb · 2010-08-11T21:12:18.272Z · LW(p) · GW(p)

Well, then that is, in fact, a method that finds good ideas!

I sometimes use the debonoesque 'lateral thinking' tricks like association with a random dictionary word to come with some creative solution for a problem. I does not work for all classes of problems, but it can be useful.

There are some methods that consistently do not work well for me when trying to find good ideas / solutions; for example, sitting at my desk and looking at the screen.

comment by NancyLebovitz · 2010-08-13T14:25:26.776Z · LW(p) · GW(p)

"Let's see how we can check this" or "let's see how we can test this" seems to work in the short run to get people to check or test things. I don't know if it changes habits.

Replies from: Clippy
comment by Clippy · 2010-08-13T15:01:09.582Z · LW(p) · GW(p)

Agreed. It would be great for people to get into the habit of continually asking "what would this claim imply that I can check?", since not enough people are accustomed to thinking that way.

comment by sark · 2010-08-12T04:38:44.500Z · LW(p) · GW(p)

RE: Candidate 1

For those interested, here's the math:

  • A one in N chance event will not occur with probability 1-1/N.
  • It will not occur after 2 trials with probability (1-1/N)^2.
  • It will occur at least once after 2 trials with probability 1-(1-1/N)^2
  • It will occur at least once after k trials with probability 1-(1-1/N)^k.
  • For an even chance for it to occur at least once, how many trials do we need?
  • We solve for k in this equation:
  • 1-(1-1/N)^k = 0.5
  • (1-1/N)^k = 0.5
  • taking logs of both sides
  • k = ln 0.5/ln (1-1/N)
  • dividing by N
  • k/N = ln 0.5 / ln(1-1/N)^N
  • (1-1/N)^N tends to e^-1 (since (1+x/N)^N tends to e^x, let x=-1)
  • so ln(1-1/N)^N tends to -1. The convergence is pretty fast. So is reliable for large N.
  • so k/N = -ln 0.5 = ln 2 which is about 0.7
  • so k=0.7N

Example for a one-in-a-million chance event, after 700,000 trials, you would have even chance of seeing at least one occurrence.

In general for a one-in-a-N chance event, there is an even chance that you would see at least one occurrence after 0.7N trials.

And of course, you can choose another probability instead of 0.5 too.

(Since I'm no mathematician, there may be mistakes in there somewhere. Please feel free to suggest corrections.)

Replies from: simplicio
comment by simplicio · 2010-08-14T20:03:56.620Z · LW(p) · GW(p)

That's a useful number to know, thanks!

Replies from: sark
comment by sark · 2010-08-15T14:17:37.289Z · LW(p) · GW(p)

You're welcome!

comment by Violet · 2010-08-10T05:23:40.511Z · LW(p) · GW(p)

Count to ten.

Learning one was wrong (and updating) is a good thing.

One should be more interested in obtaining information than winning debates.

comment by DuncanS · 2010-08-10T03:11:20.474Z · LW(p) · GW(p)

To clarify a little on candidate 1. People are often impressed by a coincidental or unlikely happening, and think that it's some kind of miracle. But in fact there are a lot of individually very unlikely things happening all the time. Out of all the cars in the world, what's the chance that you happen to see three of them in a particular order going down a particular street? Not that high, but obviously cars have to pass you in some order or other.

So all unlikely events can be categorised into unnoticable ones (any three cars at random), and noticable ones (your mum, your music teacher, and somebody you haven't seen for ten years all pass you in order.) You then calculate the odds of that particular unlikely thing happening, and discover it's pretty unlikely. The thing most people don't take into account is that there are a LOT of potential noticable events, and there are a LOT of possible moments when they could happen. Any particular unlikely happening is pretty unlikely, but the likelihood of one of the many noticable unlikely happenings occurring is much greater.

This accords with our experience that noticable unlikely events do actually occur. But there's nothing unlikely about that fact.

Replies from: khafra
comment by khafra · 2010-08-10T13:32:21.827Z · LW(p) · GW(p)

"one in a million chances happen a thousand times a day in China" is a bumper sticker phrase for that one I've found useful.

On my own, I've tried out the ol' medical test base rate fallacy explanation on a few people. My dad got it right away; so did one friend; another didn't seem to fully grok it within ~2 minutes of explanation. I haven't done any follow-ups to see if they've been able to retain and use the concept.

comment by komponisto · 2010-08-12T14:45:15.487Z · LW(p) · GW(p)

(I definitely should have thought of this earlier; interestingly enough it was this comment that was the trigger.)

Use probabilities! (Or likelihood ratios.) Especially when arguing. Yes, do so with care, i.e. without deceiving yourself into thinking you're better calibrated than you are -- but hiding the fact that you're not perfectly calibrated doesn't make your calibration any better. You brain is still making the same mistakes whether you choose to make them verbally explicit or not. So instead of reacting with indignation when someone disagrees, just ask "how confident are you?" Then, if necessary, follow up with "what was your prior?" "How many bits of evidence is this piece of information worth?" Make your argument a numerical game rather than a barefaced status war.

People often profess to be uncomfortable with assigning a numerical value to their confidence level. I have found, strangely enough, that switching from percentages (or values between 0 and 1) to something that feels more "discrete", like a scale of 0 to 10 or a "five-star" system, sometimes helps with this.

Replies from: Eliezer_Yudkowsky, ciphergoth
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-08-12T17:01:32.824Z · LW(p) · GW(p)

I have found, strangely enough, that switching from percentages (or values between 0 and 1) to something that feels more "discrete", like a scale of 0 to 10 or a "five-star" system, sometimes helps with this.

This makes perfect sense to me. I feel far more comfortable converting my sense of credibility to an intensity scale of 1 to 100 than converting those intensities to probabilities.

comment by Paul Crowley (ciphergoth) · 2010-08-12T17:12:33.108Z · LW(p) · GW(p)

I have found, strangely enough, that switching from percentages (or values between 0 and 1) to something that feels more "discrete", like a scale of 0 to 10 or a "five-star" system, sometimes helps with this.

Have you considered using decibans for this purpose?

comment by luminosity · 2010-08-12T08:33:33.083Z · LW(p) · GW(p)

Terry Pratchett has a good metaphor for a good way of thinking in his Tiffany Aching books. Second, third, etc thoughts. Basically the idea that you shouldn't just trust whatever your thoughts say, you have your second thoughts monitoring them. And then you have your third thoughts monitoring that. I've always found it extremely helpful to pay attention to what I'm thinking; many times I've noticed a thought slipping past that is very obviously wrong and picked myseld up on it. A few times I've even agreed with the original thought upon further analysis, but analysis that it definitely needed.

I'm not sure the second thoughts metaphor is the best way for explaining it to people mind you. Perhaps something such as "Pay attention to what you're thinking, you'll be surprised how often you disagree with your own first thought."

comment by sark · 2010-08-12T05:51:37.153Z · LW(p) · GW(p)

I think movements grow at their margins. It is there that we can make the greatest impact. So perhaps we should focus on recent converts to rationality. Figure out how they converted, what factors were involved, how the transition could have been easier, taking into account their personality etc.

This is what I have been trying to do with the people I introduce rationality to and who are somewhat receptive. It is not only a victory that they began to accept rationality. It was also an opportunity to observe how best to foster more of such conversions.

It is somewhat worrying that these people tend to be geeks/nerds. But given the nature of rationality itself and personality, this shouldn't be too surprising. The success of geek/nerd culture in recent mainstream pop culture is a cause for hope I think, even though some geeks/nerds will argue their culture have been distorted in the process. What matters is that the public are now not averse to geeks/nerds. Such aversion is often the first and strongest impediment to begin even considering topics such as rationality.

It ultimately comes down to social acceptance. We must leverage the phenomenon by focusing on the fringes of the rationality demographic. They have the most power to effect peer pressure to convert further rationalists.

Replies from: Bongo, Halceon
comment by Bongo · 2010-08-18T23:14:44.709Z · LW(p) · GW(p)

Defining rationalists as LW users, I think more came from these...

  • People who followed the sequences while Eliezer was still posting them
  • People who follow the Methods of Rationality fanfic

...than from just happening upon the site. I think people are more drawn in by an ongoing serial than an archive of pre-existing material.

It's easy to get someone to follow a cool blog or fanfic. It's hard to get someone to "read the sequences".

Maybe Eliezer should repost his sequences over the next few years, in a foreign part of the blogosphere, under a pen-name? :)

Replies from: JGWeissman
comment by JGWeissman · 2010-09-22T18:08:55.112Z · LW(p) · GW(p)

When I first found OB, Eliezer was just finishing the sequences and transitioning to LW. I would start reading an article, and follow all the links back to articles I hadn't read yet. I was happy to spend days reading a later article with lots of prereqs. For me, have a depth of existing material that has been built on is a feature.

Replies from: DSimon
comment by DSimon · 2010-09-22T18:25:20.965Z · LW(p) · GW(p)

Yep, just like TV Tropes or Wikipedia; all it takes is an interesting initial hook, and then the tab-queueing begins.

comment by Halceon · 2010-08-23T19:01:24.754Z · LW(p) · GW(p)

If we use LW as a metric of conversion, then you can consider me a new convert, lured here by the occasional link from the Octagon. This is, of course, a pretty weak metric. I've been interested in rational thinking since the 9th grade, when i went to a debate club and realised that people went there to win arguments, not get to the truth. While i've done my best to keep my actions and words rational in cases that seem detached from my personal life, i think i mostly fail at self-examination.

My personal observations confirm that the geek/nerd social group is the most prone to rationality, but there is a significant buffer layer around the group, that can be influenced and converted.

P.s., It feels good to finally register here. And... Am i the only one who feels a bit odd when using the word "convert" in this context?

Replies from: thomblake
comment by Eneasz · 2010-08-12T00:29:50.956Z · LW(p) · GW(p)

I've heard Candidate 1 expressed as "A one-in-a-million shot happens a thousand times a day in China."

Candidate 2 could be "I like to be less wrong."

Candidate 3 maybe "If it affects reality, it is real."?

comment by MatthewB · 2010-10-16T11:19:23.847Z · LW(p) · GW(p)

That Candidate 2 (admitting that one is wrong is a win for an argument), is one of my oldest bits of helpful knowledge.

If one admits that one is wrong, one instantly ceases to be wrong (or at lest ceases to be wrong in the way that one was wrong. It could still be the case that the other person in an argument is also wrong, but for the purposes of this point, we are assuming that they are "correct"), because one is then in possession of more accurate (i.e. "right") information/knowledge.

comment by aausch · 2010-10-08T19:58:24.522Z · LW(p) · GW(p)

http://rejectiontherapy.com/ - the 30 day rejection challenge seems to fit here. Try and, for 30 consecutive days, provoke genuine rejections or denials of reasonable requests, as part of your regular activities, at the rate of one per day.

comment by ChristianKl · 2010-08-15T16:38:06.682Z · LW(p) · GW(p)

Similarly, with millions of people dying of cancer each year, there are going to be lots of people making highly unlikely miracle recoveries. If they didn't, that would be surprising.

That's saying it's surprising that nobody lives to the age of 150. Miracle cancer cures are statistical outliers and it would be interest to know the mechanism that allows them to happen.

This is an obvious contradiction: they're claiming a measurable effect on the world and then pretending that it can't possibly be measured.

It's no contraction if you believe in a clever god that doesn't want that the effect gets scientifically measured.

Replies from: PhilGoetz, anon895
comment by PhilGoetz · 2010-08-18T18:05:28.689Z · LW(p) · GW(p)

It's no contraction if you believe in a clever god that doesn't want that the effect gets scientifically measured.

But then the believer can't claim God can do things that could be scientifically measured - for instance, curing people who pray more often than people who don't, at least while a scientist is watching. Believers who want to pray for their health should use timeless decision theory to figure out what conditions to meet so that God is allowed to cure them without making that observable to later scientists. Cult startup, anyone?

comment by anon895 · 2010-08-18T18:23:10.222Z · LW(p) · GW(p)

A clever god applying its cleverness to the job of making itself invisible is going to succeed.

comment by beriukay · 2010-08-11T07:13:01.617Z · LW(p) · GW(p)

I'm surprised there aren't any comments about reminding people they can't have it both ways. I haven't found a great way to do it quickly, but I have sometimes talked people down from forming a negative opinion (of a person, group, or event) by asking them if they would have gotten the same perception from a counterfactual (and in some sense opposite) event occurring instead.

Replies from: NancyLebovitz
comment by NancyLebovitz · 2010-08-11T14:31:27.766Z · LW(p) · GW(p)

I need an example of that one.

Replies from: beriukay
comment by beriukay · 2010-08-12T13:38:18.767Z · LW(p) · GW(p)

Ok, one fairly frustrating occurrence in my life is when my girlfriend gets freaked out about failing a math class. The problem being that she gets about as freaked when she does well on a test as when she does poorly on a quiz. Pointing out that she seems to just want to panic regardless of the event seems to calm her more than most of my other approaches.

But the example I was actually thinking about when I wrote that involved a coworker talking badly of someone else in the workplace. The specifics are lost to me, but at the time, I noticed that the complaining guy would have had material to gripe about regardless of what the other person did. I mentioned this, and he conceded the fact and changed the subject.

Replies from: NancyLebovitz
comment by NancyLebovitz · 2010-08-12T16:07:38.550Z · LW(p) · GW(p)

Thanks for the examples.

Pointing out that she seems to just want to panic regardless of the event seems to calm her more than most of my other approaches.

Interesting. I think think that exact phrasing wouldn't work well with me because when I have bad emotional habits, it generally doesn't seem as though I want them. I'd do better with a more neutral phrasing like "it seems as though you panic no matter what happens".

All I can do is guess about the difference-- maybe your girlfriend experiences her internal state as wanting the emotions she's getting?

Replies from: beriukay
comment by beriukay · 2010-08-12T16:35:46.701Z · LW(p) · GW(p)

You know, I didn't really notice that distinction before. I shall have to pay attention to that. I'll let you know if/how much better that works.

comment by VNKKET · 2010-08-12T16:10:48.704Z · LW(p) · GW(p)

Candidate: Hold off on proposing solutions.

This article is way more useful than the slogan alone, and it's short enough to read in five minutes.

comment by Sideways · 2010-08-10T04:22:44.151Z · LW(p) · GW(p)

'Instinct,' 'intuition,' 'gut feeling,' etc. are all close synonyms for 'best guess.' That's why they tend to be the weakest links in an argument-- they're just guesses, and guesses are often wrong. Guessing is useful for brainstorming, but if you really believe something, you should have more concrete evidence than a guess. And the more you base a belief on guesses, the more likely that belief is to be wrong.

Substantiate your guesses with empirical evidence. Start with a guess, but end with a test.

Replies from: thomblake
comment by thomblake · 2010-08-10T14:23:59.689Z · LW(p) · GW(p)

I disagree with this one. If it's really your best guess, it should be the result of all of the information you have to muster. And so either each of "instinct", "intuition", "gut feeling", etc. are your best chance of being right, or they're not close synonyms for "best guess".

Replies from: Sideways, adsenanim, adsenanim
comment by Sideways · 2010-08-10T20:42:10.811Z · LW(p) · GW(p)

I agree (see, e.g., The Second Law of Thermodynamics, and Engines of Cognition for why this is the case). Unfortunately, I see this as a key inferential gap between people who are and aren't trained in rationality.

The problem is that many people-- dare I say most-- feel no obligation to gather evidence for their intuitive feelings, or to let empirical evidence inform their feelings. They don't think of intuitive feelings as predictions to be updated by Bayesian evidence; they treat their intuitive feelings as evidence.

It's a common affair (at least in the United States) to see debaters use unsubstantiated intuitive feelings as linchpins of their arguments. It's even common on internet debates to see whole chains of reasoning in which every link is supported by gut feeling alone. This style of argument is not only unpersuasive to anyone who doesn't share those intuitions already-- it prevents the debater from updating, as long as his intuitions don't change.

Replies from: MichaelVassar
comment by MichaelVassar · 2010-08-11T14:25:40.170Z · LW(p) · GW(p)

Intuitive feelings are evidence AND predictions. Sadly, most people simply think of them as facts.

comment by adsenanim · 2010-08-11T07:29:16.702Z · LW(p) · GW(p)

Your argument reminds me of a thought experiment I did concerning the "GOD Operator...

1+ 1 = 2

1 -1 = 0

1 * 1 = 1

1 / 1 = 1

Etc...

The operator is the +, -, * /, etc

The GOD operator is inclusive of all known operators and allows such things as:

1 GOD 1 = whatever answer Fits

AND

1 GOD 1 = sqrt(-1), PI, Etc..

How do we define the operator when GOD can be "whatever works"?

My main thoughts then went to the idea of a "universal machine" much like Turing...

What specifically is the mechanism of the human mind that would allow both of the above examples?

comment by adsenanim · 2010-08-11T07:52:34.552Z · LW(p) · GW(p)

You argument reminds me of a thought experiment I did concerning the "GOD Operator...

1+ 1 = 2

1 -1 = 0

1 * 1 = 1

1 / 1 = 1

Etc...

The operator is the +, -, * /, etc

The god operator is inclusive of all known operators and allows such things as:

1 GOD 1 = whatever answer Fits

AND

1 GOD 1 = sqrt(-1), PI, Etc..

How do we define the operator when GOD can be "whatever works"?

My main thoughts then went to the idea of a "universal machine" much like Turing...

What specifically is the mechanism of the human mind that would allow both of the above examples?

comment by multifoliaterose · 2010-08-10T04:09:02.965Z · LW(p) · GW(p)

Upvoted for raising a very important topic.

comment by jimmy · 2010-09-15T19:48:26.166Z · LW(p) · GW(p)

It probably took me a bit more than 5 minutes, but I had conversation last night that fits this idea.

The idea to convey is "If you don't actually use the information you obtain, it cannot possibly increase your odds of success"

I went through the Monty Hall Problem (the trick to explaining that one is to generalize to the trillion box case where all but two boxes are eliminated prior to asking whether you want to switch) to get this idea across.

From there you can explain the implications. For example, how through commitment/consistency biases, conformity, and compartmentalization its likely that they aren't actually getting the real life benefits that they could be getting from their degree in biology, or whatever it is.

comment by adsenanim · 2010-08-11T05:51:29.233Z · LW(p) · GW(p)

I think that the key words are "reasonably smart".

Sagan’s Baloney Detection Kit is a good starting point, and it could be said that each of his examples are easily translatable to a oration of less than 5 minutes (as per Candle in the Dark), I have often thought that it would make a good children’s book (Carl and the Baloney Detector)...

A good resource would be the previous attempts at such a work, Aesop's Fables (Platitudinal), I Ching (Esoteric), and Judeo-Christi-Islamic Texts (Dogmatic). If we are to attempt a similar work for the ideas of reason then what can we learn from them to tell the aspects of how to provide a bible, or even a psalm, of reason?

It is a good starting point to keep a single idea down to what can be said before interest is lost and short enough to keep interest...

Perhaps instead of providing a single interpretation of a reasonable argument it would be a location for an argument idea, with the many interpretations of the specific argument and with related topics provided as a link?

So, in other words, Here is the Concept, These are the Arguments, This is What Concepts Relate, which could be kept to under 5 minutes.

"The WIKIRESONIA" :)

Replies from: sketerpot
comment by sketerpot · 2010-08-13T05:51:13.748Z · LW(p) · GW(p)

A good resource would be the previous attempts at such a work, Aesop's Fables (Platitudinal), I Ching (Esoteric), and Judeo-Christi-Islamic Texts (Dogmatic). If we are to attempt a similar work for the ideas of reason then what can we learn from them to tell the aspects of how to provide a bible, or even a psalm, of reason?

Do you know that these would be good resources? You haven't established this; it might help if you gave one or two examples of how these works of fiction that you listed could help us out.

"The WIKIRESONIA" :)

You'd need to specifically have brevity and low inferential distance as goals, if you made such a wiki. The LW wiki tends to give a brief description of something and then link to some long posts on the subject; in contrast, Wikipedia tends to have really long articles. Getting all those "many interpretations" you recommend takes quite a bit of space. Check out how long the Wikipedia article on confirmation bias is, and ask yourself if a hypothetical Average Person could take anything useful from skimming it.

Replies from: adsenanim
comment by adsenanim · 2010-08-19T05:35:18.361Z · LW(p) · GW(p)

I present them (with my critique) because they represent to me attempts at reason as it was before the definition of reason was widely accepted.

I left out any direct quotes out because I thought it may confuse the topic of conversation and that the five minute rule would be violated if I tried to discuss them.

Aesop's Fables:

http://www.aesopfables.com/aesopsel.html

I Ching:

http://en.calameo.com/read/000039257e56b7faf538d

Judeo-Christi-Islamic:

http://en.wikipedia.org/wiki/Kabbalah

Maybe not the best resources, but they could be an introduction.

I will add one more, only because I find it fun:

http://en.wikipedia.org/wiki/Pyramids_(novel)

For some reason the above link does not deliver correctly, but you should be able to follow....

Yes the wiki is a challenge, I was thinking of a new graphical interface...

Replies from: gwern, adsenanim
comment by gwern · 2010-08-19T07:34:10.389Z · LW(p) · GW(p)

http://en.wikipedia.org/wiki/Pyramids_(novel\)

 http://en.wikipedia.org/wiki/Pyramids_(novel\)