What are your benchmarks of rationality?

post by DataPacRat · 2012-01-29T03:57:13.506Z · LW · GW · Legacy · 14 comments

Contents

14 comments

What quick-and-easy rules of thumb to gauge how rational someone else is do you tend to use? How accurate do you think those rules are, and can you think of any way they might be improved?

 

For some examples of what I mean, one of the benchmarks I use is the basic skeptics' list: astrology, chiropractic, little green men abducting cattle and performing anal probes, Nessie. Another is the denialist checklist: holocaust denial, moon landing denial, global warming denial. Another is supernaturalism in general: creationism, intercessory prayer, magick, psychics, curses, ghosts, and such. If I find out that anyone I know believes in any of that, then my estimation of how well they can consider things rationally goes down. Theism... well, I've gotten used to pretty much everyone around me being theistic, so that's kind of the baseline I assume; when I learn someone is an atheist, my estimation of their rationality tends to go /up/.

Do you have any items which make you think someone is even further along the path of rationality than simply not believing logical fallacies?

14 comments

Comments sorted by top scores.

comment by hamnox · 2012-01-29T06:04:43.820Z · LW(p) · GW(p)

Here's a big one for me: Whether or not someone shows a rudimentary understanding of how their own brains can mislead them.

It's easy enough to see it working in teenagers. They're the ones who realize that their emotions are going to be completely out of whack, their judgment may not in fact be 10x better than every adult around them, and proceed to compensate for it where they can.

It's not the same as knowing when to shout "Anchoring!" or "Sunk Cost Fallacy!". That's just knowing the password. It's a matter of being aware that your brain can think and feel things without consulting you, and not all of the things it thinks for you are good or right.

Replies from: J_Taylor
comment by J_Taylor · 2012-01-29T07:53:54.647Z · LW(p) · GW(p)

A fascinating thing with teenagers: in their youths, they are exposed to many success-stories. These success-stories will often involve bragging about the mistakes they made as teenagers. I wonder, how many teens update on this sort of evidence? I know I certainly did.

comment by AlexSchell · 2012-01-29T15:30:18.893Z · LW(p) · GW(p)

A necessary condition is having an intuitive understanding of how to properly reply to hypotheticals or thought experiments in a discussion. If someone keeps focusing on convenient possible worlds, they probably have a long way to go.

Having such understanding is not a sufficient condition for rationality, because many philosophy majors internalize these habits of thought, and I don't think that being a philosophy major is strongly indicative of rationality.

comment by Alex_Altair · 2012-01-29T04:21:31.277Z · LW(p) · GW(p)

I was hoping this would be about judging one's own rationality. Judging others seems to be a lot easier.

comment by Risto_Saarelma · 2012-01-29T04:38:51.060Z · LW(p) · GW(p)

Stuff that's just identification with an established subculture like New Agers or conspiracy cranks mostly shows that the person likes to identify with that subculture. It's a lot trickier to find stuff that you'd hope people to get right but which aren't strongly tied to a specific subculture attire.

comment by APMason · 2012-01-29T04:33:40.121Z · LW(p) · GW(p)

Well, after the list given in the OP (which, while they are in fact necessary conditions for rationality, seem to me to not even constitute a "lowest standard"; they're the surface-level attributes that are adopted almost automatically upon entering into the sceptic community) I tend to use their reaction when I say "everyone should be immortal". Strangely enough it does seem like you need an abnormal clarity of thought to reliably come to the right conclusion about death.

Replies from: Prismattic, Kaj_Sotala, daenerys, DataPacRat
comment by Prismattic · 2012-01-29T05:18:19.753Z · LW(p) · GW(p)

Beware inferential distance. "Everyone should be immortal" includes a lot of unstated assumptions that the person you say it to may not be aware of. They could easily think you mean "Everyone should basically be as they are now, except live forever", which would mean either malthusian misery or draconian restrictions on reproduction. Unless you have already discussed tranhumanism with them, this is a terrible benchmark.

Replies from: APMason
comment by APMason · 2012-01-29T05:24:57.167Z · LW(p) · GW(p)

That's not the way I usually phrase it - I don't know how that would fit into a conversation anyway. I was just summarising the subject matter. Sorry for the confusion.

comment by Kaj_Sotala · 2012-01-29T19:01:24.379Z · LW(p) · GW(p)

"Everyone should be immortal" is a claim about values, not facts. There's no such thing as "the right conclusion about death".

comment by daenerys · 2012-01-29T05:54:33.015Z · LW(p) · GW(p)

I tend to use their reaction when I say "everyone should be immortal". Strangely enough it does seem like you need an abnormal clarity of thought to reliably come to the right conclusion about death.

I see this general idea espoused by rationalists rather often. But despite my months on here, I have yet to change my mind into agreement on this.

Aumann's Agreement Theorem leaves us with three options:

  1. The vast majority of LW-ers are irrational (I rather doubt it)
  2. I am not as rational as I would like to be (I'm sure of it)
  3. We do not have common priors (I do think that most anti-deathists are very privileged in terms of: intelligence, wealth, stability, etc)
comment by DataPacRat · 2012-01-29T04:49:44.429Z · LW(p) · GW(p)

not even constitute a "lowest standard"

There are a great many people who don't meet the complete standard - in fact, the great majority of people don't; and it seems worthwhile to be able to differentiate between a reasonably rationalist deist and a Californian cloud cookoo-lander.

Of course, any way to differentiate amongst people who do meet the 'lowest standard' is valuable, as well.

Replies from: Steven_Bukal
comment by Steven_Bukal · 2012-01-29T07:09:14.562Z · LW(p) · GW(p)

I believe APMason's point is that your benchmarks are testing for anti-non-mainstreamism

comment by Manfred · 2012-01-29T06:54:47.970Z · LW(p) · GW(p)

What they do when they're wrong about something immediately available to them (so, reading a map wrong, not being wrong about global warming).

comment by adamisom · 2012-01-31T02:39:38.166Z · LW(p) · GW(p)

1) Why was this post downvoted?

2) I've realized that I take the far easier path: I simply downgrade my model of someone's rationality based on what I judge to be irrationality. The benchmark of a rational person is thus simply an apparent absence of such things.

For example, I notice belief in belief a lot, leading to confused minds: I'm thinking of one person who rationalizes incessantly. Or I may notice insufficiently clear thinking, often related to the use of passwords, along with the implicit disagreement with the creed that what the truth can destroy, it should. I'm thinking of one of my professors who clearly is opposed to reductionist cognitive psychology for what seems to me primarily wishful thinking rather than good reason, and approvingly notes that some cognitive processes can be 'emergent', and so on.