When It's Not Right to be Rational
post by Annoyance · 2009-03-28T16:15:15.367Z · LW · GW · Legacy · 22 commentsContents
22 comments
By now I expect most of us have acknowledged the importance of being rational. But as vital as it is to know what principles generally work, it can be even more important to know the exceptions.
As a process of constant self-evaluation and -modification, rationality is capable of adopting new techniques and methodologies even if we don't know how they work. An 'irrational' action can be rational if we recognize that it functions. So in an ultimate sense, there are no exceptions to rationality's usefulness.
In a more proximate sense, though, does it have limits? Are there ever times when it's better *not* to explicitly understand your reasons for acting, when it's better *not* to actively correlate and integrate all your knowledge?
I can think one such case: It's often better not to look down.
People who don't spend a lot of time living precariously at the edge of long drops don't develop methods of coping. When they're unexpectedly forced to such heights, they often look down. Looking down, subcortical instincts are activated that cause them to freeze and panic, overriding their conscious intentions. This tends to prevent them from accomplishing whatever goals brought them to that location, and in situations where balance is required for safety, the panic instinct can even cause them to fall.
If you don't look down, you may know intellectually that you're above a great height, but at some level your emotions and instincts aren't as triggered. You don't *appreciate* the height on a subconscious level, and so while you may know you're in danger and be appropriately nervous, your conscious intentions aren't overridden. You don't freeze. You can keep your conscious understanding compartmentalized, not bringing to mind information which you possess but don't wish to be aware of.
The general principle seems to be that it is useful to avoid fully integrated awareness of relevant data if acknowledging that data dissolves your ability to regulate your emotions and instincts. If they run amok, your reason will be unseated. Careful application of doublethink, and avoiding confronting emotionally-charged facts that aren't absolutely necessary to respond appropriately to the situation, is probably the best course of action.
If you expect that you're going to be dealing with heights in the future, you can train yourself not to fall into vertigo. But if you don't have opportunities for training down your reactions, not looking down is the next best thing.
22 comments
Comments sorted by top scores.
comment by Cameron_Taylor · 2009-03-29T05:59:51.025Z · LW(p) · GW(p)
When It's Not Right to be Rational
Never. You show me a counter example and I'll show you an irrational decision. If you are trying to explain a subtle difference between two similar concepts, try not to conflate the two. That sounds more 'deep' and insightful but is just confusing.
There are certainly places where it is instrumentally rational for a bounded entity to apply certain limits to their epistemic rationality. Most obvious is that it is an effective signalling tool.
If someone has a system of deception that is believed to be either ineffective or expensive to apply then benefit can be gained by demonstrating certain incorrect beliefs. For example, if a male is able to demonstrate that he believes he is the strongest, most productive, most agressive or highest status when in fact he is not then that gives a clear message to observers. It shows that that male is able to maintain such flawed beliefs without the other males killing him.
More generally, demonstrable false beliefs are a commitment and a signal thereof:
- A commitment to a tribe
- A commitment away from a course of action that you don't want your flawed instrumental rationalist system to have access to in the future (hide the password to leechblock)
- A commitment that they will not (know how to) defect in certain prisoners dillemas
- A commitment to certain patterns of behavior that must have had costs to maintain (as shown above).
↑ comment by thomblake · 2009-04-02T20:27:52.227Z · LW(p) · GW(p)
Never. You show me a counter example and I'll show you an irrational decision.
So what you're saying is either that (1) 'right' and 'rational' are analytically equivalent, or (2) that your belief is unfalsifiable.
(1) begs the question, and (2) shows that your belief is held irrationally.
comment by Lightwave · 2009-03-28T20:43:58.622Z · LW(p) · GW(p)
I think you are using "rational" with two different meanings. If looking down will cause you to freeze and panic, then the rational thing is not to look down. If knowledge of the fact you're taking sugar pills destroys the placebo effect, then the rational thing is not to know you're taking sugar pills (assuming you've exhausted all other options). It's either that, or directly hacking your brain.
A better way to describe this might be to call these phenomena "irrational feelings", "irrational reactions", etc. The difference is, they're all unintentional. So while you're always rational in your intentional actions, you can still be unintentionally affected by some irrational feelings or reactions. And you correct for those unintentional reactions (which supposedly you can't just simply remove) by changing your intentional ones (i.e. you intentionally and rationally decide not to look down, because you know you will otherwise be affected by the "irrational reaction" of panicking).
Replies from: Annoyance↑ comment by Annoyance · 2009-03-29T00:31:02.605Z · LW(p) · GW(p)
Ah, but you can't choose not to know about the sugar pills. At most, you can choose not to investigate a therapy that seems to be working.
But in terms of developing and extending your powers of rationality, you can't embrace a delusion while at the same time working to be a better rationalist. You have to decide between the spurious benefits of a possible placebo, and being rational.
Since the placebo effect has mostly do with how you feel about how you feel, it wouldn't be very important in any case.
Replies from: Matt_Simpson↑ comment by Matt_Simpson · 2009-03-29T06:26:30.215Z · LW(p) · GW(p)
Let's just be clear. You are very near equivocating on rational. There are two basic definitions, though it may be natural to add more for some reason. Essentially what you are pointing out is that sometimes it's instrumentally rational to be epistemically irrational.
I don't see much of a problem with this. As rationalists, we primarily want to be instrumentally rational. Scratch that, it's the only thing we want (intrinsically). Being epistemically rationally just happens to be the best way to achieve our ends in a large percentage of cases. It also may have a direct component in our utility function, but that's another issue.
Replies from: Annoyance, Annoyance↑ comment by Annoyance · 2009-03-30T15:56:38.985Z · LW(p) · GW(p)
There is another definition, one better than either of those two, not only because it is more useful but because it is generally used and recognized.
With sufficiently limited resources, it can be rational (in that sense) to be irrational, if the available resources are sufficiently limited.
Replies from: Matt_Simpson, Cyan↑ comment by Matt_Simpson · 2009-03-30T18:58:36.458Z · LW(p) · GW(p)
I think you forgot to mention what that definition is.
↑ comment by Annoyance · 2009-03-29T17:48:32.066Z · LW(p) · GW(p)
"As rationalists, we primarily want to be instrumentally rational. Scratch that, it's the only thing we want (intrinsically)."
No. I'm not sure why you believe that our wants are outside the domain of rationality's influence, but they are not.
Replies from: timtyler, Matt_Simpson↑ comment by Matt_Simpson · 2009-03-29T22:47:09.971Z · LW(p) · GW(p)
The only thing we want is to get the things that we want in the most efficient way possible. In other words, to be instrumentally rational.
Replies from: Annoyance↑ comment by Annoyance · 2009-03-30T15:57:16.283Z · LW(p) · GW(p)
If what we want is to reach our wants without using the most efficient way possible, what method should we use?
Replies from: Matt_Simpson↑ comment by Matt_Simpson · 2009-03-30T18:56:10.513Z · LW(p) · GW(p)
Efficiency, at least the way I'm using the term, is relative to our values. If we don't want to use the most efficient method possible to achieve something, then something about that method causes it to have a negative term in our utility function which is just large enough to make another alternative look better. So then it really isn't the most efficient alternative we have.
comment by Kaj_Sotala · 2009-03-28T18:52:08.623Z · LW(p) · GW(p)
There's also the placebo effect, which can be useful at times.
comment by timtyler · 2009-03-28T17:18:18.444Z · LW(p) · GW(p)
Self-deception has done us proud. Without it ... we might still be running naked through the forest. ... Self-deception was a splendid adaptation in a world populated by nomadic bands armed with sticks and stones (Smith, 2004) [...] admittedly, bias and self-deception do produce many personal and social benefits. For example, by over-estimating our ability we not only attract social allies, we also raise our self-esteem and happiness (Taylor, 1989), and motivate ourselves to excel (Kitcher, 1990) [...]
comment by Rune · 2009-03-28T18:45:10.263Z · LW(p) · GW(p)
From Scott Aaronson's lecture notes:
"Or take another example: a singles bar. The ones who succeed are the ones best able to convince themselves (at least temporarily) of certain falsehoods: "I'm the hottest guy/girl here." This is a very clear case where irrationality seems to be rational in some sense."
Replies from: gjm, anonym↑ comment by gjm · 2009-03-29T17:59:32.524Z · LW(p) · GW(p)
Speaking of Scott Aaronson, his little fiction on more or less this topic is a delightful (and slightly disturbing) read.
↑ comment by anonym · 2009-03-29T01:30:14.274Z · LW(p) · GW(p)
That's a bad example.
In a singles bar, people don't respond to the beliefs of other people, but to their behavior. The success goes to those who behave appropriately, not to those who hold certain beliefs.
You might say that the best way to behave appropriately is to deceive yourself into thinking you're actually the hottest, but that is not what is going on in this case either. Offer the person a million dollars if they can correctly answer whether they were on average rated higher than anybody else or not by all members of the appropriate sex in the bar. Unless they actually are extremely attractive and probably the most attractive in the bar, their answer will be 'no'.
Replies from: SoullessAutomaton↑ comment by SoullessAutomaton · 2009-03-29T02:35:50.704Z · LW(p) · GW(p)
You might say that the best way to behave appropriately is to deceive yourself into thinking you're actually the hottest
I think a more emotionally neutral term for this technique would be something like "Method Acting".
Replies from: anonym↑ comment by anonym · 2009-03-29T06:36:19.536Z · LW(p) · GW(p)
"Method acting" is a very nice metaphor for what's actually going on: filling the mind with and identifying with a personality/character to the exclusion of normal thought processes, in order to more perfectly portray that other personality/character.
It's not self-deception though, no more than a child engrossed in pretending to be a dog is engaged in self-deception.
comment by [deleted] · 2009-03-28T17:08:55.872Z · LW(p) · GW(p)
The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the light into the peace and safety of a new dark age.
Replies from: PhilGoetz