Why *I* fail to act rationally

post by bentarm · 2009-03-26T03:56:57.416Z · LW · GW · Legacy · 23 comments

Contents

23 comments

There is a lot of talk here about sophisticated rationality failures - priming, overconfidence, etc. etc. There is much less talk about what I think is the more common reason for people failing to act rationally in the real world - something that I think most people outside this community would agree is the most common rationality failure mode - acting emotionally (pjeby has just begun to discuss this, but I don't think it's the main thrust of his post...).

While there can be sound evolutionary reasons for having emotions (the thirst for revenge as a Doomsday Machine being the easiest to understand), and while we certainly don't want to succumb to the fallacy that rationalists are emotionless Spock-clones. I think overcoming (or at least being able to control) emotions would, for most people, be a more important first step to acting rationally than overcoming biases.

If I could avoid saying things I'll regret later when angry, avoid putting down colleagues through jealousy, avoid procrastinating because of laziness and avoid refusing to make correct decisions because of fear, I think this would do a lot more to make me into a winner than if I could figure out how to correctly calibrate my beliefs about trivia questions, or even get rid of my unwanted Implicit Associations.

So the question - do we have good techniques for preventing our emotions from making bad decisions for us? Something as simple as "count to ten before you say anything when angry" is useful if it works. Something as sophisticated as "become a Zen Master" is probably unattainable, but might at least point us in the right direction - and then there's everything in between.

23 comments

Comments sorted by top scores.

comment by Paul Crowley (ciphergoth) · 2009-03-26T08:41:51.891Z · LW(p) · GW(p)

I'm afraid I'm repeating myself when I say this, but there is already a school of rationalism that discusses exactly this question, and it's called cognitive behavioural therapy, or CBT. As far as I can tell, CBT is exactly the process of using our capacity for rational introspection to improve our mental health, and it is the most empirically effective talking therapy there is.

Replies from: anonym, bentarm, Annoyance
comment by anonym · 2009-03-26T18:50:05.051Z · LW(p) · GW(p)

It is worth repeating. CBT includes many powerful techniques for understanding why we have the automatic thoughts & reactions we do and how to change the ones that upon reflection interfere with achieving our goals. It addresses the original poster's concerns exactly.

comment by bentarm · 2009-03-26T23:55:44.396Z · LW(p) · GW(p)

ok, that's great. It sounds like 'something in the middle'..., so what should I do? I don't have any diagnosable pyschological illness, or even any problems a psychiatrist would be interested in, but I do sometimes have emotional reactions that I'd like to control.

Is CBT aimed specifically at helping people with pschological conditions? Or does it have useful elements that perfectly healthy people can use to help them get over perfeclty normal problems? And how can I find out about them?

Replies from: Cameron_Taylor
comment by Cameron_Taylor · 2009-03-28T04:24:18.049Z · LW(p) · GW(p)

Is CBT aimed specifically at helping people with pschological conditions? Or does it have useful elements that perfectly healthy people can use to help them get over perfeclty normal problems? And how can I find out about them?

I suggest that perfectly healthy people don't have problems, otherewise they wouldn't be perfectly healthy. As for whether you can find CBT useful without subjecting yourself to labels or stigma, absolutely. In fact, having a brain that isn't acutely disabled by any particular problem can make the techniques somewhat easier to apply.

I recommend Change Your Thinking as a useful introduction.

comment by Annoyance · 2009-03-26T19:10:01.105Z · LW(p) · GW(p)

Yes!

It's the only empirically-effective talking therapy there is.

Depending on how the standards are set, it's also the only effective psychiatric intervention, period. Manipulating symptoms is nice but not nearly enough.

Replies from: thomblake, Cameron_Taylor, ciphergoth
comment by thomblake · 2009-04-02T17:15:10.640Z · LW(p) · GW(p)

The jury's still out, but EMDR seems promising - it's questionable whether the eye movements are necessary, but it seems to perform as well as CBT.

Replies from: pjeby
comment by pjeby · 2009-04-02T17:52:10.082Z · LW(p) · GW(p)

Actually, in one study, TFT beat out EMDR, but then one of the researchers came up with a hypothesis to explain the effectiveness of TFT, EMDR, TIR, and the NLP V/KD technique... and designed something even better:

After the research study was over, there was much persuasive argument from each of the proponents of the brief therapy methods represented. In a later NLP workshop, Ed Reese challenged me to test the hypothesis of pattern destabilization. I proposed that any stimuli capable of affecting a perturbation in visual, auditory, and kinesthetic modes simultaneously would prove to be as effective in eliminating a traumatic experience as TFT, even without the use of their complex algorithms. The stimuli that I proposed to test the hypothesis with was a game readily found in all children's toy stores called Simon.

Replies from: None
comment by [deleted] · 2015-07-03T01:01:47.061Z · LW(p) · GW(p)

sd

comment by Cameron_Taylor · 2009-03-28T04:27:46.612Z · LW(p) · GW(p)

it's also the only effective psychiatric intervention, period.

Absolutely not!

I notice you've acknowleged at least mood stabilisers for bipolar disorder, which I was about to shout out. If you went easy of the 'period' claim I'd have to concur that many psychiatric interventions are of dubious merit and come with side effects that are not always adequately accounted for.

comment by Paul Crowley (ciphergoth) · 2009-03-26T20:34:15.259Z · LW(p) · GW(p)

The research shows that not all of the efficacy of the drugs is down to the placebo effect.

Replies from: Annoyance
comment by Annoyance · 2009-03-26T20:54:49.013Z · LW(p) · GW(p)

Certainly drugs have effects. Whether the effects of the drugs are really a help is questionable.

There are a few conditions that people usually just can't cope with without drugs, even though the drugs have serious downsides. Lithium is a godsend for manic depression, despite it being quite dangerous - but considering how effectively repeatedly cycling is for people's lives, it's worth the risk.

comment by mattnewport · 2009-03-26T05:42:32.043Z · LW(p) · GW(p)

This touches on what for me is one of the big open questions on what it means to act rationally. I question the common position that the kinds of 'irrational' decisions you describe are actually all that irrational. Many such decisions seem to be rational decisions for an agent with a high time preference at the moment of decision. They may seem irrational from the perspective of a future self who looks back on the decisions when dealing with the consequences but I see the problem as more one of conflicting interests between present selves and past/future selves than one strictly of rationality. As the recent post discussed, rationality doesn't provide goals, it only offers a system for achieving goals. Many apparently irrational decisions are I suspect rational responses to short term goals that conflict with longer term goals.

If I decide to eat a chocolate bar now to satisfy a current craving, I am not really acting irrationally. I have a powerful short term drive to eat chocolate and there is nothing irrational in my actions to satisfy that short term goal. Later on I may look at the scales and regret eating the chocolate but that reflects either a conflict between short term and long term goals or a conflict between the goals of my present self and my past self (really just alternative ways of looking at the same problem). It is not a failure of rationality in terms of short term decision making, it is a problem of incentives not aligning across time frames or between present and future selves. In order to find solutions to such dilemmas it seems more useful to look to micro-economics and the design of incentive structures that align incentives across time scales than to ways to improve the rationality of decisions. The steps I take to acquire chocolate are perfectly rational, the problem is with the conflicts in my incentive structure.

Replies from: cousin_it, anonym, Furcas
comment by cousin_it · 2009-03-26T20:30:41.710Z · LW(p) · GW(p)

Many such decisions seem to be rational decisions for an agent with a high time preference at the moment of decision.

Emotions can also have a lower time preference than your conscious self. For example, a surge of anger can make you stand up against a bully and win much more than the present confrontation in long term self-respect and respect of others, even if you eventually "lose" this particular conflict. My subconscious is always tracking the intangible "social" terms of my long range utility function, and over the years I've come to appreciate that.

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2009-03-27T09:01:58.101Z · LW(p) · GW(p)

I'd describe that as a situation where your long-term interests and your very-short-term interests gang up on your short- to medium-term interests.

Replies from: cousin_it
comment by cousin_it · 2009-03-27T14:36:01.793Z · LW(p) · GW(p)

A great description, funny how it applies to other emotional acts such as cheating on your spouse (increase reproductive chances while risking comfort of family life). It might be enlightening to think of some emotions as optimizations for the very long term - for you and all your descendants (makes sense as emotions were created by evolution) - and the rational mind as optimizing for the short to medium term..

Replies from: BeanSprugget
comment by BeanSprugget · 2020-09-01T18:29:42.740Z · LW(p) · GW(p)

It doesn't optimize for "you", it optimizes for the gene that increases the chance of cheating. The "future" has very little "you".

comment by anonym · 2009-03-26T18:42:21.037Z · LW(p) · GW(p)

The irrational aspect is not that there is conflict between the short-term and long-term or that you act based on short-term consequences, but that when you think rationally while contemplating eating the chocolate bar about the short-term, medium-term, and long-term consequences, weighting them all appropriately, you still end up eating the chocolate bar even though at this very moment your careful consideration of the weighted factors led to the conclusion that not eating it is preferable.

comment by Furcas · 2009-03-26T06:21:25.063Z · LW(p) · GW(p)

This is something I've pondered myself. I think you're at least partly right, but I'm not entirely certain.

Let's say that the desire you usually feel in this regard is to not gain weight. What if, while experiencing the craving, your current desire to eat a chocolate bar doesn't reflect a temporary change in your incentive structure, but instead reflects a temporary distortion of your mental map of reality? For example, your certainty that eating the chocolate bar will make you gain weight might have decreased from 80% to 1%. If the truth is that eating the chocolate bar will in fact make you gain weight, you will therefore be less rational while experiencing this craving than before (or after).

Replies from: mattnewport
comment by mattnewport · 2009-03-26T07:01:18.421Z · LW(p) · GW(p)

I suspect there's a bit of both going on but I'm fairly sure it's not as dramatic a discounting as an 80% to 1% change (I realize your numbers were only illustrative of the idea). My feeling based on introspection of the decision making process when making a choice that favours short term gain over the more 'rational' longer term choice is that I am still fully aware of the negative consequences, I just discount them heavily.

If there's one area where my judgements are distorted it is in my estimate of how likely I am to be able to 'make up' for present choices in the future. I think this is a fairly universal phenomenon and is also reflective of conflicts between present and future selves - I may eat the chocolate bar and commit my future self to exercise or a healthier eating regime but I am far too trusting of my future self and consistently underestimate his incentive to renege on any commitments I attempt to bind him to in the present.

In my personal history I have an unusually explicit example of present/future self conflict. When I was at university I made short term decisions which I explicitly justified to myself and others on the basis that I was making choices that my future self would have to pay for but that I anticipated my future self being a person who my present self would have no qualms about taking advantage of. I was aware of the fact that political views tend to move further to the right with age, as best expressed by the line "Show me a young conservative and I'll show you someone with no heart. Show me an old liberal and I'll show you someone with no brains." and as a young liberal anticipated an older and wealthier conservative self who might not believe in wealth transfer. By taking out student loans I could commit my future self to a wealth transfer that suited my purposes at the time but that my future self would likely not approve of.

As it turns out, I was at least partially right about where my political views would move (though of course if I met my younger self now I would attempt to point out the many rational reasons why my views now are in fact more correct, and the many ways in which his understanding was overly simplistic). Overall I don't begrudge my younger self the choices he made however, though that may only be because the commitments did not prove to be overly burdensome.

comment by infotropism · 2009-03-26T23:52:43.394Z · LW(p) · GW(p)

There's a difference between correcting your behaviour to adapt to what you'd judge, intellectually to be rational, and correcting what generates your behaviour.

The same difference that exists between running a compiled or an interpreted program (or at least when interpreted meant it would run really slow).

The second may not be possible quite a few cases too. The first is probably possible, but unless it becomes an learned reflex, you'll always have to expand some mental energy towards it ( http://en.wikipedia.org/wiki/Ego_depletion ) . In the end, it may work against your objective of acting more in accordance with your best rational judgement.

The best idea I can come up, is, be honest with yourself. There's always a reason why you'd act against your judgement. What is that reason, and why is it stronger ?

Where does the strength that your rational decisions may possess, come from ? From your desire to be rational, from the belief it is going to be more efficient, from the expected better payoff that you'll attain by acting rationally as opposed to having no strategy, following your impulses more or less blindly ?

If that strength isn't enough, but you still "want" to act rationally, then maybe you could search inside of you for the correct feeling, the one you know you'd feel, if you were to have the rational reaction. You are angry at someone ? Why ? How would you feel if you weren't ? Can you step back, examine yourself as if you were a third party, and decide that after all the satisfaction of letting that anger go wouldn't be a great loss if you decided to experience a different feeling ? Or would you rather have that satisfaction after all ? Same for most other similar issues. There's a reason why you act in some way and not another. Find it, see if it's really worth it, see how else you could feel, pause for a moment, see if you could, and would want to, feel like that after all. Then decide for yourself.

comment by byrnema · 2009-03-26T17:41:33.752Z · LW(p) · GW(p)

I really like Matt's point that not all undesired behaviors are irrational. Rather they reflect conflicts of interest within yourself, at a single time or over different points in time. It makes sense that we would have conflicts since we are very complex systems trying to optimize several things simultaneously.

In a stereotype of rationality, rational people are seen without emotions or any physical senses, like computers or robots. Unlike computers and robots, though, people are human beings with organic bodies. I think it is a mistake to discount the importance of having physical bodies which place demands on our utility functions. Matt gave the example of wanting some carbs. My thesis in this comment is that perhaps all irrational behaviors, which are not due to faults in logic or incompletely considered information, is the "fault of" our physical bodies. Everyone knows that if we don't feel well, it changes everything. Many people can't think rationally if they're too hungry.

Discounting the broad category of undesired behaviors that are really examples of the conflicts of interest described by Matt, I asked myself what other times does emotion cause me to act irrationally? These would have to be examples when I behave in a way that I really don't prefer (i.e., not just due to a conflict of interest) but I am unable to make decisions in the way that I do prefer because of my emotions.

I can think of many, many examples! In these examples, my emotions hold sway and cause me to act in ways that I do not wish -- not even at that time. Then in these cases, is it not another example of the influence of a physical body? Perhaps you have a different view, but I think of emotions that I cannot control as being physically based. If I could just turn off the surge of hormones in my body, then I could behave normally and rationally.

I would be interested in ways (mind over matter? psychology? Cognitive behaviorial therapy as ciphergoth mentioned?) to have more control over these hormones when some control is needed.

comment by Furcas · 2009-03-26T04:35:21.266Z · LW(p) · GW(p)

Visualizing the consequences of whatever it is I'm trying not to do usually works for me.

comment by CannibalSmith · 2009-03-26T10:28:35.035Z · LW(p) · GW(p)

Sleep on it. Take a nap.