What are the optimal biases to overcome?

post by aaronsw · 2012-08-04T15:04:14.699Z · LW · GW · Legacy · 70 comments

If you're interested in learning rationality, where should you start? Remember, instrumental rationality is about making decisions that get you what you want -- surely there are some lessons that will help you more than others.

You might start with the most famous ones, which tend to be the ones popularized by Kahneman and Tversky. But K&T were academics. They weren't trying to help people be more rational, they were trying to prove to other academics that people were irrational. The result is that they focused not on the most important biases, but the ones that were easiest to prove.

Take their famous anchoring experiment, in which they showed the spin of a roulette wheel affected people's estimates about African countries. The idea wasn't that roulette wheels causing biased estimates was a huge social problem; it was that no academic could possibly argue that this behavior was somehow rational. They thereby scored a decisive blow for psychology against economists claiming we're just rational maximizers.

Most academic work on irrationality has followed in K&T's footsteps. And, in turn, much of the stuff done by LW and CFAR has followed in the footsteps of this academic work. So it's not hard to believe that LW types are good at avoiding these biases and thus do well on the psychology tests for them. (Indeed, many of the questions on these tests for rationality come straight from K&T experiments!)

But if you look at the average person and ask why they aren't getting what they want, very rarely do you conclude their biggest problem is that they're suffering from anchoring, framing effects, the planning fallacy, commitment bias, or any of the other stuff in the sequences. Usually their biggest problems are far more quotidian and commonsensical.

Take Eliezer. Surely he wanted SIAI to be a well-functioning organization. And he's admitted that lukeprog has done more to achieve that goal of his than he has. Why is lukeprog so much better at getting what Eliezer wants than Eliezer is? It's surely not because lukeprog is so much better at avoiding Sequence-style cognitive biases! lukeprog readily admits that he's constantly learning new rationality techniques from Eliezer.

No, it's because lukeprog did what seems like common sense: he bought a copy of Nonprofits for Dummies and did what it recommends. As lukeprog himself says, it wasn't lack of intelligence or resources or akrasia that kept Eliezer from doing these things, "it was a gap in general rationality."

So if you're interested in closing the gap, it seems like the skills to prioritize aren't things like commitment effect and the sunk cost fallacy, but stuff like "figure out what your goals really are", "look at your situation objectively and list the biggest problems", "when you're trying something new and risky, read the For Dummies book about it first", etc. For lack of better terminology, let's call the K&T stuff "cognitive biases" and this stuff "practical biases" (even though it's all obviously both practical and cognitive and biases is kind of a negative way of looking at it). 

What are the best things you've found on tackling these "practical biases"? Post your suggestions in the comments.

70 comments

Comments sorted by top scores.

comment by Vladimir_M · 2012-08-05T08:15:38.920Z · LW(p) · GW(p)

Basically, the problem is that K&T-style insights about cognitive biases -- and, by extension, the whole OB/LW folklore that has arisen around them -- are useless for pretty much any question of practical importance. This is true both with regards to personal success and accomplishment (a.k.a. "instrumental rationality") and pure intellectual curiosity (a.k.a. "epistemic rationality").

From the point of view of a human being, the really important questions are worlds apart from anything touched by these neat academic categorizations of biases. Whom should I trust? What rules are safe to break? What rules am I in fact expected to break? When do social institutions work as advertised, and when is there in fact conniving and off-the-record tacit understanding that I'm unaware of? What do other people really think about me? For pretty much anything that really matters, the important biases are those that you have about questions of this sort -- and knowing about the artificial lab scenarios where anchoring, conjunction fallacies, etc. are observable won't give you any advantage there.

Note that this applies to your biases about abstract intellectual topics just as much as to your practical life. Whatever you know about any such topic, you know largely ad verecundiam from the intellectual authorities you trust, so that chances are you have inherited their biases wholesale. (An exception here is material that stands purely on rigorous internal logical evidence, like mathematical proofs, but there isn't much you can do with that beyond pure math.) And to answer the question of what biases might be distorting the output of the official intellectual authorities in the system you live under, you need to ask hard questions about human nature and behavior akin to the above listed ones, and accurately detect biases far more complex and difficult than anything within the reach of the simplistic behavioral economics.

Of course, the problem you ultimately run into is that such analysis, if done consistently and accurately, will produce results that clash with the social norms you live under. Which leads to the observation that some well-calibrated instinctive bias towards conformity is usually good for you.

Replies from: MichaelVassar, CarlShulman, Will_Newsome, whowhowho
comment by MichaelVassar · 2012-08-05T17:55:15.809Z · LW(p) · GW(p)

I really like this post. Could you make the link go both ways?
That said, I think you are overstating your case.
Also, if you figure out what local social norms are and that the stories are BS, you can accomodate the norms and ignore the stories internally. You can also optimize separate internal stories and external ones, or alternatively, drop out of the official story entirely and just be some guy who hangs around and is fun to talk to and mysteriously seems to always have enough money for his needs (the secret being largely that one's needs turn out to be very cheap to fulfill, even extravagantly, if optimized for directly, and money is likewise easy to get if optimized for directly). If you aren't dependent on others, don't compete, dont make demands, and are helpful and pleasant, you can get away with not conforming.

Replies from: Vladimir_M, None
comment by Vladimir_M · 2012-08-11T00:15:45.756Z · LW(p) · GW(p)

Could you make the link go both ways?

Sure.

comment by [deleted] · 2012-08-06T18:47:00.884Z · LW(p) · GW(p)

If you aren't dependent on others, don't compete, dont make demands, and are helpful and pleasant, you can get away with not conforming.

If this isn't a joke, how does it balance VMs overstatement?

Replies from: MichaelVassar
comment by MichaelVassar · 2012-08-09T18:29:52.294Z · LW(p) · GW(p)

It's an alternative to having a well-calibrated bias towards conformity.

comment by CarlShulman · 2012-08-12T20:08:16.712Z · LW(p) · GW(p)

Basically, the problem is that K&T-style insights about cognitive biases -- and, by extension, the whole OB/LW folklore that has arisen around them -- are useless for pretty much any question of practical importance.

I agree for most topics, but there are applied cases of clear importance. Investment behavior provides particularly concrete and rich examples, which are a major focus for the K&T school, and "libertarian paternalists" inspired by them: index funds as preferable to overconfident trading by investors, setting defaults of employee investment plans to "save and invest" rather than "nothing," and so forth. Now, you can get these insights packaged with financial advice in books and the like, and I think that tends to be more useful than general study of biases, but the insights are nonetheless important to the tune of tens or hundreds of thousands of dollars over a lifetime.

comment by Will_Newsome · 2012-08-11T00:48:53.880Z · LW(p) · GW(p)

are useless for pretty much any question of practical importance

Worse than useless, give illusion of insight.

(And I feel like many comments on this post are sort of exemplary of that problem—as you put it in a different context, the equivalent of magic healing crystals are being talked about in a frighteningly serious manner.)

comment by whowhowho · 2013-02-02T16:41:30.267Z · LW(p) · GW(p)

By Django, that needed saying!

comment by John_Maxwell (John_Maxwell_IV) · 2012-08-05T00:30:07.382Z · LW(p) · GW(p)

Here are some tentative guesses about this whole rationality and success business.

Let's set aside "rationality" for a minute and talk about mental habits. Everyone seems to agree that having the right habits is key to success, perhaps most famously the author of 7 Habits of Highly Effective People. But if you look at the 7 habits the Covey identifies ("Be Proactive", "Begin with the End in Mind", "Put First Things First", "Think Win/Win", "Seek First to Understand, Then Be Understood", "Synergize", and "Sharpen the Saw") they don't look too much like what gets discussed on Less Wrong. So what gives?

I think part of the problem is the standard pattern-matching trap. Perhaps books like Covey's genuinely do address the factors that the vast majority of people need to work on in order to be more successful. But analytical folks tend not to read these books because

  • they're part of a genre that's sullied its reputation by overpromising
  • even when they don't overpromise, analytical people are rarely part of the target audience, and the books do things like give incorrect folk explanations for stuff that actually happens to work (but the analytical people don't try the stuff because they can tell the stated explanation for why it works is bogus)
  • they tend to distrust their emotions, and a good part of how the books work, when they do, is by manipulating your emotions to elevate your mood and make it easier for you to get to work or implement changes
  • analytical people typically manage to figure out stuff up to the level discussed in popular books for themselves, and don't do further careful study because they've written off the genre

So in the same way that pure math grad students are smarter than psychology grad students, even though good psychology research is probably higher-value than good pure math research, Less Wrong has focused on a particular set of mental habits that have the right set of superficial characteristics: mental habits related to figuring out what's true. But figuring out what's true isn't always that important for success. See Goals for which Less Wrong does (and doesn't) help. (Although the focus has gradually drifted towards more generally useful mental habits since the site's creation, I think.)

A big problem with addressing these more generally useful habits through the internet is that people who get good enough at applying them are liable to decide that surfing the internet is a waste of time and leave the conversation. I'm quite interested if anyone has any suggestions for dealing with this problem.

So when Holden Karnofsky says something like "rationality is a strong (though not perfect) predictor of success", maybe he is claiming that mental habits that make you better at figuring out what's true are actually quite useful in practice. (Or maybe by "rationality" he meains "instrumental rationality", in which case his statement would be true by definition.) Perhaps the reason Stephen Covey doesn't write about that stuff is because it's too advanced or controversial for him or his audience?

(Disclaimer: I haven't read The Seven Habits of Highly Effective People, although I did read the version for teenagers when I was a teenager.)

Replies from: aaronsw, timtyler
comment by aaronsw · 2012-08-05T11:43:04.409Z · LW(p) · GW(p)

I really enjoyed The Seven Habits of Highly Effective People. (By contrast, I tried reading some @pjeby stuff yesterday and it had all the problems you describe cranked up to 11 and I found it incredibly difficult to keep reading.)

I don't think the selection bias thing would be a problem if the community was focused on high-priority instrumental rationality techniques, since at any level of effectiveness becoming more effective should be a reasonably high priority. (By contrast, if the community is focused on low-priority techniques it's not that big a deal (that was my attitude toward OvercomingBias at the beginning) and when it gets focused on stuff like cryo/MWI/FAI I find that an active turnoff.)

I think there's a decent chance epistemic rationality, ceteris paribus, makes you less likely to be traditionally successful My general impression from talking to very successful people is that very few of them are any good at figuring out what's true; indeed, they often seem to have set up elaborate defense mechanisms to make sure no one accidentally tells them the truth.

Replies from: pjeby
comment by pjeby · 2012-08-07T02:09:26.985Z · LW(p) · GW(p)

I tried reading some @pjeby stuff yesterday and it had all the problems you describe cranked up to 11

Technically, John was describing the problems of analytical readers, rather than the problems of self-help writers. ;-)

I have noticed, though, that some of my early writing (e.g. 2010 and before) is very polarizing in style: people tend to either love it or hate it, and the "hate it" contingent seems larger on LW than anywhere else.

However, most of the people who've previously said on LW that they hate my writing, seemed to enjoy this LW post, so you may find something of use there.

Replies from: David_Gerard
comment by David_Gerard · 2012-08-07T11:36:30.374Z · LW(p) · GW(p)

It's the ...

INFOMERCIAL STYLE!

... of formatting. Doesn't work for everyone ;-)

comment by timtyler · 2012-08-05T12:15:59.912Z · LW(p) · GW(p)

So when Holden Karnofsky says something like "rationality is a strong (though not perfect) predictor of success", maybe he is claiming that mental habits that make you better at figuring out what's true are actually quite useful in practice. (Or maybe by "rationality" he meains "instrumental rationality", in which case his statement would be true by definition.)

Of course, instrumental rationality is not a perfect predictor of success either. There are always stochastic factors with the potential to lead to bad outcomes. How strong a predictor it is depends on the size of such factors.

comment by David_Gerard · 2012-08-04T17:15:30.237Z · LW(p) · GW(p)

Eliezer noted (in a comment on a blog post I made about Nonprofit Kit For Dummies) that he did in fact buy the book and try to apply it. This suggests the difference was in fact Luke, and that we need How To Be Lukeprog For Dummies, which he is of course posting piecemeal ;-)

Replies from: lukeprog
comment by lukeprog · 2012-08-05T01:54:49.966Z · LW(p) · GW(p)

Eliezer's comment doesn't say he tried to apply the lessons in Nonprofit Kit for Dummies, though some of it he clearly did — e.g. filing the necessary paperwork to launch a 501c3!

Anyway, reading a how-to book doesn't help much unless you actually do what the book recommends. That's why it's such an important intervention to figure out How To Actually Do The Stuff You Know You Should Be Doing — also known as How to Beat Procrastination.

But the anti-akrasia techniques we've uncovered so far don't work for everyone, and there are other factors at play. For example, since a young age Eliezer has become cognitively exhausted rather quickly. He has spent years trying different things (diet, exercise, context changes, vitamins, etc.) but still hasn't found an intervention that lets him do cognitive work for as long as I can. (Luckily, the value of an hour of cognitive work from Eliezer is much higher than the value of an hour of cognitive work from me.)

Also, there was no time in history when it made sense for Eliezer Yudkowsky to spend his time doing Nonprofit Kit for Dummies stuff. (But it would have made sense, I think, for Eliezer to try harder to find someone who could do non-profit management better, or to try harder to find someone who could execute that search more effectively. This is the kind of thing I meant by mentioning a potential "gap in general rationality.")

P.S. Eliezer's memory of reading Nonprofit Kit for Dummies "before starting the Singularity Institute in 2000" must be mistaken. The first edition of Nonprofit Kit for Dummies wasn't published until 2001.

Replies from: Pablo_Stafforini
comment by Pablo (Pablo_Stafforini) · 2012-08-07T17:11:38.446Z · LW(p) · GW(p)

reading a how-to book doesn't help much unless you actually do what the book recommends. That's why it's such an important intervention to figure out How To Actually Do The Stuff You Know You Should Be Doing — also known as How to Beat Procrastination.

I'm confused. You seem to be suggesting that procrastination is one of the main "biases" we need to overcome (or, as I would put it, that the ability to beat procrastination is one of the main "practical skills" we need to develop). But aaronsw disagrees that this is what you yourself believe: "As lukeprog himself says, it wasn't lack of intelligence or resources or akrasia that kept Eliezer from doing these things, 'it was a gap in general rationality.'" (emphasis added) Could you clarify?

comment by Dreaded_Anomaly · 2012-08-04T18:07:47.693Z · LW(p) · GW(p)

But if you look at the average person and ask why they aren't getting what they want, very rarely do you conclude the issue is that they're suffering from anchoring, framing effects, the planning fallacy, commitment bias, or any of the other stuff in the sequences.

I very often conclude that people are suffering from the planning fallacy.

Replies from: RomeoStevens, Decius
comment by RomeoStevens · 2012-08-04T20:40:46.122Z · LW(p) · GW(p)

Not falling prey to the planning fallacy is the most obvious and quantifiable result from applying rationality techniques in my day to day life.

comment by Decius · 2012-08-04T22:01:18.069Z · LW(p) · GW(p)

I very often conclude that people are suffering from the planning fallacy.

How often is that the reason that they aren't making progress toward their goals?

Replies from: Dreaded_Anomaly
comment by Dreaded_Anomaly · 2012-08-05T21:31:56.492Z · LW(p) · GW(p)

How often is that the reason that they aren't making progress toward their goals?

Very often. Any project that goes "over budget" - that's the planning fallacy. On a smaller scale, any meeting which goes too long or has too many scheduled presentations (90% of the meetings I've attended) - that's the planning fallacy. The people who plan meetings or budget projects are aiming for the meetings to end on time and the projects to be completed within their budgets, but they're not meeting those goals.

Replies from: Decius
comment by Decius · 2012-08-06T01:25:45.253Z · LW(p) · GW(p)

So... if there is a 10% chance that there will be a 25% cost overrun, and a 90% chance that the unexpected expenses will fall within the contingency budget, should the budget be 125% projected cost, or 102.5% projected cost? If there are 6 items on the agenda, and a 95% chance that each of them will take 5 minutes but a 5% chance that they will take 15 minutes, how long should the meeting be scheduled for?

Keep in mind that meetings will expand to fill the allocated time, even if they are completed before then, and projects will tend to use their entire budget if possible.

Granted, some people budget without a 'contingency' line item, but budgeting for the expected serious cost increase doesn't significantly reduce the odds of going over budget, because the frequency of a serious overrun is so low that the expected cost is much smaller than the actual cost should one occur.

Expecting all projects to complete on time and within budget, now that IS a planning fallacy.

Replies from: handoflixue
comment by handoflixue · 2012-08-06T19:54:37.789Z · LW(p) · GW(p)

So... if there is a 10% chance that there will be a 25% cost overrun, and a 90% chance that the unexpected expenses will fall within the contingency budget, should the budget be 125% projected cost, or 102.5% projected cost?

Which is entirely the wrong way to go about the problem. If this project is critical, and it's failure will sink the company, you really, really want to be in a position to handle the 25% cost overrun. If you have ten other identically-sized, identically-important project, then the 102.5% estimate is probably going to give you enough of a contingency to handle any one of them going over budget (but what is your plan if two go over budget?)

Thinking in terms of statistics, without any actual details attached, is one of the BIG failure modes I see from rationalists - and one that laypeople seem to avoid just fine, because to them the important thing is that Project X will make or break the company.

Keep in mind that meetings will expand to fill the allocated time, even if they are completed before then, and projects will tend to use their entire budget if possible.

I'd suggest that this is a solvable problem - I've worked in multiple offices where meetings routinely ended early. Having everyone stand helps a lot. So does making them a quick and daily occurrence (it becomes routine to show up on time). So does having a meeting leader who keeps things on-topic, understands when an issue needs to be "taken offline" or researched and brought up the next day, etc..

Replies from: sakranut, Decius
comment by sakranut · 2012-08-06T20:13:22.020Z · LW(p) · GW(p)

If this project is critical, and it's failure will sink the company, you really, really want to be in a position to handle the 25% cost overrun

So, to refine Decius' formula from above, you'd want to add in a variable which represents expected marginal utility of costs.

Thinking in terms of statistics, without any actual details attached, is one of the BIG failure modes I see from rationalists

I don't think the problem here is thinking in terms of statistics; I think that the problem is attempting to use a simple model for a complicated decision.

[edited for grammar]

Replies from: handoflixue
comment by handoflixue · 2012-08-06T20:40:51.550Z · LW(p) · GW(p)

I don't think the problem here is thinking in terms of statistics; I think that the problem is attempting to use a simple model for a complicated decision.

Both geeks and laypeople seem to use overly simply models, but (in my experience) they simplify in DIFFERENT ways: Geeks/"rationalists" seem to over-emphasize numbers, and laypeople seem to under-emphasize them. Geeks focus on hard data, while laypeople focus on intuition and common sense.

Replies from: fubarobfusco
comment by fubarobfusco · 2012-08-06T21:39:18.802Z · LW(p) · GW(p)

"Intuition and common sense" sound more like styles of thought process, not models. The models in question might be called "folklore" and "ordinary language" — when thinking "intuitively" with "common sense", we expect the world to fit neatly into the categories of ordinary language, and for events to work out in the way that we would find plausible as a story.

comment by Decius · 2012-08-06T21:02:29.566Z · LW(p) · GW(p)

If you have a project which will bankrupt the company if it fails, then it does not have a budget. It has costs. If you have multiple such projects, such that if any one of them fails, the company goes bankrupt, then they all have costs instead of budget.

Note that I'm assigning such a large negative value to bankruptcy such that it is trivially worse to be bankrupt with a large amount of debt as it is to be bankrupt with a smaller amount of debt- if the sunk costs fallacy applies, then there is a fate significantly worse than cancelling the project due to cost overruns; funding the project more and having it fail.

Tricks to avoid long meetings are different than figuring out how long a meeting will last.

Replies from: handoflixue
comment by handoflixue · 2012-08-06T21:36:59.738Z · LW(p) · GW(p)

Tricks to avoid long meetings are different than figuring out how long a meeting will last.

Thus it instead being in response to the idea that meetings will expand to fill their schedule - if you don't solve that, then scheduling is that much less reliable.

If you have a project which will bankrupt the company if it fails, then it does not have a budget.

Yes it does; even if the budget is "100% of the company resources", that's still a constraint. Given that the odds of success probably drop drastically if you stop providing payroll, paying rent, etc., then it's constrained further. It might also be the case that spending (say) 10% of your resources elsewhere will double your profits on success, but you have a corresponding 10% chance of failure because of it.

90% chance of a major success vs 10% chance of bankruptcy is not necessarily a trivial decision.

comment by [deleted] · 2012-08-05T09:19:01.536Z · LW(p) · GW(p)

I really don't like the word 'bias' especially in combination with 'overcoming'. It implies that there's the ideal answer being computed by your brain, but it has a bias added to it, which you can overcome to have a correct answer. Much more plausible is that you do not have the answer, and you substitute some more or less flawed heuristic. And if you just overcome this heuristic you will get dumber..

Replies from: Swimmer963
comment by Swimmer963 (Miranda Dixon-Luinenburg) (Swimmer963) · 2012-08-06T03:49:35.054Z · LW(p) · GW(p)

I think your point is a good one. However, I don't think you're disagreeing with the main point of the post, since 'flawed heuristics' are more an example of the traditional 'biases' studied by researchers, which aaronsw is indeed claiming aren't that important for improving general rationality. The "not getting around to reading a 'nonprofits for dummies' book" isn't an example of overcoming a heuristic and becoming dumber, it's an example of having a knowledge/common sense gap and not applying any kind of heuristic at all. "Always read the relevant 'for dummies' book first if you want to start working on a project" is a heuristic, which is probably biased in itself, but which most people don't follow when they would be better off following it.

Also, I think there is more subtlety to 'overcoming bias' than just not using that heuristic anymore (and maybe being dumber). Heuristics exist because they are useful in most circumstances, but they occasionally fail massively when subjected to new and unexpected types of situations. Realizing that thinking happens in the form of heuristics, and then trying to notice when you're in a situation where you wouldn't expect the heuristic to apply, can help with the problem of being overconfident on a given problem. Recognized ignorance is preferable to being very certain of an answer that is likely wrong, in terms of not making decisions that will blow up in your face.

Replies from: None
comment by [deleted] · 2012-08-06T06:54:23.097Z · LW(p) · GW(p)

There may be more subtlety in the ideal but I fail to see it in practice, and least of all I see any sign of lower overconfidence.

comment by Robert Miles (robert-miles) · 2012-08-04T16:10:27.517Z · LW(p) · GW(p)

The difference in optimisation targets between LW and H&B researchers is an important thing to point out, and probably the main thing I'll take away from this post.

Biases can:-

  • Be interesting to learn about
  • Serve an academic/political purpose to research
  • Give insight into the workings of human cognition
  • Be fun to talk about
  • Actually help to achieve your goals by understanding them

And the correlations between any 2 of these things need not be strong or positive.

Is it the halo effect if we assume that a more interesting bias will better help us achieve our goals?

comment by Manfred · 2012-08-05T06:56:35.037Z · LW(p) · GW(p)

No, it's because lukeprog did what seems like common sense: he bought a copy of Nonprofits for Dummies and did what it recommends.

There's a similar principle that I use sometimes when solving physics problems, and when building anything electronic. It's called "Do it the Right Way."

Most of the time, I take shortcuts. I try things that seem interesting. I want to rely on myself rather than on a manual. I don't want to make a list of things to do, but instead want to do things as I think of them.

This is usually fine - it's certainly fast when it works, and it's usually easy to check my answers. But as I was practicing physics problems with a friend, I realized that he was terrible at doing things my way. Instead, he did things the right way. He "used the manual." Made a mental list. Followed the list. Every time he made a suggestion, it was always the Right Way to do things.

With physics, these two approaches aren't all that far apart in terms of usefulness - though it's good to be able to do both. But if you want to do carpentry or build electronics, you have to be able to do things the Right Way.

Replies from: robert-miles
comment by Robert Miles (robert-miles) · 2013-03-07T12:01:40.620Z · LW(p) · GW(p)

To add to that, if you want to Do Things The Right Way, don't use a mental list, use a physical list. Using a checklist is one of the absolute best improvements you can make in terms of payoff per unit of effort. The famous example is Gawande, who tested using a "safe surgery" checklist for surgeons, which resulted in a 36% reduction in complications and a 47% fall in deaths.

comment by RHollerith (rhollerith_dot_com) · 2012-08-04T23:27:32.448Z · LW(p) · GW(p)

Gerald Weinberg is a celebrated author of computer and management books. And for many years he was a management consultant. Often he would get a consulting gig at an organization he had consulted for in the past. The unhealthy organizations, he observed, had the same (crushing) worst problem during his second gig that they had during his first gig, whereas the better organizations tended to have lots of little problems, which he took as a sign that the organization was able to recognize their worst problems and slowly or quickly shrink them.

I am not sure because I do not have access to the book, but that is probably from the chapter or section "Rudy’s Law of Rutabagas" from Weinberg's book The Secrets of Consulting: A Guide to Giving and Getting Advice Successfully.

What Weinberg did after he stopping doing management consulting, by the way, is to run workshops on improving what we would call individual and team rationality, and he maintained that people learned the skills he taught a lot better in the right kind of interpersonal situations (e.g., workshops) than they do from written materials.

Hope that helps someone.

comment by aaronsw · 2012-08-04T14:33:58.125Z · LW(p) · GW(p)

Use direct replies to this comment for suggesting things about tackling practical biases.

Replies from: aaronsw, D_Malik, aaronsw, D_Malik, aaronsw
comment by aaronsw · 2012-08-04T22:17:57.254Z · LW(p) · GW(p)

Carol Dweck's Mindset. While unfortunately it has the cover of a self-help book, it's actually a summary of some fascinating psychology research which shows that a certain way of conceptualizing self-improvement tends to be unusually effective at it.

Replies from: Pablo_Stafforini, Dorikka, Pablo_Stafforini
comment by Dorikka · 2012-08-08T02:42:22.177Z · LW(p) · GW(p)

Reviews seem to indicate that the book can and should be condensed into a couple quality insights. Is there any reason to buy the actual book?

Replies from: aaronsw
comment by aaronsw · 2012-08-08T16:31:16.932Z · LW(p) · GW(p)

The main insight of the book is very simple to state. However, the insight was so fundamental that it required me to update a great number of other beliefs I had, so I found being able to read a book's worth of examples of it being applied over and over again was helpful and enjoyable. YMMV.

comment by Pablo (Pablo_Stafforini) · 2012-08-07T16:57:46.639Z · LW(p) · GW(p)

I took a look at Mindset. The book seemed to me extremely repetitive and rambling. Its teachings could be condensed in an article ten or fifteen times shorter. Fortunately, this Stanford Magazine piece seems to accomplish something close to that. So, read the piece, and forget the book.

comment by D_Malik · 2012-08-05T04:29:34.143Z · LW(p) · GW(p)

Set a ten-minute timer and make a list of all the things you could do that would make you regret not doing them sooner. And then do those things.

I have a pretty long list like this that I try to look at every day, but I can't post it for the next two weeks for a complicated, boring reason.

Replies from: aaronsw
comment by aaronsw · 2012-08-19T13:14:56.453Z · LW(p) · GW(p)

It's been two weeks. Can you post it now?

Replies from: None
comment by [deleted] · 2012-08-25T06:02:42.186Z · LW(p) · GW(p)

Indeed I can, and thank you for reminding me: D_Malik d_livers.

I didn't see your comment earlier because I switched accounts to stop using a pseudonym (as you can see), and I haven't been browsing the internet much lately because I'm doing my anki backlog, which I have because I was away from home for three weeks doing SPARC and other things, which, together with the fact that my anki "ideas" deck was corrupt (because I copied it over to my ipad before first closing anki) and the fact that I couldn't de-corrupt it on my ipad and didn't have my laptop with me, made me unable to post it at the time of the grandparent comment.

comment by aaronsw · 2012-08-04T14:38:23.151Z · LW(p) · GW(p)

lukeprog's writings, especially Build Small Skills in the Right Order.

comment by D_Malik · 2012-08-05T04:51:35.171Z · LW(p) · GW(p)

Buy some nicotine gum and chew that while doing useful stuff, like working out, doing SRS reviews, thinking really hard about important things, etc..

Of course you should read up on nicotine gum before you do this. Start here.

Replies from: niceguyanon
comment by niceguyanon · 2012-08-08T01:41:35.171Z · LW(p) · GW(p)

I am curious about using nicotine as a low cost way to improve performance, and build positive habits for exercise. However as an ex-tobacco smoker (4 years). I am very wary of my interest in nicotine because I suspect that my interest is based on latent cravings. After reading about the positive effects of nicotine, all I could think about was taking a pull of an e-cig, I didn't give any thoughts at all to gums or patches, which should be a warning sign.

I am quite conflicted about this — I am very certain I would not go back to smoking tobacco, but I see my self using e-cigs as a daily habit rather than to promote habit learning on skills and activities that I want.

Replies from: None
comment by [deleted] · 2012-08-25T08:25:52.816Z · LW(p) · GW(p)

I think you should be careful and stick to gum or lozenges (or maybe patches) if you do nicotine at all.

Chewing a 4mg nicorette (gradually, as per the instructions) produces blood concentrations of nicotine about 2/3 that of a cigarette. If you cut a 4mg nicorette into 4 pieces like I do, and only take 1 piece per 30 minutes, that's even less. It's not enough to produce any sense of visceral pleasure for me (in a study on gwern's page, people couldn't distinguish between 1mg nicotine and placebo), but I think it's still enough to form habits. I don't think you should use nicotine as a way of "rewarding" things (by producing noticeable pleasure).

Maybe you could get someone else to dish out nicotine only when you're doing things you want to reinforce? That way it'll be harder for you to relapse.

(I'm D_Malik; I didn't see your post earlier because I changed usernames.)

Replies from: niceguyanon
comment by niceguyanon · 2012-10-18T04:29:11.203Z · LW(p) · GW(p)

Update

I eventually purchased Walgreen branded 21mg 24 hour release patches, which I cut into 4 equal doses. I use them for days when I go weight lifting or bouldering. I feel a noticeable alertness when I use the patches. I did not notice any increase desire to smoke and have no noticeable cravings on days when I am off the patch. I decided to stay away from any instant forms of nicotine such as e-cigs or gum.

comment by aaronsw · 2012-08-04T14:34:19.177Z · LW(p) · GW(p)

Ray Dalio's "Principles". There's a bunch of stuff in there that I disagree with, but overall he seems pretty serious about tackling these issues -- and apparently has been very successful.

comment by Pablo (Pablo_Stafforini) · 2012-08-04T17:35:53.455Z · LW(p) · GW(p)

I think that you are using the word 'bias' somewhat idiosyncratically here, and that this might be causing some people to have a hard time understanding the main point of this post, which (if I may) I would summarize as follows:

Many people in this community seem to believe that, when we do not get what we want, this is primarily because we are afflicted by one or more cognitive biases, such as anchoring or scope insensitivity. But this is not so. The main source of practical irrationality is lack of certain practical skills or habits, like "figuring out what your goals really are" or "looking at your situation objectively and listing the biggest problems". What are the best ways to develop these skills?

Replies from: Cyan, Viliam_Bur
comment by Cyan · 2012-08-05T00:22:29.398Z · LW(p) · GW(p)

I can vouchsafe that the June CFAR minicamp covered a lot of material on figuring out what your goals really are.

Replies from: Pablo_Stafforini
comment by Pablo (Pablo_Stafforini) · 2012-08-05T01:21:59.424Z · LW(p) · GW(p)

You may want to move this comment to the appropriate thread.

Replies from: Cyan
comment by Cyan · 2012-08-05T03:52:44.312Z · LW(p) · GW(p)

The material is still a work-in-progress, so minicampers have been asked not to make it public.

comment by Viliam_Bur · 2012-08-09T15:06:57.758Z · LW(p) · GW(p)

More meta: "Believing that not achieving our goals is caused by cognitive biases, when it is actually caused by a lack of skills and habits" is a cognitive bias, isn't it?

It only needs some name that would make it easier to remember. Something like: "Nerd Over-Thinking Fallacy".

Replies from: wedrifid, nshepperd
comment by wedrifid · 2012-08-09T15:12:16.011Z · LW(p) · GW(p)

It only needs some name that would make it easier to remember. Something like: "Nerd Over-Thinking Fallacy".

Nerd over thinking is a different problem and occurs even when the nerds in question don't necessarily believe that the overthinking is useful.

comment by nshepperd · 2012-08-09T17:34:23.222Z · LW(p) · GW(p)

More meta: "Believing that not achieving our goals is caused by cognitive biases, when it is actually caused by a lack of skills and habits" is a cognitive bias, isn't it?

Nope.

comment by Vaniver · 2012-08-07T17:00:50.332Z · LW(p) · GW(p)

As much as I think lukeprog - EY comparisons are informative, I wonder if the difference is just different amounts of energy. I hear that lukeprog is working 60 hour weeks and that EY had trouble being productive for more than 4 hours a day, and looking for citations I noticed this down the page.

That can't explain everything- there's another comment that comes to mind that I'm having difficulty finding, in which one of lukeprog's hacks dramatically increased EY's writing output- but it seems like it's part of a complete explanation.

Replies from: MileyCyrus
comment by MileyCyrus · 2012-08-10T08:51:05.984Z · LW(p) · GW(p)

So how does one get more energy?

Replies from: Vaniver
comment by Vaniver · 2012-08-10T16:45:21.033Z · LW(p) · GW(p)

Various diet and exercise changes seem to improve energy, but like mood and intelligence I suspect the range one can inhabit varies based on biological factors that are mostly beyond individual control.

Replies from: gwern
comment by gwern · 2012-08-10T18:12:27.177Z · LW(p) · GW(p)

Indeed. Conscientiousness is a pretty durable personality trait (as are all of the Big Five, and to make things worse, they tend to be 20-50% hereditable too!). This is why I've spent so much time looking into stimulants: 'use the Try Harder, Luke!' doesn't work very well. (Unless your last name is Muelhauser, I suppose.)

comment by shminux · 2012-08-06T06:20:15.320Z · LW(p) · GW(p)

My guess would be that risk analysis and mitigation would be one of the more useful positive techniques in practical rationality. I wish every organization with executive officers had a CRO (chief risk officer) position. Of course, a person like that would be highly unpopular, as they would be constantly asking some very hard questions. Imagine that it is you against Murphy. What can go wrong? What are the odds of its going wrong? What are the odds of you mis-estimating that it will go wrong? What has gone wrong in the past? What are the potential mitigation steps? What are the odds of the mitigation steps themselves going wrong? Basically, a CRO would ensure that an organization is (almost) never blindsided, except maybe for true black swans. Otherwise the most that can happen is "a failure mode described in has occurred, we should now review, possibly update and implement the risk mitigation steps outlined". The standard business plan is certainly not a substitute for something like that.

Most companies do not do nearly enough risk analysis and management, possibly because the CEOs are required to be optimistic, and neither the CEO nor the board are personally responsible for failures. The worst that can happen is that they are booted out and get a golden parachute.

comment by DuncanS · 2012-08-05T00:16:56.458Z · LW(p) · GW(p)

My top 2....

Looking at unlikely happenings more sensibly. Remembering that whenever something really unlikely happens to you, it's not a sign from the heavens. I must remember to take into account the number of other unlikely things that might have happened instead that I would also have noticed, and the number of things that happen in a typical time. In a city of a million people, meeting a particular person might seem like a one in a million chance. But if I know a thousand people in the city, and walk past a thousand people in an hour, the chance of bumping into one of my friends is pretty good.

The other one? We're all too optimistic about our own abilities. Most of the time that's pretty benign, but it's a good thing to remember when considering employing yourself as a stock picker, gambling advisor, or automobile driver. We're actually much more average than we think, most of the time.

comment by gwern · 2012-08-04T19:38:16.784Z · LW(p) · GW(p)

http://lesswrong.com/lw/ahz/cashing_out_cognitive_biases_as_behavior/ may be of relevance. The single strongest correlation with various unhappy behaviors or outcomes (the DOI) in Bruine de Bruine 2007 (weaker than the overall correlation with succumbing to various fallacies, though!) was 'Applying Decision Rules':

Applying Decision Rules asks participants to indicate, for hypothetical individual consumers using different decision rules, which of five DVD players they would buy (e.g., “Lisa wants the DVD player with the highest average rating across features,” describing an equal weights rule). Each consumer chooses from a different set of five equally priced DVD players with varying ratings of picture quality, sound quality, programming options, and brand reliability (from 1 [very low] to 5 [very high]). The decision rules are taken from Payne, Bettman, and Johnson (1993) and include elimination by aspects, satisficing, lexicographic, and equal weights rules. The present task uses more complex rules than the Y-DMC, which, in pretests, proved too easy for adults. Performance is measured by the percentage of items for which the correct DVD players are chosen, given the decision rule to be applied.

Seems somewhat reasonable to me - if you can't even shop well, you're probably going to overspend or buy unsatisfactory stuff or just junk, behavior which will cost you over your entire life.

comment by novalis · 2012-08-04T19:43:54.752Z · LW(p) · GW(p)

As I recall, experiments show that people who learn about anchoring are still susceptible to anchoring, but people who learn about the sunk cost fallacy are less likely to throw good money after bad.

Replies from: gwern, Error
comment by gwern · 2013-01-17T20:37:26.246Z · LW(p) · GW(p)

Sunk cost training has mixed results: http://www.gwern.net/Sunk%20cost#fnref37

Replies from: novalis
comment by novalis · 2013-01-18T04:53:55.181Z · LW(p) · GW(p)

Thanks. I guess it should be no surprise that people's behavior on quizzes has very little effect on their behavior in real life. This shows that CFAR probably ought to find some way to test the effectiveness of their training other than by a written test.

comment by Error · 2013-01-17T19:44:18.402Z · LW(p) · GW(p)

Supportive anecdote: Since I started reading here, I've started to consciously see and avoid the sunk costs fallacy.

(as opposed to several other biases that I now see and recognize, but still don't avoid, apparently because I am an idiot.)

comment by beoShaffer · 2012-08-07T18:39:03.765Z · LW(p) · GW(p)

This obviously doesn't help right here and now, but I would like to point out that CfAR is in a good position to investigate this question experimentally. We'll have to wait awhile to be sure, but it looks like they have developed decent debiasing procedures and life outcome measures. I'm also guessing that they can't train people against every bias in a single retreat. Thus the can include different biases in different curriculums and compare their practical effects.

comment by John_Maxwell (John_Maxwell_IV) · 2012-08-05T00:38:15.344Z · LW(p) · GW(p)

It's a good point that academics are likely to focus on those biases that are likely to be easy to prove, not those that are likely to be important to fix. But I'd expect the most important biases to also manifest themselves in a big way and in lots of different scenarios, and therefore be relatively easy to prove.

comment by [deleted] · 2014-06-13T08:38:11.573Z · LW(p) · GW(p)

''"A large body of evidence[1][2][3][4][5][6][7][7][8][9][10] has established that a defining characteristic of cognitive biases is that they manifest automatically and unconsciously over a wide range of human reasoning, so even those aware of the existence of the phenomenon are unable to detect, let alone mitigate, their manifestation via awareness only."''

AFAIK, currently, none of them. The entire effort is futile and the introductory paragraph to Lesswrong appears self-defeating in light of this. I think there is far more too this place that cognitive bias mitigation.