[Draft] Productive Use of Heuristics and Biases
post by wattsd · 2012-08-26T18:08:15.564Z · LW · GW · Legacy · 18 commentsContents
18 comments
The main point
If we are naturally biased and limited in how rational our decision making can be, it might be more effective to improve intuition and manage biases rather than trying to completely eliminate them and focus on unbiased thinking. I’m not advocating irrationality, instead I’m interested in how to deal with bounded rationality. How can we improve our ability to satisfice? How can we productively use and develop our intuition?
Caveats
While I intend for this post to be informative, it may not be, and I suspect I’ll learn more from writing it and feedback than others will get from reading it. There is much on LessWrong I haven’t read, so I apologize if this is something that was discussed extensively before. If that is the case, I’d appreciate links to those discussions in the comments. One other note, I realize intuition can be interpreted as a dirty word, with good reason. Here, please interpret intuition as a rational heuristic guided by experience. Heuristics are tools, they can be effective when used properly.
Background
This post was prompted by some casual research on expertise and decisionmaking I've been doing for a couple months now. Along the way I came across the work of Herbert Simon and Gary Klein. Simon’s ideas of bounded rationality and satisficing have come up in discussion here before, but I haven’t seen any discussion of Klein, who is an active researcher in the area of intuitive decisionmaking. Simon proposes that rationality in decisionmaking is limited by the constraints of time, available information, and brain/computational capabilities (bounded rationality). Rather than make perfectly rational decisions, we make the best decisions we can given those constraints (satisficing). Klein’s work is primarily focused on how experts actually make decisions in the real world. Put another way, his work is focused on how experts satisfice.
One of Klein’s early research projects was to help the military improve decision making. His proposal was to study firefighters, figuring their experience would be a useful analog to military experience (high pressure, life/death, time...). The major finding was that they did not follow the standard model of good decision making, they did not evaluate multiple options. Instead, they believed they operated by feel. Generally, the firefighters had one option they were considering. After further dialogue, Klein realized that they were forming mental models of the situations based on recognition/experience, which he calls recognition primed decision making. Using those mental models, they would mentally simulate what might occur if they followed their first instinct and go from there. A similar process was seen in other areas of expertise as well. For example, expert chess players aren’t evaluating more options or seeing farther ahead than novices, they just have better recognition/heuristics that are developed through experience. One potential problem with using chess as an analog for other skills is that the rules are clearly defined, all of the information you need is on the board (or in your opponent’s head...). The real world isn’t quite so clean, which which might imply that rationality is generally more limited/bounded than in the case of chess.
From what I’ve been reading, I guess I’m skeptical about efforts to be more rational in everyday life. Certainly it is possible, but it may be unnatural enough to be impractical. Even then, it will still be bounded. If this is the case, it might be more effective to train in some mix of intuitive and rational decisionmaking, while understanding the limits of both. While I haven’t read it yet, I think Daniel Kahneman refers to intuition and rationality as system 1 and 2 in “Thinking Fast and Slow”. While Kahneman has seemed skeptical of our ability to improve decision making and reduce bias in the interviews I’ve read, Klein is a bit more optimistic.
Methods for improving intuition I’ve found so far
(1-5 from Klein/Fadde - Deliberate Performance, 6-7 from Klein - The Power of Intuition)
Intuition comes from experience/expertise. Expertise is developed through deliberate practice. The methods here are intended to accelerate the learning process.
1. Estimation
This is fairly simple, estimate something and compare to reality. How long do you expect a project to take? Think through it and discuss with others, and record your estimate and reasoning. Later, after you’ve completed the project, go back and look at your prediction. Where did you go wrong?
One of the keys to effective and efficient learning is timely feedback. If your project will last months, you’ll probably want to do other estimates in the meantime, preferably with tighter feedback loops. An example given in the paper is giving a presentation. How long do you expect discussion to take on each point? What questions will be asked? After the presentation, compare the prediction to reality. For what it’s worth, I believe Peter Drucker also recommends something similar in his article “Managing Oneself”. If I remember correctly, Drucker recommends keeping a decision journal, where you record the decisions you make and return later to see what you got right or wrong (and why).
2. Extrapolation
In “Sources of Power”, Klein discusses an example of extrapolation. Engineers are trying to estimate the cost and time it takes to build various components of airplanes that haven’t yet been built. Since the parts don’t yet exist in some cases, they need to find a way to provide reasonable estimates. To do this, they try to find an analog. Are there any parts that are similar? How long did they take to produce and how much did they cost? It may take combining several analogs to come up with an estimate. The idea is to use what you have and know to model what you don’t have and know. Again, compare the extrapolation to the actual results when you are able to do so, so you can improve next time.
3. Experimentation
At its most basic, experimentation can be just trying something and seeing what happens. A better way is to form a hypothesis (estimation/prediction/extrapolation) prior to trying.This promotes better learning, because you have a concrete expectation that is either verified or not. If not, you’ve learned something new. Perhaps you’ll even be surprised, and surprises can lead you in new directions.
A more personal example: For the longest time, I had trouble with directions where I currently live. I’d look at a map, figure out how to get somewhere, and try not to vary things too much so I wouldn’t get lost. I rapidly improved by using a GPS, and ignoring it until I needed it. I’d have an idea of how to get there (from looking at maps), but I wouldn’t have it down 100% or I’d try to improvise. The GPS allowed me to experiment without the risk of getting lost, and it provided instant feedback.
4. Explanation
This section is rather short, so I am not entirely sure if I understand it correctly. I think the intent is to try to make sense of what you’ve seen and learned from the other three Es.
One way to do this might be to use what Scott Young calls the Feynman Technique (Learn Faster with the Feynman Technique):
Step 1. Choose the concept you want to understand.
Take a blank piece of paper and write that concept at the top of the page.
Step 2. Pretend you’re teaching the idea to someone else.
Write out an explanation of the topic, as if you were trying to teach it to a new student. When you explain the idea this way you get a better idea of what you understand and where you might have some gaps.
Step 3. If you get stuck, go back to the book.
Whenever you get stuck, go back to the source material and re-learn that part of the material until you get it enough that you can explain it on paper.
Step 4. Simplify your language.
The goal is to use your words, not the words of the source material. If your explanation is wordy or confusing, that’s an indication that you might not understand the idea as well as you thought – try to simplify the language or create an analogy to better understand it.
5. Feedback/Coaching (Emulation?)
Feedback is critical. While you might not have a coach at work, you can still find someone to emulate. Assuming they are available for questions, you can try to predict what they would do in a situation. When something does not go as you’d expect, explain your thinking and ask for feedback. In my experience, other people are busy with their own work, so coaching/mentorship takes a backseat to more urgent matters. Plus, some people often just don’t want to be bothered. In this case, I think the best thing to do is to get good at asking effective questions and be well prepared.
6. Premortem
Imagine results of what you are trying to do are a complete failure and that you are now doing a post mortem to understand what went wrong. What went wrong? The reasoning behind this technique is that people don’t want something to fail (or look like they want it to fail), assuming it already has failed reduces that bias. Apparently the idea for the technique came from Mitchell, Russo, and Pennington - “Back to the Future: Temporal Perspective in the Explanation of Events”. That paper doesn’t entirely support the technique, stating that it produces more reasons that are episodic, but not necessarily better (they were unable to judge the value of the reasons given). Klein uses this technique regularly in meetings, the general impression is that it reduces confidence in the plan, as intended. From there, they try to prepare for the potential problems.
In a sense, this is similar to red teams but easier to implement and less resource intensive.
7. Identify Decisions/Decision Making Exercises/Decision Making Critiques
In “The Power of Intuition”, Klein advocates identifying decisions where problems have occurred. When reviewing the decisions, note what makes it difficult, what kinds of errors are often made, how an expert might approach it differently than a novice, and how the decision can be practiced and you can get feedback.
Those decisions are then turned into scenarios which can be repeatedly practiced (typically in groups). Start with describing the events that led to the decision. The players are then told what they are trying to achieve, the context, and the constraints. Try to include a visual representation whenever possible.
After the exercise, critique the decision and the process used to make it. Start with a timeline and identify key judgments. For each of the key judgments, note why it was difficult, how you were interpreting the situation, what cues/patterns you should have been picking up, why you chose to do what you did, and what you would’ve done differently with the benefit of hindsight.
Concluding thoughts
Is there interest in this topic on LW? I’m not denying that relying on intuition alone can be dangerous, but I am very skeptical that focusing on reducing bias alone will lead to better decisions. In some cases, it may be better to admit that biases are affecting a decision. One other thing to note, is that bias and mistakes are inevitable. It seems a lot of the LW rationality posts I’ve seen focus on reducing or eliminating these mistakes. That is certainly a valid goal (at least the reduction is), but it isn’t enough. Choice/information overload can affect decisions, as can blood sugar levels (Sweet Future: Fluctuating Blood Glucose Levels May Affect Decision Making) and having to go to the bathroom (Tuk M. et al. (2011). Inhibitory Spillover: Increased Urination Urgency Facilitates Impulse Control in Unrelated Domains. Psychological Science.).
Mistakes will happen, so you’ll have to do your best to learn from them and reduce their cost/make recovery easier. At the same time, good heuristics give you a better starting point to apply the rationality techniques. They are complementary. Worry about reducing bias after you have come up with something using expertise/intuition.
Again, this is a draft, written mostly from memory of what I’ve been reading. The primary sources were “Sources of Power” and “The Power of Intuition” (both by Klein), “The Cambridge Handbook of Expertise”, and scattered readings on/by Simon. Also of interest are this interview with Daniel Kahneman and Gary Klein (free registration required unfortunately) and this paper on deliberate performance by Peter Fadde and Klein.
18 comments
Comments sorted by top scores.
comment by Oscar_Cunningham · 2012-08-26T18:59:52.035Z · LW(p) · GW(p)
Nice post! You didn't explicitly ask for criticism, but I'm going to give some anyway:
I think the standard font-size on LessWrong is smaller. Most people would prefer it if you used that.
I think there's definitely interest on LessWrong for improving intuition, but I would frame it as "Training intuition to make its judgements more rational" rather than (as your post leans towards) "Forget rationality and harness our natural biases!". This is mostly just a terminological difference.
The System 1/System 2 distinction is really between System 1 being (fast, intuitive, subconscious) and System 2 being (slow, deliberative, conscious). Around these parts, the word "rationality" tends to be used to mean something like "succeeding by using any and all means". Under this definition, rationality can use both System 2 and System 1 type thinking. Thus I believe your post could be improved by taking the sentences where intuition is being contrasted with "rationality" and replacing the word "rationality" with something like "deliberate thought" or "System 2".
As I say above, this is really just a terminological difference, but I think that making it will clarify some of the ideas in the post. In particular, I think that the main content of the post (the seven ways of improving our heuristic judgements), is really useful instrumental rationality, but that the introduction and conclusion hide it in poorly backed up statements about how reducing bias is less important than using good heuristics. I find it strange that these things are even presented in contrast; the techniques you give for improving intuition are techniques for reducing bias. The intuitive judgements become more accurate (i.e. less biased) than they were before.
Some bits of the two "Concluding thoughts" paragraphs seem especially washy. A general sentiment of "System 1 should work in harmony with System 2" sounds nice, but without any data to back it up it could just be complete bollocks. Maybe we should all be using System 1 all the time. Or maybe there are some activities where System 1 wins and some where System 2 wins. If so, which activities are which? Are firefighters actually successful decision makers?
One final thought: Do the seven methods really focus on System 1 only? Many of them seem like general purpose techniques, an in particular I think that 4, 5, 6, and 7 are actually more System 2.
Replies from: wattsd, wattsd, Bruno_Coelho↑ comment by wattsd · 2012-08-28T04:12:17.377Z · LW(p) · GW(p)
After doing a bit more reading here and thinking about your comments, I think I'll focus on the 7 methods and eliminate much of the low quality fluff that make up the intro/conclusion for the next version.
I think some of my confusion was due to unsubstantiated assumptions about the standard views of LessWrong. What I've been thinking of bias is closer to Inductive bias than the standard definition, which refers to error patterns. I then interpreted rationality as "overcoming bias". Inductive bias can be useful, and the idea of overcoming bias of that type seemed to be taking things too far. That doesn't seem to be what anyone is actually advocating here though.
Again, thanks.
↑ comment by wattsd · 2012-08-26T20:05:50.220Z · LW(p) · GW(p)
Thanks for the comments, criticism is welcomed.
I think the standard font-size on LessWrong is smaller. Most people would prefer it if you used that.
Apologies for the font size, I was editing in Google Docs rather than the post editor...
As I say above, this is really just a terminological difference, but I think that making it will clarify some of the ideas in the post. In particular, I think that the main content of the post (the seven ways of improving our heuristic judgements), is really useful instrumental rationality, but that the introduction and conclusion hide it in poorly backed up statements about how reducing bias is less important than using good heuristics. I find it strange that these things are even presented in contrast; the techniques you give for improving intuition are techniques for reducing bias. The intuitive judgements become more accurate (i.e. less biased) than they were before.
I admit, terminology is an issue. I perhaps bit off a bit more than I can chew for a first post. I'll try to fix that.
One final thought: Do the seven methods really focus on System 1 only? Many of them seem like general purpose techniques, an in particular I think that 4, 5, 6,and 7 are actually more System 2.
From the way Klein describes them, they are meant to accelerate expertise. If my interpretation is correct, they are using system 2 to develop system1 for the next scenario. I think part of the problem with how I'm describing this is that experience, which is instrumental in developing expertise, develops intuition. Intuition can either help or hurt. Sometimes we won't know which until after a decision has been made, other times we might be able to prevent mistakes by running through a checklist of cognitive biases. In the former case, the methods should help next time. In the latter case, you need something (from system 1 for example) to run through the checklist. The checklist on its own isn't very useful.
Again, thanks for the feedback.
Replies from: Oscar_Cunningham↑ comment by Oscar_Cunningham · 2012-08-26T22:16:16.241Z · LW(p) · GW(p)
If my interpretation is correct, they are using system 2 to develop system1 for the next scenario.
Good point.
↑ comment by Bruno_Coelho · 2012-08-27T00:31:13.345Z · LW(p) · GW(p)
Up. Small steps in goal achievement is good if your already have a minimal list to fulfil. Think in biases when cookies are nearby.
comment by lukeprog · 2012-08-26T22:49:01.042Z · LW(p) · GW(p)
To make the post more exciting, you may wish to review the list of tips in Rhetoric for the Good.
comment by NancyLebovitz · 2012-08-26T22:02:33.174Z · LW(p) · GW(p)
Any thoughts about protecting intuition? Some types of useful intuition come from experience, but there are people (the outrage industry, advertising) are trying to hijack other people's intuition by supplying large quatities of emotionally intense simulated experience.
Replies from: wattsd↑ comment by wattsd · 2012-08-26T22:59:50.932Z · LW(p) · GW(p)
Something like this was discussed by Kelly McGonigal in "The Willpower Instinct". A couple things that might help:
Avoidance - Complete avoidance is probably impossible, but you might try limiting your exposure to such things, particularly when you are vulnerable to making poor decisions. The old advice "don't go to the store when your hungry", might be related to low glucose levels (which affect decisionmaking).
Controlled exposure w/ reflection - I remember wanting toys when I was younger based on what was shown in commercials. After a couple disappointments, I got a little better resisting the ads. That said, I could probably use some recalibration...
All in all, mindfulness and an information diet. I've seen this particular field (ads, store layouts, etc) referred to as choice architecture, perhaps you could do some choice architecture of your own, to guard when your defenses are down. Essentially, develop good routines and make good choices ahead of time and stick to them.
comment by [deleted] · 2012-08-26T21:04:29.967Z · LW(p) · GW(p)
If we are naturally biased and limited in how rational our decision making can be, it might be more effective to improve intuition and manage biases rather than trying to completely eliminate them and focus on unbiased thinking.
Improving intuition is beyond me. Buckminster Fuller claimed it was key, and you might get some information out of his works. The limits of rationality as a skill and as a trend are good to know as something that exists.
But when it comes to identifying error (bias) as a tool for lessening error I don't know that you can do better than Karl Popper. I started with 'Conjectures and Refutations' but also like 'In Search of a Better World' and, well, most of his books.
comment by buybuydandavis · 2012-08-26T20:03:31.379Z · LW(p) · GW(p)
I'd focus more on getting effective more than reducing or managing biases. The latter often serves the former, but I think the answers come fairly directly if you start there.
I see most of the methods you list as general process improvement methods whether or not they actually improve intuition.
An alternative to improving your intuition and removing your biases would be to find other and better processes and tools to rely on. And then actually use them.
That last part is probably the main failing. We all have a boatload of good ideas that would make us a zillion times more effective if we actually used them.
How often do you plan? How often do you monitor your plans? Measure the results? Provide summary statistics? Do you avail yourself of any tools to do this?
For probably a couple of decades now, I've wanted some planning software where I input goals, utilities, activities, and results, and the software plans my day, makes suggestions, tracks progress on those goals, and charts overall utility.
I bet that pencil and paper process monitoring would be a huge advance. Yet I don't do it. I don't need a lot of fancy research about process monitoring to improve, I need to do it.
I would think this is true for most everyone here. We indulge our enjoyment of thinking, and fell justified in doing so, when it really isn't much different than watching porn. Mental masturbation. Wank wank wank, instead of getting things done. It's actually a bit worse than porn, because we feel justified when we mentally wank, we feel we're accomplishing something respectable, and for the most part, society agrees.
Replies from: wattsd, buybuydandavis↑ comment by wattsd · 2012-08-26T20:13:04.911Z · LW(p) · GW(p)
An alternative to improving your intuition and removing your biases would be to find other and better processes and tools to rely on. And then actually use them.
I think that is part of what I was attempting to get at, though I probably didn't do a very good job. In a sense we are biased to use certain processes or tools. The only way to change those "default settings" is to deliberately practice something better, so that when the time comes, you'll be ready.
Replies from: buybuydandavis, billswift↑ comment by buybuydandavis · 2012-08-26T21:59:27.623Z · LW(p) · GW(p)
The time has always come. The time is now.
Any particular skill is small potatoes compared to establishing a general continuing practice of planning, monitoring, and executing.
↑ comment by billswift · 2012-08-27T00:31:28.822Z · LW(p) · GW(p)
Some places, the "deliberate practice" idea breaks down, choosing and decision making is one of them. There is no way to "practice" them except by actually making chooses and decisions; separating practice from normal execution is not possible.
Replies from: wattsd↑ comment by wattsd · 2012-08-27T01:21:59.325Z · LW(p) · GW(p)
I agree that the only way to practice decisions is to make them, but I think there is more to it than that. The deliberate part of deliberate practice is that you are actively trying to get better. The deliberate performance paper I linked to touches on this a bit, in that deliberate practice is challenging for professionals and that something else might work better (they advocate the first 5 methods in that paper).
Beyond making decisions, you need to have an expectation of what will happen, otherwise hindsight bias is that much harder to overcome. It's the scientific method: hypothesis->test->new hypothesis. Without defining what you expect ahead of time, it is much easier to just say "Oh yeah, this makes sense" and normalize without actually improving understanding.
Replies from: billswift↑ comment by billswift · 2012-08-27T01:49:31.705Z · LW(p) · GW(p)
I don't disagree with anything in this comment, I was just pointing out that "deliberate practice" has several requirements, including practice being separate from execution, that makes it less usable, or even totally unusable, for some areas, such as decision making and choosing. The other main requirements are that it has a specific goal, should not be enjoyable and, as you pointed out, that is is challenging. Another thing, that is not part of the original requirements but is encompassed by them, is that you are not practicing when you are in "flow".
↑ comment by buybuydandavis · 2012-08-26T20:04:43.590Z · LW(p) · GW(p)
For probably a couple of decades now, I've wanted some planning software where I input goals, utilities, activities, and results, and the software plans my day, makes suggestions, tracks progress on those goals, and charts overall utility.
What's out there like this?
Replies from: wedrifid, wattsd↑ comment by wedrifid · 2012-08-27T02:59:24.040Z · LW(p) · GW(p)
For probably a couple of decades now, I've wanted some planning software where I input goals, utilities, activities, and results, and the software plans my day, makes suggestions, tracks progress on those goals, and charts overall utility.
What's out there like this?
Personal Assistants.
↑ comment by wattsd · 2012-08-26T20:23:49.338Z · LW(p) · GW(p)
tracks progress on those goals, and charts overall utility.
I don't think it works very well for what you are envisioning, but something like spaced repetition software might help.
With SRS, the idea is that the software tries to figure out when you are going to forget something and prompts you at that time, when the reminder will be most effective.