How You Make Judgments: The Elephant and its Rider

post by lukeprog · 2011-04-15T01:02:12.625Z · LW · GW · Legacy · 19 comments

Contents

  Attribute substitution
  Supervision of intuitive judgments
  Conclusion
  Notes
  References
None
19 comments

Part of the sequence: Rationality and Philosophy

Whether you're doing science or philosophy, flirting or playing music, the first and most important tool you are using is your mind. To use your tool well, it will help to know how it works. Today we explore how your mind makes judgments.

From Plato to Freud, many have remarked that humans seem to have more than one mind.1 Today, detailed 'dual-process' models are being tested by psychologists and neuroscientists:

Since the 1970s dual-process theories have been developed [to explain] various aspects of human psychology... Typically, one of the processes is characterized as fast, effortless, automatic, nonconscious, inflexible, heavily contextualized, and undemanding of working memory, and the other as slow, effortful, controlled, conscious, flexible, decontextualized, and demanding of working memory.2

Dual-process theories for reasoning,3 learning and memory,4 decision-making,5 belief,6 and social cognition7 are now widely accepted to be correct to some degree,8 with researchers currently working out the details.9 Dual-process theories even seem to be appropriate for some nonhuman primates.10

Naturally, some have wondered if there might be a "grand unifying dual-process theory that can incorporate them all."11 We might call such theories dual-system theories of mind,12 and several have been proposed.13 Such unified theories face problems, though. 'Type 1' (fast, nonconscious) processes probably involve many nonconscious architectures,14 and brain imaging studies show a wide variety of brain systems at work at different times when subjects engage in 'type 2' (slow, conscious) processes.15

Still, perhaps there is a sense in which one 'mind' relies mostly on type 1 processes, and a second 'mind' relies mostly on type 2 processes. One suggestion is that Mind 1 is evolutionarily old and thus shared with other animals, while Mind 2 is recently evolved and particularly developed in humans. (But not fully unique to humans, because some animals do seem to exhibit a distinction between stimulus-controlled and higher-order controlled behavior.16) But this theory faces problems. A standard motivation for dual-process theories of reasoning is the conflict between cognitive biases (from type 1 processes) and logical reasoning (type 2 processes).17 For example, logic and belief bias often conflict.18 But both logic and belief bias can be located in the pre-frontal cortex, an evolutionarily new system.19 So either Mind 1 is not entirely old, or Mind 2 is not entirely composed of type 2 processes.

We won't try to untangle these mysteries here. Instead, we'll focus on one of the most successful dual-process theories: Kahneman and Frederick's dual-process theory of judgment.20


Attribute substitution

Kahneman and Frederick propose an "attribute-substitution model of heuristic judgment" which claims that judgments result from both type 1 and type 2 processes.21 The authors explain:

The early research on judgment heuristics was guided by a simple and general hypothesis: When confronted with a difficult question, people may answer an easier one instead and are often unaware of the substitution. A person who is asked "What proportion of long-distance relationships break up within a year?" may answer as if she had been asked "Do instances of failed long-distance relationships come readily to mind?" This would be an application of the availability heuristic. A professor who has heard a candidate’s job talk and now considers the question "How likely is it that this candidate could be tenured in our department?" may answer the much easier question: "How impressive was the talk?" This would be an example of one form of the representativeness heuristic.22

Next: what is attribute substitution?

...whenever the aspect of the judgmental object that one intends to judge (the target attribute) is less readily assessed than a related property that yields a plausible answer (the heuristic attribute), individuals may unwittingly substitute the simpler assessment.22

For example, one study23 asked subjects two questions among many others: "How happy are you with your life in general?" and "How many dates did you have last month?" In this order, the correlation between the two questions was negligible. If the dating question was asked first, however, the correlation was .66. The question about dating frequency seems to evoke "an evaluation of one's romantic satisfaction" that "lingers to become the heuristic attribute when the global happiness question is subsequently encountered."22

Or, consider a question in another study: "If a sphere were dropped into an open cube, such that it just fit (the diameter of the sphere is the same as the interior width of the cube), what proportion of the volume of the cube would the sphere occupy?"24 The target attribute (the volumetric relation between cube and sphere) is difficult to assess intuitively, and it appears that subjects sought out an easier-to-assess heuristic attribute instead, substituting the question "If a circle were drawn inside a square, what proportion of the area of the square does the circle occupy?" The mean estimate for the 'sphere inside cube' problem was 74%, almost identical to the mean estimate of the 'circle inside square' problem (77%) but far larger than the correct answer for the 'sphere inside cube' problem (52%).

Attribution substitutions like this save on processing power but introduce systematic biases into our judgment.25

Some attributes are always candidates for the heuristic role in attribute substitution because they play roles in daily perception and cognition and are thus always accessible: cognitive fluency, causal propensity, surprisingness, mood, and affective valence.26 Less prevalent attributes can become accessible for substitution if recently evoked or primed.27 

 

Supervision of intuitive judgments

Intuitive judgments, say Kahneman and Frederick, arise from processes like attribute substitution, of which we are unaware. They "bubble up" from the unconscious, after which many of them are evaluated and either endorsed or rejected by type 2 processes.

You can feel the tension28 between type 1 and type 2 processes in your own judgment when you try the Stroop task. Name the color of a list of colored words and you will find that you pause a bit when the word you see names a different color than the color it is written in, like this: green. Your unconscious, intuitive judgment uses an availability heuristic to suggest the word 'green' is shown in green, but your conscious type 2 processes quickly correct the unconscious judgment and conclude that it is written in red. You have no such momentary difficulty naming the color of this word: blue.

In many cases, type 2 processes have no trouble correcting the judgments of type 1 processes.29 But because type 2 processes are slow, they can be interrupted by time pressure.30 On the other hand, biased attribute substitution can sometimes be prevented if subjects are alerted to the possible evaluation contamination in advance.31 (This finding justifies a great deal of material on Less Wrong, which alerts you to many cognitive biases - that is, possible sources of evaluation contamination.)

Often, type 2 processes fail to correct intuitive judgments, as demonstrated time and again in the heuristics and biases literature.32 And even when type 2 processes correct intuitive judgments, the feeling that the intuitive judgments is correct may remain. Consider the famous Linda problem. Knowledge of probability theory does not extinguish the feeling (from type 1 processes) that Linda must be a feminist bank teller. As Stephen Jay Gould put it:

I know [the right answer], yet a little homunculus in my head continues to jump up and down, shouting at me, "But she can’t just be a bank teller; read the description!"33

 

Conclusion

Kahneman and Frederick's dual-process theory appears to be successful in explaining a wide range of otherwise puzzling phenomena in human judgment.34 The big picture of all this is described well by Jonathan Haidt, who imagines his conscious mind as a rider upon an elephant:

I'm holding the reins in my hands, and by pulling one way or the other I can tell the elephant to turn, to stop, or to go. I can direct things, but only when the elephant doesn't have desires of his own. When the elephant really wants to do something, I'm no match for him.

...The controlled system [can be] seen as an advisor. It's a rider placed on the elephant's back to help the elephant make better choices. The rider can see farther into the future, and the rider can learn valuable information by talking to other riders or by reading maps, but the rider cannot order the elephant around against its will...

...The elephant, in contrast, is everything else. The elephant includes gut feelings, visceral reactions, emotions, and intuitions that comprise much of the automatic system. The elephant and the rider each have their own intelligence, and when they work together well they enable the unique brilliance of human beings. But they don't always work together well.35

 

Next post: Your Evolved Intuitions

Previous post: Philosophy: A Diseased Discipline

 

 

Notes

1 Plato divided the soul into three parts: reason, spirit, and appetite (Annas 1981, ch. 5). Descartes held that humans operate on unconscious mechanical processes we share with animals, but that humans' additional capacities for rational thought separate us from the animals (Cottingham 1992). Leibniz said that animals are guided by inductive reasoning, which also guides 'three-fourths' of human reasoning, but that humans can also partake in 'true reasoning'  logic and mathematics (Leibniz 1714/1989, p. 208; Leibniz 1702/1989, pp. 188-191). Many thinkers, most famously Freud, have drawn a division between conscious and unconscious thinking (Whyte 1978). For a more detailed survey, see Frankish & Evans (2009). Multiple-process theories of mind stand in contrast to monistic theories of mind, for example: Johnson-Laird (1983); Braine (1990); Rips (1994). Note that dual-process theories of mind need not conflict with massively modular view of human cognition like Barrett & Kurzban (2006) or Tooby & Cosmides (1992): see Mercier & Sperber (2009). Finally, note that dual-process theories sit comfortably with current research on situated cognition: Smith & Semin (2004).

2 Frankish & Evans (2009).

3 Evans (1989, 2006, 2007); Evans & Over (1996); Sloman (1996, 2002); Stanovich (1999, 2004, 2009); Smolensky (1988); Carruthers (2002, 2006, 2009); Lieberman (2003; 2009); Gilbert (1999).

4 Sun et al. (2009); Eichenbaum & Cohen (2001); Carruthers (2006); Sherry & Schacter (1987); Berry & Dienes (1993); Reber (1993); Sun (2001).

5 Kahneman & Frederick (2002, 2005).

6 Dennett (1978, ch. 16; 1991); Cohen (1992); Frankish (2004); Verscheuren et al. (2005).

7 Smith & Collins (2009); Bargh (2006); Strack & Deutsch (2004). 

8 Evans (2008); Evans & Frankish (2009). Or as Carruthers (2009) puts it, "Dual-system theories of human reasoning are now quite widely accepted, at least in outline."

9 One such detail is: When and to what extent does System 2 intervene in System 1 processes? See: Evans (2006); Stanovich (1999); De Neys (2006); Evans & Curtis-Holmes (2005); Finucane et al. (2000); Newstead et al. (1992); Evans et al. (1994); Daniel & Klaczynski (2006); Vadenoncoeur & Markovits (1999); Thompson (2009). Other important open questions are explored in Fazio & Olson (2003); Nosek (2007); Saunders (2009). For an accessible overview of the field, see Evans (2010).

10 Call & Tomasello (2005).

11 Evans (2009).

12 Dual-process and dual-system theories of the mind suggest multiple cognitive architectures, and should not be confused with theories of multiple modes of processing, or two kinds of cognitive style. One example of the latter is the supposed distinction between Eastern and Western thinking (Nisbett et al. 2001). Dual-process and dual-system theories of the mind should also be distinguished from theories that posit a continuum between one form of thinking and another (e.g. Hammond 1996; Newstead 2000; Osman 2004), since this suggests there are not separate cognitive architectures at work.

13 Evans (2003); Stanovich (1999, 2009); Evans & Over (1996); Smith & DeCoster (2000); Wilson (2002).

14 Evans (2008, 2009); Stanovich (2004); Wilson (2002).

15 Goel (2007).

16 Toates (2004, 2006).

17 Evans (1989); Evans & Over (1996); Kahneman & Frederick (2002); Klaczynski & Cottrell (2004); Sloman (1996); Stanovich (2004).

18 Evans et al. (1983); Klauer et al. (2000).

19 Evans (2009); Goel & Dolan (2003).

20 Those who prefer video lectures to reading may enjoy a 2008 lecture on judgment and intuition, to which Kahneman contributed: 1, 2, 3, 4.

21 They use the terms 'system 1' and 'system 2' instead of 'type 1' and 'type 2'. Their theory is outlined in Kahneman & Frederick (2002, 2005).

22 Kahneman & Frederick (2005).

23 Strack et al. (1988).

24 Frederick & Nelson (2007).

25 Cognitive biases particularly involved in attribute substitution include the availability heuristic (Lichtenstein et al. 1978; Schwarz et al. 1991; Schwarz & Vaughn 2002), the representativeness heuristic (Kahneman & Tversky 1973; Tversky & Kahneman 1982; Bar-Hillel & Neter 1993; Agnolia 1991), and the affect heuristic (Slovic et al. 2002; Finucane et al. 2000).

26 Cognitive fluency: Jacoby & Dallas (1981); Schwarz & Vaughn (2002); Tversky & Kahneman (1973). Causal propensity: Michotte (1963); Kahneman & Varey (1990). Surprisingness: Kahneman & Miller (1986). Mood: Schwarz & Clore (1983). Affective valence: Bargh (1997); Cacioppo et al. (1993); Kahneman et al. (1999); Slovic et al. (2002); Zajonc (1980, 1997).

27 Bargh et al. (1986); Higgins & Brendl (1995). Note also that attributes must be mapped across dimensions on a common scale, and we understand to some degree the mechanism that does this: Kahneman & Frederick (2005); Ganzach and Krantz (1990); Stevens (1975).

28 Also see De Neys et al. (2010).

29 Gilbert (1989).

30 Finucane et al. (2000).

31 Schwarz & Clore (1983); Schwarz (1996).

32 Gilovich et al. (2002); Kahneman et al. (1982); Pohl (2005); Gilovich (1993); Hastie & Dawes (2009).

33 Gould (1991), p. 469.

34 See the overview in Kahneman & Frederick (2005).

35 Haidt (2006), pp. 4, 17.

 

References

Agnolia (1991). Development of judgmental heuristics and logical reasoning: Training counteracts the representativeness heuristic. Cognitive Development, 6: 195–217.

Annas (1981). An introduction to Plato's republic. Oxford University Press.

Bar-Hillel & Neter (1993). How alike is it versus how likely is it: A disjunction fallacy in probability judgments. Journal of Personality and Social Psychology, 41: 671–680.

Bargh (1997). The automaticity of everyday life. Advances in social cognition, 10. Erlbaum.

Bargh, Bond, Lombardi, & Tota (1986). The additive nature of chronic and temporary sources of construct accessibility. Journal of Personality and Social Psychology, 50(5): 869–878.

Bargh (2006). Social psychology and the unconscious. Psychology Press.

Barrett & Kurzban (2006). Modularity in cognition: Framing the debate. Psychological Review, 113: 628-647.

Berry & Dienes (1993). Implicit learning. Erlbaum.

Braine (1990). The 'natural logic' approach to reasoning. In Overton (ed.), Reasoning, necessity and logic: Developmental perspectives. Psychology Press.

Cacioppo, Priester, & Berntson (1993). Rudimentary determinants of attitudes: II. Arm flexion and extension have differential effects on attitudes. Journal of Personality and Social Psychology, 65: 5–17.

Call & Tomasello (2005). Reasoning and thinking in nonhuman primates. In Holyoak & Morrison (eds.), The Cambridge Handbook of Thinking and Reasoning (pp. 607-632). Cambridge University Press.

Carruthers (2002). The cognitive functions of language. Behavioral and Brain Sciences, 25: 657-719.

Carruthers (2006). The architecture of the mind. Oxford University Press.

Carruthers (2009). An architecture for dual reasoning. In Evans & Franklin (eds.), In Two Minds: Dual Processes and Beyond (pp. 109-128). Oxford University Press.

Cohen (1992). An essay on belief and acceptance. Oxford University Press.

Cottingham (1992). Cartesian dualism: Theology, metaphysics, and science. In Cottingham (ed.), The Cambridge companion to Descartes (pp. 236-257). Cambridge University Press.

Daniel & Klaczynski (2006). Developmental and individual differences in conditional reasoning: Effects of logic instructions and alternative antecedents. Child Development, 77: 339-354.

De Neys, Moyens, & Vansteenwegen (2010). Feeling we're biased: Autonomic arousal and reasoning conflict. Cognitive, affective, and behavioral neuroscience, 10(2): 208-216.

Dennett (1978). Brainstorms: Philosophical essays on mind and psychology. MIT Press.

Dennett (1991). Two contrasts: Folk craft versus folk science and belief versus opinion. In Greenwood (ed.), The future of folk psychology: Intentionality and cognitive science (pp. 135-148). Cambridge University Press.

De Neys (2006). Dual processing in reasoning: Two systems but one reasoner. Psychological Science, 17: 428-433.

Eichenbaum & Cohen (2001). From conditioning to conscious reflection: Memory systems of the brain. Oxford University Press.

Evans (1989). Bias in human reasoning: Causes and consequences. Erlbaum.

Evans (2003). In two minds: Dual-process accounts of reasoning. Trends in Cognitive Sciences, 7: 454-459.

Evans (2006). The heuristic-analytic theory of reasoning: Extension and evaluation. Psychonomic Bulletin and Review, 13: 378-395.

Evans (2007). Hypothetical Thinking: Dual processes in reasoning and judgment. Psychology Press.

Evans (2008). Dual-processing accounts of reasoning, judgment and social cognition. Annual Review of Psychology, 59: 255-278.

Evans (2009). How many dual-process theories do we need? One, two, or many? In Evans & Franklin (eds.), In Two Minds: Dual Processes and Beyond (pp. 33-54). Oxford University Press.

Evans (2010). Thinking Twice: Two minds in one brain. Oxford University Press.

Evans & Over (1996). Rationality and Reasoning. Psychology Press.

Evans & Frankish, eds. (2009). In Two Minds: Dual Processes and Beyond. Oxford University Press.

Evans & Curtis-Holmes (2005). Rapid responding increases belief bias: Evidence for the dual-process theory of reasoning. Thinking & Reasoning, 11: 382-389.

Evans, Barston, & Pollard (1983). On the conflict between logic and belief in syllogistic reasoning. Memory & Cognition, 11: 295-306.

Evans, Newstead, Allen, & Pollard (1994). Debiasing by instruction: The case of belief bias. European Journal of Cognitive Psychology, 6: 263-285.

Fazio & Olson (2003). Implicit measures in social cognition research: Their meaning and uses. Annual Review of Psychology, 54: 297-327.

Finucane, Alhakami, Slovic, & Johnson (2000). The affect heuristic in judgments of risks and benefits. Journal of Behavioral Decision Making, 13: 1-17.

Frankish (2004). Mind and Supermind. Cambridge University Press.

Frankish & Evans (2009). The duality of mind: An historical perspective. In Evans & Franklin (eds.), In Two Minds: Dual Processes and Beyond (pp. 1-29). Oxford University Press.

Frederick & Nelson (2007). Attribute substitution in the estimation of volumetric relationships: Psychophysical phenomena underscore judgmental heuristics. Manuscript in preparation. Massachusetts Institute of Technology.

Ganzach and Krantz (1990). The psychology of moderate prediction: I. Experience with multiple determination. Organizational Behavior and Human Decision Processes, 47: 177–204.

Gilbert (1989). Thinking lightly about others: Automatic components of the social inference process. In Uleman & Bargh (eds.), Unintended thought (pp. 189–211). Guilford Press.

Gilbert (1999). What the mind's not. In Chaiken & Trope (eds.), Dual-process theories in social psychology (pp. 3–11 ). Guilford Press.

Gilovich (1993). How we know what isn't so. Free Press.

Gilovich, Griffin, & Kahneman, eds. (2002). Heuristics and biases: the psychology of intuitive judgment. Cambridge University Press.

Goel (2007). Cognitive neuroscience of deductive reasoning. In Holyoak & Morrison (eds.), The Cambridge Handbook of Thinking and Reasoning (pp. 475-492). Cambridge University Press.

Goel & Dolan (2003). Explaining modulation of reasoning by belief. Cognition, 87: B11-B22.

Gould (1991). Bully for brontosaurus. Reflections in natural history. Norton.

Hammond (1996). Human judgment and social policy. Oxford University Press.

Haidt (2006). The happiness hypothesis: Finding modern truth in ancient wisdom. Basic Books.

Hastie & Dawes, eds. (2009). Rational Choice in an Uncertain World, 2nd ed. Sage.

Higgins & Brendl (1995). Accessibility and applicability: Some 'activation rules' influencing judgment. Journal of Experimental Social Psychology, 31: 218–243.

Jacoby & Dallas (1981). On the relationship between autobiographical memory and perceptual learning. Journal of Experimental Psychology: General, 3: 306–340.

Johnson-Laird (1983). Mental Models. Cambridge University Press.

Kahneman & Tversky (1973). On the psychology of prediction. Psychological Review, 80: 237–251.

Kahneman et al. (1999). Economic preferences or attitude expressions? An analysis of dollar responses to public issues. Journal of Risk and Uncertainty, 19: 203–235.

Kahneman & Frederick (2002). Representativeness revisited: Attribute substitution in intuitive judgment. In Gilovich, Griffin, & Kahneman (eds.), Heuristics and biases: The psychology of intuitive judgment (pp. 49-81). Cambridge University Press.

Kahneman & Frederick (2005). A model of heuristic judgment. In Holyoak & Morrison (eds.), The Cambridge Handbook of Thinking and Reasoning (pp. 267-294). Cambridge University Press.

Kahneman & Miller (1986). Norm theory: Comparing reality with its alternatives. Psychological Review, 93: 136–153.

Kahneman & Varey (1990). Propensities and counterfactuals: The loser that almost won. Journal of Personality and Social Psychology, 59(6): 1101–1110.

Klaczynski & Cottrell (2004). A dual-process approach to cognitive development: The case of children's understanding of sunk cost decisions. Thinking and Reasoning, 10: 147-174.

Kahneman, Slovic, & Tversky, eds. (1982). Judgment under uncertainty: Heuristics and biases. Cambridge University Press.

Klauer, Musch, & Naumer (2000). On belief bias in syllogistic reasoning. Psychological Review, 107: 852-884.

Leibniz (1702/1989). Letter to Queen Sophie Charlotte of Prussia, on what is independent of sense and matter. In Leibniz, Philosophical essays (pp. 186-192). Hackett.

Leibniz (1714/1989). Principles of nature and grace, based on reason. In Leibniz, Philosophical essays (pp. 206-213). Hackett.

Lichtenstein, Slovic, Fischhoff, Layman, & Combs (1978). Judged Frequency of Lethal Events. Journal of Experimental Psychology: Human Learning and Memory, 4(6): 551-578.

Lieberman (2003). Reflective and reflexive judgment processes: A social cognitive neuroscience approach. In Forgas, Williams, & von Hippel (eds.), Social judgments: Implicit and explicit processes (pp. 44-67). Cambridge University Press.

Lieberman (2009). What zombies can't do: A social cognitive neuroscience approach to the irreducibility of reflective consciousness. In Evans & Franklin (eds.), In Two Minds: Dual Processes and Beyond (pp. 293-316). Oxford University Press.

Mercier & Sperber (2009). Intuitive and reflective inferences. In Evans & Franklin (eds.), In Two Minds: Dual Processes and Beyond (pp. 149-170). Oxford University Press.

Michotte (1963). The perception of causality. Basic Books.

Newstead (2000). Are there two different types of thinking? Behavioral and Brain Sciences, 23: 690-691.

Newstead, Pollard, & Evans (1992). The source of belief bias effects in syllogistic reasoning. Cognition, 45: 257-284.

Nisbett, Peng, Choi, & Norenzayan (2001). Culture and systems of thought: Holistic vs analytic cognition. Psychological Review, 108: 291-310.

Nosek (2007). Implicit-explicit relations. Current Directions in Psychological Science, 16: 65-69.

Osman (2004). An evaluation of dual-process theories of reasoning. Psychonomic Bulletin and Review, 11: 988-1010.

Pohl, ed. (2005). Cognitive illusions: a handbook on fallacies and biases in thinking, judgment and memory. Psychology Press.

Reber (1993). Implicit learning and tacit knowledge. Oxford University Press.

Rips (1994). The psychology of proof: Deductive reasoning in human thinking. MIT Press.

Saunders (2009). Reason and intuition in the moral life: A dual-process account of moral justification. In Evans & Franklin (eds.), In Two Minds: Dual Processes and Beyond (pp. 335-354). Oxford University Press.

Schwarz, Bless, Strack, Klumpp, Rittenauer-Schatka, & Simons (1991). Ease of retrieval as information: Another look at the availability heuristic. Journal of Personality and Social Psychology, 61: 195–202.

Schwarz & Clore (1983). Mood, misattribution, and judgments of well-being: Informative and directive functions of affective states. Journal of Personality and Social Psychology, 45(3): 513–523.

Schwarz (1996). Cognition and communication: Judgmental biases, research methods, and the logic of conversation. Erlbaum.

Schwarz & Vaughn (2002). The availability heuristic revisited: Ease of recall and content of recall as distinct sources of information. In Gilovich, Griffin, & Kahneman (eds.), Heuristics & biases: The psychology of intuitive judgment (pp. 103–119). Cambridge University Press.

Sherry & Schacter (1987). The evolution of multiple memory systems. Psychological Review, 94: 439-454.

Sloman (1996). The empirical case for two systems of reasoning. Psychological Bulletin, 119: 1-23.

Sloman (2002). Two systems of reasoning. In Gilovich, Griffin, & Kahneman (eds.), Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge University Press.

Slovic, Finucane, Peters, & MacGregor (2002). Rational Actors or Rational Fools: Implications of the Affect Heuristic for Behavioral Economics. Journal of Socio-Economics, 31: 329-342.

Smith & Collins (2009). Dual-process models: A social psychological perspective. In Evans & Franklin (eds.), In Two Minds: Dual Processes and Beyond (pp. 197-216). Oxford University Press.

Smith & DeCoster (2000). Dual-process models in social and cognitive psychology: Conceptual integration and links to underlying memory systems. Personality and Social Psychology Review, 4: 108-131.

Smith & Semin (2004). Socially situated cognition: Cognition in its social context. Advances in experimental social psychology, 36: 53-117.

Smolensky (1988). On the proper treatment of connectionism. Behavioral and Brain Sciences, 11: 1-23.

Stanovich (1999). Who is rational? Studies of individual differences in reasoning. Psychology Press.

Stanovich (2004). The robot's rebellion: Finding meaning in the age of Darwin. Chicago University Press.

Stanovich (2009). Distinguishing the reflective, algorithmic, and autonomous minds: Is it time for a tri-process theory? In Evans & Franklin (eds.), In Two Minds: Dual Processes and Beyond (pp. 55-88). Oxford University Press.

Strack & Deutsch (2004). Reflective and impulsive determinants of social behavior. Personality and Social Psychology Review, 8: 220-247.

Strack, Martin, & Schwarz (1988). Priming and communication: The social determinants of information use in judgments of life satisfaction. European Journal of Social Psychology, 1: 429–442.

Stevens (1975). Psychophysics: Introduction to its perceptual, neural, and social prospects. Wiley.

Sun (2001). Duality of mind: A bottom-up approach towards cognition. Psychology Press.

Sun, Lane, & Mathews (2009). The two systems of learning: An architectural perspective. In Evans & Franklin (eds.), In Two Minds: Dual Processes and Beyond (pp. 239-262). Oxford University Press.

Toates (2004). 'In two minds' - consideration of evolutionary precursors permits a more integrative theory. Trends in Cognitive Sciences, 8: 57.

Toates (2006). A model of the heirarchy of behaviour, cognition, and consciousness. Consciousness and Cognition, 15: 75-118.

Thompson (2009). Dual-process theories: A metacognitive perspective. In Evans & Franklin (eds.), In Two Minds: Dual Processes and Beyond (pp. 171-195). Oxford University Press.

Tooby & Cosmides (1992). The psychological foundations of culture. In Barkow, Cosmides, & Tooby (eds.), The Adapted Mind (pp. 19-136). Oxford University Press.

Tversky & Kahneman (1973). Availability: a heuristic for judging frequency and probability. Cognitive Psychology, 5(2): 207–232.

Tversky & Kahneman (1982). Judgments of and by representativeness. In Kahneman, Slovic, & Tversky (eds.), Judgment under uncertainty: Heuristics and biases (pp. 84–98). Cambridge University Press.

Vadenoncoeur & Markovits (1999). The effect of instructions and information retrieval on accepting the premises in a conditional reasoning task. Thinking & Reasoning, 5: 97-113.

Verscheuren, Schaeken, & d'Ydewalle (2005). A dual-process specification of causal conditional reasoning. Thinking & Reasoning, 11: 239-278.

Whyte (1978). The unconscious before Freud. St. Martin's Press.

Wilson (2002). Strangers to ourselves: Discovering the adaptive unconscious. Belknap Press.

Zajonc (1980). Feeling and thinking: Preferences need no inferences. American Psychologist, 35(2): 151–175.

Zajonc (1997). Emotions. In Gilbert, Fiske, & Lindzey (eds.), Handbook of social psychology (4th ed., pp. 591–632). Oxford University Press.

19 comments

Comments sorted by top scores.

comment by CronoDAS · 2011-04-15T18:15:10.529Z · LW(p) · GW(p)

That bit about the sphere inside the cube reminds me of something that I once heard in math class.

First, a bit of terminology: in some branches of mathematics, it's useful to make a a distinction between "sphere" and "ball". A sphere only includes the points on the surface of the object in question, while a ball includes both the surface and the inside of the sphere. So it's a ball, not a sphere, that's being dropped into the cube.

Now, you notice how a three-dimensional ball takes up less of the space inside of the cube than a two-dimensional-ball (a circle) does of a two-dimensional cube (a square)? Well, if you try the same thing with 4-dimensional objects, the 4-dimensional ball will fill even less space. In fact, if you take the limit of the size of the unit ball as the number of dimensions goes to infinity, it actually goes to zero. Therefore, infinite dimensional balls contain no space whatsoever.

So, as my math professor wisecracked, if you ever meet a man from the 23rd dimension, his balls are very small.

Replies from: lukeprog
comment by lukeprog · 2011-04-15T21:02:28.919Z · LW(p) · GW(p)

I will admit, you made me laugh.

comment by shokwave · 2011-04-15T02:12:32.536Z · LW(p) · GW(p)

My immediate reaction to this is wanting to couple it with this and, to stretch the metaphor, have the rider modify the environment so that the elephant's independent desires happen to be for things the rider desires as well.

Hmm. This is really interesting.

Replies from: listic, leoxagy12
comment by listic · 2011-04-17T21:02:50.677Z · LW(p) · GW(p)

For this, one should know elephant's desires, at least. It must be not easy.

comment by leoxagy12 · 2011-04-17T04:33:06.876Z · LW(p) · GW(p)

Seems good idea to me.

Replies from: listic
comment by listic · 2011-04-17T21:02:28.161Z · LW(p) · GW(p)

For this, one should know elephant's desires, at least. It must be not easy.

comment by CuSithBell · 2011-04-20T15:59:19.199Z · LW(p) · GW(p)

Making judgments is like riding an elephant - if you do it poorly, you might get crushed by an elephant.

comment by MinibearRex · 2011-04-15T02:24:52.079Z · LW(p) · GW(p)

In many cases, type 2 processes have no trouble correcting the judgments of type 1 processes. But because type 1 processes are slow, they can be interrupted by time pressure.

I've always thought of type 1 processes as fast. Is this a mistake on my part or a typo?

Replies from: lukeprog
comment by lukeprog · 2011-04-15T03:00:41.770Z · LW(p) · GW(p)

Fixed, thanks.

comment by moridinamael · 2011-04-15T14:43:21.295Z · LW(p) · GW(p)

I became aware or the elephant-and-rider metaphor a while ago, perhaps due to one of your posts. Since that time, I have attempted to take advantage of the insight by considering what else it could mean.

For example, the rider can "see farther" but the elephant perhaps can "see more clearly what is nearby.". By this I mean only that feelings which have no obvious explanations often come from flashes of intuition about people, ideas or situations which your conscious mind would have never noticed.

In other words, the unconscious mind seems to be the seat of our various pattern matching algorithms, which leads us to make logical errors at times, but may also lead us to infer things about the motives or mental states of other humans or give us a "gut feeling" that some situation is unsafe, when the conscious mind would otherwise have blissfully ignored the danger.

This isn't really in contradiction to what you just wrote; the main idea is that half of training the elephant may be listening to the elephant.

Replies from: atucker, Davorak, Armok_GoB
comment by atucker · 2011-04-15T17:25:42.999Z · LW(p) · GW(p)

"gut feeling" that some situation is unsafe, when the conscious mind would otherwise have blissfully ignored the danger.

I think of my conscious mind as being useful in areas that I'm focusing on and paying attention to, and almost useless in everywhere that I'm not.

My subconscious and automatic responses, on the other hand, seems to be much better at dealing with things that I'm not noticing (like breathing, keeping my feet moving in a reasonable manner, etc). However, I don't know enough neuroscience to know if my subconscious actually focuses on some things and not others.

comment by Davorak · 2011-04-16T19:28:27.330Z · LW(p) · GW(p)

The biggest insight I think most people can take away from the metaphor is that it is ok to train themselves. Often people are overly optimistic in their ability to use their "will power" to change or stop a behavior at any moment they wish. "I can stop smoking any time I want," "I will eat this treat now but I will exercise latter," and etc. Often, the more productive method is to use type 2 process to train your elephant so less "will power" is required.

comment by Armok_GoB · 2011-04-21T21:37:01.650Z · LW(p) · GW(p)

This makes me wonder if my experiences with actual horseriding is useful for managing my own brain. It feels quite plausible that it might be so.

comment by Jonathan_Graehl · 2011-04-15T04:33:20.781Z · LW(p) · GW(p)

Fantastic citations. It took me a few moments to accept that belief bias is probably fast (type-1). You strike me as a precise and careful writer. Much appreciated.

I feel like I should have already noticed attribute substitution as a frequent cause of mistakes (in myself and others). I'll enjoy being at least briefly more aware of the tendency.

comment by Davorak · 2011-04-16T19:32:42.699Z · LW(p) · GW(p)

I have often remarked to people that their conscious self is not the dictator of their own mind and at best only has a vote. Most react with confusion and denial with only an ocasional "of course." Thank you for introducing to me the research performed in the area.

comment by Dreaded_Anomaly · 2011-04-15T02:35:19.043Z · LW(p) · GW(p)

This was a fascinating read. The results you quoted from various attribute substitution studies were particularly striking. Thanks for posting!

comment by Entraya · 2013-12-09T16:30:00.401Z · LW(p) · GW(p)

Now that we're speaking in metaphors, I will thank you for these low hanging fruits of the tree of wisdom. I will familiarize myself with the website before i begin gnawing my way up the trunk. This is likely going to be the most incomprehensible and difficult thing to learn since i quit German classes. It's really quite interesting to know the details of how you actually think. Once you're aware of it, you may begin to notice it in effect, and that's a good standpoint to have if you want to change it. Just like the way you'd be suspicious of things set up in anything resembling the Milligrams experiment, once you've learned about it. Keeping these things in mind, and being aware of how you think, what you think and why you think what you think, is tremendously valuable, and a simple thing to do once it becomes habit

comment by Dreaded_Anomaly · 2011-04-23T03:20:48.536Z · LW(p) · GW(p)

A new article in Mother Jones, The Science of Why We Don't Believe Science, discusses this phenomenon and references Haidt. (The article does frame the issue in terms of the stereotypical American political dichotomy, but it's worth reading nonetheless.)

Also, I'm wondering why this post never got promoted to the front page. It's part of a sequence, addresses human cognitive function... it seems like a prime candidate to me.

comment by Johnny · 2011-04-15T09:58:24.436Z · LW(p) · GW(p)

Great post. I remember being very struck by the idea of the rider and the elephant when I read Jonathan Haidt's Happiness Hypothesis.

I think its a useful framework to think about procrastination and akrasia as well, something that you've written about in the past. I often think about my fears and desires as a disobedient elephant than I have to constantly look after.