Evolutionary Psychology
post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-11-11T20:41:03.000Z · LW · GW · Legacy · 42 commentsContents
42 comments
Like "IRC chat" or "TCP/IP protocol", the phrase "reproductive organ" is redundant. All organs are reproductive organs. Where do a bird's wings come from? An Evolution-of-Birds Fairy [? · GW] who thinks that flying is really neat? The bird's wings are there because they contributed to the bird's ancestors' reproduction. Likewise the bird's heart, lungs, and genitals. At most we might find it worthwhile to distinguish between directly reproductive organs and indirectly reproductive organs.
This observation holds true also of the brain, the most complex organ system known to biology. Some brain organs are directly reproductive, like lust; others are indirectly reproductive, like anger.
Where does the human emotion of anger come from? An Evolution-of-Humans Fairy who thought that anger was a worthwhile feature? The neural circuitry of anger is a reproductive organ as surely as your liver. Anger exists in Homo sapiens because angry ancestors had more kids. There's no other way it could have gotten there.
This historical fact about the origin of anger confuses all too many people. They say, "Wait, are you saying that when I'm angry, I'm subconsciously trying to have children? That's not what I'm thinking after someone punches me in the nose."
No. No. No. NO!
Individual organisms are best thought of as adaptation-executers, not fitness-maximizers. [? · GW] The cause of an adaptation, the shape of an adaptation, and the consequence of an adaptation, are all separate things. If you built a toaster, you wouldn't expect the toaster to reshape itself when you tried to cram in a whole loaf of bread; yes, you intended it to make toast, but that intention is a fact about you, not a fact about the toaster. The toaster has no sense of its own purpose.
But a toaster is not an intention-bearing object. It is not a mind at all, so we are not tempted to attribute goals to it. If we see the toaster as purposed, we don't think the toaster knows it, because we don't think the toaster knows anything.
It's like the old test of being asked to say the color of the letters in "blue". It takes longer for subjects to name this color, because of the need to untangle the meaning of the letters and the color of the letters. You wouldn't have similar trouble naming the color of the letters in "wind".
But a human brain, in addition to being an artifact historically produced by evolution, is also a mind capable of bearing its own intentions, purposes, desires, goals, and plans. Both a bee and a human are designs, but only a human is a designer. The bee is "wind", the human is "blue".
Cognitive causes are ontologically distinct from evolutionary causes. They are made out of a different kind of stuff. Cognitive causes are made of neurons. Evolutionary causes are made of ancestors.
The most obvious kind of cognitive cause is deliberate, like an intention to go to the supermarket, or a plan for toasting toast. But an emotion also exists physically in the brain, as a train of neural impulses or a cloud of spreading hormones. Likewise an instinct, or a flash of visualization, or a fleetingly suppressed thought; if you could scan the brain in three dimensions and you understood the code, you would be able to see them.
Even subconscious cognitions exist physically in the brain. "Power tends to corrupt," observed Lord Acton. Stalin may or may not have believed himself an altruist, working toward the greatest good for the greatest number. But it seems likely that, somewhere in Stalin's brain, there were neural circuits that reinforced pleasurably the exercise of power, and neural circuits that detected anticipations of increases and decreases in power. If there were nothing in Stalin's brain that correlated to power - no little light that went on for political command, and off for political weakness - then how could Stalin's brain have known to be corrupted by power?
Evolutionary selection pressures are ontologically distinct from the biological artifacts they create. The evolutionary cause of a bird's wings is millions of ancestor-birds who reproduced more often than other ancestor-birds, with statistical regularity owing to their possession of incrementally improved wings compared to their competitors. We compress this gargantuan historical-statistical macrofact by saying "evolution did it".
Natural selection is ontologically distinct from creatures; evolution is not a little furry thing lurking in an undiscovered forest. Evolution is a causal, statistical regularity in the reproductive history of ancestors.
And this logic applies also to the brain. Evolution has made wings that flap, but do not understand flappiness. It has made legs that walk, but do not understand walkyness. Evolution has carved bones of calcium ions, but the bones themselves have no explicit concept of strength, let alone inclusive genetic fitness. And evolution designed brains themselves capable of designing; yet these brains had no more concept of evolution than a bird has of aerodynamics. Until the 20th century, not a single human brain explicitly represented the complex abstract concept of inclusive genetic fitness. [? · GW]
When we're told that "The evolutionary purpose of anger is to increase inclusive genetic fitness," there's a tendency to slide to "The purpose of anger is reproduction" to "The cognitive purpose of anger is reproduction." No! The statistical regularity of ancestral history isn't in the brain, even subconsciously, any more than the designer's intentions of toast are in a toaster!
Thinking that your built-in anger-circuitry embodies an explicit desire to reproduce, is like thinking your hand is an embodied mental desire to pick things up.
Your hand is not wholly cut off from your mental desires. In particular circumstances, you can control the flexing of your fingers by an act of will. If you bend down and pick up a penny, then this may represent an act of will; but it is not an act of will that made your hand grow in the first place.
One must distinguish a one-time event of particular anger (anger-1, anger-2, anger-3) from the underlying neural circuitry for anger. An anger-event is a cognitive cause, and an anger-event may have cognitive causes, but you didn't will the anger-circuitry to be wired into the brain.
So you have to distinguish the event of anger, from the circuitry of anger, from the gene complex which laid down the neural template, from the ancestral macrofact which explains the gene complex's presence.
If there were ever a discipline that genuinely demanded X-Treme Nitpicking [? · GW], it is evolutionary psychology.
Consider, O my readers, this sordid and joyful tale: A man and a woman meet in a bar. The man is attracted to her clear complexion and firm breasts, which would have been fertility cues in the ancestral environment, but which in this case result from makeup and a bra. This does not bother the man; he just likes the way she looks. His clear-complexion-detecting neural circuitry does not know that its purpose is to detect fertility, any more than the atoms in his hand contain tiny little XML tags reading "<purpose>pick things up</purpose>". The woman is attracted to his confident smile and firm manner, cues to high status, which in the ancestral environment would have signified the ability to provide resources for children. She plans to use birth control, but her confident-smile-detectors don't know this any more than a toaster knows its designer intended it to make toast. She's not concerned philosophically with the meaning of this rebellion, because her brain is a creationist and denies vehemently that evolution exists. He's not concerned philosophically with the meaning of this rebellion, because he just wants to get laid. They go to a hotel, and undress. He puts on a condom, because he doesn't want kids, just the dopamine-noradrenaline rush of sex, which reliably produced offspring 50,000 years ago when it was an invariant feature of the ancestral environment that condoms did not exist. They have sex, and shower, and go their separate ways. The main objective consequence is to keep the bar and the hotel and condom-manufacturer in business; which was not the cognitive purpose in their minds, and has virtually nothing to do with the key statistical regularities of reproduction 50,000 years ago which explain how they got the genes that built their brains that executed all this behavior.
To reason correctly about evolutionary psychology you must simultaneously consider many complicated abstract facts that are strongly related yet importantly distinct, without a single mixup or conflation.
42 comments
Comments sorted by oldest first, as this post is from before comment nesting was available (around 2009-02-27).
comment by Tom_McCabe2 · 2007-11-11T21:31:53.000Z · LW(p) · GW(p)
This level confusion also seems to show up whenever people talk about "free will"- a computer was programmed by us, but its code can still do things that we never designed it for. Evolution sure as heck never designed people to make condoms and birth control pills, so why can't a computer do things we never designed it to do?
Replies from: pnrjulius↑ comment by pnrjulius · 2012-05-05T21:37:54.983Z · LW(p) · GW(p)
Are bugs free will?
Replies from: Luaan↑ comment by Luaan · 2012-10-11T15:25:22.079Z · LW(p) · GW(p)
If by "free will" we define any action that is not the intended behaviour of the original designer, then yes. And it actually does fit the bill relatively well, IMO - it is an emergent behaviour (usually) experienced during unexpected values appearing somewhere in the code. And just like with us, the behaviour is deterministic, and at the same time, pretty much impossible to predict in some cases :D
Multi-threading issues are a nice example - everything works very well in isolation, and breaks down in a real production enviroment.
comment by Matthew2 · 2007-11-11T22:14:29.000Z · LW(p) · GW(p)
The part about the ontological distinctiveness between cognitive and evolutionary causes reminds me of my old English professor who mixed the two. While I knew it was wrong, I didn't have a label. He believed that nature had a kind of memory through natural selection.
comment by Boris · 2007-11-11T22:26:19.000Z · LW(p) · GW(p)
You say "the neural circuitry of anger is a reproductive organ as surely as your liver" and "the evolutionary purpose of anger is to increase inclusive genetic fitness."
I don't believe you have enough evidence to assert these statements. All you know is that "angry ancestors had more kids" but you DON'T know that it's as a result of the anger. It could have happened that, say, the same ancestors that could run faster also happened to have the capacity for anger. As a result of their faster running, they reproduced/survived, and so did anger.
I liken this to classic studies on the effects of divorce on children. Of course, kids end up worse off with parents that divorce, but all else equal, divorce may very well be GOOD for the kid. Similarly, although here angry ancestors did have more kids, anger may very well be BAD for reproduction/survival. I'm sure there's also a good cynical example, too, like that the reason the dollar was the dominant currency through the 20th century was because it was green.
Replies from: pnrjulius, Technoguyrob, ThisIsYourSignal↑ comment by pnrjulius · 2012-05-05T21:43:53.799Z · LW(p) · GW(p)
It's possible that anger was a byproduct of something else which is adaptive (certainly such evolutionary byproducts exist)... but it seems pretty unlikely in this case. Anger is a rather complicated thing that seems to have its own modular brain systems; it doesn't seem to be a byproduct of anything else.
↑ comment by robertzk (Technoguyrob) · 2013-09-22T20:20:54.749Z · LW(p) · GW(p)
The possibility of an "adaptation" being in fact an exaptatation or even a spandrel is yet another reason to be incredibly careful about purposing teleology into a discussion about evolutionarily-derived mechanisms.
↑ comment by ThisIsYourSignal · 2019-12-24T20:19:18.096Z · LW(p) · GW(p)
Yes dude! That's showing rigor. And PnrJulius' comment that Boris' comment "seems unlikely" is precisely the soft-serve sludge that rigorous thinkers like Boris here have to slog against day in and day out. Boo Julius, boo. Yay Boris, yay.
And then a roar of the crowd for TechnoGuyRob who takes the long pass from Boris and dunks on Julius in a way J's grandbabies gonna feel when he writes "The possibility of an "adaptation" being in fact an exaptatation or even a spandrel is yet another reason to be incredibly careful about purposing teleology into a discussion about evolutionarily-derived mechanisms."
It's problematic how stoked this exchange makes me. I'ma say it will not prove adaptive.
comment by Barkley_Rosser · 2007-11-11T22:28:26.000Z · LW(p) · GW(p)
Oh, tsk tsk. But women with "creationist" brains just don't have the sort of one night stands implied by your story, at least not as often as the ones with "evolutionary" brains, :-).
Replies from: pnrjuliuscomment by Barkley_Rosser · 2007-11-11T22:30:40.000Z · LW(p) · GW(p)
OTOH, if they are creationists who have been reading too much Stephen Jay Gould, who knows what sorts of trouble they might get into. They might even tragically start selecting partners on multi-levels, while disobeying the correct equations, :-).
comment by Tiiba2 · 2007-11-11T22:52:02.000Z · LW(p) · GW(p)
My name is Tiiba, and I approve of this post.
That said, I have a question. Your homepage says:
"Most of my old writing is horrifically obsolete. Essentially you should assume that anything from 2001 or earlier was written by a different person who also happens to be named "Eliezer Yudkowsky". 2002-2003 is an iffy call."
Well, as far as I can tell, most of your important writing on AI is "old". So what does this mean? What ideas have been invalidated? What replaced them? Are you secretly building a robot?
Replies from: JohnWittle↑ comment by JohnWittle · 2013-02-06T07:22:52.779Z · LW(p) · GW(p)
I have information from the future!
EY says it best in The Sheer Folly of Callow Youth, but essentially EY once thought, "If there is truly such a thing as moral value, then a superintelligence will likely discover what the correct moral values are. If there is no such thing as moral value, then the current reality is no more valuable than the reality where I make an AI that kills everyone. Therefore, I should strive to make an AI regardless of ethical problems."
Then in the early 2000s he had an epiphany. The mechanics of his objection had to do with disproving the first part of the argument, that a superintelligence would automatically do the 'right' thing in a universe with ethics. This is because you could build an AI 'foo' which was a superintelligence, and an AI 'bar' which was 'foo' except with a little gnome who sat at the very beginning of the decision algorithm and changed all of the goals from "maximize value" to "minimize value". This proves that it is possible for two superintelligences to do two completely different things, therefore an AI must be a Friendly AI in order to do the 'right' thing. This is when he realized how close he had come to perhaps causing an extinction event, and realized how important the FAI project was. (It was also when he coined the term FAI to begin with.)
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-11-11T22:55:37.000Z · LW(p) · GW(p)
Tiiba, please re-ask on this month's Open Thread and I'll delete the comment here.
comment by Gray_Area · 2007-11-11T23:27:54.000Z · LW(p) · GW(p)
Tom McGabe: "Evolution sure as heck never designed people to make condoms and birth control pills, so why can't a computer do things we never designed it to do?"
That's merely unpredictability/non-determinism, which is not necessarily the same as free will.
comment by billswift · 2007-11-12T05:14:05.000Z · LW(p) · GW(p)
"That's merely unpredictability/non-determinism, which is not necessarily the same as free will."
Prove it; at least give a reasonable definition of free will that doesn't include "unpredictability/non-determinism". For that matter, how about a definition of "unpredictability/non-determinism".
Free will and, usually, non-determinism are among the big ideas everyone talks about without having any idea what they're talking about.
comment by Pete_Carlton · 2007-11-12T06:24:19.000Z · LW(p) · GW(p)
"To reason correctly about evolutionary psychology you must simultaneously consider many complicated abstract facts that are strongly related yet importantly distinct, without a single mixup or conflation."
Sure, but after a while this just becomes a habit and I don't think it's more difficult than, say, organic chemistry. But without some practice or exposure, it is deeply counterintuitive. It's also probably encroaching on some sacred territory. You can subject some atrocious things like infanticide and homicidal rampages to evolutionary explanations. I don't think anyone's closed the book on any of these, but in all these cases I think EP has an interesting perspective. Generally, though, people don't even want to think about it. People probably resist thinking along these lines because of the perceived violation of their freedom or morality (which violation is, as you say, is an illusion).
comment by Tiiba2 · 2007-11-12T07:23:22.000Z · LW(p) · GW(p)
I define free will as an ability to create and implement plans that move the world toward a goal. That seems to fit the way the term is used with regard to humans.
You can do something willfully (construct and implement a plan with a goal in mind), be coerced (construct and implement a plan that achieves something other than your own goals), be restrained (construct a plan that could achieve a goal, and then not implement it), or be manipulated (implement a plan that you did not construct, and whose goal you might not understand; you might or might not like the result).
Most programs implement plans that they did not create to achieve goals they don't understand in a world of which they don't know. But I think that if a machine can create and carry out plans, it has a degree of freedom.
comment by Gray_Area · 2007-11-12T09:11:20.000Z · LW(p) · GW(p)
billswift said: "Prove it."
I am just saying 'being unpredictable' isn't the same as free will, which I think is pretty intuitive (most complex systems are unpredictable, but presumably very few people will grant them all free will). As far as the relationship between randomness and free will, that's clearly a large discussion with a large literature, but again it's not clear what the relationship is, and there is room for a lot of strange explanations. For example some panpsychists might argue that 'free will' is the primitive notion, and randomness is just an effect, not the other way around.
comment by guest2 · 2007-11-12T19:38:46.000Z · LW(p) · GW(p)
"You can subject some atrocious things like infanticide and homicidal rampages to evolutionary explanations. I don't think anyone's closed the book on any of these, but in all these cases I think EP has an interesting perspective."
Huh? We know that group selection can lead to cannibalism, so analyzing infanticide and homicidal rampages would seem rather trivial. On the other hand, it's rather surprising that purely evolutionary mechanisms would lead to something as complex as our psychology and sense of morality. From a rational Bayesian perspective, how likely is it that an evolved adaptation executor would be able to formulate optimization criteria, while lacking an intuitive understanding of inclusive fitness?
Replies from: pnrjulius↑ comment by pnrjulius · 2012-05-05T21:48:24.691Z · LW(p) · GW(p)
Right, this is a bit of a problem. Why do we have these complicated brains that work toward their own goals? This seems counter-productive to the goal of maximizing fitness by executing adaptations... but maybe it has other advantages we've not yet understood.
Replies from: Luaan↑ comment by Luaan · 2012-10-11T15:39:00.878Z · LW(p) · GW(p)
Is the ability to plan really so special?
When an animal goes out of its nest, forages for food and then returns, isn't that the same planning we exhibit too? And now add that humans are omnivorous and acted both as pack hunters and as gatherers; suddenly, complexity arises, that requires you to be able to plan not only for yourself, but also as part of your group - these 10 guys will go hunt that mammoth, while these 5 will go gather berries and these 5 will make some new spears. Simply through the requirement for group interaction, you have another mechanism for the development of plans, psychology, morality.
And that's kind of the point, isn't it? Who says our psychology and morality is a direct product of biological evolution? Biological evolution only gave us the tools (a brain capable of forming plans); morality is a social behaviour that evolved alongside, led by intelligent designers (us) - with groups dividing on various issues, some of them surviving, some not; some of them spreading their ideas further, some not. We have long since taken over the reins of our development, even though we still move within certain constraints imposed on us, with various flexibility (eg. the ability to suppress our anger).
I think this is quite apparent when you look at animals bred in isolation or in different conditions; sure, there's some behaviour based on genetics, but it obviously isn't everything.
comment by Pete_Carlton · 2007-11-12T20:35:17.000Z · LW(p) · GW(p)
What are you saying - that EP has closed the book on them?
My point about infanticide etc. was that EP has bigger problems for becoming generally accepted than how difficult it is to reason about - problems having to do with a perceived removal of agency from human beings.
Anyway, it doesn't strike me as surprising that purely evolutionary mechanisms led to our psychology, and especially not our sense of morality. Are these things much more complex than any other animal behavior we're happily willing to concede to evolution?
comment by jesus_christ · 2008-05-19T09:05:13.000Z · LW(p) · GW(p)
it wasn't evolution that did it. it was my daddy peace dudes, seeyas all soon chris-dawg taken from the right (hand-side)
comment by ThrustVectoring · 2010-11-05T20:14:23.219Z · LW(p) · GW(p)
Another part of the evolutionary psychology puzzle is the distinction of levels, or what Yudkowsky calls the hands vs fingers.
Its not that anger causes reproduction. Its the anger circuitry, or the ability to be angry, that causes better reproductive success. The former statement doesn't even make sense in terms of evolutionary psychology, while the latter is fairly obvious (namely, that lack of anger leads to not fighting from a reproductively hopeless situation. Anger circuitry forces action from reproductive dead-ends)
comment by matteri · 2011-04-05T16:14:06.768Z · LW(p) · GW(p)
"Anger exists in Homo sapiens because angry ancestors had more kids. There's no other way it could have gotten there."
This is not entirely true - as Boris seems to have noticed. More generally; anything that purely helps survival is certainly more probable to propagate through a species. However, there are other traits that might propagate, such as any of those that are either: a) Not useful nor a burden b) A negative biproduct of something useful, without outweighing the useful
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2011-04-05T18:32:56.160Z · LW(p) · GW(p)
Is it relevant that humanity doesn't have competent competition?
I wonder how we'd be doing if we were up against coyotes with thumbs.
Replies from: moshez, matteri↑ comment by moshez · 2011-04-05T18:36:37.540Z · LW(p) · GW(p)
Um...Neanderthals had thumbs, and fairly large brains. We pretty much wiped them out. If they weren't "competent competition", I'm not sure what you'd call "competent" (unless it would have been some species that wiped us out, who would be here having the exact conversation, or something so delicately balanced that I doubt would ever happen).
Replies from: NancyLebovitz, David_Gerard↑ comment by NancyLebovitz · 2011-04-05T18:55:04.323Z · LW(p) · GW(p)
We haven't wiped out coyotes, so they might be more competent competitors (even without thumbs) than Neanderthals.
↑ comment by David_Gerard · 2011-04-05T23:20:33.578Z · LW(p) · GW(p)
I thought they were subsumed into the European branch of Cro-Magnon (us).
Replies from: moshez↑ comment by moshez · 2011-04-05T23:26:43.539Z · LW(p) · GW(p)
Controversial: http://en.wikipedia.org/wiki/Neanderthal_admixture_theory -- but in any case, 1%-4% of the genome? That's close enough to extinction...if coyotes interbred with dogs, and lots of household dogs had 1%-4% coyote DNA in them, but there would be no coyotes in the wild, I'd treat it as "extinct enough for me." :)
↑ comment by matteri · 2011-04-05T22:51:02.730Z · LW(p) · GW(p)
First of all; I don't see any apes or monkeys competing with us presently. Also, we are an evolved species. There have certainly been competitors along the way - perhaps said monkeys or apes and most certainly neanderthals as moshez mentioned. We've won though; that is hardly arguable.
Replies from: Desrtopacomment by MrHen · 2011-07-26T17:50:17.308Z · LW(p) · GW(p)
I don't understand the point of this post. I mean, I understand its points, but why is this post here? Is it trying to point out that: (a) intent and reality are not always -- and usually aren't -- entangled? (b) Reality happened and our little XML-style purpose tags are added post fact?
It seems odd to spend so much time saying, "Humans reproduced successfully. Anger exists in humans." If the anger part is correlated to the reproduction part it seems fair to ask, "Why did anger help reproduction?" This is a different question than, "What is the purpose of anger?" Is this difference what the article was pointing out?
To reason correctly about evolutionary psychology you must simultaneously consider many complicated abstract facts that are strongly related yet importantly distinct, without a single mixup or conflation.
How is this different from any other topic?
To reason correctly about computer science you must simultaneously consider many complicated abstract facts that are strongly related yet importantly distinct, without a single mixup or conflation.
To reason correctly about Starcraft II you must simultaneously consider many complicated abstract facts that are strongly related yet importantly distinct, without a single mixup or conflation.
The idea of special-casing evolutionary psychology is where I feel I am losing the plot.
Replies from: wedrifid, JGWeissman↑ comment by JGWeissman · 2011-07-26T18:42:56.178Z · LW(p) · GW(p)
Evolutionionary psychology is related to the study of cognitive biases, so being able to reason about it well is important. It is also easily observable that people make the mistakes this post warns against.
When discussing goal systems and terminal values, people with a confused view of evolutionary psychology tend to suggest that we should try to maximize inclusive genetic fitness, and this post discusses the confusion which leads to that common mistake.
And Eliezer has also drawn examples from computer science, I don't think is favoring evolutionary psychology. It is not surprising that some posts focus on a subtopic or rationality a specific domain of its application.
comment by pnrjulius · 2012-05-05T21:42:17.545Z · LW(p) · GW(p)
The part where this gets difficult is understanding why we evolved to have conscious intentions in the first place. What purpose does it serve to make us actually want things, rather than simply act as if we wanted them? Why aren't we like toasters?
This also gets at one of the reasons why I think it's a fool's errand to try to make the singularity with a non-sentient AI. If it were possible to make that level of intelligence without consciousness (and do so efficiently and cheaply), surely natural selection would have done so? Instead it made us sentient; this suggests that sentience is a useful thing to have.
Replies from: army1987↑ comment by A1987dM (army1987) · 2013-02-16T15:30:13.962Z · LW(p) · GW(p)
What purpose does it serve to make us actually want things, rather than simply act as if we wanted them?
For certain values of “act as if”, what you're asking is why we aren't p-zombies.
Replies from: Caplacomment by lesswronguser123 (fallcheetah7373) · 2024-04-12T16:33:54.278Z · LW(p) · GW(p)
Like "IRC chat"
I don't think that aged well :)
comment by lesswronguser123 (fallcheetah7373) · 2024-04-12T18:16:14.787Z · LW(p) · GW(p)
I found a related article prior to this on this topic which seems to be expanding about the same thing.