Humans Shouldn't make Themselves Smarter?

post by Ronny Fernandez (ronny-fernandez) · 2011-12-11T12:00:55.418Z · LW · GW · Legacy · 21 comments

Just thought you guys should know about this. Some work that argues that humans should not enhance their intelligence with technology, and that super intelligence probably never evolves.

21 comments

Comments sorted by top scores.

comment by lessdazed · 2011-12-11T14:33:59.964Z · LW(p) · GW(p)

Actual paper title from scientific journal: Why Aren't We Smarter Already: Evolutionary Trade-Offs and Cognitive Enhancements

Corresponding article headline: Human Brains Unlikely to Evolve Into a 'Supermind' as Price to Pay Would Be Too High

Actual paper title from scientific journal: Influence of Incubation Temperature on Morphology, Locomotor Performance, and Early Growth of Hatchling Wall Lizards (Podarcis muralis)

Projected future article headline: Killer 'Godzilla' Lizard Race Larger than Skyscrapers Unlikely to Arise because Global Warming Heats Eggs. All Forms of Genetic Engineering Therefore Impossible

comment by Scott Alexander (Yvain) · 2011-12-11T13:05:03.897Z · LW(p) · GW(p)

Walking on land is probably impossible, Pre-Cambrian researchers announced, since even if we did evolve some sort of "legs" our gills would be unable to extract oxygen from the environment.

comment by lessdazed · 2011-12-11T14:46:12.059Z · LW(p) · GW(p)

"Today we're at the beach, and yesterday we climbed the tallest mountain in the world," proclaimed the researchers over the phone during their celebratory Florida vacation. "Sugarloaf Mountain), 95 meters above sea level. No matter which direction by the compass you walk from the summit, it's down!"

comment by ZankerH · 2011-12-11T12:29:10.130Z · LW(p) · GW(p)

The main argument appears to be that on average, higher intelligence implies a higher rate of mental disorders such as autism and Asperger's syndrome. I don't see how this relates to humans "making themselves smarter" - supposedly, if we have the technology to improve our brains, we'll of course also be able to get rid of the nasty side effects introduced by the alien god that's been improving it for us thus far.

Replies from: DavidAgain, erratio
comment by DavidAgain · 2011-12-11T14:01:59.325Z · LW(p) · GW(p)

It's also that if you take things that improve one side of mental performance it's likely to harm another. This isn't massively surprising to me: you'd expect that if upping a single hormone level or whatever would simply improve performance overall then evolution would have 'found' it. But presumably the same is true of giving performance-enhancing drugs to less intelligent animals - or, for that matter, giving people steroids etc. to increase their physical performance.

But just because drugs to make you run faster might lower your life expectancy, that doesn't mean our current running speed is the best evolution or technology can achieve. The problem is that any complex adaptation, like intelligence, is going to be a 'sweet spot' in the sense that a random massive change in a single factor will make it less succesful. That doesn't mean that evolution, or potentially much more sophisticated technological enhancement, can't improve matters.

Also, the 'something's going to get worse' principle only holds if what we consider bad is the same as what evolution selects against. It could in principle be true that humans became much more intelligent if they lost something that made them capable of defending themselves, reproducing, making allies or whatever. If our aims are different to what benefits our genes' survival, we may well be able to improve on nature: as we do with artificial sweetners, sex with condoms and other cunning tricks.

Replies from: munkeegutz, red75
comment by munkeegutz · 2011-12-13T00:36:43.661Z · LW(p) · GW(p)

There's also the situation of "local maxima": It's possible (probable) that there are ways to make humans smarter through evolution, but the intermediate steps have poor results, causing a resistance to progress.

comment by red75 · 2011-12-12T07:17:45.634Z · LW(p) · GW(p)

Diversity of a population plays a role too. If I'm well below Feynman level (and I am), then there's a possibility that I can slightly improve my cognitive abilities without any negative consequences.

My experience with nootropics (racetams) seems to support this, as far as it is possible for anecdotal evidence.

comment by erratio · 2011-12-11T14:46:10.931Z · LW(p) · GW(p)

Of course, that assumes that autism should be considered a mental disorder. Many of those on the autism spectrum don't, whereas most of those with depression or high levels of anxiety do consider their condition to be a disorder. It looks a lot to me like status quo bias: if being more intelligent will cause our minds to become qualitatively different then we shouldn't try to be more intelligent.

Replies from: Logos01, ZankerH
comment by Logos01 · 2011-12-11T15:18:05.359Z · LW(p) · GW(p)

Many of those on the autism spectrum don't, whereas most of those with depression or high levels of anxiety do consider their condition to be a disorder.

As a high-functioning autist; I would love for there to be a higher representation of fellow HFA's in the population. Our learning functions would still be different from the baseline population (as it currently exists) but... I feel the world would be a better place if HFAs represented as much as 10% of the population. Beyond that, I have uncertainty.

Replies from: fubarobfusco
comment by fubarobfusco · 2011-12-11T22:37:04.546Z · LW(p) · GW(p)

It seems to me that some of the "high-functioning" / "low-functioning" autism distinction has actually to do with the comorbidity of various other disorders and disabilities; as well as with the quality of schooling and other care. There seem to be a number of autistic folks whose lives are complicated by PTSD from bad psychiatric care, institutionalization, abusive schooling situations, etc. Presumably, if ASD were more common and better understood, these would be less likely.

comment by ZankerH · 2011-12-11T15:20:30.471Z · LW(p) · GW(p)

Then again, defining disorders by self-reporting isn't that much more accurate than going with "any mental condition considered weird by the society".

Replies from: erratio
comment by erratio · 2011-12-11T16:26:43.005Z · LW(p) · GW(p)

"any mental condition considered weird by the society".

But that is how a lot of mental disorders are defined. See: attempts to medicalise non-heterosexuality.

comment by gwern · 2011-12-11T14:44:16.703Z · LW(p) · GW(p)

It's not really interesting - from the summaries, it isn't adding anything new to Algernon's Law except perhaps some more detailed examples.

(None the less, I did request a copy. Might be useful.)

Replies from: Cthulhoo
comment by Manfred · 2011-12-11T13:42:24.035Z · LW(p) · GW(p)

On the evolution of intelligence bit, he's probably right.

On the enhancement of intelligence, part, he's not... entirely wrong. Given a fixed energy budget (i.e. evolved environment), it seems reasonable that you can't improve the brain very much with gross chemical intervention (which is what he's talking about, in context). Of course, the energy budget isn't necessarily fixed, but it still is interesting.

Replies from: DavidAgain
comment by DavidAgain · 2011-12-11T17:07:17.926Z · LW(p) · GW(p)

Surely this depends on what you mean by 'improve the brain'. You might be able to make it better at things you consider important, by undermining things that the evolutionary environment deems important.

Replies from: rwallace
comment by rwallace · 2011-12-12T09:47:21.234Z · LW(p) · GW(p)

You could also trade off things that were more important in the ancestral environment than they are now. For example, social status (to which the neurotypical brain devotes much of its resources) is no longer the evolutionary advantage that it used to be.

Replies from: gwern
comment by gwern · 2011-12-14T19:26:36.969Z · LW(p) · GW(p)

You two realize you are just reinventing Bostrom's EOCs, right?

People, I wrote a thorough essay all about this! If I left something out, just tell me - you don't need to reinvent the wheel!

(This goes for half the comments on this page.)

comment by djcb · 2011-12-11T20:18:56.594Z · LW(p) · GW(p)

Hmmm... the research smells a bit of status-quo bias, or? I always think of Richard Feynman when thinking about super-intelligent people who are not socially awkward -- if we could raise average intelligence to his level, that would improve this world, or?

Whether there's an evolutionary (ie., reproductive) advantage is not so clear - but humans are not limited by that.

comment by gwern · 2012-02-12T20:46:49.628Z · LW(p) · GW(p)

Copy jailbreaked: http://www2.warwick.ac.uk/fac/sci/psych/people/academic/thills/thills/hillspublications/hillshertwig2011cdps.pdf

I've read it, and it doesn't cite Bostrom; as one would expect, this means it's pretty useless and a retread of Bostrom's paper. The main contribution of the paper, for me, is that it includes one or two useful examples I hadn't covered, and it includes some simple math models showing how U-shaped curves can fall out of optimizing for multiple properties.

EDIT: I emailed a link to Bostrom to the main author, who replied:

I think the arguments are actually quite similar, but from a slightly different perspective. We're both arguing that enhancement is possible, but that an understanding of the evolutionary and cognitive constraints is needed. We further add a bit on the kinds of domains where such trade-offs are most likely.

comment by mwengler · 2011-12-12T16:39:00.915Z · LW(p) · GW(p)

Seems stupid to me. If you look at human height, you see the problems 7 feet tall humans have. And yet we have the giraffe...