Epistemic status: Practising thinking aloud. There might be an important question here, but I might be making a simple error.
There is a lot of variance in general competence between species. Here is the standard Bostrom/Yudkowsky graph to display this notion.
There's a sense that while some mice are more genetically fit than others, they're broadly all just mice, bound within a relatively narrow range of competence. Chimps should not be worried about most mice, in the short or long term, but they also shouldn't worry especially so about peak mice - there's no incredibly strong or cunning mouse they ought to look out for.
However, my intuition is very different for humans. While I understand that humans are all broadly similar, that a single human cannot have a complex adaptation that is not universal [LW · GW] , I also have many beliefs that humans differ massively in cognitive capacities in ways that can lead to major disparities in general competence. The difference between someone who does understand calculus and someone who does not, is the difference between someone who can build a rocket and someone who cannot [LW · GW]. And I think I've tried to teach people that kind of math, and sometimes succeeded, and sometimes failed to even teach basic fractions.
I can try to operationalise my hypothesis: if the average human intelligence was lowered to be equal to an IQ of 75 in present day society, that society could not have built rockets or do a lot of other engineering and science.
(Sidenote: I think the hope of iterated amplification is that this is false. That if I have enough humans with hard limits to how much thinking they can do, stacking lots of them can still produce all the intellectual progress we're going to need. My initial thought is that this doesn't make sense, because there are many intellectual feats like writing a book or coming up with special relativity that I generally expect individuals (situated within a conducive culture and institutions) to be much better at than groups of individuals (e.g. companies).
This is also my understanding of Eliezer's critique [LW · GW], that while it's possible to get humans with hard limits on cognition to make mathematical progress, it's by running an algorithm on them that they don't understand, not running an algorithm that they do understand, and only if they understand it do you get nice properties about them being aligned in the same way you might feel many humans are today.
It's likely I'm wrong about the motivation behind Iterated Amplification though.)
This hypothesis doesn't imply that someone who can do successful abstract reasoning is strictly more competent than a whole society of people who cannot. The Secret of our Success [LW · GW] talks about how smart modern individuals stranded in forests fail to develop basic food preparation techniques that other, primitive cultures were able to build.
I'm saying that a culture with no people who can do calculus will in the long run score basically zero against the accomplishments of a culture with people who can.
One question is why we're in a culture so precariously balanced on this split between "can take off to the stars" and "mostly cannot". An idea I've heard is that if a culture is easily able to reach technologically maturity, it will come later than a culture who is barely able to become technologically maturity, because evolution works over much longer time scales than culture + technological innovation. As such, if you observe yourself to be in a culture that is able to reach technologically maturity, you're probably "the stupidest such culture that could get there, because if it could be done at a stupider level then it would've happened there first."
As such, we're a species whereby if we try as hard as we can, if we take brains optimised for social coordination and make them do math, then we can just about reach technical maturity (i.e. build nanotech, AI, etc).
That may be true, but the question I want to ask about is what is it about humans, culture and brains that allows for such high variance within the species, that isn't true about mice and chimps? Something about this is still confusing to me. Like, if it is the case that some humans are able to do great feats of engineering like build rockets that land, and some aren't, what's the difference between these humans that causes such massive changes in outcome? Because, as above, it's not some big complex genetic adaptation some have and some don't. I think we're all running pretty similar genetic code.
Is there some simple amount of working memory that's required to do complex recursion? Like, 6 working memory slots makes things way harder than 7?
I can imagine that there are many hacks, and not a single thing. I'm reminded of the story of Richard Feynman learning to count time, where he'd practice being able to count a whole minute. He'd do it while doing the laundry, while cooking breakfast, and so on. He later met the mathematician John Tukey, who could do the same, but they had some fierce disagreements. Tukey said you couldn't do it while reading the newspaper, and Feynman said he could. Feynman said you couldn't do it while having a conversation, and Tukey said they could. They then both surprised each other by doing exactly what they said they could.
It turned out Feynman was hearing numbers being spoken, whereas Tukey was visualising the numbers ticking over. So Feynman could still read at the same time, and his friend could still listen and talk.
The idea here is that if you're unable to use one type of cognitive resource, you may make up for it with another. This is probably the same situation as when you make trade-offs between space and time in computational complexity.
So I can imagine different humans finding different hacky ways to build up the skill to do very abstract truth-tracking thinking. Perhaps you have a little less working memory than average, but you have a great capacity for visualisation, and primarily work in areas that lend themselves to geometric / spacial thinking. Or perhaps your culture can be very conducive to abstract thought in some way.
But even if this is right I'm interested in the details of what the key variables actually are.
A mouse brain has ~75 million neurons, a human brain ~85 billion neurons. The standard deviation of human brain size is ~10%. If we think of that as a proportional increase rather than an absolute increase in the # of neurons, that's ~74 standard deviations of difference. The correlation between # of neurons and IQ in humans is ~0.3, but that's still a massive difference. Total neurons/computational capacity does show a pattern somewhat like that in the figure. Chimps' brains are a factor of ~3x smaller than humans, ~12 standard deviations.
Selection can cumulatively produce gaps that are large relative to intraspecific variation (one can see the same relationships even more blatantly considering total body mass). Mice do show substantial variation in maze performance, etc.
And the cumulative cognitive work that has gone into optimizing the language, technical toolkit, norms, and other factors involved in human culture and training into are immensely beyond those of mice (and note that human training of animals can greatly expand the set of tasks they can perform, especially with some breeding to adjust their personalities to be more enthusiastic about training). Humans with their language abilities can properly interface with that culture, dwarfing the capabilities both of small animals and people in smaller earlier human cultures with less accumulated technology or economies of scale.
Hominid culture took off enabled by human capabilities [so we are not incredibly far from the minimum need for strongly accumulating culture, the selection effect you reference in the post], and kept rising over hundreds of thousands and millions of years, at accelerating pace as the population grew with new tech, expediting further technical advance. Different regions advanced at different rates (generally larger connected regions grew faster, with more innovators to accumulate innovations), but all but the smallest advanced. So if humans overall had lower cognitive abilities there would be slack for technological advance to have happened anyway, just at slower rates (perhaps manyfold), accumulating more by trial and error.
Human individual differences are also amplified by individual control over environments, e.g. people who find studying more congenial or fruitful study more and learn more.
First, humans aren't at equilibrium; as you point out, our environment has shifted much more quickly than evolution has time to catch up with. So we should expect that many analyses that make sense at equilibrium aren't correctly describing what's happening now.
Second, while it seems like "humans are very different yet mice are all the same," this is often because it's easy to track the differences in humans but difficult to track the differences in mice. What fraction of mice become parents (a decent proxy for the primary measure of success, according to evolution)? Would it look like the core skills of being a mouse (finding food, evading predators, sociability, or so on) have variance comparable to the human variation in intelligence? What fraction of humans become parents?
Third, while we have some evidence that humans are selected for intelligence (like the whole skull/birth canal business), intelligence is just one of many traits that are useful for humans, and we don't have reason to believe this is the equilibrium that would result if intelligence were the only determinant of fitness. Consider Cochran et al on evidence for selection for intelligence for Ashkenazi Jews; they estimate that parents had perhaps a 1 IQ point edge over non-parents for the last ~500 years (with lower estimates on the heritability of intelligence having to only slightly increase that number).
Fourth, rapid population growth generally amplifies variance along dimensions that aren't heavily selected for if the population growth is accomplished in part by increasing the number of parents.
Half formed thoughts towards how I think about this:
Something like Turing completeness is at work, where our intelligence gains the ability to loop in on itself, and build on its former products (eg definitions) to reach new insights. We are at the threshold of the transition to this capability, half god and half beast, so even a small change in the distance we are across that threshold makes a big difference.
From "complex adaptations require a sequence of simpler adaptations", we cannot conclude "that a single human cannot have a complex adaptation that is not universal"! What you are missing is physical barriers that existed for large parts of human history and prehistory, such as the Sahara Desert, that largely prevented any significant amount of gene flow. While individual early European explorers could cross the Sahara Desert and might even have survived for a short while, they would face considerable dangers on the other side: tropical diseases such as malaria to which their bodies and immune systems in particular were poorly adapted, no modern science-based medicine to treat them with and no easy way to communicate with locals, unfamiliar wild animals such as the very dangerous hippopotamus, and hostile African tribes.
And yes, African tribes would probably have been highly suspicious of, and hostile towards, white outsiders - not least because of their "strange-coloured" skin, strange clothes, and incomprehensible language. Even today, Albino Africans are discriminated against in Africa, and sometimes even killed for their flesh to be used by witch-doctors. The modern notion that we should care about all humans equally, because racial and national differences are superficial at most, is quite a modern phenomenon. There is little sign of these kinds of ideas in most of human history, although Islam approaches this idea with the idea of the Ummah (brotherhood of all Muslim men and sisterhood of all Muslim women) and Christianity gets even closer with the idea of people of other races (the Good Samaritan, for example) being "your neighbour" who should be treated "as yourself". Both were quite radical breaks from the more tribal pasts that had preceded them - and we can see that in evidence in the Old Testament. The God of the Old Testament is portrayed as snuffing out entire rival tribes - or in one case, the Great Flood story, the entire outside world. But of course, the period I am referring to was before European imperialism spread Christianity in Africa. (And in practice, Christian ideals of treating others as neighbours were not in evidence in the Crusades, or in the many expulsions of Jews from European countries.)
One cause of variation in intelligence is injuries, such as head injuries. (It is a little-known fact that this can also cause personality changes, such as increased risk-taking - so there is an intriguing hypothesis that the prevalence of both diagnosed and undiagnosed blunt-force head trauma caused by male students playing American Football may have led to some men randomly obtaining the "gift" of risk-taking, leading to greater US entrepreneurship, and thus contributed to America's impressive economic performance. My confidence in this hypothesis is very low, however, because I have only seen it mentioned on Twitter.)
However, it is noteworthy that while injuries, such as brain injuries caused by temporary loss of oxygen during a badly-managed birth, or caused by blunt-force trauma, can cause a reduction in IQ, the "moron level" or "idiot level" - the IQ level at which someone is considered developmentally disabled - can differ by population group. It is an unpalatable fact that a white European with an IQ of 60 and the typical impairments going along with that would probably be considered developmentally disabled, while a pygmy hunter-gatherer with an IQ of 60 - normalised to the same scale as the European - might not. Although the pygmy, in line with other traditional hunter-gatherer peoples that survive today, perhaps does not read or write or count above ten in their native language, and spends a lot of time hunting, gathering, or trying to sell what they have hunted, and thus the cognitive demands upon them in their daily lives would arguably be a bit lower, this difference in assessments is mainly because the IQ averages of the two groups are significantly different. Why might we expect this to be so? Well, as already noted, different population groups were semi-completely separated by physical barriers, such as the Sahara Desert and the ocean around Australia, for long periods of time. They also faced different selection pressures - another factor making substantial group differences plausible from an evolutionary perspective. Recall that the majority of experts believe that homo sapiens originated in Africa, so if that's true, all races have ancestors there - so white and Asian people must have had ancestors who lived in Africa, some of whom must have migrated out of Africa at some point. (And by the way, we have research on how each population group on earth got where they reside today, including Native Americans and Australian Aborigines.) The development of farming - obviously a technological factor, but one that was sort of necessitated by migrating into harsher environments with harsher winters - may have exacerbated these selection pressures.
Hunter-gathering requires a certain amount of intelligence, clearly, but perhaps less so than farming. Farming, particularly in challenging environments with long, heavy winters with snow on the ground, requires more long-term thinking and values, and rewards intelligence that produces innovations that yield more food more reliably - in the Malthusian era that comprised most of human history and prehistory, those who were able to have and feed and protect more children because of their superior food-obtaining abilities, would often have passed on more copies of their genes, and those that had less success at doing so would have passed on fewer copies of their genes, or even none at all. This is so even though such technological innovations will have spread out over time, thanks to the one intelligence characteristic that dramatically differs between humans and all other species - the ability to learn from and teach each other. (Note that many humans throughout Malthusian times never had any surviving children - whether due to infertility, extreme poverty or injuries acquired from war or other violence.)
But the group differences don't stop there. Certain specific population groups, such as Ashkenazi Jews in Europe, the Igbo tribes in Nigeria, and - as recent indirect findings by Gregory Clark have indicated - in modern times, the Southern English (which includes Northern English individuals who migrated, or whose ancestors migrated, to the South of England), seem to have higher average intelligence than the average of the broader groups they live next to, despite not having been physically separated from other groups in quite the same way. On the other end of the intelligence spectrum, though this is doubtless somewhat embarrassing for at least some people with such ancestry, there seems to be correlations in humans between brain size and IQ, between height and IQ, and between height and brain size, and the observed intelligence of certain population groups with shorter average height seems to bear this correlation out. (However, it is important to note that women are not substantially less intelligent than men on average, despite being significantly shorter on average. I would speculate that perhaps women have a similar total number of neurons to men, but packed in more densely on average - in line with how certain other parts of their bodies are often genetically programmed to be smaller than, larger than or structurally different to men's. Animal research has established that an underlying factor influencing intelligence on an inter-species level is number of neurons, rather than brain size per se - which explains why whales and elephants are not more intelligent than humans.)
Why might this be so? For Ashkenazi Jews - that is, Jews of proximate European ancestry - we don't know for sure. However, one plausible hypothesis holds that that the regulatory environment over the course of hundreds of years of living in European nations, where Jewish people were frequently the only people allowed to levy interest-bearing loans due to the then-in-force Christian prohibition on usury, together with high-pressure sexual selection and notably arranged marriages, would have selected for successful and intelligent people and their genes.
Note that this probably seems very implausible from the perspective of the idea, common among non-scientists in the West, that evolution takes a very, very long time to do anything, especially to produce complex traits, a conception of evolution which has probably been influenced by the highly influential essaying of the anti-hereditarian (i.e. opposed to the idea that there are genetically-caused differences in average IQ between groups) Steven Jay Gould. But actually we have an existence proof that selection doesn't have to take a long time to produce changes from the human development of dog breeds using artificial selection, and of cows bred to have unnaturally large udders to maximise milk production and so on, and scientific research on species with much shorter generation lengths than our own bears this out as well. And even Gould himself was a proponent of the theory of punctuated equilibrium, which literally means slow periods where nothing much happened punctuated by periods of comparatively rapid change. And if you think about it, there are an incredibly large number of adaptations required to turn simple microorganisms into modern humans, both ones currently known to science and presumably a large number not known to science yet, because of our comparative lack of understanding of the human brain, so it's not the case that each and every one of those changes would have taken millions of years - there simply wouldn't have been enough time.
In addition, being able to memorise and interpret large portions of the scriptures was highly valued, which would plausibly have selected for verbal abilities, which is anecdotally extraordinarily high among Ashkenazi Jews (I know an Ashkenazi Jewish man who speaks several languages fluently - including English, which is not his first language, with flawless pronunciation). Indeed, not only would more successful Jewish financiers, religious scholars and rabbis on average have had more children - less successful Jewish men, if they had reproduced at all, might have simply out-married from the group and therefore their heirs would not have been considered Jewish. This would be despite having the same proportion of Jewish blood - 50% - as a child of a Jewish female and a Christian convert, who would be considered Jewish under traditional Jewish law, and would of course be raised as part of the Jewish community. So the whole traditional demarcation between Jewish and non-Jewish people, defined by "mother is Jewish", technically makes little sense from a modern scientific perspective with the benefit of our knowledge of how genetics works, but it could help to form part of the story of how the average intelligence of Jewish people living in Europe apparently rose more than their non-Jewish peers despite not-inconsiderable intermarriage in both directions over the centuries (which is probably why Jewish and non-Jewish people of European ancestry are often difficult to distinguish visually, in the absence of clothing or grooming cues in the case of Orthodox Jewish men).
Less is known about the history of the Igbo tribes, who were historically farmers, and even less about the past of individual hunter-gatherer tribes.
In England, the South is more prosperous due to the very economically-important city of London, in modern times one of the world's top financial centres, but which has also been a key global city for centuries. So there would have been strong economic incentives to migrate for work, or a better or more reliable income, over the course of a number of centuries - especially in centuries past when the equivalents of "welfare" (in British English, "benefits"), were slim or inadequate compared to today, or when there was no year-round work available in the local village, with its relatively simple economy. I don't know how open to migration the English are on average, but Americans are unusually prone to migrate for a better job (another possible reason for unusually high salaries in America, I would speculate), and many Americans have English ancestry, of course. So perhaps selection effects alone could explain the apparent average IQ differences within England.
Hunter-gathering probably needed skills that are harder to AI-replace *for the common individual* even if farmer societies can use their greater numbers to accumulate more technology. This is because farmer societies move around less and have more specialized labor, making life require more narrower tasks and less general problem solving. That latter is what we call "intelligence". The 21st century is reversing this trend rapidly with automation.
Farming existed almost all over the world for long enough for natural selection to matter but not enough for our DNA and cultures to completely "forget" life as a hunter gatherer. Modern society is (most likely) a mix of ~10 hunter-gather phenotypes to ~90 farmers. The farmer phenotype has a lot of less interest in asking "why". The "intelligence" difference between a person who asked "why" since age 3 and one who didn't is a result of massive differences in training. Same with any other talent such as violin.
No non-human animal species with a well-studied intelligence had a sudden transition in the "window" of ~10000 years ago that drastically changed the skills needed to get by.
we cannot conclude "that a single human cannot have a complex adaptation that is not universal"
You seem to have missed an important word, so I bolded it for you. A reproductively isolated population is a very different case. For example, a bunch of finches got stuck in the Galapagos a few million years ago; you might have heard of them.
I have my own (unoriginal) answer, outlined in this post [LW · GW].
In short, I think it's important to distinguish learning ability and competence. The reason why Magnus Carlsen towers above me in chess is because he has more practice, not because he's superintelligent.
However, I also think that even putting this distinction aside, we shouldn't be surprised by large variation. If we think of the brain like a machine, and cognitive deficits as parts being broken parts in the machine, then it is easy to see how distributed cognitive deficits can lead to wide variation.
Is there some simple amount of working memory that's required to do complex recursion? Like, 6 working memory slots makes things way harder than 7?
My theory is that 4 is enough, and the extra that many people have is just there because it's useful overkill that's there because it doesn't hurt anything and extra memory makes things easier faster.
So, first, note that for computing pairwise operations, you only need a stack as high as 3 (so long as additional future inputs can come from somewhere else). If you've ever worked with a stack-based calculator, like one that supports reverse polish notation (RPN), or a stack-based computer language, you probably found this out empirically, but you might have also known it from the fact that early stack-based calculators had only 3 registers and that was good enough.
Well, almost. With 3 registers you can only carry forward a single term between operations and only perform scalar operations. With 4 registers to construct the stack you can do pairwise operations of two variables or carry forward an additional term. But more importantly in practice having only 3 registers is annoying and although you can theoretically do whatever you want it requires careful ordering of operations to avoid a stack overflow. With 4 you rarely have to think ahead, which is nice: you just perform the operations and having the extra register lets you get into and out of near overflows without actually running out of space and having to start over.
More registers are nice, but registers cost money, and for a long time stack-based calculators settled on using 4 registers because it was the best balance of cost, functionality, and flexibility. Again, 3 was enough, but annoying enough to work with that everyone was happy to pay for 4, but few were willing to pay for more.
Now, does this mean 4 is enough for complex recursion? I mean, sure, so long as you are tail call optimizing. More just makes life easier and means you don't have to recurse. Why wouldn't you want to do that, though?
Well, doing all this assumes you can perform the complex mental operations you want as pairwise operations. And maybe you can do that, but also maybe you don't know how, so you end up needing more working memory to deal with trying to perform operations you can't perform "natively", yet.
And why think of the mind as performing mental operations over working memory at all or that you can develop access to more powerful operations that let you do more with the same memory? That's a long topic, but I'd recommend this paper as a starting point that melds well with the viewpoint I've expressed here.
I have seen articles that track IQ of countries in Northern Hemisphere vs countries along the equator. The IQ of people in colder countries is significantly higher than along the equator. Maybe historically people who lived in cold climates had to struggle to survive against the climate and this caused them to exercise larger % of the brain in order to survive??
You have to struggle to survive even harder in hotter temperatures or at least that is how I feel. Nowadays, I associate higher temperatures with an invitation to leisure and laziness somehow. Also it happnes that the countries along the Equator are somewhat less developed or have been dealing with political, climate, economical and health issues most of their existence.
In terms of countries near equator are you seeing cause or effect. Equator regions should have been able to live off the land (eat and use plants that they did not have to grow) In northern countries people either hunted to survive or became agriculture based society. Either way they would have put in more planning and effort. Repeat this for a few hundred years might result in more creativity and reasoning ability.
The sample size isn't big enough: nearby countries are too strongly coupled with each-other. Regions are closer to independent. But the only major regions were Europe, Middle East, East Asia, India, West Africa, East Africa, South Africa, and several in the new world. It's hard to form statistics around such a small number.
Of these, four enjoyed "top of the world" civilization status at some point in time: the Middle East + North Africa, Europe, China, and India. Mesoamerica lived independently until it got destroyed by Europe, so it is hard to place in a global hierarchy. This list is pretty random; there is no evidence for or against an "avoidance of the equator".
And it keeps moving. After the industrial revolution, the "top of the world" in terms of innovation moved from England to New York (Edison days) and most recently into the Silicon Valley.
But why does the king of the hill keep changing? So what breaks the technology/resource extraction/warfare feedback? It's basically crumbling infrastructure combined with regulatory capture at all levels of institutions. The trouble is the timescales are now so fast that the silicon valley is already elderly. There are 5 gas-tank apps. Many startups use the same recycled formula with social media, block-chains, "Uber for X", etc rather than addressing new problems. We already have to go to wherever is "next" (which may or may not be in the same physical area) if we want to use our skills in a young blossoming community.
I'd say that intelligence variations are more visible in (modern) humans, not that they're necessarily larger.
Let's go back to the tribal environment. In that situation, humans want to mate, to dominate/be admired, to have food and shelter, and so on. Apart from a few people with mental defects, the variability in outcome is actually quite small - most humans won't get ostracised, many will have children, only very few will rise to the top of the hierarchy (and even there, tribal environments are more egalitarian that most, so the leader is not that much different from the others). So we might say that the variation in human intelligence (or social intelligence) is low.
Fast forward to an agricultural empire, or to the modern world. Now the top minds can become god emperor, invading and sacking other civilizations, or can be part of projects that produce atom bombs and lunar rockets. The variability of outcomes is huge, and so the variability in intelligence appears to be much higher.
the question I want to ask about is what is it about humans, culture and brains that allows for such high variance within the species, that isn't true about mice and chimps?
Some points to consider:
1. Has it been demonstrated that variations in intelligence is that much greater for humans than for mice or chimps? This may be true, but you didn't indicate any references.
Whereas I could imagine a test for chimp intelligence, and even timed maze experiments on mice, the concept of what we mean by intelligence becomes increasingly attenuated as we deal with ever simpler life forms; so that, at some point, and maybe even quite early, experts will begin disagreeing on what they are trying to measure.
2. Modern day humans have a big advantage not only over other animals, but also over our cognitively equivalent ancestors of 12+ thousand years ago. Thanks to the invention of culture, we pass knowledge to our offspring, meaning that knowledge can be accumulated from generation to generation. Variations in cognitive performance isn't only a consequence in variations in intelligence, but also reflects large differences in the quality of acculturation.
3. I wonder if your decision to compare interspecies variations in intelligence follows from a mistaken analogy. Consider, that intelligence is a human specialty. Other species have their own specialities. For instance, maybe we should be comparing variations in human intelligence with variations in the maximum speed of healthy, adult cheetahs. (I wonder, whether anyone has ever done this?)
4. The idea that we can assign a number to the variation in human intelligence is suspect. True, we can claim that the standard deviation in IQ is 15% of average IQ value. But it doesn't follow that a +1 sigma individual is 15% smarter than an average individual, because the IQ scale itself is arbitrary and intelligence has never been defined apart from performance on the test. To make the point in another way, 1-sigma variations in intelligence was arbitrarily set to 15 IQ points purely for convenience. We might just as well have set the mark at 900 IQ points. But that wouldn't mean that the +1-sigma individual was then ten times as intelligent as average.
Compare the situation with the cheetahs, where a statement like: "the ratio between the standard deviation in maximum running speeds and the average individual's maximum running speed is .15", really means something in terms of performance that can be measured with a metre and stopwatch.
What would be wanted to put IQ and maximum speed on par, would be credible results showing that a certain superiority in IQ is closely connected with a certain improved ability in raising fertile offspring to maturity, which is the definition of evolutionary success.
I think intelligence is much like homosexuality ...
... in that, it mostly benefits the tribe/gene-pool, but not the individual.
Being of average intelligence you are more intelligent than a good portion of the population and that helps you, just as being sub-average might be a hindrance in some situations. But being that much more intelligent does not help that individual much.
One does not have to be intelligent to profit from the intelligence of others. "We flew to the moon." No, *we* did not. We did not find Antibiotics, but we have much more breeding success because of it.
The fact that someone does not understand calculus, does not imply that they are incapable of understanding calculus. They could simply be unwilling. There are many good reasons not to learn calculus. For one, it takes years of work. Some people may have better things to do. So I suggest that your entire premise is dubious - the variance may not be as large as you imagine.
Personally, I learned a semester worth of calculus in three weeks for college credit at a summer program (the Purdue College Credit Program circa 1989, specifically) when I was 16. Out of 20ish students (pre-selected for academic achievement), about 15% (see note 1) aced it while still goofing around, roughly 60% got college credit but found the experience difficult, and some failed. Two years later, my freshman roommate (note 2) took the same Purdue course over 16 weeks and failed it. The question isn't "why don't some people understand calculus", but "why do some people learn it easily while others struggle, often failing".
Note 1: This wasn't a statistically robust sample. "About 15%" means "Chris, Bill, and I".
Note 2: That roommate wanted to be an engineer and was well aware that he could only achieve that goal by passing calculus. He was often working on his homework at 1:30 am, much to my annoyance. He worked harder on that course than I had, despite being 18 years old and having a (presumably) more mature brain.
I learned a semester worth of calculus in three weeks
I'm assuming this is a response to my "takes years of work" claim, I have a few natural questions:
1. Why start counting time from the start of that summer program? Maybe you had never heard of calculus before that, but you had been learning math for many years already. If you learned calculus in 3 weeks, that simply means that you already had most of the necessary math skills, and you only had to learn a few definitions and do a little practice in applying them. Many people don't already have those skills, so naturally it takes them a longer time.
2. How much did you learn? Presumably it was very basic, I'm guessing no differential equations and nothing with complex or multi-dimensional functions? Possibly, if you had gone further, your experience might have been different.
3. Why does speed even matter? The fact that someone took longer to learn calculus does not necessarily imply that they end up with less skill. I'm sure there is some correlation but it doesn't have to be high. Although slow people might get discouraged and give up midway.
My point isn't that there is no variation in intelligence (or potential for doing calculus), but that there are many reasons why someone would overestimate this variation and few reasons to underestimate it.
1) True, but by the time that roommate took the class he had had comparable math foundations to what I had had when I took the class. Considering the extra years, arguably rather more. (Upon further thought I realized that I had taken the class in 1988 at the age of 15)
2) That was first-semester calc, Purdue's Math 161 class (for me and the roommate). Intro calc. Over the next two years I took two more semesters of calc, one of differential equations, and one of matrix algebra. By the time I met my freshman roommate (he was a bit older than me) and he started the calc class, I'd had five semesters of college math (which was all I ever took b/c I don't enjoy math). Also, that roommate was a below-average college student, but there are people in the world with far less talent than he had.
3) Because time is the only thing you can't buy. Time in college can be bought, but not cheaply even then. I got through school with good grades and went on to grad school as planned; his plans didn't work out. Of course time marched on and I had failures of my own.
I agree that there's more to success than one particular kind of intelligence. Persistence, looks, money, luck, and other factors matter. But my roommate's calculus aptitude was a showstopper for his engineering ambitions, and I don't think his situation was terribly uncommon.
As such, if you observe yourself to be in a culture that is able to reach technologically maturity, you're probably "the stupidest such culture that could get there, because if it could be done at a stupider level then it would've happened there first."
Who first observed this? I say this a lot, but I'm now not sure if I first thought of it or if I'm just quoting well-understood folklore.
As for individual intelligence gaps, and specifically animals, I would say it is still pretty comparable to humans. The difference being the intelligence levels each species are working with. For instance, Ive met a lot of dogs. Some of them actually surprise me with how smart they are, and other dogs are just dumb but adorable animals. Same for mice and other rodents. And same for humans.
Humans won the evolution lottery in relation to industrial and cognitive ingenuity, and when we try to compare ourselves to other animals it seems like there is a shorter gap between different individuals intelligence's in a species. However it probably scales the same down.
As for the rest of it I relatively agree. Thinking about how different potentially technological mature species get there was quite fun as well.
Every dominant culture has essentially vanished in time, whereas the few cultures that still exist outside industry and technology have persisted much as they did since long before the fall of Rome. Many would say that survival and replication is the greatest accomplishment of all. Rockets are impressive but the resources necessary for their production are being consumed much faster than they can be replenished. The gene pool necessary to support those who understand calculus - to produce them and feed them, and clothe them, and carry away their waste will, in all likelihood, lead to the demise of the culture. There will be isolated tribes of hunter gatherers until the sun engulfs them all.
Summary of the 10 findings in the linked paper (Plomin et al 2016):
They show that all of these have large effect sizes and are well replicated, except where noted below. I notice that the authors cite themselves a lot as support for many of these claims. I am not an expert in any of this, so if they're trying to push controversial ideas as widely accepted, I wouldn't be able to see through it.
1. Significant genetic influence is ubiquitous in cognitive and psychological traits. Intelligence has about 50% heritability. Twin studies show intelligence correlation about 0.85 in identical twins vs 0.6 in fraternal twins.
2. Although basically all psychological traits have some heritability (typically 30-50%) none of them have close to 100% heritability. Contrast this with physical traits like height, which has about 90% heritability.
3. Heritability of complex traits is caused by many genes of small effect that add up. Example: tendency for open-field activity in mice shows a linear response to selection pressure over 30 generations, rather than a clear separation that would occur if it were controlled by just a few genes. "Genome-wide association" studies hundreds of thousands or millions of nucleotides covering most of the genome to detect population associations between a single-nucleotide polymorphism and a trait. It generally finds that even the most significant genetic changes by themselves have tiny effects (far less than 1% of variation).
4. Correlations between traits are usually caused largely by genetics. For example, the strong correlation between types of intelligence (R=0.76 between reading and math) is due more to genetics than environment (the reading/math correlation is about 64% genetic). Anxiety and depression are correlated entirely for genetic reasons (they are affected by all the same genes). The schizophrenia/bipolar connection is largely genetic too, as is neuroticism/depression. Another finding (not yet replicated) is that the correlation of 0.3 between exercise behavior and attitudes toward exercise is 70% genetic; I interpret this to mean that most of the genetic influence on exercise behavior is caused by the influence of those same genes on attitudes toward exercise.
5. Counterintuitively, heritability of intelligence increases linearly throughout development (from 41% at age 9 to 66% at age 17 in one twin study, and maybe as high as 80% in adulthood).
6. Stability of traits from age to age is largely due to genetics; changes that occur with age are largely environmental. So then how does the heritability of intelligence increase over time? The authors suggest "genetic amplification": genetic nudges early in development get magnified as time goes by, perhaps due to genotype-environment correlation (kids choose or create environments that match their propensities). Some evidence supports this idea, but it may vary depending on the culture. The authors do not seem to consider genetic amplification one of the "replicated findings" noted in the title.
7. Most measures of the 'environment' show significant genetic influence. This is a generalization of the genotype-environment correlation noted above for intelligence. Parenting, social support, and some life events seem to be causally affected by a child's genetics (not just correlated); this can be shown in twin studies. Same goes for school and work environments. Heritability averages 0.27. This again varies with culture; parenting is more affected by the child's genetics in Japan than in Sweden. A child's genetics have even been shown to have some effect on the family's socioeconomic status.
8. Most associations between environmental measures and psychological traits are significantly mediated by genetics. Since genetic factors affect environmental measures as well as behavioral measures, we should not assume that correlations between parenting and children’s behavior are caused entirely by the environmental effect of parenting on children’s behavior. For instance, correlation between a child's developmental index and measures of their home environment is stronger for genetically related families (0.44) than adopted families (0.29). So, much of what appears to be the effect of parenting on behavior is actually effect of the parents' and child's shared genetics on both the behavior and the environment. Disentangling genetic and environmental influences is important because it allows us to tailor interventions more effectively.
9. Most environmental effects are not shared by children growing up in the same family: salient experiences are specific to each child. Similarity among siblings is mainly due to shared genetics. Non-shared environment has a bigger effect on phenotypic variance than shared environment does. Shared environment between siblings (including going to the same schools) accounts for 10-15% of variance in academic achievement. Shared environment's effect on intelligence decreases after adolescence. Specific non-shared environmental effects are hard to identify, and are likely due to additive effects of many seemingly inconsequential experiences.
10. Abnormal is normal: quantitative genetic methods suggest that common psychological disorders are the extremes of the same genetic factors responsible for heritability throughout the distribution. Reading disabilities, for instance, have been shown to have strong "group heritability," indicating a genetic link between the disorder and normal variation in quantitative measures of reading ability. This is supported by the finding that many genes of small effect determine heritability of traits (finding 3); polygenic scores that sum these effects are normally distributed. An interesting exception involves severe intellectual disability (IQ < 70), which this type of analysis suggests is etiologically distinct from the normal distribution of intelligence (no significant group heritability).
The authors suggest that the above findings have replicated because: the controversy of the nature/nurture debate has motivated bigger and better studies; behavioral genetics has historically used better statistical methods than much of psychology, partly because studies often have to be observational rather than experimental; focusing on the net effects of genetics and environment is more reliable than studying specific genes (polygenic scores work better); there are better incentives and opportunities (data) for replications; and because genetic effect sizes are larger than other factors studied in psychology (e.g. sex differences generally account for less than 1% of variance on psychological traits). Many of these advantages cannot easily transfer to other fields.