The Simple Math of Everything
post by Eliezer Yudkowsky (Eliezer_Yudkowsky)
I am not a professional evolutionary biologist. I only know a few equations, very simple ones by comparison to what can be found in any textbook on evolutionary theory with math, and on one memorable occasion I used one incorrectly. For me to publish an article in a highly technical ev-bio journal would be as impossible as corporations evolving. And yet when I'm dealing with almost anyone who's not a professional evolutionary biologist...
It seems to me that there's a substantial advantage in knowing the drop-dead basic fundamental embarrassingly simple mathematics in as many different subjects as you can manage. Not, necessarily, the high-falutin' complicated damn math that appears in the latest journal articles. Not unless you plan to become a professional in the field. But for people who can read calculus, and sometimes just plain algebra, the drop-dead basic mathematics of a field may not take that long to learn. And it's likely to change your outlook on life more than the math-free popularizations or the highly technical math.
Not Jacobean matrices for frequency-dependent gene selection; just Haldane's
calculation of time to fixation. Not quantum physics; just the wave
equation for sound in air. Not the maximum entropy solution using
Lagrange multipliers; just Bayes's Rule.
The Simple Math of Everything, written for people who are good at math, might not be all that weighty a volume. How long does it take to explain Bayes's Rule to someone who's good at math? Damn would I like to buy that book and send it back in time to my 16-year-old self. But there's no way I have time to write this book, so I'm tossing the idea out there.
Even in reading popular works on science, there is yet power. You don't want to end up
like those poor souls in that recent interview (I couldn't Google)
where a well-known scientist in field XYZ thinks the universe is 100
billion years old. But it seems to me that there's substantially more
power in pushing until you encounter some basic math. Not complicated
math, just basic math. F=ma is too simple, though. You should take the highest low-hanging fruit you can reach.
Yes, there are sciences whose soul is not in their math, yet which are
nonetheless incredibly important and enlightening. Evolutionary
psychology, for example. But even there, if you kept pushing until you
encountered equations, you would be well-served by that
heuristic, even if the equations didn't seem all that enlightening
compared to the basic results.
I remember when I finally picked up and started reading through my copy of the Feynman Lectures on Physics, even though I couldn't think of any realistic excuse for how this was going to help my AI work, because I just got fed up with not knowing physics. And - you can guess how this story ends - it gave me a new way of looking at the world, which all my earlier reading in popular physics (including Feynman's QED) hadn't done. Did that help inspire my AI research? Hell yes. (Though it's a good thing I studied neuroscience, evolutionary psychology, evolutionary biology, Bayes, and physics in that order - physics alone would have been terrible inspiration for AI research.)
In academia (or so I am given to understand) there's a huge pressure
to specialize, to push your understanding of one subject all the way
out to the frontier of the latest journal articles, so that you can
write your own journal articles and get tenure. Well, one may certainly have to learn the far math of one field, but why avoid the simple math of others? Is it too embarrassing to learn just a little math, and then stop? Is there an unwritten rule which says that once you start learning any math, you are obligated to finish it all? Could that be why the practice isn't more common?
I know that I'm much more embarrassed to know a few simple equations of physics, than I was to know only popular physics. It feels wronger to know a few simple equations of evolutionary biology than to know only qualitative evolutionary biology. Even mentioning how useful it's been seems wrong, as if I'm boasting about something that no one should boast about. It feels like I'm a dilettante - but how would I be diletting less if I hadn't studied even the simple math?
Comments sorted by oldest first, as this post is from before comment nesting was available (around 2009-02-27).
comment by Pete_Carlton ·
2007-11-17T23:17:59.000Z · LW(p) · GW(p)
But there's no way I have time to write this book, so I'm tossing the idea out there.
Would you have time to start a wiki whose purpose was to be edited into a book, coauthored by dozens of contributors, who can explain the basic simple math of their field to non-math-phobic laypeople? (This is different from just scraping Wikipedia; these would be targeted articles, perhaps some invited ones...) Of course that could end up taking more time due to the infamous herding cats problem. But I'd love to have that book to read on the BART train.
Replies from: Yoav Ravid
comment by Psy-Kosh ·
2007-11-18T00:46:27.000Z · LW(p) · GW(p)
Pete: I was just thinking the same thing, that we ought to start a wiki to do this project. Questions do come up though like "where ought one draw the line between the simple and nonsimple"? This question relates even ti billswift's comment about the name.
For instance, in physics, ought we include Hamilton's equations/the hamiltonian? There's certainly understanding to be found by considering a system in those terms. But deriving those and so on probably is a bit deeper than what one might want to consider "easy math"... or maybe not. Those are in some ways the starting point that leads to the deep stuff.
There's probably analogous questions in other fields. So we have to decide what we're going to consider the "easy" math.
Replies from: taryneast
↑ comment by taryneast ·
2010-12-19T13:49:14.102Z · LW(p) · GW(p)
"where ought one draw the line between the simple and nonsimple"?
My suggestion would be not to draw the line... but to grade things on how hard they are (fundamental, basic, intermediate...).
That way, anybody can start, and can stop at any time they want to...
comment by Robin_Hanson2 ·
2007-11-18T00:50:33.000Z · LW(p) · GW(p)
I remember reaching exactly this point and making exactly this wish many years ago. I tried to learn as many fields as I could by reading introductory textbooks, and most of those texts avoid any math. I thought that a text that was willing to use simple math could teach me a lot more a lot faster. My theory was that there were too few people who could handle simple math and would want to learn many fields to support the book. But I'd love to be shown wrong.
comment by Miguel2 ·
2007-11-18T01:22:55.000Z · LW(p) · GW(p)
That's a GREAT idea. I've been trying to do the same as Robin, but the availability of good textbooks is somewhat limited where I live (and they're quite expensive to import). A volume containing the introductory math for many fields would make things much easier, and I'd certainly be buying it.
comment by Felix2 ·
2007-11-18T01:34:50.000Z · LW(p) · GW(p)
Is a Wiki separate from Wikipedia needed?
Similar problem: One thing I run in to often on Wikipedia is entries that use the field's particular mathematical notation for no reason other than particular symbols and expressions are the jargon of the field. They get in the way of understanding what the entry is saying, though.
Similar problem is there seem to be academic papers that have practical applications and yet the papers are written to be as unclear as possible - perhaps to take on that "important" sheen, perhaps simply because the authors are deep in their own jargon and assume all readers know everything they know. Consider papers in the AI field. :)
comment by Douglas_Knight3 ·
2007-11-18T05:27:57.000Z · LW(p) · GW(p)
I don't think most people feel more ashamed of knowing a little than knowing nothing; they just don't try.
But, Eliezer's shame reminds me of the story where Feynman is having trouble learning something, and his wife tells him to read like a beginner again. I believe it is a common speculation that people avoid learning new things to avoid feeling like a beginner.
comment by michael_vassar3 ·
2007-11-18T06:16:05.000Z · LW(p) · GW(p)
My guess is that most people simply don't know that knowing the math is important to understanding a subject. Until you have some technical understanding of a subject it may seem that a non-technical understanding is all there is.
comment by Eric_B. ·
2007-11-18T11:09:59.000Z · LW(p) · GW(p)
I am also a lurker/admirer of this site and I would love to have such a book!
I will be watching this topic and the wikipedia linked to, hoping something comes of it. Eventually I will put up simple neuroscience equations.
comment by billswift ·
2007-11-18T13:55:08.000Z · LW(p) · GW(p)
A little learning is not a dangerous thing to one who does
not mistake it for a great deal.
William A White
Quoted in Ronald Gross's Independent Scholar's Handbook.
Which, unfortunately, is not particularly useful for technical fields.
comment by mtraven ·
2007-11-18T18:02:57.000Z · LW(p) · GW(p)
Didn't Steven Hawking say that his publisher told him that every equation he put in his book would halve the sales? So that's why real math doesn't make it into most popular science books, one of the reasons there's a band-gap between narrative science and professional texts. Would be nice to have this filled, I agree.
comment by can_kurtulus ·
2007-11-18T19:18:47.000Z · LW(p) · GW(p)
There are some laudable attempts for such a book by a few people, the first one coming to mind is "the computational beauty of nature". Although it contains only a few fields, it's still a great book for the "not-afraid-of-a-few-basic-equations" crowd. Wish there were more books like that.
comment by Steve2 ·
2007-11-19T01:41:28.000Z · LW(p) · GW(p)
A little knowledge can be more dangerous - and embarrassing - than complete ignorance.
Yes. As a math professor, I sort of agree and sort of disagree with this post. On the one hand, people have lots of misunderstandings about math, as people like John Allen Paulos have written. But on the other hand, it's NOT true that everything has a simple mathematical model. Often mathematical models that might be useful in physics are not especially useful elsewhere, and even more often the most important thing is not the model's predictions, but the errors.
Look at the Social Security model, for example. It's incredibly unreliable, because it makes long-time predictions based on a single parameter (average growth of GNP) which is assumed to be constant over 40 years. And the difference in predictions by changing this widely varying number is on the order of 10-20 years.
But the problem is that a few people think they know the math here and think they understand the situation completely because of it. In fact they know a tiny bit of math (or trust that other people know the math), and end up doing incredibly stupid things because of it. If they actually knew more, they would be a lot more careful with things like personal accounts and such. Instead we trust a few political appointees, process a couple of the numbers involved, and base everything on that.
And if you disagree with me about personal accounts on Social Security or something, and just think I'm a liberal who shouldn't be taken seriously, compare the Doomsday argument http://en.wikipedia.org/wiki/Doomsday_argument. It uses statistics (which most people don't understand) to make a trivial prediction with absurd consequences that gets taken seriously. People with a little understanding of statistics will take it seriously, but people who actually understand the limitations of statistics will realize it's ridiculous.
Replies from: Emile
↑ comment by Emile ·
2011-10-31T09:45:26.659Z · LW(p) · GW(p)
But the problem is that a few people think they know the math here and think they understand the situation completely because of it. In fact they know a tiny bit of math (or trust that other people know the math), and end up doing incredibly stupid things because of it.
Agreed, but people with enough experience of the limits of simple mathematical models in one field are less likely to make that mistake in other fields.
A hypothetical "The Simple Maths of Everything" textbook should include warnings about the limits of the models, and a few memorable examples of how those models go wrong.
comment by Chris2 ·
2007-11-19T02:51:40.000Z · LW(p) · GW(p)
Steve, would you care to elucidate what's ridiculous about the Doomsday argument? I'd be especially interested in an explanation based on the "limitations of statistics" as opposed to a hand-waving argument. The Doomsday argument strikes most people as absurd on its face, and yet it's surprisingly resistant to refutation. My own opinion is that it's not absurd at all, and is among the ideas that reveal a deep truth about reality.
comment by Caledonian2 ·
2007-11-19T04:11:01.000Z · LW(p) · GW(p)
Well, the obvious point is that the Copernican Principle is frequently wrong. The Anthropic Principle does a fairly good job at pointing out the weaknesses of the CP, to start with, and remembering that all else is rarely equal takes care of most of the rest.
comment by douglas ·
2007-11-19T07:59:10.000Z · LW(p) · GW(p)
The math of a subject is only valuable when one understands the basic terminology of the subject. As Chris points out, knowing when to use statistics (the basic assumptions and what the word applies to) makes something like the Doomsday Arguement good for a laugh. It is ridiculous.
On evolutionary biology--
Evolution is defined as " any change in the frequency of alleles within a gene pool from one generation to the next."
This frequency changes with each birth. So to make the definition into regular English we could say
Evolution is defined as "living things reproduce" (the fact of evolution).
In modem evolutionary genetics, natural selection is defined as "the differential reproduction of genotypes (individuals of some genotypes have more offspring than those of others)".
In English- some cats have more babies than other cats.
So the statement "It is a fact that some cats have more babies than other cats," would be the proof of evolution by natural selection as the terms are currently defined.
Doesn't that help more than a mathematical equation?
Replies from: gershom
↑ comment by gershom ·
2011-11-12T08:32:45.989Z · LW(p) · GW(p)
Evolution is defined as " any change in the frequency of alleles within a gene pool from one generation to the next." This frequency changes with each birth. So to make the definition into regular English we could say Evolution is defined as "living things reproduce" (the fact of evolution).
This doesn't follow.
comment by g ·
2007-11-19T10:04:34.000Z · LW(p) · GW(p)
Douglas, if all you say is "some cats have more babies than other cats" then you have missed out the key element of heritable variation and therefore haven't said anything about evolution by natural selection.
comment by Me2 ·
2007-11-19T10:50:09.000Z · LW(p) · GW(p)
If what you're proposing is like a "Advanced Mathematical Principals for Dummies", I think you have a great idea.
You say you don't have the time, but you could probably put together a few people to put something together. 4-5 people writing two chapters. The "Dummies" folks would probably publish something like that. I'd consider buying it.
comment by Rob_Sayers ·
2007-11-19T12:57:50.000Z · LW(p) · GW(p)
I've been reading a book similar to what you have in mind I think. It's "Mathematics: From the birth of numbers" (http://www.amazon.com/Mathematics-Birth-Numbers-Jan-Gullberg/dp/039304002X). It starts very basic but covers all sorts of advanced topics. It's designed for someone with no higher math learning. I'm about 1/4 of the way through it and so far very impressed.
Replies from: emhs
↑ comment by emhs ·
2013-12-05T22:18:18.448Z · LW(p) · GW(p)
First off, that book looks wonderful. It looks, just from the description, like it goes deeper into Math, rather than covering the math of other fields. As delightful as Math can be, I'd be much more interested in having a primer on the math of all sorts of other things.
comment by Drake ·
2007-11-19T17:16:28.000Z · LW(p) · GW(p)
The dangers of a "little learning" are easily offset by pointing out the ways the relevant "simple math" fails in a given case. Cf. Feynman's (for example) use of analogies. He'd state the analogy, then point out the ways in which the analogy is wrong or misleading, the specific features that fail to map, etc. This strategy gets you the pedagogical benefits of structure mapping while minimizing the risk (that Bill Swift warns against, supra) that a little learning will be mistaken for a great deal.
comment by g ·
2007-11-20T10:55:11.000Z · LW(p) · GW(p)
Douglas, I'm not saying that there are cats that don't have heritable variation, any more than you're saying that there are cats that don't have varying numbers of offspring. I'm saying that the fact that cats have heritable variation is just as relevant to evolution as the fact that their number of offspring varies.
comment by Kat ·
2007-11-20T20:38:46.000Z · LW(p) · GW(p)
What I find embarrassing about knowing just a little bit about a subject is that outside of a formal class, there are few places to talk about it; particularly, few places to talk about it with people who will bring your further toward understanding what you've learned. If you learn a little bit of the mathematics of a subject, you're not interesting to the specialists, and most others won't be interested in the subject at all.
It seems easier to find a community around learning things that are less academic subjects, where you'll generally learn them in an informal structure anyhow -- cooking, crafts, foreign languages.
(I do like the idea of The Simple Math of Everything...)
comment by William_Newman ·
2007-11-20T21:48:36.000Z · LW(p) · GW(p)
If you ever get as seriously curious about electronics as you were about physics, look at Horowitz and Hill, The Art of Electronics. Very very useful for someone who already knows the math and wants to understand electronics principles and the practicalities of one-off discrete circuit design.
comment by AnthonyC ·
2011-03-29T16:59:30.287Z · LW(p) · GW(p)
I agree about the usefulness of a basic technical understanding of as many fields as possible.
As for the push to specialize in academia- well, it's complicated. I'm not a professor, I'm a grad student, but here's my experience. If you're in one of the relatively "pure" discipline- physics, computer science, and so on- the push to specialize is very real, as is the push to focus on what everyone else (including granting agencies) thinks is "hot."
But there is a lot of multi-disciplinary work going on, an increasing amount really. Trouble is, that quickly becomes a new discipline in its own right. My alma mater now has 5 different biology majors, each of them interdisciplinary in interesting ways. My own field- materials science- encompasses the study of solids and liquids. Metals, alloys, ceramics, oxides, semiconductors, polymers, and even biological materials. It can't be done unless you understand organic and inorganic chemistry, crystallography (applied group theory, really), physics (classical- strain fields, shearing forces; and quantum- bloch waves, electronic band structure), and enough computer science to right some basic simulations.
You end up with professors working in fields that didn't exist when they started out. So they keep taking classes and reading each other's books.
comment by Yoav Ravid ·
2019-05-27T11:15:43.418Z · LW(p) · GW(p)
Has there been any progress towards this idea? I as well think it would be a fantastic book and would love to read it
edit: I see there's a wiki page regarding this idea, with some links