Why did computer science get so galaxy-brained?

post by NicholasKross · 2021-12-27T08:50:13.579Z · LW · GW · No comments

This is a question post.

Contents

  Answers
    19 lsusr
    16 davidad
    3 Dagon
    1 lincolnquirk
None
No comments

Not a joke question.

Was it early military funding? Then why didn't nuclear power become The Big Smart Field/Industry? Where's the Google of nuclear fusion? Why aren't a double-digits % of LW users nuclear physicists?

Was it Von Neumann and other smart scientists' early involvement? Then again, why didn't nuclear (or some obscure area of math besides CS) get big?

Was it universal applicability? Perhaps. (Then again, they used to think we'd have cars with fusion reactors in them... and why didn't steam power turn into a superweapon program?)

Cold War paranoia/global-impact/apocalyptic mindset? Rise of world population alerting people to scale? Availability of the Commodore 64 to children?

My main guess is "universalizability [LW · GW]", but I'd also like to know about historical factors and/or other things about the structure/uniqueness-of-the-field, that may have caused this.

Answers

answer by lsusr · 2021-12-27T09:04:37.846Z · LW(p) · GW(p)

There are two questions. One is about computer science vs the rest of math. The other is about computer science vs nuclear energy. I'm going to answer the computer science vs nuclear engineering question.

The reason software advances faster than nuclear power is cost. Nuclear reactors are big, expensive, dangerous and highly-regulated [LW · GW]. Digital computers are small, cheap, safe and unregulated.

The reason we need more programmers than nuclear scientists is complexity. Computers must be programmed differently for every application. Nuclear reactors don't.

  • Nuclear energy has just one use: energy. Basically all of our machines are powered by electricity. If you build a nuclear reactor that can produce electricity then you're done. You don't have to design a special reactor just for farm equipment.
  • Digital computers are control systems. Different control systems must be programmed differently for each application. The instructions you give to a robot car are different from the instructions you give to a robot surgeon.
comment by NicholasKross · 2021-12-28T01:05:35.091Z · LW(p) · GW(p)

Good point, the control system aspect is another side of the universalizability thing.

answer by davidad · 2021-12-27T17:25:02.450Z · LW(p) · GW(p)

In terms of "what economic sector is most aimed-at by top STEM students at top US universities," before the Internet sector it was investment banking, and before investment banking it was management consulting. If the 1967 film "The Graduate" is to be believed, back then it was material science ("plastics"). I think this is most influenced by the 90th or 95th percentile early-career salary in a sector combined with the size of demand for such high-end workers in that sector. Software Developer is a Pareto-optimal point on those dimensions; per BLS statistics from 2020, the 90th percentile salary is $170,100 and there are over 147,000 jobs paying that much or more. Compare that with Mathematician, where the 90th percentile salary is almost exactly the same ($170,150) but there are only 246 such jobs, or Nuclear Engineer, where the 90th percentile salary is $185,550 but there are only 1,570 such jobs.

The natural next question is why the economy has so much demand for high-priced software developers. I think you're onto something with "universalizability". There's tremendous potential economic value in computing infrastructure (namely, the digital-circuitry and digital-communications technological-bases), which can be manifested in a bunch of different ways, each of which needs the specific attention of dedicated software developers. Those people need to be moderately talented and moderately educated, but only very slightly experienced, to catalyze the materializing of a lot of economic value. That's a recipe for a large pool of high-paying new-grad jobs—currently more so than any other occupation—and that's what is attracting a substantial fraction of the human resource flow from top universities.

I don't think it has much to do with early founders or funders. As another commenter pointed out, it's about the economics: the tools and materials that are necessary inputs to the Internet industry keep getting cheaper, the potential economic value of that technology keeps getting greater, and the human skills and training required to turn those inputs into those outputs keep being moderate (high enough to support a high wage, but low enough that young people can see how they'd get there within their time horizon).

comment by NicholasKross · 2021-12-28T01:05:02.376Z · LW(p) · GW(p)

Good, thorough, and basic. Thank you!

answer by Dagon · 2021-12-27T16:49:13.789Z · LW(p) · GW(p)

Note: I'm not sure what "galaxy-brained" means, so I'm not sure what aspect of software eating the world (can't find a good free link; the phrase is from a 2011 WSJ oped by Marc Andreeson) surprises you.

I think it's mostly because we live in a mechanistic universe, and being able to calculate/predict things with a fair amount of precision is incredibly valuable for almost all endeavors.  I doubt it's path-dependent (doesn't matter who invented it or which came first), more that software is simply a superset of other things.

BTW, this ship has sailed, but it still bugs me when people mix up "computer science" with "software development and usage".  They're not the same at all.  I suspect you're conflating the science behind nuclear power with the actual industry of power generation in the same way, which also makes no sense.

Academic science and math research remains a tiny part of the knowledge-based workforce.  Industrial use and application of that science is where almost all of the action is.  THAT distinction has good reason - there are many many more people who can understand and use a new formulation of knowledge than who can discover and formalize one.

comment by NicholasKross · 2021-12-28T01:10:20.427Z · LW(p) · GW(p)

Counterpoint: knowing nuclear physics helps at least somewhat with nuclear power generation. Same with academic CS and real-life SWEN problem-solving.

"Galaxy-brained" in this context is a little hard to define, but I'd extrinsically define it as "Cold War paranoia giant datacenters complicated plots Death Note Greyball anything HPMOR!Harry comes up with sweaty palms any evil plot that makes you go 'DAMN that was clever and elegant'". (I may eventually create a post or website fleshing out this idea cluster in more detail).

answer by lincolnquirk · 2021-12-27T18:16:03.053Z · LW(p) · GW(p)

I think there's something about programming that attracts the right sort of people. What could that be? Well, programming has very tight feedback loops, which make it fun. You can "do a lot": one's ability to gain power over the universe, if you will, is quite high with programming. I'd guess a combination of these two factors.

comment by NicholasKross · 2021-12-28T01:06:56.167Z · LW(p) · GW(p)

I think the feedback loop is underrated (see also, the same question but it's "Why did video games get so advanced compared to consumer/B2B/AI software for a long time?". GPUs started out as gaming machines partly because games are fun to play (and making them is, if not nearly as fun as playing them, at least potentially much more fun than making other type of software).

No comments

Comments sorted by top scores.