CronoDAS's Shortform

post by CronoDAS · 2024-03-17T05:23:06.375Z · LW · GW · 10 comments

10 comments

Comments sorted by top scores.

comment by CronoDAS · 2024-03-17T05:23:06.480Z · LW(p) · GW(p)

My father thinks that ASI is going to be impractical to achieve with silicon CMOS chips because Moore's law is eventually going to hit fundamental limits - such as the thickness of individual atoms - and the hardware required to create it would end up "requiring a supercomputer the size of the Empire State Building and consume as much electricity as all of New York City".

Needless to say, he has very long timelines for generally superhuman AGI. He doesn't rule out that another computing technology could replace silicon CMOS, he just doesn't think it would be practical unless that happens.

My father is usually a very smart and rational person (he is a retired professor of electrical engineering) and he loves arguing, and I suspect that he is seriously overestimating the computing hardware it would take to match a human brain. Would anyone here be interested in talking to him about it? Let me know and I'll put you in touch.

Update: My father later backpedaled and said he was mostly making educated guesses on limited information, that he knows that he really doesn't know very much about current AI, and isn't interested enough to talk to strangers online - he's in his 70s and if AI does eventually destroy the world it probably won't be in his own lifetime. :/

Replies from: quetzal_rainbow, ryan_greenblatt, lahwran, carl-feynman
comment by quetzal_rainbow · 2024-03-17T05:32:15.714Z · LW(p) · GW(p)

You can mention Portia, which can emulate mammal predators' behavior using brain much smaller.

Replies from: lahwran
comment by the gears to ascension (lahwran) · 2024-03-17T08:16:32.620Z · LW(p) · GW(p)

which? https://en.wikipedia.org/wiki/Portia

Replies from: niplav, quetzal_rainbow
comment by niplav · 2024-03-17T09:18:49.770Z · LW(p) · GW(p)

Portia spiders.

comment by quetzal_rainbow · 2024-03-17T09:28:19.340Z · LW(p) · GW(p)

I mean spiders.

comment by ryan_greenblatt · 2024-03-18T04:13:35.501Z · LW(p) · GW(p)

This report by Joe Carlsmith on How Much Computational Power Does It Take to Match the Human Brain? [LW · GW] seems relevant.

I think this is a sufficient crux, e.g. his views imply disagreement with this report.

The main issue with this report is that it doesn't take into seriously take into account memory bandwidth constraints (from my recollection), but I doubt this effects the bottom line that much.

comment by the gears to ascension (lahwran) · 2024-03-17T08:53:11.496Z · LW(p) · GW(p)

requiring a supercomputer the size of the Empire State Building and consume as much electricity as all of New York City

why does he think that is unlikely to occur? such things seem on the table. existing big super computers are very, very big already. I've asked several search engines and AIs and none seem to be able to get to the point about exactly how big a datacenter housing one of these would be, but claude estimates:
 

Frontier: 5,000-8,000 square feet (70% confidence)
Eagle: 6,000-9,000 square feet (70% confidence)

comment by Carl Feynman (carl-feynman) · 2024-03-18T00:57:07.882Z · LW(p) · GW(p)

I’d be delighted to talk about this.  I am of the opinion that existing frontier models are within an order of magnitude of a human mind, with existing hardware.  It will be interesting to see how a sensible person gets to a different conclusion. 

I am also trained as an electrical engineer, so we’re already thinking from a common point of view.

Replies from: CronoDAS, CronoDAS
comment by CronoDAS · 2024-03-18T19:05:16.172Z · LW(p) · GW(p)

I brought it up with him again, and my father backpedaled and said he was mostly making educated guesses on limited information, that he knows that he really doesn't know very much about current AI, and isn't interested enough to talk to strangers online - he's in his 70s and figures that if AI does eventually destroy the world it probably won't be in his own lifetime. :/

comment by CronoDAS · 2024-03-18T04:23:50.909Z · LW(p) · GW(p)

He might also argue "even if you can match a human brain with a billion dollar supercomputer, it still takes a billion dollar supercomputer to run your AI, and you can make, train, and hire an awful lot of humans for a billion dollars."