post by [deleted] · · ? · GW · 0 comments

This is a link post for

0 comments

Comments sorted by top scores.

comment by mathemajician · 2010-02-18T13:21:14.857Z · LW(p) · GW(p)

A whole community of rationalists and nobody has noticed that his elementary math is wrong?

1.9 gigaFLOPS doubled 8 times is around 500 gigaFLOPS, not 500 teraFLOPS.

Big difference, and one that trashes his conclusion.

Replies from: RobinZ, PhilGoetz
comment by RobinZ · 2010-02-18T15:35:14.059Z · LW(p) · GW(p)

*headdesk*

Yet another reason to favor exponential notation, I think.

comment by PhilGoetz · 2010-02-18T16:01:16.739Z · LW(p) · GW(p)

Whoa!

You are right!

I originally excluded the modern supercomputers, because they are a different category of beast - not really "computers", but networks of computers. Then I included Ranger when I saw its hardware costs were only $30 million.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-02-17T19:52:51.536Z · LW(p) · GW(p)

I certainly hope that Moore's Law will end, but I don't think we could get that lucky.

Replies from: RobinZ, PhilGoetz
comment by RobinZ · 2010-02-17T20:50:32.119Z · LW(p) · GW(p)

Besides there being some further room to reduce element size, there is 3D chip stacks - between these two factors, there should be at least a couple more possible doublings of processor power down the line. I haven't run the math to estimate the theoretical limits.

Replies from: SilasBarta, Jack
comment by SilasBarta · 2010-02-17T21:49:25.368Z · LW(p) · GW(p)

Well, others have computed the limits. The 3rd reference links to a full article deriving the limits permitted by the laws of physics.

Seth Lloyd calculated the computational abilities of an "ultimate laptop" formed by compressing a kilogram of matter into a black hole of radius 1.485 × 10^-27 meters, concluding that it would only last about 10^-19 seconds before evaporating due to Hawking radiation, but that during this brief time it could compute at a rate of about 5 × 10^50 operations per second, ultimately performing about 10^32 operations on 10^16 bits.

Replies from: RobinZ
comment by RobinZ · 2010-02-18T04:09:49.231Z · LW(p) · GW(p)

You're not going to get that with silicon technology in the next twenty years, though - that's the more urgent question.

Replies from: SilasBarta
comment by SilasBarta · 2010-02-18T04:20:41.864Z · LW(p) · GW(p)

I agree -- I was just answering the question of what the ultimate, inviolable limits are, to establish when the improvements have to stop.

comment by Jack · 2010-02-17T22:43:13.502Z · LW(p) · GW(p)

I recall hearing that 3D chips would have some rather serious cooling issues but I suppose that isn't an obviously unsolvable problem.

Replies from: RobinZ
comment by RobinZ · 2010-02-18T04:08:29.274Z · LW(p) · GW(p)

"Rather serious cooling issues" is an accurate characterization, but current electronic packaging technology does very little with direct liquid cooling - there's room to take the heat out if we can crack the theoretical challenges. You would only need to boil a few grams of liquid per second to take off kilowatts of power.

comment by PhilGoetz · 2010-02-17T20:08:18.339Z · LW(p) · GW(p)

It will eventually "end" if we count switching to any non-semiconductor technology as ending it. I don't have a strong opinion as to how long it will last.

comment by knb · 2010-02-18T02:57:34.181Z · LW(p) · GW(p)

22 years later, in 2007, Ranger, at 504 teraFLOPS, is a mere 8 doublings beyond the Cray 2, at a price of $60M.

Edited: And then, in 2008, Roadrunner, with a top speed of 1700 teraflops, a mere tripling in one year.

:3

Where you measure from matters a lot.

Replies from: orthonormal
comment by orthonormal · 2010-02-18T06:56:12.615Z · LW(p) · GW(p)

Er, that's less than 2 doublings.

Replies from: ShardPhoenix
comment by ShardPhoenix · 2010-02-18T09:12:13.144Z · LW(p) · GW(p)

In half the expected time for 1 doubling.

comment by NancyLebovitz · 2010-02-18T10:35:05.012Z · LW(p) · GW(p)

It depends on what sort of thing Moore's Law is. Maybe we'd be better off if it were called Moore's Intriguing Observation.

There's economic pressure to make information processing cheaper, faster, and more efficient, but that's only important if the improvements are physically possible at present levels of knowledge.

There's economic pressure to improve batteries, but improvements aren't happening nearly as fast.

Replies from: Jack
comment by Jack · 2010-02-18T11:32:32.512Z · LW(p) · GW(p)

It depends on what sort of thing Moore's Law is. Maybe we'd be better off if it were called Moore's Intriguing Observation.

I don't understand this distinction. Moore's Law appears to take the form of a law.

Replies from: NancyLebovitz
comment by NancyLebovitz · 2010-02-18T14:38:47.348Z · LW(p) · GW(p)

Moore's Law takes the form of a law, but it doesn't have have the weight of observation behind it that the law of Gravity does.

Moore's Law started taking effect at some point. It's very likely that it will come to an end. If we had a bunch of independent intelligent species and knew their computation development curves, we'd have something a lot more like a Law.

comment by dclayh · 2010-02-17T23:36:16.277Z · LW(p) · GW(p)

Some confusion seems to be arising on what "change" means. Let X denote the quantity whose change (in absolute terms) we are referring to. Is X

  • The number of transistors on a chip (call it N)?
  • log2(N)?
  • The utility produced by those chips (which seems like it ought to be proportional to something between N and log2(N))?

or something else?

If Kurzweil is thinking about (1) and you are thinking about (2), then you could easily both be correct (i.e. more transistors are being added to chips this year than ever before, even though the exponent governing the growth is lower).

Put another way: the rate of change may be at historic highs even if the rate of change of the rate of change is not.

Replies from: PhilGoetz
comment by PhilGoetz · 2010-02-18T00:18:56.111Z · LW(p) · GW(p)

Put another way: the rate of change may be at historic highs even if the rate of change of the rate of change is not.

Yes. But I don't think that's compatible with his argument. He posits, basically, that progress is locked in a feedback loop, and the "rate of change of the rate of change" is proportional to the rate of change. The situation you just described is therefore impossible in his model.

comment by ShardPhoenix · 2010-02-17T22:41:22.060Z · LW(p) · GW(p)

I'm not sure who or what you're even arguing against here.

Replies from: PhilGoetz
comment by PhilGoetz · 2010-02-17T23:18:41.724Z · LW(p) · GW(p)

Thanks for being up-front - it was lousy. It didn't say what I was really trying to say. I rewrote it.

comment by CarlShulman · 2010-02-17T21:27:34.969Z · LW(p) · GW(p)

I think Kurzweil talks primarily about price-performance, rather than transistor numbers or clock speeds.

Replies from: PhilGoetz
comment by PhilGoetz · 2010-02-17T22:02:37.559Z · LW(p) · GW(p)

Okay. Price-performance is a less-precise measure, because we can't compare prices across decades accurately. I think you'd get similar results; though if they differed, I'd expect them to make the pre-Moore's change look even faster. (The development cost of the ENIAC - $6M in 2008 dollars, according to Wikipedia - was small compared to the per-unit costs of later large computers. That's much less in inflation-adjusted dollars than the IBM 7090 ($3M 1960) or Cray 2 ($25M 1985) cost per computer.)

Replies from: mattnewport
comment by mattnewport · 2010-02-17T22:18:40.738Z · LW(p) · GW(p)

Isn't Kurzweil's primary claim that technological progress has always been an exponential and that Moore's law is just the best known instance of a broader phenomenon?

Replies from: MatthewB, PhilGoetz
comment by MatthewB · 2010-02-18T10:23:27.260Z · LW(p) · GW(p)

I've noticed that many people miss Kurzweil's claims. Such as I keep encountering the misconception that Kurzweil claims that technical advances will become infinite, which is just silly, and he never claims this. I have talked briefly with him about it and he says that the exponential climb will probably give way to a new paradigm that changes the way things are done rather than continue to infinity (or approach an asymptote).

comment by PhilGoetz · 2010-02-17T22:35:33.064Z · LW(p) · GW(p)

Yes, that's true.

I guess my point is confused. Let me reformulate. . . done.

comment by PhilGoetz · 2010-02-18T16:03:00.269Z · LW(p) · GW(p)

I'm going to delete this post. I wasn't happy with it even before mathemajician pointed out the important error in it.

comment by MatthewB · 2010-02-18T10:19:02.245Z · LW(p) · GW(p)

According to the guy from Intel (Justin Ratner) at the 2008 Singularity Summit. Moore's Law ended in 2005/06. The discrete transistor is a thing of the past according to his talk.

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2010-02-18T11:25:29.876Z · LW(p) · GW(p)

Can you give more details of what he says stopped? Clock speeds stopped getting faster around then or earlier, but the result was to expand into multi-core chips and graphics coprocessors.

comment by MatthewB · 2010-02-18T03:41:27.668Z · LW(p) · GW(p)

Couple of things... A leap ahead in computing would not necessarily mean that Moore's Law was not descriptive of the event. It could still follow the same exponential trend, yet look like a giant leap forward.

It is likely that Moore's Law will continue due to economic pressure to find newer and faster ways to compute. This may not have to do with shrinking transistor sized, but may well involve other forms of computation or architecture for chips.