[Cross-post]The theoretical computational limit of the Solar System is 1.47x10^49 bits per second.

post by William the Kiwi · 2023-10-17T16:06:34.258Z · LW · GW · 10 comments

Contents

10 comments

Cross posted from EA forum. Link: The theoretical computational limit of the Solar System is 1.47x10^49 bits per second. — EA Forum (effectivealtruism.org) [EA · GW]

 

Part 1

The limit is based on a computer operating at the Landauer Limit, at the temperature of the cosmic microwave background, powered by a Dyson sphere operating at the efficiency of a Carnot engine. [EDIT: this proposed limit is too low, as the Landauer Limit can be broken, it is now just a lower bound.]

 

Relevant equations

Carnot efficiency               ηI=1-(Tc/Th)

Landauer limit                   E=KbTLn(2)

Bit rate                                  R=PηI /E

 

Relevant values

Boltzmann constant [Kb] (J K-1)                                                    1.38E-23

Power output of the sun [P] (W)                                                3.83E+26

Temperature of the surface of the sun [Th] (K)                      5.78E+03

Temperature of cosmic microwave background [Tc] (K)     2.73

 

Calculations

Carnot efficiency              ηI=1-(Tc/Th)

                                                ηI=1-(2.73/5.78E+03)

                                                ηI=1.00

 

Landauer limit                   E=KbTLn(2)

                                                E=1.38E-23*2.73*0.693

                                                E= 2.61E-23 Joules per bit

 

Bit rate                                 R=PηI /E

                                                R=3.83E+26*1.00/2.61E-23

                                                R=1.47E+49 bits per second

 

Notes

Numbers are shown rounded to 3 significant figures, full values were used in calculations.

 

 

Part 2

The theoretical computational limit of the solar system is 22 orders of magnitude above the estimated computational ability of all alive humans. This is based on estimates of the number of synapses in the human brain, the update rate of those synapses, and the number of humans alive. This estimate is only an approximation and should be used with caution.

The purpose of this post was to show the limit of computation, and therefore intelligence, is far above all humans combined.

 

Relevant equations

Bit rate of all humans                     Rhumans=NsynRsynNhumans

Comparative rate                              Rc=Rmax/Rhumans

 

Relevant values

Number of synapses in the human brain [Nsyn]                     2.50E+14

Synaptic update rate [Rsyn] (Hz)                                                   500

Number of humans alive [Nhumans]                                             8.07E+09

Theoretical computational limit [Rmax] (bit s-1)                      1.47E+49 

 

Calculation

Bit rate of all humans                     Rhumans=NsynRsynNhumans

                                                                Rhumans=2.50E+14*500*8.07E+09

                                                                Rhumans= 1.01E+27

Comparative rate                              Rc=Rmax/Rhumans

                                                                Rc=1.47E+49/1.01E+27

                                                                Rc=1E22

 

Notes

Numbers are shown rounded to 3 significant figures, full values were used in calculations, final result rounded to one significant figure due to low confidence in synaptic update rate.

Synaptic update rate estimated based on a 2 millisecond refractory time of a neuron.

10 comments

Comments sorted by top scores.

comment by Joseph Van Name (joseph-van-name) · 2023-10-22T17:13:19.218Z · LW(p) · GW(p)

I forgot to mention another source of difficulty in getting the energy efficiency of the computation down to Landauer's limit at the CMB temperature.

Recall that the Stefan Boltzmann equation states that the power being emitted from an object by thermal radiation is equal to . Here,  stands for power,  is the surface area of the object,  is the emissivity of the object ( is a real number with ), is the temperature, and  is the Stefan-Boltzmann constant. Here, 

Suppose therefore that we want a Dyson sphere with radius  that maintains a temperature of 4 K which is slightly above the CMB temperature. To simplify the calculations, I am going to ignore the energy that the Dyson sphere receives from the CMB so that I obtain a lower bound for the size of our Dyson sphere. Let us assume that our Dyson sphere is a perfect emitter of thermal radiation so that 

Earth's surface has a temperature of about . In order to have a temperature of , our Dyson sphere needs to receive  the energy per unit of area. This means that the Dyson sphere needs to have a radius of about  astronomical units (recall that the distance from Earth to the sun is 1 astronomical unit).

Let us do more precise calculations to get a more exact radius of our Dyson sphere. 

, so  which is about 15 percent of a light-year. Since the nearest star is 4 light years away, by the time that we are able to construct a Dyson sphere with a radius that is about 15 percent of a light year, I think that we will be able to harness energy from other stars such as Alpha Centauri.

The fourth power in the Stefan Boltzmann equation makes it hard for cold objects to radiate heat.

comment by Vladimir_Nesov · 2023-10-17T17:06:27.853Z · LW(p) · GW(p)

If you wait for cosmic background radiation to cool down[1], you get much more total computation out of the same matter. The rate of computation doesn't seem particularly important. The amount of stuff in a Hubble volume might be reducing over time, in which case computing earlier allows more communication with distant galaxies. But given the guess about the effect size of waiting on total compute, computing locally in distant future still buys more total compute than making use of distant galaxies earlier.


  1. I don't buy the Fermi paradox angle in the paper, obviously the first thing you do is grab all the lightcone you can get your von Neumann probes on, and prepare the matter for storage in a way that's less wasteful than the random stuff that's happening in the wild. ↩︎

Replies from: Algon, William the Kiwi
comment by Algon · 2023-10-17T18:33:35.926Z · LW(p) · GW(p)

That paper is wrong. There are other systems which are not near maximal entropy states and computer generated entropy can be transferred to them adiabatically at a rate of 1 bit of negentropy to erase one bit of error. 

As to the post we're commenting on, the sun probably isn't the best configuration of matter to use as a power source. But this calculation seems like a reasonable lower bound.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2023-10-17T19:14:27.500Z · LW(p) · GW(p)

The critique just says that you can get the same advantage even without waiting, while the relevant surprising part of the original claim is that there is a large advantage to be obtained at all, compared to Landauer limit at modern background radiation temperature, so that actually this application of the Landauer limit doesn't bound available compute.

The part of the paper I appealed to is exploratory engineering, a design that is theoretically possible but not trying to be something worthwhile when it becomes feasible in practice. This gives lower bounds on what's possible, by sketching particular ways of getting it, not predictions of what's likely to actually happen [? · GW]. The critique doesn't seem to take issue with this aspect of the paper.

Replies from: Algon
comment by Algon · 2023-10-17T20:03:04.637Z · LW(p) · GW(p)

Ah, thanks. I should've noticed that.

comment by William the Kiwi · 2023-10-17T18:23:58.053Z · LW(p) · GW(p)

Yea you could, but you would be waiting a while. Your reply and 2 others have made me aware that this post's limit is too low.

[EDIT: spelling]

comment by Joseph Van Name (joseph-van-name) · 2023-10-17T23:16:32.847Z · LW(p) · GW(p)

This post uses the highly questionable assumption that we will be able to produce a Dyson sphere that can maintain a temperature at the level of the cosmic microwave background before we will be able to use energy efficient reversible computation to perform operations that cost much less than  energy. And this post also makes the assumption that we will achieve computation at the level of about  per bit deletion before we will be able to achieve reversible computation. And it gets difficult to overcome thermal noise at an energy level well above  regardless of the type of hardware that one uses. At best, this post is an approximation for the computational power of a Dyson sphere that may be off by some orders of magnitude.

Replies from: William the Kiwi
comment by William the Kiwi · 2023-10-18T11:36:21.021Z · LW(p) · GW(p)

This post makes a range of assumptions, and looks at what is possible rather than what is feasible. You are correct that this post is attempting to approximate the computational power of a Dyson sphere and compare this to the approximation of the computational power of all humans alive. After posting this, the author has been made aware that there are multiple ways to break the Landauer Limit. I agree that these calculations may be off by an order of magnitude, but this being true doesn't break the conclusion that "the limit of computation, and therefore intelligence, is far above all humans combined".

comment by Dalcy (Darcy) · 2023-10-17T16:22:36.381Z · LW(p) · GW(p)

The limit's probably much higher with sub-Landauer thermodynamic efficiency.

Replies from: William the Kiwi
comment by William the Kiwi · 2023-10-17T18:22:27.013Z · LW(p) · GW(p)

I just read the abstract. Storing information in momentum makes a lot of sense as we know it is a conserved quantity. Practically challenging. But yes, this does move the theoretical limit even further away from all humans combined.