[Link] Study: no big filter, we're just too early

post by polymathwannabe · 2015-10-21T13:13:31.361Z · LW · GW · Legacy · 45 comments

Contents

45 comments

"Earth is one of the first habitable planets to form - and we're probably too early to the party to get a chance to meet future alien civilisations."

 

http://www.sciencealert.com/earth-was-one-of-the-first-habitable-planets-in-the-universe-and-most-are-yet-to-be-born-study-finds

45 comments

Comments sorted by top scores.

comment by [deleted] · 2015-10-22T02:06:32.271Z · LW(p) · GW(p)

The conclusions of this paper do not take into account empirical results indicating that most baryonic mass will probably be unable to form stars if trends that have held for the history of the universe so far continue to hold. I'm fairly convinced conclusions made on this basis do not resemble what will actually happen in our universe. I hereby link to a previous time I examined this study on this site and found it wanting:

http://lesswrong.com/lw/mpa/september_2015_media_thread/cpz6

http://lesswrong.com/lw/mpa/september_2015_media_thread/crhi

In short, they don't take into account that star formation often completely shuts down over time in galaxies despite there still being plenty of gas around, especially as they merge into large galaxies which is a continuing ongoing process, and that if you empirically look at star formation rates over time in the universe we are actually probably in the latter fractions of stars ever formed due to formation constantly declining at a precipitous rate. Their conclusion that we are early (8th percentile) in planet-formation order is based on the fact that something like 8% of baryonic mass in galaxies has become star systems [EDIT: Oops, more than 8% of gas, they have a metallicity cutoff that excludes gas that formed stars early in the history of most galaxies], not a projection of how much WILL eventually form stars, using uniformitarian assumptions rather than what I would consider more realistic models.

When you look at the empirical data, our position in planet-formation order seems likely unremarkable, and probably somewhere not far from the middle.

Replies from: passive_fist
comment by passive_fist · 2015-10-23T00:57:41.832Z · LW(p) · GW(p)

I don't see why you think they didn't take those factors into account; the article clearly says that:

The researchers also used the data to predict that future Earths are more likely to appear inside galaxy clusters and dwarf galaxies, which have yet to use up all their gas for building stars and accompanying planetary systems. Our Milky Way Galaxy, on the other hand, is all tapped out.

In addition, your claim that "Their conclusion that we are early (8th percentile) in planet-formation order is based on the fact that something like 8% of baryonic mass in galaxies has become star systems" doesn't seem to be true; their conclusion instead seems to be more nuanced and based on taking into account metallicity and empirical rates of habitable planet formation: http://mnras.oxfordjournals.org/content/454/2/1811

Replies from: None
comment by [deleted] · 2015-10-24T00:52:36.466Z · LW(p) · GW(p)

What you say after the quote is correct and I will edit my parent comment accordingly, they do include metallicity cutoffs which decreases the contribution from star formation in the early universe, and most of the star formation of old giant ellipticals.

However, they do pretty much explicitly state that their conclusions are based on all potentially star-forming gas within the dark matter halos of galaxies eventually forming stars. This is not a good assumption since galaxy-quenching may be more or less permanent, and if you integrate fits of measured star formation rates over time in the universe into the future they converge towards total final numbers of stars that aren't THAT much larger than today. Frequent shutdown of star formation after galaxy mergers bring spirals and dwarf galaxies together into large galaxies, other less dramatic quenching events that apparently happen while being poorly understood, and the presence of many galaxies with large amounts of gas that nonetheless have failed to form stars for many gigayears (http://www.dailygalaxy.com/my_weblog/2014/02/giant-elliptical-galaxies-why-are-they-red-and-dead.html , http://mnras.oxfordjournals.org/content/439/3/2291.full.pdf), and empirical studies showing universal rates of star formation are declining very rapidly (I could point to a couple papers, but this one http://arxiv.org/abs/1006.5751 even though it isn't exactly ABOUT star formation rate has the prettiest graph I've seen in figure 1) are not taken into account. See second comment I link to.


EDIT: I've gone back into the empirical fits of universal star formation rates found in two recent papers that actually have equations, and after doing some very bothersome math to convert between redshift and universe age, it would appear that the fits not only both converge to a finite number of stars going forward in time but also agree that in terms of TOTAL stars today the universe is between 85 and 95% of the way through the total complement that will ever exist.

The sun then shows up at the ~75th percentile of stars in star-order for both fits (as in 75% show up behind us at the end of time). I am unprepared to rigorously normalize this with metallicities on my own to deal with the fact that the huge early batches of stars 10+ gigayears ago were probably unsuitable for terrestrial planets without putting way more time than I am currently likely to have into the effort, unfortunately.


EDIT 2: I can do a little VERY naive normalization. Read at your own risk:

The study that gives the ~85 percent figure for current existing stars (Yuskel et al 2008) also gives the Sun's position as about the 85th percentile in currently existing stars and ~75th percentile in total stars ever. The linked study that started this whole conversation (Behroozi & Peeples 2015) gives the Earth's location as about the 50th percentile amongst currently existing terrestrial planets after their metallicity normalization. If we assume star formation rate since the Earth's formation is roughly fixed relative to terrestrial planet formation rate (heavy elements having polluted most places) then we get that the Earth formed after 0.5 / (1+0.5*0.15/0.1) = 29% of terrestrial planets.

Examining the paper (Sobral et al 2012) indicating 95% of eventual stars exist, and the sun is currently in the 82nd percentile, and eventual 75th percentile in the same way indicates that the Earth formed after 0.5/(1+0.5*0.05/0.13) = 42% of terrestrial planets. I trust the other number a bit better given it has a better estimation of star formation rates longer ago. I also caution that nobody really understands cutoffs for terrestrial planet formation, and that there could be other important factors, and these numbers only mean so much.

There's been some rounding here and there as I did all these calculations. May redo them later, in the hopes of making sure I didn't accidentally tune these numbers or propagate errors.

comment by AABoyles · 2015-10-21T17:04:12.626Z · LW(p) · GW(p)

This research doesn't imply the non-existence of a Great Filter (contra this post's title). If we take the Paper's own estimates, there will be approximately 10^20 terrestrial planets in the Universe's history. Given that they estimate the Earth preceded 92% of these, there currently exist approximately 10^19 terrestrial planets, any one of which might have evolved intelligent life. And yet, we remain unvisited and saturated in the Great Silence. Thus, there is almost certainly a Great Filter.

Replies from: jacob_cannell, MaximumLiberty
comment by jacob_cannell · 2015-10-21T18:54:34.112Z · LW(p) · GW(p)

And yet, we remain unvisited and saturated in the Great Silence. Thus, there is almost certainly a Great Filter.

Do you really think the probability that aliens have visited our system over it's history is less than say 10^-9?

The 10^19 or so planets that could have independently evolved civilizations generates an enormous overwhelming prior that we are not the first. It would take extremely strong evidence to overcome this prior. So from a Bayesian view, it is completely unreasonable to conclude that there is any sort of Filter - given the limits of our current observations.

We have no idea whether we have been visited or not. The evidence we have only filters out some very specific types of models for future civs - such as aliens which colonize most of the biological habitats near stars. The range of models is vast and many (such as cold dark models where advanced civs avoid stars) remain unfiltered by our current observations.

Replies from: passive_fist, AABoyles
comment by passive_fist · 2015-10-23T01:03:54.689Z · LW(p) · GW(p)

Taking the Bayesian view further, our posterior likelihood is the prior times the likelihood inferred from observations. You're right that the prior must consist of very strong belief in the existence of aliens. However, an expanding alien civilization would be a very large, obvious, and distinctive spectacle, and we have seen no evidence of that so far. Thus it is not clear what our posterior belief must be.

Replies from: jacob_cannell
comment by jacob_cannell · 2015-10-23T16:25:17.765Z · LW(p) · GW(p)

An expanding stellavore civ would be very obvious, and the posterior for that possibility is thus diminished.

However there are many other possibilities. An expanding cold dark civ would be less obvious, and in fact we could already be looking at it.

There also the transcendent models, where all expansion is inward and post singularity civs rather quickly exit the galaxy in some manner - perhaps through new universe creation. That appears to be possible as far as physics is concerned, and it allows for continued exponential growth rather than the unappealing cubic growth you can get from physical expansion. Physical expansion would be enormous stagnation from our current growth perspective.

After updating on our observations the standard stellavore model becomes low probability relative to other future civ models.

Replies from: passive_fist
comment by passive_fist · 2015-10-23T18:37:08.095Z · LW(p) · GW(p)

Why couldn't a civilization lead to both expanding and universe-exiting threads of evolution? Taking life on Earth as an analogy, it's clear that life expands to fill all niches it can. A particular thread of evolution won't stop occurring just because another thread has found a more optimal solution. In other words, it's not a depth-first search, it's a breadth-first search. Unless there's a good reason for a civilization to not expand into space, it will probably expand into space.

It would seem very strange, then, that no expanding interstellar civilization has occurred.

Replies from: jacob_cannell
comment by jacob_cannell · 2015-10-23T19:28:53.572Z · LW(p) · GW(p)

Why couldn't a civilization lead to both expanding and universe-exiting threads of evolution? Taking life on Earth as an analogy, it's clear that life expands to fill all niches it can.

Sure, but we are uncertain about everything, including what the niches for postbiological civs are. Physics suggest that computation is ultimately primarily entropy/temperature limited (rather than energy limited), and thus the niches for advanced civs could be in the cold dark interstellar material (which we know is more plentiful than the hot bright stuff). We don't see stellavores for the same reasons that humanity isn't interested in colonizing deep sea thermal vents (or underwater habitats in general).

So the stars could be the past - the ancient history of life, not it's far future.

Unless there's a good reason for a civilization to not expand into space, it will probably expand into space.

In the cold dark models, the galaxy is already colonized, and the evidence is perhaps already in front of us . ..

In this model the physical form of alien civs is likely to be in compact cold objects that are beyond current tech to image directly. The most likely chance to see them is during construction, which would be more energetically expensive and thus could take place near a star - perhaps the WTF star is a civ in transition to elder status.

The WOW signal was an alien radar ping, similar to what aliens would see from the radar pings that we use for planetary radar imaging with arecibo.

Aliens most likely have already visited sol at various points, but for them it is something like the ocean floor is to us - something of minor interest for scientific study.

On that note, it's starting to look like the emDrive and kin are real. If that is true, it is additional evidence for aliens. Why? Because the earliest and most credible modern UFO reports - such as the Kenneth Arnold sighting - are most consistent with craft that is vaguely areodynamic but does not rely on areodynamic principles for thrust. The arnold report contains rather specific details of the craft's speed and acceleration, lack of contrail, etc. As we know more about future engineering capabilities for atmospheric craft, that report could become rather strong evidence indeed (or not).

Unless there's a good reason for a civilization to not expand into space, it will probably expand into space.

In the transcendent models, civs use all available resources to expand inward, because that allows for continued exponential growth. Transcendent civs don't expand outward because it is always an exceptionally poor use of resources. Notice that that is true today - we could launch an interstellar colony ship for some X trillions, but spending those resources on Moore'ls Law is vastly preferred. In the transcendent model, this just continues to be true indefinitely - likely ending in hard singularities, strange machines that create new universes, etc.

Finally, the distribution over various alien civs are not really statistically independent, even if they developed independently. Our uncertainty is at the model level in terms of how physics and future engineering works. The particular instance variables of each civ don't matter so much. So if the cold dark model is correct, all civs look like that, if the transcendent model is correct all civs look like that, etc.

Replies from: passive_fist
comment by passive_fist · 2015-10-23T21:36:40.035Z · LW(p) · GW(p)

In the cold dark models, the galaxy is already colonized, and the evidence is perhaps already in front of us . ..

The hypothesis that dark matter could be comprised of cold clumps of matter has been considered (these objects are called MACHOs) and as far as I know this hypothesis has been largely ruled out as they have properties that aren't consistent with how dark matter actually behaves.

I also think you're making an unfounded assumption here - that advanced civilizations could be stealthy. But what we know suggests that there ain't no stealth in space. There are a number of difficulties in keeping large energy-consuming objects cold, and even if you succeeded in keeping the brains themselves cold, the associated support equipment and fusion reactors that you mention would be pretty hot. And the process of constructing the brains would be very hot.

Replies from: jacob_cannell
comment by jacob_cannell · 2015-10-23T22:02:05.252Z · LW(p) · GW(p)

The hypothesis that dark matter could be comprised of cold clumps of matter has been considered (these objects are called MACHOs) and as far as I know this hypothesis has been largely ruled out as they have properties that aren't consistent with how dark matter actually behaves.

Unrelated. There is baryonic and non-baryonic dark matter. Most of the total dark matter is currently believed to be non-baryonic, but even leaving that aside the amount of baryonic dark matter is still significant - perhaps on par or greater than the baryonic visible matter. Most important of all is the light/dark ratio of heavier element baryonic matter and smaller planets/planetoids. There are some interesting new results suggesting most planets/planetoids are free floating rather than bound to stars (see links in my earlier article - "nomads of the galaxy" etc).

There is a limit to how big a giant computing device can get before gravitational heating makes the core unusable - the ideal archilect civ may be small, too small to detect directly. But perhaps they hitch rides orbiting larger objects.

Also, we don't know enough about non-baryonic dark matter/energy to rule it out as having uses or a relation to elder civs (although it seems unlikely, but still - there are a number of oddities concerning the whole dark energy inflation model).

I also think you're making an unfounded assumption here - that advanced civilizations could be stealthy. . ..There are a number of difficulties in keeping large energy-consuming objects cold,

Well we are talking about hypothetical post-singularity civs . . ..

There doesn't appear to be any intrinsic limit to computational energy efficiency with reversible computing, and practicality of advanced quantum computing appears to be proportional to how close one can get to absolute zero and how long one can maintain that for coherence.

So at the limits, computational civs approach CMB temperature and use negligible energy for computation. At some point it becomes worthwhile to spend some energy to move away from stars.

Any model makes some assumptions based on what aspects of engineering/physics we believe will still hold into the future. The article you linked makes rather huge assumptions - aliens civs need to travel around in ships, ships can only move by producing thrust, etc. Even then from what I understand detecting thrust is only possible at in-system distances, not light year distances.

The cold dark alien model i favor simply assumes advanced civs will approach physical limits.

Replies from: passive_fist
comment by passive_fist · 2015-10-23T22:35:43.853Z · LW(p) · GW(p)

The CMB temperature (2.7 K) is still very warm in relative terms and it's hard to see how effective large-scale quantum computing could be done at that temperature (current crude quantum computers operate at millikelvin temperatures and still have only very miniscule levels of coherence). The only way to get around this is to either use refrigeration to cool down the system (leading to a very hot fusion reactor and refrigeration equipment) or make do with 2.7 K, which would probably lead to a lot of heat dissipation.

You would absorb a large amount of entropy from the CMB at this temperature (about 1000 terabytes per second per square meter); you'd need to compensate for this entropy to keep your reversible computer working.

Replies from: jacob_cannell
comment by jacob_cannell · 2015-10-24T00:47:06.940Z · LW(p) · GW(p)

The CMB is just microwave radiation right? So reflective shielding can block most of that. What are the late engineering limits for microwave reflective coatings? With superconducting surfaces, metamaterials, etc?

Some current telescopes cool down subcomponents to very low temperatures without requiring large fusion reactors.

If the physical limits of passive shielding are non-generous, this just changes the ideal designs to use more active cooling than they otherwise would and limit the ratio of quantum computing stuff to other stuff - presumably there is always some need for active cooling and that is part of the energy budget, but that budget can still be very small and the final device temperature could even be less than CMB.

Replies from: passive_fist
comment by passive_fist · 2015-10-24T01:44:06.831Z · LW(p) · GW(p)

The CMB is just microwave radiation right? So reflective shielding can block most of that.

I'm afraid it can't. The 'shielding' itself would soon reach equilibrium with the CMB and begin emitting at 2.7 K. It makes no difference what it's made of. You can't keep an object cooler than the background temperature indefinitely without expending energy. If you could, you would violate conservation of energy.

And, again, the process of generating that energy would produce a lot of heat and preclude stealth.

Some current telescopes cool down subcomponents to very low temperatures without requiring large fusion reactors.

But the gross mass of the telescope is never lower than (or even equal to) the background temperature. JWST, for instance, is designed for 50 K operating temperature (which emits radiation at about 100,000 times the background level according to the Stefan-Boltzmann law).

If the physical limits of passive shielding are non-generous, this just changes the ideal designs to use more active cooling than they otherwise would and limit the ratio of quantum computing stuff to other stuff

Again, this would just make the problem worse, as a decrease in entropy in one part of the system must be balanced by a larger increase in entropy elsewhere. I'm talking about the possibility of stealth here (while maintaining large-scale computation).

but that budget can still be very small and the final device temperature could even be less than CMB.

This is a non-obvious statement to me. It seems that a computation on the level you're describing (much larger in scale than the combined brainpower of current human civilization by orders of magnitude) would require a large amount of mass and/or energy and would thus create a very visible heat signature. It would be great if you could offer some calculations to back up your claim.

Replies from: Wei_Dai, jacob_cannell
comment by Wei Dai (Wei_Dai) · 2015-10-24T10:06:07.747Z · LW(p) · GW(p)

Years ago I had the idea that advanced civilizations can radiate waste heat into black holes instead of interstellar space, which would efficiently achieve much lower temperatures and also avoid creating detectable radiation signatures. See http://www.weidai.com/black-holes.txt and my related LW post.

Replies from: None, jacob_cannell
comment by [deleted] · 2015-10-25T17:53:13.435Z · LW(p) · GW(p)

The recent news about KIC 8462852 immediately reminded me of your old txt file article. I'm really curious what you think about the recent information given how much you seem to have thought about advanced civs.

comment by jacob_cannell · 2015-10-24T16:49:16.598Z · LW(p) · GW(p)

It's an interesting idea.

Stable black holes seem difficult to create though - requires alot of mass. Could there be a shortcut?

comment by jacob_cannell · 2015-10-24T16:38:10.632Z · LW(p) · GW(p)

I'm afraid it can't. The 'shielding' itself would soon reach equilibrium with the CMB and begin emitting at 2.7 K.

EDIT: After updating through this long thread, I am now reasonably confident that the above statement is incorrect. Passive shielding in the form of ice can cool the earth against's the sun's irradiance to a temp lower than the black body temp, and there is nothing special about the CMB irradiance. See the math here at the end of the thread.

Sure - if it wasn't actively cooled, but of course we are assuming active cooling. The less incoming radiation the system absorbs, the less excess heat it has to deal with.

It makes no difference what it's made of. You can't keep an object cooler than the background temperature indefinitely without expending energy. If you could, you would violate conservation of energy.

Sure you need to expend energy, but obviously the albedo/reflectivity matters a great deal. Do you know what the physical limits for reflectivity are? For example - if the object's surface can reflect all but 10^-10 of the incoming radiation, then the active cooling demands are reduced in proportion, correct?

I'm talking about the possibility of stealth here (while maintaining large-scale computation).

I'm thinking just in terms of optimal computers, which seems to lead to systems that are decoupled from the external environment (except perhaps gravitationally), and thus become dark matter.

would require a large amount of mass and/or energy and would thus create a very visible heat signature.

The limits of reversible computing have been discussed in the lit, don't have time to review it here, but physics doesn't appear to impose any hard limit on reversible efficiency. Information requires mass to represent it and energy to manipulate it, but that energy doesn't necessarily need to be dissipated into heat. Only erasure requires dissipation. Erasure can be algorithmically avoided by recycling erased bits as noise fed into RNGs for sampling algorithms. The bitrate of incoming sensor observations must be matched by an outgoing dump, but that can be proportionally very small.

Replies from: passive_fist
comment by passive_fist · 2015-10-24T20:05:51.009Z · LW(p) · GW(p)

I think you're still not 'getting it', so to speak. You've acknowledged that active cooling is required to keep your computronium brain working. This is another way of saying you expend energy to remove entropy from some part of the system (at the expense of a very large increase in entropy in another part of the system). Which is what I said in my previous reply. However you still seem to think that, given this consideration, stealth is possible.

By the way, the detection ranges given in that article are for current technology! Future technology will probably be much, much better. It's physically possible, for instance, to build a radio telescope consisting of a flat square panel array of antennas one hundred thousand kilometers on a side. Such a telescope could detect things we can't even imagine with current technology. It could resolve an ant crawing on the surface of pluto or provide very detailed surface maps of exoplanets. Unlike stealth, there is no physical limit that I can think of to how large you can build a telescope.

but physics doesn't appear to impose any hard limit on reversible efficiency

Not theoretically, no. However, at any temperature higher than 0 K, purely reversible computing is impossible. Unfortunately there is nowhere in the universe that is that cold, and again, maintaining this cold temperature requires a constant feed of energy. These considerations impose hard, nonzero limits on power consumption. Performing meaningful computations with arbitrarily small power consumption is impossible in our universe.

You're repeatedly getting very basic facts about physics and computation wrong. I love talking about physics but I don't have the time or energy to keep debating these very basic concepts, so this will probably be my last reply.

Replies from: jacob_cannell
comment by jacob_cannell · 2015-10-24T22:23:07.176Z · LW(p) · GW(p)

think you're still not 'getting it', so to speak.

No - because you didn't actually answer my question, and you are conflating the reversible computing issue with the stealth issue.

I asked:

Do you know what the physical limits for reflectivity are? For example - if the object's surface can reflect all but 10^-10 of the incoming radiation, then the active cooling demands are reduced in proportion, correct?

The energy expended and entropy produced for cooling is proportional to the incoming radiation absorbed, correct? And this can be lowered arbitrarily with reflective shielding - or is that incorrect? Nothing whatsoever to do with stealth, the context of this discussion concerns only optimal computers.

Not theoretically, no. However, at any temperature higher than 0 K, purely reversible computing is impossible.

Don't understand this - the theory on rev computing says that energy expenditure is proportional to bit erasure, plus whatever implementation efficiency. The bit erasure cost varies with temperature sure, but you could still theoretically have a rev computing working at 100K.

You seem to be thinking that approaching zero energy production requires zero temperature - no. Low temperature reduces the cost of bit erasure, but bit erasure itself can also be reduced to arbitrarily low levels with algorithmic level recycling.

These considerations impose hard, nonzero limits on power consumption.

Which are?

You're repeatedly getting very basic facts about physics and computation wrong.

Such as? Am I incorrect in the assumption that the cost of active cooling is proportional to the temperature or entropy to remove and thus the incoming radiation absorbed - and thus can be reduced arbitrarily with shielding?

Replies from: passive_fist
comment by passive_fist · 2015-10-24T23:06:24.929Z · LW(p) · GW(p)

Which are?

  • External surface area of computer = A.
  • Background temperature = T ~ 2.7 K.
  • Stefan-Boltzmann constant: σ
  • Thermal power absorbed by system: P = σAT^4 (J/s)
  • Entropy absorbed by system: X = P / (T k_B log(2)) (bits/s)
  • Minimal amount of energy required to overcome this entropy: k_B T X * log(2) -- this happens to be equal to P.

Limit: External surface area of computer times σT^4.

As for active cooling, I think the burden of proof here is up to you to present a viable system and the associated calculations. How much energy does it take to keep a e.g. sphere of certain radius cold?

Replies from: jacob_cannell
comment by jacob_cannell · 2015-10-26T17:46:58.745Z · LW(p) · GW(p)

The thermal power you quoted is the perfect black body approximation. For a grey body, the thermal power is:

P = eoAT^4

where e is the material specific emissivity coefficient , and the same rule holds for absorption.

You seem to be implying that for any materials, there is a fundamental physical law which requires that absorption and emission efficiency is the same - so that a reflector which absorbs only e% of the incoming radiation is also only e% efficient at cooling itself through thermal emission.

Fine - even assuming that is the case, there doesn't seem to be any hard limit to reflective efficiency. A hypothetical perfect whitebody which reflects all radiation perfectly would have no need of cooling by thermal emission - you construct the object (somewhere in deep space away from stars) and cool it to epsilon above absolute zero, and then it will remain that cold for the duration of the universe.

There is also current ongoing research into zero-index materials that may exhibit 'super-reflection'. 1

If we can build super-conductors, then super-reflectors should be possible for advanced civs - a super conductor achieves a state of perfect thermal decoupling for electron interactions, suggesting that exotic material states could achieve perfect thermal decoupling for photon interactions.

So the true physical limit is for a perfect white body with reflectivity 1. The thermal power and entropy absorbed is zero, no active cooling required.

Furthermore, it is not clear at all that reflection efficiency must always equal emission efficiency.

Wikipedia's article on the Stefan-Boltzmann Law hints at this:

Wavelength- and subwavelength-scale particles,[1] metamaterials,[2] and other nanostructures are not subject to ray-optical limits and may be designed to exceed the Stefan–Boltzmann law.

What do you make of that?

Also - I can think of a large number of apparent counter-examples to the rule that reflection and emission efficiency must be tied.

How do we explain greenhouse warming of the earth, snowball earth, etc? The temperature of the earth appears to mainly depend on it's albedo, and the fraction of incoming light reflected doesn't appear to be intrinsically related to the fraction of outgoing light, with separate mechanisms affecting each.

Or just consider a one-way mirror: it reflects light in one direction, but is transparent in the other. If you surround an object in a one-way mirror (at CMB infrared/microwave wavelengths) - wouldn't it stay very cold as it can emit infrared but is protected from absorbing infrared? Or is this destined to fail for some reason?

I find nothing in the physics you have brought up to rule out devices with long term temperatures much lower than 2.7K - even without active cooling. Systems can be out of equilibrium for extremely long periods of time.

Replies from: passive_fist
comment by passive_fist · 2015-10-26T22:18:23.359Z · LW(p) · GW(p)

Again, you're getting the fundamental and basic physics wrong. You've also evaded my question.

There is no such thing as a perfect whitebody. It is impossible. All those examples you mention are for narrow-band applications. Thermal radiation is wideband and occurs over the entire electromagnetic spectrum.

The piece in the wikipedia article links to papers such as http://arxiv.org/pdf/1109.5444.pdf in which thermal radiation (and absorption) are increased, not decreased!

Greenhouse warming of the Earth is an entirely different issue and I don't see how it's related. The Earth's surface is fairly cold in comparison to the Sun's.

One-way mirrors do not exist. http://web.archive.org/web/20050313084618/http://cu.imt.net/~jimloy/physics/mirror0.htm What are typically called 'one-way mirrors' are really just ordinary two-way partially-reflective mirrors connecting two rooms where one room is significantly dimmed compared to the other.

I find nothing in the physics you have brought up to rule out devices with long term temperatures much lower than 2.7K - even without active cooling.

Well, firstly, you have to cool it down to below 2.7K in the first place. That most certainly requires 'active cooling'. Then you can either let it slowly equilibrate or keep it actively cold. But then you have to consider the carnot efficiency of the cooling system (which dictates energy consumption goes up as e/T_c, where T_c is the temperature of the computer and e is the energy dissipated by the computer). So you have to consider precisely how much energy the computer is going to use at a certain temperature and how much energy it will take to maintain it at that temperature.

EDIT: You've also mentioned in that thread you linked that "Assuming large scale quantum computing is possible, then the ultimate computer is thus a reversible massively entangled quantum device operating at absolute zero." Well, such a computer would not only be fragile, as you said, but it would also be impossible in the strong sense. It is impossible to reach absolute zero because doing so would require an infinite amount of energy: http://io9.com/5889074/why-cant-we-get-down-to-absolute-zero . For the exact same reason, it is impossible to construct a computer with full control over all the atoms. Every computer is going to have some level of noise and eventual decay.

Replies from: jacob_cannell
comment by jacob_cannell · 2015-10-27T01:12:24.519Z · LW(p) · GW(p)

Again, you're getting the fundamental and basic physics wrong. You've also evaded my question.

Show instead of tell. I didn't yet answer your question about the initial energy cost of cooling the sphere because it's part of the initial construction cost and you haven't yet answered my questions yet about reflectivity vs emisison and how it relates to temperature.

There is no such thing as a perfect whitebody. It is impossible.

Says what law - and more importantly - what is the exact limit then? Perfect super-conductivity may be impossible but there doesn't appear to be an intrinsic limit to how close one can get, and the same appears to apply for super-reflection. This whole discussion revolves around modeling technologies approaching said limits.

All those examples you mention are for narrow-band applications. Thermal radiation is wideband and occurs over the entire electromagnetic spectrum.

This helps my case - the incoming radiation is narrow-band microwave from the CMB. The outgoing radiation can be across the spectrum.

The piece in the wikipedia article links to papers such as http://arxiv.org/pdf/1109.5444.pdf in which thermal radiation (and absorption) are increased, not decreased!

If the 'law' can be broken by materials which emit more than the law allows, this also suggests the 'law' can be broken in other ways as in super-reflectors.1

One-way mirrors do not exist.

Ok.

Greenhouse warming of the Earth is an entirely different issue and I don't see how it's related. The Earth's surface is fairly cold in comparison to the Sun's.

If the earth's equilibrium temperature varies based on the surface albedo, this shows that reflectivity does matter and suggests a hypothetical super-reflector shielding for the CMB microwave could lead to lower than CMB temperatures. (because snow covering of the earth leads to lower equilibrium temperatures than a black-body at the same distance from the sun.)

Well, firstly, you have to cool it down to below 2.7K in the first place.

Do you? I'm not clear on that - you haven't answered the earth counter example, which seems to show that even without active cooling, all it takes is albedo/reflectivity for an object's equilibrium temperature to be lower than that of a black body in the same radiation environment. Is there something special about low temps like 2.7k?

That most certainly requires 'active cooling'. Then you can either let it slowly equilibrate or keep it actively cold. But then you have to consider the carnot efficiency of the cooling system (which dictates energy consumption goes up as e/Tc, where Tc is the temperature of the computer and e is the energy dissipated by the computer).

Apparently coherence in current quantum computers requires millikelvin temperatures, which is why I'm focusing on the limits approaching 0K. And from what I understand this is fundamental - as the limits of computing involve very long large coherent states only possible at temperatures approaching 0.

If we weren't considering quantum computing, then sure I don't see any point to active cooling below 2.7K. The the energy cost of bit erasures is ~CTc for some constant C, but the cooling cost goes as e/Tc. So this effectively cancels out - you don't get any net energy efficiency gain for cooling below the background temperature. (of course access to black holes much colder than the CMB changes that)

Well, such a computer would not only be fragile, as you said, but it would also be impossible in the strong sense. It is impossible ..

Yes - but again we are discussing limits analysis where said quantities approach zero, or infinity or whatever.

Replies from: passive_fist
comment by passive_fist · 2015-10-27T02:47:11.851Z · LW(p) · GW(p)

Says what law

You can trivially prove this for yourself. High-energy gamma rays cannot be completely reflected by matter. All thermal radiation contains some high-energy gamma rays. Thus no material can perfectly reflect thermal radiation. QED.

This helps my case - the incoming radiation is narrow-band microwave from the CMB

No it's not. CMB radiation spans the entire EM spectrum. Thermal radiation is almost the exact opposite of narrow-band radiation.

If the 'law' can be broken by materials which emit more than the law allows

It's not really broken though. It's just that radiation in these materials happens through mechanisms beyond conventional blackbody radiation. A common LED emits radiation far in excess of its thermal radiation. This doesn't mean that Stefan-Boltzmann is 'broken', it just means that an extra emission mechanism is working. A mechanism that requires free energy to run (unlike normal thermal radiation which requires no free energy). And sure enough, if you read that paper the extra mechanism requires extra free energy.

But you can't use an extra emission mechanism to reduce the emitted raditation.

all it takes is albedo/reflectivity for an object's equilibrium temperature to be lower than that of a black body in the same radiation environment.

You keep making this same mistake. Thermal equilibrium temperature does not depend on surface reflectivity. https://www.researchgate.net/post/Is_it_possible_to_distinguish_thermal_bodies_in_equilibrium/1

This is a very basic physics error.

If we weren't considering quantum computing,

It makes no difference what type of computing you're considering. I suggest reading http://arxiv.org/pdf/quant-ph/9908043.pdf

Specifically, the limiting factor is not temperature at all but error rate of your computer hardware, quantum or not. The ultimate limit to efficiency is set by the error rate, not the temperature at which you can cool the system to.

Replies from: jacob_cannell
comment by jacob_cannell · 2015-10-27T05:39:00.680Z · LW(p) · GW(p)

High-energy gamma rays cannot be completely reflected by matter.

For any system, even exotic? By what law? A simple google search seems to disagree - gamma rays are reflected today, in practice, (albeit with difficulty and inefficiently) by multilayer reflectors.

No it's not. CMB radiation spans the entire EM spectrum.

The vast majority of the energy peaks in microwave frequencies, but fine yes there is always some emission in higher frequencies - practical shielding would be complex and multilayer.

You keep making this same mistake. Thermal equilibrium temperature does not depend on surface reflectivity.

You keep bringing this up, but you can't explain how it applies to some basic examples such as the earth. How can you explain the fact that the temperature of planets such as earth, venus varies greatly and depends mostly on their albedo? Is it because the system is not in equilibrium? Then who cares about equilibrium? It almost never applies.

If the earth/sun system is not in equilibrium, then my hypothetical reflective object somewhere in deep space receiving radiation only from the CMB is certainly not in equilibrium either.

And finally the universe itself is expanding and is never in equilibrium - the CMB temperature is actually decaying to zero over time.

Until I see a good explanation of planetary albedo and temperature, I can't take your claim of "basic physics mistake" seriously.

It makes no difference what type of computing you're considering. I suggest reading http://arxiv.org/pdf/quant-ph/9908043.pdf

Read that of course, and I'd recommend some of Mike Frank's stuff over it.1 Obviously the energy cost of bit erasure is the same for all types of computing. Quantum computing is different only in having much lower error/noise/temp tolerances due to decoherence issues.

The ultimate limit to efficiency is set by the error rate, not the temperature at which you can cool the system to.

These are directly linked.

Heat is just thermal noise. And noise and errors are fundamentally the same - uncertainty over states that can explode unless corrected. The error rate for the most advanced computers is absolutely limited by thermal noise (and quantum noise).

This is trivially obvious at extremes - ie the error rate of a computer at 10000K is 100% for most materials. The lowest error rates are only achievable by exotic matter configurations at very low temperatures.

The idealized perfect computer is one with zero entropy - ie ever quantum state stores meaningful information, and every transition at every time step is a planned computation.

Looking at it another way, using devices and transitions larger than the absolute physical limits is just an easy way to do error correction to handle thermal noise.

Replies from: passive_fist
comment by passive_fist · 2015-10-27T07:00:01.720Z · LW(p) · GW(p)

I still can't understand why you think the Earth system is representative here.... are you asking why the Earth isn't the same temperature as the Sun? Or the same temperature as the background of space? Because if you remove any one, it would equilibrate with the other. But you're proposing to put your system in deep space where there is only the background. If you did that to Earth, you'd find it would very rapidly equilibrate to close to 2.7 K, and the final temperature is irrespective of surface albedo.

Albedo doesn't have any relationship with final temperature. Only speed at which equilibrium is reached.

Again, I don't feel like I have to 'explain' anything here... perhaps you could explain, in clearer terms, why you think it bears any relationship to the system we are discussing?

Read that of course, and I'd recommend some of Mike Frank's stuff over it.1

It's great that you've read those, unfortunately it seems you haven't understood them at all.

These are directly linked.

Not in the way you probably think. Error rate depends on hardware design as well as temperature. You're confusing a set of concepts here. As errors are generated in the computation, the entropy (as measured internally) will increase, and thus the heat level will increase. If this is what you are saying, you are correct. But the rate of generation of these errors (bits/s) is not the same as the instantaneous system entropy (bits) - they're not even the same unit! You could have a quantum computer at infinitesimally low temperature and it would still probably generate errors and produce heat.

This is really just another way of saying that your computer is not 100% reversible (isentropic). This is because of inevitable uncertainties in construction (is the manufacturing process that created the computer itself a perfectly error-free computer? If so, how was the first perfectly error-free computer constructed?), uncertainties in the physics of operation, and inevitable interaction with the outside world. If you claim you can create a perfectly isentropic computer, then the burden of proof is on you to demonstrate such a system. You can't expect me to take it on faith that you can build a perfectly reversible computer!

Replies from: jacob_cannell
comment by jacob_cannell · 2015-10-27T18:20:04.521Z · LW(p) · GW(p)

I still can't understand why you think the Earth system is representative here.... are you asking why the Earth isn't the same temperature as the Sun? Or the same temperature as the background of space?

I honestly can't understand how you can't understand it. :)

1. Take a spherical body and place it in a large completely empty universe. The body receives zero incoming radiation, but it emits thermal radiation until it cools to zero or something close to that - agreed? (quantum noise fluctuations or virtual particles perhaps impose some small nonzero temp, not sure) Albedo/reflectivity doesn't matter because there is no incoming radiation. Materials with higher emissivity will cool to zero faster.

2. Spherical body in an empty universe that contains a single directional light source that is very far away. The light source is not effected by the body in any significant way and does not prevent the body from emitting radiation. The source and the body will never reach equilibrium in the timescales we care about. The body absorbs radiation according to it's albedo and the incoming flux. The body emits radiation according to temperature and emissivity. It will evolve to a local equilibrium temperature that depends on these parameters.

3. The earth sun system - it is effectively equivalent to 2. The sun is not infinitely far away, but as far as the earth's temp is concerned the sun is just a photon source - the earth has no effect on the sun's temp and the objects are not in equilibrium. This situation is equivalent to 2.

We have hard data for situation 3 which shows that the balance between incoming radiation absorbed vs outgoing radiation emitted can differ for a complex composite object based on alebdo/reflectivity vs emissitivity. The end result is the object's local equilibrium temperature depends on these material parameters and can differ significantly from that of the black body temperature for the same input irradiance conditions.

4. Object in deep space. It receives incoming radiation from the CMB - which is just an infinite omnidirectional light source like 3. The directionality shouldn't change anything, the energy spectrum shouldn't change anything, so it's equivalent to 3 and 2. The object's resting temperature can be lower than the CMB blackbody 'temperature' (which after all isn't the temp of an actual object, it's tautologically just the temperature of a simple blackbody absorbing/emitting the CMB).

So what am I missing here? - seriously - still waiting to see how #3 could possibly differ.

Whatever principle it is that allows the earth's resting temp to vary based on surface albedo can be exploited to passively cool the earth, and thus can be exploited to passively cool other objects.

Google is now good enough that it gets some useful hits for "temperature lower than the CMB". In particular on this thread from researchgate I found some useful info. Most of the discussion is preoccupied with negative temps, but one or two of the replies agree with my interpretation and they are unchallenged:

Rüdiger Mitdank · Humboldt-Universität zu Berlin:

The cosmic Background Radiation is in a very good approximation a black Body Radiation. Every Body which is in a thermal Equilibrium with this Radiation source has this temperature. If you have another Radiation sources, usually hotter bodies like suns, the temperature increases. If due to the surface reflectivity the Absorption is low, the Body temperature approximates to a lower value, that Emission and Absorption are equal. Therefore it might be possible, that bodies consisting of ice or snow and having a clean surface, have a temperature below cosmic Background temperature. This occurs only far away from any other Radiation source. I would look for comets out of our sun system.

Because if you remove any one, it would equilibrate with the other. But you're proposing to put your system in deep space where there is only the background. If you did that to Earth, you'd find it would very rapidly equilibrate to close to 2.7 K, and the final temperature is irrespective of surface albedo.

The final temp for the earth in the (earth, sun, background) 'equilibrium' does depend on the surface albedo.

Error rate depends on hardware design as well as temperature. You're confusing a set of concepts here. As errors are generated in the computation, the entropy (as measured internally) will increase, and thus the heat level will increase. If this is what you are saying, you are correct.

I recommended Frank's work because it has the most clear unifying explanations of computational entropy/information. A deterministic computer is just an approximation - real systems are probabilistic (and quantum) and eventually we will move to those models of computation. The total entropy is always conserved, with some of entropy budget being the usable computational bits(qbits) and some being unknown/error bits such as thermal noise (but this generalization can also cover quantum noise). The 'erasure' of a bit really is just intentional randomization.

The idea of a hard error comes from the deterministic approximation, which assumes that the state of every bit is exactly known. In a prob circuit, we have instead a distribution over bit states, and circuit ops transform these distributions.

This is really just another way of saying that your computer is not 100% reversible (isentropic). This is because of inevitable uncertainties in construction (is the manufacturing process that created the computer itself a perfectly error-free computer? If so, how was the first perfectly error-free computer constructed?), uncertainties in the physics of operation,

Uncertainty in the construction can be modeled as a learning/inference problem. Instead of simple deterministic circuits, think of learning probabilistic circuits (there are no 'errors' so to speak, just distributions and various types of uncertainty). As inference/learning reduces uncertainty over variables of interest, reversible learning must generate an equivalent amount of final excess noise/garbage bits. Noise bits in excess of the internal desired noise bit reserve would need to be expelled - this is the more sophisticated form of cooling.

and inevitable interaction with the outside world.

Each device has an IO stream that is exactly bit conserved and thus reversible from it's perspective. Same principle that applies to each local circuit element applies to each device.

The final limitation is incoming entropy - noise from the outside world. This inflow must be balanced by a matching bit outflow from the internal noise bit reserves. This minimal noise flow (temperature) places ultimate limits on the computational capability of the system in terms of SNR and thus (analog/probabilistic) bit ops.

Replies from: passive_fist
comment by passive_fist · 2015-10-27T21:56:41.711Z · LW(p) · GW(p)

For the moment I'm just going to ignore everything else in this debate (I have other time/energy committments...) and just focus on this particular question, since it's one of the most fundamental questions we disagree on.

You are wrong, plain and simple. Rüdiger Mitdank is also wrong, despite his qualifications (I have equivalent qualifications, for that matter). Either that or he has failed to clearly express what he means.

If it were true that you could maintain an object colder than the background without consuming energy (just by altering surface absorption!), then you could have a free energy device. Just construct a heat engine with one end touching the object and the other end being a large black radiator.

Replies from: jacob_cannell
comment by jacob_cannell · 2015-10-27T23:14:15.494Z · LW(p) · GW(p)

If it were true that you could maintain an object colder than the background without consuming energy (just by altering surface absorption!),

Yea - several examples from wikipedia for the temperature of a planet indicate that albedo and emissivity can differ (it's implied on this page, directly stated on this next page).

Here under effective temperature they have a model for a planet's surface temperature where the emissivity is 1 but the albedo can be greater than 0.

Notice that if you plug in an albedo of 1 into that equation, you get a surface temperature of 0K!

The generalized stefan boltzmann law is thus the local equilibrium where irradiance/power absorbed equals irradiance/power emitted:

J_a = J_e

J_a = J_in*(1 - a)

J_e = eoT^4

T = (J_in (1-a) / (eo)) ^-4

J_in is the incoming irradiance from the light source, a is the material albedo, e is the material emissivity, o is SB const, T is temp.

This math comes directly from the wikpedia page, I've just converted from power units to irradiance. replacing the star's irradiance term of L/(16 PI D^2) with a constant for an omni light source (CMB).

On retrospect, one way I could see this being wrong is if the albedo and emissivity are always required to be the same for a particular wavelength. In the earth example the albedo of relevance is for high energy photons from the sun whereas the relevant emissivity is lower energy infrared. Is that your explanation?

then you could have a free energy device. Just construct a heat engine with one end touching the object and the other end being a large black radiator.

Hmm perhaps, but I don't see how that's a 'free' energy device.

The 'background' is a virtual/hypothetical object anyway - the CMB actually is just a flux of photons. The concept of temperature for photons and the CMB is contrived - defined tautologically based on an ideal black body emitter. The actual 'background temperature' for a complex greybody in the CMB depends on albedo vs emissivity - as shown by the math from wikipedia.

One can construct a heat engine to extract solar energy using a reflective high albedo (low temp resevoir) object and a low albedo black object. Clearly this energy is not free, it comes from the sun. There is no fundamental difference between photons from the sun and photons from the CMB, correct?

So in theory the same principle should apply, unless there is some QM limitation at low temps like 2.7K. Another way you could be correct is if the low CMB temp is somehow 'special' in a QM sense. I suggested that earlier but you didn't bite. For example, if the CMB represents some minimal lower barrier for emittable photon energy, then the math model I quoted from wikipedia then breaks down at these low temps.

But barring some QM exception like that, the CMB is just like the sun - a source of photons.

Replies from: passive_fist
comment by passive_fist · 2015-10-27T23:46:27.259Z · LW(p) · GW(p)

I've never seen someone so confused about the basic physics.

Let's untangle these concepts.

Effective temperature is not actual temperature. It's merely the temperature of a blackbody with the same emitted radiation power. As such, it depends on two assumptions:

  1. The emitted power is thermal in origin,
  2. The emission spectrum is the ideal blackbody spectrum.

Of course if these assumptions aren't true then the temperature estimate is going to be wrong. Going back to my LED example, a glowing LED might have an 'effective temperature' of thousands of degrees K. This doesn't mean anything at all.

The source of your confusion could be that emitted and received radiation sometimes have different spectra. This is indeed true. It's true of the Earth, for instance. But at equilibrium, absorption and emission are exactly equal at all wavelengths. Please read this: https://en.wikipedia.org/wiki/Kirchhoff%27s_law_of_thermal_radiation

Notice that if you plug in an albedo of 1 into that equation, you get a surface temperature of 0K!

Irrelevant, as I said. (The concept of albedo isn't very useful for studying thermal equilibrium, I suggest you ignore it)

The generalized stefan boltzmann law is thus the local equilibrium where irradiance/power absorbed equals irradiance/power emitted:

Yes, this is the definition of being at the same temperature, if you didn't know. (Assuming, of course, that the radiation is thermal in origin and radiation is the only heat transfer process at work, which it is in our example). If you disagree with this then you are simply wrong by definition and there is nothing more to say.

You seem to think that temperature is some concept that exists outside of thermal equilibrium. This is a very common mistake. Temperature is only defined for a system at thermal equilibrium, and when two objects are in thermal equilibrium with one another, they are by definition at the same temperature. It does not matter at all how fast their atoms are moving or what they are made of.

The concept of temperature for photons and the CMB is contrived - defined tautologically based on an ideal black body emitter.

No it's not. It's based on analysis of the spectrum, which is almost perfectly the spectrum of an ideal black body.

The 'background' is a virtual/hypothetical object anyway

The physics would be exactly the same if it were an actual sheet of black material at 2.7 K covering the universe.

Replies from: jacob_cannell
comment by jacob_cannell · 2015-10-28T01:39:08.502Z · LW(p) · GW(p)

The source of your confusion could be that emitted and received radiation sometimes have different spectra. This is indeed true. It's true of the Earth, for instance. But at equilibrium, absorption and emission are exactly equal at all wavelengths. Please read this: https://en.wikipedia.org/wiki/Kirchhoff%27s_law_of_thermal_radiation

That was indeed a source of initial confusion as I stated above, and I read Kirchnoff's Law. I said:

if the albedo and emissivity are always required to be the same for a particular wavelength. In the earth example the albedo of relevance is for high energy photons from the sun whereas the relevant emissivity is lower energy infrared. Is that your explanation?

However this still doesn't explain how passive temps lower than 2.7K are impossible. Passive albedo cooling works for the earth because snow/ice is highly reflective (inefficient absorber/emitter) at the higher frequencies where most of the sun's energy is concentrated, and yet it is still an efficient absorber/emitter at the lower infrared frequencies. - correct?

Now - what prevents the same principle for operating at lower temps? If ice can reflect efficiently at 500 nm and emit efficiently at 10um, why can't some hypothetical object reflect efficiently at ~cm range CMB microwave and emit efficiently at even lower frequencies?

You said: "But at equilibrium, absorption and emission are exactly equal at all wavelengths."

But clearly, this isn't the case for the sun earth system - and the law according to wikipedia is wavelength dependent. So I don't really understand your sentence.

You are probably going to say ... 2nd law of thermodynamics, but sorry even assuming that said empirical law is actually axiomatically fundamental, I don't see how it automatically rules out these scenarios.

Notice that if you plug in an albedo of 1 into that equation, you get a surface temperature of 0K!

Irrelevant, as I said. (The concept of albedo isn't very useful for studying thermal equilibrium, I suggest you ignore it)

This isn't an explanation, you still haven't explained what is different in my examples.

The physics would be exactly the same if it were an actual sheet of black material at 2.7 K covering the universe.

Sure - but notice that it's infinitely far away, so the concept of equilibrium goes out the window.

Also - aren't black holes an exception? An object using a black hole as a heat sink could presumably achieve temps lower than 2.7K.

Replies from: passive_fist
comment by passive_fist · 2015-10-28T03:52:34.680Z · LW(p) · GW(p)

why can't some hypothetical object reflect efficiently at ~cm range CMB microwave and emit efficiently at even lower frequencies?

I never said it can't. Such a material is definitely possible. It just couldn't passively reach lower temperature than the background. Assuming it's far out in space, as it cooled down to 2.7 K, it would eventually reach the limit where its absorption and emission at all frequencies equalled the background (due to Kirchhoff's law), and that's where the temperature would stay. If it started out at a lower temperature (due to being cooled beforehand) it would absorb thermal radiation until its temperature equalled the background (again, this is directly due to how we define temperature), and again that's where it would stay.

If you have a problem with that, take it up with Kirchhoff, not me :)

The Earth system is completely different here because neither the Earth is in thermal equilibrium with the Sun, nor is the Earth in thermal equilibrium with the background, nor is the Sun in thermal equilibrium with the background. There is a net transfer of thermal energy occurring from the sun to the earth to the background (yes the sun is heating up the background -- but not by much though). And 'net transfer of energy' means no equilibrium. Sources that use 'thermal equilibrium' for the relationship between the Sun and the Earth are using the term loosely and incorrectly.

The situation here is far from equilibrium because of the massive amounts of energy that the sun is putting out. This is the very opposite of 'passive' operation.

Replies from: jacob_cannell
comment by jacob_cannell · 2015-10-28T16:40:08.207Z · LW(p) · GW(p)

EDIT: fixed equation

why can't some hypothetical object reflect efficiently at ~cm range CMB microwave and emit efficiently at even lower frequencies?

I never said it can't. Such a material is definitely possible. It just couldn't passively reach lower temperature than the background.

The 'background 'is just an incoming flux of photons. If you insist that this photon flux has a temperature, then it is obviously true that an object can have a lower temperature than this background flux, because an icy earth can have a lower temp than it's background flux. As you yourself said earlier, temperature is only defined in terms of some equilibrium condition, and based on the equations that define temperature (below), the grey body 'temperature' for an irradiance distribution can differ from the black body temperature for the same distribution.

The math allows objects to shield against irradiance and achieve lower temps.

The general multispectral thermal emission of a grey body is just the black body emission function (planck's law) scaled by the wavelength/frequency dependent emissivity function for the material (as this is how emissivity is defined).

.

The outgoing thermal emission power for a grey body is thus:

}).

The object's temperature is stable when the net energy emitted equals the net energy absorbed (per unit time), where the net absorption is just the irradiance scaled by the emissivity function.

So we have:

}%20}%20d\lambda%20=%20\int%20{\epsilon_{\lambda}%20E_{\lambda}%20}%20%20d\lambda).

Where . is the incoming irradiance distribution as a function of wavelength, and the rest should be self-explanatory.

I don't have a math package handy to solve this for T. Nonetheless, given the two inputs : the material's emissivity and the incoming irradiance (both functions of wavelength), a simple numerical integration and optimization can find T solutions.

The only inputs to this math are the incoming irradiance distribution and the material's emissivity/absorptivity distribution. There is no input labeled 'background temperature'.

Assuming I got it right, this should model/predict the temps for earth with different ice/albedo/greenhouse situations (ignoring internal heating sources) - and thus obviously also should allow shielding against the background! All it takes is a material function which falls off heavily at frequencies before the mean of the input irradiance distribution.

This math shows that a grey body in space can have an equilibrium temperature less than a black body.

A hypothetical ideal low temp whitebody would have an e function that is close to zero across the CMB frequency range, but is close to 1 for frequencies below the CMB frequency range. For current shielding materials the temp would at best only a little lower than CMB black body temp due to the 4th root, but still - for some hypothetical material the temp could theoretically approach zero as emissivity across CMB range approaches zero. Put another way, the CMB doesn't truly have a temperature of 2.7k - that is just the black body approx temp of the CMB.

If your position is correct, something must be wrong with this math - some extra correction is required - what is it?

Assuming it's far out in space, as it cooled down to 2.7 K, it would eventually reach the limit where its absorption and emission at all frequencies equalled the background (due to Kirchhoff's law), and that's where the temperature would stay.

No - that isn't what the math says - according to the functions above which define temperature for this situation. As I pointed out earlier 2.7K is just the black body approx temp of the CMB (defined as the temp of a black body at equilibrium with the CMB!), and the grey body temp can be lower. Kirchhoff's law just says that the material absorptivity at each wavelength equals the emissivity at that wavelength. The wikipedia page even has an example with white paint analogous to the icy earth example.

There is a net transfer of thermal energy occurring from the sun to the earth to the background (yes the sun is heating up the background -- but not by much though). And 'net transfer of energy' means no equilibrium. Sources that use 'thermal equilibrium' for the relationship between the Sun and the Earth are using the term loosely and incorrectly.

None of this word level logic actually shows up in the math. The CMB is just an arbitrary set of photons. Translating the actual math to your word logic, the CMB set can be split as desired into subsets based on frequency such that a material could shield against the higher frequencies and emit lower frequencies - as indicated by the math. There is thus a transfer between one subset of the CMB, the object, and another CMB subset.

Another possible solution in your twisted word logic: there is always some hypothetical surface at zero that is infinitely far away. Objects can radiate heat towards this surface - and since physics is purely local, the code/math for local photon interactions can't possibly 'know' whether or not said surface actually exists. Or replace surface with vaccuum.

For your position to be correct, there must be something extra in the math not yet considered - such as some QM limitation on low emission energies.

Replies from: passive_fist
comment by passive_fist · 2015-10-28T22:10:42.506Z · LW(p) · GW(p)

You're abusing the math here. You've written down the expanded form of the Stefan-Boltzmann equation, which assumes a very specific relationship between temperature and emitted spectrum (which you say yourself). Then you write the temperature in terms of everything else in the equation, and assume a completely different emitted spectrum that invalidates the original equation that you derived T from in the first place.

What you're doing isn't math, it's just meaningless symbolic manipulation, and it has no relationship to the actual physics.

If you insist that this photon flux has a temperature,

Of course it does - this is a very very common and useful concept in physics - and when you say it doesn't this betrays lack of familiarity with physics. Photon gas most certainly has temperature, in exactly the same way as a gas of anything else has temperature. Not only that, but it also has pressure and entropy.

In fact, in some situations, like an exploding hydrogen bomb, a photon gas has considerable temperature and pressure, far in excess of say the temperature in the center of the sun or the pressure in the center of the Earth.

https://en.wikipedia.org/wiki/Photon_gas

You say yourself that you assume 'local equilibrium with it's incoming irradiance'. Firstly, you're using the word 'local' incorrectly. I assume you mean 'equilibrium at each wavelength'. If so, this is the very definition of being at the same temperature as the incoming irradiance (assuming everything here is thermal in nature) and, again, there is nothing more to say. http://physics.info/temperature/

I'm sorry that you're so insistent on your incorrect viewpoint that you're not even willing to listen to the obvious facts, which are really very simple facts.

Replies from: jacob_cannell
comment by jacob_cannell · 2015-10-28T23:35:38.651Z · LW(p) · GW(p)

You're abusing the math here. You've written down the expanded form of the Stefan-Boltzmann equation, ..

EDIT: I fixed the equations above, replaced with the correct emission function for a grey body.

Yes I see - the oT^4 term on the left needs to be replaced with the black body emission function of wavelength and temperature - which I gather is just Planck's Law. Still, I don't (yet) see how that could change the general conclusion.

Here is the corrected equation:

}%20}%20d\lambda%20=%20\int%20{\epsilon_{\lambda}%20E_{\lambda}%20}%20%20d\lambda).

Where . is the incoming irradiance distribution as a function of wavelength, and the rest should be self-explanatory. For any incoming irradiance spectrum, the steady state temperature will depend on the grey body emissivity function and in general will differ from that of a black body.

Photon gas most certainly has temperature, in exactly the same way as a gas of anything else has temperature.

I'm aware that the concept of temperature is applied to photons - but given that they do not interact this is something very different than temperature for colliding particles. The temperature and pressure in the examples such as the hydrogen bomb require interactions through intermediaries.

The definition of temperature for photon gas in the very page you linked involves a black body model due to the lack of photon-photon interactions - supporting my point about photon temps such as the CMB being defined in relation to the black body approx.

You say yourself that you assume 'local equilibrium with it's incoming irradiance'. Firstly, you're using the word 'local' incorrectly

I meant local in the physical geometric sense - as in we are modelling only a local object and the space around it over small smallish timescale. The meaning of local equilibrium should thus be clear - the situation where net energy emitted equals net energy absorbed. This is the same setup as the examples from wikipedia.

I'm sorry that you're so insistent on your incorrect viewpoint that you're not even willing to listen to the obvious facts, which are really very simple facts.

Actually I've updated numerous times during this conversation - mostly from reading the relevant physics. I've also updated slightly on answers from physicists which reach my same conclusion.

I've provided the radiosity equations for a grey body in outer space where the temperature is driven only by the balance between thermal emission and incoming irradiance. There are no feedback effects between the emission and the irradiance, as the latter is fixed - and thus there is no thermodynamic equilibrium in the Kirchoff sense. If you still believe that you are correct, you should be able to show how that math is wrong and what the correct math is.

This grey body radiosity function should be able to model the temp of say an icy earth, and can show how that temp changes as the object is moved away from the sun such that the irradiance shifts from the sun's BB spectrum to that of the CMB.

We know for a fact that real grey body objects can have local temps lower than the black body temp for the irradiance of an object near earth. The burden of proof is now on you to show how just changing the irradiance spectrum can somehow lead to a situation where all possible grey body materials have the same steady state temperature.

You presumably believe such math exists and that it will show the temp has a floor near 2.7K for any possible material emissivity function, but I don't see how that could possibly work.

I assume you mean 'equilibrium at each wavelength'

No. In retrospect I should not have used the word 'equilibrium'.

As you yourself said, the earth is not in thermodynamic equilibrium with the sun, and this is your explanation as to why shielding works for the earth.

Replace the sun with a distant but focused light source, such as a large ongoing explosion. The situation is the same. The earth is never in equilibrium with the explosion that generated the photons.

The CMB is just the remnant of a long gone explosion. The conditions of thermodynamic equilibrium do not apply.

If you are correct then you should be able to show the math. Provide an equation which predicts the temp of an object in space only as a function of the incoming (spectral) irradiance and that object's (spectral) emissivity.

comment by AABoyles · 2015-10-21T19:11:43.863Z · LW(p) · GW(p)

The Great Filter isn't an explanation of why life on Earth is unique; rather, it's an explanation of why we have no evidence of civilizations that have developed beyond Kardashev I. So, rather than focusing on the probability that some life has evolved somewhere else, consider the reason that we apparently don't have intelligent life everywhere. THAT's the Great Filter.

Replies from: OrphanWilde
comment by OrphanWilde · 2015-10-21T19:22:52.646Z · LW(p) · GW(p)

"They do exist, but we see no evidence" is an alternative theory to the Great Filter, and I believe what Jacob Cannell is using wrt the cold dark model.

comment by MaximumLiberty · 2015-10-21T22:45:07.545Z · LW(p) · GW(p)

Surely "unvisited" is insignificant. There's no current science suggesting any means of faster-than-light travel. So, if you assume that extraterrestrial life would have lifespans grossly similar to terrestrial lifespans, we ought to remain unvisited.

"Saturated in the Great Silence" seems like a far more significant point.

Replies from: passive_fist, Will_BC, jacob_cannell
comment by passive_fist · 2015-10-23T01:14:28.372Z · LW(p) · GW(p)

So, if you assume that extraterrestrial life would have lifespans grossly similar to terrestrial lifespans, we ought to remain unvisited.

Human beings spread all over the globe on foot 75000-15000 years ago, despite the fact that no single human probably walked all the way from Africa to Australia. It's a fairly trivial assumption that an expanding interstellar civilization would not be limited by the lifespan of its inhabitants.

The galaxy may be big, but it is very small compared to the time-scales involved here. At walking pace (~5 km/h), you could travel 61 light years during the time from when the milky way formed up to now. At speeds easily achievable using chemical propulsion (~15 km/s), you could travel the circumference of the milky way --- twice!

comment by Will_BC · 2015-10-22T13:29:54.365Z · LW(p) · GW(p)

http://www.fhi.ox.ac.uk/intergalactic-spreading.pdf

You didn't actually do the math on that. According to this paper by the Future of Humanity Institute (Nick Bostrom's group), if life evolved to the point of interstellar travel 3 billion years ago and could travel at 50% of c, then you would expect it to travel not just to this galaxy, but the nearest million. If you go back five billion years and assume travel speeds of 99% of c, it could reach a billion galaxies. 75% of stars in the Milky Way that could support life are older than our Sun. It really is an enigma.

comment by jacob_cannell · 2015-10-21T23:41:02.319Z · LW(p) · GW(p)

So, if you assume that extraterrestrial life would have lifespans grossly similar to terrestrial lifespans, we ought to remain unvisited.

Are talking about civilization/life lifespan or individual organism lifespan?

Civilizations can send out long lived probes, and individual lifespans are somewhat irrelevant, especially for post-biological civilizations.

If life is as plentiful as it appears to be, then due to the enormous numbers we should expect to have been visited in our history unless there is alot of future filtering somewhere in num civs avg civ 'active' lifespan fraction of civs that explore.

FTL travel isn't necessary at all. The natural easy way to travel around the solar system is to use gravitational assists, which allows for travel at speeds on order of the orbital speeds. The sun orbits the galaxy at a respectable speed of around 251 km/s or 0.1c, and some stars such as Schol'z star travel in the opposite direction. So it should only take about a million years for even a slow expanding civ to expand out 1,000 lyrs. And very small scout probes could more easily travel at faster speeds.

Basically we should expect the galaxy to be at least fully visited, if not colonized, by at most one galactic year after the birth of the first elder space civ. Earth is only about 18 gyrs old, whereas the galaxy is 54 gyrs old.

comment by Diadem · 2015-10-21T15:00:43.016Z · LW(p) · GW(p)

The article is unclear in its terms. At the top is says "92 percent of the Universe's habitable planets have yet to be born" and at the bottom it says "Earth is in the first 8 percent". Those two statements can only both be true if no habitable planets were formed between the formation of the earth and now (which is, of course, not the case). If the former is correct, earth might be significantly higher than top 8%.

I still don't see how this escapees the Fermi paradox though. Even if we're top 1%, that still means there must be great, great many potential alien civilizations out there. A factor 100 isn't going to significantly affect that conclusion.

comment by Gunnar_Zarncke · 2015-10-21T14:16:52.627Z · LW(p) · GW(p)

Actual paper: http://hubblesite.org/pubinfo/pdf/2015/35/pdf.pdf

Note that "Earth is one of the first habitable planets to form" seems to be an overstatement. It is one of the first 8% to form. But nonetheless...

Replies from: jacob_cannell
comment by jacob_cannell · 2015-10-21T21:49:10.820Z · LW(p) · GW(p)

Even ignoring uncertainty in their estimate, being in the first 8% is hardly statistically unusual or 'early'. It's just what we'd expect from the mediocrity principle. Unusual would be say 10^-20, like some physical constants which are apparently under significant observational selection pressure and have extremely improbable values.

Knowing that we are not improbably early does perhaps suggest that some alien models are unlikely - for example it is unlikely that aliens are very common and very aggressive/expansive in our region of the multiverse, because if that was the case then observers like us would always tend to find ourselves on an unusually early planet. But we aren't.