The Negentropy Cliff

post by mephistopheles · 2023-08-17T17:08:20.962Z · LW · GW · 10 comments

Contents

10 comments

Something that is not often discussed explicitly and factors into the different intuitions people have about P(Doom) is how close to optimal biology and humans are in terms of harnessing negative entropy. This consideration pertains to equally to nanobots, ASI and artifical life in general.

Let's consider grey goo first: The race to turn all resources into copies of yourself has been going for a few billion years and is quite competitive. In order to supplant organic life, nanobots would have to either surpass it in carnot efficiency or (more likely) utilise a source of negative entropy thus far untapped. Examples of this previously happening are:

If, in the designspace of replicators, we are in a local (metastable) optimum and the ability to consume negative entropy, falls off a cliff in a place that is reachable by the synthetic but not organic life, we will get outcompeted quickly. So, are we stumbling in the dark, next to a civilisation swallowing precipice? Would the ASI need to discover new physics or are there already examples of negentropy sources that it could use better than biology?

10 comments

Comments sorted by top scores.

comment by johnswentworth · 2023-08-18T01:24:05.665Z · LW(p) · GW(p)

The very large majority of the sun's energy output currently just propagates through space unimpeded. Even restricting to Earth's land surface, vast tracts of desert have ample unharnessed solar negentropy.

Replies from: mephistopheles
comment by mephistopheles · 2023-08-19T13:09:20.357Z · LW(p) · GW(p)

Agreed! I don’t see if there is a quick “hack” that will enable ASI to harness that negentropy though. The "cliff" I had in mind would look like creating a sizable competitive advantage with only small scale control.

 E.g. the first bacteria to perform photosynthesis locally changed a single molecule and had an advantage that its competitors could neither match nor negate. I wonder how ASI could pull the rug from beneath us in a comparable manner.

comment by anithite (obserience) · 2023-08-18T03:57:33.348Z · LW(p) · GW(p)

In order to supplant organic life, nanobots would have to either surpass it in carnot efficiency or (more likely) use a source of negative entropy thus far untapped.

Efficiency leads to victory only if violence is not an option. Animals are terrible at photosynthesis but survive anyways by taking resources from plants.

A species can invade and dominate an ecosystem by using a strategy that has no current counter. It doesn't need to be efficient. Intelligence allows for playing this game faster than organisms bound by evolution. Humans can make vaccines to fight the spread of a virus despite viruses being one of the fastest adapting threats.

Green goo is plausible [LW · GW] not because it would necessarily be more efficient but because it would be using a strategy the existing ecosystem has no defenses to (IE:it's an invasive species).

Likewise AGI that wants to kill all humans could win even if it required 100x more energy per human equivalent instance if it can execute strategies we can't counter. Just being able to copy itself and work with the copies is plausibly enough to allow world takeover with enough scaling [LW · GW].

Replies from: mephistopheles, M. Y. Zuo
comment by mephistopheles · 2023-08-19T13:21:01.443Z · LW(p) · GW(p)

Of course! The way I think of it, violence would be using other lifeforms as sources of negentropy.

I like the invasive species argument, I agree that we would be very vulnreable to an engineered pathogen.

comment by M. Y. Zuo · 2023-08-18T13:27:57.835Z · LW(p) · GW(p)

Likewise AGI that wants to kill all humans could win even if it required 100x more energy per human equivalent instance if it can execute strategies we can't counter. Just being able to copy itself and work with the copies is plausibly enough to allow world takeover with enough scaling [LW · GW].

We haven't done that against ants, even though the difference is way more then 100x. 

Replies from: Lichdar, obserience
comment by Lichdar · 2023-08-18T17:21:14.610Z · LW(p) · GW(p)

We do have world takeover compared to ants, though our desire to wipe out all ants is just not that high.

Replies from: M. Y. Zuo
comment by M. Y. Zuo · 2023-08-18T17:41:48.153Z · LW(p) · GW(p)

Not really? Ants have more biomass then humans, and are likely to outlast us.

comment by anithite (obserience) · 2023-08-20T15:16:27.520Z · LW(p) · GW(p)

If we wanted to kill the ants or almost any other organism in nature we mostly have good enough biotech. For anything biotech can't kill, manipulate the environment to kill them all.

Why haven't we? Humans are not sufficiently unified+motivated+advanced to do all these things to ants or other bio life. Some of them are even useful to us. If we sterilized the planet we wouldn't have trees to cut down for wood.

Ants specifically are easy.

Gene drives allow for targeted elimination of a species. Carpet bomb their gene pool with replicating selfish genes. That's if an engineered pathogen isn't enough. Biotech will only get better.

What about bacteria living deep underground? We haven't exterminated all the bacteria in hard to reach places so humans are safe. That's a tenuous but logical extension to your argument.

If biotech is not enough, shape the environment so they can't survive in it. Trees don't do well in a desert. If we spent the next hundred years adapting current industry to space and building enormous mirrors we can barbecue the planet. It would take time, but that would be the end of all earth based biological life.

comment by Shmi (shminux) · 2023-08-18T01:41:35.261Z · LW(p) · GW(p)

As mentioned in another comment, efficiency is not a bottleneck. You don't have to rely on solar radiation toward Earth to increase entropy for fun and profit. There are other energy sources around. We are not using fission much, we are not spending nearly enough resources to figure out fusion, geothermal is all but untapped, there are countless other sources, as well, and that's just on Earth. 

comment by Lichdar · 2023-08-18T17:08:51.802Z · LW(p) · GW(p)

I think even if AI proves strictly incapable of surviving in the long time due to various efficiency constraints, this has no relevance on its ability to kill us all.

A paperclip maximizer that eventually runs into a halting problem as it tries to paperclip itself may very well have killed everyone by that point.

I think the term for this is "minimal viable exterminator."