Is this a Pivotal Weak Act? Creating bacteria that decompose metal

post by doomyeser · 2024-09-11T18:07:19.385Z · LW · GW · 1 comment

This is a question post.

Contents

  Answers
    9 faul_sname
    7 titotal
    5 Lalartu
None
1 comment

This has been haunting my mind for a while and I would appreciate feedback on it!

In his infamous article "AGI Ruin: A list of lethalities" [LW · GW]Eliezer defines a "pivotal weak act" and gives a heuristic proof that no such thing can exist.

TLDR: I think his proof is wrong and there is a counterexample. I believe creating bacteria that decompose metal, silicone, (or any other superset of the materials GPUs consist of) would constitute a pivotal weak act.

Long Version:
In his article, Eliezer outlines several hopes of people claiming AGI won't be as bad or any problem at all, and then cruelly squashes them. One of those hopes is the possibility of executing a "pivotal weak act". The idea being that a small group of people executes some action X that will prevent AGI from being built, for example a group that is privy to the dangers of AGI would command a friendly AGI to "burn all GPUs" and then we are good. Eliezer argues that any AGI powerful enough (pivotal) to prevent or at least indefinitely postpone unaligned AGI must itself be powerful enough such that it needs to be aligned (not weak), which we don't know how to do. I believe his proof is false.

Definition:
A Pivotal Weak Act, would be some action or event A, such that
1. A happening or being executed prevents or delays the advent of an unaligned AGI indefinitely or at least very long (Pivotal)
2. A does not itself pose a significant X-risk for humanity as a whole (Weak)
3. A is realistically achievable with technology attainable in the coming decades (Realism)

furthermore it is not required that
- A is in any way related to or facilitated by an AI system
- A has no collateral damage
- A is moral, legal or anywhere near the Overton Window
- A is achievable today with current technology

 
I think that the following is an example of a Pivotal Weak Act.

Creating bacteria that decompose metal (and spreading them worldwide)

This is pivotal, since it is a special scenario of "Burning all GPUs"
It is (likely) weak, since there are uncontacted tribes in the Amazone that certainly live without (forged) metal and would barely even notice.
I am not sure how realistic it is because I have no knowledge about Microbiology, but it does not seems to be SciFi, as there already seems to be something like that going on: 

https://www.bam.de/Content/EN/Standard-Articles/Topics/Environment/Biocorrosion/mic-microbiologically-influenced-corrosion.html

Caveats:
- There is a Weakness vs. Pivotality tradeoff since bacteria don't spread as quickly as viruses. They would have to be specially engineered and that can be risky. The more natural the bacteria the weaker but also less pivotal the act.
- This is not advocacy since it would likely kill a lot of people, but I hope that there is renewed interest in discussion the idea of Pivotal Weak Acts and I am sure that some smart people out there can come up with better ideas and scenarios.

But didn't Eliezer prove that there are no Pivotal Weak Acts?
I believe Eliezer has made an error in his proof. I will restate the proof as I understood it and highlight the error.

Eliezer takes a look at the act of "Burning all GPUs (BAG)" and states that this is a slight overestimation of the complexity needed for an act to be pivotal. I agree so far. In order to prevent AGI it is necessary to avoid large GPU clusters to run potentially dangerous training algorithms. In order to achieve that you can have these scenarios in ascending complexity.
1. Let the clusters be assembled but make sure nobody run dangerous algorithms on them (unrealistic)
2. Have GPUs exists, but prevent them from being assembled into large enough clusters (I guess this is the current policy-goal)
3. Have GPUs not exists.

Then he states that "A GPU-burner is also a system powerful enough to, and purportedly authorized to, build nanotechnology, so it requires operating in a dangerous domain at a dangerous level of intelligence and capability". Thus, since the BAG scenario is basically the minimum complexity necessary and that is already difficult to align there can be no Pivotal Weak Acts.

But, as is clear, he assumes incorrectly that the pivotal act has to be carried out by some sort of AI system and he also seems to suggest that the goal of the GPU-burner is to "Burn only GPUs" (BOG) while the task BAG is only an slight overestimate when it is taken to mean "Burning (at least) all GPUs" as opposed to "Burning (only) all GPUs".

I believe that this switch of what is meant by "Burning all GPUs" together with the assumption that an AI system needs to execute it is what incorrectly leads him to conclude there is no Pivotal Weak Act. The gap opening up between "Burning only the GPUs" and "Burning any superset of GPUs not containing all of humanity" is where the possibilities lie.

Answers

answer by faul_sname · 2024-09-11T21:18:37.348Z · LW(p) · GW(p)

"Create bacteria that can quickly decompose any metal in any environment, including alloys and including metal that has been painted, and which also are competitive in those environments, and will retain all of those properties under all of the diverse selection pressures they will be under worldwide" is a much harder problem than "create bacteria that can decompose one specific type of metal in one specific environment", which in turn is harder than "identify specific methanobacteria which can corrode exposed steel by a small fraction of a millimeter per year, and find ways to improve that to a large fraction of a millimeter per year."

Also it seems the mechanism is "cause industrial society to collapse without killing literally all humans" -- I think "drop a sufficiently large but not too large rock on the earth" would also work to achieve that goal, you don't have to do anything galaxy-brained.

comment by RHollerith (rhollerith_dot_com) · 2024-09-11T21:32:49.545Z · LW(p) · GW(p)

Because of the difficulty of predicting a "safe upper bound" on size of rock below which the risk of human extinction is within acceptable limits, I prefer the idea of destroying all the leading-edge fabs in the world or reducing the supply of electricity worldwide to low enough levels that the AI labs cannot compete for electricity with municipalities who need some electricity just to maintain social order and cannot compete with the basic infrastructure required just to keep most people alive. If either of those 2 outcomes weren't hard enough, we would have to maintain such an improved state of affairs (i.e., no leading-edge fab capability or severely degraded electricity generation capability) long enough (i.e., probably at least a century in my estimation) for there to come into being some other, less drastic way of protecting against reckless AI development.

Neither OP's metal-eating bacteria, the large rock from space nor either of the 2 interventions I just described is feasible enough to be worth thinking about much, IMO (and again the large rock from space carries much higher extinction risk than the other 3).

Replies from: doomyeser
comment by doomyeser · 2024-09-16T10:07:25.847Z · LW(p) · GW(p)

Thank you for your answer!

Not only do you have to maintain such a state but also induce it in the first place. Neither the "correctly" sized rock nor the electricity reduction is something that you can just do without a government concensus that would enable you to prescribe and enforce a moratorium, which would be the preferred and most humane solution anyways.

Another humane idea I had, but is likely same as unrealistic would be to just buy up most of the GPUs or the resources used to produce them and making it economically unviable to build large GPU clusters. I doubt that all the EA and LessWrong-Doomer money in the world would suffice for that though.

Replies from: rhollerith_dot_com
comment by RHollerith (rhollerith_dot_com) · 2024-09-18T14:51:13.875Z · LW(p) · GW(p)

buy up most of the GPUs or the resources used to produce them

That would backfire IMHO. Specifically, GPUs would become more expensive, but that would last only as long as it takes for the GPU producers to ramp up production (which is very unlikely to take more than 5 years) after which GPU prices would go lower than they would've gone if we hadn't started buying them up (because of better economies of scale).

GPUs and the products and services needs to produce GPUs are not like the commodity silver where if you buy up most of the silver, the economy probably cannot respond promptly by producing a lot more silver. If you could make leading-edge fabs blow up in contrast that would make GPUs more expensive permanently (by reducing investment in fabs) or at least it would if you could convince investors that leading-edge fabs are likely to continue to blow up.

comment by doomyeser · 2024-09-16T10:01:44.663Z · LW(p) · GW(p)

Yes, this kind of "preemptive teardown" is the underlying mechanism, however the "taking the semicondutor out of the equation" seemed to be a more indefinite approach as opposed to the things you outlined as well as most of the other "non-galaxy-brained" things I have thought about.

answer by titotal · 2024-09-12T09:37:00.507Z · LW(p) · GW(p)

building a bacteria that eats all metals would be world-ending: Most elements on the periodic table are metals. If you engineer a bacteria that eats all metals, it would eat things that are essential for life and kill us all. 

Okay, what about a bacteria that only eats "stereotypical" metals, like steel or iron? I beg you to understand that you can't just sub in different periodic table elements and expect a bacteria to work the same. There will always be some material that the bacteria wouldn't work on that computers could still be made with. And even making a bacteria that only works on one material, but is able to spread over the entire planet, is well beyond our abilities. 

I think list of lethalities is nonsense for other reasons, but he is correct in that trying to do a "pivotal act" is a really stupid plan. 

comment by doomyeser · 2024-09-16T10:16:06.810Z · LW(p) · GW(p)

Thank you for you answer!

As I said I don't know much about microbiology and chemistry so I cannot challenge anything you said. It seems consistent with other answers as well telling me that there is no way of doing it.
I was queezy about it anyways since teleporting our technology back to the stone age and committing 99.9% of humanity to die is quite extreme.

answer by Lalartu · 2024-09-12T03:54:33.552Z · LW(p) · GW(p)

No, that is not realistic. Bacteria described in the article don't really eat iron, they just make corrosive chemicals as methabolic waste. They rely on other sources of energy (sulfates or organic compounds). Metal-eating bacteria (those which derive energy by reducing metals) exist but require metals dissolved in water, eating solid metals doesn't work chemically.

Generally I think Eliziers definition of weak pivotal act doesn't include civilization collapse, because there are multiple obvious ways which don't kill all humans.

comment by doomyeser · 2024-09-16T10:13:41.532Z · LW(p) · GW(p)

Thank you for your answer!

There might be more obvious ways but (at least the ones I can think of) are either only temporary and local disruptions or cannot be executed by a small coordinated group.

I understand that LW and the AI Safety community does not want to be associated with terrorism and ending civilisation, however Eliezer talked about blowing up AI labs, so I am uncertain where the line here would be.

I accept that "preemtively destroying civilization" might be excluded from the definition of PWAs but is that something that is at all discussed on LW or in the AI Safety community? Seems to me that if you 99.99% believe that AGI will kill or worse torture us, then it should be on the table.

1 comment

Comments sorted by top scores.