Preparing for the apocalypse might help prevent it
post by Ocracoke · 2022-08-25T00:18:58.832Z · LW · GW · 1 commentsContents
1 comment
When I tell people that I think there is a decent chance that an unaligned AGI will bring about the apocalypse within the next 20 years or so, they tend to not take it too seriously. Often that's because they think I would act differently if I really assigned a high probability to this.
When I say apocalypse, I personally consider it more likely that the development of AGI will lead to major disruptions like war, civil unrest or breakdown of supply chains or basic infrastructure. I can also imagine that an AGI actively tries to kill all humans, but I consider that less likely. Even if it actively tried to disempower humans, it wouldn't need to kill most of us, it would just have to destroy or control our institutions or infrastructure.
Because of that, I think it would be prudent to learn how to survive without supply chains, electricity, and so on. Basically, become a prepper. Not only would that increase the chances of my own survival in a range of different scenarios, it would also be an honest signal to others that I'm serious about the potential threat. I would be putting my money where my mouth is.
I'm surprised that despite the high probability that many people here assign to AGI takeover scenarios, there is little talk about how we might improve our survival chances if we can't avoid bad outcomes. I suspect that this is partially because being a prepper has certain connotations like being a conspiracy theorist and having wonky ideas. AGI takeover is also a wonky idea, but a different kind of wonky, and we don't want to be grouped together with people we think shouldn't be taken seriously.
So maybe the problem with this approach is that AGI worries would be taken less seriously by others.
Or maybe the reason why there isn't much talk about preparing for bad AGI outcomes is that there is a sense that this can wait until the threat is more imminent?
1 comments
Comments sorted by top scores.
comment by JBlack · 2022-08-25T08:05:41.033Z · LW(p) · GW(p)
The main paths seem to be that AGI will not actually be that dangerous, or that AGI is lethal against any preparation. If AGI is relatively slow and tops out at collapsing civilization without extinction, I expect there to be quite a lot of time to prepare. I don't assign any significant probability to AGI being dangerous enough to collapse civilization with little warning, but in such a way that preparing for it now will help.
In principle the best way to prepare for such a scenario would be to invest in AI technology so that I ride the productivity curve upward before the disaster strikes, and have a lot more wealth to start material preparations later - but this would actively make the existential risks worse. I believe increasing one's personal expected utility at the expense of increasing the risk of human extinction to be massively immoral, and refuse to do that.
So while I do have some preparations for ordinary disasters that almost certainly will happen anyway, these plans do not have much room for AGI disaster.