Seven Apocalypses

post by scarcegreengrass · 2016-09-20T02:59:20.173Z · LW · GW · Legacy · 16 comments

Contents

16 comments

0: Recoverable Catastrophe

An apocalypse is an event that permanently damages the world. This scale is for scenarios that are much worse than any normal disaster. Even if 100 million people die in a war, the rest of the world can eventually rebuild and keep going.


1: Economic Apocalypse

The human carrying capacity of the planet depends on the world's systems of industry, shipping, agriculture, and organizations. If the planet's economic and infrastructural systems were destroyed, then we would have to rely on more local farming, and we could not support as high a population or standard of living. In addition, rebuilding the world economy could be very difficult if the Earth's mineral and fossil fuel resources are already depleted.


2: Communications Apocalypse

If large regions of the Earth become depopulated, or if sufficiently many humans die in the catastrophe, it's possible that regions and continents could be isolated from one another. In this scenario, globalization is reversed by obstacles to long-distance communication and travel. Telecommunications, the internet, and air travel are no longer common. Humans are reduced to multiple, isolated communities.


3: Knowledge Apocalypse

If the loss of human population and institutions is so extreme that a large portion of human cultural or technological knowledge is lost, it could reverse one of the most reliable trends in modern history. Some innovations and scientific models can take millennia to develop from scratch.


4: Human Apocalypse

Even if the human population were to be violently reduced by 90%, it's easy to imagine the survivors slowly resettling the planet, given the resources and opportunity. But a sufficiently extreme transformation of the Earth could drive the human species completely extinct. To many people, this is the worst possible outcome, and any further developments are irrelevant next to the end of human history.

 

5: Biosphere Apocalypse

In some scenarios (such as the physical destruction of the Earth), one can imagine the extinction not just of humans, but of all known life. Only astrophysical and geological phenomena would be left in this region of the universe. In this timeline we are unlikely to be succeeded by any familiar life forms.


6: Galactic Apocalypse

A rare few scenarios have the potential to wipe out not just Earth, but also all nearby space. This usually comes up in discussions of hostile artificial superintelligence, or very destructive chain reactions of exotic matter. However, the nature of cosmic inflation and extraterrestrial intelligence is still unknown, so it's possible that some phenomenon will ultimately interfere with the destruction.


7: Universal Apocalypse

This form of destruction is thankfully exotic. People discuss the loss of all of existence as an effect of topics like false vacuum bubbles, simulationist termination, solipsistic or anthropic observer effects, Boltzmann brain fluctuations, time travel, or religious eschatology.


The goal of this scale is to give a little more resolution to a speculative, unfamiliar space, in the same sense that the Kardashev Scale provides a little terminology to talk about the distant topic of interstellar civilizations. It can be important in x risk conversations to distinguish between disasters and truly worst-case scenarios. Even if some of these scenarios are unlikely or impossible, they are nevertheless discussed, and terminology can be useful to facilitate conversation.

16 comments

Comments sorted by top scores.

comment by James_Miller · 2016-09-20T03:52:21.187Z · LW(p) · GW(p)

"A Disneyland with no children" apocalypse where optimization competition eliminates any pleasure we get from life.

A hell apocalypse where a large numbers of sentient lifeforms are condemned to very long term suffering possibly in a computer simulation.

Replies from: cousin_it, scarcegreengrass
comment by cousin_it · 2016-09-21T10:38:05.898Z · LW(p) · GW(p)

Your first option fits somewhere between 4 and 5. Your second option fits at the end of the scale and I'm not sure why it wasn't included in the OP.

comment by scarcegreengrass · 2016-09-20T13:44:56.292Z · LW(p) · GW(p)

Yeah, i was thinking about the latter (like Pascal's Mugging) but i think it might be too exotic to fit into a linear scale.

comment by David Althaus (wallowinmaya) · 2016-09-29T16:30:29.984Z · LW(p) · GW(p)

I don't understand why you exclude risks of astronomical suffering ("hell apocalypses").

Below you claim that those risks are "Pascalian" but this seems wrong.

comment by turchin · 2016-09-20T20:30:24.085Z · LW(p) · GW(p)

We could add here "Qualia apocalypses" - human are alive but become p-zombies may be after wrong uploading

Intelligence apocalypses - human go extinct, but no other form of intelligence appear. Or human survive but their creative intelligence permanently damaged and IQ never rise above 80. May be because global contamination by arsenic.

Gene allele apocalypses - many interesting alleles in human genom disappear. The remains look like humans but many interesting traits are lost.

Primate apocalypses - all high apes extinct including humans, new intelligence on Earth could appear only after 10 million years from now or more.

Mammal apocalypses.

Vertebral apocalypses

Values apocalypses - human values eroded and replaced by another values, like Nazi. Probably it has happened several times in history.

Evolution apocalypses - evolution just ends, human exists almost forever, but nothing new happens, no superAI, no star travel. Just the end of complexity growth. AI may appear but will be as boring as Windows 7.

Individuality apocalypses - humans become all very similar to each other. It already happened with globalisation.

Children apocalypses - human just stop to reproduce above replacement rate.

Art apocalypses - human lost interest to arts or ability to create really interesting art. Some think that it has already happened.

Wire-geading-euporium-superdrug apocalypses. New ways of the brain stimulation completely distract humans from real life. ̶T̶h̶e̶y ̶s̶p̶e̶n̶t̶ ̶a̶l̶l̶ ̶t̶i̶m̶e̶ ̶i̶n̶ ̶F̶B̶

Wrong upgrade apocalypses - human basic drives are edited on birth so they are not aggressive, but they also lose interest to space exploration (S.Lem novel about it).

Replies from: scarcegreengrass
comment by scarcegreengrass · 2016-09-21T13:59:09.459Z · LW(p) · GW(p)

These are very interesting, particularly the Values Apocalypse. I'd be curious to draw up a longer and more detailed spectrum. I limited this one to seven to keep it low-resolution and memorable.

Replies from: turchin
comment by turchin · 2016-09-21T14:16:17.564Z · LW(p) · GW(p)

Did you see my map Typology of x-risks? http://lesswrong.com/lw/mdw/a_map_typology_of_human_extinction_risks/ I am interested in creating maps which will cover all topics about x-risks.

Replies from: scarcegreengrass
comment by scarcegreengrass · 2016-09-21T17:26:17.067Z · LW(p) · GW(p)

Oh, no i haven't seen this one! I'll check it out.

What software do you use to make these?

Replies from: turchin
comment by turchin · 2016-09-21T23:48:09.242Z · LW(p) · GW(p)

Hand drawing in Adobe Indesign

comment by Jude_B · 2016-09-28T17:34:34.540Z · LW(p) · GW(p)

Thanks for this summation.

Maybe we can divide item 7. to "our universe apocalypse" and "everything that (physically) exists apocalypse." since the two might not be equal.

Of course, there might be things that exist necessarily and thus cannot be "apocalypsed out", and it also would be strange if the principle that brought our universe to existence can only operate once.

So while it might be possible to have a Multiverse apocalypse, I think that there will always be something (physical) existing (but I don't know if this thought really can comfort us if we get wiped out...)

By the way, how do you (up)vote here?

Cheers

Replies from: scarcegreengrass
comment by scarcegreengrass · 2016-09-30T17:19:39.913Z · LW(p) · GW(p)

The upvote for comments is in the lower left of the comment. The upvote for posts is harder to find: It's at the bottom left of the post, above the text box for commenting.

Also, there could be a rule where only accounts with positive karma (ie, not brand-new accounts) can upvote. I'm not sure.

(Slow response because i am also learning site features: Didn't see the 'letter' icon under my karma score.)

Replies from: Jude_B
comment by Jude_B · 2016-09-30T19:00:02.987Z · LW(p) · GW(p)

Thanks for the reply.

Yes, I think you're right and I still don't have enough karma points.

Well, I guess I will have to owe you an upvote in the meantime...

Thanks

comment by siIver · 2016-09-20T10:47:12.246Z · LW(p) · GW(p)

Can you elaborate a bit on what exactly is your intention?

Specifically, is this meant to be a scale of severity categories with one example for each, or is it meant as an exhaustive list of all relevant apocalyptic scenarios put into a ranking?

Replies from: scarcegreengrass
comment by scarcegreengrass · 2016-09-20T13:42:36.004Z · LW(p) · GW(p)

A scale of roughly-ordered tiers. It's a shorthand for expressing the level of devastation in far-future failure modes.

comment by tukabel · 2016-09-23T06:42:53.221Z · LW(p) · GW(p)

Where should one put the current civilization-destroying socialist catastrophy? Economic? Knowledge? Human? Hope it's recoverable.

Maybe the worst current risk - because of sheer politically fed irrationality (on the scale not seen since religion-dumbed dark ages), from political correctness, general statism, to keynesian-ispired megadebts, to political "scientific consensus" (of. e.g. global warming), etc. etc. Actually, worth saying that it's quite popular in modern pseudoreligious movements to use word "science" as true religious power word to justify anything (marxists, bolsheviks, nazis... coincidentally, all socialist branches).