Quotes on Existential Risk

post by lukeprog · 2012-04-28T02:01:33.295Z · LW · GW · Legacy · 15 comments

Similar to the AGI Quotes thread and the monthly Rationality Quotes threads, this is a thread for memorable quotes about existential risk.

But please feel free to quote posts and comments from Less Wrong or Overcoming Bias.

15 comments

Comments sorted by top scores.

comment by lukeprog · 2012-04-28T02:14:31.469Z · LW(p) · GW(p)

I believe that if we destroy mankind, as we now can, this outcome will be much worse than most people think. Compare three outcomes:

(1) Peace. (2) A nuclear war that kills 99% of the world’s existing population. (3) A nuclear war that kills 100%.

(2) would be worse than (1), and (3) would be worse than (2). Which is the greater of these two differences? Most people believe that the greater difference is between (1) and (2). I believe that the difference between (2) and (3) is very much greater. … The Earth will remain habitable for at least another billion years. Civilization began only a few thousand years ago. If we do not destroy mankind, these few thousand years may be only a tiny fraction of the whole of civilized human history. The difference between (2) and (3) may thus be the difference between this tiny fraction and all of the rest of this history. If we compare this possible history to a day, what has occurred so far is only a fraction of a second.

Derek Parfit, 1984, Reasons and Persons

Replies from: gjm, army1987
comment by gjm · 2012-04-28T08:28:46.819Z · LW(p) · GW(p)

I don't at all disagree with Parfit's assessment that 2>3 by much more than 1>2 (and, incidentally, Reasons and Persons is an excellent book, though rather dense) but I wonder whether he's right that "most people" -- by which I think he probably really means something like "most people who would ever consider such a question in the first place" -- think otherwise.

Replies from: Alejandro1
comment by Alejandro1 · 2012-04-28T08:32:13.421Z · LW(p) · GW(p)

Maybe he means that most people (out of everyone, including those who never have considered the question) would give that answer if asked out of the blue.

Replies from: gjm
comment by gjm · 2012-04-28T20:03:28.395Z · LW(p) · GW(p)

Yes, perhaps he does, but if so I don't know why he's paying any attention -- in a quite technical work of philosophy aimed at professional philosophers -- to the question of what "most people" would unreflectively say to a question of that sort.

comment by A1987dM (army1987) · 2012-04-28T08:55:36.242Z · LW(p) · GW(p)

I'm not sure I'd agree. “Civilization began only a few thousand years ago” alright, but during these years the Earth wasn't full of radioactive waste.

comment by Rain · 2012-04-28T03:01:40.355Z · LW(p) · GW(p)

Destroying the Earth is harder than you may have been led to believe.

You've seen the action movies where the bad guy threatens to destroy the Earth. You've heard people on the news claiming that the next nuclear war or cutting down rainforests or persisting in releasing hideous quantities of pollution into the atmosphere threatens to end the world.

Fools.

The Earth is built to last. It is a 4,550,000,000-year-old, 5,973,600,000,000,000,000,000-tonne ball of iron. It has taken more devastating asteroid hits in its lifetime than you've had hot dinners, and lo, it still orbits merrily. So my first piece of advice to you, dear would-be Earth-destroyer, is: do NOT think this will be easy.

This is not a guide for wusses whose aim is merely to wipe out humanity. I (Sam Hughes) can in no way guarantee the complete extinction of the human race via any of these methods, real or imaginary. Humanity is wily and resourceful, and many of the methods outlined below will take many years to even become available, let alone implement, by which time mankind may well have spread to other planets; indeed, other star systems. If total human genocide is your ultimate goal, you are reading the wrong document. There are far more efficient ways of doing this, many which are available and feasible RIGHT NOW. Nor is this a guide for those wanting to annihilate everything from single-celled life upwards, render Earth uninhabitable or simply conquer it. These are trivial goals in comparison.

This is a guide for those who do not want the Earth to be there anymore.

-- Sam Hughes, Preamble to How to destroy the Earth

Replies from: army1987
comment by A1987dM (army1987) · 2012-04-28T12:54:32.002Z · LW(p) · GW(p)

Rendering Earth uninhabitable would be as bad an existential risk as destroying it altogether, though.

comment by Tuxedage · 2012-04-28T14:12:06.958Z · LW(p) · GW(p)

There is a saying in heuristics and biases that people do not evaluate events, but descriptions of events - what is called non-extensional reasoning. The extension of humanity's extinction includes the death of yourself, of your friends, of your family, of your loved ones, of your city, of your country, of your political fellows. Yet people who would take great offense at a proposal to wipe the country of Britain from the map, to kill every member of the Democratic Party in the U.S., to turn the city of Paris to glass - who would feel still greater horror on hearing the doctor say that their child had cancer - these people will discuss the extinction of humanity with perfect calm. "Extinction of humanity", as words on paper, appears in fictional novels, or is discussed in philosophy books - it belongs to a different context than the Spanish flu. We evaluate descriptions of events, not extensions of events. The cliché phrase end of the world invokes the magisterium of myth and dream, of prophecy and apocalypse, of novels and movies. The challenge of existential risks to rationality is that, the catastrophes being so huge, people snap into a different mode of thinking. Human deaths are suddenly no longer bad, and detailed predictions suddenly no longer require any expertise, and whether the story is told with a happy ending or a sad ending is a matter of personal taste in stories.

.

No more than Albert Szent-Györgyi could multiply the suffering of one human by a hundred million can I truly understand the value of clear thinking about global risks. Scope neglect is the hazard of being a biological human, running on an analog brain; the brain cannot multiply by six billion. And the stakes of existential risk extend beyond even the six billion humans alive today, to all the stars in all the galaxies that humanity and humanity's descendants may some day touch. All that vast potential hinges on our survival here, now, in the days when the realm of humankind is a single planet orbiting a single star. I can't feel our future. All I can do is try to defend it."

-- Eliezer Yudkowsky, Cognitive biases potentially affecting judgment of global risks

I hope I am allowed to quote EY. I personally thought this was a very well written and beautiful quote.

Replies from: pedanterrific, ciphergoth
comment by pedanterrific · 2012-04-28T18:20:07.561Z · LW(p) · GW(p)

First reaction: EY wrote entire long paragraphs without any italics? So I looked up the paper. It should be "The extension of" and "phrase end of the world invokes". Apparently he toned it down.

comment by Paul Crowley (ciphergoth) · 2012-04-28T15:03:32.278Z · LW(p) · GW(p)

This is two quotes; quite a bit of text separates these two paragraphs.

comment by Tuxedage · 2012-04-28T14:34:05.259Z · LW(p) · GW(p)

Suppose you have a moral view that counts future people as being worth as much as present people. You might say that fundamentally it doesn't matter whether someone exists at the current time or at some future time, just as many people think that from a fundamental moral point of view, it doesn't matter where somebody is spatially---somebody isn't automatically worth less because you move them to the moon or to Africa or something. A human life is a human life. If you have that moral point of view that future generations matter in proportion to their population numbers, then you get this very stark implication that existential risk mitigation has a much higher utility than pretty much anything else that you could do. There are so many people that could come into existence in the future if humanity survives this critical period of time---we might live for billions of years, our descendants might colonize billions of solar systems, and there could be billions and billions times more people than exist currently. Therefore, even a very small reduction in the probability of realizing this enormous good will tend to outweigh even immense benefits like eliminating poverty or curing malaria, which would be tremendous under ordinary standards.

-- Nick Bostrom

comment by RolfAndreassen · 2012-04-28T18:39:37.206Z · LW(p) · GW(p)

The Earth is just too small and fragile a basket for the human race to keep all its eggs in.

-- Robert Heinlein.

comment by lukeprog · 2012-04-28T02:07:12.229Z · LW(p) · GW(p)

[Our] situation was well characterized by the mathematician John von Neumann: "Technological possibilities are irresistible to man. If man can go to the moon, he will. He he can control the climate, he will." ...

There is a simple way of establishing the downright absurdity... of accepting such technological compulsiveness; and that is to carry von Neumann's dictum to its logical conclusions: If man has the power to exterminate all life on earth, he will. Since we know that the governments of the United States and Soviet Russia have already created nuclear, chemical, and bacterial agents in the massive quantities needed to wipe out the human race, what prospects are there of human survival, if this practice of submitting to extravagant and dehumanized technological imperatives is "irresistibly" carried to its final stage?

Lewis Mumford, 1970, "The Technological Imperative"

comment by Paul Crowley (ciphergoth) · 2012-04-28T14:28:02.264Z · LW(p) · GW(p)

All else being equal, not many people would prefer to destroy the world.

-- Eliezer Yudkowsky, Cognitive biases potentially affecting judgment of global risks

comment by daenerys · 2012-04-28T02:52:46.639Z · LW(p) · GW(p)

"If a man with this tragic sense approaches, not fire, but another manifestation of original power, like the splitting of the atom, he will do so with fear and trembling. He will not leap in where angels fear to tread, unless he is prepared to accept the punishment of the fallen angels.

Neither will he calmly transfer to the machine made in his own image the responsibility for his choice of good and evil, without continuing to accept a full responsibility for that choice.”

Norbert Wiener, 1950, The Human Use of Human Beings