Toby Ord’s ‘The Precipice’ is published!

post by matthew.vandermerwe · 2020-03-08T21:20:20.653Z · LW · GW · 2 comments

Contents

    How to get it
    What you can do
  Summary of the book
    Part One: The Stakes
    Part Two: The Risks
    Part Three: The Path Forward
None
2 comments

[x-posted from EA Forum]

The Precipice: Existential Risk and the Future of Humanity is out today. I’ve been working on the book with Toby for the past 18 months, and I’m excited for everyone to read it. I think it has the potential to make a profound difference to the way the world thinks about existential risk.

How to get it

What you can do

Summary of the book

Part One: The Stakes

Toby places our time within the broad sweep of human history: showing how far humanity has come in 2,000 centuries, and where we might go if we survive long enough. He outlines the major transitions in our past—the Agricultural, Scientific, and Industrial Revolutions. Each is characterised by dramatic increases in our power over the natural world, and together they have yielded massive improvements in living standards. During the twentieth century, with the detonation of the atomic bomb, humanity entered a new era. We gained the power to destroy itself, without the wisdom to ensure that we don’t. This is the Precipice, and how we navigate this period will determine whether humanity has a long and flourishing future, or no future at all. Toby introduces the concept of existential risk—risks that threaten to destroy humanity’s longterm potential. He shows how the case for safeguarding humanity from these risks draws support from a range of moral perspectives. Yet it remains grossly neglected—humanity spends more each year on ice cream than we do on protecting our future.

Part Two: The Risks

Toby explores the science behind the risks we face. In Natural Risks, he considers threats from asteroids & comets, supervolcanic eruptions, and stellar explosions. He shows how we can use humanity’s 200,000 year history to place strict bounds on how high the natural risk could be. In Anthropogenic Risks, he looks at risks we have imposed on ourselves in the last century,  from nuclear war, extreme climate change, and environmental damage. In Future Risks, he turns to threats that are on the horizon from emerging technologies, focusing in detail on engineered pandemics, unaligned artificial intelligence, and dystopian scenarios. 

Part Three: The Path Forward

Toby surveys the risk landscape and gives his own estimates for each risk. He also provides tools for thinking about how they compare and combine, and for how to prioritise between risks. He estimates that nuclear war and climate change each pose more risk than all the natural risks combined, and that risks from emerging technologies are higher still. Altogether, Toby believes humanity faces a 1 in 6 change of existential catastrophe in the next century. He argues that it is in our power to end these risks today, and to reach a place of safety. He outlines a grand strategy for humanity, provides actionable policy and research recommendations, and shows what each of us can do. The book ends with an inspiring vision of humanity’s potential, and what we might hope to achieve if we navigate the risks of the next century. 

2 comments

Comments sorted by top scores.

comment by Sherrinford · 2020-03-11T20:45:37.321Z · LW(p) · GW(p)

Does an epub version exist?

comment by Jose Miguel Cruz y Celis (jose-miguel-cruz-y-celis) · 2022-06-05T00:55:42.187Z · LW(p) · GW(p)

I'm very much aligned with the version of utilitarianism that Bostrom and Ord generally put forth, but a question came up in a conversation regarding this philosophy and view of sustainability.  As a thought experiment what would be consistent with this philosophy if we discover that a very clear way to minimize existential risk due to X requires a genocide of half or a significant subset of the population?