FLI Podcast: The Precipice: Existential Risk and the Future of Humanity with Toby Ord

Toby Ord’s “The Precipice: Existential Risk and the Future of Humanity" has emerged as a new cornerstone text in the field of existential risk. The book presents the foundations and recent developments of this budding field from an accessible vantage point, providing an overview suitable for newcomers. For those already familiar with existential risk, Toby brings new historical and academic context to the problem, along with central arguments for why existential risk matters, novel quantitative analysis and risk estimations, deep dives into the risks themselves, and tangible steps for mitigation. "The Precipice" thus serves as both a tremendous introduction to the topic and a rich source of further learning for existential risk veterans. Toby joins us on this episode of the Future of Life Institute Podcast to discuss this definitive work on what may be the most important topic of our time. Topics discussed in this episode include: -An overview of Toby's new book -What it means to be standing at the precipice and how we got here -Useful arguments for why existential risk matters -The risks themselves and their likelihoods -What we can do to safeguard humanity's potential You can find the page for this podcast here: https://futureoflife.org/2020/03/31/he-precipice-existential-risk-and-the-future-of-humanity-with-toby-ord/ Timestamps:  0:00 Intro  03:35 What the book is about  05:17 What does it mean for us to be standing at the precipice?  06:22 Historical cases of global catastrophic and existential risk in the real world 10:38 The development of humanity’s wisdom and power over time   15:53 Reaching existential escape velocity and humanity’s continued evolution 22:30 On effective altruism and writing the book for a general audience  25:53 Defining “existential risk”  28:19 What is compelling or important about humanity’s potential or future persons? 32:43 Various and broadly appealing arguments for why existential risk matters 50:46 Short overview of natural existential risks 54:33 Anthropogenic risks 58:35 The risks of engineered pandemics  01:02:43 Suggestions for working to mitigate x-risk and safeguard the potential of humanity  01:09:43 How and where to follow Toby and pick up his book This podcast is possible because of the support of listeners like you. If you found this conversation to be meaningful or valuable consider supporting it directly by donating at futureoflife.org/donate. Contributions like yours make these conversations possible.

Om Podcasten

The Future of Life Institute (FLI) is a nonprofit working to reduce global catastrophic and existential risk from powerful technologies. In particular, FLI focuses on risks from artificial intelligence (AI), biotechnology, nuclear weapons and climate change. The Institute's work is made up of three main strands: grantmaking for risk reduction, educational outreach, and advocacy within the United Nations, US government and European Union institutions. FLI has become one of the world's leading voices on the governance of AI having created one of the earliest and most influential sets of governance principles: the Asilomar AI Principles.