logo
episode-header-image
Mar 2020
3h 14m

#72 - Toby Ord on the precipice and huma...

Rob, Luisa, and the 80,000 Hours team
About this episode
This week Oxford academic and 80,000 Hours trustee Dr Toby Ord released his new book The Precipice: Existential Risk and the Future of Humanity. It's about how our long-term future could be better than almost anyone believes, but also how humanity's recklessness is putting that future at grave risk — in Toby's reckoning, a 1 in 6 chance of being extinguished this century.

I loved the book and learned a great deal from it (buy it here, US and audiobook release March 24). While preparing for this interview I copied out 87 facts that were surprising, shocking or important. Here's a sample of 16:

1. The probability of a supervolcano causing a civilisation-threatening catastrophe in the next century is estimated to be 100x that of asteroids and comets combined.

2. The Biological Weapons Convention — a global agreement to protect humanity — has just four employees, and a smaller budget than an average McDonald’s.

3. In 2008 a 'gamma ray burst' reached Earth from another galaxy, 10 billion light years away. It was still bright enough to be visible to the naked eye. We aren't sure what generates gamma ray bursts but one cause may be two neutron stars colliding.

4. Before detonating the first nuclear weapon, scientists in the Manhattan Project feared that the high temperatures in the core, unprecedented for Earth, might be able to ignite the hydrogen in water. This would set off a self-sustaining reaction that would burn off the Earth’s oceans, killing all life above ground. They thought this was unlikely, but many atomic scientists feared their calculations could be missing something. As far as we know, the US President was never informed of this possibility, but similar risks were one reason Hitler stopped…

N.B. I've had to cut off this list as we only get 4,000 characters in these show notes, so:

Click here to read the whole list, see a full transcript, and find related links.

And if you like the list, you can get a free copy of the introduction and first chapter by joining our mailing list.

While I've been studying these topics for years and known Toby for the last eight, a remarkable amount of what's in The Precipice was new to me.

Of course the book isn't a series of isolated amusing facts, but rather a systematic review of the many ways humanity's future could go better or worse, how we might know about them, and what might be done to improve the odds.

And that's how we approach this conversation, first talking about each of the main threats, then how we can learn about things that have never happened before, then finishing with what a great future for humanity might look like and how it might be achieved.

Toby is a famously good explainer of complex issues — a bit of a modern Carl Sagan character — so as expected this was a great interview, and one which Arden Koehler and I barely even had to work for.

Some topics Arden and I ask about include:

• What Toby changed his mind about while writing the book
• Are people exaggerating when they say that climate change could actually end civilization?
• What can we learn from historical pandemics?
• Toby’s estimate of unaligned AI causing human extinction in the next century
• Is this century the most important time in human history, or is that a narcissistic delusion?
• Competing vision for humanity's ideal future
• And more.

Get this episode by subscribing: type '80,000 Hours' into your podcasting app. Or read the linked transcript.

Producer: Keiran Harris.
Audio mastering: Ben Cordell.
Transcriptions: Zakee Ulhaq.

Up next
Jul 8
#220 – Ryan Greenblatt on the 4 most likely ways for AI to take over, and the case for and against AGI in <8 years
Ryan Greenblatt — lead author on the explosive paper “Alignment faking in large language models” and chief scientist at Redwood Research — thinks there’s a 25% chance that within four years, AI will be able to do everything needed to run an AI company, from writing code to design ... Show More
2h 50m
Jun 24
#219 – Toby Ord on graphs AI companies would prefer you didn't (fully) understand
The era of making AI smarter just by making it bigger is ending. But that doesn’t mean progress is slowing down — far from it. AI models continue to get much more powerful, just using very different methods, and those underlying technical changes force a big rethink of what comin ... Show More
2h 48m
Jun 12
#218 – Hugh White on why Trump is abandoning US hegemony – and that’s probably good
For decades, US allies have slept soundly under the protection of America’s overwhelming military might. Donald Trump — with his threats to ditch NATO, seize Greenland, and abandon Taiwan — seems hell-bent on shattering that comfort.But according to Hugh White — one of the world' ... Show More
2h 48m
Recommended Episodes
Apr 2020
Toby Ord: What are the odds civilisation will survive the century?
This week we talk to the philosopher Toby Ord about the end of civilisation as we know it.Ok, it’s not all doom and gloom. As Toby says, he’s an optimistic person, but in his new book The Precipice (£25, Bloomsbury) he explains why we’re at a point in time where we, as a species, ... Show More
35m 35s
Jun 2023
848: Yuval Noah Harari | Peering into the Future of Humanity
Yuval Noah Harari (@harari_yuval) is a historian and the bestselling author of Sapiens: A Brief History of Humankind, Homo Deus: A Brief History of Tomorrow, and 21 Lessons for the 21st Century. His latest book, Unstoppable Us, Volume 1: How Humans Took Over the World, is out now ... Show More
1h 11m
Jun 2023
AI and human extinction
In the headlines this week eminent tech experts and public figures signed an open letter that read “Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”One of the signatories was Geoffrey Hi ... Show More
27m 44s
May 2024
The Infinite Monkey's Guide To… The Future
We know the universe is rapidly expanding but what happens if other galaxies disappear from view? That’s what Eric Idle wants to know as he ponders the future and what it holds in store. Solar scientist Lucie Green says this is not worth dwelling on because we’ll all be wiped out ... Show More
19m 43s
Nov 2023
#151 — Will We Destroy the Future?
Sam Harris speaks with Nick Bostrom about the problem of existential risk. They discuss public goods, moral illusions, the asymmetry between happiness and suffering, utilitarianism, “the vulnerable world hypothesis,” the history of nuclear deterrence, the possible need for “turnk ... Show More
1h 33m
Apr 2023
Superintelligence by Nick Bostrom - Book Summary and Review | Free Audiobook
Show notes | PDF & Infographic | Free Audiobook | What happens when artificial intelligence surpasses human intelligence? This is the world that Nick Bostrom explores in his book, Superintelligence. Read 1 million books in minutes. For free. Get the PDF, infographic, full ad-free ... Show More
19m 57s
Oct 2021
Ep182 - John Lennox | 2084: Artificial Intelligence and the Future of Humanity
This week, mathematician and emeritus Oxford University Professor John Lennox discusses his book, “2084: Artificial Intelligence and the Future of Humanity” You don't have to be a computer scientist to get involved in the discussion about where artificial intelligence and technol ... Show More
1h 1m
Sep 2023
Future Shock 2023, Part 2
In the 1970 book “Future Shock,” futurist Alvin Toffler outlined a vision of post-industrial society in which rapid technological and social changes outstrip the average human’s ability to cope. More than half a century later, how does this idea hold up and are contemporary human ... Show More
52m 43s
Oct 2022
Will Humans Ever Become Extinct?
Greg and Bella explore the science of extinction with the help of dinosaur enthusiast Mr Yates.Together, they’ll look at some famous recent extinctions, find out what role conservation plays in the preservation of species, and explore the major mass extinction events that shaped ... Show More
43m 7s