This episode currently has no reviews.
Submit ReviewDavid talks to Martin Rees about how we should evaluate the greatest threats facing the human species in the twenty-first century. Does the biggest danger come from bio-terror or bio-error, climate change, nuclear war or AI? And what prospects does space travel provide for a post-human future?
Talking Points:
Existential risk is risk that cascades globally and is a severe setback to civilization. We are now so interconnected and so empowered as a species that humans could be responsible for this kind of destruction.
There are four categories of existential risk: climate change, bioterror/bioerror, nuclear weapons, and AI/new technology.
These threats are human-made. Solving them is also our responsibility.
Mentioned in this episode:
Further Learning:
And as ever, recommended reading curated by our friends at the LRB can be found here: lrb.co.uk/talking
This episode currently has no reviews.
Submit ReviewThis episode could use a review! Have anything to say about it? Share your thoughts using the button below.
Submit Review