Sep 24, 2017
We have a guest on the show today – Phil Torres. Phil Torres is an author, Affiliate Scholar at the Institute for Ethics and Emerging Technologies, former contributor at the Future of Life Institute, and founding Director of the X-Risks Institute. He has published in Bulletin of the Atomic Scientists, Skeptic, Free Inquiry, The Humanist, Journal of Future Studies, Bioethics, Journal of Evolution and Technology, Foresight, Erkenntnis, and Metaphilosophy, as well as popular media like Time, Motherboard, Salon, Common Dreams, Counterpunch, Alternet, The Progressive, and Truthout.
I was absolutely delighted that he agreed to be interviewed for
a show like ours, and so I urge you to seek out his website –
risksandreligion.org – and buy one of his books. There’s “The End –
what Science and Religion have to tell us about the Apocalypse”,
which is on my shelf already, and, forthcoming, we have
Morality, Foresight, and Human Flourishing, which is going
to act as an introduction to the whole field of existential risks,
which people have been thinking about for a good deal of time now.
So I would urge you all, if you’re interested in this topic – that
of risks to the entire human species, which I think we can agree
affects us all – to buy one of those books.
This is the first part of our conversation, which touches on what
is meant by an existential risk, some specific examples from the
modern world in terms of nuclear profileration and nuclear
accidents; transhumanism, and how our societies and institutions
can deal with existential risks more effectively. We talk about the
field in general and how we can hope to think more constructively
about the end of the world - without waving a 'The End is Nigh'
sign! The second part, which focuses on AI, will be released
shortly.
Follow Phil @xriskology and the show @physicspod.