Please login or sign up to post and edit reviews.
AI: Integral to the future or existential risk? (or both) - conversations on current evolution with Daniel Thorson
Podcast |
Accidental Gods
Publisher |
Accidental Gods
Media Type |
audio
Categories Via RSS |
Nature
Philosophy
Science
Society & Culture
Publication Date |
Aug 02, 2023
Episode Duration |
01:17:46

How dangerous is AI? Are Large Language Models likely to subvert our children?  Is Generalised AI going to wipe out all life on the planet?  I don't know the answers to these. It may be that nobody knows, but this week's guest was my go-to when I needed someone with total integrity to help unravel one of the most existential crises of our time, to lay it out as simply as we can without losing the essence of complexity, to help us see the worst cases - and their likelihood - and the best cases, and then to navigate a route past the first and onto the second. 

Daniel Thorson is an activist - he was active in the early days of the Occupy movement and in Extinction Rebellion. He is a lot more technologically literate than I am - he was active early in Buddhist Geeks. He is a soulful, thoughtful, heartful person, who lives at and works with the Monastic Academy for the Preservation of Life on Earth in Vermont. And he's host of the Emerge podcast, Making Sense of What's Next. 

So in all ways, when I wanted to explore the existential risks, and maybe the potential of Artificial Intelligence, and wanted to talk with someone I could trust, and whose views I could bring to you unfiltered, Daniel was my first thought, and I'm genuinely thrilled that he agreed to come back onto the podcast to talk about what's going on right now. 

My first query was triggered by the interview with Eliezer Yudkowsky on the Bankless podcast - Eliezer talked about the dangers of Generalised AI, or Artificial General Intelligence, AGI, and the reasons why it was so hard - he would say impossible - to align the intentions of a silicon-based intelligence with our human values, even if we knew what they were and could define them clearly. 

Listening to that, was what prompted me to write to Daniel. Since then, I listened many times to two of Daniels own recent podcasts: one with the educational philosopher Zak Stein on the dangers of AI Tutors and one with Jill Nephew, the founder of Inqwire, Public Benefit Company on a mission to help the world make sense. The Inqwire technology is designed to enhance and accelerate human sensemaking abilities. Jill is also host of the Natural Intelligence podcast and has clearly thought deeply about the nature of intelligence, the human experience and the neurophysiology and neuropsychology of our interactions with Large Language Models. 

I've linked all three of these podcasts below and absolutely recommend that you listen to them if you want more depth than we have here. What Daniel and I tried to do today was to lay things out in very straightforward terms: it's an area fraught with jargon and belief systems and assumptions, and we wanted to strip those away where we could and acknowledge them where we couldn't, and lay out where we are, what the worst cases are, what the best case is, given that we have to move forward with technology, switching it all off seems not to be an option—and how we might move from worst to best case. 

With this latter in mind, I've included a link to Daniel's new project, the Church of the Intimate Web which aims to connect people with each other. I've also - because it seems not everyone listens to the very end of the podcasts - included a link to our membership programme in Accidental Gods where we aim to help people connect to the wider web of life. I definitely see these two as interlinked and mutually compatible. 

So - trigger warning - a lot of this is not yet impinging on public awareness and we're not yet aware of how close we are to some very dangerous edges. This podcast leads us up to the edge so we can look over. We do it as gently as we can, but still, you'll want to be resourced and resilient before you listen. 

The Emerge Podcast https://www.whatisemerging.com/emergepodcastEmerge with Zak Stein https://podcasts.apple.com/gb/podcast/emerge-making-sense-of-whats-next/id1057220344?i=1000610403148Emerge with Jill Nephew https://podcasts.apple.com/gb/podcast/emerge-making-sense-of-whats-next/id1057220344?i=1000613784941

Bankless with Eliezer Yudkowsky https://podcasts.apple.com/gb/podcast/bankless/id1499409058?i=1000600575387

The Church of the Intimate Web https://tome.app/the-church-of-the-intimate-web/the-church-of-the-intimate-web-a-response-to-the-global-intimacy-disorder-clhgc8h1l1b2p5k3z9ppbitfyAccidental Gods Membership https://accidentalgods.life/join-us/

The Soul's Code by James Hillman https://uk.bookshop.org/p/books/the-soul-s-code-james-hillman/1563087?ean=9780553506341

This episode currently has no reviews.

Submit Review
This episode could use a review!

This episode could use a review! Have anything to say about it? Share your thoughts using the button below.

Submit Review