60. Rob Miles - Why should I care about AI safety?
Publisher |
The TDS team
Media Type |
audio
Categories Via RSS |
Technology
Publication Date |
Dec 02, 2020
Episode Duration |
00:45:29

Progress in AI capabilities has consistently surprised just about everyone, including the very developers and engineers who build today’s most advanced AI systems. AI can now match or exceed human performance in everything from speech recognition to driving, and one question that’s increasingly on people’s minds is: when will AI systems be better than humans at AI research itself?

The short answer, of course, is that no one knows for sure — but some have taken some educated guesses, including Nick Bostrom and Stuart Russell. One common hypothesis is that once an AI systems are better than a human at improving their own performance, we can expect at least some of them to do so. In the process, these self-improving systems would become an even more powerful system that they were previously—and therefore, even more capable of further self-improvement. With each additional self-improvement step, improvements in a system’s performance would compound. Where this all ultimately leads, no one really has a clue, but it’s safe to say that if there’s a good chance that we’re going to be creating systems that are capable of this kind of stunt, we ought to think hard about how we should be building them.

This concern among many others has led to the development of the rich field of AI safety, and my guest for this episode, Robert Miles, has been involved in popularizing AI safety research for more than half a decade through two very successful YouTube channels, Robert Miles and Computerphile. He joined me on the podcast to discuss how he’s thinking about AI safety, what AI means for the course of human evolution, and what our biggest challenges will be in taming advanced AI.

This episode currently has no reviews.

Submit Review
This episode could use a review!

This episode could use a review! Have anything to say about it? Share your thoughts using the button below.

Submit Review