Please login or sign up to post and edit reviews.
Google Eats Rocks + A Win for A.I. Interpretability + Safety Vibe Check
Podcast |
Hard Fork
Publisher |
The New York Times
Media Type |
audio
Categories Via RSS |
Technology
Publication Date |
May 31, 2024
Episode Duration |
01:19:20

This week, Google found itself in more turmoil, this time over its new AI Overviews feature and a trove of leaked internal documents. Then Josh Batson, a researcher at the A.I. startup Anthropic, joins us to explain how an experiment that made the chatbot Claude obsessed with the Golden Gate Bridge represents a major breakthrough in understanding how large language models work. And finally, we take a look at recent developments in A.I. safety, after Casey’s early access to OpenAI’s new souped-up voice assistant was taken away for safety reasons.

Guests:

  • Josh Batson, research scientist at Anthropic

Additional Reading: 

We want to hear from you. Email us at hardfork@nytimes.com. Find “Hard Fork” on YouTube and TikTok.

“Pass me the nontoxic glue and a couple of rocks, because it’s time to whip up a meal with Google’s new A.I. Overviews.”

This week, Google found itself in more turmoil, this time over its new AI Overviews feature and a trove of leaked internal documents. Then Josh Batson, a researcher at the A.I. startup Anthropic, joins us to explain how an experiment that made the chatbot Claude obsessed with the Golden Gate Bridge represents a major breakthrough in understanding how large language models work. And finally, we take a look at recent developments in A.I. safety, after Casey’s early access to OpenAI’s new souped-up voice assistant was taken away for safety reasons.

Guests:

  • Josh Batson, research scientist at Anthropic

Additional Reading: 

We want to hear from you. Email us at hardfork@nytimes.com. Find “Hard Fork” on YouTube and TikTok.

This episode currently has no reviews.

Submit Review
This episode could use a review!

This episode could use a review! Have anything to say about it? Share your thoughts using the button below.

Submit Review