Please login or sign up to post and edit reviews.
Spotlight: The Three Rules of Humane Tech
Media Type |
audio
Publication Date |
Apr 06, 2023
Episode Duration |
00:22:17

In our previous episode, we shared a presentation Tristan and Aza recently delivered to a group of influential technologists about the race happening in AI. In that talk, they introduced the Three Rules of Humane Technology. In this Spotlight episode, we’re taking a moment to explore these three rules more deeply in order to clarify what it means to be a responsible technologist in the age of AI.

Correction: Aza mentions infinite scroll being in the pockets of 5 billion people, implying that there are 5 billion smartphone users worldwide. The number of smartphone users worldwide is actually 6.8 billion now.

 

RECOMMENDED MEDIA 

We Think in 3D. Social Media Should, TooTristan Harris writes about a simple visual experiment that demonstrates the power of one’s point of view

Let’s Think About Slowing Down AI

Katja Grace’s piece about how to avert doom by not building the doom machine

harari-ai-chatgpt.html?smid=nytcore-ios-share&referringSource=articleShare">If We Don’t Master AI, It Will Master Us

Yuval Harari, Tristan Harris and Aza Raskin call upon world leaders to respond to this moment at the level of challenge it presents in this New York Times opinion piece

 

RECOMMENDED YUA EPISODES 

The AI Dilemma

Synthetic humanity: AI & What’s At Stake

 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

In our previous episode, we shared a presentation Tristan and Aza recently delivered to a group of influential technologists about the race happening in AI. In that talk, they introduced the Three Rules of Humane Technology. In this Spotlight episode, we’re taking a moment to explore these three rules more deeply in order to clarify what it means to be a responsible technologist in the age of AI.

In our previous episode, we shared a presentation Tristan and Aza recently delivered to a group of influential technologists about the race happening in AI. In that talk, they introduced the Three Rules of Humane Technology. In this Spotlight episode, we’re taking a moment to explore these three rules more deeply in order to clarify what it means to be a responsible technologist in the age of AI.

Correction: Aza mentions infinite scroll being in the pockets of 5 billion people, implying that there are 5 billion smartphone users worldwide. The number of smartphone users worldwide is actually 6.8 billion now.

 

RECOMMENDED MEDIA 

We Think in 3D. Social Media Should, TooTristan Harris writes about a simple visual experiment that demonstrates the power of one’s point of view

Let’s Think About Slowing Down AI

Katja Grace’s piece about how to avert doom by not building the doom machine

harari-ai-chatgpt.html?smid=nytcore-ios-share&referringSource=articleShare">If We Don’t Master AI, It Will Master Us

Yuval Harari, Tristan Harris and Aza Raskin call upon world leaders to respond to this moment at the level of challenge it presents in this New York Times opinion piece

 

RECOMMENDED YUA EPISODES 

The AI Dilemma

Synthetic humanity: AI & What’s At Stake

 

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

This episode currently has no reviews.

Submit Review
This episode could use a review!

This episode could use a review! Have anything to say about it? Share your thoughts using the button below.

Submit Review