Please login or sign up to post and edit reviews.
When Facial Recognition Tech Is Wrong - Publication Date |
- Mar 18, 2022
- Episode Duration |
- 00:38:01
Like a lot of tech solutions to complex problems, facial recognition algorithms aren't perfect. But when the technology is used to identify suspects in criminal cases, those flaws in the system can have catastrophic, life-changing consequences. People can get wrongly identified, arrested, and convicted, often without ever being told they were ID’d by a computer. It’s especially troubling when you consider false identifications disproportionately affect women, young people, and people with dark skin—basically everyone other than white men.
This week on Gadget Lab, WIRED senior writer Khari Johnson joins us to talk about the limits of facial recognition tech, and what happens to the people who get misidentified.
Show Notes:
Read Khari’s stories about how facial recognition tech has led to wrongful arrests that derailed people’s lives. Here’s Lauren’s story about Garmin’s Fenix smartwatch. (And here’s WIRED’s review of the latest model.) Arielle’s story about the wave of shows about Silicon Valley tech founders is here.
Recommendations:
Khari recommends hoagies. Lauren recommends Garmin smartwatches. Mike recommends the show The Dropout on Hulu.
Khari Johnson can be found on Twitter @kharijohnson. Lauren Goode is @LaurenGoode. Michael Calore is @snackfight. Bling the main hotline at @GadgetLab. The show is produced by Boone Ashworth (@booneashworth). Our theme music is by Solar Keys.
Learn more about your ad choices. Visit
podcastchoices.com/adchoicesThis episode could use a review!
This episode could use a review! Have anything to say about it? Share your thoughts using the button below.
Submit Review