This episode currently has no reviews.
Submit ReviewAI systems can be highly susceptible to bias. The potential for serious consequences is huge, especially when it comes to the AI increasingly used in police departments.
AI systems can be highly susceptible to bias. The potential for serious consequences is huge, especially when it comes to the AI increasingly used in police departments. We’ll hear from a man wrongfully arrested due to facial recognition technology, and from a lawyer researching how “dirty data” has corrupted predictive policing algorithms. We’ll also hear from the creator of one of these predictive policing systems, and an AI researcher who says de-biasing AI isn’t enough.
This episode currently has no reviews.
Submit ReviewThis episode could use a review! Have anything to say about it? Share your thoughts using the button below.
Submit Review