Please login or sign up to post and edit reviews.
AI for Evil
Publisher |
Dan Bowen
Media Type |
audio
Categories Via RSS |
Education
Technology
Publication Date |
Nov 06, 2019
Episode Duration |
00:32:59

This week Dan and Ray go in the opposite direction from the last two episodes. After talking about AI for Good and AI for Accessibility, this week they go deeper into the ways that AI can be used in ways that can disadvantage people and decisions. Often the border line between 'good' and 'evil' can be very fine, and the same artificial intelligence technology can be used for good or evil depending on the unwitting (or witting) decisions!

During the chat, Ray discovered that Dan is more of a 'Dr Evil' than he'd previously thought, and together they discover that there are differences in how people perceive 'good' and 'evil' when it comes to AI's use in education.  This episode is a lot less focused on the technology, and instead spends all the time focused on the outcomes of using it.

Ray mentions the "MIT Trolley Problem", which is actually two things! The Trolley Problem, which is the work of English philosopher Philippa Foot, is a thought experiment in ethics about taking decisions on diverting a runaway tram. And the MIT Moral Machine, which is built upon this work, is about making judgements about driverless cars. The MIT Moral Machine website asks you to make the moral decisions and decide upon the consequences. It's a great activity for colleagues and for students, because it leads to a lot of discussion.

Two other links mentioned in the podcast are the CSIRO Data61 discussion paper as part of the consultation about AI ethics in Australia (downloadable here: https://consult.industry.gov.au/strategic-policy/artificial-intelligence-ethics-framework/) and the Microsoft AI Principles (available here: https://www.microsoft.com/en-us/AI/our-approach-to-ai)

This episode currently has no reviews.

Submit Review
This episode could use a review!

This episode could use a review! Have anything to say about it? Share your thoughts using the button below.

Submit Review