This episode currently has no reviews.
Submit ReviewTimnit recently completed her postdoc in the Fairness, Accountability, Transparency, and Ethics (FATE) group at Microsoft Research, New York. Prior to that, she was a PhD student at the Stanford Artificial Intelligence Lab, studying computer vision under Fei-Fei Li. She also co-founded Black in AI, an organization that works to increase diversity in the field and to reduce the negative impact of racial bias in training data used for machine learning models.
She was born and raised in Ethiopia. As an ethnic Eritrean, she was forced to flee Ethiopia at age 15 because of the war between Eritrea and Ethiopia. She eventually got political asylum in the United States. “This is all very related to the things I care about now because I can see how division works,” she explains during a conversation with Margot Gerritsen, Stanford professor and host of the Women in Data Science podcast. “Things that may seem little, like visas, really change people's lives.”
Last year, she said that half of the Black in AI speakers could not go to NeurIPS because of different visa issues. “And in that 20 seconds, that visa denial, it feels like the whole world is ending for you because you have an opportunity that's missed… Not being able to attend these conferences is much more important than people know.”
She has learned through her work with Black in AI that the number one thing we need to do is empower people from marginalized communities, which is why diversity, inclusion and ethics are not at all separate. It’s essential to have a wider group of people in the world determining where AI technology goes and what research questions we pursue. She says the industry has been pretty receptive to her proposals around norms, process and transparency because they are easier to operationalize. However, there are other things like racism and sexism where we need a fundamental shift in culture.
She has seen the potential for unintended consequences with AI research. Her PhD thesis at Stanford utilized Google maps data to predict income, race, education level, and voting patterns at the zip code level. She saw some follow up research using a similar methodology to determine what kind of insurance people should have. “And that is very scary to me. I don't think we should veer off in that direction using Google Street View.” She says she wishes you could attach an addendum to your earlier research where you talk about your learnings and your intentions for how the work be used. Timnit is currently working on large-scale analysis using computer vision to analyze society with lots of publicly available images. She says it’s critical that she also spend a lot of time thinking about the consequences of this research.
RELATED LINKS
Connect with Timnit Gebru on Twitter (@TimnitGebru) and LinkedInRead more about Google AI and Black in AIConnect with Margot Gerritsen on Twitter (@margootjeg) and LinkedInFind out more about Margot on her Stanford ProfileFind out more about Margot on her personal website
Timnit recently completed her postdoc in the Fairness, Accountability, Transparency, and Ethics (FATE) group at Microsoft Research, New York. Prior to that, she was a PhD student at the Stanford Artificial Intelligence Lab, studying computer vision under Fei-Fei Li. She also co-founded Black in AI, an organization that works to increase diversity in the field and to reduce the negative impact of racial bias in training data used for machine learning models.
She was born and raised in Ethiopia. As an ethnic Eritrean, she was forced to flee Ethiopia at age 15 because of the war between Eritrea and Ethiopia. She eventually got political asylum in the United States. “This is all very related to the things I care about now because I can see how division works,” she explains during a conversation with Margot Gerritsen, Stanford professor and host of the Women in Data Science podcast. “Things that may seem little, like visas, really change people's lives.”
Last year, she said that half of the Black in AI speakers could not go to NeurIPS because of different visa issues. “And in that 20 seconds, that visa denial, it feels like the whole world is ending for you because you have an opportunity that's missed… Not being able to attend these conferences is much more important than people know.”
She has learned through her work with Black in AI that the number one thing we need to do is empower people from marginalized communities, which is why diversity, inclusion and ethics are not at all separate. It’s essential to have a wider group of people in the world determining where AI technology goes and what research questions we pursue. She says the industry has been pretty receptive to her proposals around norms, process and transparency because they are easier to operationalize. However, there are other things like racism and sexism where we need a fundamental shift in culture.
She has seen the potential for unintended consequences with AI research. Her PhD thesis at Stanford utilized Google maps data to predict income, race, education level, and voting patterns at the zip code level. She saw some follow up research using a similar methodology to determine what kind of insurance people should have. “And that is very scary to me. I don't think we should veer off in that direction using Google Street View.” She says she wishes you could attach an addendum to your earlier research where you talk about your learnings and your intentions for how the work be used. Timnit is currently working on large-scale analysis using computer vision to analyze society with lots of publicly available images. She says it’s critical that she also spend a lot of time thinking about the consequences of this research.
RELATED LINKS
Connect with Timnit Gebru on Twitter (@TimnitGebru) and LinkedInRead more about Google AI and Black in AIConnect with Margot Gerritsen on Twitter (@margootjeg) and LinkedInFind out more about Margot on her Stanford ProfileFind out more about Margot on her personal website
This episode currently has no reviews.
Submit ReviewThis episode could use a review! Have anything to say about it? Share your thoughts using the button below.
Submit Review