This episode currently has no reviews.
Submit ReviewTwo Ethiopians recently filed a lawsuit against Meta, Facebook's parent company, alleging that the company not only allowed hate speech to spread online during the country's recent civil war — it prioritized hate speech. Facebook’s content moderation practices have been under scrutiny for years, particularly after whistleblower Frances Haugen revealed internal documents that showed Facebook was well aware that its practices for finding and removing hate speech in a number of countries — including Ethiopia — were severely lacking. Will the recent peace deal also mean a reckoning for the social media giant?
We speak with Berhan Taye, a Practitioner Fellow at the Digital Civil Society Lab of Stanford University, researching digital rights and social justice. She's based in Nairobi, Kenya.
Meta answered The Takeaway's request for comment with the following statement:
"We have strict rules which outline what is and isn’t allowed on Facebook and Instagram. Hate speech and incitement to violence are against these rules and we invest heavily in teams and technology to help us find and remove this content. Our safety and integrity work in Ethiopia is guided by feedback from local civil society organizations and international institutions. We employ staff with local knowledge and expertise and continue to develop our capabilities to catch violating content in the most widely spoken languages in the country, including Amharic, Oromo, Somali and Tigrinya."
To read the full transcript, see above.
This episode currently has no reviews.
Submit ReviewThis episode could use a review! Have anything to say about it? Share your thoughts using the button below.
Submit Review