Friday, 3 May 2019

The Human Cost of Content Management

What, if anything, should be banned from online media? And who should review violent and explicit content, in order to decide if it’s okay for the public? Thousands of people around the world are working long, difficult hours as content moderators in support of sites like Facebook, Twitter, and YouTube. They are guided by complex and shifting guidelines, and their work can sometimes lead to psychological trauma. But the practice of content moderation also raises questions about censorship and free expression online.

In this IRL episode, host Manoush Zomorodi talks with a forensic investigator who compares the work she does solving disturbing crimes with the work done by content moderators. We hear the stories of content moderators working in the Philippines, as told by the directors of a new documentary called The Cleaners. Ellen Silver from Facebook joins us to outline Facebook’s content moderation policies. Kalev Leetaru flags the risks that come from relying on artificial intelligence to clean the web. And Kat Lo explains why this work is impossible to get exactly right.


\

No comments:

Post a comment