Relying on AI to moderate content is an imperfect solution. But companies also have to address the trauma that human moderators face.
3/25/2022
Moderating content on social media platforms is key to mitigating the spread of everything from conspiracy theories to horrific photos and videos of mass violence. Companies like Facebook use a combination of tools to search out and remove posts that violate their rules, including AI and human content moderators.
TikTok, like Facebook, employs humans to view and take down problematic posts. On Thursday, two former content moderators filed a federal lawsuit against the company for negligence and a lack of protection for workers against emotional trauma. They are seeking class-action status.
The two women behind the suit, Ashley Velez and Reece Young, both worked as TikTok moderators on contract through third-party companies, but claimed TikTok controlled the rules of their work day-to-day. They also alleged that the videos and photos they were forced to review, combined with strict schedules, consistent 12-hour days and aggressive quotas, left them traumatized.
The allegations echo claims in a lawsuit that Facebook (now Meta) settled for $52 million in May 2020. In that case, 11,250 moderators accused the company of similar abuses. The experiences these workers describe sound ghastly: One of the women suing TikTok recalled watching videos of children being abused, for example. Moderators in the Facebook suit referenced watching videos of suicide.
But relying solely on AI to censor content is also an imperfect solution. AI models are more likely to censor tweets authored by Black people, one study showed. Indigenous activists say they’ve been censored by racially biased algorithms, too. The known potential for bias also allows content-moderating algorithms to be more easily politicized, inflamed by national polarization so that different social media platforms become known as left- or right-wing. TikTok creators complain of getting their content removed for the wrong reasons so often, entire how-to videos and articles are dedicated to learning ways to appeal the decision.
TikTok is known for providing its content moderators better pay and benefits than many of its competitors, and has been on a hiring spree to bring content moderation in-house. The lawsuit filed by contractors indicates that perhaps that move was a good one. But TikTok must both figure out how to better protect its content moderators and keep its moderation strategy agile, proactive and accurate to succeed.
Protocol link: https://www.protocol.com/bulletins/tiktok-content-moderation-lawsuit
Comments