Some former TikTok content critics have sued the company for failing to adequately support them in their deeply disturbing task of removing offensive videos from the social network. NPR first reported on the lawsuit, which was filed in federal court on Thursday.
Plaintiffs Ashley Veles and Rhys Young moderated TikTok under contract through third parties, Canadian technology firm Telus International and New York-based Atrium. Velez and Yang are filing a class action lawsuit, allowing other TikTok content moderators who they claim have been adversely affected by the company’s practices to join the lawsuit.
The lawsuit alleges that TikTok and ByteDance violated California labor laws by failing to provide adequate mental health support to Veles and Yang, despite being mentally exposed to “unusually dangerous activities” on a daily basis. It also alleges that companies forced moderators to rate extreme content in large numbers to meet quotas and then forced them to sign non-disclosure agreements to discuss what they saw. I was legally incompetent.
“Defendants failed to provide a safe workplace for thousands of contractors who are the gatekeepers between unfiltered, hateful and unwanted content uploaded to the app and the millions of people who use the app every day,” the lawsuit says. It claims that despite the psychological risks of prolonged exposure to such malicious content, TikTok and ByteDance did not take “appropriate steps” to help employees deal with extreme content later.
The lawsuit describes how both prosecutors spent 12 hours a day watching extreme, disturbing material including “child sexual abuse, rape, torture, animalism, beheadings, suicides and murders.” In addition to the graphic content, the lawsuit describes how Velez and Young were repeatedly subjected to hate speech and conspiracy theories, which also had a negative impact on their mental well-being. Another TikTok content moderator, Candy Frazier, filed a similar lawsuit in December, though NPR says the case is no longer pending.
The new TikTok case follows in the footsteps of a class-action lawsuit that the same legal team filed against Facebook in 2018. Two years later, the company settled that lawsuit, agreeing to pay $52 million to more than 11,000 referees who were struggling with mental health because of the content they had to search for on a daily basis.