A former content moderator for TikTok has filed a lawsuit against the platform, alleging that parent company ByteDance provides inadequate safeguards to protect moderators’ mental health against a near-constant onslaught of traumatic footage.
In a proposed class-action lawsuit filed in the California Central District Court, Candie Frazier says she spent 12 hours a day moderating videos uploaded to TikTok for a third-party contracting firm named Telus International. In that time, Frazier says she witnessed “thousands of acts of extreme and graphic violence,” including mass shootings, child rape, animal mutilation, cannibalism, gang murder, and genocide.
Frazier says that in order to deal with the huge volume of content uploaded to TikTok daily, she and her fellow moderators had to watch between three and ten videos simultaneously, with new videos loaded in at least every 25 seconds. Moderators are only allowed to take one 15 minute break in the first four hours of their shift, and then additional 15 minute breaks every two hours afterwards. The lawsuit says ByteDance monitors performance closely and “heavily punishes any time taken away from watching graphic videos.”
TikTok failed to meet industry standards for content moderation, the lawsuit claims
The lawsuit states that TikTok and its partners have failed to meet industry-recognized standards intended to mitigate the harms of content moderation. These include offering moderators more frequent breaks, psychological support, and technical safeguards like blurring or reducing the resolution of videos under review.
As a result of her work, Frazier says she has suffered “severe psychological trauma including depression and symptoms associated with anxiety and PTSD.” The lawsuit says Frazier has “trouble sleeping and when she does sleep, she has horrific nightmares. She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind. She has severe and debilitating panic attacks.”
The testimony in Frazier’s lawsuit fits reports of content moderators working (usually indirectly) for other big tech companies like Facebook, YouTube, and Google. Over the past few years, the terrible working conditions facing these moderators — a labor force that is absolutely crucial in maintaining the profitability of some of the world’s biggest companies — has become increasingly scrutinized. Reports like Frazier’s, though, suggest that despite the extra attention, working conditions for moderators are still incredibly challenging.
Frazeri’s lawsuit was filed by California’s Joseph Saveri Law Firm, which previously filed a similar lawsuit in 2018 against moderators reviewing content for Facebook. That case resulted in a $52 million settlement paid by Facebook to its content moderators. The Verge has reached out to ByteDance for comment and will update this story if we hear back.