In a nutshell: A TikTok moderator has sued the platform and parent company ByteDance for failing to provide support to workers who are exposed to graphic content, including child pornography, beheadings, and school shootings.
Bloomberg reports that in her proposed class-action lawsuit, content moderator Candie Frazier says she had to screen videos involving cannibalism, crushed heads, suicides, and a fatal fall from a building.
The complaint states that although TikTok was one of several social media firms to introduce guidelines—such as psychological support and limiting shifts to four hours—designed to help moderators cope with exposure to child pornography, the company failed to implement them.
The suit also states that TikTok moderators work 12-hour shifts with just a one-hour lunch break and two 15-minute breaks, and that they must watch hundreds of videos each day. Frazier’s lawyers said the employees are permitted no more than 25 seconds per video and often view three to ten videos simultaneously.
Las Vegas resident Frazier says she now has post-traumatic stress disorder due to the graphic videos. According to the complaint, she also has trouble sleeping and suffers from horrific nightmares when she does sleep.
Frazier is seeking compensation for psychological injuries and a court order requiring the company to set up a medical fund for moderators.
Moderators suing companies for allegedly causing them PTSD isn’t something new. A content moderator for Facebook contractor Pro Unlimited sued the social network in 2018 after the “constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace” resulted in the disorder. There was also the case of a YouTube mod who sued the Google-owned firm in 2020 after developing symptoms of PTSD and depression, a result of reviewing thousands of disturbing videos.