59°F
weather icon Clear

Former content moderators sue TikTok over exposure to graphic videos

Updated March 30, 2022 - 1:52 pm

During 12-hour shifts for seven months, Ashley Velez watched video after video of graphic violence and child abuse as a content moderator for TikTok.

On Thursday, the Las Vegas resident and another former content moderator filed a federal class action lawsuit against TikTok and its owner, Bytedance Inc., alleging that the company violated California’s unfair competition law by failing to provide a safe environment for employees.

“Defendants rely on content moderators to ensure that TikTok is free from graphic and objectionable content,” the lawsuit states. “Defendants monitor and control content moderators’ day to day work and provide the software that allows content moderators to do their jobs. Therefore, Defendants are required under the law to pay for the harm caused by requiring content moderators to review and remove graphic and objectionable content.”

Bytedance did not reply to a request for comment on Tuesday.

Steve Williams is an attorney with the Joseph Saveri Law Firm in San Francisco, which represents the plaintiffs. Williams said that in a previous class-action lawsuit filed by his firm, Facebook agreed to pay $52 million in a settlement with content moderators who said they experienced psychological trauma and post-traumatic stress disorder after viewing and removing offensive and disturbing videos for the social media platform.

‘An uphill battle’

But Williams said litigation regarding content moderators is still relatively new.

“There’s no precedent for it,” he said. “The defendants are gigantic, powerful companies with really big law firms, so it’s an uphill battle.”

According to their lawsuit, Velez and Nashville, Tennessee, resident Reece Young were hired by separate third-party companies to work as content moderators for TikTok. Both performed the same tasks, and were subject to monitoring and discipline by the company, the lawsuit states. Velez worked for the company from May 2021 to November, while Young was a content moderator for about 11 months starting in 2021.

The moderators usually were given about 25 seconds to watch each video during their 12-hour shifts, which included a one-hour break and two 15-minute breaks. The company withheld payment if moderators were not on the video-watching software outside of their allotted break time, and moderators often watched multiple videos at once to keep up with quotas, the suit alleges.

Many of the videos showed “extreme and graphic violence,” the plaintiffs’ lawyers wrote.

“Plaintiff Young saw a thirteen-year-old child being executed by cartel members, bestiality, and other distressing images,” the lawsuit states. “Plaintiff Velez saw bestiality and necrophilia, violence against children, and other distressing imagery.”

The content moderators also were repeatedly exposed to “conspiracy theories” regarding the COVID-19 pandemic, Holocaust denials, hate speech and manipulated videos of elected officials, according to the lawsuit.

“Plaintiffs have sought counseling on their own time and effort due to the content they were exposed to while providing content moderation services for TikTok because they are not provided adequate prophylactic measures before exposure nor appropriate ameliorative measures after exposure,” the document alleges.

The lawsuit asks for a court order to compensate content moderators who were exposed to graphic content, to provide moderators with “tools, systems, and mandatory ongoing mental health support,” and to provide mental health screening and treatment to current and former moderators.

In addition, the lawsuit alleges that the training employees went through did not adequately prepare them for the content they would see, and that nondisclosure agreements prevented moderators from speaking with anyone about the videos.

Industry standards

TikTok and Bytedance also are accused of failing to meet industry standards for mitigating harm to content moderators.

The Technology Coalition, which includes tech companies such as Google, Facebook and TikTok, recommends limiting time that employees spend viewing disturbing material to “no more than four consecutive hours,” the lawsuit states.

It also recommends that companies limit the amount of time employees are exposed to child sexual abuse imagery, provide mandatory group and individual counseling, and allow moderators to opt out of viewing child sexual abuse imagery.

The National Center for Missing and Exploited Children also recommends that companies alter graphic images by changing the color or resolution, blurring or superimposing a grid over the image, changing the direction or reducing the image’s size, or muting audio, according to the lawsuit.

“Defendants failed to implement the aforementioned standards as a member of the Technology Coalition,” according to the complaint. “Instead, Defendants impose productivity standards and quotas on their content moderators that are irreconcilable with applicable standards of care.”

Contact Katelyn Newberg at knewberg@reviewjournal.com or 702-383-0240. Follow @k_newberg on Twitter.

MOST READ
Don't miss the big stories. Like us on Facebook.
THE LATEST
MORE STORIES