Meta Faces Allegations of Inflicting “Lifelong Trauma” on Content Moderators in Kenya

3 min read

Campaigners have accused Meta, the parent company of Facebook, of causing “potentially lifelong trauma” to hundreds of content moderators in Kenya. The allegations follow the diagnosis of over 140 moderators with post-traumatic stress disorder (PTSD) and other mental health conditions.

According to the sources of Leaders team, the diagnoses, conducted by Dr. Ian Kanyanya, head of mental health services at Kenyatta National Hospital in Nairobi, were filed with Kenya’s Employment and Labor Relations Court on December 4. The medical reports were submitted by legal firm Nzili and Sumbi Associates as part of an ongoing lawsuit against Meta and its former subcontractor, Samasource Kenya (now known as Sama).

The Toll of Moderation Work

Content moderators, often hired through third-party firms in developing countries, are tasked with filtering out disturbing material on social media platforms. Critics have long warned of the mental health toll this work takes on moderators, who are frequently exposed to graphic content.

Dr. Kanyanya reported that moderators routinely reviewed “extremely graphic content,” including videos of gruesome murders, suicides, sexual violence, and child abuse. Among the 144 moderators who underwent psychological assessments, 81% were diagnosed with “severe” PTSD.

In one medical account seen by the sources, a former moderator described experiencing vivid flashbacks, nightmares, and paranoia, leading to frequent emotional breakdowns. Another said they developed trypophobia—a fear of dotted patterns—after reviewing an image of maggots crawling out of a decomposing hand.

Legal Action Against Meta and Samasource

The class-action lawsuit is rooted in a 2022 legal case by a former Facebook moderator, who claimed they were unlawfully dismissed by Samasource Kenya after protesting unfair working conditions. The current suit represents moderators who worked for Samasource between 2019 and 2023, many of whom were made redundant last year after raising concerns about their pay and working conditions.

Foxglove, a UK-based nonprofit supporting the case, alleged that all 260 moderators employed at Samasource Kenya’s Nairobi hub were fired in retaliation for speaking out.

Martha Dark, co-executive director of Foxglove, condemned Meta’s practices, stating, “Moderating Facebook is dangerous, even deadly, work that inflicts lifelong PTSD on almost everyone who does it. Facebook is responsible for the potentially lifelong trauma of hundreds of young people.”

Meta’s Response

As per the sources of Leaders team, Meta declined to comment on the medical findings due to the ongoing litigation but emphasized its commitment to supporting content moderators. A spokesperson stated that Meta’s contracts with third-party firms include provisions for counseling, training, and fair pay. Moderators also have tools to blur or alter graphic content to reduce its impact.

Sama, the outsourcing firm previously contracted by Meta, did not respond to requests for comment.

A Broader Industry Issue

This case is not the first legal challenge brought by content moderators against social media companies. In 2021, a TikTok moderator sued the platform, alleging that the job caused her to develop psychological trauma.

Campaigners argue that if similar diagnoses were found in other industries, those responsible would face severe legal and professional consequences. As the legal battle unfolds, it underscores the urgent need for greater accountability and safeguards for those working on the frontlines of content moderation.

You May Also Like

More From Author