HomeAFRICAKENYAFACEBOOK MODERATORS DIAGNOSED WITH SEVERE PTSD AFTER EXPOSURE TO GRAPHIC CONTENT

FACEBOOK MODERATORS DIAGNOSED WITH SEVERE PTSD AFTER EXPOSURE TO GRAPHIC CONTENT

Published on

spot_img

A disturbing case has emerged from Kenya, where more than 140 Facebook content moderators have been diagnosed with severe post-traumatic stress disorder (PTSD) after being exposed to graphic and traumatic material as part of their job. These individuals worked long hours at a facility in Kenya, tasked with moderating disturbing social media content for Meta, Facebook’s parent company.

The moderators, employed by Samasource Kenya, an outsourcing firm contracted by Meta, were subjected to images and videos that included graphic depictions of murders, suicides, child sexual abuse, and terrorism. This exposure has led to widespread mental health issues, including PTSD, generalized anxiety disorder (GAD), and major depressive disorder (MDD).

The diagnoses were disclosed by Dr. Ian Kanyanya, a prominent mental health specialist and the head of the mental health services department at Kenyatta National Hospital in Nairobi. Following assessments of the employees, Dr. Kanyanya determined that a significant number were experiencing severe mental health issues.

The case has now become a central focus in a lawsuit being filed against Meta and Samasource Kenya, as former moderators seek justice for the mental health damage caused by their work.

From 2019 to 2023, the moderators worked in shifts of eight to ten hours daily, tasked with reviewing content uploaded to Facebook, particularly posts from Africa. The work was grueling and emotionally taxing, with moderators exposed to horrific content on a daily basis.

According to the lawsuit, some of the material was so extreme that moderators would faint, vomit, scream, or run away from their desks in response. The content they had to review included depictions of necrophilia, bestiality, self-harm, and terrorist attacks, which caused extreme emotional reactions.

The plaintiffs in this case are a group of over 190 former moderators who claim they were subjected to hazardous working conditions, poor pay, and inadequate mental health support. Of the 144 moderators examined by Dr. Kanyanya, 81% were found to have severe or extremely severe symptoms of PTSD.

Many of these individuals reported experiencing debilitating anxiety, depression, and flashbacks long after leaving their jobs. Some moderators described how their mental health issues negatively impacted their personal lives, including the breakdown of marriages, the loss of sexual intimacy, and strained relationships with family members.

In addition to PTSD and other mental health conditions, the plaintiffs reported substance abuse problems, with at least 40 of them misusing alcohol, cannabis, cocaine, amphetamines, and prescription drugs. The emotional toll of the work also led to feelings of paranoia and fear, particularly among those who reviewed content related to terrorist groups.

Several moderators feared for their safety, believing they were being targeted by the groups they were monitoring. Some even reported fearing for their lives, believing that they would be hunted and killed if they returned home after their shifts.

The case has garnered significant attention as it sheds light on the dark side of social media content moderation. While companies like Meta rely on large teams of content moderators to filter out harmful material and ensure the safety of their platforms, the psychological toll on these workers has often gone unnoticed.

Moderators are typically underpaid and overworked, with little regard for the long-term mental health consequences of their jobs. According to the lawsuit, the moderators in this case were paid eight times less than their counterparts in the United States despite doing the same work.

The working conditions at the facility were described as harsh and dehumanizing. Moderators were reportedly required to review a constant stream of images in a cold, warehouse-like environment under bright lights, which further strained their mental health. Their activities were closely monitored, with supervisors tracking their every move, including how long they spent on each piece of content.

Workers were often expected to meet strict quotas, which added to the stress of the job. Medical reports filed with the Nairobi Employment and Labour Relations Court paint a chilling picture of the moderators’ experiences, highlighting the severe toll that such work can have on mental health.

In addition to the mental health issues, the plaintiffs have made several other serious allegations, including accusations of human trafficking, modern slavery, and unlawful redundancy. The lawsuit claims that the workers were not adequately compensated for the risks they faced and were subjected to unfair employment practices. The moderators were employed under third-party contracts, often less secure than direct employment arrangements. Furthermore, the workers had limited access to healthcare and counseling services despite the extreme nature of their work.

Meta and Samasource Kenya have declined to comment on the lawsuit, citing the ongoing litigation. However, Meta has previously stated that it takes the welfare of its content reviewers seriously. In response to concerns about the impact of content moderation on mental health, Meta has implemented some measures to reduce the exposure of its moderators to graphic content. These include blurring images, muting sounds, and rendering visuals in monochrome.

Meta has also stated that it offers content moderators access to counseling services and private healthcare. However, critics argue that these measures are insufficient to protect workers from the extreme psychological damage caused by their work.

Martha Dark, the founder and co-executive director of Foxglove, a UK-based nonprofit organization supporting the case, has criticized Meta for failing to address the mental health crisis among its content moderators. “The evidence is indisputable: moderating Facebook is dangerous work that inflicts lifelong PTSD on almost everyone who moderates it,” Dark said. While crucial to ensuring user safety, content moderation has significant psychological risks that must be acknowledged and addressed. The situation in Kenya is just one example of a broader issue affecting content moderators worldwide.

Latest articles

EC FINDS APPLE AND META IN BREACH OF DIGITAL MARKETS ACT

Yesterday, the European Commission fined Apple 500 million euros and Meta 200 million euros...

STELLANTIS FACES CLASS ACTION IN ITALY OVER DEFECTIVE AIRBAGS

An Italian court has ruled a class action lawsuit against Stellantis regarding potentially defective...

FAMILIES UNITE TO INITIATE CLASS ACTION REGARDING FATALITIES INVOLVING UK POLICE CONTACT

Numerous families are participating in a class action initiative regarding deaths following police contact...

FRENCH COMPETITION AUTHORITY FINES APPLE 150 MILLION EUROS OVER ATT TOOL

On 31 March, the French Competition Authority fined Apple 150 million euros for abusing...

More like this

EC FINDS APPLE AND META IN BREACH OF DIGITAL MARKETS ACT

Yesterday, the European Commission fined Apple 500 million euros and Meta 200 million euros...

STELLANTIS FACES CLASS ACTION IN ITALY OVER DEFECTIVE AIRBAGS

An Italian court has ruled a class action lawsuit against Stellantis regarding potentially defective...

FAMILIES UNITE TO INITIATE CLASS ACTION REGARDING FATALITIES INVOLVING UK POLICE CONTACT

Numerous families are participating in a class action initiative regarding deaths following police contact...