Meta Faces Lawsuit Over PTSD Claims by Kenyan Content Moderators
Campaigners allege that Meta has caused severe mental health issues, including PTSD, among over 140 content moderators in Kenya. This situation stems from ongoing litigation following disturbing working conditions. Dr. Ian Kanyanya’s findings reveal the traumatic effects of content moderation, prompting calls for greater accountability in the industry.
Recent allegations against Meta, Facebook’s parent company, reveal that over 140 content moderators in Kenya have been diagnosed with PTSD and other mental health disorders. Dr. Ian Kanyanya from Kenyatta National Hospital prepared these assessments, which contribute to an ongoing lawsuit against Meta and Samasource Kenya, the outsourcing firm involved in content moderation. Critics have long raised concerns about the psychological toll this line of work exerts on moderators, particularly as they are exposed to highly disturbing content daily.
Meta has refrained from commenting on the medical findings, asserting its commitment to supporting content moderators and outlining expectations within its contracts regarding counselling and fair treatment. However, the inability of moderators to avoid extreme content type raises serious ethical questions about the oversight of mental health in high-stress jobs within the tech industry.
Kanyanya highlighted that the moderators faced graphic materials such as gruesome violence, suicides, and sexual abuse perpetually. Approximately 81% of the 144 participants he evaluated exhibited severe PTSD symptoms. This lawsuit follows a precedent set by previous claims from ex-moderators alleging unfair labor practices and termination related to workplace advocacy efforts, indicating systemic issues surrounding content moderation roles.
Significantly, last year’s layoffs of all 260 Samasource Kenya moderators after they voiced concerns about their working conditions exacerbate these troubling narratives. This ongoing situation has caught the attention of advocacy organizations like Foxglove, which supports affected individuals in their legal battles against Meta and external contractors involved in these practices.
Personal accounts from moderators reveal a deeply troubling impact of their work. One described experiencing severe nightmares and disassociation linked to their duties, while another cited a phobia stemming from particularly gruesome images encountered during moderation. Foxglove’s co-executive director, Martha Dark, asserted that “moderating Facebook is dangerous, even deadly, work that inflicts lifelong PTSD on almost everyone who moderates it.”
Such allegations underscore the need for greater accountability and safeguarding measures within the realm of digital content management. The pattern of trauma among content moderators extends beyond Facebook, seen in other platforms like TikTok, revealing a widespread industry issue. The need for reform in the treatment and welfare of individuals responsible for curating online content has never been more urgent.
The issue of mental health among content moderators, especially in developing countries, has sparked significant concern in recent years. These individuals are tasked with reviewing often distressing user-generated content to maintain community standards on social media platforms. Given the gravity of the material, the psychological consequences can be profound, leading to conditions like PTSD. Legal actions against tech giants have surfaced as employees advocate for better mental health protections and work conditions.
The situation surrounding the content moderators in Kenya raises pressing questions about the ethical responsibilities of tech companies toward their employees. The alarming statistics regarding PTSD diagnoses indicate a severe need for comprehensive support and safeguards for those in such vulnerable positions. Moving forward, it is critical for tech companies to evaluate their treatment policies and ensure the well-being of the individuals they employ to moderate content.
Original Source: www.cnn.com
Post Comment