Increasing Worries about the Tech Giant’s Treatment of Moderators
Meta, the parent company of Facebook, Instagram, and WhatsApp, is now facing a new set of lawsuits in Ghana about the effects of extreme content on the mental health of its subcontracted content moderators. The lawsuits, brought forth by a class action from former moderators, allege that the organisation neglected to offer adequate mental health assistance, safe workplace conditions, and protective treatment, as the work greatly exposed them to the violent and traumatic content and provided no surgical strikes at resolving nightmarish issues around engaging with such content.
The lawsuits filed in the high court of Ghana this week come during the period of increased international focus on the abuse of content moderators. The plaintiffs maintain that Meta and its subcontracting associates have undergone extreme exploitation by neglecting the welfare of employees who deal daily with graphic violence, extreme abuse, and pornographic sexual abuse material.
Psychological Issues Among Moderators
As per the legal papers, several of the moderators suffered from grave mental health problems like post-traumatic stress disorder (PTSD), anxiety, and depression after being forced to sift through thousands of graphic and disturbing materials. Some claim that they were not provided sufficient psychological support, rest periods, and that their concerns were either disregarded or they were coerced into being fired, although psychological support was given during breaks.
“I was treated as a cog in a machine. A machine that had to devour the worst of humanity day after day without any complaints. If a machine malfunctions, it is only given repairs. In my case, I was treated like I was weak.” This is what a prior moderator offered to the media under the cloak of anonymity.
Contrary to this, Dr. Mensah argues about the psychological abuse and allegations made by moderators, stating: “It’s not a case of workplace trauma. There’s a social responsibility here. Prolonged exposure to traumatic content has ramifications for emotional health. You can’t tell me this is work safety—it is public health.”

Allegation Against Meta’s Outsourcing Policies
Plaintiffs came up with drafts focusing not just on the Meta itself, but the rest of the freelance industry. Regional partners who had control over hiring, training, and managing moderators were also included. According to drafts, using these companies makes it easy for Meta to shift the blame for terrible working conditions.
Specifically, plaintiffs blame Meta for lacking proper control regarding the provision of working tools, subjecting employees to unbearable review expectations, and making too little effort to assist employees in dealing with extreme content. Some moderators report that during their shifts, they had to review hundreds of posts, which is against most definitions of work-life balance.
One of Meta’s outsourcing companies’ trainers reported in court that “’empathy’ as a word was misplaced in that context because it was seen as inefficiency; a workplace culture where ‘empathy’ was inefficiency has become prevalent.”
Prior Global Complaints Resurfaced
In recent years, many social media companies have come under fire for their brutal algorithms and the poor moderation policies that tag along with them. Other than the United States, legal action has also been taken in places like Ireland and Kenya, where the moderators’ mental well-being is in constant threat due to negligence from Meta.
Former moderators from Kenya have reported a case of contract racism where they have been paid sick leave and are unfairly compensated. The Ghanaian origins of the case suggest that there is something fundamentally wrong with the way the company conducts itself around the world.
“This corporate exploitation is a manifestation of dishumanization happening globally, not just in Africa,” claims Michael Adeyemi, a Lagos-based rights attorney. “The infrastructure makes it easy for corporations to take advantage of the employees because there are limited legal barriers in place.”
Meta’s Injury
“Part of providing industry-leading support means extending to content reviewers unparalleled access to essential mental wellness services and industry protective frameworks,” a spokesperson for Meta said, when addressing global media. “As far as our responsibilities as propositional content reviewers are concerned, their obligations to take care of reviewers’ needs are not optional. The lawsuits are payable to the utmost discretion.”
Meta critics argue the narrative pushed by Meta fails to encapsulate the reality faced by workers on the ground. The growing reliance on worker-provided testimonies, capped against a backdrop of leaking internal documents, paints a troubling picture of a workplace where minimal support is provided to moderators, and active stigmatization is rampant towards those suffering from mental health issues.
“Meta talks a good game–on paper,” said Samuel Obeng, executive director of the Ghana Labour Rights Council. “But shelling out those platitudes makes no difference if they are portrayed as a marketing gimmick. Companies claiming to provide mental health support need to address the maladaptive mentalities encased within the claim system.”
Increased Appeal for Regulatory Action
Technical oversight of Meta operations in Africa has been lower than expected. Concerns set off by the lawsuits have further flamed the conversation questioning the level of control tech companies are allowed to exercise over African countries. Such worries are in the crosshairs of Ghanaian lawmakers who, until now, have demonstrated passive interest in creating new laws widened solely to govern content moderation.
“We won’t allow international businesses to come to our country, take advantage of our people, and leave them shattered.” This is what Nana Asante, a Member of Parliament of the Eastern Region, had to say. “We will seek legislation that ensures these businesses operate under the proper ethical frameworks.”
Amnesty International supports the moderators and states that Meta needs to create “unambiguous, enforceable human rights policies” for every employee, regardless of their geographical location. This also draws concern from international surveillance organizations.
Greater Consequences Facing Leading Technology Corporations
The lawsuits from Ghana mark the beginning of something bigger for the leading United States technology corporations. Companies like Meta make promises to sustain safety and welfare to their consumers; however, critics strongly believe that these overseas corporations take advantage of the legal voids present in Global South Countries for endless cost-cutting measures without facing any ramifications.
As experts have stated, these practices will not change without substantial action from regulators and members of the judiciary.
Multinational corporations have learned that in localities where labor regulations are lenient or enforcement is inconsistent, these corporations can incur illegal conduct which in other countries would be deemed illegal,” explained Dr. Amina Sekou, a sociologist focusing on labor and technology in West Africa. “These cases are shock therapy for these companies with no regard for human resources.”
Table of Contents
Looking Ahead
Ghanaian courts are expected to consider preliminary motions in the next few months. Legal analysts believe that the rulings could create defining shifts not only in Ghana but throughout the African continent.
If the plaintiffs emerge victorious, it’s likely that Meta will be compelled to change how content moderation is done in Africa, enforce stricter protections regarding workers’ mental health, and potentially be liable for huge compensation payouts to the employees impacted.
For the time being, the former moderators continue to feel optimistic that the legal outcomes will meet their expectations. “We are not only pursuing a victory for ourselves,” said one plaintiff. “We are doing this for every single person that comes after us.”
As the lawsuits progress, the scrutiny will focus on whether tech companies are willing to protect the employees who make sure the platforms function smoothly.