Meta ordered to provide mental health care to moderators in Kenya
After hearing testimonies of a troubled work environment, a court ruled that Meta must provide "proper medical, psychiatric, and psychological care."
After hearing testimonies of a troubled work environment, a Kenyan employment court has ruled that Meta must provide "proper medical, psychiatric, and psychological care" to a group of moderators in Nairobi.
Judge Byram Ongaya's directive was part of a larger interim judgment that reinstated the moderators' employment after they sued Meta in March for what they called a "sham" mass redundancy.
With its content moderation for eastern and southern Africa headquartered out of the capital Nairobi, Facebook is also up against another lawsuit in the country brought by two individuals and a rights group, carrying the accusation of inadequately handling hateful content, namely against the war in Ethiopia's northern Tigray region.
In one of the cases, a petitioner claimed that his father, an ethnic Tigrayan, was targeted with racist messages on Facebook before being murdered in November 2021 following Facebook's neglect in handling it.
The petitioners are requesting a compensation fund of 200 billion Kenyan shillings ($1.6 billion) intended for victims of hate incited on the social media platform.
Around 260 screeners at Facebook's moderation center in Nairobi were laid off early this year when the internet giant shifted its moderation supplier from the US business Sama, with which it has been working since 2019, to the European firm Majorel. Sama attributed its decision to discontinue moderation services and separate ways with Meta to a difficult economic situation and changing business requirements.
Read more: Meta allows targeted hate speech, violence, but only against US rivals
However, moderators allege they were given "varying" and "confusing" justifications for the mass layoffs, and believe it was an attempt to cover mounting worker concerns about poor pay and a lack of mental health care. The court decided that Meta and Sama were "restrained from terminating the contracts" until the result of the action contesting the dismissal's validity.
According to one testimony, “I remember my first experience witnessing manslaughter on a live video … I unconsciously stood up and screamed. For a minute, I almost forgot where I was and who I was. Everything went blank."
“I’ve seen stuff that you’ve never seen, and I’d never wish for you to see,” Frank Mugisha, a 33-year-old from Uganda told The Guardian that he had "seen stuff that you've never seen, and I'd never wish for you to see."
Many felt they didn't fully understand what they were getting into when they took the position. Some claimed they were persuaded to assume they were working in customer service, only to find themselves filtering through horrible information on short deadlines.
Discontent allegedly began to emerge at the Nairobi hub after a former moderator, Daniel Motaung, launched a complaint against Meta and Sama last year, accusing them of unacceptable working conditions, union-busting, and exposure to explicit content at work with insufficient psychological care. Similar accusations were made by the moderators in the March complaint.
According to the moderators, mental health issues such as depression, anxiety, post-traumatic stress disorder, and suicidal thoughts are common side effects of their employment, which force them to sift through dark material on the internet for lengthy periods of time. Some argued they had become "desensitized" to graphic imagery, nudity, and self-harm in written comments to the court.
“It alters the way you think and react to things,” according to Mugisha. “We may be able to find other jobs, but would we be able to keep them? I don’t know. We don’t interact normally anymore.”
A moderator group divulged to The Guardian that at least one person tried to commit suicide over redundancy and a mental health problem.
Nathan Nkunzimana, one of the moderators leading the suit stated that future moderators should not have to go through what they have.
The group has previously met with plans on unionizing, a step called "significant" by Cori Crider of the meeting was intended to discuss plans of unionizing.
The move to unionize is significant, said Cori Crider, from UK-based tech nonprofit Foxglove.
Crider applauded the judgment and called it a "Watershed," not "just against Facebook" but against all social media companies. Crider stated that "the days of hiding behind outsourcing to exploit your key safety workers are over.”
Meta's lawyers have stated that the business intends to appeal.
The labor tribunal upheld its prior rulings, preventing Meta and Sama from proceeding with any layoffs and the IT giant from switching from Sama to Majorel until a final judgment on the matter is issued.
The decision of the lawsuit will have substantial significance for both parties and may have ramifications for how Meta and its nearly 15,000 moderators globally will operate.
This comes in light of the 2021 scandal against the platform whereby whistleblower Frances Haugen declared that platform executives were well aware of the platform's ability to incite hate speech and thus bring damage to the mental health of users.