Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
World
Caroline Kimeu

‘A watershed’: Meta ordered to offer mental health care to moderators in Kenya

Content moderators demonstrate at the offices of Sama in Nairobi last month.
Content moderators demonstrate at the offices of Sama in Nairobi last month. Photograph: Daniel Irungu/EPA

Meta has been ordered to “provide proper medical, psychiatric and psychological care” to a group of moderators in Nairobi following a ruling in a Kenyan employment court that heard harrowing testimony about the distressing nature of their work.

The instruction by judge Byram Ongaya formed part of a broader interim ruling that saw the moderators’ jobs restored after they sued Meta in March for what they termed a “sham” mass redundancy.

About 260 screeners in Facebook’s moderation hub in Nairobi were declared redundant early this year as the tech giant switched its moderation provider from the US company Sama, with which it had worked since 2019, to the European firm Majorel. Sama blamed its decision to stop its moderation services and part ways with Meta on a challenging economic climate and new business needs.

But moderators claim they were given “varying” and “confusing” reasons for the mass layoffs, and believe it was an effort to suppress growing worker complaints over low pay and lack of mental health support. The court ruled that Meta and Sama were “restrained from terminating the contracts” pending the outcome of the lawsuit challenging the legality of the dismissal.

The court heard testimony of the traumatic nature of the moderators’ work on a day-to-day basis. “I remember my first experience witnessing manslaughter on a live video … I unconsciously stood up and screamed. For a minute, I almost forgot where I was and who I was. Everything went blank,” read one of the written testimonies.

“I’ve seen stuff that you’ve never seen, and I’d never wish for you to see,” Frank Mugisha, 33, a moderator from Uganda told the Guardian.

Daniel Motaung, who has taken Meta to court in Nairobi.
Daniel Motaung, who has taken Meta to court in Nairobi. Photograph: Courtesy of Foxglove

Many said that they didn’t have full knowledge of what they were signing up for when they took the job. Some alleged they were led to believe they were taking customer service roles, only to end up sifting through gruesome content under tight timelines.

Discontent at the Nairobi hub reportedly began to surface after a former moderator, Daniel Motaung, filed a case against Meta and Sama last year, accusing them of unreasonable working conditions, union-busting and exposure to graphic content at work with inadequate psychological support. The group of moderators in the March lawsuit have made similar claims.

The moderators say that mental health struggles, such as depression, anxiety, post-traumatic stress disorder and suicidal ideation, are a standard outcome of their work, which requires them to sift through dark material on the internet for long hours. In written submissions to the court, some said they had become “desensitised” to graphic content, nudity and self-harm.

“It alters the way you think and react to things,” said Mugisha. “We may be able to find other jobs, but would we be able to keep them? I don’t know. We don’t interact normally any more.”

At least one among the group has attempted to take their life – reportedly due to worries over the redundancy and declining mental health, a group of moderators told the Guardian.

Speaking at a meeting last month where the moderators announced plans to unionise, Nathan Nkunzimana, one of the moderators behind the suit, said: “Future moderators should not go through what we have.”

The move to unionise is significant, said Cori Crider, of UK-based tech nonprofit Foxglove. Tight non-disclosure agreements and the secrecy surrounding their work had “put a chill on organising”, she said. The firing of Motaung – which followed his efforts to unionise – also spread fear among the moderators, some of whom had travelled from neighbouring countries to take up their jobs, and whose immigration status was tied to the role.

Foxglove has claimed that moderators who lobbied for better conditions at Sama were in effect “blacklisted” from applying for the new roles at Majorel. Meta and Majorel previously declined to comment on ongoing litigation. Sama did not respond to requests for comment.

The next sitting on the moderators’ case, on Thursday, follows a ruling by Kenya’s employment tribunal last week that Meta was the “primary” or “principal” employer of the content moderators in the Nairobi hub, and Sama only an agent, as all the work the screeners did directly served and was provided by Meta. The court also said that, on an initial assessment, no proper justification for the redundancy had been shown.

“Today’s ruling is a watershed, and not just against Facebook,” said Crider, lauding the interim judgment. “All social media giants should take note – the days of hiding behind outsourcing to exploit your key safety workers are over.”

Meta’s lawyers have indicated that the company intends to appeal.

The employment tribunal confirmed its previous orders, blocking Meta and Sama from proceeding with any layoffs, and the tech giant’s switch from Sama to Majorel, until a final ruling on the case is made. The outcome of the case will have significant implications for both sides, and may have knock-on effects on how the tech giant works with its roughly 15,000 content moderators worldwide.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.