Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
World
Deepa Parent and Katie McQue

‘I log into a torture chamber each day’: the strain of moderating social media

Man's head in silhouette with blurry laptop screen in background
Workers in the IT hub of Hyderabad in India said moderating sexual and abusive content had left them depressed, distressed and struggling to sleep. Photograph: Bloomberg/Getty Images

‘I had to watch every frame of a recent stabbing video … It will never leave me,” says Harun*, one of many moderators reviewing harmful online content in India, as social media companies increasingly move the challenging work offshore.

Moderators working in Hyderabad, a major IT hub in south Asia, have spoken of the strain on their mental health of reviewing images and videos of sexual and violent content, sometimes including trafficked children.

Many social media platforms in the UK, European Union and US have moved the work to countries such as India and the Philippines.

While OpenAI, creator of ChatGPT, has said artificial intelligence could be used to speed up content moderation, it is not expected to end the need for the thousands of human moderators employed by social media platforms.

Content moderators in Hyderabad say the work has left them emotionally distressed, depressed and struggling to sleep.

“I had to watch every frame of a recent stabbing video of a girl. What upset me most is that the passersby didn’t help her,” says Harun.

“This is one video that affected me like never before. You may have seen the video in its blurred version, but I had to watch it completely and clearly. It will never leave me.”

Harun, 22, was one of eight moderators who agreed to speak to the Guardian on condition of anonymity. Their average pay was less than £8 a day. While those interviewed say they were encouraged to seek help in stressful situations, only one of the moderators found the support helpful.

“The wellness coaches exist as a formality because these global companies need to tick off a legal clause probably. When we get overwhelmed after reviewing a sensitive video, they simply move us to a less sensitive queue until we feel better,” says Harun.

Akash* was 21 when he graduated with a bachelor’s degree. Having had mobility issues since birth, he often felt disheartened when he received rejections from companies based in Hyderabad.

“No one would hire me because I’m disabled. I was desperate to be independent and earn a living,” he says. “When I was told by a headhunter for disabled people that there was a vacancy for a moderator, I applied, expecting a rejection.

“I was elated when I finally found the job. Since then I’ve realised this is not what I signed up for. It feels as if I log into a torture chamber each day.”

Akash says his disability has forced him to work in isolation and all communication with the client and management has been online.

“There have been instances when I’ve flagged a video containing child nudity and received continuous calls from my supervisors. I am not sure if it’s because I’m from a society where semi-naked pictures of minors are frowned upon, but it’s just not acceptable to me.

“Most of these half-naked pictures of minors are from the US or Europe, but over the past three years, I’ve received multiple warnings from my supervisors not to flag these videos.

“One of them asked me to ‘man up’ when I complained that these videos need to be discussed in detail because perverts online are leaving predatory paedophilic comments on videos of children.”

Neha* has worked as a moderator reviewing sensitive content for more than four years. She says her “blood boiled” when bosses told her and other moderators in a team meeting to “relax and not ‘take it to heart’ when dealing with content like sexual violence”.

“I just want to work in a company that doesn’t give paedophiles a platform,” she says. “I’m disgusted.

“I have stopped complaining now and I am looking for a new job. At least if they were paying better for the gore we’re dealing with every other day, I would have the money to pay for therapy sessions.”

Another moderator, Rishi*, 23, who works specifically on content from the EU, says: “I had an argument with my supervisors when I mentioned that we need to ban videos of sickos killing and pretending to eat animal carcasses.

“These videos are rare, but why let something that sadistic on our client’s platform? Unless the intention is to gain more revenue via such videos?

“There’s a need for strict regulation, especially on livestream apps. We are managing content to protect people’s mental health at the risk of our emotional wellbeing.”

Harun says: “All they’re doing is creating a young workforce that’s depressed. It’s like the whole world’s trash is being dumped in India.”

Martha Dark, a director at the non-profit IT and social justice organisation Foxglove, says moderators need better protection from “life-changing trauma”.

“Content moderators take huge risks to protect us, fighting every day against a torrent of violence and child abuse. There is no social media without them,” she says.

“Cut-rate content moderation doesn’t just fail workers; it fails all of us by making social media a more dangerous place.”

Dark adds that AI “is not and may well never be able” to moderate content effectively. “The only real solution is to invest in the content moderators who already do the work.”

* Names have been changed

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.