Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
Nicole Wootton-Cane

Why UK TikTok employees are considering legal action as whistleblowers speak out

TikTok is facing possible legal action from two of its employees over plans to axe hundreds of jobs from its UK online safety team.

In June 2025, the social media giant said it would be expanding its “investment and presence” in the UK, proudly declaring: "What underpins our continued growth is our deep commitment to safety”.

Two months later, TikTok said it would be slashing its London-based trust and safety team, including content moderators - the people responsible for assessing and removing dangerous and graphic videos from the platform.

TikTok said the move was part of a “wider global reorganisation” - a shift away from human moderation, and towards using automated systems, including artificial intelligence (AI).

But employees see it differently. They claim the redundancies were announced just seven days before a ballot was due to take place among workers on whether to unionise.

Two moderators have now sent a legal letter to TikTok claiming they were unfairly dismissed because of their efforts to unionise.

This comes amid a backdrop of growing concern over TikTok’s trust and safety operations. In November, Dame Chi Onwurah, chair of the government’s science, innovation and technology committee, said TikTok’s staffing cuts posed a “real risk to the lives of TikTok users” and questioned its ability to keep them safe.

Three whistleblowers also told The Independent a culture of “pressure” at the platform was leaving workers with health problems related to excessive screen time, feeling unable to take breaks in order to achieve grades, and putting “stress” on them to meet “arbitrary” targets.

Here’s why TikTok could go to a UK court, and why this could affect the videos you see on your screen.

Why is TikTok facing possible legal action?

Two content moderators have sent a legal letter to TikTok laying out the terms of a potential legal case.

TikTok has been given a month to respond to the letter (Reuters)

The argument lies on grounds of unlawful detriment and automatic unfair dismissal. It centres around a vote planned by TikTok’s London-based trust and safety team on whether to form a union with the UTAW (United Tech & Allied Workers), a branch of the Communication Workers Union (CWU).

In the letter, it is claimed that a vote was planned for 29 August - but seven days before, on 22 August, workers were told hundreds of jobs would be scrapped as part of a “restructuring exercise”.

At the time, TikTok said the move was part of plans to “strengthen” its global trust and safety operations amid a shift towards “technological advancements” including AI.

But unions and campaigners were quick to accuse them of “bare-faced union busting,” claiming the true purpose of the redundancies was to prevent them from unionising.

In the legal letter sent to TikTok, seen by The Independent, the moderators argue that the actions of the company are unlawful.

They are asking an employment tribunal to make orders for their roles to be reinstated, should they be made redundant, as well as compensation for their lost pay. In a statement, TikTok called the claims “baseless”.

‘You are pitted against each other’

Content moderators at TikTok say they were hoping to unionise in a bid to tackle what they claim is a “stressful” and “high pressure” environment where targets are strictly tracked and employees are “pitted against each other”.

One content moderator said they had at one time been expected to watch 1,200 videos a day (AP)

One moderator told The Independent at one point she had been expected to watch 1,200 videos per day.

They work under TikTok’s “utilisation rate”, a piece of software that encourages them to work faster and penalises them if they don’t move their mouse or interact with the computer screen based on fixed time limits, rather than the specific content on the screen.

They are then graded, and bonuses are determined by these performance statistics, whistleblowers said.

“It’s very stressful,” one moderator said. “We have to focus every single day, every single hour, we always have to be on target.”

She added some people she knew had stopped taking breaks in order to achieve a higher grade.

These strict screen-facing targets have left workers feeling their brains are being “burned” by the work, according to lawyers.

One told The Independent she had been forced to seek help from a neurologist for headaches due to her work.

What happens next?

Lawyers have given TikTok one month to respond to the letter, which is anticipated towards the end of January.

What happens next depends on how TikTok chooses to respond. The social media giant may choose to enter into negotiations with the moderators, taking discussions behind closed doors.

Alternatively, it may decide to fight the claims in court, taking the case to an employment tribunal.

What are the consequences for online safety?

Employees have been sounding the alarm about the impact job cuts could have on online safety since they were announced.

Speaking to The Independent, whistleblowers said they did not believe any automated moderation is “ready” yet to do their job.

Dame Chi Onwurah has said the cuts pose a ‘real risk to the lives of TikTok users’ (PA Archive)

One employee said he sees TikTok’s AI get things wrong “all the time”.

“We have different models that are supposed to identify guns, for example,” he said. “Sometimes they do, but they make a lot of mistakes, because if you do a gun shape with your fingers, it will identify it like a gun.”

He said a model that should identify blood would often pull out “stains” or marks on a wall.

In comments sections, he said he found AI was often unable to identify “cat and mouse” games where users would communicate in certain emojis to “trick the system” and violate the platform’s community guidelines.

Whistleblowers also said they had particular concerns about children’s safety. One said “90 per cent” of the content she faces on the app is from children, many of whom are clearly under the age of 13, TikTok’s minimum age requirement.

Most of these videos, she said, have been flagged for references to suicide - for example, discussions of wanting to kill themselves.

“When we moderate, I think you have to use human emotion,” she said. “I can do that. Personally, I don’t think that is something AI can do.”

TikTok said 91 per cent of the content removed for violating its Community Guidelines is identified and taken down by automation, and 99 per cent of that content is proactively removed before anybody reports it.

Moderators also expressed concern that where human moderation remains, jobs will be outsourced to low-paid employees in countries such as Morocco, Kenya, and Ghana. Moderators said they fear those hired may lack the contextual and cultural understandings needed to moderate videos effectively.

Since the redundancy announcement, The Independent has seen screenshots of job advertisements on TikTok’s internal platform for moderators in Morocco - somewhere experts say many social media platforms employ moderators.

TikTok said it continues to have a number of trust and safety roles in a variety of locations.

Tom Hegarty, head of communications at Foxglove, an independent non-profit organisation that supports tech workers in legal cases, said the case is “really important because content moderators are how people are kept safe online”.

A TikTok spokesperson said: "We once again strongly reject these baseless claims.

“These changes were part of a wider global reorganisation, as we evolve our global operating model for Trust and Safety with the benefit of technological advancements to continue maximising safety for our users."

If you are experiencing feelings of distress, or are struggling to cope, you can speak to the Samaritans, in confidence, on 116 123 (UK and ROI), email jo@samaritans.org, or visit the Samaritans website to find details of your nearest branch.

If you are based in the USA, and you or someone you know needs mental health assistance right now, call or text 988, or visit 988lifeline.org to access online chat from the 988 Suicide and Crisis Lifeline. This is a free, confidential crisis hotline that is available to everyone 24 hours a day, seven days a week. If you are in another country, you can go to www.befrienders.org to find a helpline near you.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.