Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
Tara Cobham

Children exposed to guns, self-harm and misogyny ‘within minutes of creating social media profiles’

British children are potentially being exposed to guns, self-harm, misogyny and sex within minutes of creating social media profiles, a new study has revealed.

Researchers found that a profile designed to be similar to that of a young teenager was shown an extremely distressed woman being rescued from a torture scene within seconds of signing up to TikTok, while another was shown graphic gun content on YouTube.

Technology firms have been accused of designing systems that can expose young people to “deeply harmful” content on social media, with “powerful, unregulated algorithms built to maximise engagement at any cost”.

Campaigners are reiterating calls for the government to raise the age of social media access to 16, warning that “every day of delay leaves thousands more children exposed to harm and exploitation”. Actress Natalie Cassidy has called the findings “every parent’s worst nightmare”.

The experiment, carried out by the Big Tech’s Little Victims campaign, set out to uncover what children are shown by algorithm-driven platforms when they sign up to social media at 13 – the current legal age of access.

Four fictional profiles were created on TikTok, Instagram, Snapchat and YouTube, based on typical 13-year-old girls and boys in the UK, using common interests such as gaming, beauty, music and sport. Researchers then used each platform for up to 30 minutes a day, scrolling as a child would.

Over the course of a week, the findings show the profiles were served hundreds of pieces of concerning content, including material glamourising guns and knives, explicit references to sex and pornography, promoting extreme fitness regimes and diets, and encouraging misogyny, isolation, self‑harm and even suicide.

On average across the week, it was found that the profiles were served concerning content within just three minutes of logging on, and for every minute spent scrolling, they were shown one piece of harmful or inappropriate content. In some sessions, harmful material was the very first thing served, while algorithmic content loops made it difficult – and sometimes impossible – to escape escalating harm, according to the study. In one session on Snapchat, researchers say 86 pieces of concerning content were flagged in 30 minutes of scrolling.

Daniel Kebede, general secretary of the National Education Union (NEU), which leads the campaign, said: “What this experiment shows is shocking, but not surprising. Children are being exposed to deeply harmful content on social media, even when platforms know their age. This is not accidental – it is how these systems are designed.

“At 13, children’s minds are still developing, yet they are being targeted by powerful, unregulated algorithms built to maximise engagement at any cost. Teachers see the impact every day, with rising misogyny, worsening concentration and pupils arriving at school exhausted by what they have been exposed to online. While parents are left to manage the fallout at home alone.

“That is why the government must act now and raise the age of social media access to 16. Every day of delay leaves thousands more children exposed to harm and exploitation.”

The experiment also revealed clear differences across platforms and genders.

The experiment also revealed clear differences across platforms (Getty/iStock)

It was found that concerning and harmful content appeared more frequently and escalated most sharply across TikTok and Snapchat, while content on Instagram was mostly age-appropriate. An adult researcher using Snapchat said they were forced to step away as the self-harm and suicide ideation content was so extreme.

In the experiment, girls were disproportionately served extreme body‑focused and sexualised content, including “thinspiration” and body shaming, often alongside content promoting self‑harm and suicidal ideation. Researchers say TikTok served the girl profiles content on extreme health, fitness or diet in 92 per cent of the sessions, and sexualised content in 83 per cent of the sessions.

Meanwhile, the findings show boys were funnelled towards violence, misogyny and radicalisation, with repeated exposure to weapons, hostile content about women, racist and anti‑immigration narratives, and figures linked to extremist or conspiratorial views – such asTommy Robinson and Andrew Tate. Across TikTok and YouTube, boys’ profiles were served content featuring hate speech or racist narratives in 77 per cent of sessions, according to the research, while, on just TikTok, misogynistic content appeared in 85 per cent of boys’ sessions, compared with only 13 per cent of girls’ sessions.

Mental health distress was found to be a common thread across all profiles, with content of this nature appearing in 74 per cent of the TikTok, Snapchat and YouTube sessions.

Natalie Cassidy is an ambassador for the Big Tech’s Little Victims campaign (Marc Humphreys/PA) (Marc Humphreys/PA)

Cassidy, Big Tech’s Little Victims campaign ambassador, said: “Your children viewing content like this is every parent’s worst nightmare. You assume that if a platform knows your child’s age, it will protect them – but this experiment shows that simply isn’t happening.

“Parents cannot monitor every scroll, every video, every algorithmic decision. We need the government to step in and put children’s safety before Big Tech’s profits.”

TikTok said it limits content that may not be appropriate for under-18s, sets age limits for certain features, like needing to be 16 or older for videos to appear in the ‘For You’ feed or to use direct messages, and uses more restrictive privacy settings by default for younger users. It added that it is reviewing the findings of the experiment.

A Snapchat spokesperson said: “Snapchat is a visual communications app, built to encourage real conversations with real friends, and, unlike other platforms, we don't apply an algorithm to a feed of unvetted or unmoderated content, so this type of harmful content has no place on Snapchat and when we find it, we quickly take action.” The company is understood to be contacting the NEU about their findings.

Meta, the company that owns Instagram, said it has introduced updates to Teen Accounts on Instagram, which place under-18s into age-appropriate content settings by default, automatically limiting exposure to sensitive or mature content across recommendations, search, accounts and AI experiences, with stricter options also available for parents.

A UK government spokesperson said: “The law is clear. Under the Online Safety Act, platforms must protect children from harmful content, including violent and pornographic content and material encouraging self-harm.

“Those who do not act will face enforcement action from Ofcom. The regulator has the full backing of the government in going after those who fail to comply with UK law.”

The Independent has approached YouTube for comment.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.