Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Alex Hern UK technology editor

YouTube found to push content about guns to children watching games videos

The researchers created fake ‘user accounts’ – two identified as nine-year-old boys and two identified as 14-year-old boys – for the study.
The researchers created fake ‘user accounts’ – two identified as nine-year-old boys and two identified as 14-year-old boys – for the study. Photograph: Ianni Dimitrov Pictures/Alamy

YouTube’s recommendation algorithm continues to direct young video game fans down dark paths of violent and dangerous content, a report has found, years after critics first raised concerns about the system.

A report from the Tech Transparency Project (TTP), a Washington DC-based nonprofit, observed the effects of the video sharing site’s recommendation algorithms on a spread of accounts identified as those of boys aged nine and 14.

Researchers created four fake “user accounts” – two identified as nine-year-old boys and two identified as 14-year-old boys – and used them to watch exclusively gaming-related videos, albeit not always strictly age-appropriate ones, in an attempt to come up with an accurate cross-section of what a real child and teenager would be looking at.

For the nine-year-old, that included videos for games such as Roblox and Lego Star Wars, but also the horror game Five Nights at Freddy’s, set in a parody of the Chuck E Cheese restaurant chain. For the 14-year-old, the playlist “consisted primarily of videos of first-person shooter games like Grand Theft Auto, Halo and Red Dead Redemption”.

After using the accounts to watch video gaming content, the researchers logged and analysed the videos the algorithm recommended, with one in each group passively tracking the recommendations and the other actively clicking and viewing them. For all the accounts, YouTube’s algorithm pushed content related to weapons, shootings and murderers, but for those who actively watched the material it recommended such footage at much higher volumes.

TTP was clear that video games were not to blame for violent behaviour. “For more than two decades, politicians have pointed to violent video games as the root cause of mass shootings in the United States, even though researchers have found no evidence to support that claim,” the report said. “But TTP’s study shows there is a mechanism that can lead boys who play video games into a world of mass shootings and gun violence: YouTube’s recommendation algorithm.”

TTP’s executive director, Michelle Kuppersmith, said: “It’s bad enough that YouTube makes videos glorifying gun violence accessible to children. Now, we’re discovering that it recommends these videos to young people. Unfortunately, this is just the latest example of Big Tech’s algorithms taking the worst of the worst and pushing it to kids in an endless pursuit of engagement.”

All four test accounts were disclosed to YouTube as being those of children, with consistent birthdates and, for the under-13s, a linked “parental” account in accordance with the video site’s policies. Despite that, the recommendation algorithm showed the users videos that were not only wildly age-inappropriate, but apparently in violation of YouTube’s terms of service altogether.

The report said: “Many of these videos violated YouTube policies, which prohibit showing ‘violent or gory content intended to shock or disgust viewers’, ‘harmful or dangerous acts involving minors’ and ‘instructions on how to convert a firearm to automatic’.”

“YouTube took no apparent steps to age-restrict these videos, despite stating it has the option to do so with content that features ‘adults participating in dangerous activities that minors could easily imitate’.”

In a statement, a YouTube spokesperson said: “We offer a number of options for younger viewers, including a standalone YouTube Kids app and our Supervised Experience tools which are designed to create a safer experience for tweens and teens whose parents have decided they are ready to use the main YouTube app. We welcome research on our recommendations, and we’re exploring more ways to bring in academic researchers to study our systems.

“But in reviewing this report’s methodology, it’s difficult for us to draw strong conclusions. For example, the study doesn’t provide context of how many overall videos were recommended to the test accounts, and also doesn’t give insight into how the test accounts were set up, including whether YouTube’s Supervised Experiences tools were applied.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.