Get all your news in one place.
100's of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
Technology
Andrew Griffin

More than half of teens have used AI to make sexual images of people, study finds

More than half of teens have made sexual images of people with artificial intelligence, a new study suggests.

As generative AI has grown, so has its use for “nudification”, where users instruct artificial intelligence tools to generate images of what real people might look like without clothes. But it has remained largely unclear how common such images – which are illegal in much of the world, whether or not they depict a real person – actually are.

Now a new study suggests that the majority of young people have both made and received such images.

And more than a third of those studied reported that someone had made non-consensual nude images.

“Teens are no longer just digital natives but AI-natives. ‘Nudification’ and GenAI apps are their new ‘sexting’, only with more challenging issues surrounding consent,” said Chad Steel, from George Mason, who conducted the new research.

In the study, researchers interviewed 557 English-speaking U.S. residents, aged between 13 and 17. They underwent an anonymous study that asked about how they had used artificial intelligence for sexual exploitation.

Of those in the survey, 55.3 per cent said they had used tools to create such an image of themselves or others, and 54.4 per cent said they had received such an image. Some 36.3 per cent said that someone had made a sexual image of them without their consent, and 33.2 per cent said that at least one such image had been non-consensually distributed.

The results were largely similar across the different demographics that took part in the study. But it found that male participants were more likely to create sexual images of themselves or others, with or without consent.

Those who had been victim to AI-powered sexual exploitation reported similar issues to other kinds of child sexual exploitation. That included being fearful and hyper vigilant about who might have seen such images, avoiding social media, a sense of powerlessness to stop the abuse and a general dehumanisation, which together disrupted their life permanently.

The work is reported in a new paper, ‘Prevalence of generative artificial intelligence sexualized image usage by adolescents in the United States’, published in the journal PLOS One.

Sign up to read this article
Read news from 100's of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.