Get all your news in one place.
100's of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
National
Bryony Gooch

Reports of image-based sexual abuse rise by 120 per cent in five years

  • Reports of image-based sexual abuse and explicit deepfakes to the Metropolitan Police have more than doubled in the last five years, with a 120 per cent rise in non-consensual intimate image abuse complaints.
  • The increase is largely attributed to the rapid rise of AI-powered 'nudification' tools, which facilitate the creation and sharing of realistic, sexually explicit images without consent.
  • Under the new Crime and Policing Bill, social media platforms will be mandated to remove reported non-consensual images within 48 hours or face substantial fines or service blocking in the UK.
  • The legislation will also ban nudification tools used for AI deepfakes, extend the reporting period for victims to three years, and introduce prison sentences for offenders and developers of such tools.
  • Experts and the Metropolitan Police are urging social media platforms to adopt a more comprehensive approach, integrating prevention, consent, and detection, to effectively combat this escalating issue.

IN FULL

Sign up to read this article
Read news from 100's of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.