- Reports of image-based sexual abuse and explicit deepfakes to the Metropolitan Police have more than doubled in the last five years, with a 120 per cent rise in non-consensual intimate image abuse complaints.
- The increase is largely attributed to the rapid rise of AI-powered 'nudification' tools, which facilitate the creation and sharing of realistic, sexually explicit images without consent.
- Under the new Crime and Policing Bill, social media platforms will be mandated to remove reported non-consensual images within 48 hours or face substantial fines or service blocking in the UK.
- The legislation will also ban nudification tools used for AI deepfakes, extend the reporting period for victims to three years, and introduce prison sentences for offenders and developers of such tools.
- Experts and the Metropolitan Police are urging social media platforms to adopt a more comprehensive approach, integrating prevention, consent, and detection, to effectively combat this escalating issue.
IN FULL