The most extreme form of child sexual abuse material accounted for a fifth of such content found online last year, according to a report.
Category A abuse represented 20% of illegal images discovered online last year by the Internet Watch Foundation, a UK-based body that monitors distribution of child sexual abuse material (CSAM). It found more than 51,000 instances of such content, which can include the most severe imagery including rape, sadism and bestiality.
The IWF annual report said the 2022 total for category A imagery was double the figure in 2020 and the increase was partly due to criminal sites selling videos and images of such abuse.
“We have seen criminals looking to exploit more and more insidious ways to profit from the abuse of children,” said Susie Hargreaves, the chief executive of the IWF. “I don’t think I can overstate the harm being done here. These are real children, and the suffering inflicted on them is unimaginable.
“They are being raped, and subjected to sexual torture, and criminals are making money off the back of that. It is truly appalling.”
The IWF said the number of webpages dedicated to making money off CSAM had more than doubled since 2020, with nearly 29,000 such pages identified last year. One content analyst at the IWF said child sexual abuse content was being treated as a “commodity” on some sites.
The IWF said it took action over more than 250,000 webpages last year, an increase of 1% on 2021, with three-quarters of them containing self-generated imagery, where the victim is manipulated into recording their own abuse before it is shared online.
The NSPCC, a child protection charity, said the figures were “incredibly concerning”. It said the government must also update the online safety bill to ensure senior managers were held liable for the presence of CSAM on their sites.
Under the bill, tech executives face the threat of a two-year jail term for failing to protect children from online harm. However, the provision applies to protecting children from content such as material promoting self-harm and eating disorders, and does not apply to dealing with CSAM.
The security minister, Tom Tugendhat, said companies using heavily encrypted services – which mean only the sender and recipient can see the content – must build in safety features that help detect abuse.
“Companies need to ensure that features such as end-to-end encryption have the necessary safety features built in so that they do not blind themselves to abuse occurring on their platforms,” he said.
The encrypted messaging service WhatsApp said last month that it would leave the UK rather than accept weakened encryption. WhatsApp and other encrypted services are concerned by provisions in the bill that could force them to apply content moderation policies that would amount to circumventing end-to-end encryption.