Police are still blaming child victims of sexual grooming gangs for the attacks they suffer, an official report has found.
The inquiry by HM Inspectorate of Constabulary and Fire & Rescue Services (HMICFRS) comes more than a decade after scandals in Rotherham and Rochdale came to light, revealing failings by the authorities that left gangs of men free to attack vulnerable young girls.
The report found that while the situation had improved, progress was slow, and warnings from other official bodies had not been heeded. It also dismissed claims that one ethnic group posed more of a danger to children than any other.
The inspectorate said: “In 2013, the home affairs committee was able to report that child sexual exploitation was a ‘large-scale, nationwide problem’, which was increasing.
“With such a stark warning, we expected to find, 10 years later, that the police and other organisations had a greater understanding of the problem and had developed effective responses to protect children.
“In many respects, we were disappointed. We found that an accurate view of group-based child sexual exploitation still wasn’t available to the police service, data collection was unreliable, and intelligence gathering wasn’t prioritised.”
The report found that examples of group child sexual exploitation were missed, with some cases handled by non-specialist officers who were less likely to know what to look for. It also said law enforcement lacked a clear definition for group-based child sexual exploitation.
In one case, key evidence from mobile phones was not examined for a year. In another, a child and her friend who were being exploited by a 30-year-old man were themselves initially arrested before officers reversed course and began treating them as victims.
HMICFRS said that in three of the six forces it inspected, it found a dozen instances of victim blaming, which can result from poor culture in a force rather than any failings in individual officers.
Examples included police staff saying about one child victim that “concerns [were] raised [due] to her general proclivity with older men”. In another instance, a missing child was described as “medium-risk due to age – streetwise and tends to return the next day”. In another case, a child was described as “putting herself in precarious situations”, while another child was described as a “difficult victim to engage with”.
The inspectorate said: “Victim-blaming language indicates that some police personnel don’t understand the vulnerability of children. It means that responses to protect and help them are at times inadequate and risk is missed.”
The report dismissed claims that attackers were likely to be predominantly from one ethnic group: “Any public perception that those responsible are predominantly from the Pakistani or south Asian community may be influenced by national media coverage of some of the cases … Furthermore, we didn’t find that this public perception was supported by the 27 group-based child sexual exploitation investigations we examined during the inspection.”
Wendy Williams, the lead inspector, said: “It cannot be overstated how complex and challenging these crimes can be to prevent and investigate, and the police can’t tackle them alone.
“Police and law enforcement bodies have improved how they support victims and understand their needs. However, the pace of change needs to increase, and this starts with understanding the problem. We found that the police, law enforcement bodies and the government still didn’t have a full understanding of the nature or scale of these crimes.”
Meanwhile, police chiefs have criticised a decision by the large tech firms to enact end-to-end encryption, which law enforcement believes will help shield serious criminals, including sexual predators.
The National Police Chiefs’ Council (NPCC) said it received a “staggering” number of reports of child sexual exploitation every month and uncovered 800 suspects and identified 1,200 children as potential victims. A large number of these came from Meta-owned sites and apps including WhatsApp, Instagram and Facebook.
Ian Critchley, a deputy chief constable and lead for child protection at the NPCC, said: “The introduction of Meta’s new end-to-end encryption will have a dangerous impact on child safety. Meta will no longer be able to see messages from online groomers which contain child sexual abuse material and therefore they won’t be able to refer it to the police.
“There is a moral responsibility on media companies to ensure this does not happen.”
In the past Meta has said it is developing “robust safety measures to prevent, detect and combat abuse while maintaining online security”.