Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Rachel Hall

‘Urgent clarity’ sought over racial bias in UK police facial recognition technology

People in front of a police van with cameras on it and notice reading: 'Live facial recognition in operation', and union flags above
A facial recognition system deployed by the Metropolitan police at Oxford Circus on 13 May in London. Photograph: Leon Neal/Getty Images

The UK’s data protection watchdog has asked the Home Office for “urgent clarity” over racial bias in police facial recognition technology before considering its next steps.

The Home Office has admitted that the technology was “more likely to incorrectly include some demographic groups in its search results”, after testing by the National Physical Laboratory (NPL) of its application within the police national database.

The report revealed that the technology, which is intended to be used to catch serious offenders, is more likely to incorrectly match black and Asian people than their white counterparts.

In a statement responding to the report, Emily Keaney, the deputy commissioner for the Information Commissioner’s Office, said the ICO had asked the Home Office “for urgent clarity on this matter” in order for the watchdog to “assess the situation and consider our next steps”.

The next steps could include enforcement action, including issuing a legally binding order to stop using the technology or fines, as well as working with the Home Office and police to make improvements.

Keaney said: “Last week we were made aware of historical bias in the algorithm used by forces across the UK for retrospective facial recognition within the police national database.

“We acknowledge that measures are being taken to address this bias. However, it’s disappointing that we had not previously been told about this, despite regular engagement with the Home Office and police bodies as part of our wider work to hold government and the public sector to account on how data is being used in their services.

“While we appreciate the valuable role technology can play, public confidence in its use is paramount, and any perception of bias and discrimination can exacerbate mistrust. The ICO is here to support and assist the public sector to get this right.”

Police and crime commissioners said publication of the NPL’s finding “sheds light on a concerning inbuilt bias” and urged caution over plans for a national expansion, which could include cameras being placed at shopping centres, stadiums and transport hubs, without putting in place adequate safeguards.

The findings were released on Thursday, hours after Sarah Jones, the policing minister, had described the technology as the “biggest breakthrough since DNA matching”.

Facial recognition technology scans people’s faces and cross-references the images against watchlists of known or wanted criminals. It can be used while examining live footage of people passing cameras, comparing their faces with those on wanted lists, or to enable officers to target individuals as they walk by mounted cameras.

Police officers can also retrospectively run images of suspects through police, passport or immigration databases to identify them and check their backgrounds.

Analysts who examined the police national database’s retrospective facial recognition technology tool at a lower setting found that “the false positive identification rate (FPIR) for white subjects (0.04%) is lower than that for Asian subjects (4.0%) and black subjects (5.5%)”.

The testing found that the number of false positives for black women was particularly high. “The FPIR for black male subjects (0.4%) is lower than that for black female subjects (9.9%),” the report said.

Responding to the report, a Home Office spokesperson said the department took the findings “seriously”, and had already taken action, including procuring and testing a new algorithm “which has no statistically significant bias”.

“Given the importance of this issue, we have also asked the police inspectorate, alongside the forensic science regulator, to review law enforcement’s use of facial recognition. They will assess the effectiveness of the mitigations, which the National Police Chiefs’ Council supports,” the spokesperson said.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.