Far-right riots may not be the calmest moment for a reasoned debate about the regulation of big tech, but the eruption of racist violence in England and Northern Ireland raises urgent questions about the responsibilities of social media companies, and how the police use facial recognition technology. While social media isn’t the root of these riots, it has allowed inflammatory content to spread like wildfire and helped rioters coordinate. Keir Starmer has pledged to address online impunity and increase the use of live facial recognition technology. We should applaud the former but be very wary of the latter idea.
Both technologies have profound implications for democratic accountability. Take social media. We are used to the adage that what’s illegal in the real world is illegal in the virtual universe too. But in practice, years of cuts to the justice system have left it ill equipped to deal with the growth in racist and inflammatory content online. Explicit violent threats, and incitements to violence, have gone unpoliced, and dangerous and deliberate misinformation has spread. Both can be weapons used by hostile actors, including enemy states.
The great elephant in the room is the wealth, power and arrogance of the big tech emperors. Silicon Valley billionaires are richer than many countries. Some believe they can buy current and recently retired politicians, and see themselves as above both democracy and the law. That mature modern states should allow them unfettered freedom to regulate the content they monetise is a gross abdication of duty, given their vast financial interest in monetising insecurity and division. Is Elon Musk the legitimate arbiter of whether Mr Yaxley-Lennon’s social media broadcasts are a threat to public order and security in the UK or anywhere else? Isn’t it time we had judicial processes that would allow for the removal of particular material and the proportionate blocking of certain users?
Similar questions arise at the sharp end of law enforcement around the use of facial recognition. In recent years, this technology has been used on our streets without any significant public debate or any parliamentary authorisation. We wouldn’t dream of allowing telephone taps, DNA retention or even stop and search and arrest powers to be so unregulated by the law, yet this is precisely what has happened with facial recognition, thanks to a Conservative government that was casual with both the law and the doling out of public money to private contractors.
Our facial images are gathered en masse via CCTV cameras, the passport database and the internet. At no point were we asked about this, nor is there any statutory legal basis for it. People are placed on “watchlists” with no statutory criteria. These lists typically include victims and vulnerable people alongside “people of interest” and convicted criminals. Individual police forces have entered into direct contracts with private companies of their choosing, making opaque arrangements to trade our highly sensitive personal data with private companies that use it to develop proprietary technology. And there is no specific law governing how the police, or private companies such as retailers, are authorised to use this technology.
Facial recognition is incredibly intrusive and capable of achieving complete societal surveillance. It can also create significant injustice, particularly when it’s deployed in real time as opposed to after a crime is captured on camera. Live facial recognition depends on cameras situated in public places that capture everyone who passes by. This increases the risk of false positive matches, where members of the public are incorrectly identified and flagged to police or security staff. Unless that person can then persuade an officer that they’re not the person on the list, they risk facing an intrusive stop and search, or even an arrest.
This poses a huge threat to everyone’s civil liberties, but there are further concerns about racial bias. Experts at Big Brother Watch believe the inaccuracy rate for live facial recognition since the police began using it is around 74%, and there are many cases pending about false positive IDs. One involved a black youth worker who was subject to a false match and detained on the street even after he had identified himself to the police. Another involved a teenager (who was also black) who was subject to a search and ejected from a shop after being told that she had been banned. Again, the shop’s facial recognition camera misidentified her. As the post office operators will testify, such are the human dangers of believing in the infallibility of the machine.
In Europe the AI act, which came into force in the EU on 1 August, bans the use of live facial recognition in anything but extreme circumstances (such as an imminent terrorist threat). Why doesn’t the UK have something similar? If we want to lead the world in exciting new technologies, we must surely seek to regulate them as our international partners do. The police and the government have so far been silent on this legal black hole.
When corporate lobbyists descend on Liverpool for the first party conference of a new Labour government this September, ministers must not be deterred from making big tech more fairly accountable. Nor should companies be given yet more unaccountable contracts for intrusive technology that replaces legal code with opaque computer code. Total societal surveillance is a dangerous and poor substitute for intelligence-led community cooperation and policing. VIP lanes and cosy chats with billionaires at Bletchley Park must be replaced by parliamentary process and primary legislation. Not only would this protect our liberties under the law, but it could also go some way towards rebuilding people’s trust in politics.
Shami Chakrabarti is a lawyer, Labour member of the House of Lords and the author of Human Rights: The Case for the Defence
Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.