Across the United States, police departments are increasingly using AI facial recognition software as a tool to identify crime suspects. This technology has the capability to compare faces on surveillance footage against millions of driver's license photos. However, it draws some serious concerns from experts who highlight that this technology is more likely to make errors when identifying people of color, leading to severe real-life implications.
The instance of a Detroit homeowner, who was misidentified by the facial recognition software, perfectly illustrates the faults within the system. She was falsely accused of carjacking and robbery, and subsequently arrested and spent 11 hours in jail. The wrong identification came from the software incorrectly matching a mugshot from her arrest eight years ago to a carjacking suspect in a video. Despite this mishap, the local police department defended the technology and blamed its officers for the error.
Matthew Gombele, a Georgia Tech professor and researcher declares the technology as 'not ready for deployment'. A startling report by Georgetown Law suggests that about one in two American adults were in a law enforcement face recognition network as of 2016 without their consent or knowledge.
Notably, the AI technology's accuracy rate drops significantly when identifying people of color compared to white people. A 2019 U.S. government study found some facial recognition algorithms were up to 100 times more likely to misidentify people of color. Further to this, Calvin Lawrence, a specialist in AI, raises concerns about the technology not being fair to everyone. He highlights how the algorithms can be biased since black people are overrepresented in mugshot databases and most developers use white faces when training the software.
As technology continues to evolve, it's crucial to address these misidentifications and ensure the tools are providing accurate and fair results. The case of the Detroit homeowner went as far as a $25 million lawsuit, a costly mistake illustrating the high price of getting AI facial recognition wrong.