The Chicago PD recently upped their facial recognition game with the introduction of new software and, as always, people are up in arms about it. This report from CBS Chicago is rife with complaints about possible invasions of privacy, racism and all the usual kvetching. But the fact is, the technology is working. The big sticking point here is that the police are using Clearview Ai, which we’ve covered here before. The fact that they use a massive database containing images scraped from various social media sites has some people worried that the Police State is coming for us all.
The department has used some form of facial recognition tech since at least 2013. They say it’s used within the law, and never used alone to make an arrest, but there are still privacy concerns.
The technology is called Clearview Ai.
The controversial new facial recognition software relies on billions of photos scraped from social media and websites.
These are photos the company grabs without permission.
Both the police and City Hall have done their best to allay the concerns of Chicago residents, not that it’s winding down the outrage machine. They’ve covered most of the common complaints about facial recognition software that crop up in public debate.
First of all, they’re not using Clearview AI in “real-time” to just randomly scan crowds and identify people. The system is only employed after an actual crime is reported or suspected and there is a video or still photograph available showing the suspect’s face. Even then, noting the high rate of misidentifications produced by most of this type of software, they never issue a warrant based only on the match generated by the system. A human investigator always has to confirm the match. While the software makes mistakes, a trained law enforcement official can almost always quickly eliminate false positives.
The payoff is significant in terms of effectiveness. In the bad old days, a witness to a crime would need to sit for hours or days pouring through books full of mugshots and relying only on their memory. This system takes an image and compares it to a massive database of mugshots and other image sources, coming up with possible matches in a matter of seconds. That can make a huge difference between catching a criminal near the scene of the crime or losing them in the crowd, perhaps permanently.
It’s not as if Chicago doesn’t need more help in solving crimes. While nowhere near as bad as Baltimore, Chicago has suffered from a surge of shootings and murders in recent years that only began to level off in 2019. On average, 74% of these murders go unsolved. Between 2007 and 2017 there were 5,534 homicides in Chicago and just 26% of them resulted in a conviction.
There has yet to be a single report of a case where someone was incorrectly identified as a suspect by facial recognition in Chicago who was subsequently convicted or even charged. So if this software gives the cops a leg up on the bad guys and isn’t leading to hundreds of innocent people being rounded up, why not let them at least give it a try? Everyone values their privacy, but sometimes these sorts of protests can be taken too far.