San Francisco preparing another blow to their own police

As technology moves forward, criminals often take advantage of cutting edge tools (hacking and fraud being two of the most common examples) and law enforcement is left to play catchup. But sooner or later, the government begins to take advantage of new technology in the battle against crime. One example is facial recognition software that can, in some cases, help identify a suspect and bring them in more quickly. But that won’t be happening in San Francisco, assuming a new bill currently coming up for a vote is signed into law. They plan to ban any use of facial recognition tools by municipal or county law enforcement. (CBS San Francisco)

The city of San Francisco is on the verge of becoming the first in the nation to ban the use of facial recognition technology for law enforcement.

On Monday, the rules committee of the San Francisco Board of Supervisors voted unanimously to pass the “Stop Secret Surveillance Ordinance, which would disallow city and county law enforcement agencies to use facial recognition systems.

“Facial recognition allows the government to know where we walk, what stores we visit, even if we’ve gone to a protest or a place of worship,” said Matt Cagle of ACLU of Northern California.

The ACLU is really fighting against facial recognition software everywhere it’s being employed because Big Brother shouldn’t have a record of where you travel in public or something. They also (correctly) cite the fact that testing of facial recognition software has produced some seriously sketchy results in the past. That second point is a legitimate concern until the software is significantly improved. We recently looked at Amazon’s facial recognition product and how it was pretty good at correctly identifying white males, but in a significant number of test runs it was unable to figure that a black woman was even female. Similarly, another test found the software had mistaken 28 members of Congress for the mug shot photos of criminals.

But even recognizing that the software has a ways to go in terms of accuracy, what’s really the downside here? If it makes a mistake in identifying someone, a human law enforcement officer showing up to look at them will quickly determine they’ve got the wrong person and get back to the search. One area where the software is far more reliable (though still not 100%) is in reading automobile license plates. If you can quickly identify all of the vehicles that were at the scene of a crime or leaving the area shortly thereafter, you can really narrow down the search.

In the end, we’re talking about video captures of people when they are out in public, not in their homes, offices or private spaces. Your legal expectation of privacy is vastly lower when you step outside and a human being could spot you near the scene of a crime and relay that information to the police just as easily as a camera. Why San Francisco wants to continue to make it harder for their police and sheriffs to do their jobs is a mystery.