Amazon sells facial recognition tools to cops and people are freaking out

Hey, are you using that wild Amazon product known as Rekognition? Me neither. And unless you’re running an online company of some significant size, most people probably aren’t. First of all, it’s pretty expensive if you put it to any extensive amount of use, and it’s rather specific in what it does. It’s a facial recognition program listed under their Artificial Intelligence offerings (gulp) and it can scan social media or other video feeds and pick out individual faces from both still pictures and video.

You know who some of their biggest customers are, right? Law enforcement. And that has certain online privacy advocates up in arms and demanding that the e-commerce giant stop selling it to the cops. (Associated Press)

Amazon’s decision to market a powerful face recognition tool to police is alarming privacy advocates, who say the tech giant’s reach could vastly accelerate a dystopian future in which camera-equipped officers can identify and track people in real time, whether they’re involved in crimes or not.

It’s not clear how many law enforcement agencies have purchased the tool, called Rekognition, since its launch in late 2016 or since its update last fall, when Amazon added capabilities that allow it to identify people in videos and follow their movements almost instantly.

The Washington County Sheriff’s Office in Oregon has used it to quickly compare unidentified suspects in surveillance images to a database of more than 300,000 booking photos from the county jail — a common use of such technology around the country — while the Orlando Police Department in Florida is testing whether it can be used to single out persons-of-interest in public spaces and alert officers to their presence.

So the ACLU and other civil rights groups are going public and demanding that Amazon stop selling this powerful facial recognition software to the police. Not to everyone, mind you… just to law enforcement. That’s a tricky proposition because normally they demand that the government either start doing something or stop some activity they consider harmful. But Amazon isn’t the government. They’re a private business entity selling a product which has apparently not been deemed illegal or dangerous in a fashion which would cause the government to restrict its sale.

Looked at in that light, Amazon is pretty much free to ignore them unless they can come up with some sort of court order forcing the company to cease and desist. But since the product would still be made available to the public under the terms of these demands, it’s tough to see a judge making that call.

The bigger issue here is the reason the ACLU gives for wanting to keep police from having this tool. They claim that Rekognition could allow the government to “easily build a system to automate the identification and tracking of anyone.” Um… isn’t that the point? And if you’re just wandering around minding your own business, why would the police want to track you to begin with? Sure, this software probably looks like something out of the Tom Cruise movie Minority Report, but technology is continually reshaping how our society operates.

I keep coming back to the same type of crime scenario when considering these online privacy questions. Imagine that some creep is out there using this software (or something like it) to stalk his ex-girlfriend. When he finally puts his plan into action and throws her in the trunk of his car, somebody has to call the police. Wouldn’t you like the cops to be able to feed that guy’s picture into their system and have it spit out the location of his car minute by minute? Of course you would, at least if the victim was your relative or friend.

But somehow this is still viewed as “a bad thing” among privacy advocates because it’s apparently worth getting a bunch of people killed so long as you can continue to make the job of law enforcement harder. And for what? Because you imagine that Big Brother is stalking you every time you leave your house? This makes no sense to me.