Up until now, facial recognition software has largely remained in the vaults of some of the tech giants like Amazon, Microsoft and Google. This has alarmed many privacy advocates while simultaneously promising improvements in the effectiveness of law enforcement and efforts to stop terrorist plots. But what if this technology moves out of those controlled confines and spreads throughout the private sector? What if it becomes embedded in something as simple as an app that anyone can download on their phone?

This is apparently no longer a “what if” scenario. A company called Clearview AI has developed just such an app and the user can take a picture of someone with their phone, resulting in a search of a massive database of images. The results can potentially provide a name and perhaps even an address or other personal information. If you thought privacy advocates were upset before, this is going to take it to an entirely new level. (CNET)

What if a stranger could snap your picture on the sidewalk then use an app to quickly discover your name, address and other details? A startup called Clearview AI has made that possible, and its app is currently being used by hundreds of law enforcement agencies in the US, including the FBI, says a Saturday report in The New York Times.

The app, says the Times, works by comparing a photo to a database of more than 3 billion pictures that Clearview says it’s scraped off Facebook, Venmo, YouTube and other sites. It then serves up matches, along with links to the sites where those database photos originally appeared. A name might easily be unearthed, and from there other info could be dug up online.

I first became aware of this new app via a female Twitter user who was clearly upset over the news. (Language warning)

I can’t say I’m unsympathetic to her fears. Nobody wants to pick up a stalker, and that’s a concern that’s particularly applicable to women. But at least for the moment the app isn’t available to the general public. Police forces in multiple locations are testing it out, however. And the company says that it expects the app to be available to everyone sooner or later.

So is this something the government needs to prevent? Congress has already held hearings about facial recognition technology exploring just such questions, including several last year. But what justification would they use to clamp down on this sort of software?

It’s not illegal to take pictures of other people with your phone when you’re out in the public square. (Once you walk out your front door your expectation of privacy drops to almost zero aside from in hotel rooms, bathrooms, showers or locker rooms.) The sources for the image library the company uses to look for matches come largely from social media. Those are pictures that people willingly shared with the public for the most part. Precisely what harm is being caused if an app matches the images up?

The real crime takes place if you use the information to begin stalking someone and threatening or injuring them. But that was already illegal anyway. Banning this sort of software effectively makes criminals out of anyone with a phone, based on the assumption that you would use the app for ill intent.

I’m not saying it’s not disturbing. It definitely is. But like every other major technological advancement, we find ourselves dealing with something that might be a both a blessing and a curse. The same software and tools that allow you to build broad, sweeping communities to engage with online also opens you up to intrusive collisions with bad actors. Every time you post a selfie you risk exposing some details about your location and habits that determined stalkers could make use of. All this software is doing is streamlining the process.

For better or worse, the facial recognition genie is already out of the bottle. And to think that it wasn’t going to eventually seep out into the public domain was shortsighted. Of course, you could choose to throw away your devices and stay largely off the grid. But that’s a trade that most younger people simply won’t be prepared to make.