The state of Utah has contracted a new AI company to combine, compile and analyze all manner of data applicable to law enforcement and first responder operations using newly emerging artificial intelligence to locate potential crimes or emergencies in “real-time” (or as close as possible, anyway). The name of the company is Banjo, and they have been awarded a five-year contract to tap into traffic cameras around the state, CCTV feeds, the audio from 911 call centers and more. Like some other new tech startups we’ve covered here in the past, Banjo also “scrapes” social media outlets with location identification and collects images of people to help provide potential
The end purpose is for the system to identify “emergency events” and not only alert law enforcement or other first responders as applicable, but to match up available resources so they can be deployed more quickly. We’re not just talking about crimes, but also traffic accidents, fires, flooding or anything else you could imagine. But as you would expect, the usual list of suspects is completely up in arms over this “creeping surveillance state” where everyone will be under the thumb and watchful eye of Big Brother. At Vice, Jason Koebler describes the various functions and paints a dystopian picture of a future that’s only a few steps away from Minority Report.
The company, called Banjo, says that it’s combining this data with information collected from social media, satellites, and other apps, and claims its algorithms “detect anomalies” in the real world.
The lofty goal of Banjo’s system is to alert law enforcement of crimes as they happen. It claims it does this while somehow stripping all personal data from the system, allowing it to help cops without putting anyone’s privacy at risk. As with other algorithmic crime systems, there is little public oversight or information about how, exactly, the system determines what is worth alerting cops to…
Privacy experts are unsure how Banjo can be doing anything other than applying machine learning to a terrifying amount of data to create a persistent panopticon pointed at everyone who lives in Utah.
For those not familiar, the “panopticon” is a design concept incorporated into the construction of various prisons. I’ve watched a couple of documentaries on the subject and it’s really quite fascinating. They built tall, circular prisons with multiple levels of cells all facing toward a central guard tower. That way, all of the prisoners would be visible to the jailers at all times. There was one built in Illinois, but the rest were all in other countries.
The usual reminders about these sorts of complaints all apply. First of all, the state of Utah and any private sector entities voluntarily participating in the program were already collecting all of this data. Banjo isn’t putting up new cameras or microphones in people’s cars or homes. The cameras, phone lines and all the rest were already in place and collecting data. The problem is that there is so much data that it’s impossible to monitor it all and immediately find problems before they get out of hand. That’s what the AI is doing here.
As for the collection of images from social media and other sources, those are pictures and details that people willingly paste onto the web themselves. Nobody is making you do it, and anyone, including law enforcement, is free to look at them. Once again, Banjo is just accelerating the process.
And what do we get in return? In one example cited by authorities, the Utah Attorney General’s office worked with law enforcement to set up a simulated child abduction drill involving more than 100 officers and all of the normal law enforcement resources. It took them over eight hours to eventually locate the child. They then ran the same simulation using Banjo the system found the abducted child in 27 seconds and directed police to the location.
Other examples abound, such as the ability to spot buildings that have caught fire and route the closest firefighting assets toward the event before anyone could even dial 911. The system is still far from foolproof, but the possibilities seem amazing.
Now consider that child abduction test above and ask yourself one question. Would you rather see that child being returned to his or her mother by police in a matter of minutes or is having your selfie from spring break showing up in the Banjo database too big of a price to pay?
For some additional reading on facial recognition and the frightening police state we’re apparently all living under now, check out this article about a group of artists in London. They’ve started painting their faces in bizarre patterns in an attempt to thwart all of the facial recognition software being used across the pond. Hey… suit yourself. You can start wearing huge red clown noses for all I care.