Any time the terror alert status bubbles up to red or pink or whatever it currently is, the government looks for more and better ways to gather information. This month there are moves afoot to get the manufacturers of mobile phones and other high end communications devices to open up their security protocols and allow law enforcement to dredge up whatever data may be found. So far the tech giants have been blocking any such moves, but the pressure is definitely being turned up. (The Hill)
Pressure is rising on Apple, Google and other technology companies to allow law enforcement and intelligence agencies access to encrypted phones and other devices.
In the wake of the coordinated terrorist attacks on Paris, CIA Director John Brennan, Sen. Dianne Feinstein (D-Calif.) and other critics are amplifying their arguments against Silicon Valley.
The rise of commercial encryption technology, they say, risks shielding terrorists from surveillance — raising the bar for law enforcement to thwart future attacks.
“We in many respects have gone blind as a result of the commercialization and the selling of devices that cannot be accessed either by the manufacturer or, more importantly, by us in law enforcement, even equipped with the search warrants and judicial authority,” New York City Police Commissioner William Bratton said Sunday on “Face the Nation.”
If this were as simple as saying, hey… we just want to catch the bad guys, this would be a no brainer. Let’s crack into the phones of the terrorists and their domestic sympathizers and lock them up. But nothing’s ever that simple, is it?
People are already worried about the security of their phones, particularly given how integrated this technology is in each and every aspect of our lives. (Particularly for younger citizens.) We don’t just call folks… we do shopping and banking on our phones. We maintain our contact lists, which themselves are rich sources of data for those with malicious intents. So the first big problem which comes to mind is the hackers. It’s all well and good for the government to ask for a “back door” into the phones which would, we are assured, only be cracked open by John Law with an appropriate court order if probably cause existed to suspect we were up to no good. But the hackers are always better at all of this than the government is. If a back door exists, why would Americans believe for a moment that the hackers wouldn’t be sneaking in a few weeks later? Massive data breaches have already taken place in the halls of commercial retailers as well as the theft of naughty pictures of celebrities and regular folks alike. That’s a pretty big trade-off to ask for.
The second, more traditional concern of privacy advocates is that the government, having been made aware of such a back door into our iPhones, will be snooping around on anyone and everyone they don’t particularly care for, not just the guy sending money to ISIS. Political “enemy lists” are a long time concern and recent events at the IRS and other agencies haven’t done much to quell those fears.
So what do we do? Apple and Google don’t want to hand the keys over to the feds, but they will do so if the laws of the land force their hand. Do we just use our phones less, or do we blithely assume that if we’re not doing anything wrong we have nothing to worry about? The tech genie is out of the bottle, so going back to clunky, rotary dial phones on the kitchen table isn’t really much of an option for the majority of Americans. But if we deny the government the access they are looking for we have an ongoing channel for terrorist infrastructure in place.
I don’t have an answer for this one. I’m more curious to hear what your thoughts are. Do you want your phone to have a “back door” that the government can peek into when warranted?