FBI chief rips Apple, Google for adding unbreakable encryption to their smart phones

If I’m not mistaken, Android phones already have this capability. What has the feds exercised now is the fact that encryption will soon be the default option on Android phones and on Apple’s new iPhone.

Has there ever been a technology capable of protecting information so securely that law enforcement couldn’t get to it, even if they knew where it was? I feel like there must be obvious examples but I’m blanking on them. People have been using codes since the beginning of time but until recently breaking those codes was a matter of humans matching wits. We’re now at the point where computers are capable of generating codes so long and complex that it would take other computers ages to guess every possible password permutation.

“There will come a day when it will matter a great deal to the lives of people . . . that we will be able to gain access” to such devices, [FBI director James] Comey told reporters in a briefing. “I want to have that conversation [with companies responsible] before that day comes.”

Comey added that FBI officials already have made initial contact with the two companies, which announced their new smartphone encryption initiatives last week. He said he could not understand why companies would “market something expressly to allow people to place themselves beyond the law.”

“Apple will become the phone of choice for the pedophile,” said John J. Escalante, chief of detectives for Chicago’s police department. “The average pedophile at this point is probably thinking, I’ve got to get an Apple phone.”

A few of the comments in our Headlines thread on this have me thinking that not everyone understands what this means. One reader wrote defiantly that if the feds, i.e. the NSA, want to know what’s on their phone, they can go ahead and get a search warrant. But … that’s Comey’s point: The new technology would render even valid search warrants useless. Currently, if the cops want to know what’s on someone’s phone, they can serve Apple or Google with a warrant and the company can unlock the phone on the back end. With the new encryption, the company won’t be able to do that; the code is so hard to break that only its owner would know how to unlock it. Terrorists, pedophiles, you name it: As long as they keep their sensitive info on the phone’s hard drive and don’t stupidly upload it to an external server like “the Cloud,” there’d be no way for cops to reach it.

Law prof Orin Kerr wrote a long post about this last week, noting that he couldn’t see any reason for the new technology except to thwart lawful search warrants that have been obtained in full compliance with the Fourth Amendment. (He also imagined a case where cops might need information stored on the phone of a murder victim to solve the case but wouldn’t be able to touch it because of the encryption.) He walked that back a bit in a later post after people pointed out that a phone that can’t be broken by the company that built it also can’t be broken by hackers. It’s fully secure from prying eyes of all kinds — possibly including the NSA, although who knows what capabilities they’ve developed — so long as you’re not sharing the information on it with an outside server, which will be increasingly important as people start putting more and more of their vital data (including health data and financial data) on our phones. As one Headlines commenter put it, if the FBI has a problem with this they should direct their complaints not to Apple and Google but to the NSA. After all, it’s public alarm over mass surveillance and consumer upset over telecoms’ collusion with the feds that created the demand that the companies are now trying to meet with the new encryption tech. Most people, I’d bet, support making smart phones searchable if the FBI has a warrant. But since the only options at this point seem to be “searchable by the NSA” and “searchable by no one,” go figure that there’s a market for the latter.

Two points here. One: It could be that Comey’s worry is overblown simply because so much of our most sensitive is in fact also stored on external servers. E-mails, texts, tweets, Facebook updates, on and on — it’s all landing on telecoms’ hard drives, where the FBI and NSA can hoover it up. It’s hard to believe that, as health tracking apps become more popular, the data they generate will remain completely contained on your phone and not transmitted to some third party as well. Super-encryption might make the FBI’s job harder by forcing them to go knocking on multiple telecoms’ doors to get the info they could extract in 10 minutes if they had your phone, but it probably won’t thwart most investigations outright. Two: It may be that courts will decide that suspects can be compelled to unlock their phones pursuant to a subpoena. That was a hot topic in the first Kerr post that I linked above. The Fifth Amendment privilege against self-incrimination protects you from having to divulge the information stored in your brain, but what about the information stored on your phone? I’ll leave it to legal eagles to argue that out but do note that Kerr thinks people can be compelled to unlock their phones on pain of contempt of court if they refuse. If the Supreme Court agrees, they’ll have created a legal backdoor to replace the technological backdoor that Apple and Google closed. It used to be that cops could make the telecoms unlock your phone; now they’ll make you do it instead.