Apple's encryption surrender

(AP Photo/Mark Lennihan, File)

Apple portrays its new child porn detection program as necessary to protects kids, however, it gives governments and hackers new avenues to spy on people. Jazz wrote about this earlier but it’s an extremely disappointing surrender on encryption from a company previously focused on user security.

Advertisement

Two parts of Apple’s announcement raise privacy concerns. The first involves the tech giant’s installation of a review module on individual devices in hopes of finding potential child pornographic images.

“This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result,” said Apple’s summary of the plan including claims there’s a minuscule chance of a false positive. “The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.”

So much for security. Hacker groups just need access to the private set intersection technology and, voilà, access granted to every individual’s Apple product. Hackers found a flaw in the “Find My iPhone” software to obtain passwords and nudes in 2014. This new program potentially allows similar access to iMessages instead of only iCloud images, despite promises of otherwise by Apple.

The second issue involves the reliance on an algorithm, then a human, to review possible matches to known child porn. The tech giant vows no one will take a peek at images unless flagged under the Child Sexual Abuse Material hashes.

“Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images,” the Tim Cook-led company professes. “Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.”

Advertisement

This raises several important questions. Who at Apple reviews each image and what security procedures are in place to prevent any Apple infrastructure hack? What’s the likelihood of threshold or safety voucher manipulation to get access to the photo without disabling the account or sending NCMEC a report? Could accounts be held ransom by hackers who are then able to lockout Apple from restoring the accounts?

It’s also not known how people could appeal a suspension or who determines whether appeals are valid or not. Twitter and Facebook’s problems with account locks are well known. How will Apple prevent a similar situation from happening?

There’s no doubt Apple put government consideration in this announcement. It’s possible, albeit a bit cynical, they considered certain provisions in the For The People Act, particularly Section 2401-4211. The bill language puts new rules on tech companies regarding online advertising expanding the definition of political advertising and requiring the disclosure of non-profits donors. Let’s say someone downloads a meme or image they see from a political non-profit flagged by the government under For The People Act. The government, specifically the Federal Elections Commission, could instruct tech companies to limit distribution through text or social media posts by using technology previously used in the hunt for child porn. All under the guise of “protecting elections.” EFF raised similar concerns without mentioning new rules under the For The People Act and how certain images could be added to the terrorist database. Eugene Volokh at Reason worried about Fourth Amendment concerns, particularly on whether it violates the part involving “secure papers.”

Advertisement

Perhaps it’s time to return to my old Motorola StarTAC for communication. Glibness aside, this new move by Apple is disappointing and probably leads to abuse at some point in the near future. It needs to be scrapped. The good news is people can use more secure methods of encryption including apps like Signal.

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement
Advertisement