Earlier this month, we looked at Apple’s announcement saying that they would be scanning the photos and encrypted messages of users, seeking to detect child pornography and report users trading in such images and videos. While nearly everyone (hopefully) agrees that cracking down on people who traffic in the sexual abuse of children is a worthy goal, concerns were immediately raised over the level of invasive scrutiny Apple would be applying to its users. There were also questions about the volume of data that would be harvested and where it might all end up.
This week, a coalition of nearly 100 policy groups and privacy advocates from around the globe signed off on a letter asking Apple to put the brakes on this plan. They expressed concerns over the likelihood that such a sweeping net would end up censoring protected speech and damaging privacy rights. Ironically, they also cited the possibility that such a policy could wind up hurting vulnerable children rather than protecting them. (Reuters)
More than 90 policy and rights groups around the world published an open letter on Thursday urging Apple (AAPL.O) to abandon plans for scanning children’s messages for nudity and the phones of adults for images of child sex abuse.
“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the groups wrote in the letter, which was first reported by Reuters.
The largest campaign to date over an encryption issue at a single company was organized by the U.S.-based nonprofit Center for Democracy & Technology (CDT).
Rather than conceding, Apple responded by saying once again that they have built “protections” into the system that will prevent abuse of it. They also don’t seem terribly concerned over the weakening of encryption protocols as long as they are the ones weakening them.
But how could this plan actually endanger children rather than protecting them? The groups signing off on the letter expressed concerns over a filter that would blur images containing nudity on the accounts of minors, only allowing them to view the content if their parent or guardian approved it. They say that this could create danger for children in less tolerant households even if the content they were accessing was strictly educational in nature.
I suppose that could be true if the child was seeking medical information. But how much medical imagery including nudity is out there and publicly available? Even if it’s far more than I would have guessed, the majority of nude images on the internet tend to have at least some erotic connotations.
The other major stumbling block has nothing to do with protecting children. Advocates are concerned that this change will introduce a back door to Apple’s famous end-to-end encryption regimen. And once the back door is installed, the government could compel providers to give them access to it. That sounds like a more viable objection if your primary interest is in blocking the government from any access to digital records. But it doesn’t address the question of instances where law enforcement may require such data to solve crimes, including child abuse, when no other avenue to evidence collection is available.
Join the conversation as a VIP Member