Articles

Apple forges ahead with plan to scan iPhotos for child sexual abuse media

Written by Osano Staff | August 17, 2021

"Apple's compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company's leadership in privacy and security.   the Electronic Frontier Foundation on Apple's plan to fight child abuse media 

Recently, Apple announced a new plan to help it detect child pornography online, or what's called Child Sexual Abuse Media (CSAM). The announcement triggered an onslaught of criticism, partly because of the significant privacy implications posed and partly because Apple fumbled the reveal, as the company would later admit. 

Apple worked with child safety experts to develop a three-pronged approach. First, families that opt-in will experience on-device messaging to warn children and their parents before sharing sensitive content. Specifically, if a child age 12 or younger tries to send a message that Apple's iMessage algorithm deems to be "sensitive," it will warn the child. If the child continues to engage, it will send a warning message to the parents. 

Second, updates to "Siri" and Apple's search function will "intervene" when users search for CSAM-related topics. 

And most controversially, with its future iOS updates, Apple plans to do on-device scanning of users' iCloud images to hunt for CSAM material. The process itself is not so new; it's is called PhotoDNA, invented 15 years ago by Microsoft and a UC-Berkeley professor. Technically, here's how it works: Apple's iOS will compare a database of known child sex abuse material with a users' photos. The system will use what's called a "NeuralHash," the process of analyzing an image and then assigning it a unique number, as Apple explains. Photos will be flagged as a "match" if the scan identifies one of the specific numbers indicating child sex abuse material. The National Media Exploitation Center (NMEC) has developed a database of images to compare against users' photos to detect CSAM. Once a user's photos are flagged, a human reviewer will confirm the photos and, if accurate, will send them to NMEC to verify. To prevent against false positives, Apple said it will only flag an account if the algorithm detects 30 or more CSAM matches.

While many platforms already scan their databases for CSAM, Apple's choosing to do the scan on the device itself. And that's most of what has critics incredulous. Privacy advocates, cybersecurity experts and cryptographers sounded off on Twitter immediately following the news, and debates have stretched across threads for days. The concerns mainly center around the potential for mission creep, whether U.S. or foreign governments might someday gain access to the data and the precedent of decrypting formerly encrypted data. 

The Center for Democracy and Technology (CDT) and the Electronic Frontier Foundation (EFF) filed protest letters demanding Apple scrap its plan. 

"Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the U.S., but around the world," said CDT's Greg Nojeim, co-director of CDT's Security & Surveillance Project. "Apple should abandon these changes and restore its users' faith in the security and integrity of their data on Apple devices and services."

In response to the backlash, Apple's senior vice president of software engineering Craig Federighi told The Wall Street Journal it was clear "a lot of messages got jumbled pretty badly in terms of how things were understood. We wish that this would've come out a little more clearly for everyone because we feel very positive and strongly about what we're doing." 

And in a post-announcement explainer on Frequently Asked Questions, Apple said, "CSAM detection in iCloud Photos is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images. CSAM images are illegal to possess in most countries, including the United States." 

But that hasn't silenced concerns about the potential for the matching system to be used in alternative and harmful ways, including by governments who could seek access. Former Facebook Chief Information Security Officer Alex Stamos has been very vocal about his disappointment with Apple's plans. In an interview with The Mark-Ups Julia Angwin, Stamos said he wouldn't go so far as to call it a "backdoor," but said Apple's plans to do device-side scanning "has created the possibility of a new type of surveillance becoming popular globally." 

Like Stamos, EFF said it's also concerned about the potential for mission creep. Now that Apple has decided to do device-side scanning, it opens up a new world of possibilities. Sure, right now, the algorithm is trained to look for CSAM content. But what if a government approached Apple and asked it to train the algorithm to look for something else? 

"The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers," EFF said. 

Apple, in its FAQ, said it "would refuse such demands and our system has been designed to prevent that from happening. Apple's CSAM detection capability is built solely to detect known CSAM images ... We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it." 

Apple's Federighi also said the system will be protected by "multiple levels of auditability."

But EFF said, in effect, let's call a spade a spade. 

"Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor," EFF wrote.  

An end to end-to-end encryption? 

In addition to mission creep, Apple's critics are upset at the plan to decrypt encrypted data. 

What's always at stake when we're talking about fighting crime is privacy. The way most criminals work is to keep communications private. And that's something we all want; the ability to communicate with others without someone else reading or listening over our shoulders. It's also an essential part of communication between doctors, journalists, political refugees and others. Sometimes the difference between life and death is the ability to talk in secret. 

That's part of the reason why encryption is so important. If you're new: Encryption is the scrambling of text to make it unreadable to anyone other than the intended recipients. Adding a third party into the mix means it's no longer safe from one. 

The Electronic Frontier Foundation called Apple's CSAM plan a "backdoor to your private life."

Despite messages passing through a server, an end-to-end encrypted message will not allow the server to know the contents of a message. When that same server has a channel for revealing information about the contents of a significant portion of messages, that's not end-to-end encryption. -- EFF

The CDT's Nojeim said images that used to be protected by end-to-end encryption "will now be searched routinely using algorithms that have not been revealed to the public. And users who expect privacy in the photos they take and share with their iPhones can no longer have that expectation when those photos are backed up to iCloud. Instead, they should know that Apple will scan those photos." 

In addition to public backlash, Apple's also facing criticism within the company itself. Reuters reported on Aug. 12 that employees reacted with more than 800 messages on an internal Slack channel indicating concerns that "the feature could be exploited by repressive governments looking to find other material for censorship or arrests."   

For Stamos's part, he said he hopes Apple will hit pause on its plans and spend time consulting civil society groups like the EFF and the ACLU to find a solution that's less privacy-invasive. 

"What I really would like is, if you're going to do any kind of [device] scanning, it has to be on behalf of the user," Stamos told The Markup's Angwin. "It should not be ever seen as being against the user's interests or for the benefit of law enforcement. That should be the guiding principle for any future work here."

It's unclear whether Apple feels the same way, but it doesn't look good.