Apple has responded to criticism that its new anti-child abuse measures will infringe on users’ privacy.
The backlash follows the company’s announcement last week that it plans to introduce a new means of scanning for child abuse in images uploaded to iCloud.
More than 5,000 people and organisations - including whistleblower Edward Snowden and the Center for Democracy and Technology - signed an open letter calling on Apple to repeal its decision, arguing that it opened a “backdoor” to spying on people, particularly by authoritarian governments.
The company has been forced to defend the new technology and pledged to not "expand it" further.
"Let us be clear, this technology is limited to detecting CSAM [child sexual abuse material] stored in iCloud," the company said in a new FAQ on its website.
"We will not accede to any government’s request to expand it. Furthermore, Apple conducts human review before making a report".
How does the controversial system work?
Law enforcement officials maintain a database of known child sexual abuse images and translate those images into "hashes" - numerical codes that positively identify the image but cannot be used to reconstruct them.
Apple will implement a similar database using a technology called "NeuralHash," designed to also catch edited images similar to the originals. That database will be stored on iPhones.
When a user uploads an image to Apple's iCloud storage service, the iPhone will create a hash of the image to be uploaded and compare it against the database.
The company claims that the system is designed to reduce false positives to one in a trillion.
Why has there been criticism?
Critics of the plan say that it is an intrusion of users’ privacy and could set a dangerous precedent "where our personal devices become a radical new tool for invasive surveillance, with little oversight to prevent eventual abuse and unreasonable expansion of the scope of surveillance".
In an open letter published on Friday, critics voiced concerns that the tech giant’s plans would "undermine fundamental privacy protections" on Apple products.
"While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products," they wrote.
They also implored that "Apple Inc.'s' deployment of its proposed content monitoring technology is halted immediately" and that the company "issue a statement reaffirming their commitment to end-to-end encryption and to user privacy".
Apple’s response to its critics
However, in Apple’s new FAQ document, the company claims that CSAM detection in iCloud Photos does not send information to Apple about "any photos other than those that match known CSAM images".
The company also reiterated that the system does not work for users who have iCloud photos disabled and that it does not work on your private iPhone photo library on your device.
Despite the company's assurances that it will now bow to pressure from governments to expand the technology, Apple has made concessions to governments in the past.
In Saudi Arabia, Apple sells iPhones with FaceTime disabled as the country does not allow encrypted phone calls. In January 2020, Reuters reported that the company had scrapped plans for encrypting data backups after pressure from the FBI.
Dr Nadim Kobeissi, a researcher in security and privacy issues, is critical of Apple’s response to the concerns.
"Asking people to disable iCloud Photos in 2021 is not realistic, and Apple knows this," he said.
"Everyone depends strongly on iCloud Photos not just for sync, but as a critical backup feature for what is often years and years of important photos".