Apple's plans to scan iPhones for child abuse images spark concern among its own staff - reports

Apple's plans to scan iPhones for child abuse images spark concern among its own staff - reports
Copyright 
By Reuters
Share this articleComments
Share this articleClose Button

According to sources within the company, staff have been taking to Slack to voice their concerns over customers' privacy.

ADVERTISEMENT

A backlash over Apple's move to scan iPhones and computers in the US for child sex abuse images has grown to include employees speaking out internally.

It's a notable turn within a company famed for its secretive culture, as well as provoking intensified protests from leading technology policy groups.

Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. 

Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.

Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate are surprising, the workers said. 

Some posters worried that Apple is damaging its leading reputation for protecting privacy.

Pushback within 'secretive' Apple

Though coming mainly from employees outside of lead security and privacy roles, the pushback marks a shift for a company where a strict code of secrecy around new products colors other aspects of the corporate culture.

Slack rolled out a few years ago and has been more widely adopted by teams at Apple during the pandemic, two employees said. As workers used the app to maintain social ties during the work-from-home era by sharing recipes and other light-hearted content, more serious discussions have also taken root.

In the Slack thread devoted to the photo-scanning feature, some employees have pushed back against criticism, while others said Slack wasn't the proper forum for such discussions.

Core security employees did not appear to be major complainants in the posts, and some of them said that they thought Apple's solution was a reasonable response to pressure to crack down on illegal material.

Other employees said they hoped that the scanning is a step toward fully encrypting iCloud for customers who want it, which would reverse Apple's direction on the issue a second time.

A letter of protest

Last week's announcement is drawing heavier criticism from past outside supporters who say Apple is rejecting a history of well-marketed privacy fights.

They say that while the US government can't legally scan wide swathes of household equipment for contraband or make others do so, Apple is doing it voluntarily, with potentially dire consequences.

People familiar with the matter said a coalition of policy groups are finalising a letter of protest to send to Apple within days demanding a suspension of the plan. 

Two groups, the Electronic Frontier Foundation (EFF) and the Center for Democracy and Technology (CDT) both released newly-detailed objections to Apple's plan this week.

"What Apple is showing with their announcement last week is that there are technical weaknesses that they are willing to build in," CDT project director Emma Llanso said in an interview. 

"It seems so out of step from everything that they had previously been saying and doing".

Apple declined to comment for this story. It has said it will refuse requests from governments to use the system to check phones for anything other than illegal child sexual abuse material.

ADVERTISEMENT

Apple policy could be 'forcibly changed'

Outsiders and employees pointed to Apple's stand against the FBI in 2016, when it successfully fought a court order to develop a new tool to crack into a terrorism suspect's iPhone. 

At the time, the company said that such a tool would inevitably be used to break into other devices for other reasons.

But Apple was surprised its stance then was not more popular, and the global tide since then has been toward more monitoring of private communication.

The infrastructure needed to roll out Apple’s proposed changes makes it harder to say that additional surveillance is not technically feasible-
Kurt Opsahl
General Counsel, Electronic Frontier Foundation

With less publicity, Apple has made other technical decisions that help authorities, including dropping a plan to encrypt widely used iCloud backups and agreeing to store Chinese user data in that country.

A fundamental problem with Apple's new plan on scanning child abuse images, critics said, is that the company is making cautious policy decisions that it can be forced to change, now that the capability is there, in exactly the same way it warned would happen if it broke into the terrorism suspect's phone.

ADVERTISEMENT

Apple says it will scan only in the United States and other countries to be added one by one, only when images are set to be uploaded to iCloud, and only for images that have been identified by the National Center for Exploited and Missing Children and a small number of other groups.

But any country's legislature or courts could demand that any one of those elements be expanded, and some of those nations, such as China, represent enormous and hard to refuse markets, critics said.

Police and other agencies will cite recent laws requiring "technical assistance" in investigating crimes, including in the United Kingdom and Australia, to press Apple to expand this new capability, the EFF said.

"The infrastructure needed to roll out Apple’s proposed changes makes it harder to say that additional surveillance is not technically feasible," wrote EFF General Counsel Kurt Opsahl.

Share this articleComments

You might also like