This content is not available in your region

Here's how the EU's plan to tackle online child abuse could impact your privacy

Access to the comments Comments
By Alice Tidey
euronews_icons_loading
Mobile phone app logos for, from left, Facebook, Instagram and WhatsApp.
Mobile phone app logos for, from left, Facebook, Instagram and WhatsApp.   -   Copyright  AP Photo/Richard Drew

Privacy activists are sounding the alarm over the European Commission's plans to clamp down on online child abuse, warning that it would usher in "mass surveillance" in the bloc.

The EU executive's Better Internet for Kids strategy, unveiled on Wednesday is calling for stronger safeguards to protect children from harmful content online or from being preyed upon.

Margrethe Vestager, Executive Vice-President for a Europe fit for the Digital Age, assured in a statement that the strategy is "in line with our core values and digital principles" while her colleague, Commissioner for Internal Market Thierry Breton, stressed that the EU now "call upon industry to play its part in creating a safe, age-appropriate digital environment for children in respect of EU rules."

Niels Van Paemel, policy advisor at Child Focus Belgium, told Euronews that the NGO is "very pleased that  the Commission is taking the fight against CSAM, Child Sexual Abuse Material, to the next level."

"It's great that right now we see industry, that they are being reminded of their responsibilities. We are moving away from voluntary action, that's how it was in the past but that didn't work. Now social media platforms are forced to proactively look for reports and remove possible exploitation," he explained. 

Problematic content they detect will then be flagged to a soon-to-be-created EU expertise centre as well as national authorities, which Van Paemel said would make the fight against CSAM more transparent as well as enhance cooperation between member states' organisations and law enforcement.  

'Clearly undermines end-to-end encryption'

But privacy rights experts or activists are much more critical of the Commission's plan which obliges companies to service providers to detect, report and remove child sexual abuse when it was previously done on a voluntary basis. 

It also demands that they monitor encrypted content. End-to-end encryption enables only the sender and reader of a communication to access its content. Tech companies, including Meta - the parent company of Facebook - and Apple, have for years resisted authorities' demand they create so-called backdoors to encrypted services.

But Commission argues that "if such services were to be exempt from requirements to protect children and to take action against the circulation of child sexual abuse images and videos via their services, the consequences would be severe for children."

For Zach Meyers, Senior Research Fellow at the Centre for European Reform (CER) think tank, the Commission's plan "clearly undermines end-to-end encryption."

"Once a “backdoor” to undermine encryption exists, that will create both new security vulnerabilities for hackers, and inevitable political pressure to expand the “backdoor” so that it covers more than just child sexual abuse material (CSAM) over time," Meyers added. 

This could lead to some companies shelving end-to-end encrypted services altogether in order to comply with the EU's legislation. 

It is also a bit of a head-scratcher for industry players as the bloc is expected to soon give the final green light on two important pieces of legislation -- the Digital Markets Act and Digital Services Act -- which will, in part, regulate tech companies' access and use of personal data.

The EU parliament has throughout the negotiations with the EU Council on these two key pieces of legislation insisted that end-to-end encryption be protected. 

Then, there is the fact that detecting grooming is much harder to do than spotting harmful images and videos, which can largely be done with artificial intelligence tools.

According to Meyers, "detecting “grooming” can only be effectively undertaken by scanning texts between individuals. A high degree of human intervention is necessary because understanding the context, and whether the recipient of the messages is a child, is critical."

'EU would become a world leader in generalised surveillance'

Interinstitutional negotiations on these proposals are likely to focus heavily on these two issues. 

German MEP and civil rights activist Dr. Patrick Breyer (Pirate Party) has decried the legislation as a "mass surveillance plan" and a "spying attack on our private messages and photos by error-prone algorithms" which he described as "a giant step towards a Chinese-style surveillance state."

“Organised child porn rings don’t use email or messenger services, but darknet forums. With its plans to break secure encryption, the EU Commission is putting the overall security of our private communications and public networks, trade secrets and state secrets at risk to please short-term surveillance desires. Opening the door to foreign intelligence services and hackers is completely irresponsible," he added in a statement. 

He argued to Euronews that "when it comes to private communications, it must be limited to suspects and require a judicial order" and flagged that "the hash database [in which known child abuse material is stored] currently used for matching is so flawed that up to 86% of reports are not even criminally relevant."

A collective of 35 civil society organisations had already urged the Commission, back in March, when the proposal was originally meant to be unveiled before being twice-delayed, to "ensure that people’s private communications do not become collateral damage".

The European Digital Rights (EDRi), one of the signatories of the statement, added that "this law would make the EU a world leader in the generalised surveillance of whole populations". They also emitted doubt as to whether it would actually make much of a difference in tackling the dissemination of child abuse material. 

"Real criminals can easily circumvent this legislation by just moving to self-hosted messengers, the dark web or other jurisdiction," Thomas Lohninger, Executive Director of epicenter.works and Vice-President of EDRi, told Euronews on Wednesday.

"The only ones whose messages will in the end be surveilled are normal European citizens, journalists, doctors, lawyers and whistleblowers. If this proposal goes through, the days in which the EU was leading on data protection are over," he added. 

Europe is CSAM hub

The Commission has sought to brush aside these concerns. Commissioner for Home Affairs Ylva Johansson argued to Euronews that the bloc's executive has "listened to those concerns" around privacy.

"We have set up every clear safeguards," she said so that "detection will only be allowed when there is a detection order, and there needs to be a prior consultation with the data protection authorities".

In its communication, the Commission also said that it is closely working with industry, civil society organisations, and academia to "support research that identifies technical solutions to scale up and feasibly and lawfully be implemented by companies to detect child sexual abuse in end-to-end encrypted electronic communications in full respect of fundamental rights."

Time is now of the essence for the EU institutions to find compromises as a temporary law allowing tech companies to voluntarily scan their users' content to report CSAM is due to expire in six months. Failure to strike a deal would mean online platforms would no longer have a legal basis to carry out this work and may choose to stop rather than risk being exposed to legal proceedings. 

According to a report to the Internal Watch Foundation's annual report, published last month, there were 252,194 URLs (webpages) confirmed last year as containing child sexual abuse imagery having links to the imagery or advertising it — 64% increase from 2020.

The European region accounted for 72% of the reports assessed by the NGO.