MEPs call for crackdown on child sexual abuse without ‘mass surveillance’

Rapporteur Javier Zarzalejos (left) following the LIBE vote on laying down rules to prevent and combat child sexual abuse
Rapporteur Javier Zarzalejos (left) following the LIBE vote on laying down rules to prevent and combat child sexual abuse Copyright Emilie GOMEZ/ European Union 2023 - Source : EP
Copyright Emilie GOMEZ/ European Union 2023 - Source : EP
By Mared Gwyn Jones
Share this articleComments
Share this articleClose Button
Copy/paste the article video embed link below:Copy to clipboardCopied

The European Parliament adopted Tuesday its draft position on plans to crack down on child sexual abuse online, calling for the new EU rules to avoid "mass surveillance" or "scanning" of the internet.

ADVERTISEMENT

An overwhelming majority of 51 out of 54 members of the justice committee spanning all political groups backed the position following what lead negotiator Javier Zarzalejos described as a "sensitive, complex and controversial" process.

The committee's position needs to be endorsed by the plenary next week before negotiations with EU member states can open.

In May 2022, the European Commission proposed using emerging technologies to scan end-to-end encrypted messages on platforms such as Meta’s Whatsapp in order to detect, report and remove child sexual abuse material (CSAM).

The proposal sparked a bitter row, pitting privacy lobbies against advocates of children’s rights, and saw EU Home Affairs Commissioner Ylva Johansson respond to allegations of undue influence.

Critics cited a major infringement of our fundamental rights to privacy online, as well as concerns that technologies were not mature enough to pinpoint CSAM without falsely flagging millions of legal content and wrongly incriminating users.

The parliament's draft compromise mandates digital platforms to mitigate the risks of their services being used for online sexual abuse and to groom children. It also allows judicial authorities to issue so-called "detection orders" to digital platforms, obliging them to use emerging technologies - such as the so-called perpetual hash function - to detect child sexual abuse material (CSAM). But in contrast to the Commission's proposal, these orders would need to be targeted, time-limited and used as a "last resort" where there are “reasonable grounds of suspicion."

Speaking to Euronews following the vote, Zarzalejos assured the parliament had managed to walk the tightrope between safeguarding children online while also protecting the fundamental right to digital privacy.

"This is about striking the right balance between the protection of children and at the same time to provide a legal framework in which privacy and data protection is ensured," Zarzalejos said. "It has some complexities from a legal point of view, but also in terms of the intrusiveness of technologies that should be deployed to detect and remove child sexual abuse material."

"This balance has been achieved. And I think that the broad support that this file has obtained is very telling about the spirit of compromise and the importance that all the political groups have attached to this proposal," he added

The parliament has also backed the proposal to set up an EU Centre for Child Protection to help implement the new rules in collaboration with the competent national authorities and Europol, the Hague-based EU law enforcement agency.

The centre would help develop detection technologies, conduct investigations and issue fines against platforms where necessary. 

MEPs also proposed to create a new consultative forum to ensure the voices of victims are heard.

"The EU Centre will be extremely important as a pivotal institution, and for the first time, the victims and survivors will be recognised in a consultative forum within the European Centre. So I think that all in all, this file will provide the necessary tools to be both legally sound and at the same time effective," Zarzalejos said.

'Targeted' detection orders

The highly contested detection orders included in the Commission's proposal would oblige digital messaging services to deploy client-side scanning (CSS) technology to tap into users' encrypted messages.

In a damning opinion on the proposal published last May, the legal service of the Council of the EU raised "serious legal concerns" about detection orders and their potential "serious interference with fundamental rights," as enshrined in EU law.

The parliament's draft position calls for encrypted communications to be excluded from the scope of detection orders. CSS technology puts the "integrity and confidentiality" of encrypted communications at risk, the compromise text says.

MEPs have also significantly limited the scope of detection orders to court-issued orders in situations of reasonable suspicion.

European digital rights group EDRi, who have staunchly opposed the Commission's proposal, have welcomed the parliament's compromise.

ADVERTISEMENT

"Civil liberties MEPs have rightly recognised that no one will be safe online if the EU breaks encryption," Ella Jakubowska, senior policy advisor at EDRi, said.

"This is a critical step towards ensuring that internet regulation is based on evidence and in legal and technical reality, not on promises from AI companies," Jakubowska added.

But children's rights defenders have accused the parliament of short-sightedness in watering down the Commission's ambitions. According to ECPAT International - a global platform to end child sexual exploitation - limiting detection orders to targeted suspects will enable perpetrators to continue to abuse under the radar of law enforcement.

Amy Crocker, head of child protection and technology at ECPAT, said the parliament's draft position is "an alarming setback to children's safety online."

"It starkly contradicts the expectations of European citizens and, more critically, actively undermines the safety of our children in digital spaces. It is a decision that favours bureaucracy over the welfare of children," she said.

ADVERTISEMENT

Grooming could go undetected

To tackle the online grooming of minors, MEPs on Tuesday backed calling on digital services targeting children to require user consent for unsolicited messages, have blocking and muting options, and boost parental controls.

But the Commission had wanted to go further, using AI-based language models to detect behavioural patterns that could amount to child grooming in order to catch online predators.

ECPAT said: "Deciding against detecting grooming means that we give up on the possibility of preventing future harm from happening in the first place."

A new report published in October by the WeProtect Global Alliance suggests social gaming platforms are becoming dangerous new environments for adult-child interaction where conversations can escalate into high-risk grooming situations within just 19 seconds, with an average grooming time of 45 minutes.

But according to Zarzalejos, new risk mitigation measures for platforms will help protect children from such dangers.

ADVERTISEMENT

"Let me make it clear that grooming is within the scope of the regulation. There will be mitigation measures particularly addressed to prevent grooming," he said.

"The only change that we have agreed upon is to take grooming out of the detection audit," he added.

Share this articleComments

You might also like

New EU rules will criminalise 'paedophilia handbooks' and deepfakes of child abuse

Lawmakers back fix absent deal to block child sexual abuse material online

Victims of child sexual abuse across EU face ‘postcode lottery’ of injustice - report