Find Us

EU plans to fight child sexual abuse online with new law obliging tech firms to report offences

Swedish EU commissioner Ylva Johansson plans to introduce new regulations on tackling child sexual abuse
Swedish EU commissioner Ylva Johansson plans to introduce new regulations on tackling child sexual abuse Copyright STEPHANIE LECOCQ / POOL / AFP
By Tom Bateman with Reuters
Published on
Share this articleComments
Share this articleClose Button

The EU's home affairs commissioner plans to propose new legislation that will strengthen requirements for reporting abuse by social media platforms and internet service providers.


Social media platforms could be forced to do more to tackle child sexual abuse online, under new European Union plans expected to be announced in the coming months.

The rules would replace current interim legislation that allows the voluntary reporting of child sexual abuse material (CSAM) with a legal obligation to recognise, report and remove it.

"I will propose legislation in the coming months that will require companies to detect, report, and remove child sexual abuse," EU home affairs commissioner Ylva Johansson told Germany's Welt am Sonntag newspaper on Sunday.

"A voluntary report will then no longer be sufficient," she said.

Meta, the parent company of Facebook, Instagram, and WhatsApp, would be particularly affected by any change in regulations, Johansson told the paper. The company currently accounts for around 95 per cent of child sexual abuse notifications.

The EU's current rules on reporting CSAM leave it up to social media platforms and messenger services to decide whether or not to follow up on suspected instances of users committing child sexual abuse offences.

The voluntary requirements led to a six-month period last year where companies stopped reporting CSAM for fear of breaking privacy regulations introduced at the end of 2020.

Reporting began again after members of the European Parliament passed temporary legislation that allows platforms and internet service providers to use technologies that scan images and texts for the hashes of known child sexual abuse images, as well as newly-produced sexual abuse material.

"Data processed to detect child sexual abuse online is limited to what is necessary and is stored no longer than what is strictly necessary. Processing must be subject to human oversight and if necessary also to human review. These safeguards answer important concerns of the Parliament," EU commissioner Johansson told MEPs in Strasbourg at the time.

A pandemic of child sexual abuse

The problem of child sexual abuse imagery is a growing one. In June 2020, the EU law enforcement agency Europol published a report stating that the COVID-19 pandemic had led to a "surge" in the distribution of CSAM online, with some EU member states seeing as much as a 25 per cent rise in reported cases.

In 2020, internet service providers and social media platforms in the EU filed 22 million reports of child sexual abuse, Johansson told Welt am Sonntag. This is thought to be just a fraction of the actual number of incidents.

In her comments to the paper, Johansson said the fight against child abuse should be better coordinated and called for the establishment of a specialist European centre to improve prevention, law enforcement, and victim support.

Most web pages hosting CSAM are based in Europe, according to a 2020 report by the Internet Watch Foundation, a non-profit organisation that specialises in tracking down and removing child sexual abuse imagery from the internet.

Four EU member states - France, Latvia, Luxembourg, and the Netherlands - appear among the top ten countries hosting child sexual abuse URLs, the report said. Of those, the Netherlands was by far the leader, playing host to over 117,000 CSAM web pages.

Share this articleComments

You might also like