Instagram to blur nudity as part of new features to protect minors from 'sextortion'

The Instagram logo is seen on a phone.
The Instagram logo is seen on a phone. Copyright Michael Dwyer/AP Photo
Copyright Michael Dwyer/AP Photo
By Euronews
Share this articleComments
Share this articleClose Button

Social media companies have come under fire for not doing enough to protect minors on their platforms.


Instagram will begin testing new features that aim to "protect young people from sextortion and intimate image abuse".

Sexual extortion or "sextortion" is a form of blackmail that involves coercing someone into sharing sexually explicit photos or videos and using them to threaten or exploit someone.

One of the new tools will detect and blur images that contain nudity and notify people to think twice before sending them. For everyone under 18, that feature will be activated on Instagram.

When the "nudity protection" tool is on, users will receive a message urging them to "take care when sharing sensitive photos". 

They can also withdraw photos they have already sent that contain nudity.

When someone receives a nude photo, it will be blurred with a warning that the picture "may contain nudity".

The tool uses machine learning to analyse whether the image contains nudity.

"Because the images are analysed on the device itself, nudity protection will work in end-to-end encrypted chats, where Meta won’t have access to these images – unless someone chooses to report them to us," Instagram, owned by Mark Zuckerberg's Meta, said in a blog post.

The other features include new notifications sent to people who may have interacted with an account removed for sextortion as well as hiding the message button on young people's profiles to possible accounts engaging in sextortion.

Susie Hargreaves, CEO of the UK-based Internet Watch Foundation (IWF), said in a statement provided to Euronews Next that sexual extortion can have "horrendous repercussions" and that the non-profit organisation applauds "any efforts by tech companies to safeguard" children using social media.

"However, while the new tool is a welcome move by Meta, any potential benefits will be undermined by its decision to roll out end-to-end encryption on its messaging channels," Hargreaves added.

"By doing so, it is willfully turning a blind eye to child sexual abuse. More can, and should be done, to protect the millions of children who use Meta’s services".

Recent increases in sextortion

In a report released in January, the Network Contagion Research Institute said financial sextortion was growing at an alarming rate.

The report added that Instagram was the most common platform for targeting victims because blackmailers could get personal information about people quickly.

Snapchat, meanwhile, was the most frequently used platform to coerce people into sending compromising photos.

The European Commission sent requests to Meta and Snap, the parent company of Snapchat, in 2023 to provide more information on measures they have taken to protect minors, which is a requirement under the Digital Services Act.

Social media companies have been repeatedly criticised for failing to protect children online.

At a US Senate hearing earlier this year, Meta CEO Mark Zuckerberg apologised to parents who said social media had harmed their children.


He said no one "should go through what you and your families have suffered".

EU institutions, meanwhile, are considering new laws on child sexual abuse content that would make it mandatory for companies to prevent the dissemination of this type of material. They have been subject to debate, however, over privacy concerns.

This week, EU lawmakers voted to extend a current exemption to privacy rules that allows platforms to detect child sexual abuse material until 2026.

Share this articleComments

You might also like