The Almendralejo case: When AI deepfakes are used to undress teenagers | Euronews Tech Talks Podcast

Euronews Tech Talks is the podcast delving into the impact of new technologies on our lives.
Euronews Tech Talks is the podcast delving into the impact of new technologies on our lives.
Copyright 
By Marta Rodriguez MartinezMarian Rosado
Share this articleComments
Share this articleClose Button

Almendralejo in southern Spain became one of the first global examples of just how alarming sexual harassment with AI Deepfake technology can be.

ADVERTISEMENT

It was a quiet Sunday when journalist Marian Rosado was browsing her Instagram account and suddenly stumbled upon an unusual Instagram live broadcast.

The well-known gynecologist Miriam Al Adib was revealing a very personal and shocking situation: her 14-year-old daughter just told her somebody used an app to take her photo from social media and made it look like she was naked.

This is just the beginning of a scandal that placed a quiet town in southern Spain in the spotlight.

Over the next three episodes, the team of Euronews Tech Talks will delve into the world of AI Deepfake technology.

We aim to uncover its scope, address strategies for mitigating its risks, and explore methods for educating society to recognise the dangers and protect themselves. 

The impact of fake AI nudes

Almendralejo, a tranquil town with 35,000 residents nestled in the Spanish region of Extremadura near the Portuguese border, isn't the type of place that typically captures national, let alone international, attention.

However, in September, just after the school break, Almendralejo made headlines

Dozens of local teenagers reported receiving AI-generated naked images of themselves on their mobile phones.

In the real photos, the teenagers were fully clothed. These images, stolen from their Instagram accounts, were altered using an artificial intelligence (AI) app and then circulated in WhatsApp groups.

Despite the artificial nature of the nudity, the distress felt by the girls upon seeing these images was very real, as reported by their mothers.

What adds to the unsettling nature of this story is that the perpetrators of this sexual harassment were also teenagers known to the girls.

'Undress any picture with AI for free'

Equally disturbing is how easily these images were created. 

Deepfakes, a form of synthetic media utilising AI, typically involve complex processes like deep learning.

However, these teenagers weren't AI experts. They simply paid €10 to obtain 25 hyper-realistic naked images of their peers using the Clothoff app.

Available for free download, the app enables users to digitally undress anyone in their phone's picture gallery with the slogan, "Undress any picture with AI for free".

When approached for clarification on their rules, Clothoff emphasized age verification and obtaining consent. 

While they claimed to take age verification seriously, they didn't disclose their methods due to "security reasons".

Regarding consent, they asserted having strict policies but didn't specify them, relying on users to follow the guidelines.

ADVERTISEMENT

A study by Sensity AI revealed that 96 per cent of deepfake images consist of sexually explicit pictures of non-consenting women. 

A Europol report estimates that within three years, approximately 90% of online content may be AI-generated.

Journalist • Marta Rodriguez Martinez

Share this articleComments

You might also like