Nude deepfakes: Is the EU doing enough to tackle the issue?

Screen with sexually explicit deepfakes site open on computer.
Screen with sexually explicit deepfakes site open on computer. Copyright Inês Pereira
By Ines Trindade Pereira
Share this articleComments
Share this articleClose Button
Copy/paste the article video embed link below:Copy to clipboardCopied
This article was originally published in Portuguese

The artificial intelligence tools used to undress without consent were created to target women. These nude deepfakes don't work on men.

ADVERTISEMENT

The manipulation of images and videos to create sexually orientated content is closer to being considered a criminal offence in all European Union countries.

The first directive on violence against women will go through its final approval stage in April 2024.

Through artificial intelligence programmes, these images are being manipulated to undress women without their consent.

But what will change with this new directive? And what happens if women living in the European Union are the victims of manipulation carried out in countries outside the European Union?

The victims

Websites that allow sexual deepfakes to be created are just a click away on any search engine and free of charge.

Creating a sexual deepfake takes less than 25 minutes and costs nothing, using only a photograph in which the face is clearly visible, according to the 2023 State of Deepfakes study.

In the sample of more than 95,000 deepfake videos analysed between 2019 and 2023, the study reveals that there has been a 550% increase.

According to Henry Ajder, an AI and Deepfakes expert, those who use these stripping tools seek to "defame, humiliate, traumatise and, in some cases, sexual gratification".

And it's important to state that these synthetic stripping tools do not work on men. They are explicitly designed to target women. So it's a good example of a technology which is explicitly malicious. There's nothing neutral about that.
Henry Ajder
AI and Deepfakes expert

The creators of nude deepfakes look for their victims' photos "anywhere and everywhere".

"It could be from your Instagram account, your Facebook account, your WhatsApp profile picture," says Amanda Manyame, Digital Law and Rights Advisor at Equality Now.

Prevention

When women come across nude deepfakes of themselves, questions about prevention arise.

However, the answer is not prevention, but swift action to remove them, according to a cybersecurity expert.

"I'm seeing that trend, but it's like a natural trend any time something digital happens, where people say don't put images of you online, but if you want to push the idea further is like, don't go out on the street because you can have an accident," explains Rayna Stamboliyska.

"Unfortunately, cybersecurity can't help you much here because it's all a question of dismantling the dissemination network and removing that content altogether," the cybersecurity expert adds.

Currently, victims of nude deepfakes rely on a range of laws such as the European Union's privacy law, the General Data Protection Regulation and national defamation laws to protect themselves.

When faced with this type of offence, victims are advised to take a screenshot or video recording of the content and use it as evidence to report it to the social media platform itself and the police.

The Digital Law and Rights Advisor at Equality Now adds: "There is also a platform called StopNCII, or Stop Non-Consensual Abuse of Private Images, where you can report an image of yourself and then the website creates what is called a 'hash' of the content. And then, AI is then used to automatically have the content taken down across multiple platforms."

Global trend

With this proposed new directive to combat violence against women, all 27 member states will have the same set of laws to criminalise the most diverse forms of cyber-violence such as sexually explicit "deepfakes".

ADVERTISEMENT

However, reporting this type of offence can be a complicated process.

"The problem is that you might have a victim who is in Brussels. You've got the perpetrator who is in California, in the US, and you've got the server, which is holding the content in maybe, let's say, Ireland. So, it becomes a global problem because you are dealing with different countries," explains Amanda Manyame.

Faced with this situation, the MEP for S&D and co-author of the new directive explains that "what needs to be done in parallel with the directive" is to increase cooperation with other countries, "because that's the only way we can also combat crime that does not see any boundaries."

Evin Incir also admits: "Unfortunately, AI technology is developing very fast, which means that also our legislation needs to keep up. So we would need to revise the directive in this soon. It is an important step for the current state, but we will need to keep up with the development of AI."

Share this articleComments

You might also like