'Living a lifelong sentence': How AI is trapping women in a deepfake porn hell

The dark side of AI: The women trapped in a deepfake porn nightmare
The dark side of AI: The women trapped in a deepfake porn nightmare Copyright Canva
Copyright Canva
By Imane El Atillah
Share this articleComments
Share this articleClose Button

Using AI tools, porn photos, and videos can easily be doctored to create "deepfake porn" of a person. Victims say the damage is irreparable.

ADVERTISEMENT

Noelle Martin was just 18 when she discovered that pornographic pictures of her were being circulated online. She never recalled taking, let alone sharing, intimate images. However, that was her face in those images - the body, however, wasn’t hers.

She became a victim of what would later be known as "deepfakes". Pornographic pictures had been manipulated to look like her by using images she had shared on her personal social media accounts.

"This is a lifelong sentence," Martin told Euronews Next. "It can destroy people's lives, livelihoods, employability, interpersonal relationships, romantic relationships. And there is very, very little that can be done once someone is targeted".

Deepfakes are digitally altered videos or images created to depict someone in fake scenarios. While deepfake technology can theoretically be used for more lighthearted, satiric or well-intended purposes, a 2019 Deeptrace Labs report found that 96 per cent of deepfake content online is non-consensual pornography.

Just this week, a 22-year-old man in New York was sentenced to six months in jail for posting deepfake porn photos of former school classmates using teenage pictures of them taken from their social media accounts.

Fighting for justice

Back in 2013, when the same happened to her, Martin tried to limit the damage by going to the police and asking for the images to be taken down. However, there was nothing they could do considering there were no laws against the dissemination of intimate images at the time in Australia, where she lives.

"Even if you can try and take things down, if you're a victim of this, you’ve still got issues to do with holding whoever is responsible to account. Because taking things down from public sites, from these websites or from wherever they're hosted is only one element to this," Martin said.

Ultimately what we're trying to do as the survivor community is push for laws to be established, push for countries around the world to criminalise this issue.
Noelle Martin
Lawyer and activist

"If you don't hold a perpetrator accountable or whoever is responsible, accountable, they can still continue to do this," she added.

Martin went on to pursue a career in law and spent years pushing for legislation against the non-consensual sharing of intimate images in Australia. In 2018, her efforts helped usher in new laws criminalising the distribution of non-consensual intimate images or “revenge porn” in her country.

Ten years after her own traumatic experience, Martin, now a 28-year-old lawyer and legal researcher at the University of Western Australia, is still advocating against the sharing of non-consensual sexual content, especially those created using deepfake technology.

"Ultimately what we're trying to do as the survivor community is push for laws to be established, push for countries around the world to criminalise this issue, push for the infrastructure to be developed to ensure that victims can seek help," she said. 

The rise of porn deepfakes

Porn deepfakes have been found online for years, however, recent major advances in artificial intelligence (AI) and the growing accessibility of such technology have made it easier to create disturbingly realistic - and deceptive - sexual material.

When deepfakes first emerged, the main goal behind them was to make pornographic videos featuring the faces of celebrities.

The term deepfakes itself surfaced in 2017 when a user with that alias on the social platform Reddit became known for making and sharing doctored pornographic videos of female celebrities including Scarlett Johansson and Taylor Swift.

More recently, porn deepfakes made the headlines in January when a popular Twitch streamer, Brandon Ewing - who goes by “Atrioc” online - was caught with a deepfake porn website open on his browser during a livestream on the platform. The site featured doctored images of fellow Twitch streamers and friends.

Ewing later issued a tearful apology and halted his streaming activity for several weeks, while Twitch updated its community guidelines to ban deepfake porn on the platform.

In his apology, Ewing said he got “morbidly curious” after seeing an ad for deepfake pornography on a popular porn website, so he clicked and paid to access the videos.

Female Twitch streamers saw their names dragged into the controversy, many of whom said they were unaware such content even existed online. 

ADVERTISEMENT

The scandal on Twitch in fact caused a spike in online searches for the phrases "deepfake porn," "porn deepfake" and "deepfakes," according to Google Trends data.

Sophie Compton, co-founder of the coalition My Image My Choice, which campaigns against intimate image abuse, said the controversy shows users of deepfake websites are part of the problem and should be held accountable too.

"It's not just the creators, it's all of the users. And these are ordinary boys. This is people in our schools, and probably some women as well. But it is predominantly men. And this is classmates, friends, brothers, boyfriends," Compton told Euronews Next.

“They might not realise that their participation is also part of the problem. And it's helping to validate something that is extremely, extremely damaging to women”.  

How deepfakes are used against their victims

While porn deepfakes of celebrities have existed for years, recent progress in AI means today virtually anyone can become a victim of such content.

ADVERTISEMENT

The technology can also be used to discredit or blackmail people in an attempt to silence or undermine them.

"There is a new deepfake porn website popping up every month it seems, and these are developing into commercialised businesses," Compton said, adding these were opening with apparent impunity.

"People are kind of branding as a deepfaker and they're accepting Bitcoin and they're accepting PayPal and they've got a tagline and a strapline and a logo".

Even if you can tell that something is fake or even if it's not completely realistic, the damage is still irreparable
Noelle Martin
Lawyer and activist

Whether the motivation behind creating and sharing porn deepfakes is to embarrass, defame or harass victims or simply make money, the content often mirrors some men's fantasies of degrading and objectifying women using explicit imagery, whether real or fake.

"From the perspective of a person who is the target of this, even if you can tell that something is fake or even if it's not completely realistic, the damage is still irreparable," Martin told Euronews Next.

ADVERTISEMENT

"If a deepfake is created of you, and even if you can kind of tell, people will still associate whatever is depicted of you with you. You still can be misappropriated. You can still have your reputation and name and image dragged through the mud".

Rana Ayyub, an Indian investigative journalist and contributor to the Washington Post, was the victim of a smear campaign in which porn deepfakes of her were used in an attempt to silence her. 

She detailed her harrowing experience in a column where she explained how being outspoken and seen as "anti-establishment" while being a Muslim woman made her a target of hate on social media.

However, the online abuse she was accustomed to suddenly felt very real when a fake porn video of her went viral and left her so shocked she couldn’t even face seeing her family.

"The entire country was watching a porn video that claimed to be me and I just couldn’t bring myself to do anything," she recounted.

ADVERTISEMENT

"From the day the video was published, I have not been the same person. I used to be very opinionated, now I’m much more cautious about what I post online. I’ve self-censored quite a bit out of necessity," she wrote.

"It is a very, very dangerous tool and I don’t know where we’re headed with it".

Do any laws and regulations protect us from porn deepfakes?

Campaigners say it’s currently very easy to publicly humiliate a person by creating deepfakes of them, and that this lack of regulation and accountability is failing victims.

"The message that is being sent to the people that are making these forms is no one is coming for you. What you're doing is not illegal and there's no consequences," Compton said.

The European Union has proposed new regulations tackling harmful online content, but these have yet to prove they can fully protect people from nonconsensual porn deepfakes.

ADVERTISEMENT

For instance, the Artificial Intelligence Act, which is currently being discussed in the European Parliament, requires creators to clearly disclose when content has been artificially generated or manipulated. But campaigners say disclaimers would hardly deter creators of deepfakes or bring justice to victims once the content is out there and forever linked to them.

However, another piece of EU legislation in the works is the Directive on Gender-Based Violence, proposed in 2022. This bill would criminalise the nonconsensual sharing of intimate images, and perpetrators could face jail time. Article 7b even refers specifically to deepfakes.

The role of tech companies and online platforms

In the meantime, victims are pinning their hopes on online platforms updating their policies to crack down on deepfake pornographic content.

OpenAI, the company behind the image-generating tool DALL-E, has already removed explicit content from its data, and filters requests to block the creation of images of celebrities and politicians.

Another popular AI model, Midjourney, blocks certain keywords and encourages users to flag problematic images to moderators.

ADVERTISEMENT

TikTok has also mandated that deepfakes or manipulated content must be labeled as fake or altered, and has banned deepfakes of private figures and young people.

When it updated its community guidelines, Twitch warned that the intentional promotion, creation or sharing of deepfake porn would lead to an instant ban. Even showing a glimpse of such content - even if it’s intended to express outrage - “will be removed and will result in an enforcement,” the company wrote in a blog post.

Meanwhile, Meta, OnlyFans and Pornhub have all started participating in a new "Take It Down" tool allowing teens to report explicit images and videos of themselves from the Internet.

"Law enforcement, governments, regulators need to come together in order to deal with this because if we just have domestic countries around the world dealing with it independently, they can only do so much. This is a global issue that needs a global solution," Martin said.

"I would also say that victims and survivors need to be at the table whenever decisions are made about this, because they are so often excluded from decisions that are made about us, for us, but without us," she added.

ADVERTISEMENT
Share this articleComments

You might also like