Newsletter Newsletters Events Events Podcasts Videos Africanews
Loader
Advertisement

European Commission, Interpol and 100 others call to outlaw AI nudification tools

Major global organisations call for crackdown on AI nudification apps
Major global organisations call for crackdown on AI nudification apps Copyright  Canva
Copyright Canva
By Indrabati Lahiri
Published on
Share Comments
Share Close Button

A number of major global organisations such as Amnesty International and the European Commission, have pushed for AI nudification tools to be banned.

More than 100 major humanitarian and child protection organisations are calling for urgent action against AI nudification apps and tools. The coalition includes Amnesty International, the European Commission, Interpol, Safe Online, Save the Children and other child protection experts and human rights advocates.

This movement comes after the Grok nudification backlash, where users requested Elon Musk’s AI chatbot Grok to remove clothes from digital photographs of women.

This started with the “put her in a bikini” trend, which changed pictures to show women in bikinis, but quickly devolved into increasingly sexualised pictures.

These non-consensual and fake pictures were then posted publicly on the social media platform X, for millions of people to view. So far, Grok has been estimated to generate around three million non-consensual nude pictures.

In several cases, pictures created by AI nudification tools have been increasingly connected to blackmail, coercion, and child sexual abuse material, with most of the victims being women and children.

Serious threat to child safety and human dignity

The global coalition has highlighted that these nudification apps and tools pose a significant and unacceptable threat to child safety and human dignity.

“Nudifying tools have created an unprecedented threat to our children. AI- the technology that should expand human potential, is being weaponised against children,” Marija Manojlovic, head of Safe Online, said in a statement.

“We minimise harm by calling it ‘online’, as if it is somehow less serious than what happens in the physical world, but the trauma is real,” she added.

Despite often being marketed as “adult” applications, they have increasingly been used to generate illegal sexual pictures of children without any kind of consent, effective barriers or accountability.

“Tech companies have the ability to detect and block nudified content of children. The distribution of child sexual abuse material is illegal in every jurisdiction and tech platforms should be brought in line with other creation and distribution channels,” Manojlovic said.

“It’s frankly shocking that these platforms are monetised and aren’t required to report offenders, or work with industry partners to cut off payment flows - these are safeguarding tools that are used in the real world and need to be applied to online platforms.”

There has been increasing pressure to outlaw AI nudifying technologies, with advocates claiming that they serve no good purpose.

As such, the coalition is pushing for these technologies to be blocked, with developers and platforms being held accountable.

Go to accessibility shortcuts
Share Comments

Read more