Mastodon has child sexual abuse content problem, researchers say

The logo of Mastodon reflected on a smartphone screen
The logo of Mastodon reflected on a smartphone screen Copyright JOEL SAGET / AFP, Euronews edit
Copyright JOEL SAGET / AFP, Euronews edit
By Lauren Chadwick
Share this articleComments
Share this articleClose Button

In just two days, researchers at Stanford University found hundreds of posts on Mastodon with child sexual abuse content.

ADVERTISEMENT

Mastodon, a social media platform that has become popular as a possible alternative to Twitter, is teeming with child sexual abuse content, according to a new report from Stanford University researchers.

In just two days, the researchers found 112 instances of child sexual abuse material out of roughly 325,000 analysed posts.

They also found "554 instances of content identified as sexually explicit with highest confidence by Google SafeSearch," according to the report from Stanford's Internet Observatory.

It only took approximately five minutes to find the first instance of child sexual abuse content.

Mastodon is a decentralised social media site, unlike larger companies such as Facebook, Twitter and YouTube.

This means that multiple servers called instances are run independently with users able to create their own accounts. Each instance creates its own code of conduct and regulations.

These are enforced "locally and not top-down like corporate social media, making it the most flexible in responding to the needs of different groups of people," Mastodon states.

The Stanford researchers analysed the top 25 Mastodon instances as determined by their total user count. The media was submitted to PhotoDNA and Google's SafeSearch for analysis.

They also looked at the larger Fediverse, which is a group of decentralised social media platforms including Mastodon, Bluesky, Pleroma, and Lemmy.

On the Fediverse, the researchers found 713 uses of the top 20 child sexual abuse-related hashtags on posts with media and 1,217 posts without media.

Decentralised social media presents safety moderation problems

"At a time when the intersection of moderation and free speech is a fraught topic, decentralised social networks have gained significant attention and many millions of new users," Stanford researchers David Thiel and Renée DiResta wrote in the report published on Monday and first reported in the Washington Post.

They say that this "decentralised" approach to social media presents challenges for safety as there is no central moderation team to remove child abuse images.

"While Mastodon allows user reports and has moderator tools to review them, it has no built-in mechanism to report CSAM (child sexual abuse material) to the relevant child safety organisations," Thiel and DiResta write.

"It also has no tooling to help moderators in the event of being exposed to traumatic content—for example, grayscaling and fine-grained blurring mechanisms."

Mastodon has risen in popularity since Elon Musk's takeover of Twitter.

Last week, founder and CEO Eugen Rochko said the platform's monthly active user count was up to 2.1 million, which is "not far off" from its last peak.

Thiel and DiResta argue that while decentralised social media could help to "foster a more democratic environment," it will need to solve safety problems in order to "prosper".

Share this articleComments

You might also like