Instagram bans graphic self-harm images after British teenager's suicide

Instagram bans graphic self-harm images after British teenager's suicide
Copyright 
By Alice Cuddy
Share this articleComments
Share this articleClose Button
Copy/paste the article video embed link below:Copy to clipboardCopied

Social media giant Instagram has announced that it is banning all graphic images of self-harm from the site as part of efforts to “support and protect the most vulnerable” users.

ADVERTISEMENT

Social media giant Instagram has announced that it is banning all graphic images of self-harm from its platform as part of efforts to “support and protect the most vulnerable” users.

The move comes after the father of British teenager Molly Russell said the site was partly to blame for her suicide in 2017. He explained that following the 14-year-old's death, the family found distressing material about suicide and depression on her account on the popular photo-sharing app.

“We will not allow any graphic images of self-harm, such as cutting on Instagram – even if it would previously have been allowed as admission,” Instagram Head Adam Mosseri said in a blog post on Thursday.

Instagram’s previous policy allowed graphic images in cases of people sharing admissions of self-harm in the hope it would help them get the support they need.

But going forward, the company said it needs to “do more to consider the effect of these images on other people who might see them.”

Mosseri added that non-graphic, self-harm related content, such as healed scars, would not be shown in hashtags or the explore tab and wouldn’t be recommended by the site, but could still be posted.

“We are not removing this type of content from Instagram entirely, as we don't want to stigmatise or isolate people who may be in distress and posting self-harm related content as a cry for help," he explained.

Social media consultant, Matt Navarra, said that one of the things Instagram changed on Thursday was the ability for some of this content to be recommended, "particularly the non-graphic stuff," he said. 

"So anything that's graphic to do with self-harm is an outright ban and anything that’s less offensive or non-graphic stuff will be removed from the ability to be discovered in discovery or within hashtag searches or search in general and the recommended engine, which is part of the algorithm," he said. 

Mosseri said the company was looking into ways of directing people posting and searching for self-harm related content and directing them to organisations who could help.

In its own statement, Facebook, which owns Instagram, said it had spoken to more than a dozen experts from around the world about ways of improving its policy.

Based on the expert feedback, the company said it would also begin enforcing a policy of banning “graphic cutting images even in the context of admission”.

UK health secretary Matt Hancock welcomed the move on Thursday.

The minister last month warned companies of legal action if they failed to remove inappropriate content.

He said he was "glad" of Instagram's action but said he would "keep working to make the internet safe for all".

Your view | What do you think? Do you support Instagram's move? Or will it isolate vulnerable people? Let us know in the comments, below.

Share this articleComments

You might also like

Instagram tourists test patience of Hong Kong locals

Man dies after setting himself on fire outside Trump trial court

‘It scars you for life’: Workers sue Meta claiming viewing brutal videos caused psychological trauma