A 14-year-old teen committed suicide in 2017 after viewing graphic content on social media. Now, the British court is implicating tech giants played a role in her death.
When 14-year-old Molly Russell died in 2017, her father spent countless hours scouring his daughter's social media trying to understand why she took her life.
A few days after her death, he found an email in Molly's inbox from Pinterest, a social media and image-sharing platform.
It suggested his daughter suicide-themed messages, like an image of a girl self-harming captioned, “I can’t tell you how many times I wish I was dead”.
Ian Russell says he was “shocked” to see Molly had received messages such as “10 depression pins you might like".
It became clear that Molly Russell had been engaging with graphic content promoting suicide and self-harm, often recommended to her by the algorithms of the social media platforms she was using, such as Pinterest and Instagram.
For the first time ever, a British coroner has ruled that social media was to blame for the suicide of the teenager.
“She died from an act of self-harm while suffering from depression and the negative effects of online content,” stated London coroner Andrew Walker.
Last week, Andrew Walker wrote a letter to tech giants such as Meta (the parent company of Facebook and Instagram), Snapchat, Pinterest, and Twitter, issuing six recommendations, including separating platforms for adults and children.
A coroner can write a report following an inquest if it appears there is a risk of future deaths occurring for similar reasons.
All parties must respond by 8 December with details of the actions they propose to take or explain why no action will be proposed.
'A clarion call around the world for litigation pending'
The decision has sparked a wave of hope for many victims' families around the world.
Matthew P. Bergman is an attorney and founder of the Social Media Victims Law Centre, working to hold tech giants accountable for the harm caused to children in multiple countries.
"Molly Russell's case is incredibly important because it's the first time a social media platform has been adjudicated to have caused the death of a child," he told Euronews.
"Russell’s death was not a coincidence nor an accident. It’s a direct result of designed decisions that Meta made to maximise user engagement over safety, and failure to provide any safeguards for vulnerable kids."
"This is going to be a clarion call around the world to litigation pending," said Bergman.
One of the main lawsuits brought by the Social Media Victims Law Center is that of Selena Rodriguez.
The American 11-year-old was so addicted to Instagram and Snapchat that she experienced severe sleep deprivation and depression, eventually leading to her suicide.
UK Online Safety Bill: Can platforms be held accountable?
The inquest into Molly Russell's death has renewed pressure on the UK government to introduce the long-awaited Online Safety Bill.
One of the provisions will obligate tech giants to remove content deemed harmful, with a particular emphasis on children's well-being.
Companies that breach the bill could face important fines or 10% of global turnover imposed by Ofcom, the country's communications watchdog.
But the bill has created controversy because its critics believe it could lead to significant censorship.
Tory MPs such as David Davis have argued the bill could make tech firms “inevitably err on the side of censorship".
Prime Minister Liz Truss has said she wants to “make sure free speech is allowed” when the bill comes back on the table.
Other critics believe that the wording of the bill is still too vague, especially when it comes to the definition of harmful content.
"One of my concerns is how this bill will be interpreted in the future. We need to make sure we are striking a balance between freedom of expression and personal autonomy," said Dr Laura Higson-Bliss, an expert on social media and criminal law at Keele University.
She told Euronews that the vagueness and the fact that there is no similar legislation in other countries could lead to "issues for the courts and even law enforcement on how this bill could be interpreted, if passed."
During Molly Russell's inquest in late September, representatives from Meta and Pinterest gave evidence.
Meta executive Elizabeth Lagone said that she believed it was "safe for people to be able to express themselves."
However, she agreed that two of the posts shown to the court would have violated Instagram's policies and offered an apology.
'Companies prefer engagement over safety'
"My concern is that we will take a step back on our progress around conversations regarding mental health," Dr Higson-Bliss explained.
If Meta or Twitter have to take down anything related to suicide or self-harm, it’s as if they’re saying that what you’re feeling is not OK."
But Pinterest's Judson Hoffman admitted that the image-sharing platform was "not safe" when the teenager used it.
In a statement provided to Euronews, a Pinterest spokesperson said that, "to help ensure the safety of our Pinners, we’ve strengthened our policies and enforcement practices around self-harm content ... and will continue to work with experts in the mental health field."
"Molly’s story has reinforced our commitment to create a safe and positive space for our Pinners."
But as lawmakers continue debating whether tech giants could be held accountable for the teen’s death, Molly Russell’s father has been pleading to stop delays in implementing the online safety bill to protect children exposed to harmful online content.
Bergman believes that the social media giants are simply not willing to implement these changes that could be life-saving for some children.
"The improvements exist right now. But companies prefer to maximise user engagement over safety. Tinder has age and identity verification. If that’s good enough for people hooking up, why isn't it good enough for our kids?"