Find Us

Should social media come with a health warning? An expert weighs in

Euronews Next asked an expert if social media should come with a warning about the risks it poses to mental health.
Euronews Next asked an expert if social media should come with a warning about the risks it poses to mental health. Copyright Canva
Copyright Canva
By Oceane Duboust
Published on Updated
Share this articleComments
Share this articleClose Button

Euronews Next asked an expert if social media should come with a warning about the risks it poses to mental health.


Should social media come with a warning? A top US health official suggested this week that platforms should have warnings similar to those on tobacco packaging.

“The mental health crisis among young people is an emergency - and social media has emerged as an important contributor,” Dr Vivek Murthy, the US surgeon general or leading spokesperson on matters of public health, wrote in an editorial published in the New York Times.

He also called on curbing the use of features that contribute to excessive use such as push notifications, autoplay, and infinite scroll.

But could this actually help people with their mental health?

“Action starts with awareness,” Titania Jordan, chief parenting officer at the US-based company Bark Technologies told Euronews Health in an email. 

“We are hopeful that any type of warning will serve as a springboard for everyone to wake up and pay more attention to these insidious dangers,” she added.

She highlighted that Murthy already called the public’s attention to the issue in a 2023 advisory which said that nearly 40 per cent of children between 8 and 12 were using social media despite that 13 is the legal minimum age.

“But tackling these issues is a multi-faceted approach. It requires hard work and dedication from parents, caregivers–even kids themselves– coupled with lots of change from Big Tech, and of course, national measures and laws are helpful to hold everyone accountable,” Jordan warned. 

Some options are accessible right now for young people and caregivers, Jordan says, such as parental controls, screen time management, and more family conversations about hard topics like cyberbullying, predators, and violence.

Murthy, meanwhile, acknowledged that a warning will not be enough to tackle the issue and that policymakers and companies are responsible for minimising and addressing the risks.

Insufficient safeguards on platforms

Facing an increasing amount of criticism, social media platforms have progressively implemented safeguards to preserve their users’ mental health. 

This includes filtering certain hashtags that could promote negative body image or eating disorders.

Some platforms are also developing tools to help users track and limit their social media consumption. In 2021, Instagram introduced a "Take a Break" feature in some countries that prompts users to step away after a certain amount of scrolling. 

Though these tools empower users to be more mindful, their success relies heavily on individual responsibility.

“Platforms like Snapchat and TikTok and companies like Meta have one purpose: to keep their users (most of them minors) on their platforms for as long as possible,” Jordan said. 

“They do this by filling their feeds with content they think will be ‘beneficial’ to them but in truth frequently erodes their psyches”.

Share this articleComments

You might also like