Facebook: can its new features improve mental wellbeing?

Facebook: can its new features improve mental wellbeing?
By Euronews
Share this articleComments
Share this articleClose Button

Facebook has introduced several new features in a bid to combat the detrimental impact the platform can have on users' mental health. What are they and what will their effect be?

ADVERTISEMENT

Facebook has introduced a number of new features in response to criticism that its platform damages the mental health of its users. What are they, and what impact are they likely to have?

Snooze

The snooze button allows youto hide a person, page or group for a period of 30 days, without any need to unfriend or unfollow them permanently. It's designed to give you more control of what you see in your news feed.

How does it work? You click a button and it's done. You can "unsnooze" at any time, and you will receive a warning just before the 30-day period comes to an end, allowing you to make any longer term decisions about your online relationship with the person, page or group concerned before they pop up in your feed again.

Why? At a party, if someone is annoying or upsetting you, you can give them a wide berth. On Facebook, they will keep cropping up in your feed, causing irritation or even distress. But you won't always want to unfriend people like this completely: perhaps they're a key member of your social group. And, well, absence makes the heart grow fonder, and all that.

Will it work? This is a simple tool, and there is no reason why it shouldn't work.

Take a break

Take a break is a tool that allows you to customise the way you interact with a person you have just broken up with. You choose how much they hear about you, and you about them. It works for past business partners and tumultuous friendships, too, apparently.

How does it work? You enter the "take a break" tool and answer a series of questions to determine how your online relationship with the person in question is managed. They are not informed.

Why? It can be difficult to get over a relationship when you are still confronted online by evidence of everything that your ex is doing in their daily life. Facebook says that it wants to support you in whatever emotional state you may be in: it described this new tool as "empathetic but neutral".

Will it work? Facebook says the toolkit is available immediately, but it is quite hard to find, if so. Once you have found it, well, it can't do any harm.

Suicide prevention

Facebook has developed a number of tools to try to help prevent suicides from occurring. It has support options for anyone posting about suicide, including online tips, helpline information, putting users in touch with mental health resources, and reaching out to a friend. Most recently, it has developed artificial intelligence that it claims can detect suicidal posts before they are even reported. Live streaming suicide prevention support is also available.

**How does it work? **In most cases, these tools rely either on a Facebook friend reporting a concern, or on the suicidal person clicking through to the help links. The new artificial intelligence tool will automatically send those links through to the person at risk, and, in some cases, will alert local authorities.

Why? Vulnerable people sometimes use social media posts as a cry for help. After a spate of live-streamed, suicides, Facebook has been called upon to take some responsibility for the content on its site.

**Will it work? **In the case of live-streaming, time is of the essence. Facebook's artificial intelligence will need to be quick both to detect problems, and to contact the authorities if it's to have any hope of intervening.

News feed

The news feed will be organised differently. The site says that posts from favoured friends, and "personally informative" posts will be prioritised; and clickbait and false news demoted.

How does it work? Basically, Facebook will organise your news feed less neutrally, in a way it believes is better for you.

Why? Recent studies have shown that passive consumption and meaningless online interactions are linked to higher rates of depression. This intervention is designed to combat this phenomenon.

Will it work? Can any algorithm truly reflect what you want and need?

ADVERTISEMENT

Why?

Facebook is responding to a slew of studies that suggest that it is bad for its users' mental health. In a blog post it admitted that passive consumption of social media could make people feel worse, but said that the solution lay in creating more meaningful social interactions, which could indeed be beneficial in themselves.

Share this articleComments

You might also like

Facebook to overhaul news feed to promote 'meaningful' posts

‘It scars you for life’: Workers sue Meta claiming viewing brutal videos caused psychological trauma

Wagner still using Facebook to recruit fighters, despite Meta saying content will be removed