Top mental health apps are 'data sucking machines' that could be trading your sensitive information

The majority of the apps the researchers looked at raised "strong concerns over user data management"
The majority of the apps the researchers looked at raised "strong concerns over user data management"   -   Copyright  Canva
By Tom Bateman

Mental health apps are worse at protecting their users' personal data than almost any other category of app, new research by the Mozilla Foundation has found.

The mediation service Calm, the App Store's top-ranking mental health app, as well as apps catering to people recovering from eating disorders and sexual violence were criticised for their approach to users' privacy by the writers of the *Privacy Not Included report.

Many of the mental health apps assessed for the report routinely shared user data, allowed weak passwords, had unclear or vague privacy policies and targeted their users with personalised advertising based on their data, the report found.

Mental health apps can carry added privacy risk because of the particularly sensitive and personal data they handle, pairing data about a user's mental state with identifying markers like age, gender, race, location and even their contact information, the researchers said.

"These are all details that can be used to identify an individual, whether that be to an employer, an advertiser, a health insurer, or a bad person looking to use this information in harmful ways," *Privacy Not Included lead Jen Caltrider told Euronews Next.

Monetising mental health

The review of 32 apps catering to mental health needs and religious beliefs found that 28 of them raised "strong concerns over user data management," while 25 failed to meet security standards on passwords, security updates and managing vulnerabilities.

These apps target people at their most vulnerable. People who probably don't expect their mental health struggles can turn into advertising data
Jen Caltrider
Lead, *Privacy Not Included

"With such great power, these apps must equally take up the responsibility of respecting and securing their users' data," Caltrider said.

"These apps target people at their most vulnerable. People who probably don't expect their mental health struggles can turn into advertising data".

Courtesy of Calm
The top-ranked meditation app in the Apple App Store was one of those picked out by the researchersCourtesy of Calm

The six apps with the worst privacy protections were Better Help, Youper, Woebot, Better Stop Suicide, Pray.com, and Talkspace, the researchers said, with flaws including vague privacy policies, sharing user data with third parties, and in the case of online therapy app Talkspace, gathering private chat logs from therapy sessions.

'Data sucking machines'

Overall, users should be cautious when signing up to mental health apps, the report warned, calling them a "data harvesting bonanza".

"Nearly all the apps reviewed gobble up users’ personal data — more than Mozilla researchers have even seen from apps and connected devices," it said.

In some cases, they operate like data sucking machines with a mental health app veneer
Misha Rykov
Mozilla researcher

Data is big business, the report warned, with insurance firms able to collect more information on their clients, and data brokers "enriching their databases" with users' personal details.

"Hundreds of millions of dollars are being invested in these apps despite their flaws. In some cases, they operate like data sucking machines with a mental health app veneer. In other words: A wolf in sheep’s clothing," Mozilla researcher Misha Rykov said.

Euronews Next contacted Calm and Talkspace to request a response to the report, but they did not immediately respond.

How can you protect your data?

There are ways that app users can reduce the chance of their sensitive data falling into the wrong hands, Caltrider told Euronews Next.

"Make sure any information you share on these apps is information you'd be okay being made public, as nothing is guaranteed to be 100 per cent private when shared through these apps," she said.

Users of online therapy services should ask their therapist to take notes by hand and not upload them to the app's system, she said, in order to limit the chance of private conversations being shared - either intentionally or by accident.

Caltrider also recommended that users opt out of data sharing agreements if possible, and that they do not link their social media accounts to mental health or prayer apps.

Users can also make sure they set strong passwords, even when an app doesn't require it, and should regularly request that companies delete any of their personal data they have

stored, she added.