A new report published by Mozilla’s *Privacy Not Included buyer’s guide revealed some shocking information about dozens of therapy apps on the market: 28 out of 32 of them are allegedly capitalizing on client data.
1 in 5 U.S. adults reported struggling with mental illness in 2020 according to the National Institute of Mental Health—that’s 52.9 million people. The Mental Health Association found a similar result in their 2022 State of Mental Health in America Report, and further revealed that 27 million U.S. adults experiencing a mental illness do not receive treatment for it. This number is lofty, and there are a lot of reasons people might not seek out therapy.
“Some individuals may avoid accessing care out of fear that others they know may find out they are seeing a therapist,” Alissa Davis—an Assistant Professor at Columbia University’s School of Social Work—explained to Gizmodo via email. “Other individuals may have internalized stigma or may be afraid to admit to themselves that they are struggling with a mental illness and need additional support.” Davis further cited poor interactions with previous therapists, a shortage of mental health professionals, and high expenses associated with mental health care as additional obstacles for those seeking therapy.
When mental health apps like BetterHelp and Headspace hit the digital market, they were a beacon of hope in an otherwise turbulent landscape: mental health care when you need it, for a relatively cheap price, in the privacy of your own home. But the mental health app bubble may be beginning to burst. Axios reports that the digital therapy boom of the last two years, fueled by the existentialism of the Covid-19 pandemic onset, is starting to concern experts in the field as apps may fail to appropriately target the needs of its clients.
If that wasn’t enough, a new report by Mozilla contains some damning information on the security of these mental health apps. Mozilla’s *Privacy Not Included is a buyer’s guide aiming to provide consumers and companies with concrete information on certain products, services, and apps. In their report on 32 mental health apps, they flagged 28 for not meeting the *Privacy Not Included standards of user safety. These standards are judged on:
- What the company does with user data,
- How a user can control their own data,
- The company’s previous track record with user data protection, and
- How the company meets Mozilla’s Minimum Security Standards.
The list of apps that Mozilla analyzed is stratified, where those at the top have been deemed “Not creepy!” while those at the bottom were labelled “Super creepy!” Four apps—Wysa, PTSD Coach, Headspace, and Glorify—were not issued a warning label. Notable flagged apps included RAINN, a chat app for survivors of sexual assault and their loved ones that may not protect user identities, Calm, a guided meditation app which allegedly uses your data for targeted advertisements on other platforms, and King James Bible, a daily Bible verse app with a shadowy parent company. Some apps that Mozilla deemed dubious like Breathe, Think, Do with Sesame by the Sesame Workshop are even targeted toward children.
“Talk therapy apps like Talkspace and BetterHelp concern us because they collect a large amount of personal information to be able to link users to a therapist,” Jen Caltrider, lead for *Privacy Not Included, said to Gizmodo via email. Caltrider further explained that BetterHelp directs users to an intake questionnaire before they are able to see any sort of privacy notice. “Where is [the questionnaire] data going? Is it being shared only with Better Help? With Google? With others?”
BetterHelp, one of the most commonly used therapy apps, is also one of the lowest apps on the list, and it’s track record is a little rocky. Notably, in 2018, the app launched an aggressive marketing campaign, where YouTube personalities like Shane Dawson, Phillip DeFranco, and Elle Mills, gave the app a shoutout as a paid advertisement. Some users that downloaded the app at the behest of these YouTubers were unhappy with the service, and discovered that BetterHelp’s terms of service stated:
We do not control the quality of the Counselor Services and we do not determine whether any Counselor is qualified to provide any specific service as well as whether a Counselor is categorized correctly or matched correctly to you…You agree, confirm and acknowledge that you are aware of the fact that the Counselor Services are not a complete substitute for a face-to-face examination and/or session by a licensed qualified professional.
BetterHelp’s Terms and Conditions have since been updated and do not include the quoted passages.
“While some mental health apps have established secure data protections that comply with HIPAA and other regulatory laws, many mental health apps do not have these systems in place,” says Davis. “Since mental health apps aren’t regulated, it can be difficult for individuals to know what apps have protections to ensure that their personal data remain secure.”
Mental health apps may have been an attempt to bridge the gap between those suffering from mental illness and appropriate health care, but along the way, some have sacrificed protecting their users and their data. As access to mental health continues to be a growing issue in the U.S, there may be a world where apps are a solution, but until they are regulated appropriately, they need to be treated as nothing more than a crutch.