Mental Health Apps and User Privacy

Consumer Reports has no financial relationship with advertisers on this site.

  • HIPAA, the federal health data law, doesn’t apply to all the information collected by the apps.

  • CR's testers observed apps sharing unique IDs, specific to a particular smartphone, with several companies, including Facebook.

  • Privacy policies don’t always make it clear what kind of data could be shared, and how it could be used.

Type “mental health” or a condition such as anxiety or depression into an app store search bar, and you can end up scrolling through endless screens of options. As a recent Consumer Reports investigation has found, these apps take widely varied approaches to helping people handle psychological challenges—and they are just as varied in how they handle the privacy of their users.

These apps are particularly important tools these days. Four in 10 Americans reported experiencing depression or anxiety because of the pandemic, according to a nationally representative survey of 2,982 U.S. adults conducted by Consumer Reports in December (PDF).

Mental health apps take a number of approaches to providing help. Some connect you with licensed therapists over video. Conversations with therapists are typically covered by the same state and federal health privacy rules that apply to in-person therapy or to any doctor’s appointment.

But the same apps or similar-sounding ones may provide guided meditations, mood-tracking diaries, therapy chatbots, and cognitive behavioral therapy exercises. Along the way, you might be asked to complete a questionnaire on your mental health symptoms.

The data you provide as you use those features might not necessarily be treated as confidential by the app developers, or by the law.

Researchers in Consumer Reports’ Digital Lab evaluated seven of the most popular options, representing a range of approaches, to gain more insight into what happens to your personal information when you start using a mental health app.

The apps we chose were 7 Cups, BetterHelp, MindDoc (formerly known as Moodpath), Sanity & Self, Talkspace, Wysa, and Youper. We left out popular alternatives such as Headspace, which is pitched as a meditation app, although the lines between many apps people turn to for support can be blurry.

Using specially programed Android phones, we watched which outside companies received data from the apps as we used them, and checked to determine whether privacy settings were on or off by default. We also analyzed how well the apps’ privacy policies matched what we observed. We worked on that technical analysis with AppCensus, a privacy research company that has collaborated with Consumer Reports on other investigations, and we’ve posted a detailed test report (PDF).

In general, these mental health services acted like many other apps you might download. For instance, we spotted apps sharing unique IDs associated with individual smartphones that tech companies often use to track what people do across lots of apps. The information can be combined with other data for targeted advertising. Many apps do that, but should mental health apps act the same way? At a minimum, Consumer Reports’ privacy experts think, users should be given a clearer explanation of what’s going on.

“Your mental health is incredibly personal,” says Justin Brookman, director of privacy and technology policy at Consumer Reports. “You should be able to reach out for help without worrying about how that data might be shared or misused.”

Health Laws From Before Smartphones

When you go to a medical office, someone typically hands you a form describing the Health Insurance Portability and Accountability Act of 1996. HIPAA is the key federal law that protects data collected by all kinds of healthcare professionals, from dermatologists to emergency room nurses to licensed psychotherapists.

Using a mental health app might seem like the same sort of situation, but some apps operate in a regulatory gray area where HIPAA often doesn’t apply.

“A consumer should not automatically assume that HIPAA protections apply to health information entered into a health app,” says Roger Severino, a former director of the Department of Health and Human Services’ Office for Civil Rights, the federal agency charged with enforcing health privacy rules.

The law doesn’t protect data just because it’s related to your health. It applies only to information collected and held by “covered entities,” such as insurance companies, healthcare providers, and the “business associates” that provide them with services such as billing. Those companies sign business associate agreements that restrict them from doing anything with the data other than help the providers run their businesses, unless they get explicit permission from a patient.

Tell a psychologist, “I’m depressed,” and HIPAA restricts how that information can be used. But type those same words into an app that has no connection to a covered entity, and HIPAA doesn’t protect you.

“Technical practices have moved past what laws like HIPAA were designed to address, and until regulations evolve, these companies owe it to consumers to do better,” says Bill Fitzgerald, a privacy researcher in CR’s Digital Lab who led the mental health app research. “Companies need to be more transparent about their practices and make clear commitments to consumers about their rights in legally binding documents, so companies can be held accountable and consumers can make an informed choice.”

Consumer Reports found that privacy policies for a number of companies left open questions about their privacy practices. We contacted the companies behind the apps we tested to get more clarity on how HIPAA protections apply to them.

MindDoc, which is based in Germany, says it’s covered by HIPAA and complies with the law’s privacy and security regulations, along with even stricter European Union rules. Talkspace and BetterHelp focus almost exclusively on setting up in-app sessions with a therapist, and those companies say they conform to HIPAA rules just as though you had sought help at a medical office.

At least that’s true once you create an account. According to Talkspace’s general counsel, John Reilly, “Once a therapist/client relationship is established, no personally identifiable information is disclosed to third-party service providers about that user, unless the third party has signed a business associate agreement.”

Glen Moriarty, the CEO of 7 Cups, says his company applies HIPAA guidelines when it connects people to therapists, and when it harnesses their data for research or analytics. But the online communities in the app “are not intended to be private as sharing and collaboration are the cornerstones of our community.”

Wysa and Youper both told us their operations don’t fall under HIPAA guidelines. Wysa has a feature that can connect users to a therapist, but co-founder Ramakant Vempati says the company’s systems don’t collect any data covered by HIPAA. Wysa says it takes steps to avoid capturing identifiable information from its users and has implemented “physical, technical, and administrative safeguards and controls” to protect the security and confidentiality of user data.

Where HIPAA Protections End

Even apps that say they are covered by HIPAA may use data in ways consumers don’t expect. One point that isn’t always kept confidential is that you’re using a mental health app in the first place.

Consumer Reports saw the BetterHelp, Sanity & Self, Talkspace, and Wysa apps all sending data to Facebook. Separately, Youper told us it shares data with the social media titan, though we didn’t see that occur during our testing.

Facebook’s policies say that sensitive data like your medical symptoms isn’t used for targeted ads. However, the company doesn’t treat the fact that you’re using a mental health app the same way. And according to a Facebook spokesperson, the company hasn’t signed a business associate agreement that would restrict its use of identifying data with any of the app developers in our study.

The companies that responded to our questions say they share only limited information with Facebook. BetterHelp president Alon Matas told CR that Facebook “can use data from other apps on an aggregated level, not on an individual basis. It says so explicitly both in their engagement with advertisers like us, as well as in their public information.” A Talkspace spokesperson says the company collects data using Facebook’s tools to optimize ads for Talkspace that the company might run in the future, though the data isn’t being used that way right now. And, the company says, the data includes only details about users’ interactions before they start therapy.

After our initial tests, Wysa began obscuring the IDs the company shares with Facebook.

What this all means is that companies like Facebook might learn that you use a mental health app, and that piece of information could be combined with many other data points—from your gender to your hobbies to your location—to determine which ads Facebook shows you on its platform or on other websites. (Consumer Reports uses Facebook tracking technologies to market its own products and services.)

You may never know whether downloading one of these apps caused a particular ad to pop up on a website you visit. The advertising industry has become far too automated and complex to nail down a detail like that. But it’s easy to imagine because that’s generally how Facebook’s ad network operates.

Who Else Gets Your Data?

It’s not just about Facebook. Most of the mental health apps we analyzed share data with a number of outside companies, and they don’t always tell you which ones, or why that data sharing is taking place.

To be fair, many other apps on your phone don’t disclose their business partners, either. And if data is shared, that doesn’t necessarily mean it’s going to marketing companies or data brokers. Third parties perform many practical services for apps, such as compiling analytics on which features get used the most and facilitating chat functions.

Nevertheless, the more places your data goes, the greater the chance that it could be used in ways that might bother you. And mental health apps can collect very sensitive information.

Several mental health apps say in their privacy policies that your data may be shared with researchers. You might assume that means your information is combined with data from other users, to help doctors and university researchers learn more about how to treat mental health. Sharing patient data for public health and other academic research is a standard practice in healthcare.

However, some of the privacy policies we looked at blur the lines between medical research and marketing or app design projects. And it’s not always easy to choose not to participate, no matter what kind of research is involved. For instance, Wysa co-founder Ramakant Vempati told CR that users could opt out of research by emailing the company at hello@wysa.ai or wysa@touchkin.com, but this wasn’t easy to figure out from either the privacy policy or the app itself.

In 2020 the New York Times described allegations from former Talkspace employees who said that chat logs from therapy sessions had been used to gather marketing insights and improve machine learning. A Talkspace spokesperson told CR that the company does use “de-identified” text chat logs for research and product development, which they call “an allowed use under HIPAA ‘safe harbor,’ standards.” But, they said, the information isn’t used for individually targeted marketing or shared outside the company, and Talkspace never transcribes audio or video from therapy sessions. Talkspace says it’s in the process of updating its statements to consumers on privacy issues, and reexamining its privacy practices.

In response to our investigation, MindDoc told CR it will be updating its privacy policies to clarify how data is shared with third parties and to provide more detail about how data is used for research. Wysa has already made such changes, and both companies also say they will review their apps to make privacy choices and settings more clear. Youper said it will be updating its privacy policy in response to CR’s findings but didn’t provide details.

Sanity & Self did not respond to multiple requests for information. However, all the other companies assured us that they are never paid for sharing user information, and several told us that personal data was tightly controlled in accordance with health privacy laws. “The way data is shared and the limited and well-defined purposes it is shared for are explicitly and clearly detailed in the privacy policy,” says BetterHelp’s Matas.

An interesting exception was 7 Cups—its privacy policy says the company shares data with third parties, but we didn’t observe that happening during our tests. This highlights the fact that a lot of data collection and trading takes place between company computer systems. Consumer Reports’ testing reveals what information leaves directly from your smartphone, and where it goes. However, no test can capture what companies do with your data or who they share it with after they receive it. This is why both CR and consumers need to rely on privacy policies and other company documents.

And, of course, if there’s a community feature, other people who use the app and engage in those forums might be able to learn a lot about you. However, there may be some steps you can take to enhance your privacy. For example, Sanity & Self includes community boards where the user’s profile is public by default. But you can go into the settings to switch on “Private Account,” meaning that “your profile details including location, session history, and progress will not be shown to others,” according to the app. “However, when leaving comments your name, profile photo, and comment are still public and visible to others.”

Also, you might not be able to delete all the data these apps collect. In some cases the companies will delete your data or account, but the steps can be complicated, and it’s sometimes unclear exactly what data will be deleted. Other apps, including BetterHelp and Talkspace, say the law requires them to hold on to some of the health data they collect.

When Mental Health Apps Can Help

Whether or not these apps are useful in the first place is a complicated question, according to mental health researchers.

“There is lots of evidence collected over the past two decades that shows that digitally delivered mental health interventions are effective and can treat a host of different aspects of mental health issues,” says Stephen Schueller, PhD, an associate professor of psychological science at the University of California, Irvine, and executive director of One Mind PsyberGuide, a service that reviews digital mental health services. The problem, he says, is that the vast majority of apps haven’t been tested in a scientifically rigorous way.

That makes it difficult to choose the right app. “Informed decision making is one of the core tenets of medical ethics,” says John Torous, MD, the director of the division of digital psychiatry at Beth Israel Deaconess Medical Center in Boston. “Some people may be comfortable giving away their data in return for a service for free or lower cost,” Torous says, but in general, mental health apps don’t give consumers enough information to make a clear judgment.

The experts we spoke with say that speaking with a professional is the best way to seek help with mental health issues. Studies have shown that teletherapy can be as effective as in-person care. What’s more, virtual therapy can offer patients more scheduling flexibility, convenience, and a bigger pool of potential therapists. If you use a mental health app, make sure it’s clear about who will be administering your care. It’s worth seeking out licensed mental health professionals, and there are plenty of services that will connect you with them.

You don’t necessarily need to use a mental health app to find a therapist. CR has some easy suggestions to follow to help you find affordable mental teletherapy.

The experts we spoke to say mental health apps that don’t connect you with a therapist tend to work best when they’re administered with the help of a mental healthcare professional. And there’s some evidence that apps that provide cognitive behavioral therapy exercises may be particularly useful, especially if they are undertaken with guidance from a therapist.