Mental health apps may not protect your data

A CBS News poll has found that nearly half of Americans surveyed say the coronavirus pandemic has affected their mental and emotional health. Some people have turned to convenient and affordable mental health apps. Thomas Germain, a technology reporter with Consumer Reports, joins CBSN to discuss a recent investigation that found these apps take varied approaches in how they handle users' privacy.

Video Transcript

[MUSIC PLAYING]

VLADIMIR DUTHIERS: In Privacy Watch, more than a year into the coronavirus crisis, Americans are still feeling the financial and emotional tolls of the pandemic. A CBS News poll found that nearly half of those surveyed says the outbreak has negatively impacted their mental and emotional health. This has some people turning to mental health apps for support. But a recent Consumer Reports investigation found these apps take varied approaches in how they handle the privacy of their users.

Thomas Germain joins us now. He is a technology writer with a focus on privacy, public policy, and design for Consumer Reports. Thomas, Thanks for joining us. So how have smartphones made access to mental health assistance easier and more affordable?

THOMAS GERMAIN: In one sense, we're lucky that the pandemic struck when it did because smartphones and other technologies are giving us access to mental health care and other kinds of medical service that never would have been available before. You're able to receive care remotely. And often, it can be more affordable than some more traditional options. But on the other hand, using digital, internet-based services for medical care expose you to all kinds of privacy risks that you never would have had to face in more traditional settings.

ANNE-MARIE GREEN: So in your article-- and it's entitled, "Mental Health Apps Aren't All as Private as You May Think," you talk about how these apps operate in a regulatory gray area, where HIPAA often doesn't apply. HIPAA, of course, put in place to protect our private health information, just between us and our doctor. Can you tell us about this gray area?

THOMAS GERMAIN: Yeah. So HIPAA was passed in 1996. And if you can think about what the internet and digital technology was like back then, it's completely different than the world that we're facing now. And the law really wasn't written to address the kinds of privacy infringements and data sharing that we're subjected to now.

What a lot of people don't realize is HIPAA doesn't actually protect data just because it's health information. It covers certain kinds of entities, doctors and health insurance companies. And these mental health apps and a number of other medical services that are delivered over the internet sometimes don't fit into that rubric because they aren't always connecting you with a doctor. And even in some cases when they are connecting you with a doctor, there are some parts of the app that are covered by the law and some parts that aren't. And the problem that we found is that leaves consumers in this sort of unusual space where you don't always know what to expect about how your information is handled when you engage with one of these services.

VLADIMIR DUTHIERS: So this report also found that most of the mental health apps analyze shared data with a number of outside companies. Where is a user's data going with these mental health apps? And does it vary depending on the app? Normally, you see-- oftentimes, when you sign up for something, it says, hey, XYZ will sometimes share information with XYZ developers. Do you get that message with some of these mental health apps or no? And I usually check no, you can't. I don't even care if the thing crashes every five seconds. I don't want to send a report to anybody.

THOMAS GERMAIN: Right. I think a lot of people--

ANNE-MARIE GREEN: And I'm usually like-- I just want to get to the app. So I'm usually like, what do I got to click? What do I got to click? And often, I don't read those things, which, as I argue, a lot of people do.

THOMAS GERMAIN: Yeah. I think a lot of people have that same experience. But the bottom line here really is these are apps, right? They behave the way that most apps do. And what that means, like you're both saying, is they work with a lot of third party companies to process your information.

Sometimes they work with researchers. Sometimes they work with advertising companies, like Facebook and Google. And what we saw is most of these apps shared some information with other companies.

Now, I want to be clear. And this is a really important point. We didn't see that they were sharing the contents of your conversations with therapists. We didn't see that they were doing anything improper with the things that you typed into these apps. Some of that could be going on behind the scenes.

But what we did see, for sure, is a lot of these apps let other companies know, through the services that they're using, that you are engaging with their product. And what that means is some companies, Facebook, Google, might learn that you are seeking care for mental health, depending on which app you're using.

Now, should that bother you? That really depends on your own personal preferences and your feelings about the stigma of mental health issues. But it's important for you to know that that's something that could happen when you're going in.

ANNE-MARIE GREEN: You know what they say. If you're not paying for the product, you are the product. That's a new old line. So you talked about how HIPAA was written in a time when this sort of thing wasn't an option. And it's great that you can get mental health in the palm of your hand. Are there any conversations about perhaps bringing HIPAA up to date?

THOMAS GERMAIN: That's something that's been discussed. At the same time, I wouldn't hold your breath for Congress passing that kind of comprehensive regulation right now. And in fact, the opposite has been happening because of the pandemic. Federal regulators that govern HIPAA and another privacy rules have been relaxing regulations to make it easier for consumers to seek care during this time of crisis, which in one sense, is a good thing because it makes it easier for these services to perform the functions that give people the help that they need.

But in the meantime, we're left with a situation where the rules don't always apply to the data that's being collected. And it's really hard for consumers to know what to expect. And there are a lot of experts who are calling for updated rules.

VLADIMIR DUTHIERS: So how do these apps work? In other words, are you able to-- is it just-- would I get the same information by using these apps if I read, for example, Freud's "Interpretation of Dreams?" Or is it that you're actually getting tailored, specific information to help you specifically with some of the things you're going through? Or should I just check out a book by Carl Jung?

THOMAS GERMAIN: That's a great question. And again, it really depends on the app that you're engaging with. There are some services that connect you directly to a therapist. There are some that will provide you with guided meditations or cognitive behavioral therapy exercises. It really runs the gamut.

And that's one of the things that, I think, is useful for consumers to know. When you're searching for an app in the App Store, there are all kinds of different options for you to choose from. And that's actually something that's worth keeping in mind if you're concerned about your privacy. The apps that connect you with a therapist or a counselor or a medical professional are more likely to hold themselves to a higher standard. And they're more likely to fall under the privacy rules of HIPAA and other regulations. So that's something to keep in mind when you're seeking care.

ANNE-MARIE GREEN: So is that the advice you would give someone if they think they could find value in one of these apps? And there are pages and pages of them when you go to the App Store. Would that be your advice to them to in order to protect themselves?

THOMAS GERMAIN: There's a couple of pieces of advice I would give. I mean, first and foremost, it's not really a privacy issue. But you should be sure that the app that you're engaging with is connecting you with a licensed medical professional because there are some apps that will just connect you with some person that has some kind of training that isn't actually a health care professional. There are plenty of apps that will send you straight to a real health care provider. And that's something you should look for.

The second thing I would keep in mind is, yes, if you are engaging with a therapist over an app, that is probably a situation where there's a higher level of scrutiny on your information. And it's also worth considering that there are plenty of services that you can use to connect with a therapist over telehealth that aren't based in an app. You can connect over Zoom. You can connect over all kinds of different services. And that's a more traditional setting where you have better expectations about your privacy.

And the last piece of advice is just really quick. If you go into your iPhone or your Android privacy settings, a lot of them have a control you can use that will limit ad tracking. And it will prevent apps and other services you're using from connecting certain kinds of information that can give away your identity. And that's something that's worth checking on before you use any app, let alone one that's dealing with information that's so sensitive.

ANNE-MARIE GREEN: Yeah. That is really good advice. Thomas Germain, thank you.

THOMAS GERMAIN: Thanks for having me on.