Are you sharing your mental health data with tech companies?
Plus: What’s “pharmacology”? + Share your strategies for mental well-being + How we know tampons are safe to use
Welcome to Doing Well. Today:
A Q&A on how mental health apps hoover up our sensitive personal data—and what we can do to protect ourselves
Should social media apps come with a health warning label like the kind you’d see on a pack of cigarettes?
Your turn: What strategies do you use to improve your mental well-being?
Word of the week: “pharmacology”
Wait … how do we know tampons are safe?
Thanks for being here—let’s dive in.
We Asked: What should we know about mental health apps before we download them?
Many of us use apps or other digital platforms to connect with mental health resources like therapists or guided meditations. During the COVID-19 pandemic, digital tools to manage stress and mental health exploded—by some estimates, there are between 10,000 to 20,000 mental health apps out there, making up a market of around $7 billion.
When we engage with these tools, we often exchange sensitive personal information about our health and identities, and it’s not always clear where that information goes, who can access it, and how it’s protected. That’s why, in 2022, privacy researcher Jen Caltrider and her team released their first privacy review of mental health and prayer apps.
As they did with a series of other products through the Mozilla Foundation’s Privacy Not Included project, the team took a group of popular apps; dug through privacy policies, security reports, and reviews; identified potential issues; and pulled it all together in a user-friendly guide. The result? They were “shocked” at how poorly the vast majority of apps treated sensitive mental health data. (They released updated reviews in 2023.)
I spoke to Jen about what they found, and what we can all do to protect our mental health data online. Our conversation has been edited for length and clarity.
Mia Armstrong-López: When you first reviewed mental health and prayer apps, you gave 28 of 32 apps a “Privacy Not Included” warning label, which means that you had strong concerns over how they were managing user data. What alarmed you most?
Jen Caltrider: There were a number of things we found that were alarming, things like security vulnerabilities—at least some of the apps didn't require strong passwords, or had data breaches or security issues. We saw apps that collected a whole lot of data, and some were just blatantly saying that they could sell that data, or try and monetize it.
When you get into apps that connect people with therapists, you get into this kind of gray area of what potential personal information might be covered by HIPAA [the federal law that protects health information]. I think a lot of people assumed, “Oh, I'm talking to a licensed therapist, so my conversations are covered.” In some cases, yes, if you were talking with a licensed therapist, those conversations would be covered and protected with stronger privacy protections. But not all the data around them would be. So the contents of the conversations might be protected by HIPAA, but the metadata, the fact that you were having the conversation, the time and date that you were having the conversation, what web browser or phone you were using … things like that might not be protected.
There were other things, like apps that would say, “Oh, you're going to talk to our life coach,” or insinuate that you were talking to a mental health professional, but keep vague language in there. If it was a life coach, those aren't licensed mental health professionals, so HIPAA doesn't apply, and people might not be aware of that. It was very confusing to dig through the documentation to try and understand what conversations are protected.
We also found some instances of smaller apps that were targeted at specific populations—people with eating disorders or OCD, for example—and the privacy policy was a cut-and-paste boilerplate. When we reached out to the companies, they hadn't necessarily thought through collecting all this data and the responsibility that came with it.
Getting your data deleted wasn't a right that we found guaranteed to everybody. So if you did want to get off the app, you might not have the right to get your data deleted. We also saw some tactics like you would go to a website, and the first thing they would do is give you a big button, and you would click “get started,” and it would dump you into a questionnaire that asked some pretty sensitive questions about your mental health—how you were feeling, whether you were feeling suicidal, what your sexual orientation was—and that was before anybody was given any opportunity to consent to a privacy policy. All this data was simply collected right off the bat before any attempt was made to help people understand how that data was going to be protected.
MAL: Let's say that I'm considering downloading a mental health app. Maybe it's an app that's going to connect me with a therapist or help me do some guided meditations. What are the steps that I should take to look out for my privacy before deciding if I want to download an app?
JC: Research is important. Our reviews are a little old at this point, but that's a good place to start. If you're curious, do an internet search of, Does this app have any problems? Some of these apps have been investigated by the [Federal Trade Commission] for issues; they might have had data breaches or data leaks.
You can read their privacy policies. And this is something that I often hesitate to tell people to do, because I read privacy policies for a living, and they can be complicated and hard to understand. But if you want to dig in, just search for the word “sell.” Do they say they're selling data?
California usually has their own kind of privacy policy section, the CCPA section, which usually gives more detailed information than some of the other sections in privacy policy. So if you only want to read a little bit, start there, even if you don't live in California, because that will give more information.
Make sure that you have the right to get your data deleted. And then, look for, Do they have a section on HIPAA that talks about abiding by HIPAA laws?
One thing that we've seen more recently is AI chatbots marketing themselves as helping people with mental health, giving people somebody to talk to. And then if you look at their legal stuff, their privacy policies, they'll say things like, “We're not here to help your mental health. We're not a mental health app.” And so that's a red flag. AI chatbots aren't going to be covered by HIPAA generally, because they aren't real therapists.
MAL: Let's say I already have one of these apps on my phone. Are there things that I can do retroactively to protect myself and the data that I already handed over to an app or platform?
JC: It can be hard. You can reach out and ask them to delete your data. They may or may not, depending on where you live and what privacy laws protect you. You can delete the app from your device and try to wipe everything. If you're talking with a therapist, you can ask that the therapist only take handwritten notes and not upload those notes to the app system to keep them more private. Just be aware that anything that's on the internet is never 100% guaranteed to be safe and secure.
MAL: Sometimes access to these apps may be paid by or connected to one's employer or health insurer. Is there anything in particular I should look out for in that case?
JC: You should check with your employer and see if they have a privacy policy about what data is shared, what data they can access. Getting that policy in writing feels very important.
MAL: You recommend avoiding using sign-in credentials from a third party. So if I'm signing up for a service, maybe I think it's really easy just to use my Gmail login or my Facebook login. Why isn’t that a good idea? What are the potential consequences if something goes wrong?
JC: Generally, whenever you sign up for an app through a third party, that third party will collect some information. There used to be a lot of Facebook sign-ins, and then Facebook would be able to collect data about your sign-up information. That's not what you want, especially when it comes to mental health apps. We recommend to stay away from that as much as possible, just to protect yourself and limit any data sharing that you can.
MAL: It can be easy to feel numb to privacy concerns—we all know that we should care about privacy, but we interact with so many systems that snatch up our data in so many ways that it can feel tempting to just throw up one's hands. Why should I care how these apps are treating my data?
JC: I get that it's really hard to feel private when so much of our information is collected, and that's scary. The flip side of that is, if we just throw up our hands, we concede personal information out into the world that could get into the hands of people that might not have our best interests at heart. And that information can be used to coerce us, to manipulate us, to drive us to things that aren't healthy.
The companies collect all this data to build a profile on you, and the inferences that these profiles build on include things that are—if you stop and think about it—pretty creepy. It includes things like your intelligence level, your psychological trends, your abilities. They're trying to determine how smart you are and how easily you might be able to be manipulated. Oftentimes that data can be accessed by people that might be trying to push a dangerous ideology, foreign governments that might not have our best interest at heart.
It sounds alarmist, and I know it's really hard to take care of everything right now. But it is worth still caring. That means that sensitive personal information—things that you wouldn't want to stand in front of a bus full of people and say—just be extra cautious with.
Well-Informed: Related stories from the ASU Media Enterprise archives
How can we ensure that sensitive health data is treated responsibly? In this article for Issues in Science and Technology, Karl Lauterbach, Jochen Lennerz, and Nick Schneider propose a model that would blend technical safeguards and ethical values.
Plus: Should social media apps come with a health warning label like the kind you’d see on a pack of cigarettes? Swapna Reddy, assistant dean and professor at ASU’s College of Health Solutions, joined Arizona PBS to discuss the public health risks posed by our favorite social media apps—and what can be done about them.
Well-Versed: Learning resources to go deeper
What does it mean to be an ethical health care provider? How does the way health care is delivered in the U.S. compare with other countries? ASU’s “Ethics in Healthcare” course tackles these knotty questions and many more—get started online today.
Well-Read: News we’ve found useful this week
“10 Small Things Neurologists Wish You’d Do for Your Brain,” by Mohana Ravindranath, April 3, 2025, The New York Times
“Misinformation About Fentanyl Exposure Threatens to Undermine Overdose Response,” by Henry Larweh, April 10, 2025, KFF Health News
“Distracted Driving Crashes Add to Rise in Traffic Deaths,” by Tyler Buchanan, April 9, 2025, Axios
“How Measles Attacks an Unvaccinated Child,” by Emily Baumgaertner Nunn and Marco Hernandez, April 11, 2025, The New York Times
“Here’s How to Retrain Your Brain to Crave Movement More Than Screen Time,” by Diana Hill and Katy Bowman, April 13, 2025, NPR
Well-Engaged: Your turn!
What tools or strategies do you use to improve your mental well-being?
The World Health Organization defines mental health as “a state of mental well-being that enables people to cope with the stresses of life, realize their abilities, learn well and work well, and contribute to their community.” More people are recognizing that mental well-being is an essential part of overall health—a recent Gallup poll found that 70 percent of Americans preferred their health care providers ask about mental health concerns along with physical health issues, even if the initial reason for their appointment was a physical health issue. Still, the advice we get to improve our mental health can be broad and vague—and different things work for different people.
What specific strategies do you have for managing stress, improving mental health, or just trying to make the day a little easier?
Respond to this email or comment on Substack with your answer. We’ll share our community’s strategies in a future edition of Doing Well, so we can each try something we hadn’t thought of before. We can’t wait to hear from you!
-Mel Moore, health communication assistant
Well-Defined: Word of the week
Pharmacology can be broken down into two parts: Pharmaco-, meaning drug or medicine, and -ology, meaning the study of. Pharmacologists study existing medications and those in development to see how they function within the body, which helps determine how they should be used. Pharmacologists are mostly behind-the-scenes medical professionals—but the next time you read the instructions on a pill bottle, know you have a pharmacologist to thank!
-Kitana Ford, health communication assistant
Well-Aware: Setting the record straight on health myths
Have you ever been told that tampons aren’t safe to use? (I have!) Let’s break down what you should know.
Tampons, a hygiene product used to absorb menstrual flow, are regulated by the Food and Drug Administration as medical devices—which means that before bringing a product to market, tampon manufacturers have to submit data demonstrating its safety and effectiveness. This process is designed to ensure people who use tampons can have confidence in the products they find on the shelf. There are a few important caveats: The FDA says a tampon shouldn’t be used more than once, and the agency hasn’t cleared reusable tampons, which could come with a higher risk of yeast, fungal, and bacterial infections.
You may have read about something called Toxic Shock Syndrome, or TSS, on your tampon box. TSS is a real but rare risk of tampon use that occurs when bacteria produces toxins that enter the bloodstream. Luckily, there are straightforward ways to minimize risk, including changing a tampon regularly (the FDA recommends every four to eight hours) and opting for the lowest absorbency option you need. The rates of people developing TSS have declined in recent years, in part as a result of the FDA reviewing whether products encourage bacteria growth before they go to market, as well as more informative packaging on using tampons safely. Remember, millions of people use tampons safely every day.
If tampons aren’t for you, there are other options for menstrual management, including pads, period underwear, and increasingly popular menstrual cups and discs.
-Mel Moore, health communication assistant and ASU student
Looking for health-related events and ways to engage with your community? Here’s one you won’t want to miss: On Thursday, April 17 at 4:30 p.m., join the Walter Cronkite School of Journalism and Mass Communication and the National Center on Disability and Journalism to honor the winners of the Center’s annual contest, which recognizes journalists who have produced exceptional coverage of topics related to disability. Learn more.
Do you have a question or topic you’d like us to tackle? Would you like to share your experience? Reach out at any time—we’d love to hear from you.