How Private is Your Data in Mental Health Apps?

wood dirty rope door
Photo by Markus Winkler on Pexels.com

The pandemic is challenging our mental health in many ways as life in lockdown continues; while hope is on the horizon as vaccine distribution accelerates, the lingering impacts of COVID-19 will be with us for some time.

Consumer Reports (CR) has been a trusted source of reviews for many years; from cars to appliances, they conduct in-depth product reviews and provide consumers with side-by-side comparisons to aid consumer choice. More recently, CR has also waded into app reviews since more and more people use apps to manage aspects of daily life. What happens to your data in these apps? Is that a concern for you? What if some of these data are sensitive, such as your current mood. What if you share your mental health history? Who else has access to that information?

The digital fingerprint we leave in our daily wake is rightly garnering more attention. Apps that support mental health are ubiquitous on the app store. Consumers have very little to go on when choosing a solution, and often word of mouth generates interest. 

On March 2, CR released a report on privacy in consumer mental health apps. Their findings fall into four categories of recommendations for app companies to consider when developing an app and can act as a guide for consumers when using mental health apps.

  1. Clearly explain how consumer data will be de-identified for use in research and, if identifiable, how consumers can opt-in or out. Given the sensitivity of mental health data, ensuring a consumer’s data is de-identified is very important as savvy analysts could technically take this data and add it to other identifiable data sources and share it with third parties. These digital traces may come back to haunt a consumer if used for nefarious purposes.  
  2. Clearly explain how you as an app company will use data-in-context so users know what the consequence may be if they choose to share their data. CR recommends that privacy is protected as a default, so this isn’t a massive lift for the end-user. If the choice of how data will be used is presented, it should be clear and concise. 
  3. Follow platform guidelines to protect user’s privacy. Whether an app is in Android or iOS format, CR recommends that app companies develop and configure the data libraries they embed in their solutions are produced and maintained to standard.
  4. Be transparent about additional service providers that receive user data as part of the user experience. Currently, in the US is not a legal requirement to list all service providers who may receive your data from your app’s use. Still, as we advance, it’s prudent for privacy hygiene to inform the consumer about how their data may be shared so they can opt-in or out for those third-party vendors.

As digital health grows in the healthcare space, improved data use, privacy, and transparency will be table stakes for the industry’s health. Data breaches have shaken consumer trust, and there is appropriately higher sensitivity about health data. This CR report shows work to be done to improve transparency and protections in mental health apps. As health systems leverage these tools, there is also tremendous opportunity to evolve and enhance these expectations and standards to safeguard consumers. 

Thanks for reading – Trina

(Opinions are my own)

References

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Close Menu
%d bloggers like this: