Balancing Privacy and Data Use in Digital Health

internet screen security protection
Photo by Pixabay on Pexels.com

A cherished colleague often says, “necessity is the mother of innovation”- I happen to agree with him. Few of us in healthcare would look back on 2020 and think the pace of innovation necessitated by the pandemic didn’t move us forward a decade in how we approach our work. One area that is always top of mind in privacy is how we deliver care and balance new ways to meet needs while being transparent about how data is used to improve that care’s delivery. We approach the healing journey with “first do no harm” codified into our processes, by design, and by regulation.

A new paper by Deven McGraw and colleagues from Citizen in Palo Alto, published in Nature Digital Medicine, explores considerations in balancing privacy and innovation in the digital age. To level set the complexity and opportunity for data, we must first understand the different data categories and their protections.

Category one data is data generated by the healthcare system; this is most often stored in the Electronic Medical Record, along with labs, prescriptions, and additional data generated by pathology and radiographic images. Category two data is consumer-generated health data; this includes wearable and sensor-derived data, patient-reported outcomes, and direct-to-consumer data like DNA testing.

The authors describe category three data as the “digital exhaust,” which is an offshoot of consumer data such as social media posts, geolocation, and internet browsing history. The last category, four, includes non-health-related data such as demographic data, employment status, zip code, and voter registration. Taken as a whole, all this data at an individual or population level can be insight-rich, but how much privacy is an individual afforded when considering the treasure trove of insights that lay at a savvy data cruncher’s fingertips?

The Health Insurance Portability and Accountability Act of 1996, also known as HIPPA, only provides protections for category one data, generated by the healthcare system so entities operating outside may not be accountable to the same standards. However, once health-related- data is collected by an entity covered by HIPPA, ALL that data is protected. To demonstrate the variation in approaches to data use, the authors analyzed ten apps; two of them containing information on women’s menstruation cycles found that user activities were transmitted to 70 different third party companies related to advertising and profiling that lacked explicit consumer consent. This trend of a lack of transparency in data use regarding consumer-generated activities has been shown in several studies of many app companies. This reality doesn’t inspire confidence for health systems that want to explore apps as a means of augmenting current care models and if this gap persists it will slow the adoption of digital health. Better data agreements and data stewardship are necessary.

While the 21st Cures Act is designed to create more open data sharing, health systems and EMR vendors have been critical of the lack of privacy provisions in sharing category one data with consumer-facing apps. Who is best positioned to inform a path forward? While many federal entities may need to collaborate on building a solution, the Federal Trade Commission (FTC) currently regulates companies’ health data use. Still, the authors suggest a more comprehensive set of guidance will be necessary. Several bills are being drafted to address privacy concerns. Are they relying too heavily upon consumers knowing what they are opting in or out of- we have all been through lengthly terms and conditions while onboarding on apps.

How are we to strike a balance between harmonizing data access with also maintaining privacy and transparency? Addressing this duality is a necessary next step, and the blueprint may be informed by existing frameworks like HIPPA and the FTC’s consumer privacy recommendations.

Consumers will require more choice and transparency to opt-in or out of specific data being collected and shared. Limits may also be necessary to avoid the “kitchen sink” consent process where a consumer does not get to chose what they want to share, rather an entity vacuums up all the data. Ethics boards and trusts may be necessary to ensure the data is being used appropriately, and as such, they can act as data custodians. The creation of better solutions for data breaches or harmful data use also needs to be developed when malicious use is detected, and appropriate deterrents should be formulated and acted upon.

A lot of work lays ahead to realize this vision of leveraging more comprehensive data use to support health and wellbeing. If digital health brings new models of care and more member-centered approaches, the “data use and privacy rubicon” will have to be traversed in the next few years. If data privacy and data use are not transparent, the most significant risk is the loss of consumer trust. 

Thanks for reading – Trina

(Opinions are my own)

References

McGraw, D., Mandl, K.D. Privacy protections to encourage use of health-relevant digital data in a learning health system. npj Digit. Med. 4, 2 (2021). https://doi.org/10.1038/s41746-020-00362-8

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Close Menu
%d bloggers like this: