You have probably noticed an increase of email in your inbox with updated privacy policies as the European Union enacts the General Data Protection Regulation (GDPR) this past month. At its core, companies that process personal data and information of people living in the European Union must now seek consent for using that data for business or marketing purposes; approval will have to be obtained, and the information has to be clear and in plain language. If organizations are found to be in breach of the regulations, they incur a penalty of up to 4% of their global revenues, or 20 million Euro (whichever is bigger). The implications are enormous, and the outcome remains to be seen. Globally, there have been so many data breaches that protecting personal data, whether social, financial or healthcare related, is imperative.
Convenience and Personalization over Privacy?
With 7 billion people on Earth and almost 29% of the planet’s inhabitants being active on Facebook, the data generated daily is a goldmine for marketers – but have we sacrificed privacy and prioritized personalization? Recent global events showcasing how data can be used to shape elections have given many people, including social media leaders, a reason to reconsider the balance of privacy over customization. I know I get privacy notices, and frankly, I don’t have a law degree, so they don’t make a lot of sense to me. It used to be only seven pieces of information were necessary to identify who you were online; at a recent conference I attended, that number was reported as now being down to three.
There are many benefits to Big Data: the emergence of Artificial Intelligence whose engine is driven by an ability to crunch massive amounts of data and look for patterns that can support clinical care, prevent disease, or more rapidly pinpoint to a therapeutic approach. The promise of emerging technology is immense; we do however need to bring transparency, consent, and ethics to the forefront: the backbone must be built in the light, not hidden behind legalese and covert processes that have dominated the last decade. The EU’s efforts with GDPR are, I think, a step in the right direction. In 1996, the Health Insurance Portability and Accountability Act (HIPAA) was passed, and at its core, it safeguards a person’s health information in paper or electronic format, provides a path for a person to access their data, and limits where and how that health data can be shared. Any technology solutions that want to work with health systems must be able to demonstrate that they are HIPAA compliant – it is table stakes, as health systems, providers, and health plans take HIPAA very seriously.
One concept that has been emerging is called “social cooling”- a term coined by Technology Critic & Privacy Designer Tijmen Schep. He posits that just like oil and its by-product industries have contributed to climate change; our digital footprint leaves digital breadcrumbs which can be scooped up by data engines and analyzed for nefarious purposes. His concern seems Orwellian but, given recent events, is gaining credence. If all your digital data form a “picture” of who you are as a person, would your picture accurately reflect you? Schep is concerned that just like a credit score determines if we can get a loan for a car or a house, our emerging social score would have implications for employment and other essential aspects of our day to day lives. It is indeed a sobering thought.
Is your Online Life a Digital Tattoo?
In a popular Ted talk, Futurist Juan Enriquez asks if our online life functions as a digital tattoo. Well, does it? It is a fascinating concept. He suggests that Andy Warhol may have gotten it all wrong: instead of 15 minutes of fame, we may only have 15 minutes of anonymity in the modern world. iGen and Millennials are digital natives; they don’t know a time when the internet wasn’t around. Watching a toddler use an iPad is both fascinating and frightening at the same time: they figure technology out so quickly – thanks to the same intuitive design that makes it easier for those of us who are older to use, too!
So, each one of us has a lot to think about: how to redress the transparency imbalance of our digital data, who gets to see it, who gets to track us, how it gets used to market to us, or else sold on to other data mining groups. We are at an inflection point, and I believe, it is a good one. Bringing an ethical dimension to how data is used is critical. In the meantime, please tend to your digital garden with awareness, as you are sharing your information with many – most of them unknown to you.
Thanks for reading- Trina
(Opinions are my own)
The General Data Protection Regulation
Data that is Collected Online
Health Insurance Portability and Accountability Act of 1996 (HIPAA)