There’s a growing cacophony of voices calling for a clear plan to bring an ethical dimension to technology. A few things are at play here: the recent use of big data in scandals such as Cambridge Analytica’s role in Brexit and the 2016 US election is pointing to an increased need for informed consent across the internet and within the world of apps. Another realization is how “addictive” our tech use appears to be: estimates suggest we touch our phones 2000-5000 times a day – our pets would run and hide if we were that affectionate with them! The challenge here is that tech is designed to engage you; app companies live and die on whether you open their products – their very business relies on that. Industry engagement rates hover around 5%, and they want to increase that number to grow their business model.
Apps have been around for about eleven years, and this period has seen the rise and robust articulation of persuasive design, defined by the Interaction Design Foundation as “an area of design practice that focuses on influencing human behavior through a product’s or service’s characteristics. Based on psychological and social theories, persuasive design is often used in e-commerce, organizational management, and public health. However, designers also tend to use it in any field requiring a target group’s long-term engagement by encouraging continued custom.” In healthcare, using nudges to encourage people to take their medications, track their mood, log their food, or weigh themselves can build good health habits over time – but what happens if nudges have negative consequences?
Dark Patterns- What are They?
In user experience design, the concept of dark patterns has emerged as a call-out in product development. More designers are calling for greater ethical considerations in designing apps – do you need to touch your phone 2000-5000 times a day? Are we at a tipping point where we are so distracted by notifications we ignore our work, pets, relationships, and the social glue that makes us human?
Early last year, UX Collective interviewed the curator of darkpatterns.org, Harry Brignull, to discuss ethics in design. He defines them as “tricks used in websites and apps that make you buy or sign up for things that you didn’t intend to. The purpose of darkpatterns.org is to spread awareness and to shame companies that use them.”
The interview is linked below and is worth your time, as several issues emerge in this conversation. Education in design frequently focuses on the user experience and prototyping products but often excludes or minimizes the importance of ethics, diversity, and inclusion. Designers are usually not involved in the broader strategic discussions of their employer or funder. Brignull says “design teams are typically seen as tools for implementing business strategy rather than as a source of it.” Brignull would like to see a lot more attention paid to business, as the sole focus on profit often obscures the necessary ethical dimensions of product design. He also calls on designers to build the evidence base around the impact of design.
“Yes, the planet got destroyed. But for a beautiful moment in time we created a lot of value for shareholders.”
It seems clear we need a new standard where engagement and nudges can occur with informed consent; the user knows WHAT data is being collected and HOW the information is likely to be used. The user should also be able to opt-in to data collection versus that we have today: user agreements that require a law degree to decipher and to scroll forever, so you give up and accept the terms and conditions. The time is ripe for user experience design to have full awareness of when they supplant the users’ choice with their own, for profit.
Thanks for reading – Trina
(My opinions are my own)
Deloitte Study on Smartphone use in the UK in 2017
Interaction Design Foundation Definition of Persuasive Design
UX Collective Interview with Harry Bringall
The Dark (Patterns) Side of UX Design