Canadian Journal of Nursing Informatics

Information

This article was written on 20 Jun 2020, and is filled under Volume 15 2020, Volume 15 No 2.

Current post is tagged

, , , , , , , , ,

Dark Patterns

Print Friendly, PDF & Email

Software Column

by Allen McLean, RN, MN, MSc, PhD(c)

Allen is currently a PhD student in Health Sciences at the University of Saskatchewan (Saskatoon) in the Computational Epidemiology and Public Health Informatics Lab. His research interests include the development of computer modeling and simulation software for addressing health systems challenges, chronic diseases and health inequities at the population level, as well as mobile technologies applied in long-term care facilities. Allen previously attended the University of Victoria earning an MN and MSc (Health Information Science) in a unique dual degree program for Nursing Informatics professionals. Allen has over 20 years’ experience in healthcare as an ultrasound technologist, clinical educator, team leader and community health RN.

COLUMN

There are many ethical issues that come up when nursing informatics professionals develop a technology designed to persuade. One of the most salient issues specifically related to the design and use of these technologies is the use of dark patterns in user-experience (UX) design. Dark patterns are instances where designers use their knowledge of human behaviour to implement deceptive functionalities.

Interest in the study of ethics, and the use of persuasion can be traced back to ancient times. Scholars such as Socrates, Plato, and Aristotle would use rhetoric oration to debate, and persuade. Many people think of advertising as a modern method of persuasion, but archeologists have discovered evidence of advertising on bricks made by the Babylonians, some 3,000 years before the time of Christ. Today, as our society becomes more and more technologically advanced, we are developing new methods for persuading, including methods such as persuasive technologies. Persuasive technology is any technology (e.g., computers, mobile phones and apps, tablets, wearables, serious gaming), purposely designed to change attitudes or behaviours (Byrnes, 2015). Alongside the many design choices made during the development of a persuasive technology, it is important we consider the ethical ramifications associated with these technologies, especially now as we grapple with matters of privacy, confidentially, anonymity, informed consent, data sharing, and security (Gibney, 2018). There are many ethical issues concerning the use of persuasion, for example the libertarian paternalism vs. behavioural regulation vs. coercive paternalism debate – i.e., the nudge, budge, or shove debate we see playing out in public, social, and health policy arenas (Oliver, 2015).

There are a number of important ethical issues related to the use of persuasive technologies. Arguably, one of the most interesting concerns a group of specific UX design strategies termed dark patterns. Dark patterns have been defined as instances where designers and developers use their knowledge of human behaviour to implement deceptive functionalities. Others describe dark patterns as interactive design patterns that influence users with deception or trickery, and represent the unethical application of persuasive technologies. Unfortunately, much of the research into dark patterns is found published in conference proceedings, making access difficult or impossible. That said, Gray et al. (2018), published an excellent summary of the dark patterns they argue commonly serve as strategic motivators for designers and developers: (1) nagging, (2) obstruction, (3) sneaking, (4) interface interference, and (5) forced action.

The authors define nagging as a “minor redirection of expected functionality that may persist over one or more interactions” (Gray et al, 2018, p. 5). Nagging is often experienced as repeated interruptions during normal interactions with a technology. Typical nagging strategies include pop-ups that obscure the interface, audio alerts that distract the user, or any other actions that redirect a user’s focus. Obstruction is defined as “impeding a task ?ow, making an interaction more difficult than it inherently needs to be with the intent to dissuade an action” (Gray et al, 2018, p. 5). Obstruction is often experienced as a major barrier to a task. Sneaking is defined as “an attempt to hide, disguise, or delay the divulging of information that has relevance to the user” (Gray et al, 2018, p. 6). Sneaking is often experienced as forcing a user to perform an action they might not normally do, if they had knowledge of the outcome (e.g., additional undisclosed costs). Interface interference is defined as “any manipulation of the user interface that privileges specific actions over others, thereby confusing the user or limiting discoverability of important action possibilities” (Gray et al, 2018, p. 7). Interface interference is often experienced as visual or interactive deceptions, including: hidden information (options not made easily or readily accessible), preselection (options are selected by default prior to user interaction), and aesthetic manipulation (any manipulation placing form before function). And finally, forced action is defined as “any situation in which users are required to perform a specific action to access (or continue accessing) a specific functionality” (Gray et al, 2018, p. 8). This action may be experienced as a required step for the completion of a process, or may appear disguised as an option that a user will misinterpret as beneficial to themselves.

Specific examples of dark patterns include: (1) bait and switch (a user is promised one thing, but instead receives something different, possibly undesirable), (2) disguised ads (advertisements disguised as other kinds of content), (3) forced continuity (e.g., difficulty cancelling a membership), (4) friend spam (spamming a user’s contact list), (5) hidden costs (e.g., unexpected charges added at the end of a checkout process), (6) misdirection, (7) price comparison prevention (limits a user’s ability to make informed decisions), (8) privacy Zuckering (a user is tricked into publicly sharing more information about themselves than intended), (9) roach motel (design allowing for easy entry, but difficult escape), (10) sneak into basket (e.g., a system adds items to a basket, only removed using opt-out radio buttons that are not immediately obvious), and trick questions (often used for misdirection or forced continuity) (Gray et al, 2018).

Because UX designers play a central role in the development of persuasive technologies (Verbeek, 2006), it is important they understand the ethical issues involved with the use of dark patterns, else they could easily become complicit in manipulative or unreasonably persuasive practices. Strategies for mitigating the negative effects caused by the use of dark patterns might include: appropriate codes of professional conduct for UX designers and computer professionals, formal ethics education, regulation (and enforcement), and user participation in the design process. Codes of professional conduct do exist, for example the Association for Computing Machinery (ACM), is a professional organization for computing educators, researchers, and professionals, and the ACM Code of Ethics and Professional Conduct (2018), outlines a number of expectations (e.g., contribute to society and to human well-being, acknowledging that all people are stakeholders in computing, avoid harm, be honest and trustworthy, respect privacy, honour confidentiality, ensure that the public good is the central concern during all professional computing work). The Code of Professional Conduct developed by the User Experience (UX) Professionals Association (UXPA) describes seven ethical principles that might also provide UX designers with useful guidance (n.d.): (1) act in the best interest of everyone, (2) be honest with everyone, (3) do no harm, and if possible provide benefits, (4) act with integrity, (5) avoid conflicts of interest, (6) respect privacy, confidentiality, and anonymity, and (7) provide all resultant data.

Aside from these voluntary recommendations, regulation and legal enforcement might be an option for limiting the use of dark patterns, as might formal ethics training added to science and engineering curriculums, or even guidance from experts in the application of technology ethics. However, a more promising approach could begin with the use of methodological frameworks described by Davis (2009): Value Sensitive Design (emphasizing values such as fairness, autonomy, privacy, and human welfare), and Participatory Design (a family of theories and methods that emphasize the involvement of users as full participants in design processes).

The purpose of a persuasive technology is to influence behaviours, and no technology is value neutral. Persuasion is an inherently controversial activity, some believing that all forms of persuasion are unethical. This is certainly debatable, as many others see the tremendous potential for improving many aspects of society using technology. Nurses are well positioned to contribute to this discussion; we practice under a strong core of ethical values, and we are well-respected by society for our many years of ethical contributions. We can do our part to end the use of dark patterns in healthcare systems.

References

Association for Computing Machinery (ACM). (2018). ACM code of ethics and professional conduct. https://ethics.acm.org/

Byrnes, N. (2015, March 23). Technology and persuasion. MIT Technology Review. https://www.technologyreview.com/s/535826/technology-and-persuasion/?set=535816

Davis, J. (2009, April 26). Design methods for ethical persuasive computing. In Persuasion ’09: Proceedings of the 4th International Conference on Persuasive Technology. Article No. 6:1-8. https://www.cs.grinnell.edu/~davisjan/pubs/davis-persuasive2009.pdf

Gibney, E. (2018, July 26). The ethics of computer science: this researcher has a controversial proposal, Nature Briefing. https://www.nature.com/articles/d41586-018-05791-w

Gray, C.M., Kou, Y., Battles, B., Hoggatt, J. & Toombs, A.L. (2018, April 21). The dark (patterns) side of UX design. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Paper No. 534:1-14. https://doi.org/10.1145/3173574.3174108

Oliver A. (2015, March 19). Nudging, shoving, and budging: behavioural economic-informed policy. Public Administration, 93(3):700-714.

User Experience Professionals Association (UXPA). (n.d.). UXPA code of professional conduct. https://uxpa.org/uxpa-code-of-professional-conduct/#more-247

Verbeek, P.P. (2006). Persuasive technology and moral responsibility: toward an ethical framework for persuasive technologies. In IJsselsteijn, W., de Kort, Y., Midden, C., Eggen, B., van den Hoven, E. editors. PERSUASIVE 2006. Springer; p. 1-5.

Be Sociable, Share!

Comments are closed.