Persuasive Design: When Does UX Become Evil?

Almost a decade ago, somebody asked me what the number one thing I loved about working with Human Factors International was. I remember saying, “Our ethics policy. It states that I can refuse to work on a particular project on ethical grounds. If I feel strongly against violence, smoking, or alcohol, then it is well within my rights to refuse to work on that project.”

Ethics and UX should go hand in hand, but sometimes it’s not easy to decide what is right and what is wrong. UX experts use every arsenal at their disposal to make the user experience compelling. We speak to business stakeholders to understand KPIs. We conduct thorough performance, persuasion, and ethnography research. We create strategy that is persuasive and cross channel. We use every scientific design heuristic known to man. And we push technologies like virtual reality and artificial intelligence to drive experiences that appear real and engaging. We then validate our conclusions with usability and A/B testing to be certain that our designs are working the way that we had intended.

So, isn’t it imperative that we ask, “Are we designing the ‘right’ thing?” And more importantly, what is the “right” thing? After all, it’s not always easy to decide what’s right. But as UX professionals we should decide soon. After all, who is responsible when UX goes wrong?

Evil UX is Showing Up

Influencing people’s behavior for profit is not new. Marketers first used many persuasive behaviors in the mid-1930s when John B. Watson, an American psychologist, conducted psychological research in advertising. He concluded that marketing depended not on appealing to rational thought, but on appealing to emotions and stimulating desire for a product.

Watson become employed by the advertising agency J. Walter Thompson after being asked by Johns Hopkins University to leave his faculty position because of the publicity surrounding his affair with a graduate student. What was a loss to academia was a boon to advertising. Watson is said to have told advertisers to stir up the consumers’ emotions: “Tell him something that will tie him up with fear, something that will stir up a mild rage, that will call out an affectionate or love response, or strike at a deep psychological or habit need.” Today we see examples of persuasion behavior not just in advertising but in user experience design.

A few months ago, dating app Tinder unveiled a paid feature that allows users to “undo” left swipes (Figure 1). It was a rather innocuous launch, but it smelled a bit funny.

An image of the Tinder app that sows left and right swipes

Figure 1. Tinder: Swipe left if you dislike; swipe right if you like

On Tinder, you choose people based on their appearance. Swipe the profile to the left if you don’t like them. Swipe it to the right if you do. Simple? No.

Two user behaviors makes Tinder not so cool. The first, rapid screening, was documented in a 2006 study by Sillence, Briggs, Harris, and Fishwick. It found that people experience trust at two levels: rapid screening and deep evaluation. Rapid screening, also called thin slicing, is the first impression response that people have in a tiny fraction of a second.

On Tinder, you are mainly in a rejection mode while swiping due to rapid screening. This response is basically a lack-of-trust gut reaction. You are fervently rejecting people and making decisions in a fraction of a second.

But when you combine this user behavior with the second behavior, called momentum of yes (or no), it becomes evil. This second behavior means that if you repeatedly say “yes” (or in this case ”no”), then you are more likely to continue saying “no” even if you want to say “yes.” This happens because the neurons in your brains are repeatedly firing “no’s.” For you to say “yes,” you need to consciously stop your brain from saying “no.” This is the reason why, when installing software and repeatedly hitting the “Next” button, you accidentally install or change things you didn’t want.

When you visit Tinder, you are more likely to say “no” due to rapid screening, and the momentum of no is likely to keep you saying “no.” Instead of fixing this UX issue, Tinder lets you undo, but charges you for it. Hence, it’s not a wonder that a back button is the most requested feature among Tinder users, as stated by co-founder Sean Rad in a 2014 article on TechCrunch. See for yourself what Google auto suggestions show when you type in “Tinder swipe left” (Figure 2). The last suggestion is comically heart breaking.

Google’s auto suggestion when you type the words “Tinder swipe left” includes “Tinder swipe left gone forever.”

Figure 2. Google’s auto suggestion when you type the words “Tinder swipe left” is mildly amusing.

The Internet is littered with such “evil by design” examples. For a good list, visit Harry Brignull’s Dark Patterns.org website (Figure 3), which documents manipulative UI design techniques that “do not have the user’s interests in mind.”

Screen grab of the Dark Patterns website.

Figure 3. The Dark Patterns website documents the various evil designs that can be found on the Internet.

A few dark patterns featured on the site include:

  • Bait and Switch: The action of advertising goods that are an apparent bargain, with the intention of substituting inferior or more expensive goods.
  • Privacy Zuckering: The practice of “creating deliberately confusing jargon and user-interfaces that trick your users into sharing more information about themselves than they really want to,” according to Tim Jones of the Electronic Frontier Foundation. The name “Zuckering,” suggested by Jones’s followers on social media, apparently refers to Facebook and its founder Mark Zuckerberg.
  • Roach Motel: Something that is much easier to get into than to get out of, such as email newsletter subscriptions with a complicated process for unsubscribing.
  • Sneak into Basket: Adding additional items to a shopping cart that the user did not intend to select. Sometimes there’s an opt-out radio button or checkbox on another page that the user didn’t notice.

The Right Thing Is not Easy to Decide

Determining the right thing to do is not always simple. Take deception, for example. On the face of it, you would likely agree that one must not deceive people. However, in his book Evil by Design, Chris Nodder cites two examples of deception having positive outcomes.

In the first example, he writes about children who are afraid of monsters, and cites research from Liat Sayfan and Kristin Hansen Lagattuta from the Center for Mind and Brain at the University of California, Davis. They suggested that rather than reassure children that there are no such thing as monsters, it might be more useful to stay within their imaginary world and give children a way to be powerful against them.

This, of course, is deception, as Nodder points out, even though it appears that the children are willing participants in the deception. Even the 4-year-olds in the study knew that monsters weren’t real, but coped better when resolution was framed in terms of the imaginary world. The willing complicity in the deception can lead to product opportunities: Monster Go Away! spray is a diluted mix of lavender oil in water, packaged specifically with monster banishing in mind (Figure 4).

Monster and Ghost Spray Away

Figure 4. Monster and Ghost Spray Away

Another example Nodder cites in his book is Alzheimer’s patients who become distressed when their long-term memories conflict with their current situation. As dementia sets in they are less able to remember current events. Richard Neureither, the director of the Benrath Senior Center in Dusseldorf, suggested that rather than fight this, a better tact is to “meet them in their own version of reality.”

Franz-Josef Goebel, chairman of the Dusseldorf Old Lions Benefit Society, proposed creating a fake bus stop outside of a care facility. Because the residents are free people, they cannot be locked up or restrained with drugs. Some can become violent when told they can’t leave. By walking the residents to the bus stop—a symbol many associate with returning to their home—staff give the residents a sense of accomplishment. Because their short-term memory is not sharp, the residents soon forget why they were waiting for the bus. After its success at two initial locations, the idea was repeated at care facilities in Munich, Remscheid, Wuppertal, Herten, Dortmund, and Hamburg.

Nodder makes the argument that these scenarios are better alternatives than restraint or drugs because the children and Alzheimer’s patients responded favorably, and because the individuals were fully complicit in their own deception. So perhaps, Nodder concludes, persuasive techniques that use deception or appeal to sub-conscious motivations can have positive or even ethical outcomes.

Conclusion

In 1999, Daniel Berdichevsky and Erik Neuenschwander from Stanford’s Captology lab proposed eight “Principles of Persuasive Technology Design.” These principles have subsequently become a mainstay in persuasive design. Their eighth and “Golden Rule of Persuasion” says, “the creators of a persuasive technology should never seek to persuade a person or persons of something they themselves would not consent to be persuaded to do.” This guidance may help in determining what is right and what is wrong.

We can continue to debate what the right thing to do is, but what we can arguably agree on is that as long as business needs are met while improving the user’s experience, that could qualify as the right thing to do. Maybe part of our role as a UX professional is to start these discussions, stand up for the user, and find a way to meet both user and business needs without being deceitful.

Chauhan, V. (2015). Persuasive Design: When Does UX Become Evil?. User Experience Magazine, 15(4).
Retrieved from http://uxpamagazine.org/persuasive-design/

6 Responses

  1. Tarun says:

    Sir I totally agree with your views we as ux professional should stand for our users but keeping business needs in mind.

  2. Aral Balkan says:

    As long as our approach to design is anthropological (us designing for ‘the other’), any talk of ethics is superficial at best; comparable to discussing ethics in factory farming (that is to say, better that we care about the issue than not but ignoring the actual source of the problem).

    If we want to practice design (that which empowers people and creates a more egalitarian and sustainable world) instead of decoration (that which perpetuates the traditional monopolistic power structures of neoliberalism/capitalism) then we must tackle the root of the issue: instead of a mostly homogenous privileged group designing for ‘the other’, we must create diverse design teams who design for themselves. Not only is this competitive advantage (as you cannot compete with a competent design team designing for themselves when you’re designing for a demographic you are not part of) but also it is fundamentally egalitarian in nature: a diverse team, designing for themselves can design for a diverse audience without engaging in antropological practices.

  3. Even Keal says:

    I’m very happy to hear that Human Factors has a policy that allows UX professionals refuse projects on ethical grounds. UX professionals can easily find themselves at the forefront of deception.

    There have been scenarios where clients will bring UX professionals in to help with the effort to cover up or gloss over some very unethical practices. Basically make an “evil” practice feel like a “good” experience.

    I know of a UX designer who was assigned a consulting gig for Planned Parenthood and eventually asked to leave the project. In her case, she felt there was tremendous effort being put in to convince the public that abortion is somehow “healthcare”, questioning Planned Parenthoods ethics is somehow a “war on woman”, selling baby parts is “research” etc. The tools of this deception was “marketing”, “PR” and “Experience Design”.

    This is clearly using our UX superpowers for the wrong purpose. With great powers come great responsibilities!

  4. Thank you for writing about this issue,Vikram. It needs to be discussed and not swept under the rug, as it has been done in the advertising industry for many years.

    To Kilna: No, we’re not talking about regulation. But, neither are we talking about ignoring the ethical issue. As UX professionals, we need to be speaking up and calling out unethical practices, just as Vikram is doing here and Harry Brignull has done with darkpatterns.org. I’ll also add that it’s more like there’s bunches of bad apples, not just one.

    To Tema: No, ethics are not *always* clear cut, but they certainly are more clear cut than most people are willing to believe. It’s not hard to know when you’re crossing the line by choosing deception over clarity.

    BTW: I have also written about this same topic in 2014 in a post I titled “User Experience’s Dark Side Raises Ethical Stakes” — http://beautifulinvisibility.com/?p=1

  5. Kalyna says:

    Always a bad apple in the bunch! What’s the bottom line? More regulation?

  6. Tema Frank says:

    Interesting article. Ethics are not always clear-cut.