Skip to content Skip to sidebar Skip to footer

Persuasive Design: When Does UX Become Evil?

Almost a decade ago, somebody asked me what the number one thing I loved about working with Human Factors International was. I remember saying, “Our ethics policy. It states that I can refuse to work on a particular project on ethical grounds. If I feel strongly against violence, smoking, or alcohol, then it is well within my rights to refuse to work on that project.”

Ethics and UX should go hand in hand, but sometimes it’s not easy to decide what is right and what is wrong. UX experts use every arsenal at their disposal to make the user experience compelling. We speak to business stakeholders to understand KPIs. We conduct thorough performance, persuasion, and ethnography research. We create strategy that is persuasive and cross channel. We use every scientific design heuristic known to man. And we push technologies like virtual reality and artificial intelligence to drive experiences that appear real and engaging. We then validate our conclusions with usability and A/B testing to be certain that our designs are working the way that we had intended.

So, isn’t it imperative that we ask, “Are we designing the ‘right’ thing?” And more importantly, what is the “right” thing? After all, it’s not always easy to decide what’s right. But as UX professionals we should decide soon. After all, who is responsible when UX goes wrong?

Evil UX is Showing Up

Influencing people’s behavior for profit is not new. Marketers first used many persuasive behaviors in the mid-1930s when John B. Watson, an American psychologist, conducted psychological research in advertising. He concluded that marketing depended not on appealing to rational thought, but on appealing to emotions and stimulating desire for a product.

Watson become employed by the advertising agency J. Walter Thompson after being asked by Johns Hopkins University to leave his faculty position because of the publicity surrounding his affair with a graduate student. What was a loss to academia was a boon to advertising. Watson is said to have told advertisers to stir up the consumers’ emotions: “Tell him something that will tie him up with fear, something that will stir up a mild rage, that will call out an affectionate or love response, or strike at a deep psychological or habit need.” Today we see examples of persuasion behavior not just in advertising but in user experience design.

A few months ago, dating app Tinder unveiled a paid feature that allows users to “undo” left swipes (Figure 1). It was a rather innocuous launch, but it smelled a bit funny.

An image of the Tinder app that sows left and right swipes
Figure 1. Tinder: Swipe left if you dislike; swipe right if you like

On Tinder, you choose people based on their appearance. Swipe the profile to the left if you don’t like them. Swipe it to the right if you do. Simple? No.

Two user behaviors makes Tinder not so cool. The first, rapid screening, was documented in a 2006 study by Sillence, Briggs, Harris, and Fishwick. It found that people experience trust at two levels: rapid screening and deep evaluation. Rapid screening, also called thin slicing, is the first impression response that people have in a tiny fraction of a second.

On Tinder, you are mainly in a rejection mode while swiping due to rapid screening. This response is basically a lack-of-trust gut reaction. You are fervently rejecting people and making decisions in a fraction of a second.

But when you combine this user behavior with the second behavior, called momentum of yes (or no), it becomes evil. This second behavior means that if you repeatedly say “yes” (or in this case ”no”), then you are more likely to continue saying “no” even if you want to say “yes.” This happens because the neurons in your brains are repeatedly firing “no’s.” For you to say “yes,” you need to consciously stop your brain from saying “no.” This is the reason why, when installing software and repeatedly hitting the “Next” button, you accidentally install or change things you didn’t want.

When you visit Tinder, you are more likely to say “no” due to rapid screening, and the momentum of no is likely to keep you saying “no.” Instead of fixing this UX issue, Tinder lets you undo, but charges you for it. Hence, it’s not a wonder that a back button is the most requested feature among Tinder users, as stated by co-founder Sean Rad in a 2014 article on TechCrunch. See for yourself what Google auto suggestions show when you type in “Tinder swipe left” (Figure 2). The last suggestion is comically heart breaking.

Google’s auto suggestion when you type the words “Tinder swipe left” includes “Tinder swipe left gone forever.”
Figure 2. Google’s auto suggestion when you type the words “Tinder swipe left” is mildly amusing.

The Internet is littered with such “evil by design” examples. For a good list, visit Harry Brignull’s Dark Patterns.org website (Figure 3), which documents manipulative UI design techniques that “do not have the user’s interests in mind.”

Screen grab of the Dark Patterns website.
Figure 3. The Dark Patterns website documents the various evil designs that can be found on the Internet.

A few dark patterns featured on the site include:

  • Bait and Switch: The action of advertising goods that are an apparent bargain, with the intention of substituting inferior or more expensive goods.
  • Privacy Zuckering: The practice of “creating deliberately confusing jargon and user-interfaces that trick your users into sharing more information about themselves than they really want to,” according to Tim Jones of the Electronic Frontier Foundation. The name “Zuckering,” suggested by Jones’s followers on social media, apparently refers to Facebook and its founder Mark Zuckerberg.
  • Roach Motel: Something that is much easier to get into than to get out of, such as email newsletter subscriptions with a complicated process for unsubscribing.
  • Sneak into Basket: Adding additional items to a shopping cart that the user did not intend to select. Sometimes there’s an opt-out radio button or checkbox on another page that the user didn’t notice.

The Right Thing Is not Easy to Decide

Determining the right thing to do is not always simple. Take deception, for example. On the face of it, you would likely agree that one must not deceive people. However, in his book Evil by Design, Chris Nodder cites two examples of deception having positive outcomes.

In the first example, he writes about children who are afraid of monsters, and cites research from Liat Sayfan and Kristin Hansen Lagattuta from the Center for Mind and Brain at the University of California, Davis. They suggested that rather than reassure children that there are no such thing as monsters, it might be more useful to stay within their imaginary world and give children a way to be powerful against them.

This, of course, is deception, as Nodder points out, even though it appears that the children are willing participants in the deception. Even the 4-year-olds in the study knew that monsters weren’t real, but coped better when resolution was framed in terms of the imaginary world. The willing complicity in the deception can lead to product opportunities: Monster Go Away! spray is a diluted mix of lavender oil in water, packaged specifically with monster banishing in mind (Figure 4).

Monster and Ghost Spray Away
Figure 4. Monster and Ghost Spray Away

Another example Nodder cites in his book is Alzheimer’s patients who become distressed when their long-term memories conflict with their current situation. As dementia sets in they are less able to remember current events. Richard Neureither, the director of the Benrath Senior Center in Dusseldorf, suggested that rather than fight this, a better tact is to “meet them in their own version of reality.”

Franz-Josef Goebel, chairman of the Dusseldorf Old Lions Benefit Society, proposed creating a fake bus stop outside of a care facility. Because the residents are free people, they cannot be locked up or restrained with drugs. Some can become violent when told they can’t leave. By walking the residents to the bus stop—a symbol many associate with returning to their home—staff give the residents a sense of accomplishment. Because their short-term memory is not sharp, the residents soon forget why they were waiting for the bus. After its success at two initial locations, the idea was repeated at care facilities in Munich, Remscheid, Wuppertal, Herten, Dortmund, and Hamburg.

Nodder makes the argument that these scenarios are better alternatives than restraint or drugs because the children and Alzheimer’s patients responded favorably, and because the individuals were fully complicit in their own deception. So perhaps, Nodder concludes, persuasive techniques that use deception or appeal to sub-conscious motivations can have positive or even ethical outcomes.

Conclusion

In 1999, Daniel Berdichevsky and Erik Neuenschwander from Stanford’s Captology lab proposed eight “Principles of Persuasive Technology Design.” These principles have subsequently become a mainstay in persuasive design. Their eighth and “Golden Rule of Persuasion” says, “the creators of a persuasive technology should never seek to persuade a person or persons of something they themselves would not consent to be persuaded to do.” This guidance may help in determining what is right and what is wrong.

We can continue to debate what the right thing to do is, but what we can arguably agree on is that as long as business needs are met while improving the user’s experience, that could qualify as the right thing to do. Maybe part of our role as a UX professional is to start these discussions, stand up for the user, and find a way to meet both user and business needs without being deceitful.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.