[greybox]
A review of
Future Ethics
by Cennydd Bowles
About this book
A good reference for Methods/How-To and Case Studies
Primary audience: Researchers, designers, and technical roles.
Writing style: Matter of fact
Publisher: NowNext Press (September 25, 2018)
Text density: Mostly text
206 pages, 9 chapters
Learn more about our book review guidelines
[/greybox]
Traditionally, ethics has neither been a particularly interesting nor accessible topic for contemporary practitioners. Somewhere between the dusty discourse of Greek philosophers and the purely intellectual thought experiments in school, we lost our appetite for pondering moral implications. In his book, Cennydd Bowles revives the value of historical ethical arguments and presents them in both a provocative and terrifying light relevant for all technologists.
The rise of big data has led to an explosion in terms of regulation and moral debates, but Cennydd Bowles points out that ethical objections have always existed regardless of the innovation. Bowles writes, “social media joins a rich canon of scourges: books, newspapers, and gramophones were all, in various centuries, linked to the certain downfall of social order.” However, the ethics of our present distinguishes itself from the past through the sheer volume and magnitude of consequences. Taught to build first and test later, design and development practitioners are painfully unaware of the ethical choices they make on a daily basis. Even worse, in our desire to simplify the user experience, we have created a gap so wide between the conceptual model and the underlying technology that our users are often oblivious to the ambiguous boundaries of data collection. Those who cannot perceive the mechanisms behind technology are powerless to act against it.
Commenting on the current landscape, Bowles describes two ideological camps that surround technology: instrumentalists and determinists. Instrumentalists argue that technology is neutral; just a tool that people can either use for good or harm. The opposing view, determinism, argues that technology is anything but neutral; so powerful that it shapes our society and culture. Technologists typically describe their goals with deterministic language but fall back to instrumentalist ideals when things go awry. To sum up the attitudes that define Silicon Valley, Bowles writes, “In other words, technology will change the world, but if the world changes, don’t blame us.”
Bowles offers a third approach—mediation theory—from tech philosopher Peter-Paul Verbeek. This theory combines the competing views of both instrumentalism and determinism. “We don’t fully control tech, nor does it fully control us; instead, humans and technologies co-create the world.” What this means for our field is that technology is neither inert nor separate from human action. We must apply ethics to technology just as we do to the decisions of our everyday lives.
The majority of the book expounds upon some of the greatest challenges today, including the inherent algorithmic bias, the power of persuasive technology, and the dystopias of surveillance, autonomous war, and a post-work future. Such challenges ascended to prominence through carelessness or failure to predict unintended consequences. This should no longer be acceptable for our society. Bowles proposes that “unintended does not mean unforeseeable. We can and must try to anticipate and mitigate the worst potential consequences.” Throughout the rest of the book, Bowles arms the audience with tools to help anticipate unintended consequences of their choices.
One of these tools—a “provocatype”—is intended to inspire moral imagination and discussion. Provocatypes are not “good” design but spark better reactions than a hypothetical discussion would. The following renderings showcase a provocatype in action. This selected future features public charging station (Figure 1) and ID cards (Figure 2) in an energy scarce world.
Figure 1. Public charging station (Credit: Creative Commons).
Figure 2. Provocatype rendering of ID cards (Credit: Audrey Bryson).
The most interesting question surrounding this particular provocatype is not the charging mechanism but how an algorithm might prioritize energy when demand exceeds supply. Each user is given a card that relates to their position in society. For instance, emergency vehicles and doctors might be given priority, while a recently released offender might have a cap on their energy use. This allows us to imagine the potential good and harm that might occur if we allow algorithms to wrap up our social status into a digital ID.
In the final chapters, Bowles discusses how designers, armed with newfound tools and awareness, can begin to navigate the ethical quagmires that inevitably surround their work. The author acknowledges that facilitating ethical discussions within companies can seem daunting in some office environments. Determining what is morally right and standing up for it can be a difficult and lonely endeavor. Bowles writes the following:
“But choosing to be a more ethical professional will also make you a more moral person. There should be no divide between personal and professional ethics: thinking deeply about the right way to live will, with luck get you closer to the life you want.”
The book ends with a call to action. Given the unprecedented challenges that the future will bring, we need thoughtful technologists now more than ever. We need people who are genuinely curious, inclusive, and determined to steer this industry towards a better future. Will you be one of them?
[bluebox]
An ethical awakening is long overdue. Technologists are rightly starting to question their influence on a world spiraling off its expected course, and as the industry matures, it is natural to pay attention to deeper questions of impact and justice. As sociologist Richard Sennett points out, “It is at the level of mastery […] that ethical problems of craft appear.” This focus coincides with growing public disquiet and appetite for ethical change. Consumers want to support companies that espouse clear values: 87% of consumers would purchase a product because a company advocated for an issue close to their hearts. Emerging technology raises the stakes further. Over the coming decades, our industry will ask people to trust us with their data, their vehicles, and even their families’ safety. Dystopian science fiction has already taught people to be skeptical of these requests; unless we tackle the ethical issues that are blighting the field, this trust will be hard to earn.
[/bluebox]