Two Looks at
by Dave Eggers
Alfred Knopf, 2013
Privacy, A Casualty of Technology
At the dawn of the current millennium, Sun Microsystem’s ever-glib CEO Scott McNealy was quoted as saying, “You don’t have any privacy; get over it.” We frustrated HCI workers were briefly unsettled by that line, but soon dismissed it as yet another challenge for the sales and marketing departments. Dave Eggers’s latest novel takes a more serious look at the emerging war between technology and privacy, for some of us provoking thoughts that perhaps we should have had sooner.
If It Can Be Done…
The novel The Circle is the story of Mae’s career at a company called The Circle, a high-tech organization that bears an eerie resemblance to both Google and Facebook. She works in customer experience (or “CE”), a hybrid department of user experience and tech support. From her brief stop on the rungs of newbyhood, she ascends to become a major presence in the company, at one point responding to input from six screens lined-up on the desk in her cubicle. The idyllic start of Eggers’s adventures with Mae begins to unravel through a series of sinister details that include one Circler’s prediction that “we will become all-seeing, all knowing.”
Privacy, Garrett Keizer’s non-fiction investigation of this topic can help us to understand more about the war between technology and privacy that is central to The Circle. Keizer says, “Technology is our ever-expanding ability to let nothing alone.” He goes on to consider privacy as an issue and quotes an array of other writers on the subject.
Keizer says, “The writer occupies a zone of extraordinary privacy—not only in the conditions necessary to write but also and frequently in the ancient sense of privacy as a form of privation.” He calls writing “the chance to work alone and unmolested while producing work of acknowledged public benefit….”
Those same conditions, however, are the environment of software engineering, where violations of privacy are easily realized.
Mystique or Competitive Advantage?
Years ago, I co-authored a book on how to write resumes, find a job, and so on. One of the chapters, “Your Mystique,” suggested the need to reveal only what’s relevant at the time. There’s no question of personal mystique in Mae’s life; it’s all relevant to the ubiquitous systems at The Circle.
As an intensely competitive individual, Mae achieves success through the performance metrics that rule both the professional and personal lives of Circlers. In fact, the two kinds of life are inseparable in this world, and cameras everywhere record both the achievements and the infractions of the players—capturing them in perpetuity on massive data storage from which nothing is ever erased.
The most mysterious character in The Circle is Mae’s lover Kalden, who is completely unknown to her co-workers, but has absolute power over his encounters with Mae. His ironic character is anticipated by Keizer who quotes Australian professor Bonnie S. McDougall, who says that “privacy asserts power…and power confers privacy.”
The ultimate violation of privacy, a lesson learned by more than one politician, is the measurement of one’s human aptitudes by fellow humans. Circlers live in a world of constant personal analytics, which are rampant, relentless performance reviews.
Keizer again: “Privacy provides a zone of reflection and discussion in which gentler, less forward personalities can have some hope of making a contribution…temporary asylum to those who know themselves to be impressionable….” However, in a world where “friends” and “followers” are reduced to numbers of visits to a website, artificiality replaces sincerity.
Mae takes the measurements as a personal challenge. In time, she becomes jealous even of her mentor Annie’s success. Friendships and family members fail to survive Mae’s rush to the top; her personal scorecard includes points for ditching her family and friends.
So the story in The Circle also becomes an ethical quagmire, in large part because privacy has ceased to exist. When everything about you is known, everything about you is also permitted.
After reading The Circle, what is the eternally frustrated UX practitioner to do? Make a commitment to address the issue of privacy continually in research among users and research among designers—even when it may mean advocating against additional “features.” Learn more about the limits of privacy to be observed and how best to implement them.
Users Are More Important Than the Information We Collect About Them
Dave Eggers’ The Circle offers a cautionary tale about a world where information about users is more important than the users themselves. The world of The Circle is awash in information: productivity data to keep employees aware of how they’re performing; health data to inform them of their vital signs and statistics; community rankings to reflect their engagement; text messages, emails, and social media to keep them informed of news, gossip, critical client and personal information, and so on.
While this information is helpful in providing real-time visibility into an employee’s standing at work, well-being, and company involvement, at times it is prioritized above the employees themselves.
For example, Mae misses a few social events to visit her ailing father and to do some personal reflection. Not active in the company’s social media due to her relatively new employment status, her community engagement score suffers and Human Resources rushes to investigate. While the face-to-face meetings could have provided an opportunity for HR to gather insight into the “why” behind the score drop and understand the stressors Mae faces, instead, the metric is taken as a literal representation of Mae’s interest in being part of the Circle community.
This singular focus on a number distorts the company’s view of who Mae is and how she contributes to the community. Since the score only considers activities that can be tracked—such as participation in social media and attendance at events— Mae must over-participate in these to increase her ranking.
In addition to this explicit information, there is also the implicit information yet to be synthesized by Circle technologies. For example, video data captured via surveillance can be analyzed by anyone curious enough to seek it out. In fact, many of the Circle’s major innovations are technologies that bring together information in novel ways and analytics that measure previously intangible attributes such as children’s development in school.
These innovations are not without consequences. Mae’s mentor suffers a nervous breakdown upon learning the truth about her ancestors through PastPerfect, a tool that synthesizes historical data. This information was always available, but the Circle’s technology allowed her to derive meaning from it. Similarly Mae’s ex-boyfriend is literally driven to death while trying to escape a mob-like search party made possible by The Circle’s video surveillance technology, SeeChange.
While Mae appears to thrive on the information available to her, this is not realistic for many people. Human attention and our ability to simultaneously process many sources of information are limited. As emotional creatures, it is difficult to predict how we will feel when we interact with information, especially personal or sensitive data.
Finally, and perhaps the most provocative question raised by The Circle, is what to make of the reliance on metrics as a way to gauge intangibles, in some cases to the detriment or distortion of the attribute being assessed. Metrics are excellent tools to diagnose and forecast, but when analyzed out of context and without sensitivity to some larger picture, they can be more harmful than helpful.
Although The Circle is fictional, our current information environment is not so different. What lessons can we draw for managing interactions with information and the innovative technologies that produce it?
- Recognize that users will vary in their ability and desire to interact with multiple feeds of information. Technologies that synthesize multiple information sources should allow users to set preferences, and default settings should show the least amount of information necessary.
- It’s difficult to predict how users will feel when they interact with information, especially if it is personal or sensitive. Some form of warning or advisory should be present to let them know there is potential for harm or emotional distress.
- Giving users control over what they share and how companies and other people can use their information is critical. At the end of the day, the information belongs to the user.
- Leverage metrics, but don’t allow them to be the end-all-be-all when it comes to driving decision-making or understanding. To quote Albert Einstein, “Not everything that can be counted counts, and not everything that counts can be counted.” There is a critical interpretation element that comes with metrics that is best left to human reasoning. Using metrics alongside intuition and other qualitative inputs will lead to richer, more nuanced assessments.
Interacting with technologies created by companies like that in The Circle, along with the information they produce, is not inherently dangerous. Keeping the focus on users by safeguarding their information, allowing them to manage the data, and realizing they are more than the metrics or information that exists about them will ensure their well-being, as well as the value they get from interacting with the information.
Retrieved from https://uxpamagazine.org/privacy-meets-technology/
Comments are closed.