In the old days of product development, more features equaled more value. In today’s design-driven economy, true value lies in identifying and delivering the objective with the fewest features. Take the iPod, a near-ubiquitous example of successful design: it’s hardly the most feature-rich MP3 player on the market. It delivers the greatest amount of consumer-relevant value with a minimum of distraction. Indeed, too many features can be the very definition of distraction. I’ve raised eyebrows in the past by stating that “features are the enemy of good design.” But what exactly does that mean?
Whether you’re building iPods or financial software, your end customers’ needs (and, most importantly, the fulfillment of those needs,) must drive your process to market. But understanding is merely the first step. Getting the product to market without losing its customer-centric focus is the real challenge. Many forces conspire to knock your product off course, and often there are hard choices to be made along the way. Many features must “fall away” to allow for more task-relevant features to shine. To succeed, you’ll need a solid commitment at all levels to allow your customers’ vision to show up in the end-product. This is, in its most basic sense, what user-centered design (UCD) is all about.
The basics of UCD have remained relatively unchanged for some time but, thanks to advances in technology, the tools we use to accomplish it have changed dramatically. From best-in-class simulation/visualizing technology to multiple usability labs connected over a network, we leverage these advances to design and architect applications that drive customer adoption, reduce expenses, and increase revenue. By focusing on understanding what our customers want to do with our products and then designing accordingly, we naturally end up with a product that they will adopt and use over and over again. This seems like a common-sense notion, but it is amazing how rarely it is applied, particularly in the financial services world.
In this article I’ll talk mostly about prototyping and application simulation, but there are plenty of other advances in UCD and user experience (UX) methodology. Online meeting tools like WebEx and Microsoft Live Meeting (or its older sibling NetMeeting) can be used for focus groups and cognitive walkthroughs. Card sorting now can be done on the Web instead of in person (for example, www.websort.net and www.optimalsort.com). And now you can set up usability labs over the network with tools from companies like Ovo Studios and TechSmith (Morae and UserVue).
Hi-fi Brings Clarity
At Washington Mutual (WaMu), we’ve built our success around a process that emphasizes rapid prototyping and testing with real people early and often. Creating high-fidelity simulations (using iRise Studio) allows us to put a working model of the application—one that’s virtually indistinguishable from the real thing—in front of everybody involved in the product development process, before doing any coding.
Of course, there are many ways to simulate an application or website (paper prototyping, HTML, Visio diagrams, Axure, Word docs, and PowerPoint to name a few), but in WaMu’s case, there were two main factors that led us toward a solution like iRise. One, that our simulations had to be as close to the final product as possible, and two, that we needed to be able to model data flow as well as user interface (UI) and navigation.
As user experience professionals, we have to be able to share not only the way an application is supposed to look, but also the way it’s supposed to behave as users interact with it. A high-fidelity model that is data-driven accomplishes this in a way that low-fidelity or static models simply cannot. The difference between the two is a bit like the difference between a movie script and a storyboard: if you can show the application functioning rather than describing it in text, people “get it” much faster and with less ambiguity.
Because we can simulate so rapidly and with so much clarity, we spend less time and money on basic definition. We get to focus on the details of quality and innovation—never “surprised” by what our technology teams deliver in the end. And, conversely, our technology teams don’t need to struggle to understand precisely what we want. This shared understanding is a key benefit of prototyping/simulation, and managed properly, it can foster a more collaborative and less antagonistic working environment.
Consensus on the Basics, Exploration of the Edge Cases
Our online projects involve a number of participants, from developers and designers, to end-users and members of the compliance and legal departments. All of these experts come with different points of view—which is precisely why they are initially brought into the project. The flip side of that expertise is that it is no easy task to overcome differences in experience, terminology, and worldview to build a common vision of the end product. It’s no surprise this can result in a lot of complexity very quickly.
Simulation offers everyone involved in the specification process something that is both common (everybody looks at the same thing), and commonly understood (an actual behavioral depiction of the way the software should work). This quickly brings a degree of consensus that you just can’t get with traditional low-fidelity or text-only descriptions of the system.
So, what do you do with all of that leftover time? It depends upon your industry and business goals of course. At WaMu we’ve found that getting past the basics quickly allows us to focus on the details with an eye toward both quality and innovation. Once you start testing the “what if?” scenarios, you quickly discover the nooks and crannies—which is where the real work lies. Ultimately, every software project must confront this fact—that 80 percent or more of the definition of the experience lies in thornier issues: the edge cases. Without a solid foundation from which to begin, edge case definition can have the effect of eroding the underlying design and driving the project team toward confusion.
To drive out all of the edge cases, we often put prototypes in front of internal audiences that go beyond the core project team. Because iRise is collaborative in real-time, we can hold design meetings with people across the country without traveling to one location. And with iRise interactive documents (iDocs), we can distribute a self-contained copy of the simulation for feedback from remote stakeholders anywhere in the world. This gives us a far greater chance of avoiding the inevitable pitfalls without slowing things down.
The Benefits— and Drawbacks—of Convenience
Coded prototypes can be expensive in terms of time, money, and resources and they are often thrown away, replaced by the final product. Application simulation avoids this costly process, but its very convenience can sometimes present its own drawbacks.
For example, a high-fidelity model can quickly take on a life of its own. UX professionals have to become adept at explaining to a senior executive or stakeholder that the working, branded website they are viewing is not ready to launch. The point of prototyping is not to undercut the need for software coding and development, but rather to crystallize and document the desired outcome so that valuable effort from skilled resources is not wasted.
Another potential pitfall lies in the sheer number of records and decision-points you can generate in the simulation process. Having the ability to quickly change a prototype in response to usability or other feedback can sometimes be too easy. Understanding when it is okay to adjust based on data still requires the skills of a usability professional, and weighing the implications of changes requires an analyst and information architect or interaction designer.
Application simulation doesn’t replace information architecture activities such as card sorting, contextual inquiry, or process mapping. Nor does it replace the user research activities of persona and profile development. In this sense, simulation is a tool (albeit a powerful one), and must be combined with individual and team skills to be effective.
Best Practices
We’ve learned a few lessons through our experience using application simulation at WaMu. First, we’ve learned that you have to be vigilant in keeping a “task” focus in order to prevent runaway requirements and diluted value. Likewise, we’ve also found that limiting the scope of your prototypes is important. Borrowing from the film industry, we often talk about “only building the sets we’re going to shoot on.” Last, we have come to understand the vital importance of communication with development and quality assurance teams about the artifacts and processes that application simulation generates.
Another important lesson is how best to make the inevitable compromises that arise during design and development. An ideal customer solution that never launches is of no use to anyone, and learning how to make these calls without losing your edge is a skill that every successful UX manager must cultivate.
More generally speaking, we’ve learned that application simulation is one of the best methods available for building a common understanding and consensus about the broad outlines of an application. Even in the simplest of applications, details and edge cases abound, and application simulation also helps in that regard. It forces you to walk through and discover those pesky details while minimizing the pain of hard coding (and recoding).
Moving from Features to Value
In the world of financial services, we’re still struggling to move from features to value. Our users aren’t on the WaMu site to enjoy and discover an array of new features; they have specific objectives or tasks that they want to perform as quickly as possible and get on with their lives. As designers, we have to make their experience painless and intuitive. Merely putting more features on the website won’t make that happen.
Our goal at WaMu is to make banking easier. At some point you cross a threshold and start trying to make banking different. For example, people used to be unable to open up accounts entirely online. At the end of the online application process, they still had to wait to be sent a signature card and a whole pile of paperwork in the mail.
Now we have an application that lets you complete the entire process online (www.wamu.com/apply). This was a response, in part, to customers’ expectations that they’d be able to do virtually anything online from beginning to end very quickly—whether it’s opening a bank account, opening an email account, or making a major purchase. The new application process is not only highly usable, it’s quickly become a top generator of checking accounts online, beating out larger and more heavily funded competitors.
Today, we prototype nearly all of the projects we undertake. We’re constantly putting prototypes in the lab to test with our customers and our competitors’ customers. Sometimes we find that our products need a little tweaking, sometimes a lot. And, once in a while, we actually avoid investing in projects that don’t deliver real customer value—canceling them before a single line of code is written.
It’s a model of constant adaptation, which is exactly what’s required to ensure a competitive advantage. For my team at WaMu, it means adapting to changes and developments not only in our industry, but also in business and technology in general. Users don’t live in a vacuum, comparing your website only to your competitors. They compare your site to every other site on the Web, and every minute they spend with you against every minute they could spend doing something else. End users build their expectations based on their whole lives, not just when they’re standing in line at a bank or visiting a website. If you want real business results, you must design accordingly.