Skip to content Skip to sidebar Skip to footer

Zooming In and Zooming Out: Real-life Customer Experience Research

When I joined Wells Fargo in 2003, user research and market research sat in different areas of the Internet Services Group (ISG). Occasionally we would present our work at one another’s team meetings, but in general we didn’t coordinate our studies or have a good idea about what the other group was doing. In 2005, ISG reorganized, and the market research and user research teams were brought together under the umbrella of Customer Experience Research & Design (CERD). This shuffling made sense on many levels; now we could benefit from and influence each other’s research, and help our clients in ISG make better decisions based on a more holistic view of the customer.

On the whole, the advantages of integrating market and user research far outweigh the challenges, but we still grapple with challenges as we strive to deliver integrated insight.

Market and User Research—the Same, but Different

Both market research and user research groups think of themselves as user and customer advocates. Both the market and user research practices employ a variety of methods to find answers about what people want and how they behave, but there are some general points of divergence between the two research approaches—both in how studies are designed and conducted, and how the results are used in business.

Market research typically collects data across large numbers of people and from a distance, as in a survey emailed to participants. The researchers attempt to answer questions—through what people say—about attitudes and preferences across specific groups or populations. The results of market research are judged by their statistical validity and they are used to inform messaging, marketing strategy, and business priorities.

User research typically delves deeper into the behavior of a smaller number of people, gathering data in-person, and often in context. Researchers attempt to reveal what people actually do—as opposed to what they say or aspire to do. The results are used to evaluate the usefulness and usability of a product, service, or experience, and studies are judged by the quality of research, design, participant recruiting, analysis, and impact on the final design.

At the core, market and user research emerge out of differing worldviews (Gilmore, D. May/June 2002, “Understanding and Overcoming Resistance to Ethnographic Design Research” Interactions, pp. 29-35). Market research is concerned “with validating a list of needs and sizing the market associated with each, whereas designers need to understand how the product or service is going to fit into someone’s life.” Both types of research are valuable for assessing the viability of a product or service, but territorial disputes can erupt when the line between making business decisions and design decisions is blurred, as is the case when delivering the business value depends on providing the right user experience.

Knee Deep in Ambiguity

When functional areas are inarguably distinct, there is a clear separation between spheres of expertise. There’s little need to get into the details of who owns what because it’s obvious and clear to both parties. But in market and user research, when disciplines or functions are closer in terms of what they do and how their work is used, knowledge can be more difficult to broker or translate for one another. We each have a stake in the result, and we each have a slightly different point of view. This overlap—and resultant tension—is territory that must be carefully negotiated to integrate the strengths and insights of both approaches.

Large corporations are mostly made up of people who have more familiarity with—and perhaps more trust in—traditional market research approaches, which speak the language of numbers.

It’s important to remind clients that even though numbers seem objective and definitive, they don’t automatically mitigate ambiguity or risk. For instance, business cases are based on assumptions about the percentage of people who are likely to adopt a service or product, but this does not reveal probability to abandon if use does not meet initial expectations. And the likelihood of abandonment is based on how well the product or service fits into someone’s real-life goals, habits, and routines.

Language also plays an important role in shaping how we think, judge, and analyze. Quantitative research terms like confidence intervals, statistical significance, and validity help people feel that they are reducing ambiguity. In contrast, qualitative research terms such as open-ended interviews, unstructured questions, interpretive anecdotes, stories, and themes can provoke and instill uncertainty and doubt. The different vocabularies could suggest why some business people distrust qualitative results and prefer quantitative approaches.

What can get lost in market research’s promise to decrease, rather than embrace, ambiguity is the experiential significance of individual stories and context and the validity of human intuition. Embracing the ambiguity of human decision making and behavior helps to remove the guesswork from defining new product and experience concepts; it grounds us in people’s real-life reactions and experiences. Qualitative research allows us to explore what we don’t know, and this can be much less risky when compared to designing a full-blown quantitative survey early in the product development cycle. This is because surveys, once deployed, cannot be tweaked in situ to follow emerging stories, themes, and hunches. Qualitative methods such as participatory design and usability testing, on the other hand, do support such exploration.

Once we’ve defined key user needs, behaviors, tasks, workflows, motivations, and emotions through in-depth qualitative research, we can validate these, measure drivers and barriers to adoption, determine pricing models, and measure satisfaction over time with quantitative research. The integration of all these inputs is informed innovation resulting in product concepts that are directly related to user needs and in line with business goals.

Zooming In and Zooming Out

One key to mitigating ambiguity and leveraging the strengths of each discipline is to learn to zoom in and zoom out on the area of inquiry effectively. For instance, in data collection:

  • Market research zooms out to discover answers about broad groups of people, but has to do this by zooming in on very specific questions to control the variables under investigation.
  • User research zooms in on the intricate, often tacit and mundane behaviors of just a handful of people, but also zooms out to ask broader, more open-ended questions. We have a lot of hypotheses going into the field, but these might be disproved through observing real behavior.

It takes a lot of sophistication on the part of a team, management, and an organization to calibrate and integrate insights from both approaches. Doing so requires a lot of communication, openness, and coordination between the two research groups, as well as with our clients.

Here is an example: A market research study surveyed customers’ budgeting and tracking needs. The survey asked, “Is this something you want to do online?” The answer was a resounding “Yes.” When user research conducted ethnographic research on the same topics—tracking and budgeting—it became clear that these were distinct activities and, while everyone tracked their balances and transactions constantly, most people did not want to do any real work with budgets. If we had conducted the market research survey after the ethnography instead of in parallel, we would have separated “tracking” from “budgeting” to avoid combining these two different financial activities. In the end, the team realized the experiential validity of the ethnographic research and concentrated on designing tools that made budgeting automatic, such as My Spending Report.

online spreadsheet of expenditures
“My Spending Report” a product of combined user and market research.

Multi-disciplinary Integration

Early-stage research generates new ideas, possibilities, and innovations. This is where the multi-disciplinary team can use their complementary skills to best advantage. T he trick becomes employing the correct discipline and method for the right question and then interpreting the results in this context of combined strengths. For instance, usability testing is typically conducted with six to twelve users. Increasing sample sizes to provide better quantitative data may not be warranted if the purpose of the testing was to validate design direction and iterate. On the other end of the spectrum, behavioral metrics, adoption data, and satisfaction scores are tracked quantitatively to measure success, drive improvements, and evaluate customer experience impact.

As a bank, we know we have tons of data. What we want is insight, which comes from synthesis and integration. The traditional question is “How do you get from data to knowledge?” Our revised question is, “How do you marry the strengths of the two research approaches to form a more holistic understanding of customer behavior and needs?” Insight is fueled by openness between user and market research practices as well as between product managers and research. It’s helpful when product managers engage both teams at the same time so we can collaborate to design the most efficient, yet holistic, research approach.

Aside from all the hours of discussion—both structured and informal—two tools have proved valuable in moving us toward a combined approach: customer profiles (like Cooper’s “personas”) and task models.

  • Researchers frequently use three retail customer profiles and five small business profiles to guide design decisions.
  • We develop task models comprised of all the tasks a person would use to manage their finances. We build the model on a foundation of ethnographic research and design modeling that we later validate quantitatively through market research. We now know how frequently people perform the tasks, how important they are, and whether they want to be able do these tasks online. The answers determine how much real estate they get on the web page and the priority of projects overall. It even helps us to determine the ROI of design.

Evolving our customer-centric tools forms the basis for mutual understanding. Each project challenges us to find the optimal moments for integration. This requires that we are involved in each other’s practices through learning, listening, respecting, and considering how each approach allows for a richer, more distinct, and more accurate view of our customers.