What do you do when you need to show someone in your organization, perhaps a skeptic in upper management, that the user experience of your website directly impacts the bottom line? While an eight to ten person lab-based usability study or focus group can yield a list of key usability defects or areas for improvement, neither truly demonstrates the business impact of user experience improvements. In order to persuade the skeptics, one generally needs a methodology that includes large sample sizes and allows them to see clear differences in the bottom line before and after a redesign. The two case studies presented in this article share such a methodology.
La Quinta
Situation Overview
La Quinta is a limited service hotel chain which had over 370 properties in thirty-three American states at the time of the project. The process of online bookings (at www.LQ.com) became increasingly important to La Quinta as the Internet grew, and website managers realized that customer loyalty to LQ.com had become a key component of La Quinta’s profitability.
The website management team needed to understand the behavior of their site visitors and identify opportunities to increase brand loyalty and online bookings. La Quinta hired Usability Sciences to perform a website assessment and established the following goals:
- Determine who is visiting the website
- Establish primary visitor intent
- Establish site awareness
- Measure site visitor success and satisfaction
- Measure brand affinity
- Measure likelihood to return
- Capture visitor-suggested changes to the website
- Improve each site visitor’s experience
Research Solution
Usability Sciences deployed WebIQ TM to capture site visitors’ demographic and attitudinal measures using an online survey with click-stream data. Visitors saw a survey at the beginning of their visit and responded with information about demographics, their purpose for visiting the site that day, and how they had found the site in the first place. Visitors then used the site, uninterrupted, however they wished, and their click-stream data was captured in the background.
At the conclusion of their visit (when they navigated away from LQ.com or closed the browser), visitors answered a series of questions regarding the success of their visit, their brand affinity, and their next intended action. They also suggested changes to the site . The project was implemented over the course of thirty days. A random sampling algorithm resulted in approximately one out of every three actual site visitors being invited to participate in the project and in 885 responses collected.
After data collection, a comprehensive examination of the data segmented the responses and click-stream data by user type, satisfaction with the site, visit intent, and visit success. Then we were able to recommend a variety of changes to the website that would increase visitor success. For example, first-time visitors who rated their visit a success were much more likely to return to the site than those who rated their visit unsuccessful. The study helped to identify an opportunity to improve conversion rates by addressing the issues raised by these visitors, such as:
- Members of La Quinta’s loyalty program, Returns, who initially became members offline at a hotel, did not have an option for obtaining online account access.
- Each time site visitors wished to view a hotel rate, they had to reselect their date and preferences. (The site did not remember visitors’ previously entered preferences.)
- Visitors frequently requested more pictures of the hotel in the details of each property.
Results
Over the ensuing eight months, La Quinta implemented site enhancements based on the research. Then the same methodology was repeated, with the objective of measuring the impact of the site enhancements. Using a random sampling algorithm, approximately one in eight site visitors was invited to respond to the same question set, and 933 responses were collected.
Analysis of the data collected during the second run demonstrated substantial improvement in results. When participants’ responses from round two of the research were compared to those from round one, every metric of success and satisfaction on LQ.com was raised considerably. The most significant user experience changes to the site for La Quinta were:
- Success improved by 48 percent
- Satisfaction improved by 28 percent
- Likelihood to return improved 17 percent
- Brand affinity improved 50 percent
Translating these user experience metrics into bottom-line dollars, however, was most important to La Quinta management. During the same time period, marketing campaigns drove an increase in overall site traffic, which would naturally lead to an increase in revenue. Sufficient sample sizes in each round of online research allowed us to generalize the success rates in the research to the overall population of site visitors.
To determine the revenue growth that could be attributed to visitors being more successful in making reservations on the site, we followed a three-step process:
- We assumed that if there had been no user experience improvements to the site, the success rate of reservation-seekers would have been the same during the second round of the research as it had been during the first round.
- Using the first-round success rate, we calculated the number of reservation-seekers who would likely have booked during the second round, if there had been no user-experience improvements to the site. Multiplying this number by the average value of an online booking allowed us to estimate the amount of revenue generated due to marketing efforts driving more visitors to the site.
- Subtracting the revenue generated by marketing efforts from the total revenue in round two gave us a number that could be compared to the revenue generated for the same period during round one.
We determined that, due to user experience improvements, LQ.com saw a year-over-year revenue growth of 83 percent. Other branded websites within the industry saw a growth of 33 percent for the same time period.
American Heart Association
Situation Overview
The American Heart Association (AHA) is a leading non-profit institution for education and research on heart disease and stroke. AHA’s online donation site, like those of most large non-profits, has become an increasingly vital part of the organization. AHA needed to understand how well their visitors could use the online donation process, and AHA management was concerned about the percentage of site visitors who entered the online donation section of the site but did not complete the donation process. So they hired us to investigate how the site was being used and to look for ways of improving the design and functionality.
We designed a research project with the following objectives:
- Determine the type of individuals visiting the donation section of the website
- Understand the behavior of donation-section visitors and what contributes to successful or unsuccessful completion of the donation process
- Document problem areas for the donation section
- Before going into the development and launch phases, validate design and functional changes to gauge whether abandonment and failure rates were corrected
Research Solution
Using a method similar to the La Quinta case, we deployed our online research solution to capture demographic and attitudinal measures from an online survey, as well as click-stream data. As visitors began their visit to the site, they responded to survey questions about their demographic profile, their purpose for visiting the site that day, and their past donation history. Visitors then used the site uninterrupted, however they wished, with click-stream captured in the background.
At the conclusion of their visit, visitors answered a series of questions about the success of their visit, their satisfaction with the site, their reasons for not making a donation (if applicable), and suggested changes to the site . During the sixty-day implementation, every site visitor was invited to participate in the project, and 738 responses were collected.
Afterward, we examined the data, segmenting responses and click-stream data by user type, satisfaction with the site, visit intent, and visit success. Based on the data, we recommended changes to the website to increase donations. One such recommendation was that participants needed more flexibility in the donation process, such as an acknowledgement via email, the ability to customize the acknowledgement card, the ability to donate in the name of a company rather than an individual, and the ability to input a non-USA address.
Five specific areas for improvement of the online donation section of the website were used to build a high-fidelity prototype, which was then tested in a lab-based usability study. When compared with the original design, this prototype included a donation process that had half the number of pages, took half the amount of time to complete, and created a situation where people felt better about donating.
Results
AHA implemented the recommendations from the online research and the usability lab test during a period of “donor fatigue,” which followed after several natural disasters occurred in a relatively short period of time (including the Indian Ocean tsunami and Hurricanes Katrina and Rita). During this same period, other charitable organizations saw a decrease in donations. AHA conducted no marketing or promotional campaigns to increase donations or drive more visitors to the site. Even in this climate, they saw the following results appearing as soon as the redesigned site went live:
- 60 percent year-over-year increase in online donations
- Increase in the number of monthly donors
- Increased average gift per donor
- Improved visitor satisfaction with the online donation process
- Increased likelihood to donate again
- Increased likelihood to recommend donating to AHA online to others
AHA management also gained a higher appreciation for user research and user-centered design.
Ongoing Analysis
This research methodology has become an integral part of both La Quinta’s and AHA’s ongoing website enhancement process. The WebIQ solution helps both businesses prioritize web improvement efforts and then measures the impact of those improvements. Both organizations also use the results from online research to build and enhance prototypes that are tested in lab-based studies. The process of measuring, understanding problems, making improvements, and then measuring the impact of those improvements is the foundation of improving site visitor success, and thus the businesses’ bottom lines.