The Magazine of the Usability Professionals' Association
By Michelle R. Peterson, Ph.D.
What do you do when you need to show someone in your organization, perhaps a skeptic in upper management, that the user experience of your website directly impacts the bottom line? While an eight to ten person lab-based usability study or focus group can yield a list of key usability defects or areas for improvement, neither truly demonstrates the business impact of user experience improvements. In order to persuade the skeptics, one generally needs a methodology that includes large sample sizes and allows them to see clear differences in the bottom line before and after a redesign. The two case studies presented in this article share such a methodology.
La Quinta is a limited service hotel chain which had over 370 properties in thirty-three American states at the time of the project. The process of online bookings (at www.LQ.com) became increasingly important to La Quinta as the Internet grew, and website managers realized that customer loyalty to LQ.com had become a key component of La Quinta’s profitability.
The website management team needed to understand the behavior of their site visitors and identify opportunities to increase brand loyalty and online bookings. La Quinta hired Usability Sciences to perform a website assessment and established the following goals:
Usability Sciences deployed WebIQ TM to capture site visitors’ demographic and attitudinal measures using an online survey with click-stream data. Visitors saw a survey at the beginning of their visit and responded with information about demographics, their purpose for visiting the site that day, and how they had found the site in the first place. Visitors then used the site, uninterrupted, however they wished, and their click-stream data was captured in the background.
At the conclusion of their visit (when they navigated away from LQ.com or closed the browser), visitors answered a series of questions regarding the success of their visit, their brand affinity, and their next intended action. They also suggested changes to the site . The project was implemented over the course of thirty days. A random sampling algorithm resulted in approximately one out of every three actual site visitors being invited to participate in the project and in 885 responses collected.
After data collection, a comprehensive examination of the data segmented the responses and click-stream data by user type, satisfaction with the site, visit intent, and visit success. Then we were able to recommend a variety of changes to the website that would increase visitor success. For example, first-time visitors who rated their visit a success were much more likely to return to the site than those who rated their visit unsuccessful. The study helped to identify an opportunity to improve conversion rates by addressing the issues raised by these visitors, such as: (bulleted list)
Over the ensuing eight months, La Quinta implemented site enhancements based on the research. Then the same methodology was repeated, with the objective of measuring the impact of the site enhancements. Using a random sampling algorithm, approximately one in eight site visitors was invited to respond to the same question set, and 933 responses were collected.
Analysis of the data collected during the second run demonstrated substantial improvement in results. When participants’ responses from round two of the research were compared to those from round one, every metric of success and satisfaction on LQ.com was raised considerably. The most significant user experience changes to the site for La Quinta were:
Translating these user experience metrics into bottom-line dollars, however, was most important to La Quinta management. During the same time period, marketing campaigns drove an increase in overall site traffic, which would naturally lead to an increase in revenue. Sufficient sample sizes in each round of online research allowed us to generalize the success rates in the research to the overall population of site visitors.
To determine the revenue growth that could be attributed to visitors being more successful in making reservations on the site, we followed a three-step process:
We determined that, due to user experience improvements, LQ.com saw a year-over-year revenue growth of 83 percent. Other branded websites within the industry saw a growth of 33 percent for the same time period.
The American Heart Association (AHA) is a leading non-profit institution for education and research on heart disease and stroke. AHA’s online donation site, like those of most large non-profits, has become an increasingly vital part of the organization. AHA needed to understand how well their visitors could use the online donation process, and AHA management was concerned about the percentage of site visitors who entered the online donation section of the site but did not complete the donation process. So they hired us to investigate how the site was being used and to look for ways of improving the design and functionality.
We designed a research project with the following objectives:
Using a method similar to the La Quinta case, we deployed our online research solution to capture demographic and attitudinal measures from an online survey, as well as click-stream data. As visitors began their visit to the site, they responded to survey questions about their demographic profile, their purpose for visiting the site that day, and their past donation history. Visitors then used the site uninterrupted, however they wished, with click-stream captured in the background.
At the conclusion of their visit, visitors answered a series of questions about the success of their visit, their satisfaction with the site, their reasons for not making a donation (if applicable), and suggested changes to the site . During the sixty-day implementation, every site visitor was invited to participate in the project, and 738 responses were collected.
Afterward, we examined the data, segmenting responses and click-stream data by user type, satisfaction with the site, visit intent, and visit success. Based on the data, we recommended changes to the website to increase donations. One such recommendation was that participants needed more flexibility in the donation process, such as an acknowledgement via email, the ability to customize the acknowledgement card, the ability to donate in the name of a company rather than an individual, and the ability to input a non-USA address.
Five specific areas for improvement of the online donation section of the website were used to build a high-fidelity prototype, which was then tested in a lab-based usability study. When compared with the original design, this prototype included a donation process that had half the number of pages, took half the amount of time to complete, and created a situation where people felt better about donating.
AHA implemented the recommendations from the online research and the usability lab test during a period of “donor fatigue,” which followed after several natural disasters occurred in a relatively short period of time (including the Indian Ocean tsunami and Hurricanes Katrina and Rita). During this same period, other charitable organizations saw a decrease in donations. AHA conducted no marketing or promotional campaigns to increase donations or drive more visitors to the site. Even in this climate, they saw the following results appearing as soon as the redesigned site went live:
AHA management also gained a higher appreciation for user research and user-centered design.
This research methodology has become an integral part of both La Quinta’s and AHA’s ongoing website enhancement process. The WebIQ solution helps both businesses prioritize web improvement efforts and then measures the impact of those improvements. Both organizations also use the results from online research to build and enhance prototypes that are tested in lab-based studies. The process of measuring, understanding problems, making improvements, and then measuring the impact of those improvements is the foundation of improving site visitor success, and thus the businesses’ bottom lines.
Michelle R. Peterson is an online experience project manager at Usability Sciences . She has more than thirteen years of experience conducting both quantitative and qualitative behavioral research. Prior to 2003, she worked as a usability engineer at Microsoft. Michelle received her Ph.D. in cognitive psychology from the University of Florida.
Usability Professionals' Association
promoting usability concepts and techniques worldwide
User Experience Magazine is by and about usability professionals, featuring significant and unique articles dealing with the broad field of usability and the user experience.
This article was originally printed in User Experience Magazine, Volume 6, Issue 2, 2007.
© Usability Professionals' Association
Contact UPA at http://www.usabilityprofessionals.org/about_upa/contact_upa.html