UX Magazine logoThe Magazine of the Usability Professionals' Association

Using Quick Surveys for Website Task Analysis

By Catherine Brys and Catherine Gaddy

End-user task analysis data are critical in designing or redesigning websites. Site sponsors need to know what tasks and goals users have in mind when they visit the site. Numerous methods are available to gather user task data, including online surveys, web metrics, in-person interviews, field studies, and usability testing.

We have had experience with clients who needed to take the “economy class” approach, at least at first. In this case, quick online surveys were the most practical way to collect task data from users. In this article, we describe how you can make high-quality online surveys.

We use a number of examples: one is a recent quick poll by the first author for a UK university library. The others are quick surveys, some of which involved the second author’s work for the National Institutes of Health (NIH).

We use the term "quick poll" rather than "quick survey" when there are only two or three questions.

When to Use Quick Surveys

Quick online surveys serve as a task analysis tool for websites. Quick surveys can be very useful:

By task analysis we mean getting an insight into the tasks the user wants to carry out using the website. These tasks can be interactions such as renewing a library book or ordering a publication, or they can be information-seeking tasks.

Analysis and Design

Quick surveys are a low-cost, low-barrier way of gathering a large amount of data on users' tasks. At Glasgow University, we used a quick poll on the home page with just two questions, and gathered data over several months.

We used an eight-question quick survey as input for the redesign of the Office of Cancer Complementary and Alternative Medicine website. Users were asked the reason for visiting the site that day as well as why they normally used the site.

8 survey questions used on the OCCAM website

Evaluation and monitoring

Quick surveys can also be useful to evaluate the success of a redesign, or to monitor quality on an ongoing basis.

A quick survey before a site redesign allows you to gather data for benchmarking the new site against the old site so you can measure improvements. Users' responses are more likely to be reliable when they are asked about their current experiences than when they are asked to rate improvement over a certain time span.

Many of the examples for the National Institutes of Health are user satisfaction surveys. Examples of questions probing how well the current site is performing are:

Quick Poll Example

A quick poll was used to obtain task information from a wide range of people using the Glasgow University Library website. Combined with information from other sources, such as interviews and user feedback on the existing site, the quick poll data helped develop a more task-oriented site.

Screen image with beginning of the survey

The quick poll collected information on users' tasks: (“Why are you visiting the library website today?“), and on user type: (“Are you staff, student, or visitor?”).

Responses were anonymous and the user type question was optional. The quick poll ran for a number of months and data from key weeks (for example, exam times, vacation, term time weeks) was sampled and analyzed.

The open question gave us insight into the way users think about their tasks, and the terminology they use.

The results from the quick poll data convinced management to provide a budget to perform usability tests on the new design. The insights from the task question were the foundation for the scenarios used in the usability tests.

What to Ask About

There are many possibilities to obtain a better insight into users' tasks and goals. Below we discuss three different approaches which were used in the Glasgow University case study and the NIH quick surveys.

Users’ Tasks and Goals

One option is to ask directly what task the user is carrying out, for example: “Why are you visiting the library website today?”.

When asking about users' tasks, it is helpful to collect additional information so that the task can be considered in context. This can be:

These types of questions about users' tasks provide the most direct input for task analysis.

Level of Support for Users’Tasks and Goals

Another option is to assess how well the current website information supports the users in their tasks and goals. This can take several forms:

Usability Dimensions Relating to the Task

You can also assess specific usability dimensions of the website. Whitney Quesenbery (www.wqusability.com/articles/more-than-ease-of-use.html) defines the five dimensions of usability as the 5Es: Effective, Efficient, Engaging, Error Tolerant, and Easy to Learn.

A general satisfaction open question can be used to assess how well the site is doing in terms of the 5Es. Examples are:

The answers can be categorized according to the 5Es and used to measure how well the site performs for the different usability dimensions. It is useful to consider the results in light of the key user tasks.

Alternatively, you can focus on a specific usability dimension, for example: (bulleted list)

How to Ask

When designing surveys, we find it useful to follow the steps outlined below.

Step 1. Define the objectives and the audience

What information do you need and how will it help you to make decisions? We recommend prioritizing the objectives into essential, useful, and nice-to-have. Sometimes at this stage it becomes clear that it is far better to concentrate on the first two priorities and leave out the wish list.

You also need to think carefully about the audience of the survey. Again, it is useful to reflect in detail and document the different audience segments. Do you want to target only a sub-population of your audience? If not, include all of your audience segments.

An important decision is whether or not to gather anonymous responses. Surveys which don't ask for respondents' contact details have a lower barrier. For the quick poll at Glasgow University Library, we opted for anonymous responses, as we wanted to encourage users to fill out the poll at every visit. Non-anonymous responses have other advantages, though:

Whether or not to select for anonymous responses depends on the situation. Do you need more responses? Do you want to be able to follow up to solicit detailed information? Does your organization have a solid reputation in respecting users' privacy?

Step 2. Write the questions and the answer choices

Starting from the objectives, write down each question. Decide whether an open question is more appropriate or whether to use a rating scale.

Open questions are a good way to get an insight into how users think about their tasks and what terminology they use. At Glasgow University Library, we discovered that some of the terms in use on the existing site did not correspond to users' terminology.

In contrast, a quick survey for the redesign of the Office of Science Planning and Assessment website used a closed question to probe users' tasks; although there is an ”Other” option, users are basically presented with a set of predefined choices

Sample survey question

Analyzing the results from closed questions is much quicker than from open questions. However:

When all questions have been drafted, go through the survey and imagine you already have the answers; will these answers give you the information you need? Will the objectives set out in Step 1 be met? It is worth spending some time thinking about possible answers to open questions? Personas can be a useful tool for this.

Step 3. Test and launch the survey

Before launching the survey, find a few representative users to test the survey: are all questions clear and unambiguous? Is the wording clear and neutral so as to not bias the responses? Using the test responses, verify that the survey will collect the information you need.

While quick surveys may not yield the best quality or most comprehensive data, they are attractive when you have a limited budget. Surveys are one of a number of guerrilla or discount usability methods (see www.useit.com/papers/guerrilla_hci.html for more information.) Keep in mind that any input from end-users is usually better than nothing!

Positioning Quick Surveys

Because quick surveys allow large numbers of responses, they can play an important role in convincing sceptics who may dismiss small-sample techniques such as in-depth interviews or usability testing. Quick survey results can be helpful getting management buy-in to invest in more expensive or time-consuming usability methods. Finally, they can complement other data sources or inform other usability methods such as in-person interviews, field studies, and usability testing. Quick surveys can be an excellent way to introduce usability and a more user-centered way of thinking into an organization.

Online Resources for Quick Surveys

The National Institutes of Health (NIH) resource offers practitioners in the public and private sectors examples of best practice surveys. See www.nih.gov/icd/od/ocpl/resources/OMBClearance/ClearedSurveys.htm.

In particular, the quick survey for the redesign of the Office of Cancer Complementary and Alternative Medicine collects information on user's tasks, level of support for tasks, and usability dimensions: www.nih.gov/icd/od/ocpl/resources/OMBClearance/NCIOCCAMwebSurvey.pdf.

Two useful examples of user satisfaction surveys are: www.nih.gov/icd/od/ocpl/resources/OMBClearance/NIHaboutSurvey.pdf.

www.nih.gov/icd/od/ocpl/resources/OMBClearance/NIDCDwebSurvey.pdf

 

Catherine Brys, PhD , is web coordinator for the Glasgow University Library‘s Web Site Accessibility and Usability Project, in Glasgow, Scotland. She has previously worked as a technical author, user interface designer, and user experience analyst for international companies. She serves as a member of the UPA User Experience Editorial Board.

Catherine Gaddy, PhD , CHFP is a human factors engineering and usability consultant serving current clients such as Oshyn, Inc., a software and technology consulting company with offices in Los Angeles, CA, Quito, Ecuador, and Baltimore, MD. She has practiced human factors engineering and user-centered design for over twenty-five years. In 2006, she co-authored a UX article with Aaron Marcus entitled, “Analyze This: A Task Analysis Primer for Web Design,” (2006, Volume 5, Issue 1, pp. 20-23).

UPA logo
Usability Professionals' Association

promoting usability concepts and techniques worldwide

User Experience Magazine is by and about usability professionals, featuring significant and unique articles dealing with the broad field of usability and the user experience.
http://www.usabilityprofessionals.org/upa_publications/user_experience/

This article was originally printed in User Experience Magazine, Volume 6, Issue 3, 2007.
http://www.usabilityprofessionals.org/upa_publications/past_issues/2007-3.html.

© Usability Professionals' Association
Contact UPA at http://www.usabilityprofessionals.org/about_upa/contact_upa.html