upa - home page JUS - Journal of usability studies
An international peer-reviewed journal

The Usability of Computerized Card Sorting: A Comparison of Three Applications by Researchers and End Users

Barbara S. Chaparro, Veronica D. Hinkle, and Shannon K. Riley

Journal of Usability Studies, Volume 4, Issue 1, November 2008, pp. 31-48

Article Contents


Discussion

OpenSort was found to be the card sorting application that was the least difficult to use, most satisfying, and most preferred. CardZort was reported to be the most difficult program to use, but WebSort was preferred the least. The following sections discuss usability issues that were discovered for each program.

OpenSort

Figure 11 shows the main sorting screen for OpenSort, the most preferred application. Users found the direct manipulation of the cards and group naming to be very easy and intuitive. Users also liked the realistic look of the online cards and the dynamic manner in which the cards were “dealt” onto the screen. This dynamic feature added an element of “fun” to the exercise.

Figure 11. OpenSort user interface (www.themindcanvas.com).

Figure 11. OpenSort user interface (www.themindcanvas.com).

Despite this, there were several interactions with the application that they found to be somewhat confusing. Before going to the sorting screen, participants were presented a preview screen that showed how the interface would look while they were sorting. It was not clear to the participants that this was an example and not the actual sorting screen. Several participants tried to drag the cards on this page and were frustrated when they couldn’t move anything. It was unclear to them that they had to proceed to the next screen in order to begin the sorting exercise.

In addition, users found the graphical icons at the bottom of the screen to be unclear (e.g., proceed to next page, help). There were no textual instructions on the preview screen how to proceed to the sort other than the arrow button in the bottom right of the screen. Likewise, access to help was available at the bottom of the sorting screen via a computer monitor icon, which participants did not recognize as a help link.

Users found it easy to combine groups and subdivide larger groups with OpenSort. This was also the only program that offered online Help while sorting. Those that found the help, however, were disappointed to find that this information was only available in the form of viewable demos that had to be watched in their entirety. Participants expressed that they would have preferred the option of a text-based help section instead.

CardZort

Figure 12 shows a sample sorting screen for CardZort.

Figure 12. CardZort user interface (www.cardzort.com).

Figure 12. CardZort user interface (www.cardzort.com).

Users found the cards to be difficult to drag and drop to form groups in CardZort. In particular, when moving a card to a group, users had to be very precise with their mouse cursor placement. If the cursor was on the edge of the card being moved and was outside of the border of the group in which it was to be included, then the new card was placed behind the intended group and was obscured or completely hidden.

Users also found it frustrating that they had to complete all of the sorting before they could name any of their groups. When creating a group for the first time, users thought they should have been able to click on the blank tab above each group and name it as the group was created.

Aside from the basic instructions at the top of the page, CardZort did not provide any form of a Help section whereby users could find detailed explanations of the program features. This is considered an essential component of any program, especially for first-time users. In addition, CardZort did not provide a means by which users could create subgroups or duplicate cards for instances where they may want to place a card in multiple groups. Users were observed during the usability study to comment that both of these features would be helpful.

WebSort

Figure 13 shows a sample sorting screen for WebSort.

Figure 13. WebSort user interface (www.websort.net).

Figure 13. WebSort user interface (www.websort.net).

As the least preferred application, WebSort was found to be overall difficult to use. Participants were not able to see all items to be sorted at one time. If the number of items to be sorted exceeded 23 (in 1024 x 768 screen resolution), participants had to scroll down the item list to see the entire list. In addition, participants were able to view the cards within only one group at a time; they had to click on a group name to show its contents.

Like CardZort, WebSort did not offer extended Help information beyond the initial instructions and users were unable to create subgroups or duplicate items.

Study 2 Summary

Results from Study 2 show that OpenSort was the unanimous favorite of the three card sort applications from the end users' perspective. Participants liked the step-by-step instructions and found the cards easy to manipulate, group, and name. Participants reported that the sorting and naming process in CardZort and WebSort was more cumbersome and the user interface was overall less usable. Ease of dragging and dropping the cards and concurrent group naming were found to be critical factors of success.

It must be noted that the open card sort conducted by the users was limited in its scope. Only 35 single-word items were sorted in each program. This represents a small card sort exercise; most times, card sorting is conducted with 100 or more items, with each item described by multiple phrases or sentences. One of the drawbacks, however, of electronic card sorting programs, when compared to physical card sorting, is the diminished space (or screen real estate) users have to group and move cards around. Given that users found the interaction with some of the programs cumbersome in this study, it is expected that with a sort using 100 or more cards, these problems would only be exacerbated.

Previous | Next