upa - home page JUS - Journal of usability studies
An international peer-reviewed journal

The Usability of Computerized Card Sorting: A Comparison of Three Applications by Researchers and End Users

Barbara S. Chaparro, Veronica D. Hinkle, and Shannon K. Riley

Journal of Usability Studies, Volume 4, Issue 1, November 2008, pp. 31-48

Article Contents


The following sections discuss the WebSort, OpenSort, and CardZort applications.


Researchers reported WebSort as the card sorting application that was the easiest to use, most satisfying, and most preferred overall (despite the inferior dendrogram). Participants found the user interface to be very intuitive for setup and analysis. Some of the features they liked the best were the ability to copy and paste and import lists of items into the application to create a card set and the clear instructions for creating studies and analyzing results. Figure 5 shows the main WebSort screen used by researchers to set up a study. The tabs across the top of the screen clearly outline the phases of a card sort study. Users found it easy to paste an existing list into the Items list and edit the sorting instructions. They also found it easy to import their study data and view the resulting dendrogram.

Figure 5. WebSort study setup screen.

Figure 5. WebSort study setup screen.


Participants had the most difficulty creating and setting up the card sort exercise in OpenSort. This was due to two reasons. First, after the new study was created, it appeared within a list of studies (Figure 6). All users clicked on their study title to begin editing it, but instead of seeing the study settings they saw a preview of the actual card sort study (as end users would see it). They found this to be confusing. Further examination of the study screen revealed a separate Edit Study link for editing. Participants reported that their first instinct was to simply click on the study title for this purpose. Second, participants were confused how to enter the 35 card items. An open card sort is one of many types of exercises available as part of the MindCanvas tool. Users are presented with eleven different question types when they first create their study including multiple choice, display, open text, OpenSort, multi text, etc. Users had to choose OpenSort before they could add their card items (some participants had to be told to choose OpenSort to continue). They also had to enter the 35 items one at a time as there was no apparent way to paste or import them from an existing text list to create the card set. (It should be noted that pasting items into the question type field was possible but it was not intuitive and only one user discovered how to do this.) As a result, participants found this task to be cumbersome and time-consuming.

Finding the results to analyze was also reported to be a bit cumbersome in OpenSort because participants sometimes overlooked the Download data link (underscored number, Figure 6) and instead clicked on another link (i.e., Manage Study or Study Title) to find this information. To see actual results, MindCanvas requires its users to first request a results download, wait for the results to appear on the site, and then download and extract the results from a ZIP file. MindCanvas does not create dendrograms of results immediately but does so upon request of the researcher (there is typically a separate fee for these reports). They do, however, provide the raw data of each participant sort in an Excel-ready format, which allows researchers to run their own analyses if they so desire. For the purposes of this study, participants were shown samples of the three result analyses provided by MindCanvas: the dendrogram, the Vocabulary Browser, and the Similarity Browser. Participants liked the professional appearance of the dendrogram, in particular, when compared to the other applications.

Figure 6. OpenSort set up and editing screen.

Figure 6. OpenSort set up and editing screen.


Participants spent some time browsing CardZort to figure out how to create a card set. The File>New menu option prompted them for a card style (text only, text and description) but did not provide any instructions on how to start adding cards. Users had to select the Card>New menu option or the corresponding toolbar icon to add a new card. Like OpenSort, participants were unable to find a way to paste or import the card set items into CardZort and had to type the 35 items individually. Also, shortcuts to create new cards used hot keys that users reported to be not very intuitive. For example, the shortcut to create a New card was the Insert key and to Edit a card was the Enter key.

Setting up data for analysis was also reported to be somewhat difficult in CardCluster (the analysis portion of CardZort) because the menu functions were inconsistent with user expectations. In order to run an analysis of the data, participants had to first import individual data files. The File menu option displayed a list of functions for opening or saving projects but not to create a new project. To start a new project, users had to select the File>Add Exercise menu option or click on the corresponding plus sign (+) icon. The alt. tag for this icon read “Adds a sorting exercise (.cz file) for analysis” (Figure 7). Participants found the term “sorting exercise” to be confusing when initially trying to set up a new project and when looking for individual participant data.

Figure 7. CardZort analysis setup screen.

Figure 7. CardZort analysis setup screen.

Study 1 Summary

Results from Study 1 indicate that WebSort was the most preferred application from the researchers’ perspective because of its overall ease of use for study set up and analysis. Results also demonstrate some deficiencies in the design of the two other electronic card sorting applications. In particular, fast and convenient study set up and clear menu functions for data analysis are two features that were lacking. It seems that for CardZort users were expecting the functionality to be similar to that of other Windows-based applications (including copy and paste functionality or new file setup). Likewise, users expected the link functionality in OpenSort and WebSort to be similar to that encountered in other websites. Several inconsistencies with these expectations were encountered in both CardZort and OpenSort.

It should be reiterated that this study examined first-time usage of the card sorting applications among researchers. There were several features of OpenSort, in particular, that were not evaluated because they were not available in the other two programs. In general, OpenSort offered the most options for researchers and it is possible that with continued use, this program may rate higher than what was reported with first-time usage.

Previous | Next