upa - home page JUS - Journal of usability studies
An international peer-reviewed journal

Development and Evaluation of Two Prototypes for Providing Weather Map Data to Blind Users Through Sonification

Jonathan Lazar, Suranjan Chakraborty, Dustin Carroll, Robert Weir, Bryan Sizemore, and Haley Henderson

Journal of Usability Studies, Volume 8, Issue 4, August 2013, pp. 93 - 110

Article Contents

Literature Review

When websites follow a series of accessibility design guidelines, such as the WCAG, the implication is that the websites will work for most users with perceptual and/or motor impairments, including blind users. However, information visualizations on web pages continue to be inaccessible. Visualizations are inherently directed at sighted users and developing an equivalent rendering for blind users is not a trivial problem. In the past, researchers have investigated the potential use of sonification, of non-textual sound, to represent an equivalent for visualizations (Pauletto & Hunt, 2009; Walker & Mauney, 2010; Yoshida, Kitani, Koike, Belongie, & Schlei, 2011; Zhao, Shneiderman, Plaisant, & Lazar, 2008; Zhao, Smith, Norman, Plaisant, & Shneiderman, 2005). Wall and Brewster (2006) noted that tactile printed media (such as a raised bar-chart printout) are insufficient options due to the limited amount of data that can be presented in a printed manner, and they are rarely used (or even available) after primary schooling. Wall and Brewster also noted that the talking tactile tablet was a useful approach for blind users because it combines a touch tablet with a tactile overlay. Tapping on specific regions of the tactile overlay results in speech output that provides details about the selected area (Wall & Brewster, 2006).But again, there is limited availability of tactile overlays, especially for complex data situations.

A flexible, scalable equivalent for visualizations is needed for blind users. There is a need for the ability to drill-down to specific data values (Wall & Brewster, 2006), which is a key part of the common approach to information visualization (Shneiderman & Plaisant, 2010).In visualizations, users first get an overview of lots of data points, looking for patterns at a high level and looking for exceptions that are not within the same trend (e.g., a stock that went up 50% in a single day). Users can then access details on demand (“drill-down”) about the specific item that seems to be an outlier (such as more detailed information about that one specific stock).Visualizations are highly effective in dealing with scalability of data. While you might be able to spot a trend or an outlier in a spreadsheet of 25-50 data items, it is much harder to spot trends by reading numbers, once you have 100, 1,000, 10,000 or more data points. When a website has visualizations and is legally required to be accessible, government regulations related to web accessibility can be met by simply providing a link to a downloadable table of data that blind users can manipulate using whatever tools that they prefer. While this meets government requirements and allows blind users to access the data, there is a need to develop new interface approaches that allow for the same flexibility for analyzing large sets of data using flexible tools. So far, these tools have taken the form of non-textual sound, known as sonification.

Two existing sonification tools are iSonic (http://www.cs.umd.edu/hcil/audiomap/) and Earth+ (http://prime.jsc.nasa.gov/earthplus/).We call these sonification tools, because they represent attempts to render sound-based equivalents of the visual map-based information. The iSonic tool was originally created as a graduate project at the University of Maryland and allows blind users to hear population trends and patterns on a map of the United States. At a very basic level the application uses pitch of sound to provide a user with an overview or trend about population within a geographical region. In other words, when using iSonic, blind users can navigate (using a keyboard or a touchscreen) to individual states within the map of United States and get audio feedback about the state name and population characteristics. Further, iSonic also allows blind users to develop a sense of population trends. This is done by mapping varying pitch of sound levels to population levels. This allows a user to rapidly navigate (using a keyboard or a mouse) across the different states to get a sense of the increase and decrease in population figures through the patterns of waxing and waning of the pitch. For example, a user would hear high pitch sounds for the states on the coasts and lower pitches around the mid-western and mountain states. The iSonic tool also provides alternative map views of data that include a broader regional view and also a view down to the state and county level.

Earth+ (developed by NASA in 2005) is another tool that we evaluated. Earth+ was a NASA project with similar aims at developing accessible map representations. However that tool was not developed beyond a prototype and is therefore somewhat limited in terms of functionality. The application allows a user to explore an image based on the color palette that defines it. The exploration can be done by placing the cursor at various points within the image; the software then emits a piano note at a pitch unique for that color. This allows a user to gauge the color composition and distribution within an image. Earth+ had a number of preloaded map images that used conventional color-coding to provide visual information but can in principle work with any image. While we appreciated the principle behind the Earth+ implementation, we felt that the tool had some limitations. First, the mapping of sound to color is not very efficient for maps as a user would (a) need to remember the key to sense the color changes and (b) need to do two levels of translation from the sound to color and then from color to map attributes (e.g., temperature, population, etc.). Second, there is insufficient sonic feedback about navigation. For example, if a user moves off the image, the tool does use auditory feedback to inform a user, but the feedback is not instantaneous and does not provide enough feedback of such events. That issue becomes a concern for map-based images, because the feedback a user receives when crossing boundaries within the map is erratic if a user makes quick movements, and the feedback is also less responsive for states with smaller areas (e.g., such as Rhode Island or Delaware).

Given the exploratory nature of our study, we felt that iSonic would require the least customization to fit our context and would be the most useful. We received permission from the University of Maryland, owners of the iSonic application, to continue developing it.


Previous | Next