upa - home page JUS - Journal of usability studies
An international peer-reviewed journal

Development and Evaluation of Two Prototypes for Providing Weather Map Data to Blind Users Through Sonification

Jonathan Lazar, Suranjan Chakraborty, Dustin Carroll, Robert Weir, Bryan Sizemore, and Haley Henderson

Journal of Usability Studies, Volume 8, Issue 4, August 2013, pp. 93 - 110

Article Contents


Implications and Discussion

While our current investigation into developing sonified weather maps is at a relatively early stage, there are some interesting implications for future research. First of all, participant interviews and surveys, as well as the participants’ reactions during the usability evaluation, indicate that visualization maps represent important and commonly used data representations that remain mostly inaccessible to blind users. This inaccessibility becomes more pertinent in information contexts with an inherent spatial component (such as weather data) where data values change rapidly and map sweeps can lead to important trend analyses. The enthusiasm of our participants (during interactions with the touchscreen and the tactile map, even when the technology was not fully functional) at being able to, for the first time, get an idea of the spatial orientation of the state of Maryland on the computer, indicates that there are interesting implications of research into designing and employing accessible maps as a pedagogical tool for learning not only weather, but also geography.

A second set of implications of the research relates to strategies and approaches that can be employed to create accessible data visualizations. A predominant strategy of such research has been to focus on sonification. While this approach has a number of merits, our research indicates that in contexts where the information has an underlying spatial element, usability is vastly increased by a more multimodal approach. In particular, technologies such as touchscreen and tactile overlays supplement the sensory inputs blind users can employ, confirming the findings of Wall and Brewster (2006) that these multimodal approaches are superior. In addition such approaches allow for a non-sequential summary-based comprehension of information that can greatly facilitate absorption of any form of data visualization. Therefore, it would perhaps be useful for future research to investigate how various forms of multimodal interactions can be instrumental in developing accessible information representations.

A third set of implications relates to performing usability testing involving participants with disabilities. If we didn’t quickly adjust our plan, we likely would have not been able to collect any useful feedback data—getting some feedback from participants is better than getting no feedback. In our first usability evaluation we had some technical difficulties that threatened our ability to get the participant feedback that we wanted. The participants were already scheduled to come, and their transportation was scheduled, which can be difficult to reschedule for participants with disabilities. By quickly adjusting our process, we still gained valuable feedback that we could build upon. You can’t build software or hardware applications for people with disabilities without directly involving them in development. If that means that requirements gathering or usability testing methods need to be modified, that’s a trade-off that user experience practitioners need to accept.

In previous articles in the Journal of Usability Studies, we have described some of our other experiences conducting usability test sessions with people with disabilities, with our end goal to get useful design feedback from participants. For instance, when blind users encountered an inaccessible web page that didn’t allow them to proceed further, we performed (and documented specifically) interventions where we helped users get to the next page in a job application process so we could observe their challenges on the next web page (Lazar, Olalere, & Wentz, 2012). In our usability testing involving people with Down syndrome and workplace-related tasks on multi-touch tablet computers, we documented how we used visual Likert scales instead of auditory ones, focused on using real examples and real accounts rather than fictional ones (a finding also encouraged by Zazelenchuk, Sortland, Genov, Sazegari, & Keavney, 2008), and were flexible when a user refused to perform a task (to add an event to a calendar) because they thought that the event should last longer (four hours instead of two; Kumin, Lazar, Feng, Wentz, & Ekedebe, 2012).

A final implication of this research relates to the use of mobile devices. As our preliminary usability evaluations indicate, technologies such as touchscreens and tactile overlays allow users to supplement their comprehension of data by employing supplementary sensory perceptions. Because mobile phones and tablet devices are increasingly using multi-touchscreens, and weather information is often needed on-the-go, it is important to investigate how these applications could be used in portable and tablet devices. We currently are working to evaluate the newest version that we developed, running on an Android tablet. Touchscreens can be fully accessible to blind users (by using speech output, touchscreen gesturing, and Braille overlays to indicate where the visual keyboard appears), and it seems that a next step for research might be to investigate how visualizations can be most effectively implemented on tablet computers.

 

Previous | Next