upa - home page JUS - Journal of usability studies
An international peer-reviewed journal

Development and Evaluation of Two Prototypes for Providing Weather Map Data to Blind Users Through Sonification

Jonathan Lazar, Suranjan Chakraborty, Dustin Carroll, Robert Weir, Bryan Sizemore, and Haley Henderson

Journal of Usability Studies, Volume 8, Issue 4, August 2013, pp. 93 - 110

Article Contents


Usability Test

There are many aspects of the iSonic application and usage that could potentially be a focus of usability testing. Those aspects include the iSonic application controls, the data view vs. the map view, and the keyboard vs. the touchscreen. Because there is limited research on touchscreens used by blind users, we felt that comparing the keyboard and touchscreen would add the most benefit to the literature. Furthermore, the previous usability research (Zhao, Shneiderman, Plaisant, & Lazar, 2008) on iSonic only evaluated the software using the keyboard, so we felt that evaluating the touchscreen functionality would provide useful feedback for both developers and us. We also wanted to learn more about the effectiveness of just a touchscreen as compared to a touchscreen with a tactile overlay.

Ideally, for this usability test, each participant would be available for a few days of training, get hands-on experience over a few weeks, and then evaluate the software. However, the lack of funds and the brief time available from participants required a shorter, more formative qualitative test. There also was a large list of potential tasks based on the requirements gathering. Therefore, we asked each participant to perform some overlapping, but mostly different, tasks.

We recruited five blind individuals to perform usability testing on the iSonic application at the International Braille and Technology Center in Baltimore, Maryland. These individuals were recruited through the National Federation of the Blind and had expressed interest in weather maps, sonification, or science education. Two of the participants in the usability evaluation were the same people who took part in the interviews discussed earlier (the blind participant with expertise in meteorology, and the blind participant interested in developing accessible weather maps).

Our five blind individuals had a mean age of 45.6 (range 28-68) and included four males and one female. As a matter of practice, it is not considered appropriate to ask participants about the specific cause of their blindness or for vision test results (Lazar, Feng, & Hochheiser, 2010). As a proxy, we typically ask if the participant is able to use screen magnification (meaning that they have partial vision). If they are able to use screen magnification, that means that some useful vision remains. As do most of our research evaluations, this study focused on screen-reader users who are unable to use screen magnification, so they were not considered to be “low-vision.” None of the participants had any additional documented disabilities (e.g., they did not have any hearing loss). The participants have used computers an average of 30.8 years and used screen reading software an average of 22 years. As is typical for a majority of blind people, none of our participants had a service animal. As is also typical for a majority of blind people, they can arrange their own transportation, but this must be planned out, and schedule changes at the last minute are hard for the blind participants to logistically work out.

While the participants had a high level of computing experience, our applications were not designed for users new to screen reading software. Of the five participants, only one previously had used any type of sonification software, and that participant only had used sonification in gaming software.

About a week before the usability testing, our team had stopped by the test facility at the International Braille and Technology Center to make sure that our software (iSonic) and hardware (the touchscreen) would work properly on their computers and their network. However, on the actual day of the usability testing, as the software was being set up before the usability testing, there turned out to be technical problems. Due to some security patches installed since our visit a week earlier, the sonification tones in iSonic would not work on any of the computers (the speech output, however, worked fine). When we discovered this issue, there was approximately 20 minutes until the first participant would arrive. All five participants were already scheduled to arrive at intervals throughout the day. Because transportation was arranged in advance, last-minute schedule changes are typically problematic. Because our first participant had told us that they had scheduled other meetings later that day at the International Braille and Technology Center, we asked that participant if they could come back to us at the end of the day. The participant was able to accommodate that request, giving us an additional 45 minutes to work out a solution for the other participants who were scheduled to arrive.

We were in a situation in which we had to adjust quickly if we were going to gather any feedback about our prototype. We tried using iSonic and the touchscreen on the MacBook laptop (booted in Windows mode) that we brought with us to take notes. The iSonic application worked fine on the MacBook, but the touchscreen did not work on it. So, our modification was to have the participants evaluate the keyboard version of iSonic, including the sonification tones, on the MacBook laptop operating in Windows mode. We also decided to use a PC without the sonification tones to evaluate the effectiveness of the touchscreen and the touchscreen with a tactile overlay. We used a PC (with speech output but not sonification tones) because the MacBook laptop could not be used to evaluate the touchscreen. Unfortunately, by the time that we had figured out these logistics, there was not enough time to try and fabricate a new tactile map that would be decently accurate. This situation was not ideal, but we did not want to waste the opportunity to gather some useful evaluation data. We could have abandoned the test and hoped to work out the technology later. But we could use the time with the participants productively so we went ahead with our revised procedure.

The tactile overlay was a map of Maryland with county borders (of the 24 counties in Maryland) identified on the map. We explained how the software worked and demonstrated it using the keyboard and the touchscreen. We gave the participants a few minutes to explore it, and then we asked the participants to attempt some tasks.

Participant 1 spent several minutes going though the program trying to get a feel for it. He initially said that he liked the keyboard application but wanted to be able to feel the edges of a tactile map. When he started using the touchscreen, he perceived the touchscreen as being a little bit jumpy and stated that he didn’t like a touchscreen without a tactile overlay. He preferred the tactile overlay on the touchscreen, when compared to the keyboard, and completed the task list on the touchscreen (with the tactile overlay) with more ease. He also noted that he listened more to the speech than to the tones. One interesting challenge is that he assumed that the touchscreen was a multi-touchscreen, which it was not.

Participant 2 did not like using the keyboard to navigate around the Maryland state map. However, she was already getting comfortable with the application and noted points like, “Central Maryland is definitely hotter,” and “There’s a weather front somewhere here.” She said that she didn’t like the tones because “I’m not musically inclined, so I like numbers, not sounds. It’s my learning style.” She also expressed a strong preference for the tactile overlay on the touchscreen. She also suggested making it clearer when you have entered another state. Rather than a tone to indicate that you are off the map (or have entered another state), it would be better to have the software say, “You have entered Virginia” or something similar. She said that because the counties were obviously not square shaped, it made it difficult to navigate on the keyboard. She also thought it was a very good program for a geography lesson, because, having recently moved to Maryland, she noted that she could learn more about Maryland geography using this application.

Participant 3 also stated that he listened more to the speech data than the tones, but that he could understand there was a difference in the tones. He wondered if headsets would be helpful, if the sounds could be presented differently from left to right in the headsets. When using the touchscreen, he stated that he preferred this method, for instance, because he could jump from one county to another without listening to the rest of the counties (as occurs with the keyboard). He didn’t seem to have a preference as to the touchscreen with or without the tactile overlay.

Participant 4 had a good sense of the different tones and their correlations with the data, and he understood the trends. Unlike the other participants, he seemed to find the tones to be very useful. Like participant 2, he noted that this application would really be useful for learning the geography of a new area (he had also recently moved to Maryland). He wondered why, when you cross the Chesapeake Bay, a large body of water, the application didn’t make a “splash” sound instead of a “chirp” sound (which is the current sound made by iSonic for crossing a body of water). Using the tones, he could immediately determine that the west side of Maryland had the highest chance of precipitation. He wondered if we could add “elevation” to the software application to help users learn more about the geography of Maryland. When he started using the touchscreen, the application crashed. While we were trying to get the application with the touchscreen working again, he had to leave for a work-related appointment. Consequently he was the only participant who was not able to evaluate the touchscreen interaction with the application.

Participant 5 liked being able to hear the trends using the sonified tones and immediately picked up important trends. For instance, the chance of rain was higher in the northern and western parts of the state. He really enjoyed using the application and wondered how much data you could present to a participant before they became overwhelmed. He also thought that the over-time comparisons might be most useful (e.g., checking the map at noon and then again at 6 p.m.). He was equally enthusiastic about the tactile map over the touchscreen and was able to easily complete tasks using both approaches. His comment was “Now it starts to mean something, because now I’m touching it on the map.” He further noted “Now, I get the information that I don’t normally get. This is a very different sense than I get from [data points] using the Braille note [device]. This is exactly what I have been looking for!” He further noted “I’ve always had to calculate the weather trends in my mind, until today!”

In summary, all five participants liked the application and were able to figure out how to successfully complete a few tasks, within a few minutes of first using it. There were some trends in this small sample. The participants preferred the tactile map over the touchscreen, as compared to either the touchscreen alone or the keyboard alone. While some participants found the sonification tones useful, other participants did not. Two participants who had recently moved to Maryland thought that this software application would be very useful for learning state geography, which was not a stated scenario or development goal for the project, but could be a potential feature. Suggestions for improvement included a textual notification when you left a state border (such as text saying, “You are now in Virginia”), a splash sound instead of bird chirp to notify you when you are crossing a body of water, and headsets to get a better spatial sense of where the sounds are coming from.

 

Previous | Next