upa - home page JUS - Journal of usability studies
An international peer-reviewed journal

Investigating the Accessibility and Usability of Job Application Web Sites for Blind Users

Jonathan Lazar, Abiodun Olalere, and Brian Wentz

Journal of Usability Studies, Volume 7, Issue 2, February 2012, pp. 68 - 87

Article Contents


Methods

This study focused on evaluating the accessibility and usability of online employment application Web sites in eight southeastern US states: Alabama, Florida, Georgia, Kentucky, North Carolina, South Carolina, Mississippi, and Tennessee. These states were chosen because they are the states served by the Southeastern ADA Center (http://adasoutheast.org/), which funded this project. Also, the Southeastern ADA Center has connections with businesses in these states, so the results of the usability evaluation can be communicated to companies in the southeastern US, and could result in the improved accessibility of online employment Web sites. The staff of the Southeastern ADA Center chose two companies that had online employment applications from each of the eight states, for a total of 16 Web sites evaluated. For each state, the largest 50 employers were selected. Then, in each state, the top 10 high growth fields were selected. Then, two companies were selected from the 10 top growth fields in each state, making sure that no field was represented twice in the sample. This way, not only would there be geographic diversity, but also diversity of different fields and industries. So as not to embarrass any of the companies, they will not be identified by name. Two attempts were made to apply for jobs on each Web site (for a total of 32 attempts at submitting a job application).

Participants

A total of 16 participants were involved in the usability evaluation. Most participants were recruited through a partnership with the Maryland Division of Rehabilitation Services, Office of Blindness and Vision Services. Participants were required to be blind, at least 18 years of age, must have been employed at some point within the last few years, and must be screen-reader users unable to use screen magnification (meaning that the participants did not have enough residual or partial vision to use their vision in the usability evaluation). It was also stated in the recruitment email that the testing would require an average of three to four hours per participant. Note that one participant showed up for data collection, but it was determined that the participant did not meet the screening qualifications. No data was collected from that user, and a replacement user was selected. All 16 participants were currently either unemployed or part-time employed, and were seeking full-time employment. None of the participants were fully employed; so, the participants were very representative of the typical blind persons who would be attempting to apply for jobs online. Of the 16 participants, 11 were female, and five were male, and the average age was 36.5 years (with a range of 21-65 years old). All of the 16 participants were blind users with a great deal of experience using screen reader technology (an average of 12.06 years of experience) and a great deal of experience using the Internet (an average of 10.94 years). Three of the participants had never applied for a job online before, but the other participants had previous experience applying for jobs online. Of the 16 participants, two had high school degrees, three had Associate’s degrees, nine had Bachelor’s degrees, and two had Master’s degrees. Participants were paid $250 for their participation. While some participants took public transportation, others had friends or family members drop them off, however, the friends/family members were not allowed to stay in the computer room or assist the usability evaluation in any way. There was a 5-to-15 minute break in between the two attempts to apply for applications. Participants did not have any additional documented disabilities, aside from their vision loss. Note that while the university Institutional Review Board (IRB) requires signed paper copies of both the IRB form and the payment form, printed copies logically do not make sense for blind participants, so the participants received electronic copies of the documents in advance that they could read. When the participants arrived for the data collection, they were asked to sign the paper copies, with Braille stickers saying “sign above” to let them know where to place their signature.

No personal participant information was used, and each participant had a name, resume, and email account prepared for them for use in the study. All resumes submitted were marked “not a real application—submitted for training purposes only” so as not to confuse or waste the time of employers who received the application. There was no stated time limit for how long it took participants to attempt to submit an employment application.

Data Collection

For the data collection, participants were given the URL of the home page of the company/organization and were told to apply for a job of a certain category (e.g., help desk manager, or software engineer). We interacted with all of the job application Web sites beforehand to know which jobs were available on each Web site. Specific job categories were selected for our participants in advance, and resumes appropriate to each specific job were created for use by the participants (for instance, with appropriate professional experience, degrees, and certifications). All usability evaluations took place using the same computer in the computer lab at the Maryland Division of Rehabilitation Services, Office of Blindness and Vision Services. The computer was a Dell Optiplex 760, Intel Core 2 Duo CPU, running Microsoft Windows XP Professional Service Pack 3 and JAWS 11 (screen-reader software). Users were allowed to modify the speech output speed to their liking to make it similar to how they typically interact with a computer. The browser used for the study was Internet Explorer 8. All data collection took place in August and September 2011. JAWS was selected because it is the dominant screen reader currently in use (WebAIM, 2010). Typically, the participants were in the computer lab for 3-4 hours, including the introduction, signature of forms, description of procedures, the actual usability evaluation, breaks, and wrap-up.

We used a modified usability methodology to learn as much as possible about the barriers to online job applications. Ideally, people with disabilities need to apply for a job online without assistance from anyone. Because many of the sites had core features (such as the “search jobs” function) that were inaccessible, if a traditional usability methodology had been used, the researchers could not offer help or assistance in any way, and the participants would not have made it past initial inaccessible screens. That scenario would have provided no useful feedback about the accessibility of other steps in the hiring process. In the modified usability methodology, when participants could not move forward and specifically asked for help, we offered to assist them, and took careful notes of when we were asked to perform an intervention and the type of intervention performed. Specific data about the interventions are in the Results section of this paper. Aside from the user-requested interventions, we non-obtrusively took notes about what steps the users were taking, and we did not comment or assist the users in any other way. We encouraged the participants to think aloud and state what they were doing, and that also influenced our notes.

Applying for a job online is really one large task with a number of subtasks. These subtasks cannot be separated out as separate, discrete tasks, because the tasks all must be completed successfully to reach the ultimate user goal: submitting an application. The specific subtasks for each Web site application process vary; there is no consistency among sites in the different subtasks needed to reach the goal. In comparison, when attempting to use different email applications, all applications have identical, discrete tasks that can be compared across different applications, such as adding an email address to an address book, sending an email, responding to an email, and deleting an email (Wentz & Lazar, 2011). While some subtasks are common across job application sites (such as education, certifications, and previous work experience), they are asked in a different manner, with differing levels of detail required (e.g., one site asks you to name your university attended, but another site asks you to find your university attended from a list of thousands of universities). The same question is asked in different ways on different sites: some ask a question as one question, while some sites break that same question down into multiple subtasks. Furthermore, different job application Web sites have different subtasks, such as salary requirements, date availability for a job, availability for job travel, hobbies, languages spoken, and work preferences, which often are not asked on many of the Web sites. Some Web sites allow you to upload a resume, and the software on the Web site then takes the data directly from the resume, populates the form fields, and simply asks for confirmation that they are correct. Other sites, even with a resume uploaded, do not populate the form fields with any data. Therefore, it is impossible to compare the performance on each subtask across sites, even when those sites use a similar software package for the hiring process, such as the recruitment software from Kenexa (http://www.kenexa.com/recruitment-technology).

Pilot Study

A pilot study was conducted with two blind participants to test the appropriateness of our data collection methods. Note that this did not take place at the location described for the 16 participants, but rather took place in the participants’ homes. From the pilot studies, minor modifications were made to the data collection methods, such as a stronger encouragement to participants to think aloud, clearer pre-study instructions, methods to document the interventions, and increasing the amount of information available to participants on their resumes for use in the study.

 

Previous | Next