Featured UPA Links:
UK accessibility investigation of 1000 web sites - results released April 14th 2004
An investigation of 1000 UK Web sites carried out on behalf of the Disability Rights Commission (DRC) reveals unacceptably poor (in fact woeful) accessibility. At least 81% of sites failed to meet the minimum accessibility standard, and this figure is likely to be much higher.
The 'formal investigation', launched in March 2003 is the most comprehensive research to date into the discrimination faced by disabled people on the Internet. You can download the full report from the DRC site (unfortunately at the time of writing an accessible HTML version is not yet available). The investigation was performed by the Centre for Human Computer Interaction Design at City University, London.
The investigators performed automated accessibility checking of 1000 home pages across 5 sectors (government, business, e-commerce, entertainment, web services), and then, for 100 sites, more in-depth manual checking and user testing. The user testing involved 50 disabled web users. The user group included blind and partially sighted people, people who were profoundly deaf or hard of hearing, people with specific learning difficulties such as dyslexia, and people with physical impairments. Each user evaluated 10 sites and was asked to complete 2 relatively simple tasks per site (e.g. find the rate of interest on a banking site).
Researchers also canvassed the views of more than 700 businesses which might commission web sites, and nearly 400 web site developers. This survey was backed up by interviews with 25 of the respondent businesses.
Results highlights and comment
Automated checking against WAI WCAG 1.0 - 1000 home pages
Note: Automated checking checks a subset of the 65 Web Accessibility Initiative (WAI) Web Content Accessibility Guidelines version 1.0 (WCAG 1.0) - to ensure compliance manual checks also have to be performed.
Further analyses on the automated checking revealed that:
Unfortunately the 81% failure rate for web sites is likely to be even higher:
User testing of 100 sites
50 people from the test user panel completed a total of 913 tasks (each user was asked to attempt 2 relatively straightforward tasks on each of 10 sites). Many of the tests were performed by the users themselves at their homes (or place of work), however a subset were performed in a usability laboratory observed by the researchers. The researchers compared the results of individual and lab tests and concluded that the results were equivalent and therefore all tasks were pooled. The exception was that at home users could not always be certain whether they had passed or failed to complete the task. Tasks for which it was uncertain were discounted leaving a total sample of 769 tasks. Individual accessibility and usability problems were noted during the testing sessions and these were categorized by the researchers.
'Control testing' was performed on 6 sites. 3 'high' and 3 'low' accessibility rated sites were tested by both blind screen reader users and matched (in terms of web experience etc.) sighted users with no disabilities (control users).
The tasks on the whole were relatively simple in scope. Testing of more complex (but equally representative) tasks such as site/service registration, account maintenance etc. would be likely to increase the failure rate such that the situation might be even worse.
There is some dispute over the 45% of issues that did not relate to WAI checkpoints. And there is an initial response by the WAI which attempts to clarify the issue. The WAI analysis of the available data suggests that 95% of the issues are in fact covered by WAI guidelines - 77% by WCAG 1.0 and the remaining 18% by the User Agent Accessibility Guidelines 1.0 (which together with WCAG 1.0 and the Authoring Tool Accessibility Guidelines 1.0 form a complimentary set of guidelines from WAI).
Essentially the WAI guidelines when taken as a whole do appear to cover the vast majority of the issues uncovered in the DRC investigation. If the WAI guidelines are followed then good levels of accessibility are achieved. The disparity in the figures for issues that relate directly to checkpoints in WCAG 1.0 (55% v 77%) reveals one of the known issues present with this 1999 version of the guidelines, namely that some are open to interpretation and/or difficult to objectively assess. The new guidelines Web Content Accessibility Guidelines v 2.0 (currently a working draft) seeks to address this as well as providing other improvements. Both the WAI and the DRC are keen to point out that developers should follow the guidelines for site design – WCAG Version 1.0 – but they should not follow these in isolation: user testing, they both agree, is very, very important.
Gaining better usability for all users by implementing better accessibility (the usability bonus) has for a long time been an anecdotal assertion (one which I would certainly make). The results from the DRC study support this. However, there are likely to be many factors at play, a trivial example could be that a site owner that cares about accessibility is probably more likely to care about usability and general site quality too, such that they may have arrived at a more usable site independently of the accessibility issue. It would be good to see a more focused study of this, testing across more sites, with more users and more user categories.
Questionnaires were sent to 712 web site commissioners and 388 web development agencies, and interviews were conducted with 21 commissioners and 25 agencies.
It is my experience (and that of many people involved with accessibility) that unfortunately the threat of legal action is the prime motivator for the majority of organizations, this is different to the the result indicated above. Indeed the DRC investigation when it was first announced created much interest as it was mistakenly reported that it would be a naming and shaming exercise. A report with more bite might have created more impact but it was the express aim of the DRC to find the current state of play, raise awareness and support for better levels of accessibility rather than hit people with legal proceedings. Clearly increased awareness and knowledge is what is required. NB: It was indicated that if response to 'encouragement' was not forthcoming by organizations the DRC would support more formal action being taken against individual organizations in the future.
The DRC produced a number of recommendations in the report (not detailed here please read the recommendations for yourselves) many centred around raising awareness in organizations, providing support and education for developers and people using assistive technology, and making certain policy changes. There were also a number of specific recommendations and observations that might be useful to those involved in web accessibility work.
It was recommended that Web site developers should involve disabled users from an early stage in the design process this is certainly a thing that should be advocated. Unfortunately as many of us know, at present it is often difficult enough to get any user involvement included at all - we can hope though that this may help nudge people in the right direction.
One of the most interesting recommendations was that the UK Government should promote a formal accreditation process and a certification scheme resulting in an accessibility 'kite mark'. This would certainly be valuable but it would be a highly complex and difficult proposition to set up, maintain and properly police - it will be interesting to see how things develop on this front in the near future.
Formal confirmation of what many of us knew already - that the general levels of accessibility suck, and suck badly. The levels of accessibility in other countries are likely to be similar (although section 508 in the US is likely to have prompted increased levels). Hopefully the investigation will raise awareness and stimulate action. Lets all get accessible!
140 N. Bloomingdale Road
Bloomingdale, IL 60108-1017
|Contact the Voice|