upa - home page JUS - Journal of usability studies
An international peer-reviewed journal

User Experience and Accessibility: An Analysis of County Web Portals

Norman E. Youngblood and Susan A. Youngblood

Journal of Usability Studies, Volume 9, Issue 1, November 2013, pp. 25 - 41

Article Contents


Below, we describe this study’s design, our materials, and our procedures, arranged by line of inquiry.

Study Design

After compiling a list of counties, we identified websites that serve as portals—sites that represent not a single department but the county as a whole, often linking to departments and resources available in a given county. We then conducted a heuristic evaluation of the usability of each homepage, followed by an automated evaluation of best coding practices—the use of valid and accessible coding—and a manual evaluation of the homepage code to see if the designers had employed Cascading Style Sheets (CSS), style sheets that consolidate codes for appearance and separate those codes from the semantic, or meaning-laden, code, so sites are consistent and easier to maintain. We also examined code for CSS for mobile devices, or coding that would make a site more functional on a mobile device. We then analyzed the data to test each of our hypotheses.

The tests for each hypothesis examined correlations between the website variables and, separately, each of three county demographic variables: population, per-capita income, and median household income. In cases in which the website data were continuous, we tested for Pearson product-moment correlations (Pearson’s r). In cases in which the website data were dichotomous, we tested for point bi-serial correlation (rpb), a variant of Pearson’s r. In all cases, the level of significance was set at p<0.05 to establish critical values, and the t-value of the correlation coefficient, r or rpb, was compared to the critical value for t to test for significance. In cases where the expected correlation was directional, we used a one-tailed test.


We viewed the pages on an Apple computer using OSX 10.6.8 with Firefox 6.02. We set the browser size to 1024 x 768 pixels using Free Ruler v. 1.7b5 to check page sizes. We tested each county portal homepage’s accessibility using the WAVE online accessibility tool (WebAIM, n.d. b) described above. We tested compliance with World Wide Web Consortium (W3C) coding standards by submitting each county homepage to the W3C’s HTML validation tool (W3C, 2010).


The following sections discuss how we categorized our data into four primary areas: the adoption of portals, usability, best coding practices, and the adoption of new communication technology.

Adoption of portals

To examine adoption of portals—main county websites—we compiled a list of county web portal addresses for the 67 Alabama counties. We built the list by doing the following:

  1. We examined the 45 county links on the State of Alabama website (State of Alabama, n.d.).
  2. We searched for portals for 10 links that were incorrect (either abandoned URLs or links to an organization other than the county, such as a city in the county).
  3. We searched for the 22 counties listed as not having portals.

To identify missing sites, we used a Google search for the county name and Alabama, and we examined the first 30 results. In all, we identified 39 county portal websites out of the 67 counties in Alabama. Once we identified the sites, we collected our data during a 24-hour period in September 2011.

One county’s portal was not functional during data collection That county’s data were collected at a later date and used for the analysis of portal presence, n=67. The remaining analyses examined only the portals functional during the study (n=38). We did not code departmental sites, such as county school district and county sheriff sites. Nor did we include county commissioner sites because some serve only their commissions, functioning as departmental sites.


To address adherence to basic and broadly accepted usability standards, we used a set of 14 dichotomous web usability standards developed from prior research (Cappel & Huang, 2007; Youngblood & Mackiewicz 2012; Pew Center on the States, 2008) to code the homepage—and for several measures, a sample of three internal (secondary) pages—of each site. We analyzed all portals functional during the study (n=38). For each standard, we recorded the use of each practice contributing to site usability as a 1 and the failure to adhere to standards as a 0. The standards, listed in detail in the Results section, are grouped by adherence to the following: