Four Years of Accessibility Data in Prominent US Networks
An analysis of UCEDD, Institution, and US state/territory accessibility.
In 2016 WebAIM began a longitudinal research program to examine web accessibility in national samples. We analyzed the national system of 67 University Centers for Excellence in Developmental Disabilities (UCEDD), funded by the Administration on Intellectual Developmental Disabilities (AIDD) to see if entities who work with, and for, persons with disabilities would perform better in measures of accessibility than those who do not. UCEDDs have a statutory mandate to:
“...provide leadership; advise federal, state, and community policymakers; and promote self-determination, independence, productivity, and full integration of individuals with developmental disabilities”
In this digital age, few of these can occur, and be inclusive, in the absence of web content that can be accessed by all.
Each state and territory has at least one UCEDD which acts as a resource or body of influence to both their host institution and their state government. Given their federal mandate, providing leadership in web accessibility is certainly an appropriate expectation.
Since UCEDDs are housed at major universities or university medical centers, we decided to also examine accessibility of host universities that housed UCEDDs (n=67) and also their state or territorial governments (n= 54). We believed in doing so we would better follow contextual variables that surround the UCEDDs. While we will not detail those data here, we would welcome the opportunity to share these data with other researchers.
We performed our analyses four times in the last five years. We began in the fall of 2016, and repeated collection in 2017 and 2018. Our latest collection concluded in January of 2020.
With full disclosure, WebAIM is housed at the Center for Persons with Disabilities at Utah State University, which is Utah’s UCEDD so there was a natural curiosity in knowing how our UCEDD network was doing with respect to web accessibility. Also, we were interested in seeing if improvements had occurred since a prior study. Rowland & Whiting (2010), found that only 8 of the then 64 centers (i.e.,12.5%) present at the time demonstrated conformance to Section 508. Since that time, much focus and energy has gone into web accessibility awareness, training, and transformation across our network.
Two questions guided this research:
- What does a high-level automated evaluation of a sample of webpages indicate about the current state of website accessibility for the 67 UCEDDs, the 67 host institutions, and the 54 states and territories in which UCEDDs reside and collaborate?
- Looking at longitudinal data, are systems trending up, or down, with respect to errors detected in the sample?
For each year’s data snapshot we gathered a sample of web pages from the web sites of UCEDDs, their corresponding host institution, and state government. Velleman and Geest (2013) conducted a pilot study that found that the home page plus 13 randomly-selected pages provided 93% of the error types found more broadly on the site. Using this as a model, we collected 14 pages for each entity.
Other than the home page, the process to gather the remaining 13 pages began with the use of a free web crawler (Hammond, n.d.) that identified 250 URLs under each site domain. 13 HTML pages were selected using a random number generator. Any invalid pages that were found were replaced (manually) by another randomly selected URL from the pool. It should be noted that from this sample we also performed automated searches for media files so a manual inspection for captions could be performed.
For the longitudinal analysis we reanalyzed the ULRs that had been analyzed in previous years - 2016, 2017, and 2018. If any were no longer available from one year to the next, we eliminated them from the analysis. While the number of pages available to analyze over time decreased, we employed a process to ensure direct comparisons of pages.
We analyzed the sample pages using the WAVE tool to detect 17 programmatically-determined accessibility errors. We also performed manual checks to see if captions were present and correct on pages with media. Our analysis did not determine the extent to which any page conformed to WCAG 2, nor did it determine the extent to which any page was “accessible”. While we did not perform additional human analysis, machine-detectable errors have long been considered low hanging fruit by those in the web accessibility field.
Results of the pooled sample over 4 years
Analyzing the data across a pooled sample of UCEDD, host university, and state government pages provides an overall picture of improvement.
In 2018, 81.3% of the pages in the pooled sample contained at least one of the programmatically detectable errors. By 2020, a decline in pages with errors in the sample was seen–70.3% of the pages contained automatically detectable WAVE errors.
It is important to note that there was a steady decline in the number of pages in our sample over time, from 2,245 pages 1,687 by the end of the four collection periods. We lost approximately a quarter of all pages from the original sample; this is unfortunate attrition. Yet in an era where web pages are updated and removed in regular intervals, this may be typical.
Results the UCEDDs
For UCEDDs, the improvement went from 81.1% of pages in the sample with errors in 2016 to 67.2% in 2020. 2018 had the lowest percentage of pages with errors (66%). The most recent analysis in 2020 saw a slight increase of 1.2% more pages with errors.
Results for the host institutions
Institutions made the greatest improvement over time. They began in 2016 with 84.5% of their sample pages having errors that WAVE automatically detected. By 2020, this had improved to 65% of pages in the sample–an improvement of nearly 20%.
Results for the state governments
State government pages sampled across four years are virtually the same in 2020 (78.6% pages with WAVE errors) as they were when we began in 2016 (78.5% pages with WAVE errors). There was a notable increase in errors during the second year of data collection, but pages have since returned to earlier error percentages.
The top 6 errors during the 2020 collection were the following:
- Empty link (n= 4348)
- Missing form label (n= 1347)
- Missing image alternative text (n=1061)
- Linked image missing alternative text (n=890)
- Broken ARIA reference (n=859)
- Empty button (n= 783)
If developers and designers would focus only on those few errors, a great deal of accessibility would be added to pages.
Both UCEDDs and host institution pages had decreased WAVE-detectable errors by 13.9% and 19.5% respectively over the past 4 years. It is important to reflect on the practical significance of this. This may not be viewed as a tremendous improvement over 4 years, but it is trending in the right direction. Improvement among UCEDDs and host institutions are aligned. This may be the result their ability to share resources, expertise, and practices, facilitated by their physical proximity.
The results from state governments were not aligned with the others and were relatively flat from the beginning to the ending collection points. Given that UCEDDs have a statutory obligation to provide leadership and advise state policymakers, the work of web accessibility may be a perfect topic to address with state governments. Certainly, full inclusion of citizens with disabilities cannot happen without an accessible web.
Our findings that most errors consolidated in relatively few areas may be helpful. Developers and designers can, and should, consider a focus on these error types. Doing so could make a huge impact.