Karl Groves recently published automated web accessibility test data for many of the Alexa Top 100 web sites. The results paint a rather stark picture of web accessibility. We agree with Karl’s suggestion that while automated testing is not a direct indicator of true accessibility issues, “poor performance in automated testing is strongly correlated with poor performance in manual testing.” Jennison commented that not all errors are created equally, and this is very true, yet the preponderance of automated errors is clearly indicative of serious issues.
Care should be taken in interpreting these results. These data should not be used to cast a sweeping judgement on a site. Home pages are often dissimilar to content pages, though these data generally correlate to Karl’s more extensive analysis. Regardless, the fact that an average of 25 errors per home page are present, and that only 4 of the 100 home pages had 0 errors is rather telling. There is much that still needs to be done to improve web accessibility.