E-mail List Archives

Re: What we found when we tested tools on the world's least-accessible webpage


From: Jared Smith
Date: Feb 24, 2017 2:22PM

Thanks for sharing this. Their premise that automated tools are
limited is spot on. Their methodology for reviewing the tools,
however, is rather questionable. Many of the "barriers" that they
identified have no or negligible end user impact. They seem to suggest
that if a tool does not indicate an "error" for every possible
interpretation of some accessibility issue or guideline, that somehow
that tool is flawed.

Our approach with WAVE is to facilitate human evaluation and focus the
evaluator on things that actually have an impact - not sending them on
a wild goose chase fixing "errors" that don't have any impact on
actual end user accessibility. This study would suggest that the tool
that flags the most "errors" is somehow best.

They also made significant errors in their analysis of WAVE. I found
at least 8 items that WAVE readily flags that they somehow overlooked
or recorded incorrectly. I've notified them of these errors -
https://github.com/alphagov/accessibility-tool-audit/issues/3 - and
hope they update their results accordingly.

Jared Smith