E-mail List Archives

Re: What we found when we tested tools on the world's least-accessible webpage


From: Mehmet Duran
Date: Feb 27, 2017 12:02PM

Hello all,

Thanks a lot for your comments, we were really hoping to hear back from
tool developers because we feel automated testing tools are essential for
any project. We tried to be really careful to point out we don't think a
single tool should come first and be used by everyone - rather, we wanted
to give our service teams some information on picking the right tool for

This work started with our frustration with trying to find out what tools
do and don't test. Some of our teams were relying on the tools too much and
skipping manual testing entirely. On the other hand, we wanted to also show
how much of the common problems could be caught automatically and convince
them to use an automated tool.

We initially wanted to start with all the barriers we could find but that
task quickly became too much for a single developer to handle. We instead
concentrated on the problems we keep seeing around the digital services
teams in UK Government departments. You might feel not all of these
barriers are realistic but most of them are based on what we've seen.

We're open to ideas and we'd like to make sure our audit is helpful to both
tool developers and users. That's why this is an open source project for
everyone to contribute [1]. Apart from Jared, Fabrice from the Asqatasun
team has reported the discrepancies he's spotted [2] and we'll give it a
look soon, as well. We're really pleased with the feedback!

Jared - thanks a lot for your comments. It's great to see WAVE's improved
since our last test and we've updated our results accordingly. We've also
added some comments which explains how we classified our findings. We'd
love to hear what you think. Here's a link to the PR:


[1] https://www.gov.uk/design-principles#tenth
[2] https://github.com/alphagov/accessibility-tool-audit/issues/5

On 25 February 2017 at 03:35, Birkir R. Gunnarsson <
<EMAIL REMOVED> > wrote:

> Well said Jared and co.
> I really do not like tools that flag every possible error ( often with
> little to no user impact) as an error, and flag everything as
> critical.
> As the saying goes "if everything is critical, then nothing is critical.".
> For those who may secretly think that I sold out after transferring
> to a mainstream company, no, on the contrary, I get more uptight about
> usability things that are not necessarily WCAG violations, but I
> concentrate on things that I believe make a real difference to real
> users, and tools that report a bunch of false or borderline technical
> accessibility problems with questionable user impact are not helping
> the journey towards a truly inclusive digital experience.
> In my evaluation of a variety of accessibility tools, I shockingly
> found this was often the case (and, no WebAIM was noet among them, I
> can't wait for the WebAIM keyboard accessible Firefox plug-in).
> On 2/24/17, Moore,Michael (Accessibility) (HHSC)
> < <EMAIL REMOVED> > wrote:
> > I agree with that assessment. Some things like empty data cells in a
> table
> > that they suggested was an error, I would have a hard time teaching our
> > developers to interpret the results if things like that were flagged. You
> > will also be happy to know that our current process for our developers
> > starts with the WAVE tool in Chrome and has resulted in a dramatic drop
> in
> > accessibility bugs found at QA time.
> >
> > Mike Moore
> > EIR (Electronic Information Resources) Accessibility Coordinator
> > Texas Health and Human Services Commission
> > Civil Rights Office
> > (512) 438-3431 (Office)
> >
> >
> >
> > Making electronic information and services accessible to people with
> > disabilities is everyone's job. I am here to help.
> >
> > -----Original Message-----
> > From: WebAIM-Forum [mailto: <EMAIL REMOVED> ] On
> Behalf
> > Of Jared Smith
> > Sent: Friday, February 24, 2017 3:23 PM
> > To: WebAIM Discussion List < <EMAIL REMOVED> >
> > Subject: Re: [WebAIM] What we found when we tested tools on the world’s
> > least-accessible webpage
> >
> > Thanks for sharing this. Their premise that automated tools are limited
> is
> > spot on. Their methodology for reviewing the tools, however, is rather
> > questionable. Many of the "barriers" that they identified have no or
> > negligible end user impact. They seem to suggest that if a tool does not
> > indicate an "error" for every possible interpretation of some
> accessibility
> > issue or guideline, that somehow that tool is flawed.
> >
> > Our approach with WAVE is to facilitate human evaluation and focus the
> > evaluator on things that actually have an impact - not sending them on a
> > wild goose chase fixing "errors" that don't have any impact on actual end
> > user accessibility. This study would suggest that the tool that flags the
> > most "errors" is somehow best.
> >
> > They also made significant errors in their analysis of WAVE. I found at
> > least 8 items that WAVE readily flags that they somehow overlooked or
> > recorded incorrectly. I've notified them of these errors -
> > https://github.com/alphagov/accessibility-tool-audit/issues/3 - and hope
> > they update their results accordingly.
> >
> > Jared Smith
> > WebAIM.org
> > > > > archives at
> > http://webaim.org/discussion/archives
> > > > > > > > > > > >
> --
> Work hard. Have fun. Make history.
> > > > >

Mehmet Duran
+44 7503 388 345
@cfq <https://twitter.com/cfq>