WebAIM - Web Accessibility In Mind

E-mail List Archives

Thread: What we found when we tested tools on the world’s least-accessible webpage

for

Number of posts in this thread: 10 (In chronological order)

From: Jennifer Sutton
Date: Fri, Feb 24 2017 1:10PM
Subject: What we found when we tested tools on the world's least-accessible webpage
No previous message | Next message →

Greetings, WebAIM and others who're bcc-ed:


I thought some of you might find this research, conducted in the UK, of
some interest.

I'll also be posting to the WAI-IG list.


Best,
Jennifer


What we found when we tested tools on the world's least-accessible webpage
https://accessibility.blog.gov.uk/2017/02/24/what-we-found-when-we-tested-tools-on-the-worlds-least-accessible-webpage/

From: Lucy Greco
Date: Fri, Feb 24 2017 1:12PM
Subject: Re: What we found when we tested tools on the world's least-accessible webpage
← Previous message | Next message →

this was a great find i liked it but would love to have access to the page
they created

Lucia Greco
Web Accessibility Evangelist
IST - Architecture, Platforms, and Integration
University of California, Berkeley
(510) 289-6008 skype: lucia1-greco
http://webaccess.berkeley.edu
Follow me on twitter @accessaces


On Fri, Feb 24, 2017 at 12:10 PM, Jennifer Sutton < = EMAIL ADDRESS REMOVED = >
wrote:

> Greetings, WebAIM and others who're bcc-ed:
>
>
> I thought some of you might find this research, conducted in the UK, of
> some interest.
>
> I'll also be posting to the WAI-IG list.
>
>
> Best,
> Jennifer
>
>
> What we found when we tested tools on the world's least-accessible webpage
> https://accessibility.blog.gov.uk/2017/02/24/what-we-found-
> when-we-tested-tools-on-the-worlds-least-accessible-webpage/
>
> > > > >

From: Jennifer Sutton
Date: Fri, Feb 24 2017 1:20PM
Subject: Re: What we found when we tested tools on the world's least-accessible webpage
← Previous message | Next message →

Lucy et al:

This page was linked from within the article, and I believe it's what
you're looking for.

If not, it'd probably help to comment on the post, where they will see
your request rather than to the list.


Best,

Jennifer


https://alphagov.github.io/accessibility-tool-audit/test-cases.html



On 2/24/2017 12:12 PM, Lucy Greco wrote:
> this was a great find i liked it but would love to have access to the page
> they created
>
> Lucia Greco
> Web Accessibility Evangelist
> IST - Architecture, Platforms, and Integration
> University of California, Berkeley
> (510) 289-6008 skype: lucia1-greco
> http://webaccess.berkeley.edu
> Follow me on twitter @accessaces
>
>
> On Fri, Feb 24, 2017 at 12:10 PM, Jennifer Sutton < = EMAIL ADDRESS REMOVED = >
> wrote:
>
>> Greetings, WebAIM and others who're bcc-ed:
>>
>>
>> I thought some of you might find this research, conducted in the UK, of
>> some interest.
>>
>> I'll also be posting to the WAI-IG list.
>>
>>
>> Best,
>> Jennifer
>>
>>
>> What we found when we tested tools on the world's least-accessible webpage
>> https://accessibility.blog.gov.uk/2017/02/24/what-we-found-
>> when-we-tested-tools-on-the-worlds-least-accessible-webpage/
>>
>> >> >> >> >>
> > > >

From: Jared Smith
Date: Fri, Feb 24 2017 2:22PM
Subject: Re: What we found when we tested tools on the world's least-accessible webpage
← Previous message | Next message →

Thanks for sharing this. Their premise that automated tools are
limited is spot on. Their methodology for reviewing the tools,
however, is rather questionable. Many of the "barriers" that they
identified have no or negligible end user impact. They seem to suggest
that if a tool does not indicate an "error" for every possible
interpretation of some accessibility issue or guideline, that somehow
that tool is flawed.

Our approach with WAVE is to facilitate human evaluation and focus the
evaluator on things that actually have an impact - not sending them on
a wild goose chase fixing "errors" that don't have any impact on
actual end user accessibility. This study would suggest that the tool
that flags the most "errors" is somehow best.

They also made significant errors in their analysis of WAVE. I found
at least 8 items that WAVE readily flags that they somehow overlooked
or recorded incorrectly. I've notified them of these errors -
https://github.com/alphagov/accessibility-tool-audit/issues/3 - and
hope they update their results accordingly.

Jared Smith
WebAIM.org

From: Lucy Greco
Date: Fri, Feb 24 2017 2:30PM
Subject: Re: What we found when we tested tools on the world's least-accessible webpage
← Previous message | Next message →

yes that is what i was looking for must of read through to fast to find it
my self grin

Lucia Greco
Web Accessibility Evangelist
IST - Architecture, Platforms, and Integration
University of California, Berkeley
(510) 289-6008 skype: lucia1-greco
http://webaccess.berkeley.edu
Follow me on twitter @accessaces


On Fri, Feb 24, 2017 at 12:20 PM, Jennifer Sutton < = EMAIL ADDRESS REMOVED = >
wrote:

> Lucy et al:
>
> This page was linked from within the article, and I believe it's what
> you're looking for.
>
> If not, it'd probably help to comment on the post, where they will see
> your request rather than to the list.
>
>
> Best,
>
> Jennifer
>
>
> https://alphagov.github.io/accessibility-tool-audit/test-cases.html
>
>
>
>
> On 2/24/2017 12:12 PM, Lucy Greco wrote:
>
>> this was a great find i liked it but would love to have access to the page
>> they created
>>
>> Lucia Greco
>> Web Accessibility Evangelist
>> IST - Architecture, Platforms, and Integration
>> University of California, Berkeley
>> (510) 289-6008 skype: lucia1-greco
>> http://webaccess.berkeley.edu
>> Follow me on twitter @accessaces
>>
>>
>> On Fri, Feb 24, 2017 at 12:10 PM, Jennifer Sutton < = EMAIL ADDRESS REMOVED = >
>> wrote:
>>
>> Greetings, WebAIM and others who're bcc-ed:
>>>
>>>
>>> I thought some of you might find this research, conducted in the UK, of
>>> some interest.
>>>
>>> I'll also be posting to the WAI-IG list.
>>>
>>>
>>> Best,
>>> Jennifer
>>>
>>>
>>> What we found when we tested tools on the world's least-accessible
>>> webpage
>>> https://accessibility.blog.gov.uk/2017/02/24/what-we-found-
>>> when-we-tested-tools-on-the-worlds-least-accessible-webpage/
>>>
>>> >>> >>> >>> >>>
>>> >> >> >> >>
>
> > > > >

From: Moore,Michael (Accessibility) (HHSC)
Date: Fri, Feb 24 2017 3:03PM
Subject: Re: What we found when we tested tools on the world's least-accessible webpage
← Previous message | Next message →

I agree with that assessment. Some things like empty data cells in a table that they suggested was an error, I would have a hard time teaching our developers to interpret the results if things like that were flagged. You will also be happy to know that our current process for our developers starts with the WAVE tool in Chrome and has resulted in a dramatic drop in accessibility bugs found at QA time.

Mike Moore
EIR (Electronic Information Resources) Accessibility Coordinator
Texas Health and Human Services Commission
Civil Rights Office
(512) 438-3431 (Office)



Making electronic information and services accessible to people with disabilities is everyone's job. I am here to help.

-----Original Message-----
From: WebAIM-Forum [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Jared Smith
Sent: Friday, February 24, 2017 3:23 PM
To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = >
Subject: Re: [WebAIM] What we found when we tested tools on the world's least-accessible webpage

Thanks for sharing this. Their premise that automated tools are limited is spot on. Their methodology for reviewing the tools, however, is rather questionable. Many of the "barriers" that they identified have no or negligible end user impact. They seem to suggest that if a tool does not indicate an "error" for every possible interpretation of some accessibility issue or guideline, that somehow that tool is flawed.

Our approach with WAVE is to facilitate human evaluation and focus the evaluator on things that actually have an impact - not sending them on a wild goose chase fixing "errors" that don't have any impact on actual end user accessibility. This study would suggest that the tool that flags the most "errors" is somehow best.

They also made significant errors in their analysis of WAVE. I found at least 8 items that WAVE readily flags that they somehow overlooked or recorded incorrectly. I've notified them of these errors -
https://github.com/alphagov/accessibility-tool-audit/issues/3 - and hope they update their results accordingly.

Jared Smith
WebAIM.org

From: Birkir R. Gunnarsson
Date: Fri, Feb 24 2017 8:35PM
Subject: Re: What we found when we tested tools on the world's least-accessible webpage
← Previous message | Next message →

Well said Jared and co.
I really do not like tools that flag every possible error ( often with
little to no user impact) as an error, and flag everything as
critical.
As the saying goes "if everything is critical, then nothing is critical.".
For those who may secretly think that I sold out after transferring
to a mainstream company, no, on the contrary, I get more uptight about
usability things that are not necessarily WCAG violations, but I
concentrate on things that I believe make a real difference to real
users, and tools that report a bunch of false or borderline technical
accessibility problems with questionable user impact are not helping
the journey towards a truly inclusive digital experience.
In my evaluation of a variety of accessibility tools, I shockingly
found this was often the case (and, no WebAIM was noet among them, I
can't wait for the WebAIM keyboard accessible Firefox plug-in).



On 2/24/17, Moore,Michael (Accessibility) (HHSC)
< = EMAIL ADDRESS REMOVED = > wrote:
> I agree with that assessment. Some things like empty data cells in a table
> that they suggested was an error, I would have a hard time teaching our
> developers to interpret the results if things like that were flagged. You
> will also be happy to know that our current process for our developers
> starts with the WAVE tool in Chrome and has resulted in a dramatic drop in
> accessibility bugs found at QA time.
>
> Mike Moore
> EIR (Electronic Information Resources) Accessibility Coordinator
> Texas Health and Human Services Commission
> Civil Rights Office
> (512) 438-3431 (Office)
>
>
>
> Making electronic information and services accessible to people with
> disabilities is everyone's job. I am here to help.
>
> -----Original Message-----
> From: WebAIM-Forum [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf
> Of Jared Smith
> Sent: Friday, February 24, 2017 3:23 PM
> To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = >
> Subject: Re: [WebAIM] What we found when we tested tools on the world's
> least-accessible webpage
>
> Thanks for sharing this. Their premise that automated tools are limited is
> spot on. Their methodology for reviewing the tools, however, is rather
> questionable. Many of the "barriers" that they identified have no or
> negligible end user impact. They seem to suggest that if a tool does not
> indicate an "error" for every possible interpretation of some accessibility
> issue or guideline, that somehow that tool is flawed.
>
> Our approach with WAVE is to facilitate human evaluation and focus the
> evaluator on things that actually have an impact - not sending them on a
> wild goose chase fixing "errors" that don't have any impact on actual end
> user accessibility. This study would suggest that the tool that flags the
> most "errors" is somehow best.
>
> They also made significant errors in their analysis of WAVE. I found at
> least 8 items that WAVE readily flags that they somehow overlooked or
> recorded incorrectly. I've notified them of these errors -
> https://github.com/alphagov/accessibility-tool-audit/issues/3 - and hope
> they update their results accordingly.
>
> Jared Smith
> WebAIM.org
> > > http://webaim.org/discussion/archives
> > > > > >


--
Work hard. Have fun. Make history.

From: Mehmet Duran
Date: Mon, Feb 27 2017 12:02PM
Subject: Re: What we found when we tested tools on the world's least-accessible webpage
← Previous message | Next message →

Hello all,

Thanks a lot for your comments, we were really hoping to hear back from
tool developers because we feel automated testing tools are essential for
any project. We tried to be really careful to point out we don't think a
single tool should come first and be used by everyone - rather, we wanted
to give our service teams some information on picking the right tool for
themselves.

This work started with our frustration with trying to find out what tools
do and don't test. Some of our teams were relying on the tools too much and
skipping manual testing entirely. On the other hand, we wanted to also show
how much of the common problems could be caught automatically and convince
them to use an automated tool.

We initially wanted to start with all the barriers we could find but that
task quickly became too much for a single developer to handle. We instead
concentrated on the problems we keep seeing around the digital services
teams in UK Government departments. You might feel not all of these
barriers are realistic but most of them are based on what we've seen.

We're open to ideas and we'd like to make sure our audit is helpful to both
tool developers and users. That's why this is an open source project for
everyone to contribute [1]. Apart from Jared, Fabrice from the Asqatasun
team has reported the discrepancies he's spotted [2] and we'll give it a
look soon, as well. We're really pleased with the feedback!

Jared - thanks a lot for your comments. It's great to see WAVE's improved
since our last test and we've updated our results accordingly. We've also
added some comments which explains how we classified our findings. We'd
love to hear what you think. Here's a link to the PR:
https://github.com/alphagov/accessibility-tool-audit/pull/6

Mehmet

[1] https://www.gov.uk/design-principles#tenth
[2] https://github.com/alphagov/accessibility-tool-audit/issues/5

On 25 February 2017 at 03:35, Birkir R. Gunnarsson <
= EMAIL ADDRESS REMOVED = > wrote:

> Well said Jared and co.
> I really do not like tools that flag every possible error ( often with
> little to no user impact) as an error, and flag everything as
> critical.
> As the saying goes "if everything is critical, then nothing is critical.".
> For those who may secretly think that I sold out after transferring
> to a mainstream company, no, on the contrary, I get more uptight about
> usability things that are not necessarily WCAG violations, but I
> concentrate on things that I believe make a real difference to real
> users, and tools that report a bunch of false or borderline technical
> accessibility problems with questionable user impact are not helping
> the journey towards a truly inclusive digital experience.
> In my evaluation of a variety of accessibility tools, I shockingly
> found this was often the case (and, no WebAIM was noet among them, I
> can't wait for the WebAIM keyboard accessible Firefox plug-in).
>
>
>
> On 2/24/17, Moore,Michael (Accessibility) (HHSC)
> < = EMAIL ADDRESS REMOVED = > wrote:
> > I agree with that assessment. Some things like empty data cells in a
> table
> > that they suggested was an error, I would have a hard time teaching our
> > developers to interpret the results if things like that were flagged. You
> > will also be happy to know that our current process for our developers
> > starts with the WAVE tool in Chrome and has resulted in a dramatic drop
> in
> > accessibility bugs found at QA time.
> >
> > Mike Moore
> > EIR (Electronic Information Resources) Accessibility Coordinator
> > Texas Health and Human Services Commission
> > Civil Rights Office
> > (512) 438-3431 (Office)
> >
> >
> >
> > Making electronic information and services accessible to people with
> > disabilities is everyone's job. I am here to help.
> >
> > -----Original Message-----
> > From: WebAIM-Forum [mailto: = EMAIL ADDRESS REMOVED = ] On
> Behalf
> > Of Jared Smith
> > Sent: Friday, February 24, 2017 3:23 PM
> > To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = >
> > Subject: Re: [WebAIM] What we found when we tested tools on the world's
> > least-accessible webpage
> >
> > Thanks for sharing this. Their premise that automated tools are limited
> is
> > spot on. Their methodology for reviewing the tools, however, is rather
> > questionable. Many of the "barriers" that they identified have no or
> > negligible end user impact. They seem to suggest that if a tool does not
> > indicate an "error" for every possible interpretation of some
> accessibility
> > issue or guideline, that somehow that tool is flawed.
> >
> > Our approach with WAVE is to facilitate human evaluation and focus the
> > evaluator on things that actually have an impact - not sending them on a
> > wild goose chase fixing "errors" that don't have any impact on actual end
> > user accessibility. This study would suggest that the tool that flags the
> > most "errors" is somehow best.
> >
> > They also made significant errors in their analysis of WAVE. I found at
> > least 8 items that WAVE readily flags that they somehow overlooked or
> > recorded incorrectly. I've notified them of these errors -
> > https://github.com/alphagov/accessibility-tool-audit/issues/3 - and hope
> > they update their results accordingly.
> >
> > Jared Smith
> > WebAIM.org
> > > > > archives at
> > http://webaim.org/discussion/archives
> > > > > > > > > > > >
>
>
> --
> Work hard. Have fun. Make history.
> > > > >



--
Mehmet Duran
+44 7503 388 345
@cfq <https://twitter.com/cfq>

From: Jared Smith
Date: Mon, Feb 27 2017 12:12PM
Subject: Re: What we found when we tested tools on the world's least-accessible webpage
← Previous message | Next message →

Mehmet -

Thank you for the follow-up, and for the updates. This really is a
very useful chart. I still worry a bit about the perception that
identification of more issues is better for accessibility
implementers, but it is informative to know what various tools test.
And this has helped us identify a few items for implementation or
improvement in WAVE.

I would ask that you also update the "How did each tool do?" chart at
https://alphagov.github.io/accessibility-tool-audit/index.html based
on the updated results data.

Thanks,

Jared

From: Birkir R. Gunnarsson
Date: Mon, Feb 27 2017 12:45PM
Subject: Re: What we found when we tested tools on the world's least-accessible webpage
← Previous message | No next message

Mehmet

I really like what you are trying to do here, it is much needed and I
applaud the effort.
I will go in as soon as I have time and do more analysis for more
useful and specific feedback.

But there are several big picture factors I would think about and are
important for the accessibility tool selection:

1. Is a user problem you are noticing a WCAG violation. WCAG is far
from perfect and it does not cover all problems users run into.
Feedback on real issues encountered by real users with disabilities
and that are not covered by WCAG (2.0 A or AA) would be great for the
WCAG working group. But the accessibility testing tools are mostly
limited to actual WCAG violations to remain consistent (tools could
have optional usability warnings, but those should be clearly flagged
as such). Make sure that every violation you create is definitely a
WCAG vilation, and map it to the relevant success criterion (this wil
help further analysis, such as when individual tools fail, or where
tools fail in general).

2. Come up with some type of measurement on the amount of "false
positives" or non-WCAG violations reported by a tool, even something
assimple as the difference between total issues found and the issues
you found that the tool caught. This is important, because a tool
could report every potential or possible thing as an accessibility
violation. Sure, it could uncover the most issues, but it would also
create a huge amount of unnecessary work for developers and would not
be productive use of people's time (and,just for the record, I am not
accusing Tenon of this, it is a great tool written by one of the
leading experts in the accessibility sector). You need to deduct
points from tools that report false positives. But there is a tool,
not on your list, where you have to contact the vendor specifically to
request that it does not flag all AAA issues as errors. That shows
fundamental misunderstanding of WCAG.

3. Ensure all your tools test the DOM. I think it is almost definite
for all tools currently used, but this is important. You need to fix a
couple of your accessibility defects using JavaScript that runs when
the page loads (e.g. adding alt text to a couple of images or a label
association between a label and a form field).

4. Does the tool evaluate ARIA? ARIA has made accessibility testing
much more complex, because it can be used to add functionality and fix
stuff. When properly used (along with proper functionality), ARIA is a
valid solution and cannot be shrugged of by reporting its use as an
error.
The tool should check whether valid roles and attribues are used, and
if they are appropriate for the widget or element.

I know this is a huge undertaking, but these could be good
consideration for a future evolution of what you are trying to do
here.
You are off to a very cool start.

Cheers
-B















On 2/27/17, Jared Smith < = EMAIL ADDRESS REMOVED = > wrote:
> Mehmet -
>
> Thank you for the follow-up, and for the updates. This really is a
> very useful chart. I still worry a bit about the perception that
> identification of more issues is better for accessibility
> implementers, but it is informative to know what various tools test.
> And this has helped us identify a few items for implementation or
> improvement in WAVE.
>
> I would ask that you also update the "How did each tool do?" chart at
> https://alphagov.github.io/accessibility-tool-audit/index.html based
> on the updated results data.
>
> Thanks,
>
> Jared
> > > > >


--
Work hard. Have fun. Make history.