WebAIM - Web Accessibility In Mind

E-mail List Archives

Re: Testing for accessibility compliance

for

From: Monir ElRayes
Date: Apr 29, 2010 3:00PM


I would add to this the very important - and often overlooked - issues of:

1) Who is making content accessible; and
2) What type of content are we talking about.

These issues have a great impact on how much automation is required. Allow
me to explain.

Currently accessibility is still in the realm of "specialists". The average
author of content (i.e. almost everybody) knows very little about what is
required to make content accessible. This may not be such a big issue if all
we're talking about is websites, since the rate at which new content is
added to websites is relatively low. However, if we start to look at the
massive amount of content being created every day in the form of documents
(Word, excel, PowerPoint, PDF etc) by non-specialists, it becomes clear that
we need a tool-based approach that caters to the non-specialist author of
content.

Such author-level tools, by definition, are expert systems that guide the
non-specialist author and allow the creation of accessible content. Only by
equipping the authors with such tools can we deal with the incredible volume
of content being created every day.

Best Regards,

Monir ElRayes
President
NetCentric Technologies
613-270-9582 ext 203
613-797-8563
<EMAIL REMOVED>
www.net-centric.com
 
This e-mail message is confidential, may be privileged and is intended for
the exclusive use of the addressee(s). Any other person is strictly
prohibited from disclosing, distributing or reproducing it. If the
addressee(s) cannot be reached or is unknown to you, please inform the
sender by return e-mail immediately and delete this e-mail message and
destroy all copies.

-----Original Message-----
From: <EMAIL REMOVED>
[mailto: <EMAIL REMOVED> ] On Behalf Of Hoffman, Allen
Sent: Thursday, April 29, 2010 2:33 PM
To: WebAIM Discussion List
Subject: Re: [WebAIM] Testing for accessibility compliance

I don't think its automated vs. manual testing, accessibility cannot be
automatically determined with the current technologies available.

However:

Ensuring accessibility of large content and assets of information
requires some of the following:

Baseline automated scanning to gain visibility of easy to see
accessibility issues, e.g. alt-text, form-field labels, server-side
image maps, etc.
Content review process for new content which uses either very
extensive manual process, or manual process and an automated scanner.
This prevents degradation from baseline by newly authored content.
Remediation planning for baseline items--e.g. fix the stuff you
know is wrong over a determined time frame.
Period measuring/reporting of progress of remediation, and new
content documentation.

By looking at what you have, check pointing the new stuff, and setting
realistic goals for remediating the old stuff over time, you can get
there, and stay accessible.


-----Original Message-----
From: Langum, Michael J [mailto: <EMAIL REMOVED> ]
Sent: Thursday, April 29, 2010 8:11 AM
To: 'WebAIM Discussion List'
Subject: Re: [WebAIM] Testing for accessibility compliance

This is a great discussion. But in addition to automated vs. manual
testing, there is the practical matter that rigorous testing requires
more time and resources that many have available.

If there is anyone who manually tests all content (HTML, PDF, et al) in
a moderate to large production environment, and still keep management
happy, I'd like to know your secrets.

-- Mike


-----Original Message-----
From: <EMAIL REMOVED>
[mailto: <EMAIL REMOVED> ] On Behalf Of Kevin Miller
Sent: Tuesday, April 27, 2010 3:43 PM
To: WebAIM Discussion List
Subject: Re: [WebAIM] Testing for accessibility compliance


All arguments aside about automated vs. human assessment (where I agree
that humans are needed to do it right), automated tests are helpful to
catch the easy, repetitive stuff as well as help point to where the
biggest pain points on a site are. Also, a good tool should be able to
record tests that should be reviewed manually and when that review was
completed.

Trouble is, automated tools fail miserably on several fronts:


1. They assume end-users know HTML, which very frequently, they
should
not need to know
2. They don't integrate with a CMS and know where an author's content
ends and a template begins
3. Their responses are highly technical and not helpful for end users
4. They are "reactive" in that they poll a site periodically for
changes
and generate reports that someone has to wade through - in the
meantime,
there's an accessibility problems out there.
5. Their reports look ugly and are unintuitive.

While WAVE is nicer, it's again going to suffer from first four problems
above.

<Begin Shameless plug>

At CSU Monterey Bay we started an open-source project called QUAIL
(QUAIL Accessibility Information Library) - available at
http://quail-lib.org - to start writing a generalized PHP accessibility
library that could integrate with a CMS. There's a web service around
QUAIL so it can run as a service for other non-PHP CMSes.

Out of that came the Drupal module Accessible Content (
http://drupal.org/project/accessible_content). Now we can customize
error messages, check content on the fly, and even prevent pages with
severe
(read: easily automated) errors from being published. It also allows
permissioned users to override tests after manual review and can do
reporting. There's a video at the QUAIL site.

</End shameless plug>

On Tue, Apr 27, 2010 at 10:50 AM, Mark Guisinger
< <EMAIL REMOVED> >wrote:

> I'm working for a large company that is beginning to go down the road
> to making there website accessible. I'm starting to wonder about
> testing, as I'm not a tester. Do larger companies testing their
> websites for accessibility have a group of testers reviewing their
> pages with screen readers (one or more)? Or do they just validate
> that the code created is to spec? What tools are other companies
> using to test their websites for accessibility? Any guidance will be
> greatly appreciated.
>
> Thanks,
> Mark
>