WebAIM - Web Accessibility In Mind

E-mail List Archives

Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"

for

From: Jared Smith
Date: Dec 13, 2010 10:09PM


On Mon, Dec 13, 2010 at 8:48 PM, Gunderson, Jon R wrote:

> I should also note that passing these rules doesn't mean you are accessible, it just means you have the markup for accessibility.

But not passing the rules means you are inaccessible? I think that
therein lies the fundamental issue.

There are a few automated rules that can clearly indicate
accessibility issues. And there are many, many rules that might
indicate an accessibility issue in certain situations and based on
certain assumptions and opinions on what is best practice for most
pages. Only a human can determine if this second category of rule
violations have an actual impact on the human user. The concern is
that the report lumps these two categories together and counts them up
to assign a 'grade'. It assumes that all violations of rules are
equal. This has great potential to give a very inaccurate indication
of true accessibility.

Another fundamental flaw in the methodology is that a very simple page
that has the potential for fewer rules violations will almost always
rank better than a longer, more complex page that has more elements to
be analyzed. For example, a very inaccessible home page with an <h1>
of "University X" and then a giant image with alt text of "home page"
would score 100%, yet a highly accessible web page with a few spacer
images (which to the end user experience are no different than CSS
background images) and a data table that doesn't match the prescribed
(and terribly flawed) requirements of having a summary (which is
generally ignored anyway and often not needed), headers/id (which do
nothing for accessibility in nearly all cases), or row headers
(really?) would score much, much lower. The report does not account
for human experience and impact.

> There are many manual tests that must be made, but I don't need to tell this list that.

Yet the report purports to declare a level of accessibility while
ignoring this fact.

While this has certainly helped raise awareness, I think this is the
general and widespread concern in the accessibility field about these
reports. The issue is not the rules (though I believe they are
fundamentally flawed in several areas) or the FAE tool, but in how the
results continue to be reported. It is one thing to indicate that
pages have more or fewer rule violations based on a subjective
ruleset. It's another thing entirely to declare that those with fewer
rules violations are somehow more accessible or "best", as the article
declares, for end users.

Jared Smith
WebAIM