E-mail List Archives

Thread: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"

for

Number of posts in this thread: 28 (In chronological order)

From: Gunderson, Jon R
Date: Mon, Dec 13 2010 10:42AM
Subject: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"
No previous message | Next message

Chronicle of Higher Education article "Colleges Lock Out Blind Students Online":

http://chronicle.com/article/Blind-Students-Demand-Access/125695/



And a sidebar about Cal State¹s success:

http://chronicle.com/article/Cal-States-Strong-Push-for/125683/



Chart ranking the best and worst college web sites:

http://chronicle.com/article/BestWorst-College-Web/125642/




Jon Gunderson

From: Cliff Tyllick
Date: Mon, Dec 13 2010 11:27AM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"
Previous message | Next message

Jon, can you tell us more about the methods used to determine the
rankings in the chart?

The footnotes clue us in to what was measured, but was this done with
an automated tool? With scores of volunteers?

And what are common examples of failure on each measure? For example,
would a site pass or fail on "site name readable on all pages" if the
one identifier common to all pages is the school logo (with alt text) at
upper left? Or were you looking for the site name to be included in each
page's title tag?

Thanks!
Cliff

Cliff Tyllick
Usability assessment coordinator
Agency Communications Division
Texas Commission on Environmental Quality
512-239-4516
= EMAIL ADDRESS REMOVED =


>>> On 12/13/2010 at 11:32 AM, in message
< = EMAIL ADDRESS REMOVED = >,
"Gunderson, Jon R" < = EMAIL ADDRESS REMOVED = > wrote:
Chronicle of Higher Education article "Colleges Lock Out Blind Students
Online":

http://chronicle.com/article/Blind-Students-Demand-Access/125695/



And a sidebar about Cal State*s success:

http://chronicle.com/article/Cal-States-Strong-Push-for/125683/



Chart ranking the best and worst college web sites:

http://chronicle.com/article/BestWorst-College-Web/125642/




Jon Gunderson

From: Cliff Tyllick
Date: Mon, Dec 13 2010 11:54AM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"
Previous message | Next message

From the sidebar about Cal State*s success:

http://chronicle.com/article/Cal-States-Strong-Push-for/125683/

"How, for example, can officials explain accessible ways to format a
Word document to every person*professor, student, and administrator*who
can upload materials to a course Web site?"

Cliff's answer:

1. Provide custom templates for each type of document that might be
prepared for a course Web site and online tutorials on how to use them.

2. Be sure the templates include a custom tab (Word 2007 and 2010) or
toolbar (Word 2003) that contains no buttons that apply only
formatting.

3. Be sure the custom tab or toolbar is loaded with features that
support the creation of accessible documents:
- a button to display the Document Map
- a button to switch templates
- if possible, a Quick Styles Gallery; if that's not possible, a button
to expose a Styles task pane
- Styles displayed in the Quick Styles Gallery and Styles task pane
grouped by category (headings; body text; lists; character styles)

4. Have professors and graduate teaching assistants reject papers not
created in the standard template unless they can be converted to that
template simply by switching templates.

Not simple. The full effects won't be realized immediately. But it's
high time that, just as students in the 1930s through 1970s had to
either learn to type properly or hire a service to do it for them,
students in this age start to learn how to use word processing software
as the powerful tool that it is, not as the electronic equivalent of
crayons, construction paper, safety scissors, and paste.

Because students who do learn these skills will find that they can
communicate with many more people, much faster, and much more
effectively. And that's a characteristic that one should be able to
expect of a college graduate.

Cliff

Cliff Tyllick
Usability assessment coordinator
Agency Communications Division
Texas Commission on Environmental Quality
512-239-4516
= EMAIL ADDRESS REMOVED =

From: Gunderson, Jon R
Date: Mon, Dec 13 2010 1:45PM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"
Previous message | Next message

I used the Illinois Functional Accessibility Evaluator to evaluate the websites:

http://fae.cita.illinois.edu

It is free tool.

If your register will allow you test multiple pages:

http://fae.cita.illinois.edu/accounts/register/

Each rule in the report will provide details about what it is actually testing:

http://fae.cita.illinois.edu/about/rules/

Jon


-----Original Message-----
From: = EMAIL ADDRESS REMOVED = [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Cliff Tyllick
Sent: Monday, December 13, 2010 12:27 PM
To: = EMAIL ADDRESS REMOVED =
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"

Jon, can you tell us more about the methods used to determine the rankings in the chart?

The footnotes clue us in to what was measured, but was this done with an automated tool? With scores of volunteers?

And what are common examples of failure on each measure? For example, would a site pass or fail on "site name readable on all pages" if the one identifier common to all pages is the school logo (with alt text) at upper left? Or were you looking for the site name to be included in each page's title tag?

Thanks!
Cliff

Cliff Tyllick
Usability assessment coordinator
Agency Communications Division
Texas Commission on Environmental Quality
512-239-4516
= EMAIL ADDRESS REMOVED =


>>> On 12/13/2010 at 11:32 AM, in message
< = EMAIL ADDRESS REMOVED = >,
"Gunderson, Jon R" < = EMAIL ADDRESS REMOVED = > wrote:
Chronicle of Higher Education article "Colleges Lock Out Blind Students
Online":

http://chronicle.com/article/Blind-Students-Demand-Access/125695/



And a sidebar about Cal State*s success:

http://chronicle.com/article/Cal-States-Strong-Push-for/125683/



Chart ranking the best and worst college web sites:

http://chronicle.com/article/BestWorst-College-Web/125642/




Jon Gunderson

From: Hale, Mark P Jr
Date: Mon, Dec 13 2010 1:57PM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"
Previous message | Next message

You may find after looking at the data published by Jon that the Chronicle article title and column labels are more than a little misleading: for example, "Web Design" in the Chronicle was "Table Layout" in the original data, which you can see at http://webaccessibility.cita.illinois.edu/data/

Mark

Mark Hale
R&D Project Leader / IT Accessibility Coordinator
CIO Office, The University of Iowa
319-335-5825



-----Original Message-----
From: = EMAIL ADDRESS REMOVED = [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Cliff Tyllick
Sent: Monday, December 13, 2010 12:27 PM
To: = EMAIL ADDRESS REMOVED =
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"

Jon, can you tell us more about the methods used to determine the
rankings in the chart?

The footnotes clue us in to what was measured, but was this done with
an automated tool? With scores of volunteers?

And what are common examples of failure on each measure? For example,
would a site pass or fail on "site name readable on all pages" if the
one identifier common to all pages is the school logo (with alt text) at
upper left? Or were you looking for the site name to be included in each
page's title tag?

Thanks!
Cliff

Cliff Tyllick
Usability assessment coordinator
Agency Communications Division
Texas Commission on Environmental Quality
512-239-4516
= EMAIL ADDRESS REMOVED =


>>> On 12/13/2010 at 11:32 AM, in message
< = EMAIL ADDRESS REMOVED = >,
"Gunderson, Jon R" < = EMAIL ADDRESS REMOVED = > wrote:
Chronicle of Higher Education article "Colleges Lock Out Blind Students
Online":

http://chronicle.com/article/Blind-Students-Demand-Access/125695/



And a sidebar about Cal State*s success:

http://chronicle.com/article/Cal-States-Strong-Push-for/125683/



Chart ranking the best and worst college web sites:

http://chronicle.com/article/BestWorst-College-Web/125642/




Jon Gunderson

From: Sandra Andrews
Date: Mon, Dec 13 2010 2:03PM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"
Previous message | Next message

In fairness to ASU, mentioned in the first article (and where I work): at
least the website is not too shabby in terms of accessibility. We are # 35,
in contrast, say, to Harvard.

Accessibility of the website has been a priority for ASU since the days when
the main web server was a Mac 6100 "pizza box." Could be better, but
accessibility is still considered important.

With the exception of the Facebook student group, to be sure!

On Mon, Dec 13, 2010 at 1:44 PM, Gunderson, Jon R < = EMAIL ADDRESS REMOVED = >wrote:

> I used the Illinois Functional Accessibility Evaluator to evaluate the
> websites:
>
> http://fae.cita.illinois.edu
>
> It is free tool.
>
> If your register will allow you test multiple pages:
>
> http://fae.cita.illinois.edu/accounts/register/
>
> Each rule in the report will provide details about what it is actually
> testing:
>
> http://fae.cita.illinois.edu/about/rules/
>
> Jon
>
>
> -----Original Message-----
> From: = EMAIL ADDRESS REMOVED = [mailto:
> = EMAIL ADDRESS REMOVED = ] On Behalf Of Cliff Tyllick
> Sent: Monday, December 13, 2010 12:27 PM
> To: = EMAIL ADDRESS REMOVED =
> Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock
> Out Blind Students Online"
>
> Jon, can you tell us more about the methods used to determine the rankings
> in the chart?
>
> The footnotes clue us in to what was measured, but was this done with an
> automated tool? With scores of volunteers?
>
> And what are common examples of failure on each measure? For example, would
> a site pass or fail on "site name readable on all pages" if the one
> identifier common to all pages is the school logo (with alt text) at upper
> left? Or were you looking for the site name to be included in each page's
> title tag?
>
> Thanks!
> Cliff
>
> Cliff Tyllick
> Usability assessment coordinator
> Agency Communications Division
> Texas Commission on Environmental Quality
> 512-239-4516
> = EMAIL ADDRESS REMOVED =
>
>
> >>> On 12/13/2010 at 11:32 AM, in message
> < = EMAIL ADDRESS REMOVED = >,
> "Gunderson, Jon R" < = EMAIL ADDRESS REMOVED = > wrote:
> Chronicle of Higher Education article "Colleges Lock Out Blind Students
> Online":
>
> http://chronicle.com/article/Blind-Students-Demand-Access/125695/
>
>
>
> And a sidebar about Cal State*s success:
>
> http://chronicle.com/article/Cal-States-Strong-Push-for/125683/
>
>
>
> Chart ranking the best and worst college web sites:
>
> http://chronicle.com/article/BestWorst-College-Web/125642/
>
>
>
>
> Jon Gunderson
>
>

From: Richard R. Hill
Date: Mon, Dec 13 2010 2:27PM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"
Previous message | Next message

Note that the FAE tool test for certain rules that are specific to that tool (not generally required or labelled as a best practice by other Web standards). For instance, the FAE tool marks folks down for have more than one H1 on a page. This is a Illinois rule, not a W3C or 508 rule. SO, those who adhere to more of these will have slightly higher rankings.

Still unclear as to the scope and depth of the pages/sites tested.
–––––––––––––––––––––––––––––––––––––––
Rick Hill, Web CMS Administrator
University Communications, UC Davis


From: "Gunderson, Jon R" < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >>
Reply-To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >>
Date: Mon, 13 Dec 2010 12:44:19 -0800
To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >>
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"

I used the Illinois Functional Accessibility Evaluator to evaluate the websites:

http://fae.cita.illinois.edu

It is free tool.

If your register will allow you test multiple pages:

http://fae.cita.illinois.edu/accounts/register/

Each rule in the report will provide details about what it is actually testing:

http://fae.cita.illinois.edu/about/rules/

Jon


-----Original Message-----
From: = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = > [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Cliff Tyllick
Sent: Monday, December 13, 2010 12:27 PM
To: = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"

Jon, can you tell us more about the methods used to determine the rankings in the chart?
The footnotes clue us in to what was measured, but was this done with an automated tool? With scores of volunteers?
And what are common examples of failure on each measure? For example, would a site pass or fail on "site name readable on all pages" if the one identifier common to all pages is the school logo (with alt text) at upper left? Or were you looking for the site name to be included in each page's title tag?
Thanks!
Cliff
Cliff Tyllick
Usability assessment coordinator
Agency Communications Division
Texas Commission on Environmental Quality
512-239-4516
= EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >


On 12/13/2010 at 11:32 AM, in message
< = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >>,
"Gunderson, Jon R" < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >> wrote:
Chronicle of Higher Education article "Colleges Lock Out Blind Students
Online":

http://chronicle.com/article/Blind-Students-Demand-Access/125695/



And a sidebar about Cal State*s success:

http://chronicle.com/article/Cal-States-Strong-Push-for/125683/



Chart ranking the best and worst college web sites:

http://chronicle.com/article/BestWorst-College-Web/125642/




Jon Gunderson

From: Gunderson, Jon R
Date: Mon, Dec 13 2010 3:00PM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" and titling a web page
Previous message | Next message

Many developers consider reserving the H1 header as a best practice for titling a web page, not just people in Illinois.

While it may not be explicit technique in WCAG or Section 508 I think many web developers and most people with disabilities would consider it a best practice for titling, since it provides a consistent means for speech users to find the title of the page.

The rules in FAE are based on what provides the best experience for people with disabilities, but also makes sense for developers to implement.

We also mostly work with developers at the design stages where the titling rules are easy to implement and get into the templates for the website, if rules like this one at not done at design time, most developers would find it hard (i.e. resist fixing the problem) since it would require touching a lot of pages.

If you approach accessibility as a repair process you will not like tools like FAE, since the rules in FAE can only be efficiently implemented in the design stages.

New ARIA landmark technologies will provide alternatives to using H1 for titling, but the best practices for using ARIA landmarks and headings are still evolving. The ARIA landmarks will provide people doing accessible repair more options for fixing their pages, but they will still need to touch most pages.

Data for the web sites tested can be found here:
http://webaccessibility.cita.illinois.edu/data/

Over 23,000 pages were tested at 183 universities.

The rules that were tested on each page can be found here:
https://fae.cita.illinois.edu/about/rules/

I am interested in what other people consider their best practices for titling a web page?

Jon



-----Original Message-----
From: = EMAIL ADDRESS REMOVED = [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Richard R. Hill
Sent: Monday, December 13, 2010 3:24 PM
To: WebAIM Discussion List
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"

Note that the FAE tool test for certain rules that are specific to that tool (not generally required or labelled as a best practice by other Web standards). For instance, the FAE tool marks folks down for have more than one H1 on a page. This is a Illinois rule, not a W3C or 508 rule. SO, those who adhere to more of these will have slightly higher rankings.

Still unclear as to the scope and depth of the pages/sites tested.
---------------------------------------
Rick Hill, Web CMS Administrator
University Communications, UC Davis


From: "Gunderson, Jon R" < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >>
Reply-To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >>
Date: Mon, 13 Dec 2010 12:44:19 -0800
To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >>
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"

I used the Illinois Functional Accessibility Evaluator to evaluate the websites:

http://fae.cita.illinois.edu

It is free tool.

If your register will allow you test multiple pages:

http://fae.cita.illinois.edu/accounts/register/

Each rule in the report will provide details about what it is actually testing:

http://fae.cita.illinois.edu/about/rules/

Jon


-----Original Message-----
From: = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = > [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Cliff Tyllick
Sent: Monday, December 13, 2010 12:27 PM
To: = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"

Jon, can you tell us more about the methods used to determine the rankings in the chart?
The footnotes clue us in to what was measured, but was this done with an automated tool? With scores of volunteers?
And what are common examples of failure on each measure? For example, would a site pass or fail on "site name readable on all pages" if the one identifier common to all pages is the school logo (with alt text) at upper left? Or were you looking for the site name to be included in each page's title tag?
Thanks!
Cliff
Cliff Tyllick
Usability assessment coordinator
Agency Communications Division
Texas Commission on Environmental Quality
512-239-4516
= EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >


On 12/13/2010 at 11:32 AM, in message
< = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >>,
"Gunderson, Jon R" < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >> wrote:
Chronicle of Higher Education article "Colleges Lock Out Blind Students
Online":

http://chronicle.com/article/Blind-Students-Demand-Access/125695/



And a sidebar about Cal State*s success:

http://chronicle.com/article/Cal-States-Strong-Push-for/125683/



Chart ranking the best and worst college web sites:

http://chronicle.com/article/BestWorst-College-Web/125642/




Jon Gunderson

From: Richard R. Hill
Date: Mon, Dec 13 2010 3:09PM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" and titling a web page
Previous message | Next message

Nevertheless, the FAE tool actually tests against the Illinois Information Technology Accessibility Act Implementation Guidelines for Web-Based Information and Applications 1.0. This means there are some unique requirements that this tool tests for that tools testing against international or national standards won't evaluate. Not saying you're state best practices are wrong, just that folks designing to other standards or practices may not rate as highly in some areas using just your tool.

–––––––––––––––––––––––––––––––––––––––
Rick Hill

From: "Gunderson, Jon R" < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >>
Reply-To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >>
Date: Mon, 13 Dec 2010 13:54:59 -0800
To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >>
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" and titling a web page

Many developers consider reserving the H1 header as a best practice for titling a web page, not just people in Illinois.

While it may not be explicit technique in WCAG or Section 508 I think many web developers and most people with disabilities would consider it a best practice for titling, since it provides a consistent means for speech users to find the title of the page.

The rules in FAE are based on what provides the best experience for people with disabilities, but also makes sense for developers to implement.

We also mostly work with developers at the design stages where the titling rules are easy to implement and get into the templates for the website, if rules like this one at not done at design time, most developers would find it hard (i.e. resist fixing the problem) since it would require touching a lot of pages.

If you approach accessibility as a repair process you will not like tools like FAE, since the rules in FAE can only be efficiently implemented in the design stages.

New ARIA landmark technologies will provide alternatives to using H1 for titling, but the best practices for using ARIA landmarks and headings are still evolving. The ARIA landmarks will provide people doing accessible repair more options for fixing their pages, but they will still need to touch most pages.

Data for the web sites tested can be found here:
http://webaccessibility.cita.illinois.edu/data/

Over 23,000 pages were tested at 183 universities.

The rules that were tested on each page can be found here:
https://fae.cita.illinois.edu/about/rules/

I am interested in what other people consider their best practices for titling a web page?

Jon



-----Original Message-----
From: = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = > [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Richard R. Hill
Sent: Monday, December 13, 2010 3:24 PM
To: WebAIM Discussion List
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"

Note that the FAE tool test for certain rules that are specific to that tool (not generally required or labelled as a best practice by other Web standards). For instance, the FAE tool marks folks down for have more than one H1 on a page. This is a Illinois rule, not a W3C or 508 rule. SO, those who adhere to more of these will have slightly higher rankings.

Still unclear as to the scope and depth of the pages/sites tested.
---------------------------------------
Rick Hill, Web CMS Administrator
University Communications, UC Davis


From: "Gunderson, Jon R" < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = >>
Reply-To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = >>
Date: Mon, 13 Dec 2010 12:44:19 -0800
To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = >>
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"

I used the Illinois Functional Accessibility Evaluator to evaluate the websites:

http://fae.cita.illinois.edu

It is free tool.

If your register will allow you test multiple pages:

http://fae.cita.illinois.edu/accounts/register/

Each rule in the report will provide details about what it is actually testing:

http://fae.cita.illinois.edu/about/rules/

Jon


-----Original Message-----
From: = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = > [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Cliff Tyllick
Sent: Monday, December 13, 2010 12:27 PM
To: = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = >
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"

Jon, can you tell us more about the methods used to determine the rankings in the chart?
The footnotes clue us in to what was measured, but was this done with an automated tool? With scores of volunteers?
And what are common examples of failure on each measure? For example, would a site pass or fail on "site name readable on all pages" if the one identifier common to all pages is the school logo (with alt text) at upper left? Or were you looking for the site name to be included in each page's title tag?
Thanks!
Cliff
Cliff Tyllick
Usability assessment coordinator
Agency Communications Division
Texas Commission on Environmental Quality
512-239-4516
= EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = >


On 12/13/2010 at 11:32 AM, in message
< = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = >>,
"Gunderson, Jon R" < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = >> wrote:
Chronicle of Higher Education article "Colleges Lock Out Blind Students
Online":

http://chronicle.com/article/Blind-Students-Demand-Access/125695/



And a sidebar about Cal State*s success:

http://chronicle.com/article/Cal-States-Strong-Push-for/125683/



Chart ranking the best and worst college web sites:

http://chronicle.com/article/BestWorst-College-Web/125642/




Jon Gunderson

From: Gunderson, Jon R
Date: Mon, Dec 13 2010 3:24PM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" and titling a web page
Previous message | Next message

The tools really tests best practices for web accessibility that we have developed working with web developers on campus, other universities and state government.

The tool was designed to help web developers understand and validate to design for accessibility.

All of the rules can be mapped back into WCAG 2.0.

I should remind people that both WCAG 1.0 and WCAG 2.0 techniques documents are not normative in the W3C sense, they are just suggestions by the working group on how to implement the success criteria of WCAG 2.0.

If you don't consider accessibility at design time, then developers will be looking at anyway to satisfy a requirement without having to change their web pages very much, even if it does not significantly improve accessibility.

The report shows that most pages do not even have the markup for accessibility, forms controls has the most notable deficiencies.

http://webaccessibility.cita.illinois.edu/data/

All guidelines require labeling form controls, including WCAG 1.0, WCAG 2.0 and even Section 508.

We need accessible design if we really want accessibility to make a difference for most people with disabilities.

Jon

-----Original Message-----
From: = EMAIL ADDRESS REMOVED = [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Richard R. Hill
Sent: Monday, December 13, 2010 4:06 PM
To: WebAIM Discussion List
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" and titling a web page

Nevertheless, the FAE tool actually tests against the Illinois Information Technology Accessibility Act Implementation Guidelines for Web-Based Information and Applications 1.0. This means there are some unique requirements that this tool tests for that tools testing against international or national standards won't evaluate. Not saying you're state best practices are wrong, just that folks designing to other standards or practices may not rate as highly in some areas using just your tool.

---------------------------------------
Rick Hill

From: "Gunderson, Jon R" < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >>
Reply-To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >>
Date: Mon, 13 Dec 2010 13:54:59 -0800
To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = >>
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" and titling a web page

Many developers consider reserving the H1 header as a best practice for titling a web page, not just people in Illinois.

While it may not be explicit technique in WCAG or Section 508 I think many web developers and most people with disabilities would consider it a best practice for titling, since it provides a consistent means for speech users to find the title of the page.


The rules in FAE are based on what provides the best experience for people with disabilities, but also makes sense for developers to implement.

We also mostly work with developers at the design stages where the titling rules are easy to implement and get into the templates for the website, if rules like this one at not done at design time, most developers would find it hard (i.e. resist fixing the problem) since it would require touching a lot of pages.

If you approach accessibility as a repair process you will not like tools like FAE, since the rules in FAE can only be efficiently implemented in the design stages.

New ARIA landmark technologies will provide alternatives to using H1 for titling, but the best practices for using ARIA landmarks and headings are still evolving. The ARIA landmarks will provide people doing accessible repair more options for fixing their pages, but they will still need to touch most pages.

Data for the web sites tested can be found here:
http://webaccessibility.cita.illinois.edu/data/

Over 23,000 pages were tested at 183 universities.

The rules that were tested on each page can be found here:
https://fae.cita.illinois.edu/about/rules/

I am interested in what other people consider their best practices for titling a web page?

Jon



-----Original Message-----
From: = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = > [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Richard R. Hill
Sent: Monday, December 13, 2010 3:24 PM
To: WebAIM Discussion List
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"

Note that the FAE tool test for certain rules that are specific to that tool (not generally required or labelled as a best practice by other Web standards). For instance, the FAE tool marks folks down for have more than one H1 on a page. This is a Illinois rule, not a W3C or 508 rule. SO, those who adhere to more of these will have slightly higher rankings.

Still unclear as to the scope and depth of the pages/sites tested.
---------------------------------------
Rick Hill, Web CMS Administrator
University Communications, UC Davis


From: "Gunderson, Jon R" < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = >>
Reply-To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = >>
Date: Mon, 13 Dec 2010 12:44:19 -0800
To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = >>
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"

I used the Illinois Functional Accessibility Evaluator to evaluate the websites:

http://fae.cita.illinois.edu

It is free tool.

If your register will allow you test multiple pages:

http://fae.cita.illinois.edu/accounts/register/

Each rule in the report will provide details about what it is actually testing:

http://fae.cita.illinois.edu/about/rules/

Jon


-----Original Message-----
From: = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = > [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Cliff Tyllick
Sent: Monday, December 13, 2010 12:27 PM
To: = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = >
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"

Jon, can you tell us more about the methods used to determine the rankings in the chart?
The footnotes clue us in to what was measured, but was this done with an automated tool? With scores of volunteers?
And what are common examples of failure on each measure? For example, would a site pass or fail on "site name readable on all pages" if the one identifier common to all pages is the school logo (with alt text) at upper left? Or were you looking for the site name to be included in each page's title tag?
Thanks!
Cliff
Cliff Tyllick
Usability assessment coordinator
Agency Communications Division
Texas Commission on Environmental Quality
512-239-4516
= EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = >


On 12/13/2010 at 11:32 AM, in message
< = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = >>,
"Gunderson, Jon R" < = EMAIL ADDRESS REMOVED = <mailto: = EMAIL ADDRESS REMOVED = ><mailto: = EMAIL ADDRESS REMOVED = >> wrote:
Chronicle of Higher Education article "Colleges Lock Out Blind Students
Online":

http://chronicle.com/article/Blind-Students-Demand-Access/125695/



And a sidebar about Cal State*s success:

http://chronicle.com/article/Cal-States-Strong-Push-for/125683/



Chart ranking the best and worst college web sites:

http://chronicle.com/article/BestWorst-College-Web/125642/




Jon Gunderson

From: John Foliot
Date: Mon, Dec 13 2010 7:21PM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"
Previous message | Next message

Richard R. Hill wrote:
>
> Note that the FAE tool test for certain rules that are specific to that
> tool (not generally required or labelled as a best practice by other
> Web standards). For instance, the FAE tool marks folks down for have
> more than one H1 on a page. This is a Illinois rule, not a W3C or 508
> rule. SO, those who adhere to more of these will have slightly higher
> rankings.
>
> Still unclear as to the scope and depth of the pages/sites tested.
> –––––––––––––––––––––––––––––––––––––––
> Rick Hill, Web CMS Administrator
> University Communications, UC Davis

(Rick: http://webaccessibility.cita.illinois.edu/data/school/235/)


<Personal Opinion>
I have to vehemently agree, and further express my disappointment and
frustration at Jon Gunderson and his team for presuming to set
accessibility standards for all of Higher Education. Evaluating other
institutions against "rules" set by The University of Illinois is both
misleading, and nothing more than Grandstanding: should we now go about
judging their sites against other made up criteria as well?

In particular, I point to the following "Rules" that are not part of
either Section 508 nor the W3C WCAG 1 or 2 Guidelines (the 2 recognized
benchmarks used by all other accessibility evaluation software):

HEADING STRUCTURE:
"The page must contain at least one h1 element."
According to whom? While it is certainly good practice to ensure
each page has appropriate heading structure, nowhere (outside of the FAE
tool) is it *MANDATED* as such - a page that lacks an <h1> is not
intrinsically inaccessible. False data - False results!

"The page should contain no more than two h1 elements."
Please point to one national or international guideline or
recommendation that mandates this. Another false positive from a
mechanical tool, fueled by internal University of Illinois politics and
policies.

"The text content of each h1 element should match all or part of the title
content."
"Each h1 element should have text content exclusive of the alt text of any
img elements it contains."
Bull Feathers! Made up standards by a small team with an agenda to
promote their internal tool - and it should be noted that failing to do
either of these things in no way makes a page "less accessible" - it just
doesn't meet their FAE Guidelines.


DATA TABLES:
"For each data table, the first cell in each column must be a th element,
and each row must contain at least one th element."
Patently FALSE! In fact, the table of school rankings at the
Chronicle of Higher Ed that Jon points to in his earlier email
(http://chronicle.com/article/BestWorst-College-Web/125642/) does not meet
this "pass" criteria, yet is not "inaccessible" because of it - in fact
the size of the table (183 rows in length with little-to-no internal
navigation) is more of an access issue than the failure for each row to
start with a <th>.

The following table is perfectly acceptable and valid, and meets (as far
as I know) all required accessibility guidelines as established by both
the Section 508 Standard and W3C Guidelines (yet fails the FAE tool):

<table>
<tr>
<td></td>
<th scope="col">Sunday</th>
<th scope="col">Monday</th>
<th scope="col">Tuesday</th>
<th scope="col">Wednesday</th>
<th scope="col">Thursday</th>
<th scope="col">Friday</th>
<th scope="col">Saturday</th>
</tr>
<tr>
<th scope="row">Week 1</th>
<td></td>
<td></td>
<td>1</td>
<td>2</td>
<td>3</td>
<td>4</td>
<td>5</td>
</tr>

...etc.
</table>

"Each th element in a complex data table must have an id attribute whose
value is unique relative to all ids on the page."
Please explain how failing to add an ID attribute to a table
header makes it less accessible.

"Each td element in a complex data table must have a headers attribute
that references the id attributes of associated th elements."
Please explain how failing to add a HEADER attribute to a table
cell makes it less accessible.

What defines "complex"? How does a mechanical tool makes this
assessment? The table code example shown above is perfectly valid, is
extremely accessible, and would fail 3 of the 5 data-table 'rules' this
testing imposes on *your* sites. This is simply unacceptable.


IMAGES/ALT TEXT
"Each img element with width or height less than 8 pixels should be
removed; CSS techniques should be used instead."
Really? How exactly was this determined? If I have an image that
is 9 pixels X 2 pixels than it should have alt text and not be moved to
CSS? That's what the tool and this testing tells. Furthermore, if your
site has an image like this, it has now been deemed less accessible, thus
ranks lower in the scores - leaving the impression that your pages are
inaccessible.

Clearly this is a tool that has some value, but to stake a single page's
accessibility or lack of, never mind publishing public data that states
that a University's web content is inaccessible on arbitrary Rules made up
by one University and verified by mechanical means alone against 3 or 4
pages is foolhardy, damaging to the general state of web accessibility (as
it suggests that meeting a mechanical tester's results = job done), and
unconscionable. It may also leave the University of Illinois open to libel
suits and other legal remedies.

I know Jon Gunderson personally, I like Jon Gunderson, and I respect the
work that Jon has done to advance web accessibility over the years, but
here, today, I must point the finger of shame at him and cry "Foul" - this
is no more an assessment of true web accessibility than it is a rolling of
chicken bones and voodoo chest-beating, and the damage caused here falls
squarely at his feet.
</Personal Opinion>

NOTE: These are my personal opinions, and in no way reflect the opinion of
Stanford University (with whom I am under contract), T-Base Communications
(my employer), my associates or other professional affiliates with whom I
do business with.

JF
============================
John  Foliot

Co-chair - W3C HTML5 Accessibility Task Force (Media)
http://www.w3.org/WAI/PF/HTML/wiki/Main_Page

============================

From: Gunderson, Jon R
Date: Mon, Dec 13 2010 8:48PM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"
Previous message | Next message

John,

Rules Development Clarification

The rules were not developed only by people at the University of Illinois, but were developed in an open forum of the Web Best Practices Working Group:
http://collaborate.athenpro.org/group/web/
There are members from all over the united States.
Anyone can join the group and if people have better design rules the group would love to hear and consider them for inclusion.

The study included over 20,000 web pages were analyzed, please view the data details:
http://webaccessibility.cita.illinois.edu/data/

Grand Standing Charge Response

To the charge me personally with grandstanding, maybe so, I'll let individuals make their own judgement.
But without data on the inaccessibility of higher education websites being publicly available the inaccessibility will still continue to grow and get worse.
I talk to to many CIOs, IT professionals and vendors that tell me their web sites are accessible because they have a policy or a law like Section 508 that says it must be so.
Accessibility is more than policy, it requires setting design standards (rules) and auditing the use of the design standards,.

I hope people see this as an opportunity to raise awareness on their campuses of accessibility.
If you don't like the rules used in the data collection, I hope that you will define your own campus design rules that support functional accessibility by people with disabilities and also meet the design needs of developers.
I also hope you will make the design rules publicly available so people with disabilities know what to expect when they get to your campuses web sites.
Campuses need to treat accessibility like other IT issues, like security.
They need to have people assigned web accessibility responsibilities and they need to measure the implementation of their policies.

I should also note that passing these rules doesn't mean you are accessible, it just means you have the markup for accessibility.
There are many manual tests that must be made, but I don't need to tell this list that.

Future rules will be developed through the OpenAjax Alliance Accessibility (OAA) task force:

http://www.oaa-accessibility.org
and
http://www.openajax.org/member/wiki/Accessibility

Jon

NOTE. Talk to the Chronicle about their web accessibility issues, I have already given them a report of the issues I found.


On Dec 13, 2010, at 8:20 PM, John Foliot wrote:

> Richard R. Hill wrote:
>>
>> Note that the FAE tool test for certain rules that are specific to that
>> tool (not generally required or labelled as a best practice by other
>> Web standards). For instance, the FAE tool marks folks down for have
>> more than one H1 on a page. This is a Illinois rule, not a W3C or 508
>> rule. SO, those who adhere to more of these will have slightly higher
>> rankings.
>>
>> Still unclear as to the scope and depth of the pages/sites tested.
>> –––––––––––––––––––––––––––––––––––––––
>> Rick Hill, Web CMS Administrator
>> University Communications, UC Davis
>
> (Rick: http://webaccessibility.cita.illinois.edu/data/school/235/)
>
>
> <Personal Opinion>
> I have to vehemently agree, and further express my disappointment and
> frustration at Jon Gunderson and his team for presuming to set
> accessibility standards for all of Higher Education. Evaluating other
> institutions against "rules" set by The University of Illinois is both
> misleading, and nothing more than Grandstanding: should we now go about
> judging their sites against other made up criteria as well?
>
> In particular, I point to the following "Rules" that are not part of
> either Section 508 nor the W3C WCAG 1 or 2 Guidelines (the 2 recognized
> benchmarks used by all other accessibility evaluation software):
>
> HEADING STRUCTURE:
> "The page must contain at least one h1 element."
> According to whom? While it is certainly good practice to ensure
> each page has appropriate heading structure, nowhere (outside of the FAE
> tool) is it *MANDATED* as such - a page that lacks an <h1> is not
> intrinsically inaccessible. False data - False results!
>
> "The page should contain no more than two h1 elements."
> Please point to one national or international guideline or
> recommendation that mandates this. Another false positive from a
> mechanical tool, fueled by internal University of Illinois politics and
> policies.
>
> "The text content of each h1 element should match all or part of the title
> content."
> "Each h1 element should have text content exclusive of the alt text of any
> img elements it contains."
> Bull Feathers! Made up standards by a small team with an agenda to
> promote their internal tool - and it should be noted that failing to do
> either of these things in no way makes a page "less accessible" - it just
> doesn't meet their FAE Guidelines.
>
>
> DATA TABLES:
> "For each data table, the first cell in each column must be a th element,
> and each row must contain at least one th element."
> Patently FALSE! In fact, the table of school rankings at the
> Chronicle of Higher Ed that Jon points to in his earlier email
> (http://chronicle.com/article/BestWorst-College-Web/125642/) does not meet
> this "pass" criteria, yet is not "inaccessible" because of it - in fact
> the size of the table (183 rows in length with little-to-no internal
> navigation) is more of an access issue than the failure for each row to
> start with a <th>.
>
> The following table is perfectly acceptable and valid, and meets (as far
> as I know) all required accessibility guidelines as established by both
> the Section 508 Standard and W3C Guidelines (yet fails the FAE tool):
>
> <table>
> <tr>
> <td></td>
> <th scope="col">Sunday</th>
> <th scope="col">Monday</th>
> <th scope="col">Tuesday</th>
> <th scope="col">Wednesday</th>
> <th scope="col">Thursday</th>
> <th scope="col">Friday</th>
> <th scope="col">Saturday</th>
> </tr>
> <tr>
> <th scope="row">Week 1</th>
> <td></td>
> <td></td>
> <td>1</td>
> <td>2</td>
> <td>3</td>
> <td>4</td>
> <td>5</td>
> </tr>
>
> ...etc.
> </table>
>
> "Each th element in a complex data table must have an id attribute whose
> value is unique relative to all ids on the page."
> Please explain how failing to add an ID attribute to a table
> header makes it less accessible.
>
> "Each td element in a complex data table must have a headers attribute
> that references the id attributes of associated th elements."
> Please explain how failing to add a HEADER attribute to a table
> cell makes it less accessible.
>
> What defines "complex"? How does a mechanical tool makes this
> assessment? The table code example shown above is perfectly valid, is
> extremely accessible, and would fail 3 of the 5 data-table 'rules' this
> testing imposes on *your* sites. This is simply unacceptable.
>
>
> IMAGES/ALT TEXT
> "Each img element with width or height less than 8 pixels should be
> removed; CSS techniques should be used instead."
> Really? How exactly was this determined? If I have an image that
> is 9 pixels X 2 pixels than it should have alt text and not be moved to
> CSS? That's what the tool and this testing tells. Furthermore, if your
> site has an image like this, it has now been deemed less accessible, thus
> ranks lower in the scores - leaving the impression that your pages are
> inaccessible.
>
> Clearly this is a tool that has some value, but to stake a single page's
> accessibility or lack of, never mind publishing public data that states
> that a University's web content is inaccessible on arbitrary Rules made up
> by one University and verified by mechanical means alone against 3 or 4
> pages is foolhardy, damaging to the general state of web accessibility (as
> it suggests that meeting a mechanical tester's results = job done), and
> unconscionable. It may also leave the University of Illinois open to libel
> suits and other legal remedies.
>
> I know Jon Gunderson personally, I like Jon Gunderson, and I respect the
> work that Jon has done to advance web accessibility over the years, but
> here, today, I must point the finger of shame at him and cry "Foul" - this
> is no more an assessment of true web accessibility than it is a rolling of
> chicken bones and voodoo chest-beating, and the damage caused here falls
> squarely at his feet.
> </Personal Opinion>
>
> NOTE: These are my personal opinions, and in no way reflect the opinion of
> Stanford University (with whom I am under contract), T-Base Communications
> (my employer), my associates or other professional affiliates with whom I
> do business with.
>
> JF
> ============================
> John Foliot
>
> Co-chair - W3C HTML5 Accessibility Task Force (Media)
> http://www.w3.org/WAI/PF/HTML/wiki/Main_Page
>
> ============================
>
>
>
>
>
>

From: Jared Smith
Date: Mon, Dec 13 2010 10:09PM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"
Previous message | Next message

On Mon, Dec 13, 2010 at 8:48 PM, Gunderson, Jon R wrote:

> I should also note that passing these rules doesn't mean you are accessible, it just means you have the markup for accessibility.

But not passing the rules means you are inaccessible? I think that
therein lies the fundamental issue.

There are a few automated rules that can clearly indicate
accessibility issues. And there are many, many rules that might
indicate an accessibility issue in certain situations and based on
certain assumptions and opinions on what is best practice for most
pages. Only a human can determine if this second category of rule
violations have an actual impact on the human user. The concern is
that the report lumps these two categories together and counts them up
to assign a 'grade'. It assumes that all violations of rules are
equal. This has great potential to give a very inaccurate indication
of true accessibility.

Another fundamental flaw in the methodology is that a very simple page
that has the potential for fewer rules violations will almost always
rank better than a longer, more complex page that has more elements to
be analyzed. For example, a very inaccessible home page with an <h1>
of "University X" and then a giant image with alt text of "home page"
would score 100%, yet a highly accessible web page with a few spacer
images (which to the end user experience are no different than CSS
background images) and a data table that doesn't match the prescribed
(and terribly flawed) requirements of having a summary (which is
generally ignored anyway and often not needed), headers/id (which do
nothing for accessibility in nearly all cases), or row headers
(really?) would score much, much lower. The report does not account
for human experience and impact.

> There are many manual tests that must be made, but I don't need to tell this list that.

Yet the report purports to declare a level of accessibility while
ignoring this fact.

While this has certainly helped raise awareness, I think this is the
general and widespread concern in the accessibility field about these
reports. The issue is not the rules (though I believe they are
fundamentally flawed in several areas) or the FAE tool, but in how the
results continue to be reported. It is one thing to indicate that
pages have more or fewer rule violations based on a subjective
ruleset. It's another thing entirely to declare that those with fewer
rules violations are somehow more accessible or "best", as the article
declares, for end users.

Jared Smith
WebAIM

From: John Foliot
Date: Mon, Dec 13 2010 11:21PM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online"
Previous message | Next message

Gunderson, Jon R wrote:
>
> Rules Development Clarification
>
> The rules were not developed only by people at the University of
> Illinois, but were developed in an open forum of the Web Best Practices
> Working Group:
> http://collaborate.athenpro.org/group/web/
> There are members from all over the united States.
> Anyone can join the group and if people have better design rules the
> group would love to hear and consider them for inclusion.

However, what has happened is that these rules are now being imposed on a
number of Higher Education institutions that have neither participated in
that rule making, nor have they agreed that they are what are required to
ensure accessibility. With no offense to you or the other participants in
the Best Practices group it strikes me that representatives of the
majority of the institutions evaluated are notably absent from the Working
Group; as such, your current rule-set is hardly universally accepted or
agreed to. It would seem that only those members who have developed the
FAE rule set should be judged by those rules. As well, there is a
difference between not meeting Best Practices and having web content that
is inaccessible - a nuanced point notably absent from your report and the
recent Chronicle in Higher Ed article.

I have pointed out the rules that I personally have issue with, yet the
FAE tool and rule set were used to judge pages at the institution where I
work. This now places either my professional experience and judgment into
question, or your groups judgment, as clearly we are in disagreement. I
have already pointed out the evaluation criteria I disagree with, and
await your response and justification - for example can you prove that a
page that lacks an H1 is inaccessible? I know I certainly can't, and
further can offer examples where a page without an H1 would still remain
totally accessible, and in fact could actually be an accessibility
enhancement - the long text explanatory page associated to @longdesc.

Asserting that not meeting all of your Best Practice rules = poor
accessibility is simply false.


>
> The study included over 20,000 web pages were analyzed, please view the
> data details:
> http://webaccessibility.cita.illinois.edu/data/

...and not surprisingly the issues I take most offence with are also the
ones that have the lowest mean average across the pages evaluated. This
should come as little surprise to those of us who are most actively
involved in this subject matter, as they are also the most subjective and
contentious Rules in the rule-set.

However, for CIOs, Senior Management in other positions at Universities,
and the general population reading that Chronicle article, this subtle
point is easily lost: they see a bottom line score with little
understanding on how that score was reached. In today's climate of the
recent Penn State action, this will lead to senior executives making snap
judgments based on flawed data, rather than asking the right kinds of
questions or striving to ensure real on-line accessibility. Web
accessibility professionals have long known and stated that true
accessibility is not a series of tick boxes on a shopping list, yet the
recent results released by iCITA are just that. The results cause as much
harm as they do good.

>
> Grand Standing Charge Response
>
> To the charge me personally with grandstanding, maybe so, I'll let
> individuals make their own judgement.

I point not at you, but at the report you and your team at iCITA have
publicly released. While you are free to do what you think is best at your
institution, it places many of us in a position not of advancing the
larger issue, but defending and countering your evaluations - in part
because they suggest "Best Practices" that we were not party to creating
as *requirements* for real web accessibility. If you want to evaluate
against Section 508 or WCAG Guidelines that's one thing, but using nothing
but a programmatic evaluator and a rule that states that all TH's must
have an ID (or somehow it is now magically inaccessible) is one I cannot
endorse.

I totally understand the shock and awe effect of having a report that
'names and shames' higher ed institutions (after-all, I too am well known
for going 'rogue' when fighting for web accessibility), but if you are
going to do that then the rules-set must be one that the larger community
already agrees to, and we don't have that here.


> But without data on the inaccessibility of higher education websites
> being publicly available the inaccessibility will still continue to
> grow and get worse.
> I talk to to many CIOs, IT professionals and vendors that tell me their
> web sites are accessible because they have a policy or a law like
> Section 508 that says it must be so.
> Accessibility is more than policy, it requires setting design standards
> (rules) and auditing the use of the design standards.

Fair enough, but imposing *your (ATHEN Collaboration) rules* and design
standards is not what they have agreed to, have been mandated to (by law
or internal policy), or use in internal auditing - and herein is the rub.
I personally advocate and strive for WCAG2-AA, where understanding the
goals (POUR) is significantly more important than tick-box reporting. This
report now sets many of us back in that regard, as 'passing' your tool's
subjective rule-set is now being seen as more important in some circles
than achieving real accessibility. Good for your tool, not so good for the
larger goals.


>
> I hope people see this as an opportunity to raise awareness on their
> campuses of accessibility.

However exactly the opposite is the result. Rather than talking about the
larger issues and advancing successes, many of us are left explaining why
our institutions did not fare well in your report, and explaining why some
of your criteria really have little to do with true accessibility. You've
put many of us who would normally be speaking in positive tones on the
defense - hardly a position to win support.


> If you don't like the rules used in the data collection, I hope that
> you will define your own campus design rules that support functional
> accessibility by people with disabilities and also meet the design
> needs of developers.

As you were conducting your review did you bother to ask the institutions
you were judging if they had such internal rules or policies? Or did you
simply start from the premise that your rules should be the rules we all
must follow? I posit that the later is likely the case: again, judge your
Best Practices members' sites against your/their rules, but do not presume
to impose them on those who have not agreed to them.


> I also hope you will make the design rules publicly available so people
> with disabilities know what to expect when they get to your campuses
> web sites.
> Campuses need to treat accessibility like other IT issues, like
> security.
> They need to have people assigned web accessibility responsibilities
> and they need to measure the implementation of their policies.

You are hardly telling me or others reading this something that we don't
already know. I am unclear how this report helps to achieve any of that -
rather than helping foster the right kind of ecosystems at higher ed it
sends everyone scrambling to eliminate images that are less than 8 pixels
wide or high; effort, time and resources that should be better used going
after the larger issues. (And if you think that some executive somewhere
is going to insist that an audit of web-pages in search of such images is
a fanciful exaggeration then you and I are not working in the same
universe - I pity the poor soul who draws that task)

>
> I should also note that passing these rules doesn't mean you are
> accessible, it just means you have the markup for accessibility.
> There are many manual tests that must be made, but I don't need to tell
> this list that.

No, you need to tell the Chronicle of Higher Ed that. You now need to tell
all of the Provosts, Chancellors, Presidents, CIOs, University Lawyers and
other senior executives who are looking at this article and drawing their
own (flawed) conclusions that. You need to ensure that this point is
clearly underscored on all of the evaluation pages you have publicly
posted as "Accessibility Report Cards", because more than anything else
this is what is blatantly missing in all of the reporting and publishing
of this exercise: that the reports are based not on legal or even
universally agreed to criteria, but rather it is a mechanical evaluation
of Best Practices developed by a select group of participants, and is but
one indication of success or failure in the *opinion* of those
participants.

**********

NOTE: These are my personal opinions, and in no way reflect the opinion of
Stanford University (with whom I am under contract), T-Base Communications
(my employer), my associates or other professional affiliates with whom I
do business with.

JF
============================
John Foliot

Co-chair - W3C HTML5 Accessibility Task Force (Media)
http://www.w3.org/WAI/PF/HTML/wiki/Main_Page

============================

From: Gunderson, Jon R
Date: Tue, Dec 14 2010 11:21AM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" Chronicle Article and form control labeling
Previous message | Next message

I find it interesting that people on this list were not outraged that less than 30 percent of the 19,722 pages tested with form controls had accessible labels.

http://webaccessibility.cita.illinois.edu/data/

I can't think of a more basic accessibility feature than using a label element or title attribute to label a form control.

The lack of form control labeling was my biggest conclusion from the pages tested and my biggest worry is how to address this issue.

I think everyone agrees that form control labeling is a part of WCAG 1.0, WCAG 2.0, Section 508 requirements and almost any other web accessibility standard developed.

If higher education can't even label simple form controls correctly, how are they ever going to make Dynamic HTML widgets accessible?

Jon


On Dec 14, 2010, at 12:13 AM, John Foliot wrote:

Gunderson, Jon R wrote:

Rules Development Clarification

The rules were not developed only by people at the University of
Illinois, but were developed in an open forum of the Web Best Practices
Working Group:
http://collaborate.athenpro.org/group/web/
There are members from all over the united States.
Anyone can join the group and if people have better design rules the
group would love to hear and consider them for inclusion.

However, what has happened is that these rules are now being imposed on a
number of Higher Education institutions that have neither participated in
that rule making, nor have they agreed that they are what are required to
ensure accessibility. With no offense to you or the other participants in
the Best Practices group it strikes me that representatives of the
majority of the institutions evaluated are notably absent from the Working
Group; as such, your current rule-set is hardly universally accepted or
agreed to. It would seem that only those members who have developed the
FAE rule set should be judged by those rules. As well, there is a
difference between not meeting Best Practices and having web content that
is inaccessible - a nuanced point notably absent from your report and the
recent Chronicle in Higher Ed article.

I have pointed out the rules that I personally have issue with, yet the
FAE tool and rule set were used to judge pages at the institution where I
work. This now places either my professional experience and judgment into
question, or your groups judgment, as clearly we are in disagreement. I
have already pointed out the evaluation criteria I disagree with, and
await your response and justification - for example can you prove that a
page that lacks an H1 is inaccessible? I know I certainly can't, and
further can offer examples where a page without an H1 would still remain
totally accessible, and in fact could actually be an accessibility
enhancement - the long text explanatory page associated to @longdesc.

Asserting that not meeting all of your Best Practice rules = poor
accessibility is simply false.



The study included over 20,000 web pages were analyzed, please view the
data details:
http://webaccessibility.cita.illinois.edu/data/

...and not surprisingly the issues I take most offence with are also the
ones that have the lowest mean average across the pages evaluated. This
should come as little surprise to those of us who are most actively
involved in this subject matter, as they are also the most subjective and
contentious Rules in the rule-set.

However, for CIOs, Senior Management in other positions at Universities,
and the general population reading that Chronicle article, this subtle
point is easily lost: they see a bottom line score with little
understanding on how that score was reached. In today's climate of the
recent Penn State action, this will lead to senior executives making snap
judgments based on flawed data, rather than asking the right kinds of
questions or striving to ensure real on-line accessibility. Web
accessibility professionals have long known and stated that true
accessibility is not a series of tick boxes on a shopping list, yet the
recent results released by iCITA are just that. The results cause as much
harm as they do good.


Grand Standing Charge Response

To the charge me personally with grandstanding, maybe so, I'll let
individuals make their own judgement.

I point not at you, but at the report you and your team at iCITA have
publicly released. While you are free to do what you think is best at your
institution, it places many of us in a position not of advancing the
larger issue, but defending and countering your evaluations - in part
because they suggest "Best Practices" that we were not party to creating
as *requirements* for real web accessibility. If you want to evaluate
against Section 508 or WCAG Guidelines that's one thing, but using nothing
but a programmatic evaluator and a rule that states that all TH's must
have an ID (or somehow it is now magically inaccessible) is one I cannot
endorse.

I totally understand the shock and awe effect of having a report that
'names and shames' higher ed institutions (after-all, I too am well known
for going 'rogue' when fighting for web accessibility), but if you are
going to do that then the rules-set must be one that the larger community
already agrees to, and we don't have that here.


But without data on the inaccessibility of higher education websites
being publicly available the inaccessibility will still continue to
grow and get worse.
I talk to to many CIOs, IT professionals and vendors that tell me their
web sites are accessible because they have a policy or a law like
Section 508 that says it must be so.
Accessibility is more than policy, it requires setting design standards
(rules) and auditing the use of the design standards.

Fair enough, but imposing *your (ATHEN Collaboration) rules* and design
standards is not what they have agreed to, have been mandated to (by law
or internal policy), or use in internal auditing - and herein is the rub.
I personally advocate and strive for WCAG2-AA, where understanding the
goals (POUR) is significantly more important than tick-box reporting. This
report now sets many of us back in that regard, as 'passing' your tool's
subjective rule-set is now being seen as more important in some circles
than achieving real accessibility. Good for your tool, not so good for the
larger goals.



I hope people see this as an opportunity to raise awareness on their
campuses of accessibility.

However exactly the opposite is the result. Rather than talking about the
larger issues and advancing successes, many of us are left explaining why
our institutions did not fare well in your report, and explaining why some
of your criteria really have little to do with true accessibility. You've
put many of us who would normally be speaking in positive tones on the
defense - hardly a position to win support.


If you don't like the rules used in the data collection, I hope that
you will define your own campus design rules that support functional
accessibility by people with disabilities and also meet the design
needs of developers.

As you were conducting your review did you bother to ask the institutions
you were judging if they had such internal rules or policies? Or did you
simply start from the premise that your rules should be the rules we all
must follow? I posit that the later is likely the case: again, judge your
Best Practices members' sites against your/their rules, but do not presume
to impose them on those who have not agreed to them.


I also hope you will make the design rules publicly available so people
with disabilities know what to expect when they get to your campuses
web sites.
Campuses need to treat accessibility like other IT issues, like
security.
They need to have people assigned web accessibility responsibilities
and they need to measure the implementation of their policies.

You are hardly telling me or others reading this something that we don't
already know. I am unclear how this report helps to achieve any of that -
rather than helping foster the right kind of ecosystems at higher ed it
sends everyone scrambling to eliminate images that are less than 8 pixels
wide or high; effort, time and resources that should be better used going
after the larger issues. (And if you think that some executive somewhere
is going to insist that an audit of web-pages in search of such images is
a fanciful exaggeration then you and I are not working in the same
universe - I pity the poor soul who draws that task)

From: Julie Romanowski
Date: Tue, Dec 14 2010 11:48AM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" Chronicle Article and form control labeling
Previous message | Next message

Not outraged, but disappointed. Creating accessible form controls is not rock science, and we should expect better from our universities.

-----Original Message-----
From: = EMAIL ADDRESS REMOVED = [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Gunderson, Jon R
Sent: Tuesday, December 14, 2010 12:19 PM
To: WebAIM Discussion List
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" Chronicle Article and form control labeling

I find it interesting that people on this list were not outraged that less than 30 percent of the 19,722 pages tested with form controls had accessible labels.

http://webaccessibility.cita.illinois.edu/data/

I can't think of a more basic accessibility feature than using a label element or title attribute to label a form control.

The lack of form control labeling was my biggest conclusion from the pages tested and my biggest worry is how to address this issue.

I think everyone agrees that form control labeling is a part of WCAG 1.0, WCAG 2.0, Section 508 requirements and almost any other web accessibility standard developed.

If higher education can't even label simple form controls correctly, how are they ever going to make Dynamic HTML widgets accessible?

Jon

From: Birkir Rúnar Gunnarsson
Date: Tue, Dec 14 2010 12:03PM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" Chronicle Article and form control labeling
Previous message | Next message

Jon

I hardly dare ask *grin* but was use of MathML or accessible math
considerred at all in your survey (I know, I will read the details of
the rules you crated later, was just wondering if you could give me
the quick and dirty answer to this).
One would hardly expect accessible math on pages that do not have
their form fields in order, but I am very worried about the future of
STEM education as we move from the traditional campus environment with
readers or visual interpreters to online studies.
Thanks
-B

On 12/14/10, Julie Romanowski < = EMAIL ADDRESS REMOVED = > wrote:
> Not outraged, but disappointed. Creating accessible form controls is not
> rock science, and we should expect better from our universities.
>
> -----Original Message-----
> From: = EMAIL ADDRESS REMOVED =
> [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Gunderson, Jon R
> Sent: Tuesday, December 14, 2010 12:19 PM
> To: WebAIM Discussion List
> Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock
> Out Blind Students Online" Chronicle Article and form control labeling
>
> I find it interesting that people on this list were not outraged that less
> than 30 percent of the 19,722 pages tested with form controls had accessible
> labels.
>
> http://webaccessibility.cita.illinois.edu/data/
>
> I can't think of a more basic accessibility feature than using a label
> element or title attribute to label a form control.
>
> The lack of form control labeling was my biggest conclusion from the pages
> tested and my biggest worry is how to address this issue.
>
> I think everyone agrees that form control labeling is a part of WCAG 1.0,
> WCAG 2.0, Section 508 requirements and almost any other web accessibility
> standard developed.
>
> If higher education can't even label simple form controls correctly, how are
> they ever going to make Dynamic HTML widgets accessible?
>
> Jon
>

From: John Foliot
Date: Tue, Dec 14 2010 12:45PM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" Chronicle Article and form control labeling
Previous message | Next message

Gunderson, Jon R wrote:
>
> I find it interesting that people on this list were not outraged that
> less than 30 percent of the 19,722 pages tested with form controls had
> accessible labels.
>
> http://webaccessibility.cita.illinois.edu/data/

The problem here of course is that this important data-point is lost
amidst some of the other "Rules" that are circumspect at best, and
outright false at worst. You take a mish-mash of important requirements
and mix them in with a slew of imposed Best Practices that many disagree
with - with a net result that you taint the good with the bad. Couple that
with the hysterical reporting we see at the Chronicle of Higher Ed (fueled
by your 'report') that focuses on *RANKING* rather than looking at where
we have real problems across the board and the net result is that any good
derived by your testing is off-set by that tabloid journalist approach.

I will repeat the controversial Rules here and once again ask you to
justify how failing to meet any of them results in inaccessible web
content:

*************

HEADING STRUCTURE:
"The page must contain at least one h1 element."
According to whom? While it is certainly good practice to ensure
each page has appropriate heading structure, nowhere (outside of the FAE
tool) is it *MANDATED* as such - a page that lacks an <h1> is not
intrinsically inaccessible. False data - False results!

"The page should contain no more than two h1 elements."
Please point to one national or international guideline or
recommendation that mandates this. Another false positive from a
mechanical tool, fueled by internal University of Illinois politics and
policies.

"The text content of each h1 element should match all or part of the title
content."
"Each h1 element should have text content exclusive of the alt text of any
img elements it contains."
Bull Feathers! Made up standards by a small team with an agenda to
promote their internal tool - and it should be noted that failing to do
either of these things in no way makes a page "less accessible" - it just
doesn't meet their FAE Guidelines.


DATA TABLES:
"For each data table, the first cell in each column must be a th element,
and each row must contain at least one th element."
Patently FALSE! In fact, the table of school rankings at the
Chronicle of Higher Ed that Jon points to in his earlier email
(http://chronicle.com/article/BestWorst-College-Web/125642/) does not meet
this "pass" criteria, yet is not "inaccessible" because of it - in fact
the size of the table (183 rows in length with little-to-no internal
navigation) is more of an access issue than the failure for each row to
start with a <th>.

The following table is perfectly acceptable and valid, and meets (as far
as I know) all required accessibility guidelines as established by both
the Section 508 Standard and W3C Guidelines (yet fails the FAE tool):

<table>
<tr>
<td></td>
<th scope="col">Sunday</th>
<th scope="col">Monday</th>
<th scope="col">Tuesday</th>
<th scope="col">Wednesday</th>
<th scope="col">Thursday</th>
<th scope="col">Friday</th>
<th scope="col">Saturday</th>
</tr>
<tr>
<th scope="row">Week 1</th>
<td></td>
<td></td>
<td>1</td>
<td>2</td>
<td>3</td>
<td>4</td>
<td>5</td>
</tr>

...etc.
</table>

"Each th element in a complex data table must have an id attribute whose
value is unique relative to all ids on the page."
Please explain how failing to add an ID attribute to a table
header makes it less accessible.

"Each td element in a complex data table must have a headers attribute
that references the id attributes of associated th elements."
Please explain how failing to add a HEADER attribute to a table
cell makes it less accessible.

What defines "complex"? How does a mechanical tool makes this
assessment? The table code example shown above is perfectly valid, is
extremely accessible, and would fail 3 of the 5 data-table 'rules' this
testing imposes on *your* sites. This is simply unacceptable.


IMAGES/ALT TEXT
"Each img element with width or height less than 8 pixels should be
removed; CSS techniques should be used instead."
Really? How exactly was this determined? If I have an image that
is 9 pixels X 2 pixels than it should have alt text and not be moved to
CSS?

*************

Until such time as clarification and proof exists to back up these claims,
the entire exercise is mired in inaccuracies and confusion.

Real Problems not evaluated in your testing:

* Link text that is meaningful when taken out of context
* Alt text that is meaningful
* Ensuring that all information conveyed with color is also available
without color
* Appropriate foreground and background contrast
* The ability to interact with a page without the need to use a mouse
(tabbing)
* The appropriate use of lists and list mark-up
* Multi-media issues: no auto-start, caption/transcripts for video
content, etc.
* No blinking, no auto-redirect, no timing-out with prior notice, etc.

I have also challenged you to clarify the fact that mechanical testing is
but one aspect of accessibility evaluation on the 183 Report Cards you
have issues at http://webaccessibility.cita.illinois.edu/data/schools/ so
that we have a more accurate and truthful report to properly discuss at
our institutions.


>
> I can't think of a more basic accessibility feature than using a label
> element or title attribute to label a form control.
>
> The lack of form control labeling was my biggest conclusion from the
> pages tested and my biggest worry is how to address this issue.

Yet nowhere in your evaluations or reporting is this fact highlighted. Why
is this?


>
> I think everyone agrees that form control labeling is a part of WCAG
> 1.0, WCAG 2.0, Section 508 requirements and almost any other web
> accessibility standard developed.
>
> If higher education can't even label simple form controls correctly,
> how are they ever going to make Dynamic HTML widgets accessible?

How about by focusing on real accessibility issues and not forcing
everyone to 'pass' the FAE ranking exercise by insisting that every table
cell have either a header or ID (when @scope is often more than enough),
or that "The text content of each h1 element should match all or part of
the title content"? Engaging in real dialog rather than perpetuating
boogie-man scare tactics and tick-box evaluations that leave higher
education institutions worrying about their 'ranking' rather than truly
meeting the needs of disabled students within their student body.


JF
============================
John Foliot

NOTE: These are my personal opinions, and in no way reflect the opinion of
Stanford University (with whom I am under contract), T-Base Communications
(my employer), my associates or other professional affiliates with whom I
do business with.

Co-chair - W3C HTML5 Accessibility Task Force (Media)
http://www.w3.org/WAI/PF/HTML/wiki/Main_Page

============================

From: Michael.Moore@dars.state.tx.us
Date: Tue, Dec 14 2010 1:51PM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" Chronicle Article and form control labeling
Previous message | Next message

The results for forms are not evenly distributed. There is a large cluster of schools that did very well (90% or more passing) and another cluster that did abysmally (less than 20% passing) thus the very poor sites are pulling down the average. It would appear from a quick review of the data for form fields that the folks behind the sites either understand form labels well or not at all and that the number of sites that understand the issue well exceeds the number who do not. In my view the average is rather meaningless due to the way that the data is distributed. Perhaps the median would be better representative of how the schools are doing with respect to form fields.

Another interesting set of data would be whether the sites are maintained internally by the schools or if they are outsourced. I wonder if there is any correlation between the use of form labels and the contractor.

Mike Moore


-----Original Message-----
From: = EMAIL ADDRESS REMOVED = [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Gunderson, Jon R
Sent: Tuesday, December 14, 2010 12:19 PM
To: WebAIM Discussion List
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" Chronicle Article and form control labeling

I find it interesting that people on this list were not outraged that less than 30 percent of the 19,722 pages tested with form controls had accessible labels.

http://webaccessibility.cita.illinois.edu/data/

I can't think of a more basic accessibility feature than using a label element or title attribute to label a form control.

The lack of form control labeling was my biggest conclusion from the pages tested and my biggest worry is how to address this issue.

I think everyone agrees that form control labeling is a part of WCAG 1.0, WCAG 2.0, Section 508 requirements and almost any other web accessibility standard developed.

If higher education can't even label simple form controls correctly, how are they ever going to make Dynamic HTML widgets accessible?

Jon


On Dec 14, 2010, at 12:13 AM, John Foliot wrote:

Gunderson, Jon R wrote:

Rules Development Clarification

The rules were not developed only by people at the University of
Illinois, but were developed in an open forum of the Web Best Practices
Working Group:
http://collaborate.athenpro.org/group/web/
There are members from all over the united States.
Anyone can join the group and if people have better design rules the
group would love to hear and consider them for inclusion.

However, what has happened is that these rules are now being imposed on a
number of Higher Education institutions that have neither participated in
that rule making, nor have they agreed that they are what are required to
ensure accessibility. With no offense to you or the other participants in
the Best Practices group it strikes me that representatives of the
majority of the institutions evaluated are notably absent from the Working
Group; as such, your current rule-set is hardly universally accepted or
agreed to. It would seem that only those members who have developed the
FAE rule set should be judged by those rules. As well, there is a
difference between not meeting Best Practices and having web content that
is inaccessible - a nuanced point notably absent from your report and the
recent Chronicle in Higher Ed article.

I have pointed out the rules that I personally have issue with, yet the
FAE tool and rule set were used to judge pages at the institution where I
work. This now places either my professional experience and judgment into
question, or your groups judgment, as clearly we are in disagreement. I
have already pointed out the evaluation criteria I disagree with, and
await your response and justification - for example can you prove that a
page that lacks an H1 is inaccessible? I know I certainly can't, and
further can offer examples where a page without an H1 would still remain
totally accessible, and in fact could actually be an accessibility
enhancement - the long text explanatory page associated to @longdesc.

Asserting that not meeting all of your Best Practice rules = poor
accessibility is simply false.



The study included over 20,000 web pages were analyzed, please view the
data details:
http://webaccessibility.cita.illinois.edu/data/

...and not surprisingly the issues I take most offence with are also the
ones that have the lowest mean average across the pages evaluated. This
should come as little surprise to those of us who are most actively
involved in this subject matter, as they are also the most subjective and
contentious Rules in the rule-set.

However, for CIOs, Senior Management in other positions at Universities,
and the general population reading that Chronicle article, this subtle
point is easily lost: they see a bottom line score with little
understanding on how that score was reached. In today's climate of the
recent Penn State action, this will lead to senior executives making snap
judgments based on flawed data, rather than asking the right kinds of
questions or striving to ensure real on-line accessibility. Web
accessibility professionals have long known and stated that true
accessibility is not a series of tick boxes on a shopping list, yet the
recent results released by iCITA are just that. The results cause as much
harm as they do good.


Grand Standing Charge Response

To the charge me personally with grandstanding, maybe so, I'll let
individuals make their own judgement.

I point not at you, but at the report you and your team at iCITA have
publicly released. While you are free to do what you think is best at your
institution, it places many of us in a position not of advancing the
larger issue, but defending and countering your evaluations - in part
because they suggest "Best Practices" that we were not party to creating
as *requirements* for real web accessibility. If you want to evaluate
against Section 508 or WCAG Guidelines that's one thing, but using nothing
but a programmatic evaluator and a rule that states that all TH's must
have an ID (or somehow it is now magically inaccessible) is one I cannot
endorse.

I totally understand the shock and awe effect of having a report that
'names and shames' higher ed institutions (after-all, I too am well known
for going 'rogue' when fighting for web accessibility), but if you are
going to do that then the rules-set must be one that the larger community
already agrees to, and we don't have that here.


But without data on the inaccessibility of higher education websites
being publicly available the inaccessibility will still continue to
grow and get worse.
I talk to to many CIOs, IT professionals and vendors that tell me their
web sites are accessible because they have a policy or a law like
Section 508 that says it must be so.
Accessibility is more than policy, it requires setting design standards
(rules) and auditing the use of the design standards.

Fair enough, but imposing *your (ATHEN Collaboration) rules* and design
standards is not what they have agreed to, have been mandated to (by law
or internal policy), or use in internal auditing - and herein is the rub.
I personally advocate and strive for WCAG2-AA, where understanding the
goals (POUR) is significantly more important than tick-box reporting. This
report now sets many of us back in that regard, as 'passing' your tool's
subjective rule-set is now being seen as more important in some circles
than achieving real accessibility. Good for your tool, not so good for the
larger goals.



I hope people see this as an opportunity to raise awareness on their
campuses of accessibility.

However exactly the opposite is the result. Rather than talking about the
larger issues and advancing successes, many of us are left explaining why
our institutions did not fare well in your report, and explaining why some
of your criteria really have little to do with true accessibility. You've
put many of us who would normally be speaking in positive tones on the
defense - hardly a position to win support.


If you don't like the rules used in the data collection, I hope that
you will define your own campus design rules that support functional
accessibility by people with disabilities and also meet the design
needs of developers.

As you were conducting your review did you bother to ask the institutions
you were judging if they had such internal rules or policies? Or did you
simply start from the premise that your rules should be the rules we all
must follow? I posit that the later is likely the case: again, judge your
Best Practices members' sites against your/their rules, but do not presume
to impose them on those who have not agreed to them.


I also hope you will make the design rules publicly available so people
with disabilities know what to expect when they get to your campuses
web sites.
Campuses need to treat accessibility like other IT issues, like
security.
They need to have people assigned web accessibility responsibilities
and they need to measure the implementation of their policies.

You are hardly telling me or others reading this something that we don't
already know. I am unclear how this report helps to achieve any of that -
rather than helping foster the right kind of ecosystems at higher ed it
sends everyone scrambling to eliminate images that are less than 8 pixels
wide or high; effort, time and resources that should be better used going
after the larger issues. (And if you think that some executive somewhere
is going to insist that an audit of web-pages in search of such images is
a fanciful exaggeration then you and I are not working in the same
universe - I pity the poor soul who draws that task)

From: Cliff Tyllick
Date: Tue, Dec 14 2010 4:45PM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" Chronicle Article and form control labeling
Previous message | Next message

Actually, Mike, UCLA and Harvard share the median score of 6.3. (One school had a score of N/A, which I take to mean that it had no online forms.)

From 100 down to the median, scores are distributed quite uniformly:
16 have scores of 90.1 to 100
12 have scores of 80.1 to 90
10 have scores of 70.1 to 80
5 have scores of 60.1 to 70
3 have scores of 50.1 to 60 (the first quartile ends with this group)
11 have scores of 40.1 to 50
11 have scores of 30.1 to 40
8 have scores of 20.1 to 30
8 have scores of 10.1 to 20
5 have scores between 6.3 (the median) and 10

Below the median, 45 schools (the third quartile) scored something -- at least 0.5.

Another 45 schools (the fourth quartile) had scores of 0.

But, as you point out, the data mean little without more information:
- Is this score based on one application form per institution, all of them, or some random subset?
- If a subset, how do we know the forms that were graded are the most important application forms for that institution's students?
- Do any institutions outsource site development? If so, how do they do?
- Do all institutions have someone in charge of compliance? If not, how do the sites of those that do compare to those that don't? And does it make a difference whether "in charge of compliance" means "has the backing of upper management"?
- As with most large enterprises, colleges and universities tend to have fairly autonomous subunits. Do the more balkanized schools suffer in this score? Or is that where having an Accessibility Czar particularly matters?
- Are any of these schools preparing to roll out significant redesigns to address usability and accessibility?

Oh, and one more important point: Not all schools are of the opinion that they are subject to Section 508. They each have their own lawyers, and each lawyer is entitled to interpret the law and its impact on that institution. Are the schools that scored low convinced that they are exempt, if not from the law, at least from any meaningful penalties under the law?

We simply can't tell.

Cliff

Cliff Tyllick
Usability assessment coordinator
Agency Communications Division
Texas Commission on Environmental Quality
512-239-4516
= EMAIL ADDRESS REMOVED =

>>> On 12/14/2010 at 2:51 PM, in message < = EMAIL ADDRESS REMOVED = >, < = EMAIL ADDRESS REMOVED = > wrote:
The results for forms are not evenly distributed. There is a large cluster of schools that did very well (90% or more passing) and another cluster that did abysmally (less than 20% passing) thus the very poor sites are pulling down the average. It would appear from a quick review of the data for form fields that the folks behind the sites either understand form labels well or not at all and that the number of sites that understand the issue well exceeds the number who do not. In my view the average is rather meaningless due to the way that the data is distributed. Perhaps the median would be better representative of how the schools are doing with respect to form fields.

Another interesting set of data would be whether the sites are maintained internally by the schools or if they are outsourced. I wonder if there is any correlation between the use of form labels and the contractor.

Mike Moore


-----Original Message-----
From: = EMAIL ADDRESS REMOVED = [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Gunderson, Jon R
Sent: Tuesday, December 14, 2010 12:19 PM
To: WebAIM Discussion List
Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" Chronicle Article and form control labeling

I find it interesting that people on this list were not outraged that less than 30 percent of the 19,722 pages tested with form controls had accessible labels.

http://webaccessibility.cita.illinois.edu/data/

I can't think of a more basic accessibility feature than using a label element or title attribute to label a form control.

The lack of form control labeling was my biggest conclusion from the pages tested and my biggest worry is how to address this issue.

I think everyone agrees that form control labeling is a part of WCAG 1.0, WCAG 2.0, Section 508 requirements and almost any other web accessibility standard developed.

If higher education can't even label simple form controls correctly, how are they ever going to make Dynamic HTML widgets accessible?

Jon


On Dec 14, 2010, at 12:13 AM, John Foliot wrote:

Gunderson, Jon R wrote:

Rules Development Clarification

The rules were not developed only by people at the University of
Illinois, but were developed in an open forum of the Web Best Practices
Working Group:
http://collaborate.athenpro.org/group/web/
There are members from all over the united States.
Anyone can join the group and if people have better design rules the
group would love to hear and consider them for inclusion.

However, what has happened is that these rules are now being imposed on a
number of Higher Education institutions that have neither participated in
that rule making, nor have they agreed that they are what are required to
ensure accessibility. With no offense to you or the other participants in
the Best Practices group it strikes me that representatives of the
majority of the institutions evaluated are notably absent from the Working
Group; as such, your current rule-set is hardly universally accepted or
agreed to. It would seem that only those members who have developed the
FAE rule set should be judged by those rules. As well, there is a
difference between not meeting Best Practices and having web content that
is inaccessible - a nuanced point notably absent from your report and the
recent Chronicle in Higher Ed article.

I have pointed out the rules that I personally have issue with, yet the
FAE tool and rule set were used to judge pages at the institution where I
work. This now places either my professional experience and judgment into
question, or your groups judgment, as clearly we are in disagreement. I
have already pointed out the evaluation criteria I disagree with, and
await your response and justification - for example can you prove that a
page that lacks an H1 is inaccessible? I know I certainly can't, and
further can offer examples where a page without an H1 would still remain
totally accessible, and in fact could actually be an accessibility
enhancement - the long text explanatory page associated to @longdesc.

Asserting that not meeting all of your Best Practice rules = poor
accessibility is simply false.



The study included over 20,000 web pages were analyzed, please view the
data details:
http://webaccessibility.cita.illinois.edu/data/

...and not surprisingly the issues I take most offence with are also the
ones that have the lowest mean average across the pages evaluated. This
should come as little surprise to those of us who are most actively
involved in this subject matter, as they are also the most subjective and
contentious Rules in the rule-set.

However, for CIOs, Senior Management in other positions at Universities,
and the general population reading that Chronicle article, this subtle
point is easily lost: they see a bottom line score with little
understanding on how that score was reached. In today's climate of the
recent Penn State action, this will lead to senior executives making snap
judgments based on flawed data, rather than asking the right kinds of
questions or striving to ensure real on-line accessibility. Web
accessibility professionals have long known and stated that true
accessibility is not a series of tick boxes on a shopping list, yet the
recent results released by iCITA are just that. The results cause as much
harm as they do good.


Grand Standing Charge Response

To the charge me personally with grandstanding, maybe so, I'll let
individuals make their own judgement.

I point not at you, but at the report you and your team at iCITA have
publicly released. While you are free to do what you think is best at your
institution, it places many of us in a position not of advancing the
larger issue, but defending and countering your evaluations - in part
because they suggest "Best Practices" that we were not party to creating
as *requirements* for real web accessibility. If you want to evaluate
against Section 508 or WCAG Guidelines that's one thing, but using nothing
but a programmatic evaluator and a rule that states that all TH's must
have an ID (or somehow it is now magically inaccessible) is one I cannot
endorse.

I totally understand the shock and awe effect of having a report that
'names and shames' higher ed institutions (after-all, I too am well known
for going 'rogue' when fighting for web accessibility), but if you are
going to do that then the rules-set must be one that the larger community
already agrees to, and we don't have that here.


But without data on the inaccessibility of higher education websites
being publicly available the inaccessibility will still continue to
grow and get worse.
I talk to to many CIOs, IT professionals and vendors that tell me their
web sites are accessible because they have a policy or a law like
Section 508 that says it must be so.
Accessibility is more than policy, it requires setting design standards
(rules) and auditing the use of the design standards.

Fair enough, but imposing *your (ATHEN Collaboration) rules* and design
standards is not what they have agreed to, have been mandated to (by law
or internal policy), or use in internal auditing - and herein is the rub.
I personally advocate and strive for WCAG2-AA, where understanding the
goals (POUR) is significantly more important than tick-box reporting. This
report now sets many of us back in that regard, as 'passing' your tool's
subjective rule-set is now being seen as more important in some circles
than achieving real accessibility. Good for your tool, not so good for the
larger goals.



I hope people see this as an opportunity to raise awareness on their
campuses of accessibility.

However exactly the opposite is the result. Rather than talking about the
larger issues and advancing successes, many of us are left explaining why
our institutions did not fare well in your report, and explaining why some
of your criteria really have little to do with true accessibility. You've
put many of us who would normally be speaking in positive tones on the
defense - hardly a position to win support.


If you don't like the rules used in the data collection, I hope that
you will define your own campus design rules that support functional
accessibility by people with disabilities and also meet the design
needs of developers.

As you were conducting your review did you bother to ask the institutions
you were judging if they had such internal rules or policies? Or did you
simply start from the premise that your rules should be the rules we all
must follow? I posit that the later is likely the case: again, judge your
Best Practices members' sites against your/their rules, but do not presume
to impose them on those who have not agreed to them.


I also hope you will make the design rules publicly available so people
with disabilities know what to expect when they get to your campuses
web sites.
Campuses need to treat accessibility like other IT issues, like
security.
They need to have people assigned web accessibility responsibilities
and they need to measure the implementation of their policies.

You are hardly telling me or others reading this something that we don't
already know. I am unclear how this report helps to achieve any of that -
rather than helping foster the right kind of ecosystems at higher ed it
sends everyone scrambling to eliminate images that are less than 8 pixels
wide or high; effort, time and resources that should be better used going
after the larger issues. (And if you think that some executive somewhere
is going to insist that an audit of web-pages in search of such images is
a fanciful exaggeration then you and I are not working in the same
universe - I pity the poor soul who draws that task)

From: Gunderson, Jon R
Date: Wed, Dec 15 2010 6:03AM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" Chronicle Article and form control labeling
Previous message | Next message

I am not forcing anyone to do anything.

People will review the data and make their own conclusions.

Hopefully good ones for improving accessibility and help open discussion at universities on how to manage and provide administrative controls for web accessibility, which seem to be sorely lacking.

Web accessibility is more than setting a policy and hoping developers do the right thing, it needs to be managed and audited just like other IT issues like security.

You either believe that you need to provide developers with coding practices or not for accessibility.

In my work with web developers I would do presentations on Section 508 or WCAG and many of them would come up to me after a presentation and tell me they understood some of the requirements and didn't understand mpost of the other requirements. Could I just tell them what to do and they would trust that I was providing them with coding practices that met the requirements.

The best practices are about what works for web developers and people with disabilities based on the current technologies and AT implementations (is a big factor in form control labeling)

Best Practices:
http://html.cita.illinois.edu

For the most part I have found developers have embraced the best practices since they make more sense to them than trying to understand the requirements of Section 508 or WCAG 2.0.

They also seem to make sense to a lot of other people outside Illinois because they have been independently implemented, even at less well known schools, like Missouri Sate University:
http://webaccessibility.cita.illinois.edu/data/school/204/

The best practices are based on web standards so developers gain the benefit of standards based designs.

Best Practices:
http://html.cita.illinois.edu

Since the best practices are based on web standards, many developers trying to follow web standards based design already found them on their own.

Doug Bergett of the University of Illinois was a earlier adopter of best practices at Illinois and helped in their development has a great quote:

"I learned best practices because of accessibility, but I use them because they are better web design"

This is the type of win-win situation we need to promote web accessibility,

if accessibility is thought of as a burden and not integral part of the design process, we will not get very far with web accessibility, especially in the Web 2.0 world around us.

If there are better coding practices for meeting Section 508 or WCAG 2.0 requirements I would love to hear about them, and I am sure other people on this list would also like to hear them too.

If you don't like these best practices I hope that it will spur discussion on your campus on what are best practices for your campus and I hope you will share the results with the rest of us.

Jon


On Dec 14, 2010, at 1:44 PM, John Foliot wrote:

> Gunderson, Jon R wrote:
>>
>> I find it interesting that people on this list were not outraged that
>> less than 30 percent of the 19,722 pages tested with form controls had
>> accessible labels.
>>
>> http://webaccessibility.cita.illinois.edu/data/
>
> The problem here of course is that this important data-point is lost
> amidst some of the other "Rules" that are circumspect at best, and
> outright false at worst. You take a mish-mash of important requirements
> and mix them in with a slew of imposed Best Practices that many disagree
> with - with a net result that you taint the good with the bad. Couple that
> with the hysterical reporting we see at the Chronicle of Higher Ed (fueled
> by your 'report') that focuses on *RANKING* rather than looking at where
> we have real problems across the board and the net result is that any good
> derived by your testing is off-set by that tabloid journalist approach.
>
> I will repeat the controversial Rules here and once again ask you to
> justify how failing to meet any of them results in inaccessible web
> content:
>
> *************
>
> HEADING STRUCTURE:
> "The page must contain at least one h1 element."
> According to whom? While it is certainly good practice to ensure
> each page has appropriate heading structure, nowhere (outside of the FAE
> tool) is it *MANDATED* as such - a page that lacks an <h1> is not
> intrinsically inaccessible. False data - False results!
>
> "The page should contain no more than two h1 elements."
> Please point to one national or international guideline or
> recommendation that mandates this. Another false positive from a
> mechanical tool, fueled by internal University of Illinois politics and
> policies.
>
> "The text content of each h1 element should match all or part of the title
> content."
> "Each h1 element should have text content exclusive of the alt text of any
> img elements it contains."
> Bull Feathers! Made up standards by a small team with an agenda to
> promote their internal tool - and it should be noted that failing to do
> either of these things in no way makes a page "less accessible" - it just
> doesn't meet their FAE Guidelines.
>
>
> DATA TABLES:
> "For each data table, the first cell in each column must be a th element,
> and each row must contain at least one th element."
> Patently FALSE! In fact, the table of school rankings at the
> Chronicle of Higher Ed that Jon points to in his earlier email
> (http://chronicle.com/article/BestWorst-College-Web/125642/) does not meet
> this "pass" criteria, yet is not "inaccessible" because of it - in fact
> the size of the table (183 rows in length with little-to-no internal
> navigation) is more of an access issue than the failure for each row to
> start with a <th>.
>
> The following table is perfectly acceptable and valid, and meets (as far
> as I know) all required accessibility guidelines as established by both
> the Section 508 Standard and W3C Guidelines (yet fails the FAE tool):
>
> <table>
> <tr>
> <td></td>
> <th scope="col">Sunday</th>
> <th scope="col">Monday</th>
> <th scope="col">Tuesday</th>
> <th scope="col">Wednesday</th>
> <th scope="col">Thursday</th>
> <th scope="col">Friday</th>
> <th scope="col">Saturday</th>
> </tr>
> <tr>
> <th scope="row">Week 1</th>
> <td></td>
> <td></td>
> <td>1</td>
> <td>2</td>
> <td>3</td>
> <td>4</td>
> <td>5</td>
> </tr>
>
> ...etc.
> </table>
>
> "Each th element in a complex data table must have an id attribute whose
> value is unique relative to all ids on the page."
> Please explain how failing to add an ID attribute to a table
> header makes it less accessible.
>
> "Each td element in a complex data table must have a headers attribute
> that references the id attributes of associated th elements."
> Please explain how failing to add a HEADER attribute to a table
> cell makes it less accessible.
>
> What defines "complex"? How does a mechanical tool makes this
> assessment? The table code example shown above is perfectly valid, is
> extremely accessible, and would fail 3 of the 5 data-table 'rules' this
> testing imposes on *your* sites. This is simply unacceptable.
>
>
> IMAGES/ALT TEXT
> "Each img element with width or height less than 8 pixels should be
> removed; CSS techniques should be used instead."
> Really? How exactly was this determined? If I have an image that
> is 9 pixels X 2 pixels than it should have alt text and not be moved to
> CSS?
>
> *************
>
> Until such time as clarification and proof exists to back up these claims,
> the entire exercise is mired in inaccuracies and confusion.
>
> Real Problems not evaluated in your testing:
>
> * Link text that is meaningful when taken out of context
> * Alt text that is meaningful
> * Ensuring that all information conveyed with color is also available
> without color
> * Appropriate foreground and background contrast
> * The ability to interact with a page without the need to use a mouse
> (tabbing)
> * The appropriate use of lists and list mark-up
> * Multi-media issues: no auto-start, caption/transcripts for video
> content, etc.
> * No blinking, no auto-redirect, no timing-out with prior notice, etc.
>
> I have also challenged you to clarify the fact that mechanical testing is
> but one aspect of accessibility evaluation on the 183 Report Cards you
> have issues at http://webaccessibility.cita.illinois.edu/data/schools/ so
> that we have a more accurate and truthful report to properly discuss at
> our institutions.
>
>
>>
>> I can't think of a more basic accessibility feature than using a label
>> element or title attribute to label a form control.
>>
>> The lack of form control labeling was my biggest conclusion from the
>> pages tested and my biggest worry is how to address this issue.
>
> Yet nowhere in your evaluations or reporting is this fact highlighted. Why
> is this?
>
>
>>
>> I think everyone agrees that form control labeling is a part of WCAG
>> 1.0, WCAG 2.0, Section 508 requirements and almost any other web
>> accessibility standard developed.
>>
>> If higher education can't even label simple form controls correctly,
>> how are they ever going to make Dynamic HTML widgets accessible?
>
> How about by focusing on real accessibility issues and not forcing
> everyone to 'pass' the FAE ranking exercise by insisting that every table
> cell have either a header or ID (when @scope is often more than enough),
> or that "The text content of each h1 element should match all or part of
> the title content"? Engaging in real dialog rather than perpetuating
> boogie-man scare tactics and tick-box evaluations that leave higher
> education institutions worrying about their 'ranking' rather than truly
> meeting the needs of disabled students within their student body.
>
>
> JF
> ============================
> John Foliot
>
> NOTE: These are my personal opinions, and in no way reflect the opinion of
> Stanford University (with whom I am under contract), T-Base Communications
> (my employer), my associates or other professional affiliates with whom I
> do business with.
>
> Co-chair - W3C HTML5 Accessibility Task Force (Media)
> http://www.w3.org/WAI/PF/HTML/wiki/Main_Page
>
> ============================
>

From: Gunderson, Jon R
Date: Wed, Dec 15 2010 6:09AM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" Chronicle Article and form control labeling
Previous message | Next message

No I do not look for the use of MathML in the review.

Typically MathML would not be found on the administrative websites that I reviewed.

I would expect MathML to be found more on instructional websites, which are usually behind pass word protections.

Jon

On Dec 14, 2010, at 1:00 PM, Birkir Rúnar Gunnarsson wrote:

> Jon
>
> I hardly dare ask *grin* but was use of MathML or accessible math
> considerred at all in your survey (I know, I will read the details of
> the rules you crated later, was just wondering if you could give me
> the quick and dirty answer to this).
> One would hardly expect accessible math on pages that do not have
> their form fields in order, but I am very worried about the future of
> STEM education as we move from the traditional campus environment with
> readers or visual interpreters to online studies.
> Thanks
> -B
>
> On 12/14/10, Julie Romanowski < = EMAIL ADDRESS REMOVED = > wrote:
>> Not outraged, but disappointed. Creating accessible form controls is not
>> rock science, and we should expect better from our universities.
>>
>> -----Original Message-----
>> From: = EMAIL ADDRESS REMOVED =
>> [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Gunderson, Jon R
>> Sent: Tuesday, December 14, 2010 12:19 PM
>> To: WebAIM Discussion List
>> Subject: Re: [WebAIM] Chronicle of Higher Education article "Colleges Lock
>> Out Blind Students Online" Chronicle Article and form control labeling
>>
>> I find it interesting that people on this list were not outraged that less
>> than 30 percent of the 19,722 pages tested with form controls had accessible
>> labels.
>>
>> http://webaccessibility.cita.illinois.edu/data/
>>
>> I can't think of a more basic accessibility feature than using a label
>> element or title attribute to label a form control.
>>
>> The lack of form control labeling was my biggest conclusion from the pages
>> tested and my biggest worry is how to address this issue.
>>
>> I think everyone agrees that form control labeling is a part of WCAG 1.0,
>> WCAG 2.0, Section 508 requirements and almost any other web accessibility
>> standard developed.
>>
>> If higher education can't even label simple form controls correctly, how are
>> they ever going to make Dynamic HTML widgets accessible?
>>
>> Jon
>>

From: Cliff Tyllick
Date: Wed, Dec 15 2010 10:06AM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" Chronicle Article and form control labeling
Previous message | Next message

Jon, I certainly understand your approach of telling Web developers simple steps to follow to produce content that should be accessible. In fact, I agree that it would be better for us to realize that many people who create content online cannot or will not take the time to learn the fundamentals behind P-O-U-R. For example, nearly all of the content creators at my agency are hired and evaluated to do other tasks, not to put content online. Yet they create the forms, publications, and Web pages that make up a great deal of our website.

But it's one thing to give those folks rules of thumb that they can follow to increase the likelihood that their content is accessible, and quite another to use those rules of thumb to evaluate work produced by people who have their own ways of producing accessible content. Others have pointed out the shortcomings of several of the FAE guidelines as absolute measures of accessibility, so let's focus on the one feature on which we seem to have universal agreement: Did online application forms have field labels?

Sort the list on that parameter. (In the Chron article, just click the arrows in that column's header cell.) When you do, you should notice the same thing I observed:

At one end of the scale, three schools that scored 0 on application forms were in the top 10 percent overall:
U of Evansville, ranked 9th overall
U of Tulsa, ranked 12th overall
Loyola-Chicago, ranked 16th overall
So a school can score in the 90th percentile of this overall measure without having a single field in its online application forms labeled properly. If the application form is the way to get in to the school, how can we consider such a school to have mastered accessibility?

At the other end of the scale, these schools ranked in the top 10 percent in their score on application forms, but did not do nearly so well overall:
U of Louisville, 95th in the overall score (95.1 on application forms)
CSU-Sacramento, 94th (95.1)
CSU-Long Beach, 56th (95)
U of Memphis, 143rd (92.9)
San Jose State, 47th (92.7)
U of Arizona, 60th (92.4)
St. Joseph's, 55th (91.9)
CSU-Monterey Bay, 97th (89.7)

It's hard to know what these disparities mean. In some instances, they might reflect prioritization -- "We can't solve everything at once; let's fix the forms first." In others, they might reflect a lack of cooperation -- "We have no control over site design, and the folks who do have ignored our advice."

But in every case, it means that someone who has done something very important the right way is affiliated with a school that, some people will infer, isn't doing enough to make their sites accessible. Because of where this list was published, "some people" is a random subset of the higher-education community in general, most of whom know very little about Web accessibility.

And sometimes people who make that inference will blame the person who is leading the effort, not the people who are holding it back.

Cliff

Cliff Tyllick
Usability assessment coordinator
Agency Communications Division
Texas Commission on Environmental Quality
512-239-4516
= EMAIL ADDRESS REMOVED =

>>> On 12/15/2010 at 7:01 AM, in message < = EMAIL ADDRESS REMOVED = >, "Gunderson, Jon R" < = EMAIL ADDRESS REMOVED = > wrote:
I am not forcing anyone to do anything.

People will review the data and make their own conclusions.

Hopefully good ones for improving accessibility and help open discussion at universities on how to manage and provide administrative controls for web accessibility, which seem to be sorely lacking.

Web accessibility is more than setting a policy and hoping developers do the right thing, it needs to be managed and audited just like other IT issues like security.

You either believe that you need to provide developers with coding practices or not for accessibility.

In my work with web developers I would do presentations on Section 508 or WCAG and many of them would come up to me after a presentation and tell me they understood some of the requirements and didn't understand mpost of the other requirements. Could I just tell them what to do and they would trust that I was providing them with coding practices that met the requirements.

The best practices are about what works for web developers and people with disabilities based on the current technologies and AT implementations (is a big factor in form control labeling)

Best Practices:
http://html.cita.illinois.edu

For the most part I have found developers have embraced the best practices since they make more sense to them than trying to understand the requirements of Section 508 or WCAG 2.0.

They also seem to make sense to a lot of other people outside Illinois because they have been independently implemented, even at less well known schools, like Missouri Sate University:
http://webaccessibility.cita.illinois.edu/data/school/204/

The best practices are based on web standards so developers gain the benefit of standards based designs.

Best Practices:
http://html.cita.illinois.edu

Since the best practices are based on web standards, many developers trying to follow web standards based design already found them on their own.

Doug Bergett of the University of Illinois was a earlier adopter of best practices at Illinois and helped in their development has a great quote:

"I learned best practices because of accessibility, but I use them because they are better web design"

This is the type of win-win situation we need to promote web accessibility,

if accessibility is thought of as a burden and not integral part of the design process, we will not get very far with web accessibility, especially in the Web 2.0 world around us.

If there are better coding practices for meeting Section 508 or WCAG 2.0 requirements I would love to hear about them, and I am sure other people on this list would also like to hear them too.

If you don't like these best practices I hope that it will spur discussion on your campus on what are best practices for your campus and I hope you will share the results with the rest of us.

Jon


On Dec 14, 2010, at 1:44 PM, John Foliot wrote:

> Gunderson, Jon R wrote:
>>
>> I find it interesting that people on this list were not outraged that
>> less than 30 percent of the 19,722 pages tested with form controls had
>> accessible labels.
>>
>> http://webaccessibility.cita.illinois.edu/data/
>
> The problem here of course is that this important data-point is lost
> amidst some of the other "Rules" that are circumspect at best, and
> outright false at worst. You take a mish-mash of important requirements
> and mix them in with a slew of imposed Best Practices that many disagree
> with - with a net result that you taint the good with the bad. Couple that
> with the hysterical reporting we see at the Chronicle of Higher Ed (fueled
> by your 'report') that focuses on *RANKING* rather than looking at where
> we have real problems across the board and the net result is that any good
> derived by your testing is off-set by that tabloid journalist approach.
>
> I will repeat the controversial Rules here and once again ask you to
> justify how failing to meet any of them results in inaccessible web
> content:
>
> *************
>
> HEADING STRUCTURE:
> "The page must contain at least one h1 element."
> According to whom? While it is certainly good practice to ensure
> each page has appropriate heading structure, nowhere (outside of the FAE
> tool) is it *MANDATED* as such - a page that lacks an <h1> is not
> intrinsically inaccessible. False data - False results!
>
> "The page should contain no more than two h1 elements."
> Please point to one national or international guideline or
> recommendation that mandates this. Another false positive from a
> mechanical tool, fueled by internal University of Illinois politics and
> policies.
>
> "The text content of each h1 element should match all or part of the title
> content."
> "Each h1 element should have text content exclusive of the alt text of any
> img elements it contains."
> Bull Feathers! Made up standards by a small team with an agenda to
> promote their internal tool - and it should be noted that failing to do
> either of these things in no way makes a page "less accessible" - it just
> doesn't meet their FAE Guidelines.
>
>
> DATA TABLES:
> "For each data table, the first cell in each column must be a th element,
> and each row must contain at least one th element."
> Patently FALSE! In fact, the table of school rankings at the
> Chronicle of Higher Ed that Jon points to in his earlier email
> (http://chronicle.com/article/BestWorst-College-Web/125642/) does not meet
> this "pass" criteria, yet is not "inaccessible" because of it - in fact
> the size of the table (183 rows in length with little-to-no internal
> navigation) is more of an access issue than the failure for each row to
> start with a <th>.
>
> The following table is perfectly acceptable and valid, and meets (as far
> as I know) all required accessibility guidelines as established by both
> the Section 508 Standard and W3C Guidelines (yet fails the FAE tool):
>
> <table>
> <tr>
> <td></td>
> <th scope="col">Sunday</th>
> <th scope="col">Monday</th>
> <th scope="col">Tuesday</th>
> <th scope="col">Wednesday</th>
> <th scope="col">Thursday</th>
> <th scope="col">Friday</th>
> <th scope="col">Saturday</th>
> </tr>
> <tr>
> <th scope="row">Week 1</th>
> <td></td>
> <td></td>
> <td>1</td>
> <td>2</td>
> <td>3</td>
> <td>4</td>
> <td>5</td>
> </tr>
>
> ...etc.
> </table>
>
> "Each th element in a complex data table must have an id attribute whose
> value is unique relative to all ids on the page."
> Please explain how failing to add an ID attribute to a table
> header makes it less accessible.
>
> "Each td element in a complex data table must have a headers attribute
> that references the id attributes of associated th elements."
> Please explain how failing to add a HEADER attribute to a table
> cell makes it less accessible.
>
> What defines "complex"? How does a mechanical tool makes this
> assessment? The table code example shown above is perfectly valid, is
> extremely accessible, and would fail 3 of the 5 data-table 'rules' this
> testing imposes on *your* sites. This is simply unacceptable.
>
>
> IMAGES/ALT TEXT
> "Each img element with width or height less than 8 pixels should be
> removed; CSS techniques should be used instead."
> Really? How exactly was this determined? If I have an image that
> is 9 pixels X 2 pixels than it should have alt text and not be moved to
> CSS?
>
> *************
>
> Until such time as clarification and proof exists to back up these claims,
> the entire exercise is mired in inaccuracies and confusion.
>
> Real Problems not evaluated in your testing:
>
> * Link text that is meaningful when taken out of context
> * Alt text that is meaningful
> * Ensuring that all information conveyed with color is also available
> without color
> * Appropriate foreground and background contrast
> * The ability to interact with a page without the need to use a mouse
> (tabbing)
> * The appropriate use of lists and list mark-up
> * Multi-media issues: no auto-start, caption/transcripts for video
> content, etc.
> * No blinking, no auto-redirect, no timing-out with prior notice, etc.
>
> I have also challenged you to clarify the fact that mechanical testing is
> but one aspect of accessibility evaluation on the 183 Report Cards you
> have issues at http://webaccessibility.cita.illinois.edu/data/schools/ so
> that we have a more accurate and truthful report to properly discuss at
> our institutions.
>
>
>>
>> I can't think of a more basic accessibility feature than using a label
>> element or title attribute to label a form control.
>>
>> The lack of form control labeling was my biggest conclusion from the
>> pages tested and my biggest worry is how to address this issue.
>
> Yet nowhere in your evaluations or reporting is this fact highlighted. Why
> is this?
>
>
>>
>> I think everyone agrees that form control labeling is a part of WCAG
>> 1.0, WCAG 2.0, Section 508 requirements and almost any other web
>> accessibility standard developed.
>>
>> If higher education can't even label simple form controls correctly,
>> how are they ever going to make Dynamic HTML widgets accessible?
>
> How about by focusing on real accessibility issues and not forcing
> everyone to 'pass' the FAE ranking exercise by insisting that every table
> cell have either a header or ID (when @scope is often more than enough),
> or that "The text content of each h1 element should match all or part of
> the title content"? Engaging in real dialog rather than perpetuating
> boogie-man scare tactics and tick-box evaluations that leave higher
> education institutions worrying about their 'ranking' rather than truly
> meeting the needs of disabled students within their student body.
>
>
> JF
> ============================
> John Foliot
>
> NOTE: These are my personal opinions, and in no way reflect the opinion of
> Stanford University (with whom I am under contract), T-Base Communications
> (my employer), my associates or other professional affiliates with whom I
> do business with.
>
> Co-chair - W3C HTML5 Accessibility Task Force (Media)
> http://www.w3.org/WAI/PF/HTML/wiki/Main_Page
>
> ============================
>

From: John Foliot
Date: Wed, Dec 15 2010 11:24AM
Subject: Re: Chronicle of Higher Education article "Colleges Lock Out Blind Students Online" Chronicle Article and form control labeling
Previous message | Next message

Gunderson, Jon R wrote:
>
> I am not forcing anyone to do anything.

By publishing a ranking, and by having the Chronicle of Higher Education
re-publish that data as "The Best and Worst" as some form of definitive
listing you are forcing non-participants in your Best Practices exercise
to meet *your* criteria. I completely and vehemently disagree with a
number of those Best Practices (as do many others, including WebAIM's
Jared Smith), have said so publicly here on this thread twice now, and
have further asked you to justify how they ensure accessibility.

You have steadfastedly avoided answering those questions and providing
those justifications, yet you presume to judge other institutions 'success
or failure' by these flawed Rules. You are attempting to force
institutions to abide by your rule-set, or those institutions will
continue to rank lower in your Best and Worst list. It amounts to
blackmail.

Will you be answering my questions? Yes or No?


>
> People will review the data and make their own conclusions.
>
> Hopefully good ones for improving accessibility and help open
> discussion at universities on how to manage and provide administrative
> controls for web accessibility, which seem to be sorely lacking.

I think my conclusions are pretty clear to everyone, but I now get to
spend the next couple of weeks running around putting out false fires at
my institution explaining to managers why it's not crucial to meet some of
*your* best practices to ensure accessibility, how there is wide-spread
disagreement about those best practices, that "...passing these rules
doesn't mean you are accessible, it just means you have the markup for
accessibility. There are many manual tests that must be made..." etc.,
etc. and I am telling you, publicly, that this does ABSOLUTELY NOTHING TO
ADVANCE THE GOALS AT MY INSTITUTION. You have put me, and others at
similar higher education institutions, on the defense, running around
undoing the damage you have caused here. (8 pixel images indeed!)


>
> Web accessibility is more than setting a policy and hoping developers
> do the right thing, it needs to be managed and audited just like other
> IT issues like security.

But you did not do an 'audit', you did a mechanical checking with no
thought process involved, simply a series of tick-box checks using Rules
that neither ensure accessibility is met nor, when missing, impede
accessibility, and then you had the audacity to publish the results as a
ranking of the Best and Worst. It's a sham!

Web accessibility is more than a mechanical testing of a page and
expecting that if all the tick-boxes are checked off the page it's
accessible; it needs to be an educational process just like any high-level
cognitive success process is, be it *real* security or *real*
accessibility. The intelligent use of tools in that pursuit is important,
but you've shifted the entire burden of "success or failure" and Best or
Worst ranking, not on progress, education, or real accessibility, but
rather on the fact that you believe that pages should meet *your* Best
Practices, as tested by a tool *you* built, without question or it fails -
and gets added to the "Worst" list.


>
> You either believe that you need to provide developers with coding
> practices or not for accessibility.

Jon, this is me you are talking to. I have spent the last 11 years of my
life committed to this cause, and you don't need to tell me what must be
done. We all know that educating developers and teaching them how to do
things right, and how to use tools to ensure they are meeting success *IS*
the job, and to presume that anyone reading this list doesn't get that
already is preposterous.

Many other dedicated people in Higher Ed also understand that, and have
spent years working towards that goal, yet you presume to judge their
institutions and rank their 'success'. People such as Terrill Thompson
(University of Washington - #113 in your list), Jared Smith (University of
Utah - #38), Lisa Fiedor (North Carolina State - #72) Harold Kramer and
Cath Stager Kilcommons (U. of Colorado at Boulder - #137), Gregg
Vanderheiden and Ben Caldwell (U. of Wisconsin at Madison - #96), me
(Stanford - #109)... it's preposterous, it's offensive, and it's
insulting.


>
> In my work with web developers I would do presentations on Section 508
> or WCAG and many of them would come up to me after a presentation and
> tell me they understood some of the requirements and didn't understand
> mpost of the other requirements. Could I just tell them what to do and
> they would trust that I was providing them with coding practices that
> met the requirements.

So rather than truly help them understand, you give them a cookie cutter
check list, pat them on the head and tell them that if they just do these
things all will be great, and they will have reached magical accessibility
nirvana? Seriously?


>
> The best practices are about what works for web developers and people
> with disabilities based on the current technologies and AT
> implementations (is a big factor in form control labeling)
>
> Best Practices:
> http://html.cita.illinois.edu
>
> For the most part I have found developers have embraced the best
> practices since they make more sense to them than trying to understand
> the requirements of Section 508 or WCAG 2.0.

Sure, because once again you've turned it into a tick-box list that
requires no THINKING, just do what Jon says and all will be groovy -
you'll rank higher on the Best and Worst List. YOU ARE PERPETUATING A LIE!


>
> They also seem to make sense to a lot of other people outside Illinois
> because they have been independently implemented, even at less well
> known schools, like Missouri Sate University:
> http://webaccessibility.cita.illinois.edu/data/school/204/

Yes, with no offense to those schools who participate, for the most part
they don't seem important enough to make it to your ranking list. Instead
you take on larger institutions, with their own accessibility programs and
strategies, and go after them - they aren't following Illinois' Best
Practices and using FAE? Down to the bottom of the list you go.

Jon, if you truly believe that this strategy will ensure success at the
University of Illinois, or at the other institutions who are part of your
Best Practices group, then I wish you and them all the success in the
world. However, when you presume to judge the efforts and strategies of
other organizations who do not share that opinion you crossed the line.
And to then have it published in the Chronicle of Higher Education as some
form of definitive Best and Worst list? It's outrageous!


>
> This is the type of win-win situation we need to promote web
> accessibility,

A ranking of Best and Worst is hardly win-win: you're either a Best
(winner) or Worst (loser). It does *absolutely nothing* to promote web
accessibility - NOTHING!


>
> If there are better coding practices for meeting Section 508 or WCAG
> 2.0 requirements I would love to hear about them, and I am sure other
> people on this list would also like to hear them too.

http://www.w3.org/TR/WCAG20-TECHS/
http://webaim.org/articles/

But to be clear, these aren't shopping lists, they make the effort to
actually *teach*, because THE best practice is to think about what you are
doing, rather than dogmatically following somebody else's tick-box list.


>
> If you don't like these best practices I hope that it will spur
> discussion on your campus on what are best practices for your campus
> and I hope you will share the results with the rest of us.

So... you've passed judgment on these institutions, yet you don't even
know what their internal discussions and best practices strategies already
are? Unbelievable...

JF

From: Tim Harshbarger
Date: Wed, Dec 15 2010 1:57PM
Subject: Re: Chronicle of Higher Education article"Colleges Lock Out Blind Students Online" Chronicle Articleand form control labeling
Previous message | Next message

It is obvious that the publication of this article has raised concerns
and caused some discussions. I think a serious discussion about testing
methodologies and accessibility approaches is extremely valuable to the
WebAIM community in general.

However, as an accessibility colleague, I would like to ask people
refrain from including comments that seem to assume that others have
maligned intent or self-interest. They tend to make it more difficult
for me to discern what the core discussion is. Also, my personal
opinion is that such comments make other list members less likely to ask
questions or add their thoughts to the discussion.

Thanks!
Tim

From: Dave Katten
Date: Wed, Dec 15 2010 2:54PM
Subject: Re: Chronicle of Higher Education article"Colleges Lock Out Blind Students Online" Chronicle Articleand form control labeling
Previous message | Next message

Hi list,
Long time lurker, but having worked on accessibility at 2 of the
institutions (both lower ranking than I think they deserve), I thought I'd
share some of my concerns.

1) I considered using the FAE as a primary tool in the past, but opted
instead for WAVE, and I think Jon and the UCITA gang might benefit from some
feedback as to why. Critically, the Functional Accessibility Evaluator is
not any of the 3 things it claims in its name. It is not Functional (nor
does it assist in the determination of functional accessibility), because it
is, as its authors state, a set of coding style rules that _perhaps_ will
lead to a more streamlined site. It checks many rules that, I think pretty
clearly, have nothing to do with accessibility. While some (like Jon's
defense of form labeling) do qualify as being accessibility related, many do
not (see John F.'s various responses). Lastly, it really is not an
evaluator, nor an evaluation tool. When I use WAVE, the red icons are clear
indicators of _barriers_, rather than with FAE's failures, which I then have
to check to see if there's a barrier. That is, if something fails FAE (not
just warn, but fail), I still have to go in and manually check. This is of
no use to me, personally. Please don't take this as a "your tool sux0rz lol"
rant; I just thought you might like some user feedback.

2) I do think there's a place for coding style guidelines like the ones
developed by UCITA, although I, like John F. find several of them perplexing
(to put it mildly). And I'm willing to concede that perhaps a university
policy should have rules like "no layout tables". But there needs to be a
process/remedy for sites that don't employ the best practices, but are still
accessible. Such a discussion is painfully absent from the Chronicle
article, and from what I can tell, the UCITA guidelines (though feel free to
point me to one).

3) Doing this sort of analysis takes a great amount of time and effort on
the part of Jon and/or his staff. It is really concerning to me that given
the thought that must have gone into this ranking, there doesn't appear to
be any recognition of the negative impact that a ranking the Chronicle might
have (I'm not saying it didn't cross their mind, but I haven't seen much
evidence in terms of the comments here or the substance of the article.) For
instance, how many of the low ranking schools have people hired specifically
to do accessibility, and who have created _accessible sites_ are now going
to be hauled in front of a CIO and asked to defend their job? That's a worst
case scenario, but certainly Mr. Gunderson had to consider that possibility,
no? The audience of the Chronicle is not IT specialists, or access
advocates, or people who are generally equipped to digest and critique this
information; they're presidents and administrators who care about bottom
lines and institutional reputation.

4) Articles in the Chronicle of Higher Ed that focus on access and the
challenges of institutionalizing accessible practices are, in general, a
good thing. I expect that many campuses that were looking for ways to obtain
buy-in on accessibility will be able to point to this article and say "see?
We need more support". But it's a shame that that support will be based on
inaccurate, artificial, and in some ways orthogonal rankings. I'm a big
proponent of raising awareness, but like others question the use of UCITA
guidelines. I would much prefer a detailed look at say, 1 school per
conference, that, well, evaluated the functional accessibility of a sample
of the site. While it loses the impact of rankings, it makes up for it with
an actionable survey of the state of university web development practices.

5) I know neither Jon nor John F. personally, nor do I think anyone on this
list. But I have no doubt that we're all on the same team. I like to think
we all agree on that point.

Best,
Dave Katten

On Wed, Dec 15, 2010 at 2:57 PM, Tim Harshbarger <
= EMAIL ADDRESS REMOVED = > wrote:

> It is obvious that the publication of this article has raised concerns
> and caused some discussions. I think a serious discussion about testing
> methodologies and accessibility approaches is extremely valuable to the
> WebAIM community in general.
>
> However, as an accessibility colleague, I would like to ask people
> refrain from including comments that seem to assume that others have
> maligned intent or self-interest. They tend to make it more difficult
> for me to discern what the core discussion is. Also, my personal
> opinion is that such comments make other list members less likely to ask
> questions or add their thoughts to the discussion.
>
> Thanks!
> Tim
>

From: Gunderson, Jon R
Date: Thu, Dec 16 2010 11:03AM
Subject: Re: Chronicle of Higher Education article"Colleges Lock Out Blind Students Online" Chronicle Articleand form control labeling
Previous message | Next message

I would just like to note that I did not write the Chronicle Article, Marc Perry did.

He was writing an article on the student with blindness and found my data in the Educause archive and interviewed me.

I hope people read the whole Chronicle article it is not just about the data I collected.

The Chronicle, not me, decided to use the data I presented as a part of my poster presentation at the 2010 Educause Conference in Anaheim, California.
http://www.educause.edu/E2010/Program/PS043

There is a place to comment on the presentation at the Educause website for those who wish to do so.

I think if people look more closely at the best practices they will hopefully have a better understanding of the rules and why they are useful.
http://html.cita.illinois.edu

The more I listen to people about their evaluation practices it is clear to me there is no common concept of "accessible design", there seems to be a lot of different opinions on what parts of WCAG 2.0 or Section 508 are more important to implement and what needs to be done to meet the requirement and even more important what is needed for functional accessibility and usability by people with disabilities . Some requirements like WCAG 2.0 1.2.3 "Audio Description or Media Alternative (Prerecorded, Level A)" and the corresponding Section 508 requirement 1194.24(d) "All training and informational video and multimedia productions which support the agency's mission, regardless of format, that contain visual information necessary for the comprehension of the content, shall be audio described" are ignored as far as I can tell.

I think it would help the web accessibility community if there is a more uniform understanding of what it means to design accessible web resources and to have a more common set of practices to give to web developers for accessible design and resources to perform accessibility quality assurance checks on their web resources.

I hope this discussion will eventually lead to a more fruit full discussion on accessible design and evaluation/QA best practices.

Jon


-----Original Message-----
From: = EMAIL ADDRESS REMOVED = [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Dave Katten
Sent: Wednesday, December 15, 2010 3:50 PM
To: WebAIM Discussion List
Subject: Re: [WebAIM] Chronicle of Higher Education article"Colleges Lock Out Blind Students Online" Chronicle Articleand form control labeling

Hi list,
Long time lurker, but having worked on accessibility at 2 of the institutions (both lower ranking than I think they deserve), I thought I'd share some of my concerns.

1) I considered using the FAE as a primary tool in the past, but opted instead for WAVE, and I think Jon and the UCITA gang might benefit from some feedback as to why. Critically, the Functional Accessibility Evaluator is not any of the 3 things it claims in its name. It is not Functional (nor does it assist in the determination of functional accessibility), because it is, as its authors state, a set of coding style rules that _perhaps_ will lead to a more streamlined site. It checks many rules that, I think pretty clearly, have nothing to do with accessibility. While some (like Jon's defense of form labeling) do qualify as being accessibility related, many do not (see John F.'s various responses). Lastly, it really is not an evaluator, nor an evaluation tool. When I use WAVE, the red icons are clear indicators of _barriers_, rather than with FAE's failures, which I then have to check to see if there's a barrier. That is, if something fails FAE (not just warn, but fail), I st
ill have to go in and manually check. This is of no use to me, personally. Please don't take this as a "your tool sux0rz lol"
rant; I just thought you might like some user feedback.

2) I do think there's a place for coding style guidelines like the ones developed by UCITA, although I, like John F. find several of them perplexing (to put it mildly). And I'm willing to concede that perhaps a university policy should have rules like "no layout tables". But there needs to be a process/remedy for sites that don't employ the best practices, but are still accessible. Such a discussion is painfully absent from the Chronicle article, and from what I can tell, the UCITA guidelines (though feel free to point me to one).

3) Doing this sort of analysis takes a great amount of time and effort on the part of Jon and/or his staff. It is really concerning to me that given the thought that must have gone into this ranking, there doesn't appear to be any recognition of the negative impact that a ranking the Chronicle might have (I'm not saying it didn't cross their mind, but I haven't seen much evidence in terms of the comments here or the substance of the article.) For instance, how many of the low ranking schools have people hired specifically to do accessibility, and who have created _accessible sites_ are now going to be hauled in front of a CIO and asked to defend their job? That's a worst case scenario, but certainly Mr. Gunderson had to consider that possibility, no? The audience of the Chronicle is not IT specialists, or access advocates, or people who are generally equipped to digest and critique this information; they're presidents and administrators who care about bottom lines and institu
tional reputation.

4) Articles in the Chronicle of Higher Ed that focus on access and the challenges of institutionalizing accessible practices are, in general, a good thing. I expect that many campuses that were looking for ways to obtain buy-in on accessibility will be able to point to this article and say "see?
We need more support". But it's a shame that that support will be based on inaccurate, artificial, and in some ways orthogonal rankings. I'm a big proponent of raising awareness, but like others question the use of UCITA guidelines. I would much prefer a detailed look at say, 1 school per conference, that, well, evaluated the functional accessibility of a sample of the site. While it loses the impact of rankings, it makes up for it with an actionable survey of the state of university web development practices.

5) I know neither Jon nor John F. personally, nor do I think anyone on this list. But I have no doubt that we're all on the same team. I like to think we all agree on that point.

Best,
Dave Katten

On Wed, Dec 15, 2010 at 2:57 PM, Tim Harshbarger < = EMAIL ADDRESS REMOVED = > wrote:

> It is obvious that the publication of this article has raised concerns
> and caused some discussions. I think a serious discussion about
> testing methodologies and accessibility approaches is extremely
> valuable to the WebAIM community in general.
>
> However, as an accessibility colleague, I would like to ask people
> refrain from including comments that seem to assume that others have
> maligned intent or self-interest. They tend to make it more difficult
> for me to discern what the core discussion is. Also, my personal
> opinion is that such comments make other list members less likely to
> ask questions or add their thoughts to the discussion.
>
> Thanks!
> Tim
>

From: Hoffman, Allen
Date: Thu, Dec 16 2010 12:51PM
Subject: Re: Chronicle of Higher Education article"Colleges Lock Out Blind Students Online" Chronicle Articleand form control labeling
Previous message | No next message

In my opinion inconsistency in the IT accessibility community as a whole
is a major contributor to the slow, and inconsistent progress we see as
a whole. Identifying a common set of consistently applied requirements
is critical for success. When there is no right answer, everything is
OK, and from experience most of us probably would agree this is not the
best way to achieve accessibility across the board.

WCAG 2.0 has a great amount of valuable standards that can be applied in
consistent fashion. If something is left out of WCAG 2.0 it's not for
lack of asking on the group's part. I think there is significant room
for improvement in consistently providing "how to" information for
various platforms. The lack of centrally located, consistently
developed, and complete how to guidance to meet the WCAG 2.0 standards
for the wide range of platforms in use today leads to inconsistent
efforts by developers. For example, recently the VA released a very
comprehensive course on creation of accessible flash--something that has
been sorely needed in that development community for some time now.
Kudos to the VA folks.

Finally, personally I believe periodic accessibility reporting with
consistently applied metrics would be an enormous value to all
stakeholders, but needs to be designed to provide those who make
decisions about such Web content the ability to understand their
results, and use the metrics to direct change.






-----Original Message-----
From: Gunderson, Jon R [mailto: = EMAIL ADDRESS REMOVED = ]
Sent: Thursday, December 16, 2010 1:02 PM
To: WebAIM Discussion List
Subject: Re: [WebAIM] Chronicle of Higher Education article"Colleges
Lock Out Blind Students Online" Chronicle Articleand form control
labeling

I would just like to note that I did not write the Chronicle Article,
Marc Perry did.

He was writing an article on the student with blindness and found my
data in the Educause archive and interviewed me.

I hope people read the whole Chronicle article it is not just about the
data I collected.

The Chronicle, not me, decided to use the data I presented as a part of
my poster presentation at the 2010 Educause Conference in Anaheim,
California.
http://www.educause.edu/E2010/Program/PS043

There is a place to comment on the presentation at the Educause website
for those who wish to do so.

I think if people look more closely at the best practices they will
hopefully have a better understanding of the rules and why they are
useful.
http://html.cita.illinois.edu

The more I listen to people about their evaluation practices it is
clear to me there is no common concept of "accessible design", there
seems to be a lot of different opinions on what parts of WCAG 2.0 or
Section 508 are more important to implement and what needs to be done to
meet the requirement and even more important what is needed for
functional accessibility and usability by people with disabilities .
Some requirements like WCAG 2.0 1.2.3 "Audio Description or Media
Alternative (Prerecorded, Level A)" and the corresponding Section 508
requirement 1194.24(d) "All training and informational video and
multimedia productions which support the agency's mission, regardless of
format, that contain visual information necessary for the comprehension
of the content, shall be audio described" are ignored as far as I can
tell.

I think it would help the web accessibility community if there is a more
uniform understanding of what it means to design accessible web
resources and to have a more common set of practices to give to web
developers for accessible design and resources to perform accessibility
quality assurance checks on their web resources.

I hope this discussion will eventually lead to a more fruit full
discussion on accessible design and evaluation/QA best practices.

Jon


-----Original Message-----
From: = EMAIL ADDRESS REMOVED =
[mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Dave Katten
Sent: Wednesday, December 15, 2010 3:50 PM
To: WebAIM Discussion List
Subject: Re: [WebAIM] Chronicle of Higher Education article"Colleges
Lock Out Blind Students Online" Chronicle Articleand form control
labeling

Hi list,
Long time lurker, but having worked on accessibility at 2 of the
institutions (both lower ranking than I think they deserve), I thought
I'd share some of my concerns.

1) I considered using the FAE as a primary tool in the past, but opted
instead for WAVE, and I think Jon and the UCITA gang might benefit from
some feedback as to why. Critically, the Functional Accessibility
Evaluator is not any of the 3 things it claims in its name. It is not
Functional (nor does it assist in the determination of functional
accessibility), because it is, as its authors state, a set of coding
style rules that _perhaps_ will lead to a more streamlined site. It
checks many rules that, I think pretty clearly, have nothing to do with
accessibility. While some (like Jon's defense of form labeling) do
qualify as being accessibility related, many do not (see John F.'s
various responses). Lastly, it really is not an evaluator, nor an
evaluation tool. When I use WAVE, the red icons are clear indicators of
_barriers_, rather than with FAE's failures, which I then have to check
to see if there's a barrier. That is, if something fails FAE (not just
warn, but fail), I still have to go in and manually check. This is of no
use to me, personally. Please don't take this as a "your tool sux0rz
lol"
rant; I just thought you might like some user feedback.

2) I do think there's a place for coding style guidelines like the ones
developed by UCITA, although I, like John F. find several of them
perplexing (to put it mildly). And I'm willing to concede that perhaps a
university policy should have rules like "no layout tables". But there
needs to be a process/remedy for sites that don't employ the best
practices, but are still accessible. Such a discussion is painfully
absent from the Chronicle article, and from what I can tell, the UCITA
guidelines (though feel free to point me to one).

3) Doing this sort of analysis takes a great amount of time and effort
on the part of Jon and/or his staff. It is really concerning to me that
given the thought that must have gone into this ranking, there doesn't
appear to be any recognition of the negative impact that a ranking the
Chronicle might have (I'm not saying it didn't cross their mind, but I
haven't seen much evidence in terms of the comments here or the
substance of the article.) For instance, how many of the low ranking
schools have people hired specifically to do accessibility, and who have
created _accessible sites_ are now going to be hauled in front of a CIO
and asked to defend their job? That's a worst case scenario, but
certainly Mr. Gunderson had to consider that possibility, no? The
audience of the Chronicle is not IT specialists, or access advocates, or
people who are generally equipped to digest and critique this
information; they're presidents and administrators who care about bottom
lines and institutional reputation.

4) Articles in the Chronicle of Higher Ed that focus on access and the
challenges of institutionalizing accessible practices are, in general, a
good thing. I expect that many campuses that were looking for ways to
obtain buy-in on accessibility will be able to point to this article and
say "see?
We need more support". But it's a shame that that support will be based
on inaccurate, artificial, and in some ways orthogonal rankings. I'm a
big proponent of raising awareness, but like others question the use of
UCITA guidelines. I would much prefer a detailed look at say, 1 school
per conference, that, well, evaluated the functional accessibility of a
sample of the site. While it loses the impact of rankings, it makes up
for it with an actionable survey of the state of university web
development practices.

5) I know neither Jon nor John F. personally, nor do I think anyone on
this list. But I have no doubt that we're all on the same team. I like
to think we all agree on that point.

Best,
Dave Katten

On Wed, Dec 15, 2010 at 2:57 PM, Tim Harshbarger <
= EMAIL ADDRESS REMOVED = > wrote:

> It is obvious that the publication of this article has raised concerns

> and caused some discussions. I think a serious discussion about
> testing methodologies and accessibility approaches is extremely
> valuable to the WebAIM community in general.
>
> However, as an accessibility colleague, I would like to ask people
> refrain from including comments that seem to assume that others have
> maligned intent or self-interest. They tend to make it more difficult

> for me to discern what the core discussion is. Also, my personal
> opinion is that such comments make other list members less likely to
> ask questions or add their thoughts to the discussion.
>
> Thanks!
> Tim
>