WebAIM - Web Accessibility In Mind

E-mail List Archives

Thread: Running ChromeVox as a library in a web page?

for

Number of posts in this thread: 10 (In chronological order)

From: Robert Fentress
Date: Wed, Aug 30 2017 3:36PM
Subject: Running ChromeVox as a library in a web page?
No previous message | Next message →

I was wondering if anyone knows enough to say how hard it would be to port
the ChromeVox browser extension to be a JavaScript library that someone
could load in a browser on a per-page basis without having to install the
extension. I suspect that, under the hood, Google is using standard web
technologies, like JavaScript and the Web Speech API, but I really have no
idea.

I've mentioned this before and folks seemed to be baffled by why one would
want to do such a thing, but I didn't totally understand the criticism, so
I'd appreciate anyone who wished to (kindly) enlighten me. Basically, my
thinking is that, if this were an option, developers could code their page
or web application to standards, as best they could interpret them, and
then test with ChromeVox. If it worked with that, and the developer could,
essentially, include that screen reader as an option on the page itself,
then it would help ensure at least a floor for screen reader
accessibility. It would also provide another option for users, in general,
to interact with their site.

I think many developers want to do right, but don't have the time to learn
all the ins and outs of how different screen readers interpret things or to
test in a half dozen or more different screen reader/browser/platform
combinations, guessing, without any really reliable data, on what those
might be. I know ChromeVox is not a great or complete screen reader, but,
if people started using this as a back up, and it started to gain traction
as a strategy, it might prompt Google to improve it. Then that might begin
to serve as sort of a reference standard for other screen readers in terms
of how to interpret and present things. I can hear the groans already, and
I'm not saying ChromeVox is the best thing to serve as that reference, but
I'm suggesting it here solely because I suspect it would be easier to port
to be just an in-page Javascript-based screen reader.

OK. Have at me, but be kind.

--
Rob Fentress
Senior Accessibility Solutions Designer
Assistive Technologies at Virginia Tech
Electronic Business Card (vCard)
<http://search.vt.edu/search/person.vcf?person54847>
LinkedIn Profile
<https://www.linkedin.com/in/rob-fentress-aa0b609?trk=profile-badge>

From: Patrick H. Lauke
Date: Wed, Aug 30 2017 4:11PM
Subject: Re: Running ChromeVox as a library in a web page?
← Previous message | Next message →

On 30/08/2017 22:36, Robert Fentress wrote:
[...]
> I've mentioned this before and folks seemed to be baffled by why one would
> want to do such a thing, but I didn't totally understand the criticism, so
> I'd appreciate anyone who wished to (kindly) enlighten me. Basically, my
> thinking is that, if this were an option, developers could code their page
> or web application to standards, as best they could interpret them, and
> then test with ChromeVox. If it worked with that, and the developer could,
> essentially, include that screen reader as an option on the page itself,
> then it would help ensure at least a floor for screen reader
> accessibility. It would also provide another option for users, in general,
> to interact with their site.

This *may* help a subset of users that would require a screen reader/AT
- mainly those with mild vision impairment, or users that with cognitive
disabilities who would benefit from self-voicing pages. Clearly, any
other users that do rely on screen readers would already need to have a
screen reader installed anyway (in order to log in, open the browser,
navigate to the site, etc), but of course that's also true for ChromeVox
itself. But having it installed as extension at browser level means that
these users still benefit from it on all websites, not just on the ones
that decided to install some form of site-specific ChromeVox library.
Also, having it at browser level means that users can set their
preferences globally, while a site-specific version would need it own
settings - and then the user goes to another site that implements this
sort of thing, and the settings need to be changed again for THAT site.
This then goes into the same territory as the discussion around the
benefits of site-specific "text resize widgets" versus users actually
using text sizing options in their browser...

> I think many developers want to do right, but don't have the time to learn
> all the ins and outs of how different screen readers interpret things or to
> test in a half dozen or more different screen reader/browser/platform
> combinations, guessing, without any really reliable data, on what those
> might be.

In general, screen readers interpret well-formed and correctly
implemented HTML/ARIA stuff fairly uniformly - at least compared to
years ago. Ideally developers need to learn the "correct" way to mark
things up (particularly referring to official ARIA patterns) and then
their sites should work quite well in recent AT. Sure, every AT has bugs
(the same way every browser has bugs), but the answer to that is not to
just decide to bless one particular implementation (ChromeVox) as the
de-facto standard...

P
--
Patrick H. Lauke

www.splintered.co.uk | https://github.com/patrickhlauke
http://flickr.com/photos/redux/ | http://redux.deviantart.com
twitter: @patrick_h_lauke | skype: patrick_h_lauke

From: Robert Fentress
Date: Wed, Aug 30 2017 5:53PM
Subject: Re: Running ChromeVox as a library in a web page?
← Previous message | Next message →

Thanks for your thoughtful response.

I guess I'm thinking of complex composite widgets where it is not entirely
clear what pattern fits, but you want to make sure it's not going to be
totally fubar. An example: I've seen a complex autocomplete-like widget,
where you are in a field and start typing in characters and a list of users
appears in a listbox structure. When you arrow down to select a user and
press Enter, a sort of badge appears in the field (or at least appears to
be in the field, anyway), indicating you've added that user to a list of
users, to be used for whatever process you are trying to accomplish. Then,
you can start typing again, bringing up another listbox where you can
select another user to be added as a sort of badge in that field, and so
on. The badges in the field have little exes in them, allowing you to
remove them. It can all be accessed using only the keyboard somehow, but
exactly how you structure that in terms of ARIA patterns, and what keyboard
interaction model to use is not 100% clear, at least in my mind.

Therefore, what I would want to know, as a conscientious developer, is if
this thing--whatever it is--is going to presented in *some sort of sensible
way* to a screen reader user. In cases like this, JAWS may not understand
the semantics the developer is trying to express exactly right and present
a possibly confusing mishmash of cues and affordances, but VoiceOver may
guess what you mean to be conveying well, and so on. It would be helpful
in complex cases like this to be able to say something like, "Look, I know
this is a weird widget I've made here, but it does provide useful
affordances to many users, and I don't want to just stuck be with this
limited palette of widgets that the ARIA authoring practices has blessed.
I've tried to use semantics that are proper though, and it doesn't trigger
any parsing errors, and I have at least tested this out in ChromeVox and
know it works somewhat sensibly there, so if it doesn't work exactly how
you'd like it to work with your particular screen reader, you can at least
use this in-page screen reader I've provided, and it'll work there."

I know that sucks, and your point about how weird it would be to switch
screen readers mid-stream is well taken. It is awkward and probably
unrealistic--maybe even a pro-forma copt-out. That being said I guess I'm
still confused, absent something like this, how you keep moving forward in
terms of UI patterns. This, at least, provides one path that is, to twist
the meaning a little, "accessibility supported." Hope that made sense.

On Wed, Aug 30, 2017 at 6:11 PM, Patrick H. Lauke < = EMAIL ADDRESS REMOVED = >
wrote:

> On 30/08/2017 22:36, Robert Fentress wrote:
> [...]
>
>> I've mentioned this before and folks seemed to be baffled by why one would
>> want to do such a thing, but I didn't totally understand the criticism, so
>> I'd appreciate anyone who wished to (kindly) enlighten me. Basically, my
>> thinking is that, if this were an option, developers could code their page
>> or web application to standards, as best they could interpret them, and
>> then test with ChromeVox. If it worked with that, and the developer
>> could,
>> essentially, include that screen reader as an option on the page itself,
>> then it would help ensure at least a floor for screen reader
>> accessibility. It would also provide another option for users, in
>> general,
>> to interact with their site.
>>
>
> This *may* help a subset of users that would require a screen reader/AT -
> mainly those with mild vision impairment, or users that with cognitive
> disabilities who would benefit from self-voicing pages. Clearly, any other
> users that do rely on screen readers would already need to have a screen
> reader installed anyway (in order to log in, open the browser, navigate to
> the site, etc), but of course that's also true for ChromeVox itself. But
> having it installed as extension at browser level means that these users
> still benefit from it on all websites, not just on the ones that decided to
> install some form of site-specific ChromeVox library. Also, having it at
> browser level means that users can set their preferences globally, while a
> site-specific version would need it own settings - and then the user goes
> to another site that implements this sort of thing, and the settings need
> to be changed again for THAT site. This then goes into the same territory
> as the discussion around the benefits of site-specific "text resize
> widgets" versus users actually using text sizing options in their browser...
>
> I think many developers want to do right, but don't have the time to learn
>> all the ins and outs of how different screen readers interpret things or
>> to
>> test in a half dozen or more different screen reader/browser/platform
>> combinations, guessing, without any really reliable data, on what those
>> might be.
>>
>
> In general, screen readers interpret well-formed and correctly implemented
> HTML/ARIA stuff fairly uniformly - at least compared to years ago. Ideally
> developers need to learn the "correct" way to mark things up (particularly
> referring to official ARIA patterns) and then their sites should work quite
> well in recent AT. Sure, every AT has bugs (the same way every browser has
> bugs), but the answer to that is not to just decide to bless one particular
> implementation (ChromeVox) as the de-facto standard...
>
> P
> --
> Patrick H. Lauke
>
> www.splintered.co.uk | https://github.com/patrickhlauke
> http://flickr.com/photos/redux/ | http://redux.deviantart.com
> twitter: @patrick_h_lauke | skype: patrick_h_lauke
> > > > >



--
Rob Fentress
Senior Accessibility Solutions Designer
Assistive Technologies at Virginia Tech
Electronic Business Card (vCard)
<http://search.vt.edu/search/person.vcf?person54847>
LinkedIn Profile
<https://www.linkedin.com/in/rob-fentress-aa0b609?trk=profile-badge>

From: Patrick H. Lauke
Date: Wed, Aug 30 2017 6:52PM
Subject: Re: Running ChromeVox as a library in a web page?
← Previous message | Next message →

On 31/08/2017 00:53, Robert Fentress wrote:
[...]
> I know that sucks, and your point about how weird it would be to switch
> screen readers mid-stream is well taken. It is awkward and probably
> unrealistic--maybe even a pro-forma copt-out. That being said I guess I'm
> still confused, absent something like this, how you keep moving forward in
> terms of UI patterns.

The most future-proof way would be to propose new patterns (or point out
omissions/edge cases in existing patterns) at the ARIA spec level. Most
UAs/ATs (barring bugs, as ever) stick quite sensibly to the spec
provided it's comprehensive enough (in terms of detailing how things
should be processed/what expected navigation and behavior should be).

And in the meantime (as with other things like browser support for CSS,
new JS APIs, etc) you'll have to compromise. Worst case provide a
separate alternative version for very complex widgets that currently
don't map to any known type of widget / where a best practice has not
been established yet.

P
--
Patrick H. Lauke

www.splintered.co.uk | https://github.com/patrickhlauke
http://flickr.com/photos/redux/ | http://redux.deviantart.com
twitter: @patrick_h_lauke | skype: patrick_h_lauke

From: Jonathan Avila
Date: Wed, Aug 30 2017 7:26PM
Subject: Re: Running ChromeVox as a library in a web page?
← Previous message | Next message →

> absent something like this, how you keep moving forward in terms of UI patterns.

Ultimately we need a way to communicate the patterns a control supports, the actions associated with a control, and a method to activate those actions. Instead of solely relying on all sorts of keystrokes if there was a programmatic way to communicate this and perform the actions like those provided through the actions rotor on VoiceOver it would provide consistency and flexibility. Accessibility APIs like UIA support the above at some depth but the ARIA semantics still need to catch up as they are focused on roles and not patterns and there are not great ways of communicated in the DOM the different actions and perform those. Web accessibility will in time be moving into where we can access the accessibility API through JavaScript and these areas will be better supported. The challenge has been up to this point to get the assistive technology and user agents updated to support the current ARIA consistently -- which has taken years to get where we are today.

Jonathan

Jonathan Avila
Chief Accessibility Officer
Level Access, inc. (formerly SSB BART Group, inc.)
(703) 637-8957
= EMAIL ADDRESS REMOVED =
Visit us online: Website | Twitter | Facebook | LinkedIn | Blog
Looking to boost your accessibility knowledge? Check out our free webinars!

The information contained in this transmission may be attorney privileged and/or confidential information intended for the use of the individual or entity named above. If the reader of this message is not the intended recipient, you are hereby notified that any use, dissemination, distribution or copying of this communication is strictly prohibited.

-----Original Message-----
From: WebAIM-Forum [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Robert Fentress
Sent: Wednesday, August 30, 2017 7:53 PM
To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = >
Subject: Re: [WebAIM] Running ChromeVox as a library in a web page?

Thanks for your thoughtful response.

I guess I'm thinking of complex composite widgets where it is not entirely clear what pattern fits, but you want to make sure it's not going to be totally fubar. An example: I've seen a complex autocomplete-like widget, where you are in a field and start typing in characters and a list of users appears in a listbox structure. When you arrow down to select a user and press Enter, a sort of badge appears in the field (or at least appears to be in the field, anyway), indicating you've added that user to a list of users, to be used for whatever process you are trying to accomplish. Then, you can start typing again, bringing up another listbox where you can select another user to be added as a sort of badge in that field, and so on. The badges in the field have little exes in them, allowing you to remove them. It can all be accessed using only the keyboard somehow, but exactly how you structure that in terms of ARIA patterns, and what keyboard interaction model to use is not 100% clear, at least in my mind.

Therefore, what I would want to know, as a conscientious developer, is if this thing--whatever it is--is going to presented in *some sort of sensible
way* to a screen reader user. In cases like this, JAWS may not understand the semantics the developer is trying to express exactly right and present a possibly confusing mishmash of cues and affordances, but VoiceOver may guess what you mean to be conveying well, and so on. It would be helpful in complex cases like this to be able to say something like, "Look, I know this is a weird widget I've made here, but it does provide useful affordances to many users, and I don't want to just stuck be with this limited palette of widgets that the ARIA authoring practices has blessed.
I've tried to use semantics that are proper though, and it doesn't trigger any parsing errors, and I have at least tested this out in ChromeVox and know it works somewhat sensibly there, so if it doesn't work exactly how you'd like it to work with your particular screen reader, you can at least use this in-page screen reader I've provided, and it'll work there."

I know that sucks, and your point about how weird it would be to switch screen readers mid-stream is well taken. It is awkward and probably unrealistic--maybe even a pro-forma copt-out. That being said I guess I'm still confused, absent something like this, how you keep moving forward in terms of UI patterns. This, at least, provides one path that is, to twist the meaning a little, "accessibility supported." Hope that made sense.

On Wed, Aug 30, 2017 at 6:11 PM, Patrick H. Lauke < = EMAIL ADDRESS REMOVED = >
wrote:

> On 30/08/2017 22:36, Robert Fentress wrote:
> [...]
>
>> I've mentioned this before and folks seemed to be baffled by why one
>> would want to do such a thing, but I didn't totally understand the
>> criticism, so I'd appreciate anyone who wished to (kindly) enlighten
>> me. Basically, my thinking is that, if this were an option,
>> developers could code their page or web application to standards, as
>> best they could interpret them, and then test with ChromeVox. If it
>> worked with that, and the developer could, essentially, include that
>> screen reader as an option on the page itself, then it would help
>> ensure at least a floor for screen reader accessibility. It would
>> also provide another option for users, in general, to interact with
>> their site.
>>
>
> This *may* help a subset of users that would require a screen
> reader/AT - mainly those with mild vision impairment, or users that
> with cognitive disabilities who would benefit from self-voicing pages.
> Clearly, any other users that do rely on screen readers would already
> need to have a screen reader installed anyway (in order to log in,
> open the browser, navigate to the site, etc), but of course that's
> also true for ChromeVox itself. But having it installed as extension
> at browser level means that these users still benefit from it on all
> websites, not just on the ones that decided to install some form of
> site-specific ChromeVox library. Also, having it at browser level
> means that users can set their preferences globally, while a
> site-specific version would need it own settings - and then the user
> goes to another site that implements this sort of thing, and the
> settings need to be changed again for THAT site. This then goes into
> the same territory as the discussion around the benefits of site-specific "text resize widgets" versus users actually using text sizing options in their browser...
>
> I think many developers want to do right, but don't have the time to
> learn
>> all the ins and outs of how different screen readers interpret things
>> or to test in a half dozen or more different screen
>> reader/browser/platform combinations, guessing, without any really
>> reliable data, on what those might be.
>>
>
> In general, screen readers interpret well-formed and correctly
> implemented HTML/ARIA stuff fairly uniformly - at least compared to
> years ago. Ideally developers need to learn the "correct" way to mark
> things up (particularly referring to official ARIA patterns) and then
> their sites should work quite well in recent AT. Sure, every AT has
> bugs (the same way every browser has bugs), but the answer to that is
> not to just decide to bless one particular implementation (ChromeVox) as the de-facto standard...
>
> P
> --
> Patrick H. Lauke
>
> www.splintered.co.uk | https://github.com/patrickhlauke
> http://flickr.com/photos/redux/ | http://redux.deviantart.com
> twitter: @patrick_h_lauke | skype: patrick_h_lauke
> > > archives at http://webaim.org/discussion/archives
> >



--
Rob Fentress
Senior Accessibility Solutions Designer
Assistive Technologies at Virginia Tech
Electronic Business Card (vCard)
<http://search.vt.edu/search/person.vcf?person54847>
LinkedIn Profile
<https://www.linkedin.com/in/rob-fentress-aa0b609?trk=profile-badge>

From: Lovely, Brian (CONT)
Date: Thu, Aug 31 2017 7:48AM
Subject: Re: Running ChromeVox as a library in a web page?
← Previous message | Next message →

Oh, the weird widget. That pernicious little stinker whose purpose is just what exactly? I assume these hybrid monsters are created to combine two or three user steps into one. But at what cost? At what point does the confusion of a singleton widget that no sighted user has ever encountered and that can only be communicated to screen reader users with great difficulty (and the same excessive effort goes into making it keyboard accessible) offset the questionable value of a cool looking widget that combines three steps into one?

-----Original Message-----
From: WebAIM-Forum [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Robert Fentress
Sent: Wednesday, August 30, 2017 7:53 PM
To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = >
Subject: Re: [WebAIM] Running ChromeVox as a library in a web page?

Thanks for your thoughtful response.

I guess I'm thinking of complex composite widgets where it is not entirely clear what pattern fits, but you want to make sure it's not going to be totally fubar. An example: I've seen a complex autocomplete-like widget, where you are in a field and start typing in characters and a list of users appears in a listbox structure. When you arrow down to select a user and press Enter, a sort of badge appears in the field (or at least appears to be in the field, anyway), indicating you've added that user to a list of users, to be used for whatever process you are trying to accomplish. Then, you can start typing again, bringing up another listbox where you can select another user to be added as a sort of badge in that field, and so on. The badges in the field have little exes in them, allowing you to remove them. It can all be accessed using only the keyboard somehow, but exactly how you structure that in terms of ARIA patterns, and what keyboard interaction model to use is not 100% clear, at least in my mind.

Therefore, what I would want to know, as a conscientious developer, is if this thing--whatever it is--is going to presented in *some sort of sensible
way* to a screen reader user. In cases like this, JAWS may not understand the semantics the developer is trying to express exactly right and present a possibly confusing mishmash of cues and affordances, but VoiceOver may guess what you mean to be conveying well, and so on. It would be helpful in complex cases like this to be able to say something like, "Look, I know this is a weird widget I've made here, but it does provide useful affordances to many users, and I don't want to just stuck be with this limited palette of widgets that the ARIA authoring practices has blessed.
I've tried to use semantics that are proper though, and it doesn't trigger any parsing errors, and I have at least tested this out in ChromeVox and know it works somewhat sensibly there, so if it doesn't work exactly how you'd like it to work with your particular screen reader, you can at least use this in-page screen reader I've provided, and it'll work there."

I know that sucks, and your point about how weird it would be to switch screen readers mid-stream is well taken. It is awkward and probably unrealistic--maybe even a pro-forma copt-out. That being said I guess I'm still confused, absent something like this, how you keep moving forward in terms of UI patterns. This, at least, provides one path that is, to twist the meaning a little, "accessibility supported." Hope that made sense.

On Wed, Aug 30, 2017 at 6:11 PM, Patrick H. Lauke < = EMAIL ADDRESS REMOVED = >
wrote:

> On 30/08/2017 22:36, Robert Fentress wrote:
> [...]
>
>> I've mentioned this before and folks seemed to be baffled by why one
>> would want to do such a thing, but I didn't totally understand the
>> criticism, so I'd appreciate anyone who wished to (kindly) enlighten
>> me. Basically, my thinking is that, if this were an option,
>> developers could code their page or web application to standards, as
>> best they could interpret them, and then test with ChromeVox. If it
>> worked with that, and the developer could, essentially, include that
>> screen reader as an option on the page itself, then it would help
>> ensure at least a floor for screen reader accessibility. It would
>> also provide another option for users, in general, to interact with
>> their site.
>>
>
> This *may* help a subset of users that would require a screen
> reader/AT - mainly those with mild vision impairment, or users that
> with cognitive disabilities who would benefit from self-voicing pages.
> Clearly, any other users that do rely on screen readers would already
> need to have a screen reader installed anyway (in order to log in,
> open the browser, navigate to the site, etc), but of course that's
> also true for ChromeVox itself. But having it installed as extension
> at browser level means that these users still benefit from it on all
> websites, not just on the ones that decided to install some form of
> site-specific ChromeVox library. Also, having it at browser level
> means that users can set their preferences globally, while a
> site-specific version would need it own settings - and then the user
> goes to another site that implements this sort of thing, and the
> settings need to be changed again for THAT site. This then goes into
> the same territory as the discussion around the benefits of site-specific "text resize widgets" versus users actually using text sizing options in their browser...
>
> I think many developers want to do right, but don't have the time to
> learn
>> all the ins and outs of how different screen readers interpret things
>> or to test in a half dozen or more different screen
>> reader/browser/platform combinations, guessing, without any really
>> reliable data, on what those might be.
>>
>
> In general, screen readers interpret well-formed and correctly
> implemented HTML/ARIA stuff fairly uniformly - at least compared to
> years ago. Ideally developers need to learn the "correct" way to mark
> things up (particularly referring to official ARIA patterns) and then
> their sites should work quite well in recent AT. Sure, every AT has
> bugs (the same way every browser has bugs), but the answer to that is
> not to just decide to bless one particular implementation (ChromeVox) as the de-facto standard...
>
> P
> --
> Patrick H. Lauke
>
> www.splintered.co.uk | https://github.com/patrickhlauke
> http://flickr.com/photos/redux/ | http://redux.deviantart.com
> twitter: @patrick_h_lauke | skype: patrick_h_lauke
> > > archives at http://webaim.org/discussion/archives
> >



--
Rob Fentress
Senior Accessibility Solutions Designer
Assistive Technologies at Virginia Tech
Electronic Business Card (vCard)
<http://search.vt.edu/search/person.vcf?person54847>
LinkedIn Profile
<https://www.linkedin.com/in/rob-fentress-aa0b609?trk=profile-badge>
The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates and may only be used solely in performance of work or services for Capital One. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.

From: Robert Fentress
Date: Thu, Aug 31 2017 9:19AM
Subject: Re: Running ChromeVox as a library in a web page?
← Previous message | Next message →

Yes. I suppose there is always a way to decompose things to better fit
existing prescribed patterns. However, we all exist in a practical
context. It is not always possible to prevail upon people we are dealing
with that they need to completely redesign their UI, especially if it was
not the result of the whim of some overly creative developer, but, rather,
involved quite costly usability testing that just didn't happen to include,
in its sample of participants, screen reader users. Working in a
competitive environment where companies are always trying to distinguish
themselves by providing a more fluid experience--for some users at
least--means sometimes one needs to make what is already there work as best
it can, and encourage the building of a culture moving forward that
considers a broader set of users when making design decisions.

On Thu, Aug 31, 2017 at 9:48 AM, Lovely, Brian (CONT) via WebAIM-Forum <
= EMAIL ADDRESS REMOVED = > wrote:

> Oh, the weird widget. That pernicious little stinker whose purpose is just
> what exactly? I assume these hybrid monsters are created to combine two or
> three user steps into one. But at what cost? At what point does the
> confusion of a singleton widget that no sighted user has ever encountered
> and that can only be communicated to screen reader users with great
> difficulty (and the same excessive effort goes into making it keyboard
> accessible) offset the questionable value of a cool looking widget that
> combines three steps into one?
>
> -----Original Message-----
> From: WebAIM-Forum [mailto: = EMAIL ADDRESS REMOVED = ] On
> Behalf Of Robert Fentress
> Sent: Wednesday, August 30, 2017 7:53 PM
> To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = >
> Subject: Re: [WebAIM] Running ChromeVox as a library in a web page?
>
> Thanks for your thoughtful response.
>
> I guess I'm thinking of complex composite widgets where it is not entirely
> clear what pattern fits, but you want to make sure it's not going to be
> totally fubar. An example: I've seen a complex autocomplete-like widget,
> where you are in a field and start typing in characters and a list of users
> appears in a listbox structure. When you arrow down to select a user and
> press Enter, a sort of badge appears in the field (or at least appears to
> be in the field, anyway), indicating you've added that user to a list of
> users, to be used for whatever process you are trying to accomplish. Then,
> you can start typing again, bringing up another listbox where you can
> select another user to be added as a sort of badge in that field, and so
> on. The badges in the field have little exes in them, allowing you to
> remove them. It can all be accessed using only the keyboard somehow, but
> exactly how you structure that in terms of ARIA patterns, and what keyboard
> interaction model to use is not 100% clear, at least in my mind.
>
> Therefore, what I would want to know, as a conscientious developer, is if
> this thing--whatever it is--is going to presented in *some sort of sensible
> way* to a screen reader user. In cases like this, JAWS may not understand
> the semantics the developer is trying to express exactly right and present
> a possibly confusing mishmash of cues and affordances, but VoiceOver may
> guess what you mean to be conveying well, and so on. It would be helpful
> in complex cases like this to be able to say something like, "Look, I know
> this is a weird widget I've made here, but it does provide useful
> affordances to many users, and I don't want to just stuck be with this
> limited palette of widgets that the ARIA authoring practices has blessed.
> I've tried to use semantics that are proper though, and it doesn't trigger
> any parsing errors, and I have at least tested this out in ChromeVox and
> know it works somewhat sensibly there, so if it doesn't work exactly how
> you'd like it to work with your particular screen reader, you can at least
> use this in-page screen reader I've provided, and it'll work there."
>
> I know that sucks, and your point about how weird it would be to switch
> screen readers mid-stream is well taken. It is awkward and probably
> unrealistic--maybe even a pro-forma copt-out. That being said I guess I'm
> still confused, absent something like this, how you keep moving forward in
> terms of UI patterns. This, at least, provides one path that is, to twist
> the meaning a little, "accessibility supported." Hope that made sense.
>
> On Wed, Aug 30, 2017 at 6:11 PM, Patrick H. Lauke < = EMAIL ADDRESS REMOVED = >
> wrote:
>
> > On 30/08/2017 22:36, Robert Fentress wrote:
> > [...]
> >
> >> I've mentioned this before and folks seemed to be baffled by why one
> >> would want to do such a thing, but I didn't totally understand the
> >> criticism, so I'd appreciate anyone who wished to (kindly) enlighten
> >> me. Basically, my thinking is that, if this were an option,
> >> developers could code their page or web application to standards, as
> >> best they could interpret them, and then test with ChromeVox. If it
> >> worked with that, and the developer could, essentially, include that
> >> screen reader as an option on the page itself, then it would help
> >> ensure at least a floor for screen reader accessibility. It would
> >> also provide another option for users, in general, to interact with
> >> their site.
> >>
> >
> > This *may* help a subset of users that would require a screen
> > reader/AT - mainly those with mild vision impairment, or users that
> > with cognitive disabilities who would benefit from self-voicing pages.
> > Clearly, any other users that do rely on screen readers would already
> > need to have a screen reader installed anyway (in order to log in,
> > open the browser, navigate to the site, etc), but of course that's
> > also true for ChromeVox itself. But having it installed as extension
> > at browser level means that these users still benefit from it on all
> > websites, not just on the ones that decided to install some form of
> > site-specific ChromeVox library. Also, having it at browser level
> > means that users can set their preferences globally, while a
> > site-specific version would need it own settings - and then the user
> > goes to another site that implements this sort of thing, and the
> > settings need to be changed again for THAT site. This then goes into
> > the same territory as the discussion around the benefits of
> site-specific "text resize widgets" versus users actually using text sizing
> options in their browser...
> >
> > I think many developers want to do right, but don't have the time to
> > learn
> >> all the ins and outs of how different screen readers interpret things
> >> or to test in a half dozen or more different screen
> >> reader/browser/platform combinations, guessing, without any really
> >> reliable data, on what those might be.
> >>
> >
> > In general, screen readers interpret well-formed and correctly
> > implemented HTML/ARIA stuff fairly uniformly - at least compared to
> > years ago. Ideally developers need to learn the "correct" way to mark
> > things up (particularly referring to official ARIA patterns) and then
> > their sites should work quite well in recent AT. Sure, every AT has
> > bugs (the same way every browser has bugs), but the answer to that is
> > not to just decide to bless one particular implementation (ChromeVox) as
> the de-facto standard...
> >
> > P
> > --
> > Patrick H. Lauke
> >
> > www.splintered.co.uk | https://github.com/patrickhlauke
> > http://flickr.com/photos/redux/ | http://redux.deviantart.com
> > twitter: @patrick_h_lauke | skype: patrick_h_lauke
> > > > > > archives at http://webaim.org/discussion/archives
> > > >
>
>
>
> --
> Rob Fentress
> Senior Accessibility Solutions Designer
> Assistive Technologies at Virginia Tech
> Electronic Business Card (vCard)
> <http://search.vt.edu/search/person.vcf?person54847>
> LinkedIn Profile
> <https://www.linkedin.com/in/rob-fentress-aa0b609?trk=profile-badge>
> > > at http://webaim.org/discussion/archives
> > >
> The information contained in this e-mail is confidential and/or
> proprietary to Capital One and/or its affiliates and may only be used
> solely in performance of work or services for Capital One. The information
> transmitted herewith is intended only for use by the individual or entity
> to which it is addressed. If the reader of this message is not the intended
> recipient, you are hereby notified that any review, retransmission,
> dissemination, distribution, copying or other use of, or taking of any
> action in reliance upon this information is strictly prohibited. If you
> have received this communication in error, please contact the sender and
> delete the material from your computer.
> > > > >



--
Rob Fentress
Senior Accessibility Solutions Designer
Assistive Technologies at Virginia Tech
Electronic Business Card (vCard)
<http://search.vt.edu/search/person.vcf?person54847>
LinkedIn Profile
<https://www.linkedin.com/in/rob-fentress-aa0b609?trk=profile-badge>

From: Robert Fentress
Date: Thu, Aug 31 2017 9:25AM
Subject: Re: Running ChromeVox as a library in a web page?
← Previous message | Next message →

I've *really* gotta make the time to explore UIA. I've heard you mention
it before and it seemed intriguing, but I just haven't gotten around to it
yet.

On Wed, Aug 30, 2017 at 9:26 PM, Jonathan Avila < = EMAIL ADDRESS REMOVED = >
wrote:

> > absent something like this, how you keep moving forward in terms of UI
> patterns.
>
> Ultimately we need a way to communicate the patterns a control supports,
> the actions associated with a control, and a method to activate those
> actions. Instead of solely relying on all sorts of keystrokes if there was
> a programmatic way to communicate this and perform the actions like those
> provided through the actions rotor on VoiceOver it would provide
> consistency and flexibility. Accessibility APIs like UIA support the
> above at some depth but the ARIA semantics still need to catch up as they
> are focused on roles and not patterns and there are not great ways of
> communicated in the DOM the different actions and perform those. Web
> accessibility will in time be moving into where we can access the
> accessibility API through JavaScript and these areas will be better
> supported. The challenge has been up to this point to get the assistive
> technology and user agents updated to support the current ARIA consistently
> -- which has taken years to get where we are today.
>
> Jonathan
>
> Jonathan Avila
> Chief Accessibility Officer
> Level Access, inc. (formerly SSB BART Group, inc.)
> (703) 637-8957
> = EMAIL ADDRESS REMOVED =
> Visit us online: Website | Twitter | Facebook | LinkedIn | Blog
> Looking to boost your accessibility knowledge? Check out our free webinars!
>
> The information contained in this transmission may be attorney privileged
> and/or confidential information intended for the use of the individual or
> entity named above. If the reader of this message is not the intended
> recipient, you are hereby notified that any use, dissemination,
> distribution or copying of this communication is strictly prohibited.
>
> -----Original Message-----
> From: WebAIM-Forum [mailto: = EMAIL ADDRESS REMOVED = ] On
> Behalf Of Robert Fentress
> Sent: Wednesday, August 30, 2017 7:53 PM
> To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = >
> Subject: Re: [WebAIM] Running ChromeVox as a library in a web page?
>
> Thanks for your thoughtful response.
>
> I guess I'm thinking of complex composite widgets where it is not entirely
> clear what pattern fits, but you want to make sure it's not going to be
> totally fubar. An example: I've seen a complex autocomplete-like widget,
> where you are in a field and start typing in characters and a list of users
> appears in a listbox structure. When you arrow down to select a user and
> press Enter, a sort of badge appears in the field (or at least appears to
> be in the field, anyway), indicating you've added that user to a list of
> users, to be used for whatever process you are trying to accomplish. Then,
> you can start typing again, bringing up another listbox where you can
> select another user to be added as a sort of badge in that field, and so
> on. The badges in the field have little exes in them, allowing you to
> remove them. It can all be accessed using only the keyboard somehow, but
> exactly how you structure that in terms of ARIA patterns, and what keyboard
> interaction model to use is not 100% clear, at least in my mind.
>
> Therefore, what I would want to know, as a conscientious developer, is if
> this thing--whatever it is--is going to presented in *some sort of sensible
> way* to a screen reader user. In cases like this, JAWS may not understand
> the semantics the developer is trying to express exactly right and present
> a possibly confusing mishmash of cues and affordances, but VoiceOver may
> guess what you mean to be conveying well, and so on. It would be helpful
> in complex cases like this to be able to say something like, "Look, I know
> this is a weird widget I've made here, but it does provide useful
> affordances to many users, and I don't want to just stuck be with this
> limited palette of widgets that the ARIA authoring practices has blessed.
> I've tried to use semantics that are proper though, and it doesn't trigger
> any parsing errors, and I have at least tested this out in ChromeVox and
> know it works somewhat sensibly there, so if it doesn't work exactly how
> you'd like it to work with your particular screen reader, you can at least
> use this in-page screen reader I've provided, and it'll work there."
>
> I know that sucks, and your point about how weird it would be to switch
> screen readers mid-stream is well taken. It is awkward and probably
> unrealistic--maybe even a pro-forma copt-out. That being said I guess I'm
> still confused, absent something like this, how you keep moving forward in
> terms of UI patterns. This, at least, provides one path that is, to twist
> the meaning a little, "accessibility supported." Hope that made sense.
>
> On Wed, Aug 30, 2017 at 6:11 PM, Patrick H. Lauke < = EMAIL ADDRESS REMOVED = >
> wrote:
>
> > On 30/08/2017 22:36, Robert Fentress wrote:
> > [...]
> >
> >> I've mentioned this before and folks seemed to be baffled by why one
> >> would want to do such a thing, but I didn't totally understand the
> >> criticism, so I'd appreciate anyone who wished to (kindly) enlighten
> >> me. Basically, my thinking is that, if this were an option,
> >> developers could code their page or web application to standards, as
> >> best they could interpret them, and then test with ChromeVox. If it
> >> worked with that, and the developer could, essentially, include that
> >> screen reader as an option on the page itself, then it would help
> >> ensure at least a floor for screen reader accessibility. It would
> >> also provide another option for users, in general, to interact with
> >> their site.
> >>
> >
> > This *may* help a subset of users that would require a screen
> > reader/AT - mainly those with mild vision impairment, or users that
> > with cognitive disabilities who would benefit from self-voicing pages.
> > Clearly, any other users that do rely on screen readers would already
> > need to have a screen reader installed anyway (in order to log in,
> > open the browser, navigate to the site, etc), but of course that's
> > also true for ChromeVox itself. But having it installed as extension
> > at browser level means that these users still benefit from it on all
> > websites, not just on the ones that decided to install some form of
> > site-specific ChromeVox library. Also, having it at browser level
> > means that users can set their preferences globally, while a
> > site-specific version would need it own settings - and then the user
> > goes to another site that implements this sort of thing, and the
> > settings need to be changed again for THAT site. This then goes into
> > the same territory as the discussion around the benefits of
> site-specific "text resize widgets" versus users actually using text sizing
> options in their browser...
> >
> > I think many developers want to do right, but don't have the time to
> > learn
> >> all the ins and outs of how different screen readers interpret things
> >> or to test in a half dozen or more different screen
> >> reader/browser/platform combinations, guessing, without any really
> >> reliable data, on what those might be.
> >>
> >
> > In general, screen readers interpret well-formed and correctly
> > implemented HTML/ARIA stuff fairly uniformly - at least compared to
> > years ago. Ideally developers need to learn the "correct" way to mark
> > things up (particularly referring to official ARIA patterns) and then
> > their sites should work quite well in recent AT. Sure, every AT has
> > bugs (the same way every browser has bugs), but the answer to that is
> > not to just decide to bless one particular implementation (ChromeVox) as
> the de-facto standard...
> >
> > P
> > --
> > Patrick H. Lauke
> >
> > www.splintered.co.uk | https://github.com/patrickhlauke
> > http://flickr.com/photos/redux/ | http://redux.deviantart.com
> > twitter: @patrick_h_lauke | skype: patrick_h_lauke
> > > > > > archives at http://webaim.org/discussion/archives
> > > >
>
>
>
> --
> Rob Fentress
> Senior Accessibility Solutions Designer
> Assistive Technologies at Virginia Tech
> Electronic Business Card (vCard)
> <http://search.vt.edu/search/person.vcf?person54847>
> LinkedIn Profile
> <https://www.linkedin.com/in/rob-fentress-aa0b609?trk=profile-badge>
> > > at http://webaim.org/discussion/archives
> > > > > >



--
Rob Fentress
Senior Accessibility Solutions Designer
Assistive Technologies at Virginia Tech
Electronic Business Card (vCard)
<http://search.vt.edu/search/person.vcf?person54847>
LinkedIn Profile
<https://www.linkedin.com/in/rob-fentress-aa0b609?trk=profile-badge>

From: Robert Fentress
Date: Thu, Aug 31 2017 9:32AM
Subject: Re: Running ChromeVox as a library in a web page?
← Previous message | Next message →

Sorry, I misread that. Thought you were referring to IndieUI not UIA.

On Thu, Aug 31, 2017 at 11:25 AM, Robert Fentress < = EMAIL ADDRESS REMOVED = > wrote:

> I've *really* gotta make the time to explore UIA. I've heard you mention
> it before and it seemed intriguing, but I just haven't gotten around to it
> yet.
>
> On Wed, Aug 30, 2017 at 9:26 PM, Jonathan Avila < = EMAIL ADDRESS REMOVED =
> > wrote:
>
>> > absent something like this, how you keep moving forward in terms of UI
>> patterns.
>>
>> Ultimately we need a way to communicate the patterns a control supports,
>> the actions associated with a control, and a method to activate those
>> actions. Instead of solely relying on all sorts of keystrokes if there was
>> a programmatic way to communicate this and perform the actions like those
>> provided through the actions rotor on VoiceOver it would provide
>> consistency and flexibility. Accessibility APIs like UIA support the
>> above at some depth but the ARIA semantics still need to catch up as they
>> are focused on roles and not patterns and there are not great ways of
>> communicated in the DOM the different actions and perform those. Web
>> accessibility will in time be moving into where we can access the
>> accessibility API through JavaScript and these areas will be better
>> supported. The challenge has been up to this point to get the assistive
>> technology and user agents updated to support the current ARIA consistently
>> -- which has taken years to get where we are today.
>>
>> Jonathan
>>
>> Jonathan Avila
>> Chief Accessibility Officer
>> Level Access, inc. (formerly SSB BART Group, inc.)
>> (703) 637-8957
>> = EMAIL ADDRESS REMOVED =
>> Visit us online: Website | Twitter | Facebook | LinkedIn | Blog
>> Looking to boost your accessibility knowledge? Check out our free
>> webinars!
>>
>> The information contained in this transmission may be attorney privileged
>> and/or confidential information intended for the use of the individual or
>> entity named above. If the reader of this message is not the intended
>> recipient, you are hereby notified that any use, dissemination,
>> distribution or copying of this communication is strictly prohibited.
>>
>> -----Original Message-----
>> From: WebAIM-Forum [mailto: = EMAIL ADDRESS REMOVED = ] On
>> Behalf Of Robert Fentress
>> Sent: Wednesday, August 30, 2017 7:53 PM
>> To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = >
>> Subject: Re: [WebAIM] Running ChromeVox as a library in a web page?
>>
>> Thanks for your thoughtful response.
>>
>> I guess I'm thinking of complex composite widgets where it is not
>> entirely clear what pattern fits, but you want to make sure it's not going
>> to be totally fubar. An example: I've seen a complex autocomplete-like
>> widget, where you are in a field and start typing in characters and a list
>> of users appears in a listbox structure. When you arrow down to select a
>> user and press Enter, a sort of badge appears in the field (or at least
>> appears to be in the field, anyway), indicating you've added that user to a
>> list of users, to be used for whatever process you are trying to
>> accomplish. Then, you can start typing again, bringing up another listbox
>> where you can select another user to be added as a sort of badge in that
>> field, and so on. The badges in the field have little exes in them,
>> allowing you to remove them. It can all be accessed using only the
>> keyboard somehow, but exactly how you structure that in terms of ARIA
>> patterns, and what keyboard interaction model to use is not 100% clear, at
>> least in my mind.
>>
>> Therefore, what I would want to know, as a conscientious developer, is if
>> this thing--whatever it is--is going to presented in *some sort of sensible
>> way* to a screen reader user. In cases like this, JAWS may not
>> understand the semantics the developer is trying to express exactly right
>> and present a possibly confusing mishmash of cues and affordances, but
>> VoiceOver may guess what you mean to be conveying well, and so on. It
>> would be helpful in complex cases like this to be able to say something
>> like, "Look, I know this is a weird widget I've made here, but it does
>> provide useful affordances to many users, and I don't want to just stuck be
>> with this limited palette of widgets that the ARIA authoring practices has
>> blessed.
>> I've tried to use semantics that are proper though, and it doesn't
>> trigger any parsing errors, and I have at least tested this out in
>> ChromeVox and know it works somewhat sensibly there, so if it doesn't work
>> exactly how you'd like it to work with your particular screen reader, you
>> can at least use this in-page screen reader I've provided, and it'll work
>> there."
>>
>> I know that sucks, and your point about how weird it would be to switch
>> screen readers mid-stream is well taken. It is awkward and probably
>> unrealistic--maybe even a pro-forma copt-out. That being said I guess I'm
>> still confused, absent something like this, how you keep moving forward in
>> terms of UI patterns. This, at least, provides one path that is, to twist
>> the meaning a little, "accessibility supported." Hope that made sense.
>>
>> On Wed, Aug 30, 2017 at 6:11 PM, Patrick H. Lauke < = EMAIL ADDRESS REMOVED =
>> >
>> wrote:
>>
>> > On 30/08/2017 22:36, Robert Fentress wrote:
>> > [...]
>> >
>> >> I've mentioned this before and folks seemed to be baffled by why one
>> >> would want to do such a thing, but I didn't totally understand the
>> >> criticism, so I'd appreciate anyone who wished to (kindly) enlighten
>> >> me. Basically, my thinking is that, if this were an option,
>> >> developers could code their page or web application to standards, as
>> >> best they could interpret them, and then test with ChromeVox. If it
>> >> worked with that, and the developer could, essentially, include that
>> >> screen reader as an option on the page itself, then it would help
>> >> ensure at least a floor for screen reader accessibility. It would
>> >> also provide another option for users, in general, to interact with
>> >> their site.
>> >>
>> >
>> > This *may* help a subset of users that would require a screen
>> > reader/AT - mainly those with mild vision impairment, or users that
>> > with cognitive disabilities who would benefit from self-voicing pages.
>> > Clearly, any other users that do rely on screen readers would already
>> > need to have a screen reader installed anyway (in order to log in,
>> > open the browser, navigate to the site, etc), but of course that's
>> > also true for ChromeVox itself. But having it installed as extension
>> > at browser level means that these users still benefit from it on all
>> > websites, not just on the ones that decided to install some form of
>> > site-specific ChromeVox library. Also, having it at browser level
>> > means that users can set their preferences globally, while a
>> > site-specific version would need it own settings - and then the user
>> > goes to another site that implements this sort of thing, and the
>> > settings need to be changed again for THAT site. This then goes into
>> > the same territory as the discussion around the benefits of
>> site-specific "text resize widgets" versus users actually using text sizing
>> options in their browser...
>> >
>> > I think many developers want to do right, but don't have the time to
>> > learn
>> >> all the ins and outs of how different screen readers interpret things
>> >> or to test in a half dozen or more different screen
>> >> reader/browser/platform combinations, guessing, without any really
>> >> reliable data, on what those might be.
>> >>
>> >
>> > In general, screen readers interpret well-formed and correctly
>> > implemented HTML/ARIA stuff fairly uniformly - at least compared to
>> > years ago. Ideally developers need to learn the "correct" way to mark
>> > things up (particularly referring to official ARIA patterns) and then
>> > their sites should work quite well in recent AT. Sure, every AT has
>> > bugs (the same way every browser has bugs), but the answer to that is
>> > not to just decide to bless one particular implementation (ChromeVox)
>> as the de-facto standard...
>> >
>> > P
>> > --
>> > Patrick H. Lauke
>> >
>> > www.splintered.co.uk | https://github.com/patrickhlauke
>> > http://flickr.com/photos/redux/ | http://redux.deviantart.com
>> > twitter: @patrick_h_lauke | skype: patrick_h_lauke
>> > >> > >> > archives at http://webaim.org/discussion/archives
>> > >> >
>>
>>
>>
>> --
>> Rob Fentress
>> Senior Accessibility Solutions Designer
>> Assistive Technologies at Virginia Tech
>> Electronic Business Card (vCard)
>> <http://search.vt.edu/search/person.vcf?person54847>
>> LinkedIn Profile
>> <https://www.linkedin.com/in/rob-fentress-aa0b609?trk=profile-badge>
>> >> >> at http://webaim.org/discussion/archives
>> >> >> >> >> >>
>
>
>
> --
> Rob Fentress
> Senior Accessibility Solutions Designer
> Assistive Technologies at Virginia Tech
> Electronic Business Card (vCard)
> <http://search.vt.edu/search/person.vcf?person54847>
> LinkedIn Profile
> <https://www.linkedin.com/in/rob-fentress-aa0b609?trk=profile-badge>
>



--
Rob Fentress
Senior Accessibility Solutions Designer
Assistive Technologies at Virginia Tech
Electronic Business Card (vCard)
<http://search.vt.edu/search/person.vcf?person54847>
LinkedIn Profile
<https://www.linkedin.com/in/rob-fentress-aa0b609?trk=profile-badge>

From: Jonathan Avila
Date: Fri, Sep 01 2017 8:19AM
Subject: Re: Running ChromeVox as a library in a web page?
← Previous message | No next message

It's my understanding that the current ChromeVox that is available (classic) uses the DOM and that the new version that uses the API is only available on the Chromebook machines at this time. So building support for something that only supports DOM based access is not headed in the same direction of future assistive technology.

Jonathan

Jonathan Avila
Chief Accessibility Officer
Level Access, inc. (formerly SSB BART Group, inc.)
(703) 637-8957
= EMAIL ADDRESS REMOVED =
Visit us online: Website | Twitter | Facebook | LinkedIn | Blog
Looking to boost your accessibility knowledge? Check out our free webinars!

The information contained in this transmission may be attorney privileged and/or confidential information intended for the use of the individual or entity named above. If the reader of this message is not the intended recipient, you are hereby notified that any use, dissemination, distribution or copying of this communication is strictly prohibited.

-----Original Message-----
From: WebAIM-Forum [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Robert Fentress
Sent: Wednesday, August 30, 2017 5:36 PM
To: WebAIM Discussion List < = EMAIL ADDRESS REMOVED = >
Subject: [WebAIM] Running ChromeVox as a library in a web page?

I was wondering if anyone knows enough to say how hard it would be to port the ChromeVox browser extension to be a JavaScript library that someone could load in a browser on a per-page basis without having to install the extension. I suspect that, under the hood, Google is using standard web technologies, like JavaScript and the Web Speech API, but I really have no idea.

I've mentioned this before and folks seemed to be baffled by why one would want to do such a thing, but I didn't totally understand the criticism, so I'd appreciate anyone who wished to (kindly) enlighten me. Basically, my thinking is that, if this were an option, developers could code their page or web application to standards, as best they could interpret them, and then test with ChromeVox. If it worked with that, and the developer could, essentially, include that screen reader as an option on the page itself, then it would help ensure at least a floor for screen reader accessibility. It would also provide another option for users, in general, to interact with their site.

I think many developers want to do right, but don't have the time to learn all the ins and outs of how different screen readers interpret things or to test in a half dozen or more different screen reader/browser/platform combinations, guessing, without any really reliable data, on what those might be. I know ChromeVox is not a great or complete screen reader, but, if people started using this as a back up, and it started to gain traction as a strategy, it might prompt Google to improve it. Then that might begin to serve as sort of a reference standard for other screen readers in terms of how to interpret and present things. I can hear the groans already, and I'm not saying ChromeVox is the best thing to serve as that reference, but I'm suggesting it here solely because I suspect it would be easier to port to be just an in-page Javascript-based screen reader.

OK. Have at me, but be kind.

--
Rob Fentress
Senior Accessibility Solutions Designer
Assistive Technologies at Virginia Tech
Electronic Business Card (vCard)
<http://search.vt.edu/search/person.vcf?person54847>
LinkedIn Profile
<https://www.linkedin.com/in/rob-fentress-aa0b609?trk=profile-badge>