E-mail List Archives

Re: screen reader versions for testing

for

From: Birkir R. Gunnarsson
Date: Oct 30, 2016 2:40PM


The reason I have not encouraged testing with ChromeVox is that it is
very rarely used.
According to the latest WebAiM screen reader user survey:
http://webaim.org/projects/screenreadersurvey6/
it is the primary screen reader for 0.3% of correspondants, while NVDA
is in the 15% range (and much higher if you take secondary screen
readers into acount).
Sadly it is not enough to make sure the webpage code conforms to
standards, it needs to be tested with at least one assistive
technology, usually a screen reader, and that usually means learning
and implementing some workarounds to address the qwerks of that
particular screen reader.
If I am putting an effort into that, I want to make sure to use a
popular screen reader, so those workarounds are noticed.
Of course screen reader usage pattern changes, and we all should keep
a close eye on the WebAIM survey (and other usage statistics if they
become available).
A thumbs up for WebAIM for taking the initiative to carry out this
survey. It is incredibly valuable when recommending and formulating a
corporate accessibility testing strategy, management wants
justification and numbers behind all recommendations.
The Android/Talkback development is exciting and I am keeping a close
prosthetic eye on it, in case it surpasses Voiceover use on responsive
web in the near future, it could maybe do that, seeing as Google is
doing good while the latest Apple upgrades are a bit underwhelming
(well, in my personal opinion that is).
-B


On 10/30/16, Kevin Chao < <EMAIL REMOVED> > wrote:
> I've been doing lots of a11y testing using ChromeVox Next
> <http://www.chromevox.com/next.html>; and TalkBack
> <https://play.google.com/store/apps/details?id=com.google.android.marvin.talkback&hl=en>
> with
> Chrome. I've found it to be comparable to/better than Mac/iOS VoiceOver. In
> the past half year, there have been lots of excellent improvements to
> Google's screen readers and browsers, so strongly recommend for these to be
> factored in AT test matrix.
>
> On Sun, Oct 30, 2016 at 4:20 AM Birkir R. Gunnarsson <
> <EMAIL REMOVED> > wrote:
>
>> We generally test with NVDA (current - 1) with Firefox (current -2).
>> NVDA is free, open source (so available to the general user at no
>> cost), has good visual tools to help developers and does not hide
>> accessibility issues like Jaws does (I appreciate Jaws trying to fill
>> in the gap for the end users but it makes it a bad tool for testing).
>> Since I am heavily involved in development and testing of contet, I
>> sanity check it with Jaws and IE, and we try to file bug and work
>> around the most critical problems we see occurring in that
>> combination.
>> For responsive web, we use iOS, latest (because upgrading is easy),
>> iPhone 6 in portrait mode (testing in portrait and landscape on phone
>> and tablet adds a lot of overhead very quickly).
>> Generally, banks recommend that users upgrade to latest versions of
>> browsers for security reasons.
>> We are looking into testing at least key pages with screen
>> magnification and speech recognition as well.
>>
>> Of course we focus primarily to make sure our code validates and that
>> our ARIA, when we use it, is correct.
>> Cheers
>>
>>
>>
>>
>> On 10/29/16, JP Jamous < <EMAIL REMOVED> > wrote:
>> > Here we test with the latest versions of JAWS/Internet Explorer,
>> > NVDA/Firefox and Voiceover/Safari.
>> >
>> > It makes it a bit hard to find the happy medium as all 3 screen readers
>> > render HTML markup differently. To achieve the happy medium, we try to
>> focus
>> > on proper semantic whenever we can. Sometimes that is not possible and
>> > we
>> > notice that NVDA and Voiceover tend to behave similarly, but JAWS is
>> > different since it drills deeper into the markup.
>> >
>> > We do test every now and then with older versions of the 3 screen
>> readers in
>> > case we run into an issue. As a good example, aria-describedby and
>> > aria-labelledby were not supported with Voiceover on iOS 10. We tested
>> our
>> > code against iOS 9.4 and found that it worked fine. That was when we
>> > realized that it was a bug on behalf of Apple.
>> >
>> > -----Original Message-----
>> > From: WebAIM-Forum [mailto: <EMAIL REMOVED> ] On
>> Behalf
>> > Of Beranek, Nicholas
>> > Sent: Friday, October 28, 2016 9:46 PM
>> > To: WebAIM Discussion List < <EMAIL REMOVED> >
>> > Subject: Re: [WebAIM] screen reader versions for testing
>> >
>> > We test primarily with the latest versions of NVDA and Firefox. We've
>> found
>> > that JAWS will compensate for bad coding practices (e.g. A missing
>> > programmatic label but adjacent text was present) and there were
>> > possibilities that it missed certain issues. If there is ever any
>> question
>> > about the results from NVDA; then we'll try another browser. If it
>> persists,
>> > we try another screen reader such as JAWS. Sometimes, we'll find that
>> it's
>> > simply a user agent issue and we'll do our best to file a bug.
>> >
>> > For responsive, we'll test the latest version of iOS with VoiceOver and
>> > Safari. Utilizing subject matter expertise: knowledge of the guidelines,
>> > nuances between screen readers, browsers, and operating systems,
>> front-end
>> > development experience, other tools such as aXe and MSAA Object Inspect,
>> the
>> > community, ARIA design patterns, and I could go on further, we're able
>> > to
>> > qualify that we've done our absolute best.
>> >
>> > I hope this helps,
>> >
>> > Nick Beranek
>> > Capital One
>> >
>> >> On Oct 28, 2016, at 4:54 PM, Mallory < <EMAIL REMOVED> > wrote:
>> >>
>> >> Kinda the same here: test with current (sometimes the tester doesn't
>> >> have the absolute most-current either), and things that don't work, we
>> >> look them up to see if it's some known bug, if it was fixes, which
>> >> versions were affected. In general being up to date for some big bugs
>> >> so that if things work with a current version, we can be more aware it
>> >> may fail in an older version.
>> >>
>> >> One thing I try to keep in mind is wealthy web developers tend to have
>> >> the latest and greatest. But the same usually can't be said for our
>> >> customers, so it's dangerous for us to assume "passes in latest" =>> >> works for everyone.
>> >>
>> >> cheers,
>> >> _mallory
>> >>
>> >> On Fri, Oct 28, 2016, at 09:42 PM, Moore,Michael (Accessibility)
>> >> (HHSC)
>> >> wrote:
>> >>> Generally we test with the current release of JAWS. This is after we
>> >>> have thoroughly analyzed the code. If we run into unexpected problems
>> >>> then we will test with current release of NVDA, older versions of
>> >>> JAWS, more browsers etc. What we are doing at that point is
>> >>> attempting to determine who to file the defect with, what possible
>> >>> work arounds exist, and whether we can justify changing code that is
>> >>> technically compliant.
>> >>>
>> >>> Mike Moore
>> >>> Accessibility Coordinator
>> >>> Texas Health and Human Services Commission Civil Rights Office
>> >>> (512) 438-3431 (Office)
>> >>>
>> >>> -----Original Message-----
>> >>> From: WebAIM-Forum [mailto: <EMAIL REMOVED> ] On
>> >>> Behalf Of Delisi, Jennie (MNIT)
>> >>> Sent: Friday, October 28, 2016 1:22 PM
>> >>> To: <EMAIL REMOVED>
>> >>> Subject: [WebAIM] screen reader versions for testing
>> >>>
>> >>> Hello,
>> >>>
>> >>> Interested in feedback. For those that test websites and documents
>> >>> for accessibility against the standards, but also use screen readers
>> >>> as a part of the testing protocols:
>> >>>
>> >>> -how many versions do you test with? For example, if there is a
>> >>> version
>> >>> 17 and a version 18 of the same screen reader, are you testing with
>> >>> the current version and 1 version back?
>> >>>
>> >>> -for those testing websites and documents that will be reviewed by
>> >>> the public, do you have a different number of versions you test with,
>> >>> as opposed to documents that will only be used internally? For
>> >>> example, there may be an expectation of employees having access to
>> >>> the latest version of a particular screen reader (with maybe one
>> >>> version back for a period of time), but the public may have varying
>> >>> amounts of resources to put towards upgrades.
>> >>>
>> >>> Thanks in advance for any information you can share. I will be cross
>> >>> posting this on the IAAP list and LinkedIn.
>> >>>
>> >>> Jennie
>> >>>
>> >>> Jennie Delisi
>> >>> Accessibility Analyst | Office of Accessibility Minnesota IT Services
>> >>> | Partners in Performance
>> >>> 658 Cedar Street
>> >>> St. Paul, MN, 55155
>> >>> O: 651-201-1135 <(651)%20201-1135>
>> >>> Information Technology for Minnesota Government | mn.gov/mnit
>> >>>
>> >>>
>> >>>
>> >>>
>> >>> >> >>> >> >>> archives at http://webaim.org/discussion/archives
>> >>> >> >>> >> >>> >> >>> archives at http://webaim.org/discussion/archives
>> >>> >> >> >> >> >> >> archives at http://webaim.org/discussion/archives
>> >> >> > >> >
>> > The information contained in this e-mail is confidential and/or
>> proprietary
>> > to Capital One and/or its affiliates and may only be used solely in
>> > performance of work or services for Capital One. The information
>> transmitted
>> > herewith is intended only for use by the individual or entity to which
>> it is
>> > addressed. If the reader of this message is not the intended recipient,
>> you
>> > are hereby notified that any review, retransmission, dissemination,
>> > distribution, copying or other use of, or taking of any action in
>> reliance
>> > upon this information is strictly prohibited. If you have received this
>> > communication in error, please contact the sender and delete the
>> > material
>> > from your computer.
>> > >> > >> archives at
>> > http://webaim.org/discussion/archives
>> > >> >
>> > >> > >> > >> > >> >
>>
>>
>> --
>> Work hard. Have fun. Make history.
>> >> >> >> >>
> > > > >


--
Work hard. Have fun. Make history.