WebAIM - Web Accessibility In Mind

E-mail List Archives

Re: Screen Readers as a Development Tool for Web Developers

for

From: Bryan Garaventa
Date: Jul 17, 2015 11:52PM


> I agree. And while I believe that watching a screen reader user and trying to use one helps with empathy and understanding it is unrealistic to expect
> developers to be up-to-date screen reader users.

That is true, a person who does not use an AT like a screen reader on a regular basis, will never be an expert user.

When I refer to education for developers however, I'm not talking about empathy and understanding of the AT user, I'm referring to the mechanics behind the technology. These are two very different things.

For example, it is possible while teaching a web developer about JavaScript and HTML to also teach them the following at the same time:

* Accessibility on the operating system ties into the platform Accessibility API for that system, resulting in the creation of the Accessibility Tree.

* Browsers also tie into the Accessibility API role state and property mappings by building an Accessibility Tree based on the markup of web technologies; including the use of ARIA markup for this purpose.

* Assistive Technologies like screen readers then interface with the Accessibility Tree to convey the correct roles states and properties of web technologies, and process related events when fired by dynamic changes in the DOM.

* Desktop screen readers like JAWS and NVDA use a virtual offscreen model where all content that is offscreen but not hidden can be interacted with regardless, as opposed to VoiceOver on iOS that uses a visual rendering model where only the top layer of the visible UI can be reliably interacted with via touch. Additionally, when using one finger to explore the visible UI of an iOS device, only the visibly rendered model can be interacted with, which does not follow the DOM order. However, when swiping from Left to Right or Right to Left with one finger to navigate forward or backward from one object to the next, VoiceOver is actually following the DOM order and not the visibly rendered model, which is why offscreen content cannot be reliably interacted with on iOS.

None of the above information has anything to do with empathy or being understanding of the user, but instead conveys critical information that is of direct use to engineers. Moreover, the above information covers the most widely used and standards compliant screen readers, so it covers the majority of the global user base by providing the underlying behavioral information needed to understand why certain implementations behave differently on desktops versus touch screen devices.

If every engineering course started by simply explaining the above points to developers as part of the learning process, as well as within online learning materials for current developers wishing to become better versed in these concepts, it would have a significant impact on the future of accessible technologies.

For example, all of the above points perfectly illustrate how ARIA works, by being rendered in the DOM and changing the Accessibility Tree in the browser, which is mapped to the relevant control types on the platform Accessibility API, which ATs like screen readers then interface with, which is then conveyed to AT users.

It wouldn't be difficult to add this simple explanation to a programming training course for developers, and none of it requires Assistive Technology expertise on behalf of developers to understand or reproduce.