WebAIM - Web Accessibility In Mind

E-mail List Archives

Thread: mobile AT, voiceover, talkback, etc

for

Number of posts in this thread: 2 (In chronological order)

From: Tomlins Diane
Date: Thu, Feb 15 2018 8:16AM
Subject: mobile AT, voiceover, talkback, etc
No previous message | Next message →

Our developers have been challenged (for a code-a-thon) to create a solution where patients can take a cell phone photo of an image with text/numbers and store not only the image but also the text/numbers as discrete data.

The thought is around Rx labels, insurance cards etc. So, for a11y, how does that work for those using AT on their mobile devices?
For instance, I can deposit a check with my bank by taking a photo of it, the app frames the check area with visible markers, and when it's focused and detects what it needs, it takes the photo. How does that work with AT.. or does it? Are there voice prompts or guides?? Most Rx bottles these days have bar codes, but I can see where it could be a challenge for someone using mobile AT to scan a label like that.

I'll be a floater across the teams participating in the code-a-thon to make sure that whatever they come up with has accessibility built in, so I need to understand how AT would/could work in these scenarios.

Thanks!

Diane R Tomlins
HCA IT&S | Digital Media
Accessibility SME

From: Jonathan Avila
Date: Thu, Feb 15 2018 8:29AM
Subject: Re: mobile AT, voiceover, talkback, etc
← Previous message | No next message

> For instance, I can deposit a check with my bank by taking a photo of it, the app frames the check area with visible markers, and when it's focused and detects what it needs, it takes the photo. How does that work with AT.. or does it? Are there voice prompts or guides?? Most Rx bottles these days have bar codes, but I can see where it could be a challenge for someone using mobile AT to scan a label like that.

There could be voice prompts and guides. For example, the camera app on iOS give some feedback about faces in the view when VoiceOver is running. Other apps give feedback about lighting, etc. Microsoft's Seeing AI gives some clues as well as KNFB reader app and money reader apps. It might be helpful to see how those apps assist the user in adjusting the camera to take the best picture. The biggest challenge you might have is where the processing takes place -- on the device or on the server and how long it takes to give that feedback. There could be other suggestions to help users such as using a standard, certain position, or distance that they should hold the camera from the object.

Jonathan

Jonathan Avila
Chief Accessibility Officer
Level Access
= EMAIL ADDRESS REMOVED =
703.637.8957 office

Visit us online:
Website | Twitter | Facebook | LinkedIn | Blog

See you at CSUN in March!