Thread Subject: Re: touchscreens
This archival content is maintained by WebAIM and NCDAE on behalf of TEITAC and the U.S. Access Board . Additional details on the updates to section 508 and section 255 can be found at the Access Board web site.
From: Jim Tobias
Date: Thu, Jul 19 2007 8:25 AM
- Return to this mailing list's archives
- View all messages in this thread
- Next message in thread: Gregg Vanderheiden: "Re: touchscreens"
- Previous message in thread: Gregg Vanderheiden: "Re: touchscreens"
- Messages sorted by: Author | Thread | Date
Thanks for your comments, Gregg. I'm going to respond separately to the
non-touchscreen gesture interfaces.
I want to begin by saying that my goal here is to see whether a single
touchscreen input device can meet the needs of all users with disabilities
at least as well as mechanical controls. To repeat, a static touchscreen
may be accessible to people with limited dexterity if the targets are large
enough and far enough apart. A dynamic, gesture-based touchscreen may meet
the needs of users with vision loss if the gestures can be received anywhere
on the active surface (which must itself be tactilely discernible -- no fair
having touchscreens without borders). It *might* be possible to have 2
gesture reception areas simultaneously (top/bottom or left/right) if the
gestures are well designed.
So I'm talking about a touchscreen device that can run either static or
dynamic input software at the user's choice.
> First lets look at what we have - and then some new ideas
> your post brings up
> Currently the language says - "if touchscreen is used then
> all functionality can be done through tactilely discernable controls."
> This would mean that people who can't use touch screens
> (static or dynamic) could achieve the same functions another way.
I think that the only people who can't use static or dynamic touchscreens
also could not use mechanical controls and would need speech recognition or
an AT solution like scanning or puff-and-sip switches.
> 1 - are the gestures like shortcuts? You can use them to do
> things quickly
> but there are other ways as well? -- this would be
> non-gesture access
> for all gesture input.
> 2 - are gestures the ONLY way to do some things? If so then
> some gestures
> require fine motor and some require simultaneous actions.
I'm assuming that the touchscreen is the *only* input device. But the
issue of "simultaneous action" is important -- what does it mean in this
context? It's not as obvious as "CTRL-ALT-DEL".
I think that the bottom line is, can a blind user use a well-designed
gesture interface? If we say "yes", we should reconsider the touchscreen