Thread Subject: Re: Proposal remote access requirements
This archival content is maintained by WebAIM and NCDAE on behalf of TEITAC and the U.S. Access Board . Additional details on the updates to section 508 and section 255 can be found at the Access Board web site.
From: Peter Korn
Date: Tue, Jun 26 2007 3:15 PM
- Return to this mailing list's archives
- View all messages in this thread
- Next message in thread: None
- Previous message in thread: Peter Korn: "Re: Proposal remote access requirements"
- Messages sorted by: Author | Thread | Date
> Proposed standard:
> When software provides either server or client functionality for remote
> access to a specific graphical user interface, such software MUST pass
> interface element accessibility information from (proposed 21D) from the
> server to the client to then be passed to or inspected by assistive
> Peter, when you are back from vacation, a comment on how Gnome works
> during remote access would help.
There are two ways we can make a remote desktop session accessible
through AT to a local workstation:
1. Run the AT remotely as well
2. Run the AT locally, and have it communicate accessibility
information with the remote app
Your proposal above addresses route #2, and presumes that route #2 is
the only way. I don't think we should make that presumption - though I
recognize that it may result in the preferred user experience (e.g.
users may prefer one screen reader rather than two; though it might be
rather weird to ask JAWS to make thoughtful sense of a remote UNIX/GNOME
desktop, when JAWS doesn't know about some of the GNOME desktop user
We have not (yet) solved the remote desktop problem in GNOME. The
challenge for us is that in X Windows, the standard way of doing a
remote desktop is via technology called VNC, which redirects the entire
desktop X video session into an X window (which could be a full-screen
window) on a system across the network. Since X doesn't itself specify
audio (that is done with different technology), VNC connections don't
Two computer science students at NC State have done some very
interesting work supporting route #1 in GNOME. In fact, you can try out
their solution via http://www.remoteaccessbridge.com/ (install a Java
Swing application on your Windows box, then connect to their Fedora
Linux GNOME desktop server at NC State and have a remote desktop session
working, and talking, on your Windows box - with the speech coming from
the Windows text-to-speech engine you have on your system). They layer
remote audio and local text-to-speech (taking the Orca screen reader
text on the remote desktop and getting it spoken locally) together with
the pixels coming over from VNC to provide blind users with an
accessible GNOME desktop over the Internet. These students are looking
at extending this to remote Braille, so that Orca's Braille output would
likewise get routed over the Internet to be shown on your local
workstation. Now that BRLTTY has been ported to Windows, much of the
components they need for this work are in place. They are also
interested in adding support for the other GNOME assistive technologies
GOK and Dasher, both of which use (among other things) USB mouse input
which should be fairly easy to route over a network connection.
So to summarize, I think we need standard language that recognizes at
least these two ways of solving this problem, and allows an
implementation of either one to be acceptable for meeting the standard.
Sun Microsystems, Inc.
- Next message in Thread: None
- Previous message in Thread: Peter Korn: "Re: Proposal remote access requirements"