WebAIM - Web Accessibility In Mind

E-mail List Archives

Thread: Looking for techniques for accessible maps

for

Number of posts in this thread: 7 (In chronological order)

From: Whitney Quesenbery
Date: Sat, Nov 16 2013 10:26AM
Subject: Looking for techniques for accessible maps
No previous message | Next message →

Inquiring for a colleague who is working with an government agricultural
project.

One of their functions is a detailed map that describes where land is and
how it is being used.

At the top level, they have a list of fields (the kind cows stand in, not a
form field) with a related visual map presentation. The list contains the
metadata (who owns it, what is it used for) in an accessible format.

But, there's some more detailed information, such as things like how much
margin is being left between hedges, crop layouts and so on. Today, this is
being done with crayons on a large paper map.

Her project will be putting all this information online - already a
challenge - and needs to be done in a way that makes the detail accessible.

She's looking for research or resources in accessible visual data or
mapping. Any ideas or pointers?

--
Whitney Quesenbery
www.wqusability.com | @whitneyq

A Web for Everyone
http://rosenfeldmedia.com/books/a-web-for-everyone/

Storytelling for User Experience
www.rosenfeldmedia.com/books/storytelling

Global UX: Design and research in a connected world
@globalUX | www.amazon.com/gp/product/012378591X/

From: Dianne V Pawluk
Date: Sat, Nov 16 2013 1:13PM
Subject: Re: Looking for techniques for accessible maps
← Previous message | Next message →

Hi Robert,

I realize now that I forgot to respond to you about our project - I wanted
to find the time to give you a complete response and then got burdened with
work.

I will be happy to talk to both Whitney and yourself about her problem, and
I hope to gather information for you on my tactile graphics projects soon.

Sincerely,
Dianne Pawluk
Associate Professor
Biomedical Engineering
Virginia Commonwealth University



On Sat, Nov 16, 2013 at 1:56 PM, Robert Jaquiss < = EMAIL ADDRESS REMOVED = >wrote:

> Hello:
>
> How about downloadable files that can be produced with image capable
> embossers or capsule paper?
>
> Regards,
>
> Robert
>
>
> -----Original Message-----
> From: = EMAIL ADDRESS REMOVED =
> [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Whitney
> Quesenbery
> Sent: Saturday, November 16, 2013 9:26 AM
> To: WebAIM Discussion List
> Subject: [WebAIM] Looking for techniques for accessible maps
>
> Inquiring for a colleague who is working with an government agricultural
> project.
>
> One of their functions is a detailed map that describes where land is and
> how it is being used.
>
> At the top level, they have a list of fields (the kind cows stand in, not a
> form field) with a related visual map presentation. The list contains the
> metadata (who owns it, what is it used for) in an accessible format.
>
> But, there's some more detailed information, such as things like how much
> margin is being left between hedges, crop layouts and so on. Today, this is
> being done with crayons on a large paper map.
>
> Her project will be putting all this information online - already a
> challenge - and needs to be done in a way that makes the detail accessible.
>
> She's looking for research or resources in accessible visual data or
> mapping. Any ideas or pointers?
>
> --
> Whitney Quesenbery
> www.wqusability.com | @whitneyq
>
> A Web for Everyone
> http://rosenfeldmedia.com/books/a-web-for-everyone/
>
> Storytelling for User Experience
> www.rosenfeldmedia.com/books/storytelling
>
> Global UX: Design and research in a connected world @globalUX |
> www.amazon.com/gp/product/012378591X/
> > > messages to = EMAIL ADDRESS REMOVED =
>
> > > >



--
Dianne Pawluk
Associate Professor
Biomedical Engineering
Virginia Commonwealth University

From: Whitney Quesenbery
Date: Sat, Nov 16 2013 1:52PM
Subject: Re: Looking for techniques for accessible maps
← Previous message | Next message →

Thanks Dianne. Is there a web site for your project that I can point my
friend to?


On Sat, Nov 16, 2013 at 3:13 PM, Dianne V Pawluk < = EMAIL ADDRESS REMOVED = > wrote:

> Hi Robert,
>
> I realize now that I forgot to respond to you about our project - I wanted
> to find the time to give you a complete response and then got burdened with
> work.
>
> I will be happy to talk to both Whitney and yourself about her problem, and
> I hope to gather information for you on my tactile graphics projects soon.
>
> Sincerely,
> Dianne Pawluk
> Associate Professor
> Biomedical Engineering
> Virginia Commonwealth University
>
>
>
> On Sat, Nov 16, 2013 at 1:56 PM, Robert Jaquiss < = EMAIL ADDRESS REMOVED =
> >wrote:
>
> > Hello:
> >
> > How about downloadable files that can be produced with image capable
> > embossers or capsule paper?
> >
> > Regards,
> >
> > Robert
> >
> >
> > -----Original Message-----
> > From: = EMAIL ADDRESS REMOVED =
> > [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Whitney
> > Quesenbery
> > Sent: Saturday, November 16, 2013 9:26 AM
> > To: WebAIM Discussion List
> > Subject: [WebAIM] Looking for techniques for accessible maps
> >
> > Inquiring for a colleague who is working with an government agricultural
> > project.
> >
> > One of their functions is a detailed map that describes where land is and
> > how it is being used.
> >
> > At the top level, they have a list of fields (the kind cows stand in,
> not a
> > form field) with a related visual map presentation. The list contains the
> > metadata (who owns it, what is it used for) in an accessible format.
> >
> > But, there's some more detailed information, such as things like how much
> > margin is being left between hedges, crop layouts and so on. Today, this
> is
> > being done with crayons on a large paper map.
> >
> > Her project will be putting all this information online - already a
> > challenge - and needs to be done in a way that makes the detail
> accessible.
> >
> > She's looking for research or resources in accessible visual data or
> > mapping. Any ideas or pointers?
> >
> > --
> > Whitney Quesenbery
> > www.wqusability.com | @whitneyq
> >
> > A Web for Everyone
> > http://rosenfeldmedia.com/books/a-web-for-everyone/
> >
> > Storytelling for User Experience
> > www.rosenfeldmedia.com/books/storytelling
> >
> > Global UX: Design and research in a connected world @globalUX |
> > www.amazon.com/gp/product/012378591X/
> > > > > > messages to = EMAIL ADDRESS REMOVED =
> >
> > > > > > > >
>
>
>
> --
> Dianne Pawluk
> Associate Professor
> Biomedical Engineering
> Virginia Commonwealth University
> > > >



--
Whitney Quesenbery
www.wqusability.com | @whitneyq

Storytelling for User Experience
www.rosenfeldmedia.com/books/storytelling

Global UX: Design and research in a connected world
@globalUX | www.amazon.com/gp/product/012378591X/

From: Dianne V Pawluk
Date: Sun, Nov 17 2013 2:46PM
Subject: Re: Looking for techniques for accessible maps
← Previous message | Next message →

Not that says very much. I will make a compilation of things and send.

Sincerely,
Dianne



On Sat, Nov 16, 2013 at 3:52 PM, Whitney Quesenbery < = EMAIL ADDRESS REMOVED = >wrote:

> Thanks Dianne. Is there a web site for your project that I can point my
> friend to?
>
>
> On Sat, Nov 16, 2013 at 3:13 PM, Dianne V Pawluk < = EMAIL ADDRESS REMOVED = > wrote:
>
> > Hi Robert,
> >
> > I realize now that I forgot to respond to you about our project - I
> wanted
> > to find the time to give you a complete response and then got burdened
> with
> > work.
> >
> > I will be happy to talk to both Whitney and yourself about her problem,
> and
> > I hope to gather information for you on my tactile graphics projects
> soon.
> >
> > Sincerely,
> > Dianne Pawluk
> > Associate Professor
> > Biomedical Engineering
> > Virginia Commonwealth University
> >
> >
> >
> > On Sat, Nov 16, 2013 at 1:56 PM, Robert Jaquiss < = EMAIL ADDRESS REMOVED =
> > >wrote:
> >
> > > Hello:
> > >
> > > How about downloadable files that can be produced with image
> capable
> > > embossers or capsule paper?
> > >
> > > Regards,
> > >
> > > Robert
> > >
> > >
> > > -----Original Message-----
> > > From: = EMAIL ADDRESS REMOVED =
> > > [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Whitney
> > > Quesenbery
> > > Sent: Saturday, November 16, 2013 9:26 AM
> > > To: WebAIM Discussion List
> > > Subject: [WebAIM] Looking for techniques for accessible maps
> > >
> > > Inquiring for a colleague who is working with an government
> agricultural
> > > project.
> > >
> > > One of their functions is a detailed map that describes where land is
> and
> > > how it is being used.
> > >
> > > At the top level, they have a list of fields (the kind cows stand in,
> > not a
> > > form field) with a related visual map presentation. The list contains
> the
> > > metadata (who owns it, what is it used for) in an accessible format.
> > >
> > > But, there's some more detailed information, such as things like how
> much
> > > margin is being left between hedges, crop layouts and so on. Today,
> this
> > is
> > > being done with crayons on a large paper map.
> > >
> > > Her project will be putting all this information online - already a
> > > challenge - and needs to be done in a way that makes the detail
> > accessible.
> > >
> > > She's looking for research or resources in accessible visual data or
> > > mapping. Any ideas or pointers?
> > >
> > > --
> > > Whitney Quesenbery
> > > www.wqusability.com | @whitneyq
> > >
> > > A Web for Everyone
> > > http://rosenfeldmedia.com/books/a-web-for-everyone/
> > >
> > > Storytelling for User Experience
> > > www.rosenfeldmedia.com/books/storytelling
> > >
> > > Global UX: Design and research in a connected world @globalUX |
> > > www.amazon.com/gp/product/012378591X/
> > > > > > > list
> > > messages to = EMAIL ADDRESS REMOVED =
> > >
> > > > > > > > > > > >
> >
> >
> >
> > --
> > Dianne Pawluk
> > Associate Professor
> > Biomedical Engineering
> > Virginia Commonwealth University
> > > > > > > >
>
>
>
> --
> Whitney Quesenbery
> www.wqusability.com | @whitneyq
>
> Storytelling for User Experience
> www.rosenfeldmedia.com/books/storytelling
>
> Global UX: Design and research in a connected world
> @globalUX | www.amazon.com/gp/product/012378591X/
> > > >



--
Dianne Pawluk
Associate Professor
Biomedical Engineering
Virginia Commonwealth University

From: Dianne V Pawluk
Date: Fri, Dec 06 2013 8:28AM
Subject: Re: Looking for techniques for accessible maps
← Previous message | Next message →

Hi Whitney and Robert,


I apologize for taking so long to get back to you, especially Robert, but
this term has really been busy for me.


As Robert knows, I’m a researcher who does work in developing assistive
technology for tactile graphics. I have had 2 grants in the past and a
current grant from the National Science Foundation working on tactile
graphics.


My current work is on automating the process of converting visual diagrams
into tactile diagrams. My previous work was developing effective
representations for tactile diagrams and affordable devices that could act
like computer peripherals to virtually interact with diagrams in
real-time. The overall idea is to automate and provide effective and easy
access to visual diagrams for individuals who are blind or visually
impaired without the need for intervention by a sighted person.


First, let me talk about representations. The issue of how to represent
information is very important. Currently the most common method of
presenting information is using raised line diagrams. Unfortunately,
studies have shown that even when used to identify common objects that the
ability of individuals to use these representations is very poor –
approximately around 30%. Therefore my laboratory group, as well as
another group (Thompson, Chronicle and Collins), have separately looked at
more effective methods of presenting information.


Thompson, Chronical and Collins (2006). Enhancing 2-D Tactile Picture
Design from Knowledge of 3-D Haptic Object Recognition. European
Psychologist. 11 (2), 110-118.


Burch and Pawluk (2011). Using Multiple Contacts with Texture-enhanced
Graphics. World Haptics Conference, 287-292.


We both had the idea to use texture to encode information that is difficult
to interpret through touch. Processing geometric information is done
serially (i.e., with one finger following along a line) rather than in
parallel as in vision. This makes it difficult to interpret what the lines
mean (i.e., they may not be an outline of an object part but instead be a
line indicating some detail or it may indicate perspective). In vision, as
multiple lines are seen together, the constraints can be solved
simultaneously, but touch, being serial, relies on it being solved through
memory which is much more difficult.


Both groups use texture to encode information about part (i.e., a different
object part would get a different texture) and about 3-D orientation (i.e.,
different 3-D orientations would be indicated by different 2-D orientations
of textures. We both found significant improvement with this new method.


However, our group believed that there was another significance to using
texture. Psychologists (particularly Lederman and Klatzky) have shown that
touch is more effective in interpreting material properties than geometric
information. In particular, when searching for objects placed across
fingers, they found that parallel processing (i.e., information to all
fingers could be processed simultaneously) occurred for material properties
(such as texture) but not geometric properties. We wondered if this would
hold for interpreting tactile graphics. David’s work (above) showed that
not only did our texture encoded information improve performance over
raised line drawings. It improved even more when multiple fingers were
used (this was untrue for raised line drawings, where it did not matter if
one or more fingers were used)!


This is why I would strongly recommend that fields, etc. be given textures
unique to that item (i.e., different crops may have a particular texture,
border spacing another texture, etc.). This will allow for quicker
processing of information than just using raised line drawings. We have
actually recently implemented this idea for capsule paper for a Botanical
gardens in Richmond, and it seems to be very effective. The textures we
chose are from a compilation by Lucia Hasty (www.tactilegraphics.org), from
experimentally assessed textures, to pick textures that are actually known
to be distinguishable from each other. We’ve had to make some adjustments,
as some textures did not feel as unique as we thought they would. This
could be used in combination with something like a talking pen or the
talking tablet, to provide text information at certain points.


You may also be interested in the fact that we have also developed
affordable devices that can allow exploration of a diagram, such as a map,
to happen interactively with a computer. David’s devices are shown in his
paper. These are small devices that wrap around the finger. They pick up
the color from a video display that presents a color version of the
textured graphic. The finger devices know by the color which texture to
present, which they do so through a vibrator. One can use as many devices
on as many fingers as one wants, but two to three devices seem to work best
(we have not submitted these results for publication yet).


Another affordable device that we made in the laboratory is a mouse-like
tactile display. This consists of a Braille cell mounted on a hollow mouse
case and used on top of a graphics tablet (although a touchscreen would
work just as well). Robert, this may seem similar to the VT Player by
virTouch, but there were some fundamental problems with their design that
we corrected. First, they used regular mouse technology which does not
produce accurate position information, in fact it can be horribly
inaccurate. To see this, rotate a regular mouse to the left about 45
degrees and move it straight vertically: you will find it moving to the
corner of the screen rather than straight upwards. Also, the point of
rotation was at a very different place than the pins: if the mouse was
rotated, the VT Player did not detect this and so did not compensate for
the different position of the pins as the rotation point was still the same.


One of our papers on this is the following:

Headley, Hribar and Pawluk (2011). Displaying braille and graphics on a
mouse like tactile display. ASSETS 2011.


As you can see, we found an effective way to present both braille and
graphics with the same display, and have them easily distinguishable.


For this mouse like device, we developed a method of controllably
generating distinctly different textures , but we have not evaluated its
performance with picture identification or in comparison to our other
device.


Headley and Pawluk (2011). Roughness perception of textures on a haptic
matrix display. Wourld Haptics, 221-226.


We have also, in recent, work, combined the use of tactile feedback through
a touchscreen with text feedback for what we describe are points of
interest for an electronic version of the garden map.


Another alternative, which we are planning to evaluate in comparison to
tactile feedback is the use of audio feedback called sonification (the use
of nonspeech sounds to relay information just like we would with texture).
We believe that the disadvantage with this is that one would not be able to
use multiple fingers, which we did show improved performance significantly.


However, one study I did with one of my students, Ravi Rastogi, which we
are preparing for publication is looking at the use of tactile and
sonification feedback for maps that have more than one set of features.
Actually the maps we used were crops (although mangos, etc.) and rainfall.
With sight, it is relatively easy to overlay this information without being
overwhelmed by it. We look at using: two very distinct sets of tactile
textures, two very distinct sets of audio feedback, and one set of tactile
feedback with one set of audio feedback. The reason for this comparison is
that cognitive load theory proposes that each sensory channel has a finite
capacity for working memory. Thereby, we reasoned, by spreading the load
across two modalities, we would improve performance. In fact that is what
we did find.


The main part of Ravi’s work was looking at methods of interactive zooming
and simplification.


Rastogi, Pawluk and Ketchum (2013). Intuitive Tactile Zooming for Graphics
Accessed by Individuals Who are Blind and Visually Impaired. IEEE
Transaction on Neural Systems and Rehabilitation Engineering.


Rastgo and Pawluk. (2013). Dynamic Tactile Diagram Simplification on
Refreshable Displays. Assistive Technology, 25.


Both of these are potentially applicable to agriculture map information.
With zooming, we were looking at the issue that we did not think that using
visual zooming methods were appropriate. With many visual zooming methods,
the zoom that occurs with a button click is often not appropriate: either
there is not much of a change from previously or the zoom inconveniently
cuts through information that should go together. This is easy to correct
visually as a person can quickly glance at the result and zoom back out to
the level desired. This is not so easy to do tactually, as the process of
exploring a map tactually is a much slower process. We developed an
algorithm that uses the conceptual organization of information to scale
between levels appropriately. Thus, saving time.


With simplification, we looked at the issue of what if there are a lot of
different features that would be desirable to present on a map.
Unfortunately too much information on a diagram makes them very difficult
to interpret tactually: and remember that the amount of information
presented in a normal visual diagram is too much, as the visual and tactile
systems are different in their processing abilities. Also, unfortunately,
currently, it is the sighted maker of the tactile graphic that decides
whether information should be included or not. It is based on the user’s
intent, but what if that intent changes? A user would have to go back to
the maker and ask them for another map. We looked at a method where
different sets of features would be on different “layers”,
indistinguishable in the final map but which the user would be able to
remove, add and combine at will. Thus, all the information is available to
the user, but they can select what they want at that particular moment, to
avoid cluttering the diagram. We found that this did help a lot. We also
looked at presenting borders, as straightline borders are easier to track
than meandering ones – again we allowed users to have a choice. The idea,
was that they could use the simplified method to get an overview and then
the more detailed method if needed.


I hope I have not talked to long. Whitney and Robert, if you have any
questions or Whitney, your colleague has any questions, I would be happy to
answer them or discuss things over the phone.



Sincerely,

Dianne Pawluk

Associate Professor

Biomedical Engineering

Virginia Commonwealth University


On Sat, Nov 16, 2013 at 3:52 PM, Whitney Quesenbery < = EMAIL ADDRESS REMOVED = >wrote:

> Thanks Dianne. Is there a web site for your project that I can point my
> friend to?
>
>
> On Sat, Nov 16, 2013 at 3:13 PM, Dianne V Pawluk < = EMAIL ADDRESS REMOVED = > wrote:
>
> > Hi Robert,
> >
> > I realize now that I forgot to respond to you about our project - I
> wanted
> > to find the time to give you a complete response and then got burdened
> with
> > work.
> >
> > I will be happy to talk to both Whitney and yourself about her problem,
> and
> > I hope to gather information for you on my tactile graphics projects
> soon.
> >
> > Sincerely,
> > Dianne Pawluk
> > Associate Professor
> > Biomedical Engineering
> > Virginia Commonwealth University
> >
> >
> >
> > On Sat, Nov 16, 2013 at 1:56 PM, Robert Jaquiss < = EMAIL ADDRESS REMOVED =
> > >wrote:
> >
> > > Hello:
> > >
> > > How about downloadable files that can be produced with image
> capable
> > > embossers or capsule paper?
> > >
> > > Regards,
> > >
> > > Robert
> > >
> > >
> > > -----Original Message-----
> > > From: = EMAIL ADDRESS REMOVED =
> > > [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Whitney
> > > Quesenbery
> > > Sent: Saturday, November 16, 2013 9:26 AM
> > > To: WebAIM Discussion List
> > > Subject: [WebAIM] Looking for techniques for accessible maps
> > >
> > > Inquiring for a colleague who is working with an government
> agricultural
> > > project.
> > >
> > > One of their functions is a detailed map that describes where land is
> and
> > > how it is being used.
> > >
> > > At the top level, they have a list of fields (the kind cows stand in,
> > not a
> > > form field) with a related visual map presentation. The list contains
> the
> > > metadata (who owns it, what is it used for) in an accessible format.
> > >
> > > But, there's some more detailed information, such as things like how
> much
> > > margin is being left between hedges, crop layouts and so on. Today,
> this
> > is
> > > being done with crayons on a large paper map.
> > >
> > > Her project will be putting all this information online - already a
> > > challenge - and needs to be done in a way that makes the detail
> > accessible.
> > >
> > > She's looking for research or resources in accessible visual data or
> > > mapping. Any ideas or pointers?
> > >
> > > --
> > > Whitney Quesenbery
> > > www.wqusability.com | @whitneyq
> > >
> > > A Web for Everyone
> > > http://rosenfeldmedia.com/books/a-web-for-everyone/
> > >
> > > Storytelling for User Experience
> > > www.rosenfeldmedia.com/books/storytelling
> > >
> > > Global UX: Design and research in a connected world @globalUX |
> > > www.amazon.com/gp/product/012378591X/
> > > > > > > list
> > > messages to = EMAIL ADDRESS REMOVED =
> > >
> > > > > > > > > > > >
> >
> >
> >
> > --
> > Dianne Pawluk
> > Associate Professor
> > Biomedical Engineering
> > Virginia Commonwealth University
> > > > > > > >
>
>
>
> --
> Whitney Quesenbery
> www.wqusability.com | @whitneyq
>
> Storytelling for User Experience
> www.rosenfeldmedia.com/books/storytelling
>
> Global UX: Design and research in a connected world
> @globalUX | www.amazon.com/gp/product/012378591X/
> > > >



--
Dianne Pawluk
Associate Professor
Biomedical Engineering
Virginia Commonwealth University

From: Bourne, Sarah (ITD)
Date: Fri, Dec 06 2013 8:58AM
Subject: Re: Looking for techniques for accessible maps
← Previous message | Next message →

Dianne,

I am so impressed! Perhaps I'm jumping way ahead here, but I started thinking about how this might work with a GIS system. A GIS user can pick the data layers they want included (usually sitting on top of some generic layers, such as geopolitical boundaries, roads, etc., and their associated labels.) It doesn't seem like that much of a stretch to allow a tactile user to also select the texture or sounds to apply to the selected layers. Thus the user would create a new map, rather than having to have someone else do it for them.

If you get tired of working with crop maps, perhaps you could look into the map that is most often used as an example of something really hard to provide text equivalents for: the dreaded campus map.

sb
Sarah E. Bourne
Director of Assistive Technology &
Mass.Gov Chief Technology Strategist
Information Technology Division
Commonwealth of Massachusetts
1 Ashburton Pl. rm 1601 Boston MA 02108
617-626-4502
= EMAIL ADDRESS REMOVED =
http://www.mass.gov/itd


-----Original Message-----
From: = EMAIL ADDRESS REMOVED = [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Dianne V Pawluk
Sent: Friday, December 06, 2013 10:28 AM
To: WebAIM Discussion List
Subject: Re: [WebAIM] Looking for techniques for accessible maps

Hi Whitney and Robert,


I apologize for taking so long to get back to you, especially Robert, but this term has really been busy for me.


As Robert knows, I'm a researcher who does work in developing assistive technology for tactile graphics. I have had 2 grants in the past and a current grant from the National Science Foundation working on tactile graphics.


My current work is on automating the process of converting visual diagrams into tactile diagrams. My previous work was developing effective representations for tactile diagrams and affordable devices that could act like computer peripherals to virtually interact with diagrams in real-time. The overall idea is to automate and provide effective and easy access to visual diagrams for individuals who are blind or visually impaired without the need for intervention by a sighted person.


First, let me talk about representations. The issue of how to represent information is very important. Currently the most common method of presenting information is using raised line diagrams. Unfortunately, studies have shown that even when used to identify common objects that the ability of individuals to use these representations is very poor - approximately around 30%. Therefore my laboratory group, as well as another group (Thompson, Chronicle and Collins), have separately looked at more effective methods of presenting information.


Thompson, Chronical and Collins (2006). Enhancing 2-D Tactile Picture Design from Knowledge of 3-D Haptic Object Recognition. European Psychologist. 11 (2), 110-118.


Burch and Pawluk (2011). Using Multiple Contacts with Texture-enhanced Graphics. World Haptics Conference, 287-292.


We both had the idea to use texture to encode information that is difficult to interpret through touch. Processing geometric information is done serially (i.e., with one finger following along a line) rather than in parallel as in vision. This makes it difficult to interpret what the lines mean (i.e., they may not be an outline of an object part but instead be a line indicating some detail or it may indicate perspective). In vision, as multiple lines are seen together, the constraints can be solved simultaneously, but touch, being serial, relies on it being solved through memory which is much more difficult.


Both groups use texture to encode information about part (i.e., a different object part would get a different texture) and about 3-D orientation (i.e., different 3-D orientations would be indicated by different 2-D orientations of textures. We both found significant improvement with this new method.


However, our group believed that there was another significance to using texture. Psychologists (particularly Lederman and Klatzky) have shown that touch is more effective in interpreting material properties than geometric information. In particular, when searching for objects placed across fingers, they found that parallel processing (i.e., information to all fingers could be processed simultaneously) occurred for material properties (such as texture) but not geometric properties. We wondered if this would hold for interpreting tactile graphics. David's work (above) showed that not only did our texture encoded information improve performance over raised line drawings. It improved even more when multiple fingers were used (this was untrue for raised line drawings, where it did not matter if one or more fingers were used)!


This is why I would strongly recommend that fields, etc. be given textures unique to that item (i.e., different crops may have a particular texture, border spacing another texture, etc.). This will allow for quicker processing of information than just using raised line drawings. We have actually recently implemented this idea for capsule paper for a Botanical gardens in Richmond, and it seems to be very effective. The textures we chose are from a compilation by Lucia Hasty (www.tactilegraphics.org), from experimentally assessed textures, to pick textures that are actually known to be distinguishable from each other. We've had to make some adjustments, as some textures did not feel as unique as we thought they would. This could be used in combination with something like a talking pen or the talking tablet, to provide text information at certain points.


You may also be interested in the fact that we have also developed affordable devices that can allow exploration of a diagram, such as a map, to happen interactively with a computer. David's devices are shown in his paper. These are small devices that wrap around the finger. They pick up the color from a video display that presents a color version of the textured graphic. The finger devices know by the color which texture to present, which they do so through a vibrator. One can use as many devices on as many fingers as one wants, but two to three devices seem to work best (we have not submitted these results for publication yet).


Another affordable device that we made in the laboratory is a mouse-like tactile display. This consists of a Braille cell mounted on a hollow mouse case and used on top of a graphics tablet (although a touchscreen would work just as well). Robert, this may seem similar to the VT Player by virTouch, but there were some fundamental problems with their design that we corrected. First, they used regular mouse technology which does not produce accurate position information, in fact it can be horribly inaccurate. To see this, rotate a regular mouse to the left about 45 degrees and move it straight vertically: you will find it moving to the corner of the screen rather than straight upwards. Also, the point of rotation was at a very different place than the pins: if the mouse was rotated, the VT Player did not detect this and so did not compensate for the different position of the pins as the rotation point was still the same.


One of our papers on this is the following:

Headley, Hribar and Pawluk (2011). Displaying braille and graphics on a mouse like tactile display. ASSETS 2011.


As you can see, we found an effective way to present both braille and graphics with the same display, and have them easily distinguishable.


For this mouse like device, we developed a method of controllably generating distinctly different textures , but we have not evaluated its performance with picture identification or in comparison to our other device.


Headley and Pawluk (2011). Roughness perception of textures on a haptic matrix display. Wourld Haptics, 221-226.


We have also, in recent, work, combined the use of tactile feedback through a touchscreen with text feedback for what we describe are points of interest for an electronic version of the garden map.


Another alternative, which we are planning to evaluate in comparison to tactile feedback is the use of audio feedback called sonification (the use of nonspeech sounds to relay information just like we would with texture).
We believe that the disadvantage with this is that one would not be able to use multiple fingers, which we did show improved performance significantly.


However, one study I did with one of my students, Ravi Rastogi, which we are preparing for publication is looking at the use of tactile and sonification feedback for maps that have more than one set of features.
Actually the maps we used were crops (although mangos, etc.) and rainfall.
With sight, it is relatively easy to overlay this information without being overwhelmed by it. We look at using: two very distinct sets of tactile textures, two very distinct sets of audio feedback, and one set of tactile feedback with one set of audio feedback. The reason for this comparison is that cognitive load theory proposes that each sensory channel has a finite capacity for working memory. Thereby, we reasoned, by spreading the load across two modalities, we would improve performance. In fact that is what we did find.


The main part of Ravi's work was looking at methods of interactive zooming and simplification.


Rastogi, Pawluk and Ketchum (2013). Intuitive Tactile Zooming for Graphics Accessed by Individuals Who are Blind and Visually Impaired. IEEE Transaction on Neural Systems and Rehabilitation Engineering.


Rastgo and Pawluk. (2013). Dynamic Tactile Diagram Simplification on Refreshable Displays. Assistive Technology, 25.


Both of these are potentially applicable to agriculture map information.
With zooming, we were looking at the issue that we did not think that using visual zooming methods were appropriate. With many visual zooming methods, the zoom that occurs with a button click is often not appropriate: either there is not much of a change from previously or the zoom inconveniently cuts through information that should go together. This is easy to correct visually as a person can quickly glance at the result and zoom back out to the level desired. This is not so easy to do tactually, as the process of exploring a map tactually is a much slower process. We developed an algorithm that uses the conceptual organization of information to scale between levels appropriately. Thus, saving time.


With simplification, we looked at the issue of what if there are a lot of different features that would be desirable to present on a map.
Unfortunately too much information on a diagram makes them very difficult to interpret tactually: and remember that the amount of information presented in a normal visual diagram is too much, as the visual and tactile systems are different in their processing abilities. Also, unfortunately, currently, it is the sighted maker of the tactile graphic that decides whether information should be included or not. It is based on the user's intent, but what if that intent changes? A user would have to go back to the maker and ask them for another map. We looked at a method where different sets of features would be on different "layers", indistinguishable in the final map but which the user would be able to remove, add and combine at will. Thus, all the information is available to the user, but they can select what they want at that particular moment, to avoid cluttering the diagram. We found that this did help a lot. We also looked at presenting borders, as straightline borders
are easier to track than meandering ones - again we allowed users to have a choice. The idea, was that they could use the simplified method to get an overview and then the more detailed method if needed.


I hope I have not talked to long. Whitney and Robert, if you have any questions or Whitney, your colleague has any questions, I would be happy to answer them or discuss things over the phone.



Sincerely,

Dianne Pawluk

Associate Professor

Biomedical Engineering

Virginia Commonwealth University


On Sat, Nov 16, 2013 at 3:52 PM, Whitney Quesenbery < = EMAIL ADDRESS REMOVED = >wrote:

> Thanks Dianne. Is there a web site for your project that I can point
> my friend to?
>
>
> On Sat, Nov 16, 2013 at 3:13 PM, Dianne V Pawluk < = EMAIL ADDRESS REMOVED = > wrote:
>
> > Hi Robert,
> >
> > I realize now that I forgot to respond to you about our project - I
> wanted
> > to find the time to give you a complete response and then got
> > burdened
> with
> > work.
> >
> > I will be happy to talk to both Whitney and yourself about her
> > problem,
> and
> > I hope to gather information for you on my tactile graphics projects
> soon.
> >
> > Sincerely,
> > Dianne Pawluk
> > Associate Professor
> > Biomedical Engineering
> > Virginia Commonwealth University
> >
> >
> >
> > On Sat, Nov 16, 2013 at 1:56 PM, Robert Jaquiss
> > < = EMAIL ADDRESS REMOVED =
> > >wrote:
> >
> > > Hello:
> > >
> > > How about downloadable files that can be produced with image
> capable
> > > embossers or capsule paper?
> > >
> > > Regards,
> > >
> > > Robert
> > >
> > >
> > > -----Original Message-----
> > > From: = EMAIL ADDRESS REMOVED =
> > > [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Whitney
> > > Quesenbery
> > > Sent: Saturday, November 16, 2013 9:26 AM
> > > To: WebAIM Discussion List
> > > Subject: [WebAIM] Looking for techniques for accessible maps
> > >
> > > Inquiring for a colleague who is working with an government
> agricultural
> > > project.
> > >
> > > One of their functions is a detailed map that describes where land
> > > is
> and
> > > how it is being used.
> > >
> > > At the top level, they have a list of fields (the kind cows stand
> > > in,
> > not a
> > > form field) with a related visual map presentation. The list
> > > contains
> the
> > > metadata (who owns it, what is it used for) in an accessible format.
> > >
> > > But, there's some more detailed information, such as things like
> > > how
> much
> > > margin is being left between hedges, crop layouts and so on.
> > > Today,
> this
> > is
> > > being done with crayons on a large paper map.
> > >
> > > Her project will be putting all this information online - already
> > > a challenge - and needs to be done in a way that makes the detail
> > accessible.
> > >
> > > She's looking for research or resources in accessible visual data
> > > or mapping. Any ideas or pointers?
> > >
> > > --
> > > Whitney Quesenbery
> > > www.wqusability.com | @whitneyq
> > >
> > > A Web for Everyone
> > > http://rosenfeldmedia.com/books/a-web-for-everyone/
> > >
> > > Storytelling for User Experience
> > > www.rosenfeldmedia.com/books/storytelling
> > >
> > > Global UX: Design and research in a connected world @globalUX |
> > > www.amazon.com/gp/product/012378591X/
> > > > > > > list
> > > messages to = EMAIL ADDRESS REMOVED =
> > >
> > > > > > > > > list messages to = EMAIL ADDRESS REMOVED =
> > >
> >
> >
> >
> > --
> > Dianne Pawluk
> > Associate Professor
> > Biomedical Engineering
> > Virginia Commonwealth University
> > > > > > list messages to = EMAIL ADDRESS REMOVED =
> >
>
>
>
> --
> Whitney Quesenbery
> www.wqusability.com | @whitneyq
>
> Storytelling for User Experience
> www.rosenfeldmedia.com/books/storytelling
>
> Global UX: Design and research in a connected world @globalUX |
> www.amazon.com/gp/product/012378591X/
> > > list messages to = EMAIL ADDRESS REMOVED =
>



--
Dianne Pawluk
Associate Professor
Biomedical Engineering
Virginia Commonwealth University

From: Don Mauck
Date: Fri, Dec 06 2013 2:32PM
Subject: Re: Looking for techniques for accessible maps
← Previous message | No next message

Dianne --
I've been thinking about something for a while and this may just have sparked a solution. I've been wondering how I could take a screenshot of a page on a phone such as an iPhone or an Android for that matter. Then I'd like to find a way to present it in a tactile way so that then I or any other blind user could actually see what that particular screen looked like. This would be great where you had areas of text or boxes and the like that weren't being spoken and you wanted to be able to describe it to a developer from a blind person perspective.
I'd been thinking about sending the picture to a Braille thermal form machine to see how or if I could get the tactual information presented correctly. I'm sure this sounds a bit confusing so please write or contact me offline if your interest is peaked at all but what I'm proposing, I hope this can generate some conversations.
-----Original Message-----
From: Dianne V Pawluk [mailto: = EMAIL ADDRESS REMOVED = ]
Sent: Friday, December 06, 2013 8:28 AM
To: WebAIM Discussion List
Subject: Re: [WebAIM] Looking for techniques for accessible maps

Hi Whitney and Robert,


I apologize for taking so long to get back to you, especially Robert, but
this term has really been busy for me.


As Robert knows, I'm a researcher who does work in developing assistive
technology for tactile graphics. I have had 2 grants in the past and a
current grant from the National Science Foundation working on tactile
graphics.


My current work is on automating the process of converting visual diagrams
into tactile diagrams. My previous work was developing effective
representations for tactile diagrams and affordable devices that could act
like computer peripherals to virtually interact with diagrams in
real-time. The overall idea is to automate and provide effective and easy
access to visual diagrams for individuals who are blind or visually
impaired without the need for intervention by a sighted person.


First, let me talk about representations. The issue of how to represent
information is very important. Currently the most common method of
presenting information is using raised line diagrams. Unfortunately,
studies have shown that even when used to identify common objects that the
ability of individuals to use these representations is very poor -
approximately around 30%. Therefore my laboratory group, as well as
another group (Thompson, Chronicle and Collins), have separately looked at
more effective methods of presenting information.


Thompson, Chronical and Collins (2006). Enhancing 2-D Tactile Picture
Design from Knowledge of 3-D Haptic Object Recognition. European
Psychologist. 11 (2), 110-118.


Burch and Pawluk (2011). Using Multiple Contacts with Texture-enhanced
Graphics. World Haptics Conference, 287-292.


We both had the idea to use texture to encode information that is difficult
to interpret through touch. Processing geometric information is done
serially (i.e., with one finger following along a line) rather than in
parallel as in vision. This makes it difficult to interpret what the lines
mean (i.e., they may not be an outline of an object part but instead be a
line indicating some detail or it may indicate perspective). In vision, as
multiple lines are seen together, the constraints can be solved
simultaneously, but touch, being serial, relies on it being solved through
memory which is much more difficult.


Both groups use texture to encode information about part (i.e., a different
object part would get a different texture) and about 3-D orientation (i.e.,
different 3-D orientations would be indicated by different 2-D orientations
of textures. We both found significant improvement with this new method.


However, our group believed that there was another significance to using
texture. Psychologists (particularly Lederman and Klatzky) have shown that
touch is more effective in interpreting material properties than geometric
information. In particular, when searching for objects placed across
fingers, they found that parallel processing (i.e., information to all
fingers could be processed simultaneously) occurred for material properties
(such as texture) but not geometric properties. We wondered if this would
hold for interpreting tactile graphics. David's work (above) showed that
not only did our texture encoded information improve performance over
raised line drawings. It improved even more when multiple fingers were
used (this was untrue for raised line drawings, where it did not matter if
one or more fingers were used)!


This is why I would strongly recommend that fields, etc. be given textures
unique to that item (i.e., different crops may have a particular texture,
border spacing another texture, etc.). This will allow for quicker
processing of information than just using raised line drawings. We have
actually recently implemented this idea for capsule paper for a Botanical
gardens in Richmond, and it seems to be very effective. The textures we
chose are from a compilation by Lucia Hasty (www.tactilegraphics.org), from
experimentally assessed textures, to pick textures that are actually known
to be distinguishable from each other. We've had to make some adjustments,
as some textures did not feel as unique as we thought they would. This
could be used in combination with something like a talking pen or the
talking tablet, to provide text information at certain points.


You may also be interested in the fact that we have also developed
affordable devices that can allow exploration of a diagram, such as a map,
to happen interactively with a computer. David's devices are shown in his
paper. These are small devices that wrap around the finger. They pick up
the color from a video display that presents a color version of the
textured graphic. The finger devices know by the color which texture to
present, which they do so through a vibrator. One can use as many devices
on as many fingers as one wants, but two to three devices seem to work best
(we have not submitted these results for publication yet).


Another affordable device that we made in the laboratory is a mouse-like
tactile display. This consists of a Braille cell mounted on a hollow mouse
case and used on top of a graphics tablet (although a touchscreen would
work just as well). Robert, this may seem similar to the VT Player by
virTouch, but there were some fundamental problems with their design that
we corrected. First, they used regular mouse technology which does not
produce accurate position information, in fact it can be horribly
inaccurate. To see this, rotate a regular mouse to the left about 45
degrees and move it straight vertically: you will find it moving to the
corner of the screen rather than straight upwards. Also, the point of
rotation was at a very different place than the pins: if the mouse was
rotated, the VT Player did not detect this and so did not compensate for
the different position of the pins as the rotation point was still the same.


One of our papers on this is the following:

Headley, Hribar and Pawluk (2011). Displaying braille and graphics on a
mouse like tactile display. ASSETS 2011.


As you can see, we found an effective way to present both braille and
graphics with the same display, and have them easily distinguishable.


For this mouse like device, we developed a method of controllably
generating distinctly different textures , but we have not evaluated its
performance with picture identification or in comparison to our other
device.


Headley and Pawluk (2011). Roughness perception of textures on a haptic
matrix display. Wourld Haptics, 221-226.


We have also, in recent, work, combined the use of tactile feedback through
a touchscreen with text feedback for what we describe are points of
interest for an electronic version of the garden map.


Another alternative, which we are planning to evaluate in comparison to
tactile feedback is the use of audio feedback called sonification (the use
of nonspeech sounds to relay information just like we would with texture).
We believe that the disadvantage with this is that one would not be able to
use multiple fingers, which we did show improved performance significantly.


However, one study I did with one of my students, Ravi Rastogi, which we
are preparing for publication is looking at the use of tactile and
sonification feedback for maps that have more than one set of features.
Actually the maps we used were crops (although mangos, etc.) and rainfall.
With sight, it is relatively easy to overlay this information without being
overwhelmed by it. We look at using: two very distinct sets of tactile
textures, two very distinct sets of audio feedback, and one set of tactile
feedback with one set of audio feedback. The reason for this comparison is
that cognitive load theory proposes that each sensory channel has a finite
capacity for working memory. Thereby, we reasoned, by spreading the load
across two modalities, we would improve performance. In fact that is what
we did find.


The main part of Ravi's work was looking at methods of interactive zooming
and simplification.


Rastogi, Pawluk and Ketchum (2013). Intuitive Tactile Zooming for Graphics
Accessed by Individuals Who are Blind and Visually Impaired. IEEE
Transaction on Neural Systems and Rehabilitation Engineering.


Rastgo and Pawluk. (2013). Dynamic Tactile Diagram Simplification on
Refreshable Displays. Assistive Technology, 25.


Both of these are potentially applicable to agriculture map information.
With zooming, we were looking at the issue that we did not think that using
visual zooming methods were appropriate. With many visual zooming methods,
the zoom that occurs with a button click is often not appropriate: either
there is not much of a change from previously or the zoom inconveniently
cuts through information that should go together. This is easy to correct
visually as a person can quickly glance at the result and zoom back out to
the level desired. This is not so easy to do tactually, as the process of
exploring a map tactually is a much slower process. We developed an
algorithm that uses the conceptual organization of information to scale
between levels appropriately. Thus, saving time.


With simplification, we looked at the issue of what if there are a lot of
different features that would be desirable to present on a map.
Unfortunately too much information on a diagram makes them very difficult
to interpret tactually: and remember that the amount of information
presented in a normal visual diagram is too much, as the visual and tactile
systems are different in their processing abilities. Also, unfortunately,
currently, it is the sighted maker of the tactile graphic that decides
whether information should be included or not. It is based on the user's
intent, but what if that intent changes? A user would have to go back to
the maker and ask them for another map. We looked at a method where
different sets of features would be on different "layers",
indistinguishable in the final map but which the user would be able to
remove, add and combine at will. Thus, all the information is available to
the user, but they can select what they want at that particular moment, to
avoid cluttering the diagram. We found that this did help a lot. We also
looked at presenting borders, as straightline borders are easier to track
than meandering ones - again we allowed users to have a choice. The idea,
was that they could use the simplified method to get an overview and then
the more detailed method if needed.


I hope I have not talked to long. Whitney and Robert, if you have any
questions or Whitney, your colleague has any questions, I would be happy to
answer them or discuss things over the phone.



Sincerely,

Dianne Pawluk

Associate Professor

Biomedical Engineering

Virginia Commonwealth University


On Sat, Nov 16, 2013 at 3:52 PM, Whitney Quesenbery < = EMAIL ADDRESS REMOVED = >wrote:

> Thanks Dianne. Is there a web site for your project that I can point my
> friend to?
>
>
> On Sat, Nov 16, 2013 at 3:13 PM, Dianne V Pawluk < = EMAIL ADDRESS REMOVED = > wrote:
>
> > Hi Robert,
> >
> > I realize now that I forgot to respond to you about our project - I
> wanted
> > to find the time to give you a complete response and then got burdened
> with
> > work.
> >
> > I will be happy to talk to both Whitney and yourself about her problem,
> and
> > I hope to gather information for you on my tactile graphics projects
> soon.
> >
> > Sincerely,
> > Dianne Pawluk
> > Associate Professor
> > Biomedical Engineering
> > Virginia Commonwealth University
> >
> >
> >
> > On Sat, Nov 16, 2013 at 1:56 PM, Robert Jaquiss < = EMAIL ADDRESS REMOVED =
> > >wrote:
> >
> > > Hello:
> > >
> > > How about downloadable files that can be produced with image
> capable
> > > embossers or capsule paper?
> > >
> > > Regards,
> > >
> > > Robert
> > >
> > >
> > > -----Original Message-----
> > > From: = EMAIL ADDRESS REMOVED =
> > > [mailto: = EMAIL ADDRESS REMOVED = ] On Behalf Of Whitney
> > > Quesenbery
> > > Sent: Saturday, November 16, 2013 9:26 AM
> > > To: WebAIM Discussion List
> > > Subject: [WebAIM] Looking for techniques for accessible maps
> > >
> > > Inquiring for a colleague who is working with an government
> agricultural
> > > project.
> > >
> > > One of their functions is a detailed map that describes where land is
> and
> > > how it is being used.
> > >
> > > At the top level, they have a list of fields (the kind cows stand in,
> > not a
> > > form field) with a related visual map presentation. The list contains
> the
> > > metadata (who owns it, what is it used for) in an accessible format.
> > >
> > > But, there's some more detailed information, such as things like how
> much
> > > margin is being left between hedges, crop layouts and so on. Today,
> this
> > is
> > > being done with crayons on a large paper map.
> > >
> > > Her project will be putting all this information online - already a
> > > challenge - and needs to be done in a way that makes the detail
> > accessible.
> > >
> > > She's looking for research or resources in accessible visual data or
> > > mapping. Any ideas or pointers?
> > >
> > > --
> > > Whitney Quesenbery
> > > www.wqusability.com | @whitneyq
> > >
> > > A Web for Everyone
> > > http://rosenfeldmedia.com/books/a-web-for-everyone/
> > >
> > > Storytelling for User Experience
> > > www.rosenfeldmedia.com/books/storytelling
> > >
> > > Global UX: Design and research in a connected world @globalUX |
> > > www.amazon.com/gp/product/012378591X/
> > > > > > > list
> > > messages to = EMAIL ADDRESS REMOVED =
> > >
> > > > > > > > > > > >
> >
> >
> >
> > --
> > Dianne Pawluk
> > Associate Professor
> > Biomedical Engineering
> > Virginia Commonwealth University
> > > > > > > >
>
>
>
> --
> Whitney Quesenbery
> www.wqusability.com | @whitneyq
>
> Storytelling for User Experience
> www.rosenfeldmedia.com/books/storytelling
>
> Global UX: Design and research in a connected world
> @globalUX | www.amazon.com/gp/product/012378591X/
> > > >



--
Dianne Pawluk
Associate Professor
Biomedical Engineering
Virginia Commonwealth University