WebAIM - Web Accessibility In Mind

E-mail List Archives

WebAIM-Forum Digest, Vol 105, Issue 6

for

From: Johnson, Melissa
Date: Dec 6, 2013 12:15PM


Thanks to everyone who has responded regarding my link inquiry. Great
comments and feedback!

M e l i s s a J o h n s o n

*"Accessibility does not begin with technical details. It begins with the
philosophy that people deserve equal access to information, regardless of
ability."*

Senior Instructional Designer | Professional Services | Pearson eCollege
tel: 303.658.1647 | email: <EMAIL REMOVED>

*Pearson*
Always Learning


On Fri, Dec 6, 2013 at 12:00 PM, < <EMAIL REMOVED> >wrote:

> Send WebAIM-Forum mailing list submissions to
> <EMAIL REMOVED>
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://list.webaim.org/mailman/listinfo/webaim-forum
> or, via email, send a message with subject or body 'help' to
> <EMAIL REMOVED>
>
> You can reach the person managing the list at
> <EMAIL REMOVED>
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of WebAIM-Forum digest..."
>
> Today's Topics:
>
> 1. Re: Question about links opening in new tabs/windows
> (Jens O. Meiert)
> 2. Re: Question about links opening in new tabs/windows
> (Olaf Dr?mmer)
> 3. FW: Survey for Improving Text-Equivalent Descriptions for the
> Blind - Distributed with Permission?from the Texas Tech
> Institutional Review Board (Reinhard Stebner)
> 4. State of Maryland accessibility office (Chagnon | PubCom)
> 5. YouTube (Hewitt,Susan (DSHS))
> 6. Re: Looking for techniques for accessible maps (Dianne V Pawluk)
> 7. Re: Looking for techniques for accessible maps
> (Bourne, Sarah (ITD))
>
>
> ---------- Forwarded message ----------
> From: "Jens O. Meiert" < <EMAIL REMOVED> >
> To: WebAIM Discussion List < <EMAIL REMOVED> >
> Cc:
> Date: Thu, 5 Dec 2013 16:43:39 -0500
> Subject: Re: [WebAIM] Question about links opening in new tabs/windows
> Interesting thread.
>
> To share some of my experience, I'm a friend of the "if a non-disabled
> user has the same issue, it's not an accessibility problem" school of
> thought (in a sense of locking users out), and the notification
> question is then not one of accessibility.
>
> I also subscribe to the approach of "[if] a browser or adaptive
> technology can or should handle an accessibility issue, I won't,"
> propagated by Joe Clark in 2007 [1], and the notification question is
> now a tool one.
>
> Then, when it comes to links the Web seems to have matured enough that
> we can say that
>
> 1) links should open in the same window (unless they invoke a
> different application [2]), because it's _the user_ who can and should
> control how links are handled (like opening a link in a new window if
> so desired), and
>
> 2) links do not need any particular highlighting (unless invoking
> different applications too as for email or document links, which can
> be done through appropriate link wording as in e.g.
> "<a> <EMAIL REMOVED> </a>").
>
> And to add a principle then, for good measure, it typically pays off
> to keep it simple. That means, a plain simple a@href that is styled so
> to be recognizable as a link and indicates when it has been visited
> should be enough in 99% of all cases.
>
> This is all to say, use simple links, skip the extras, and don't worry
> about them.
>
>
> [1] http://joeclark.org/appearances/atmedia2007/
> [2] http://www.nngroup.com/articles/open-new-windows-for-pdfs/
>
> --
> Jens O. Meiert
> http://meiert.com/en/
>
> ✍ New book! http://meiert.com/everyday-adventurer
>
>
>
> ---------- Forwarded message ----------
> From: "Olaf Drümmer" < <EMAIL REMOVED> >
> To: WebAIM Discussion List < <EMAIL REMOVED> >
> Cc:
> Date: Thu, 5 Dec 2013 23:45:34 +0100
> Subject: Re: [WebAIM] Question about links opening in new tabs/windows
> +1
>
> Am 5 Dec 2013 um 22:43 schrieb Jens O. Meiert < <EMAIL REMOVED> >:
>
> > This is all to say, use simple links, skip the extras, and don't worry
> > about them.
>
>
>
>
> ---------- Forwarded message ----------
> From: "Reinhard Stebner" < <EMAIL REMOVED> >
> To: "'WebAIM Discussion List'" < <EMAIL REMOVED> >
> Cc:
> Date: Thu, 5 Dec 2013 20:14:41 -0500
> Subject: [WebAIM] FW: Survey for Improving Text-Equivalent Descriptions
> for the Blind - Distributed with Permission from the Texas Tech
> Institutional Review Board
> From: Broyles, Cristopher (OS/ASPA) (CTR)
> Sent: Thursday, December 05, 2013 11:11 AM
>
> Hello,
>
> If you use screen‑reader software, are blind, are at least 18 years of
> age, and if you have an interest in helping to improve the quality and
> consistency in text-based descriptions for the blind (such as alt-text/long
> descriptions), I would very much like to include your opinions in my
> dissertation study. My study focuses on improving text-based descriptions
> for the blind. The survey consists of 12 questions and is designed to be
> completed in less than 15 minutes. It has also been tested to ensure
> screen‑reader compatibility and keyboard-based navigation.
>
> The survey can be completed anonymously, and you will not receive any
> follow-up contact from me—or anyone—about the survey or your responses.
> However, for those who indicate that it is okay to do so via their survey
> responses, please note that I am planning to conduct follow-up one-on-one
> interviews (30 to 60 minutes) and a group teleconference about improving
> text-based descriptions (60 to 90 minutes). If you have an interest in
> participating in that conference call, please be sure to provide contact
> information where requested in the survey. Compensation in the amount of
> $15.00 will be provided in the form of an Amazon gift card (or other like
> retailer) to those who participate in the individual interview. Due to
> technical limitations, not everyone can be invited to participate in the
> group teleconference. For those who do participate in the teleconference,
> additional compensation of $25.00 will be offered through gift card.
> If you have any questions about the survey, its aim and potential use, or
> my dissertation, please contact me. Your participation is completely
> voluntary and all information gathered will be used exclusively for my
> research. This survey has been approved by the TTU Institutional Review
> Board.
> Please feel free to forward this survey link onward to other individuals
> you think might have an interest in participating.
>
> Here is the link to the survey: http://www.surveymonkey.com/s/SZXHYH3 .
>
> Respectfully,
> Cristopher Broyles, Doctoral Candidate
> Technical Communication and Rhetoric
> Texas Tech University
> and
> Sean Zdenek, PhD
> Principal Investigator
>
>
>
>
>
> ---------- Forwarded message ----------
> From: "Chagnon | PubCom" < <EMAIL REMOVED> >
> To: "WebAIM Discussion List" < <EMAIL REMOVED> >
> Cc:
> Date: Fri, 6 Dec 2013 02:05:52 -0500
> Subject: [WebAIM] State of Maryland accessibility office
> If I recall correctly, one or more list members are with the Maryland
> accessibility office.
>
> If so, will you contact me directly offline? I have a question re:
> accessibility on a state website.
>
> Thanks,
>
> - Bevi Chagnon
>
> <EMAIL REMOVED>
>
> PubCom.com - Trainers, Consultants, Designers, and Developers.
>
> Print, Web, Acrobat, XML, eBooks, and U.S. Federal Section 508
> Accessibility.
>
> Sec. 508 Workshop - http://www.Workshop.PubCom.com
>
>
>
>
>
>
> ---------- Forwarded message ----------
> From: "Hewitt,Susan (DSHS)" < <EMAIL REMOVED> >
> To: "' <EMAIL REMOVED> '" < <EMAIL REMOVED> >
> Cc:
> Date: Fri, 6 Dec 2013 14:48:15 +0000
> Subject: [WebAIM] YouTube
> How many people are using YouTube as their default video player in their
> organization? How do those of you who are AT users feel about it? We're
> considering making it our standard delivery method for video and I'm
> finding varied results in both usability and strict compliance across
> devices. I'd appreciate any real world input.
>
> Thanks,
> Susan Hewitt
> EIR Accessibility Coordinator
> Texas Department of State Health Services
> <EMAIL REMOVED> <mailto: <EMAIL REMOVED> > |
> 512-776-2913
>
>
>
>
> ---------- Forwarded message ----------
> From: Dianne V Pawluk < <EMAIL REMOVED> >
> To: WebAIM Discussion List < <EMAIL REMOVED> >
> Cc:
> Date: Fri, 6 Dec 2013 10:28:26 -0500
> Subject: Re: [WebAIM] Looking for techniques for accessible maps
> Hi Whitney and Robert,
>
>
> I apologize for taking so long to get back to you, especially Robert, but
> this term has really been busy for me.
>
>
> As Robert knows, I'm a researcher who does work in developing assistive
> technology for tactile graphics. I have had 2 grants in the past and a
> current grant from the National Science Foundation working on tactile
> graphics.
>
>
> My current work is on automating the process of converting visual diagrams
> into tactile diagrams. My previous work was developing effective
> representations for tactile diagrams and affordable devices that could act
> like computer peripherals to virtually interact with diagrams in
> real-time. The overall idea is to automate and provide effective and easy
> access to visual diagrams for individuals who are blind or visually
> impaired without the need for intervention by a sighted person.
>
>
> First, let me talk about representations. The issue of how to represent
> information is very important. Currently the most common method of
> presenting information is using raised line diagrams. Unfortunately,
> studies have shown that even when used to identify common objects that the
> ability of individuals to use these representations is very poor –
> approximately around 30%. Therefore my laboratory group, as well as
> another group (Thompson, Chronicle and Collins), have separately looked at
> more effective methods of presenting information.
>
>
> Thompson, Chronical and Collins (2006). Enhancing 2-D Tactile Picture
> Design from Knowledge of 3-D Haptic Object Recognition. European
> Psychologist. 11 (2), 110-118.
>
>
> Burch and Pawluk (2011). Using Multiple Contacts with Texture-enhanced
> Graphics. World Haptics Conference, 287-292.
>
>
> We both had the idea to use texture to encode information that is difficult
> to interpret through touch. Processing geometric information is done
> serially (i.e., with one finger following along a line) rather than in
> parallel as in vision. This makes it difficult to interpret what the lines
> mean (i.e., they may not be an outline of an object part but instead be a
> line indicating some detail or it may indicate perspective). In vision, as
> multiple lines are seen together, the constraints can be solved
> simultaneously, but touch, being serial, relies on it being solved through
> memory which is much more difficult.
>
>
> Both groups use texture to encode information about part (i.e., a different
> object part would get a different texture) and about 3-D orientation (i.e.,
> different 3-D orientations would be indicated by different 2-D orientations
> of textures. We both found significant improvement with this new method.
>
>
> However, our group believed that there was another significance to using
> texture. Psychologists (particularly Lederman and Klatzky) have shown that
> touch is more effective in interpreting material properties than geometric
> information. In particular, when searching for objects placed across
> fingers, they found that parallel processing (i.e., information to all
> fingers could be processed simultaneously) occurred for material properties
> (such as texture) but not geometric properties. We wondered if this would
> hold for interpreting tactile graphics. David's work (above) showed that
> not only did our texture encoded information improve performance over
> raised line drawings. It improved even more when multiple fingers were
> used (this was untrue for raised line drawings, where it did not matter if
> one or more fingers were used)!
>
>
> This is why I would strongly recommend that fields, etc. be given textures
> unique to that item (i.e., different crops may have a particular texture,
> border spacing another texture, etc.). This will allow for quicker
> processing of information than just using raised line drawings. We have
> actually recently implemented this idea for capsule paper for a Botanical
> gardens in Richmond, and it seems to be very effective. The textures we
> chose are from a compilation by Lucia Hasty (www.tactilegraphics.org),
> from
> experimentally assessed textures, to pick textures that are actually known
> to be distinguishable from each other. We've had to make some adjustments,
> as some textures did not feel as unique as we thought they would. This
> could be used in combination with something like a talking pen or the
> talking tablet, to provide text information at certain points.
>
>
> You may also be interested in the fact that we have also developed
> affordable devices that can allow exploration of a diagram, such as a map,
> to happen interactively with a computer. David's devices are shown in his
> paper. These are small devices that wrap around the finger. They pick up
> the color from a video display that presents a color version of the
> textured graphic. The finger devices know by the color which texture to
> present, which they do so through a vibrator. One can use as many devices
> on as many fingers as one wants, but two to three devices seem to work best
> (we have not submitted these results for publication yet).
>
>
> Another affordable device that we made in the laboratory is a mouse-like
> tactile display. This consists of a Braille cell mounted on a hollow mouse
> case and used on top of a graphics tablet (although a touchscreen would
> work just as well). Robert, this may seem similar to the VT Player by
> virTouch, but there were some fundamental problems with their design that
> we corrected. First, they used regular mouse technology which does not
> produce accurate position information, in fact it can be horribly
> inaccurate. To see this, rotate a regular mouse to the left about 45
> degrees and move it straight vertically: you will find it moving to the
> corner of the screen rather than straight upwards. Also, the point of
> rotation was at a very different place than the pins: if the mouse was
> rotated, the VT Player did not detect this and so did not compensate for
> the different position of the pins as the rotation point was still the
> same.
>
>
> One of our papers on this is the following:
>
> Headley, Hribar and Pawluk (2011). Displaying braille and graphics on a
> mouse like tactile display. ASSETS 2011.
>
>
> As you can see, we found an effective way to present both braille and
> graphics with the same display, and have them easily distinguishable.
>
>
> For this mouse like device, we developed a method of controllably
> generating distinctly different textures , but we have not evaluated its
> performance with picture identification or in comparison to our other
> device.
>
>
> Headley and Pawluk (2011). Roughness perception of textures on a haptic
> matrix display. Wourld Haptics, 221-226.
>
>
> We have also, in recent, work, combined the use of tactile feedback through
> a touchscreen with text feedback for what we describe are points of
> interest for an electronic version of the garden map.
>
>
> Another alternative, which we are planning to evaluate in comparison to
> tactile feedback is the use of audio feedback called sonification (the use
> of nonspeech sounds to relay information just like we would with texture).
> We believe that the disadvantage with this is that one would not be able to
> use multiple fingers, which we did show improved performance significantly.
>
>
> However, one study I did with one of my students, Ravi Rastogi, which we
> are preparing for publication is looking at the use of tactile and
> sonification feedback for maps that have more than one set of features.
> Actually the maps we used were crops (although mangos, etc.) and rainfall.
> With sight, it is relatively easy to overlay this information without being
> overwhelmed by it. We look at using: two very distinct sets of tactile
> textures, two very distinct sets of audio feedback, and one set of tactile
> feedback with one set of audio feedback. The reason for this comparison is
> that cognitive load theory proposes that each sensory channel has a finite
> capacity for working memory. Thereby, we reasoned, by spreading the load
> across two modalities, we would improve performance. In fact that is what
> we did find.
>
>
> The main part of Ravi's work was looking at methods of interactive zooming
> and simplification.
>
>
> Rastogi, Pawluk and Ketchum (2013). Intuitive Tactile Zooming for Graphics
> Accessed by Individuals Who are Blind and Visually Impaired. IEEE
> Transaction on Neural Systems and Rehabilitation Engineering.
>
>
> Rastgo and Pawluk. (2013). Dynamic Tactile Diagram Simplification on
> Refreshable Displays. Assistive Technology, 25.
>
>
> Both of these are potentially applicable to agriculture map information.
> With zooming, we were looking at the issue that we did not think that using
> visual zooming methods were appropriate. With many visual zooming methods,
> the zoom that occurs with a button click is often not appropriate: either
> there is not much of a change from previously or the zoom inconveniently
> cuts through information that should go together. This is easy to correct
> visually as a person can quickly glance at the result and zoom back out to
> the level desired. This is not so easy to do tactually, as the process of
> exploring a map tactually is a much slower process. We developed an
> algorithm that uses the conceptual organization of information to scale
> between levels appropriately. Thus, saving time.
>
>
> With simplification, we looked at the issue of what if there are a lot of
> different features that would be desirable to present on a map.
> Unfortunately too much information on a diagram makes them very difficult
> to interpret tactually: and remember that the amount of information
> presented in a normal visual diagram is too much, as the visual and tactile
> systems are different in their processing abilities. Also, unfortunately,
> currently, it is the sighted maker of the tactile graphic that decides
> whether information should be included or not. It is based on the user's
> intent, but what if that intent changes? A user would have to go back to
> the maker and ask them for another map. We looked at a method where
> different sets of features would be on different "layers",
> indistinguishable in the final map but which the user would be able to
> remove, add and combine at will. Thus, all the information is available to
> the user, but they can select what they want at that particular moment, to
> avoid cluttering the diagram. We found that this did help a lot. We also
> looked at presenting borders, as straightline borders are easier to track
> than meandering ones – again we allowed users to have a choice. The idea,
> was that they could use the simplified method to get an overview and then
> the more detailed method if needed.
>
>
> I hope I have not talked to long. Whitney and Robert, if you have any
> questions or Whitney, your colleague has any questions, I would be happy to
> answer them or discuss things over the phone.
>
>
>
> Sincerely,
>
> Dianne Pawluk
>
> Associate Professor
>
> Biomedical Engineering
>
> Virginia Commonwealth University
>
>
> On Sat, Nov 16, 2013 at 3:52 PM, Whitney Quesenbery < <EMAIL REMOVED>
> >wrote:
>
> > Thanks Dianne. Is there a web site for your project that I can point my
> > friend to?
> >
> >
> > On Sat, Nov 16, 2013 at 3:13 PM, Dianne V Pawluk < <EMAIL REMOVED> >
> wrote:
> >
> > > Hi Robert,
> > >
> > > I realize now that I forgot to respond to you about our project - I
> > wanted
> > > to find the time to give you a complete response and then got burdened
> > with
> > > work.
> > >
> > > I will be happy to talk to both Whitney and yourself about her problem,
> > and
> > > I hope to gather information for you on my tactile graphics projects
> > soon.
> > >
> > > Sincerely,
> > > Dianne Pawluk
> > > Associate Professor
> > > Biomedical Engineering
> > > Virginia Commonwealth University
> > >
> > >
> > >
> > > On Sat, Nov 16, 2013 at 1:56 PM, Robert Jaquiss <
> <EMAIL REMOVED>
> > > >wrote:
> > >
> > > > Hello:
> > > >
> > > > How about downloadable files that can be produced with image
> > capable
> > > > embossers or capsule paper?
> > > >
> > > > Regards,
> > > >
> > > > Robert
> > > >
> > > >
> > > > -----Original Message-----
> > > > From: <EMAIL REMOVED>
> > > > [mailto: <EMAIL REMOVED> ] On Behalf Of Whitney
> > > > Quesenbery
> > > > Sent: Saturday, November 16, 2013 9:26 AM
> > > > To: WebAIM Discussion List
> > > > Subject: [WebAIM] Looking for techniques for accessible maps
> > > >
> > > > Inquiring for a colleague who is working with an government
> > agricultural
> > > > project.
> > > >
> > > > One of their functions is a detailed map that describes where land is
> > and
> > > > how it is being used.
> > > >
> > > > At the top level, they have a list of fields (the kind cows stand in,
> > > not a
> > > > form field) with a related visual map presentation. The list contains
> > the
> > > > metadata (who owns it, what is it used for) in an accessible format.
> > > >
> > > > But, there's some more detailed information, such as things like how
> > much
> > > > margin is being left between hedges, crop layouts and so on. Today,
> > this
> > > is
> > > > being done with crayons on a large paper map.
> > > >
> > > > Her project will be putting all this information online - already a
> > > > challenge - and needs to be done in a way that makes the detail
> > > accessible.
> > > >
> > > > She's looking for research or resources in accessible visual data or
> > > > mapping. Any ideas or pointers?
> > > >
> > > > --
> > > > Whitney Quesenbery
> > > > www.wqusability.com | @whitneyq
> > > >
> > > > A Web for Everyone
> > > > http://rosenfeldmedia.com/books/a-web-for-everyone/
> > > >
> > > > Storytelling for User Experience
> > > > www.rosenfeldmedia.com/books/storytelling
> > > >
> > > > Global UX: Design and research in a connected world @globalUX |
> > > > www.amazon.com/gp/product/012378591X/
> > > > > > > > > > list
> > > > messages to <EMAIL REMOVED>
> > > >
> > > > > > > > > > > > > > > >
> > >
> > >
> > >
> > > --
> > > Dianne Pawluk
> > > Associate Professor
> > > Biomedical Engineering
> > > Virginia Commonwealth University
> > > > > > > > > > > >
> >
> >
> >
> > --
> > Whitney Quesenbery
> > www.wqusability.com | @whitneyq
> >
> > Storytelling for User Experience
> > www.rosenfeldmedia.com/books/storytelling
> >
> > Global UX: Design and research in a connected world
> > @globalUX | www.amazon.com/gp/product/012378591X/
> > > > > > > >
>
>
>
> --
> Dianne Pawluk
> Associate Professor
> Biomedical Engineering
> Virginia Commonwealth University
>
>
>
> ---------- Forwarded message ----------
> From: "Bourne, Sarah (ITD)" < <EMAIL REMOVED> >
> To: WebAIM Discussion List < <EMAIL REMOVED> >
> Cc:
> Date: Fri, 6 Dec 2013 10:58:24 -0500
> Subject: Re: [WebAIM] Looking for techniques for accessible maps
> Dianne,
>
> I am so impressed! Perhaps I'm jumping way ahead here, but I started
> thinking about how this might work with a GIS system. A GIS user can pick
> the data layers they want included (usually sitting on top of some generic
> layers, such as geopolitical boundaries, roads, etc., and their associated
> labels.) It doesn't seem like that much of a stretch to allow a tactile
> user to also select the texture or sounds to apply to the selected layers.
> Thus the user would create a new map, rather than having to have someone
> else do it for them.
>
> If you get tired of working with crop maps, perhaps you could look into
> the map that is most often used as an example of something really hard to
> provide text equivalents for: the dreaded campus map.
>
> sb
> Sarah E. Bourne
> Director of Assistive Technology &
> Mass.Gov Chief Technology Strategist
> Information Technology Division
> Commonwealth of Massachusetts
> 1 Ashburton Pl. rm 1601 Boston MA 02108
> 617-626-4502
> <EMAIL REMOVED>
> http://www.mass.gov/itd
>
>
> -----Original Message-----
> From: <EMAIL REMOVED> [mailto:
> <EMAIL REMOVED> ] On Behalf Of Dianne V Pawluk
> Sent: Friday, December 06, 2013 10:28 AM
> To: WebAIM Discussion List
> Subject: Re: [WebAIM] Looking for techniques for accessible maps
>
> Hi Whitney and Robert,
>
>
> I apologize for taking so long to get back to you, especially Robert, but
> this term has really been busy for me.
>
>
> As Robert knows, I'm a researcher who does work in developing assistive
> technology for tactile graphics. I have had 2 grants in the past and a
> current grant from the National Science Foundation working on tactile
> graphics.
>
>
> My current work is on automating the process of converting visual diagrams
> into tactile diagrams. My previous work was developing effective
> representations for tactile diagrams and affordable devices that could act
> like computer peripherals to virtually interact with diagrams in real-time.
> The overall idea is to automate and provide effective and easy access to
> visual diagrams for individuals who are blind or visually impaired without
> the need for intervention by a sighted person.
>
>
> First, let me talk about representations. The issue of how to represent
> information is very important. Currently the most common method of
> presenting information is using raised line diagrams. Unfortunately,
> studies have shown that even when used to identify common objects that the
> ability of individuals to use these representations is very poor -
> approximately around 30%. Therefore my laboratory group, as well as
> another group (Thompson, Chronicle and Collins), have separately looked at
> more effective methods of presenting information.
>
>
> Thompson, Chronical and Collins (2006). Enhancing 2-D Tactile Picture
> Design from Knowledge of 3-D Haptic Object Recognition. European
> Psychologist. 11 (2), 110-118.
>
>
> Burch and Pawluk (2011). Using Multiple Contacts with Texture-enhanced
> Graphics. World Haptics Conference, 287-292.
>
>
> We both had the idea to use texture to encode information that is
> difficult to interpret through touch. Processing geometric information is
> done serially (i.e., with one finger following along a line) rather than in
> parallel as in vision. This makes it difficult to interpret what the lines
> mean (i.e., they may not be an outline of an object part but instead be a
> line indicating some detail or it may indicate perspective). In vision, as
> multiple lines are seen together, the constraints can be solved
> simultaneously, but touch, being serial, relies on it being solved through
> memory which is much more difficult.
>
>
> Both groups use texture to encode information about part (i.e., a
> different object part would get a different texture) and about 3-D
> orientation (i.e., different 3-D orientations would be indicated by
> different 2-D orientations of textures. We both found significant
> improvement with this new method.
>
>
> However, our group believed that there was another significance to using
> texture. Psychologists (particularly Lederman and Klatzky) have shown that
> touch is more effective in interpreting material properties than geometric
> information. In particular, when searching for objects placed across
> fingers, they found that parallel processing (i.e., information to all
> fingers could be processed simultaneously) occurred for material properties
> (such as texture) but not geometric properties. We wondered if this would
> hold for interpreting tactile graphics. David's work (above) showed that
> not only did our texture encoded information improve performance over
> raised line drawings. It improved even more when multiple fingers were
> used (this was untrue for raised line drawings, where it did not matter if
> one or more fingers were used)!
>
>
> This is why I would strongly recommend that fields, etc. be given textures
> unique to that item (i.e., different crops may have a particular texture,
> border spacing another texture, etc.). This will allow for quicker
> processing of information than just using raised line drawings. We have
> actually recently implemented this idea for capsule paper for a Botanical
> gardens in Richmond, and it seems to be very effective. The textures we
> chose are from a compilation by Lucia Hasty (www.tactilegraphics.org),
> from experimentally assessed textures, to pick textures that are actually
> known to be distinguishable from each other. We've had to make some
> adjustments, as some textures did not feel as unique as we thought they
> would. This could be used in combination with something like a talking pen
> or the talking tablet, to provide text information at certain points.
>
>
> You may also be interested in the fact that we have also developed
> affordable devices that can allow exploration of a diagram, such as a map,
> to happen interactively with a computer. David's devices are shown in his
> paper. These are small devices that wrap around the finger. They pick up
> the color from a video display that presents a color version of the
> textured graphic. The finger devices know by the color which texture to
> present, which they do so through a vibrator. One can use as many devices
> on as many fingers as one wants, but two to three devices seem to work best
> (we have not submitted these results for publication yet).
>
>
> Another affordable device that we made in the laboratory is a mouse-like
> tactile display. This consists of a Braille cell mounted on a hollow mouse
> case and used on top of a graphics tablet (although a touchscreen would
> work just as well). Robert, this may seem similar to the VT Player by
> virTouch, but there were some fundamental problems with their design that
> we corrected. First, they used regular mouse technology which does not
> produce accurate position information, in fact it can be horribly
> inaccurate. To see this, rotate a regular mouse to the left about 45
> degrees and move it straight vertically: you will find it moving to the
> corner of the screen rather than straight upwards. Also, the point of
> rotation was at a very different place than the pins: if the mouse was
> rotated, the VT Player did not detect this and so did not compensate for
> the different position of the pins as the rotation point was still the same.
>
>
> One of our papers on this is the following:
>
> Headley, Hribar and Pawluk (2011). Displaying braille and graphics on a
> mouse like tactile display. ASSETS 2011.
>
>
> As you can see, we found an effective way to present both braille and
> graphics with the same display, and have them easily distinguishable.
>
>
> For this mouse like device, we developed a method of controllably
> generating distinctly different textures , but we have not evaluated its
> performance with picture identification or in comparison to our other
> device.
>
>
> Headley and Pawluk (2011). Roughness perception of textures on a haptic
> matrix display. Wourld Haptics, 221-226.
>
>
> We have also, in recent, work, combined the use of tactile feedback
> through a touchscreen with text feedback for what we describe are points of
> interest for an electronic version of the garden map.
>
>
> Another alternative, which we are planning to evaluate in comparison to
> tactile feedback is the use of audio feedback called sonification (the use
> of nonspeech sounds to relay information just like we would with texture).
> We believe that the disadvantage with this is that one would not be able
> to use multiple fingers, which we did show improved performance
> significantly.
>
>
> However, one study I did with one of my students, Ravi Rastogi, which we
> are preparing for publication is looking at the use of tactile and
> sonification feedback for maps that have more than one set of features.
> Actually the maps we used were crops (although mangos, etc.) and rainfall.
> With sight, it is relatively easy to overlay this information without
> being overwhelmed by it. We look at using: two very distinct sets of
> tactile textures, two very distinct sets of audio feedback, and one set of
> tactile feedback with one set of audio feedback. The reason for this
> comparison is that cognitive load theory proposes that each sensory channel
> has a finite capacity for working memory. Thereby, we reasoned, by
> spreading the load across two modalities, we would improve performance. In
> fact that is what we did find.
>
>
> The main part of Ravi's work was looking at methods of interactive zooming
> and simplification.
>
>
> Rastogi, Pawluk and Ketchum (2013). Intuitive Tactile Zooming for
> Graphics Accessed by Individuals Who are Blind and Visually Impaired. IEEE
> Transaction on Neural Systems and Rehabilitation Engineering.
>
>
> Rastgo and Pawluk. (2013). Dynamic Tactile Diagram Simplification on
> Refreshable Displays. Assistive Technology, 25.
>
>
> Both of these are potentially applicable to agriculture map information.
> With zooming, we were looking at the issue that we did not think that
> using visual zooming methods were appropriate. With many visual zooming
> methods, the zoom that occurs with a button click is often not appropriate:
> either there is not much of a change from previously or the zoom
> inconveniently cuts through information that should go together. This is
> easy to correct visually as a person can quickly glance at the result and
> zoom back out to the level desired. This is not so easy to do tactually,
> as the process of exploring a map tactually is a much slower process. We
> developed an algorithm that uses the conceptual organization of information
> to scale between levels appropriately. Thus, saving time.
>
>
> With simplification, we looked at the issue of what if there are a lot of
> different features that would be desirable to present on a map.
> Unfortunately too much information on a diagram makes them very difficult
> to interpret tactually: and remember that the amount of information
> presented in a normal visual diagram is too much, as the visual and tactile
> systems are different in their processing abilities. Also, unfortunately,
> currently, it is the sighted maker of the tactile graphic that decides
> whether information should be included or not. It is based on the user's
> intent, but what if that intent changes? A user would have to go back to
> the maker and ask them for another map. We looked at a method where
> different sets of features would be on different "layers",
> indistinguishable in the final map but which the user would be able to
> remove, add and combine at will. Thus, all the information is available to
> the user, but they can select what they want at that particular moment, to
> avoid cluttering the diagram. We found that this did help a lot. We also
> looked at presenting borders, as straightline borders are easier to track
> than meandering ones - again we allowed users to have a choice. The idea,
> was that they could use the simplified method to get an overview and then
> the more detailed method if needed.
>
>
> I hope I have not talked to long. Whitney and Robert, if you have any
> questions or Whitney, your colleague has any questions, I would be happy to
> answer them or discuss things over the phone.
>
>
>
> Sincerely,
>
> Dianne Pawluk
>
> Associate Professor
>
> Biomedical Engineering
>
> Virginia Commonwealth University
>
>
> On Sat, Nov 16, 2013 at 3:52 PM, Whitney Quesenbery < <EMAIL REMOVED>
> >wrote:
>
> > Thanks Dianne. Is there a web site for your project that I can point
> > my friend to?
> >
> >
> > On Sat, Nov 16, 2013 at 3:13 PM, Dianne V Pawluk < <EMAIL REMOVED> >
> wrote:
> >
> > > Hi Robert,
> > >
> > > I realize now that I forgot to respond to you about our project - I
> > wanted
> > > to find the time to give you a complete response and then got
> > > burdened
> > with
> > > work.
> > >
> > > I will be happy to talk to both Whitney and yourself about her
> > > problem,
> > and
> > > I hope to gather information for you on my tactile graphics projects
> > soon.
> > >
> > > Sincerely,
> > > Dianne Pawluk
> > > Associate Professor
> > > Biomedical Engineering
> > > Virginia Commonwealth University
> > >
> > >
> > >
> > > On Sat, Nov 16, 2013 at 1:56 PM, Robert Jaquiss
> > > < <EMAIL REMOVED>
> > > >wrote:
> > >
> > > > Hello:
> > > >
> > > > How about downloadable files that can be produced with image
> > capable
> > > > embossers or capsule paper?
> > > >
> > > > Regards,
> > > >
> > > > Robert
> > > >
> > > >
> > > > -----Original Message-----
> > > > From: <EMAIL REMOVED>
> > > > [mailto: <EMAIL REMOVED> ] On Behalf Of Whitney
> > > > Quesenbery
> > > > Sent: Saturday, November 16, 2013 9:26 AM
> > > > To: WebAIM Discussion List
> > > > Subject: [WebAIM] Looking for techniques for accessible maps
> > > >
> > > > Inquiring for a colleague who is working with an government
> > agricultural
> > > > project.
> > > >
> > > > One of their functions is a detailed map that describes where land
> > > > is
> > and
> > > > how it is being used.
> > > >
> > > > At the top level, they have a list of fields (the kind cows stand
> > > > in,
> > > not a
> > > > form field) with a related visual map presentation. The list
> > > > contains
> > the
> > > > metadata (who owns it, what is it used for) in an accessible format.
> > > >
> > > > But, there's some more detailed information, such as things like
> > > > how
> > much
> > > > margin is being left between hedges, crop layouts and so on.
> > > > Today,
> > this
> > > is
> > > > being done with crayons on a large paper map.
> > > >
> > > > Her project will be putting all this information online - already
> > > > a challenge - and needs to be done in a way that makes the detail
> > > accessible.
> > > >
> > > > She's looking for research or resources in accessible visual data
> > > > or mapping. Any ideas or pointers?
> > > >
> > > > --
> > > > Whitney Quesenbery
> > > > www.wqusability.com | @whitneyq
> > > >
> > > > A Web for Everyone
> > > > http://rosenfeldmedia.com/books/a-web-for-everyone/
> > > >
> > > > Storytelling for User Experience
> > > > www.rosenfeldmedia.com/books/storytelling
> > > >
> > > > Global UX: Design and research in a connected world @globalUX |
> > > > www.amazon.com/gp/product/012378591X/
> > > > > > > > > > list
> > > > messages to <EMAIL REMOVED>
> > > >
> > > > > > > > > > > > list messages to <EMAIL REMOVED>
> > > >
> > >
> > >
> > >
> > > --
> > > Dianne Pawluk
> > > Associate Professor
> > > Biomedical Engineering
> > > Virginia Commonwealth University
> > > > > > > > > list messages to <EMAIL REMOVED>
> > >
> >
> >
> >
> > --
> > Whitney Quesenbery
> > www.wqusability.com | @whitneyq
> >
> > Storytelling for User Experience
> > www.rosenfeldmedia.com/books/storytelling
> >
> > Global UX: Design and research in a connected world @globalUX |
> > www.amazon.com/gp/product/012378591X/
> > > > > > list messages to <EMAIL REMOVED>
> >
>
>
>
> --
> Dianne Pawluk
> Associate Professor
> Biomedical Engineering
> Virginia Commonwealth University
> > > messages to <EMAIL REMOVED>
>
>
> > > >
>