WebAIM - Web Accessibility In Mind

E-mail List Archives

Re: DHS Trusted Tester_Testing support/input

for

From: Jano Llorca Lis
Date: Aug 29, 2024 9:18AM


Hi Claire,

Thank you for sharing your experience and concerns. I understand the doubts
that have arisen, and I’d like to offer some perspectives that might help
you address these points.

1.

Testing Captivate Courses as Web vs. Software:

The confusion about whether to test these courses as "web products" or
"software" is quite common and depends on the environment in which
end-users access the content. Adobe Captivate courses, even if they are
developed as SCORM modules and integrated into an LMS, are typically
accessible through a web browser. This means that, in terms of user
experience and accessibility, they should be evaluated under web
accessibility guidelines (WCAG).

Some people might suggest testing them as "software" because of the
interactive and multimedia elements that can behave differently, but the
logical stance is that if the content is accessed through a browser, it
should be considered web content. You should clearly document this approach
and how web accessibility standards apply in this case.

2.

Headings Not Detected by ANDI:

When ANDI reports "No headings, lists, landmarks, or live regions were
detected," it generally means that while there may be visually styled text
that appears as a heading, it has not been correctly coded with the
appropriate HTML tags. Your interpretation is correct: Section 9.2 of the
DHS test process states that headings must be "programmatically
identified." If not, there is a compliance issue.

The ANDI message stating "no visually apparent headings" merely indicates a
condition, but if there are visible headings that are not properly coded,
this is an accessibility issue. It is important for all involved to
understand that headings need to be programmatically identified to be
accessible to screen readers and other assistive technologies.

3.

Using ANDI for eLearning Content:

While it is true that ANDI is the recommended tool in the DHS Trusted
Tester course, it is best practice to use a set of tools for a more
comprehensive assessment, especially for interactive content like
eLearning. Tools like JAWS and NVDA (screen readers), AXE by Deque (a
browser extension providing accessibility checks), and WAVE by WebAIM are
excellent complements to ANDI for a more thorough analysis.

I recommend using a combination of these tools and manual testing methods,
such as Tab key navigation, to ensure a comprehensive evaluation of
eLearning content.

I hope these ideas are helpful and assist you in clarifying and supporting
your testing practices.

Best regards,
*Jano Llorca*
Consultor SEO - SEM - Social Ads
UX - UI - Diseño y Accesibilidad Web

Tlf: 673 346 726
<EMAIL REMOVED>

<https://ilumina-agencia-consultora-seo.business.site/>


El jue, 29 ago 2024 a las 16:47, Claire Forbes (< <EMAIL REMOVED> >)
escribió:

> Hello everyone, I'm hoping for some input ...
>
> As a DHS Trusted Tester (completed in 2020), I test online courses from
> the end-users' experience - I'm provided a sandbox link for testing
> content, which is where a course lives until it's reviewed and deployed as
> live. When a course is live, participants are provided a weblink to a
> course, or they are enrolled in a course within a learning management
> system that lives on a website. I'm being criticized for testing these
> items as Web testing; being told I should be testing them as software since
> they are developed in Captivate, "Captivate Courses are not published as
> web products and therefore should be tested as 'Software.'"
> ... how is this a logical stance, or am I completely missing something
> here? It's quite possible, as I'm not a developer and have no experience
> with Captivate.
>
> I'm also being called out, and the DHS Section 508 Compliance Test Process
> document referenced, for failing headers not being recognized by ANDI -
> there are 100% "visually apparent" headings on the page that ANDI does not
> detect and ANDI sates "No headings, lists, landmarks, or live regions were
> detected." To me, this means the headings aren't "programmatically
> identified," as they should be based on section 9.2 Web: Section Headings,
> item A.
> So I'm being told "The ANDI tool is noting that there are no visually
> apparent headings on the page; It is a message stating a condition only.
> Note this message is not identifying a non compliance issue; this message
> is not red, yellow or orange which it would be if it were a compliance
> issue. For Section 9.2, the table indicates that this Test ID DOES NOT
> APPLY (DNA) "... if there are no visually apparent headings on the page..."
> However, this is not at all how I've interpreted the DHS test step and
> failure conditions for "visually apparent" headings. Any thoughts?
>
> Lastly, I'm being told that ANDI isn't the tool to be used, "It is not
> always reliable for eLearning content. We recommend you consider using
> additional tools for a comprehensive assessment."
> But this is the tool the DHS Trusted Tester course recommends to be used -
> what tools is everyone using then for web testing?
>
> Thank you in advance,
> Claire
> > > > >