E-mail List Archives

Re: Automated WCAG testing tool that is also a good crawler

for

From: Steve Green
Date: Oct 31, 2017 11:57AM


We also use SortSite. It piggybacks onto Internet Explorer, so it can extract some types of JavaScript links from the DOM. However, it is extremely slow because it only uses 2 or 4 concurrent threads (I'm not sure which, but it is determined by Internet Explorer). We often find that it can only scan a small part of a website in the time we are given to test it.

It also only lists a subset of the pages - my recollection is that it regards different URLs as being the same if the only difference is the arguments in the URLs, which is usually an invalid assumption. Furthermore, it does not recognise all types of JavaScript links, particularly AJAX calls. Depending on how your URLs are constructed and what you use AJAX for, this could be anywhere from a showstopper to irrelevant.

In our experience, Screaming Frog is the best crawler by a long way. It is much faster than SortSite and finds pages that no other crawlers can find (the key is to select the latest JavaScript setting but there are a lot of other settings that need to be changed from their default). It can be throttled if necessary.

Unfortunately, Screaming Frog does not test for accessibility so we export the list of URLs and create a single HTML page that links to them all. We then run SortSite against that page. It's a bit of a kludge but it's sometimes the only way to run an automated test against every page on a website.

Regards,
Steve Green
Managing Director
Test Partners Ltd


-----Original Message-----
From: WebAIM-Forum [mailto: <EMAIL REMOVED> ] On Behalf Of Swift, Daniel P.
Sent: 31 October 2017 13:40
To: WebAIM Discussion List < <EMAIL REMOVED> >
Subject: Re: [WebAIM] Automated WCAG testing tool that is also a good crawler

Roel:

We use SortSite (about 80% of the way down the list). It sounds like it does everything that you are describing -- tops out at 15,000 pages I believe during its automated crawl. For the automated crawl, you pick a starting URL and it crawls secondary, tertiary, etc. pages from there. This setting is defined by the user so you can choose how 'deep' it crawls.

I feel like the UI is a little dated, but it is working out well for us.

Dan Swift
Senior Web Specialist
Enterprise Services
West Chester University
610.738.0589

-----Original Message-----
From: WebAIM-Forum [mailto: <EMAIL REMOVED> ] On Behalf Of Roel Van Gils
Sent: Tuesday, October 31, 2017 8:47 AM
To: WebAIM Discussion List
Subject: [WebAIM] Automated WCAG testing tool that is also a good crawler

I'm looking for an automated testing tool hat is *also* a good and reliable web crawler, meaning: it should be smart enough traverse a site structure at least three levels deep (and also knows when to stop, how to identify itself, throttle requests etc.).

I've tested a dozen of the tools on the Web Accessibility Evaluation Tools List on the W3C site (https://www.w3.org/WAI/ER/tools/). When enabling the filter 'Groups of web pages or web sites', I get a few tools, but it seems they only allow to add URLs manually. That's not what I'm looking for.

Can anyone point me in the right direction?

(BTW: I'm perfectly aware that only about 20% of WCAG can be tested reliably, and human judgement is necessary in most cases.)

Thanks,
Roel

--
Roel Van Gils
Inclusive Design & Accessibility Expert at 11Ways
+32 473 88 18 06 / http://11ways.be / 'roelvangils' on Twitter, Skype
+and elsewhere