WebAIM has analyzed the home pages for the top 1 million web sites and collected an immense volume of accessibility data. The results provide great insight into the current state of accessibility. Unfortunately, significant and pervasive issues are present across much of the web. While there is much work to be done to improve accessibility, these research findings can help us identify patterns so accessibility efforts can be better focused.
Check the analysis results for any home page in our sample
NOTE: The summary below described the 2019 WebAIM Million results. The pages above now reflect the 2020 WebAIM Million data.
Here are just a few notable items:
- Home pages averaged 59.6 detectable errors each.
- 7.6% of all home page elements (1 in 13) have a detectable accessibility error.
- The WCAG failure rate for home pages was at least 97.8%.
- Low contrast text was the most common detectable issue with an average of 36 instances of low contrasts text on each home pages.
- One-third of all images (12.3 images per page on average) were missing alternative text.
- 59% of form inputs were not properly labeled.
- Home pages with ARIA present averaged 11.2 more detectable errors than pages without ARIA.
- The report outlines numerous common web technologies with details on how these technologies correspond to increased or decreased accessibility errors.
There is much, much more in the full report.
This is just what I needed today as I felt I was late to the Accessibility Party. This shows I am right on time!
I have a passion to help the world become more accessible for disabled people, at some point in each of our lives we may become disabled too.
I want to help businesses increase their revenue by ensuring the millions of people out there who are disabled in one way or another can use their website to accomplish the things they want to.
These results are quite interesting, and it’s great to be able to see large trends. Was any analysis done to determine how often WAVE scanned the actual homepage content, as opposed to the version of the page seen by bots like Google? While the content is probably often quite similar, I wonder if there are certain types of errors that appear often when the “bot” version is examined. The report for our site notes empty headings and links that are most likely not empty on the actual home page. In addition, our skip links should be working. It’s possible that their target is not loaded on the version shown to bots. Finally, was any testing done on content like ad frames that may appear on a large percentage of the home pages but come from just a few sources?
I’d love to know how many errors come from ads or similar third party content, especially if fixing said issues had the potential to raise the scores of thousands of sites at once.
Steve –
The WAVE analysis emulates a standard Chrome browser. There are various mechanisms out there to determine “bots” vs. actual users – so it’s quite possible that our scanner was served a different version of the home page content that others might experience.
WAVE does not do testing within iframe content in a page. While our data indicates the presence of certain technologies (such as ads) that might be served within iframes, any actual accessibility issues within those iframes were not included in our data.
I just have to thank you guys at Webaim for the excellent work you do. You are an invaluable resource for me as a developer with strong focus on accessibility. Any coding problem I have, I can easily google and find answers. With accessibility, not so much. So thanks!