Automatic accessibility testing tools need proper understanding

Note: This post is older than two years. It may still be totally valid, but things change and technology moves fast. Code based posts may be especially prone to changes...

Number of words: 667.

(Loaded 1034 times)

Different automatic tools can produce different results. Accessibility conformance testing rules help a bit but there are still potential differences. And, again, automatic tools cover up to a portion when testing for WCAG success criteria, so please do test manually and with users to really make your products accessible

Tech people, especially developers and testers, but also tech leads, project managers and other staff can be very used of programmatic reports and automated testing coverage. And that is great. Whatever saves time and can be automated should be automated if investment into work pays off.

But not everything can be automated. Some things are subject to different interpretations and sometimes context is so important, that it is not possible to neglect it. I am still waiting for a fully automated graphical user interface (GUI) testing for example. I’ve tested some promising tools but we are still far from it. The same goes also for accessibility testing.

There are a lot of tools that try to automatize testing and I love the Accessibility Conformance Testing (ACT) rules communities efforts to make the rules (opens in new window), that gives a very solid base for all of the browser extensions and other tools out there. But we all know that testing for accessibility requires humans. User testing done correctly, on representative webpages and with representative users can not be replaced by any tool. Not even with rise of machine learning and artificial intelligence. They can help but they are not enough.

Tools can report that no issues were found and the site can still be totally inaccessible

As mentioned before on this blog – automatic tools covers up to 30, some say up to 50 percent of success criteria in Web Content Accessibility Guidelines, so other 70 – 30 percent can not be detected automatically. This means that your tool can report your site will pass, give you 100 points out of 100 possible but a manual test could still show that page has a lot of barriers or that the page is maybe totally inaccessible.

Tools also have bugs, can not cover all possible variations in the code, sometimes new standards cause tools to even break, and we must also think that sometimes tools report false positives (errors that are actually not errors) or false negatives (tools passing code that will actually cause problems).

Multiple tools can give different results

I’ve started an open source project called aXeSiA that can run multiple different tools at the same time and after testing it against this blog I’ve discovered some strange results. Sometimes one tool reported an error with serious impact while other tool missed it totally. This means that if I were only using a single tool I would have very different perspective on the results. If I were only using the tool that did not catch the error I would maybe totally miss the problem.

That’s why it is crucial to understand the limitations of the different tools and that is also why it makes sense to use multiple different tools when testing. I was actually surprised to get such differences as both tools were using ACT rules as basis, but as mentioned – sometimes software has bugs, sometimes implementation details vary, sometimes one tool has not yet implemented latest ACT rules and so on.

Automatic tools are useful but need manual supervision and understanding

I am not saying to drop automatic tools, I am just saying that they must be understood in their limitations and that accessibility specialists must supervise the results. And that can only be possible with proper training and good understanding of the different users, different assistive technologies and also the fact that WCAG are only guidelines but accessibility is much more than that.

Author: Bogdan Cerovac

I am IAAP certified Web Accessibility Specialist (from 2020) and was Google certified Mobile Web Specialist.

Work as digital agency co-owner web developer and accessibility lead.

Sole entrepreneur behind IDEA-lab Cerovac (Inclusion, Diversity, Equity and Accessibility lab) after work. Check out my Accessibility Services if you want me to help your with digital accessibility.

Also head of the expert council at Institute for Digital Accessibility A11Y.si (in Slovenian).

Living and working in Norway (🇳🇴), originally from Slovenia (🇸🇮), loves exploring the globe (🌐).

Nurturing the web from 1999, this blog from 2019.

More about me and how to contact me: