Some findings from my WAI-tools monitoring pilot analysis

Note: This post is older than two years. It may still be totally valid, but things change and technology moves fast. Code based posts may be especially prone to changes...

(Read 830 times)

Extremely valuable documentation that reveals some interesting points about future of Web Accessibility Directive monitoring methods, tools and also some less clarified reporting matters. The accessibility statement automatic analysis will most certainly also have a central role and it is worth following on the Accessibility Conformance Testing rules that are the engine of all automatic tools out there.

The European Web Accessibility Directive ((EU) 2016/2102) requires monitoring and reporting on the accessibility of webpages and mobile applications of public sector. Basis for the evaluations is the harmonized European standard under name EN 301 549, that is actually implementing the Web Content Accessibility Guidelines (WCAG 2.1 on levels A and AA).

So this means that the pilot is actually interesting for everybody trying to monitor and report on the WCAG compliance and I decided to go through it and present some findings that I think can be interesting for accessibility specialists worldwide.

At the same time – this is also interesting from the accessibility statement perspective as it must describe all deficits and problems of our evaluations and needs to be updated as well.

Article 6 of Directive (EU) 2016/2102 defines periodical monitoring in two ways:

  • simplified monitoring that is used for detection of non-compliance and is monitoring limited set of success criteria (practically by what is possible to monitor automatically),
  • monitoring in-depth, that must cover all success criteria (and requires manual evaluation as well)

Norwegian Digitalisation Agency performed the pilot as a part of Web Accessibility Initiative (WAI) Tools project (opens in new window) and published it’s findings on 26th of October 2020.

You can read the report for WAI-tools pilot monitoring here (opens in new window).

Interesting findings from the report

The report is providing some very useful information about testing, methodologies, tools, reports and even vendors and software that was used. I will try to summarize some key points in the following sub headings.

Central role of ACT Rules for automatic and semi-automatic accessibility evaluations

Everybody is trying to automate as much as possible and it is a good thing. So it is important to agree on the rules first. And Accessibility Conformance Testing (ACT) rules (opens in new window) are key for automating majority if not all tools that we have today – from open source to close source – they have the central role. Next time you run your favorite browser extension or Software as a Service (SaaS) in a cloud or locally, chances are you will be using these rules under the hood.

WAI-Tools project has currently published 5 (formally) ACT Rules but their ambition is to develop 70 test rules. Exiting evolution and I will for sure follow up on them.

Some details on monitoring process

It can be treated as a sample survey recipe and I see a lot of common ground with Website Accessibility Conformance Evaluation Methodology (WCAG-EM) and that is not surprising at all. Key points are:

  1. Planning and mapping requirements – crucial for adding value and to guarantee impact – 13 WCAG 2.1 success criteria that was using 19 ACT rules, concentrating on high-risk,
  2. Selecting representative samples for the tests is not an easy task, total website map is not known, even in public sector alone – they will probably systematize some registries for this in the future,
  3. Collecting data and measuring the tests itself – as we all know – manual checks are very time consuming and more effort should be put into automation of testing – but they will try to analyze the accessibility statements automatically which is a very interesting idea,
  4. Analyzing and then reporting – this is the most interesting part because it is the result in a way but it is worth noticing that we all need more clarifications about how to calculate the level of compliance. Is it overall, page level, combination, element level and so on.

Some additional details:

  • simplified monitoring – use automatic testing as much as possible, to detect non-compliance within a subset of requirements and aim to cover wide spectrum of users (like no vision, low vision, no color perception, no and limited hearing, no vocal capacity and so on),
  • simplified monitoring should detect non-compliance, but in-depth monitoring will verify compliance,
  • each WCAG success criterion will be checked for compliance on a page level but for whole webpage to comply all tested pages should also comply,
  • organizations representing persons with disabilities should be consulted when target sampling,
  • they have also defined what kinds of pages should be tested – like home, login, sitemap, contact, help and legal info pages, at least one service page, the accessibility statement itself and the pages that offer accessibility feedback mechanism, pages with distinct design, at least one downloadable document and at least 10% of sample must be selected randomly,
  • reporting details were discussed – form and key performance indicators,

Tools, rules and success criteria that were used in the pilot

Some ACT rules were developed in the pilot but were not checked with ACT objectives yet, they had vendor support. I will not mention vendors here, but two of them are very famous and the third is an University.

19 ACT rules were used from the tools and goal was to develop 70 by October 2020. I’ve checked the current situation and there are 80 ACT rules now on their official website (opens in new window).

13 WCAG success criteria were selected for the pilot:

  • 1.1.1 Non-text Content
  • 1.2.2 Captions (Prerecorded)
  • 1.2.3 Audio Description or Media Alternative (Prerecorded)
  • 1.3.1 Info and Relationships
  • 1.3.4 Orientation
  • 1.3.5 Identify Input Purpose
  • 2.2.1 Timing Adjustable
  • 2.4.2 Page Titled
  • 2.4.4 Link Purpose (In Context)
  • 3.1.1 Language of Page
  • 3.1.2 Language of Parts
  • 4.1.1 Parsing
  • 4.1.2 Name, Role, Value

They suggested that 4 additional success criteria should be added:

  • 1.4.3 Contrast (Minimum)
  • 2.2.2 Pause, Stop, Hide
  • 3.3.1 Error Identification
  • 3.3.2 Labels or Instructions

We plan to use the accessibility statements to collect structured data about the public sector bodies, web solutions, area of services, and individual services per entity.

one important fact from the pilot

The tools should be reliable, their results should be reproducible, based on a documented interpretation of requirements and as far as possible ACT rules. They should also integrate a crawler and results should be presented both on element and page level.
Results should be in a exportable format that should be in line with the directive.

It was not possible to calculate the number of unique pages that failed a specific success criteria because reports provided test rule results.

Single crawler was suggested so that different tools get same source code, I presume.

Conclusion

There are some important indications in the report and I tried to summarize them in this post but it will be interesting to follow up on the ACT Rules and also automatic testing and sampling in the near future.

You can read the report for WAI-tools pilot monitoring here (opens in new window).

Author: Bogdan Cerovac

I am IAAP certified Web Accessibility Specialist (from 2020) and was Google certified Mobile Web Specialist.

Work as digital agency co-owner web developer and accessibility lead.

Sole entrepreneur behind IDEA-lab Cerovac (Inclusion, Diversity, Equity and Accessibility lab) after work. Check out my Accessibility Services if you want me to help your with digital accessibility.

Also head of the expert council at Institute for Digital Accessibility A11Y.si (in Slovenian).

Living and working in Norway (🇳🇴), originally from Slovenia (🇸🇮), loves exploring the globe (🌐).

Nurturing the web from 1999, this blog from 2019.

More about me and how to contact me: