When we compare things we need to know what exactly are we comparing. I’ve seen that benchmark of accessibility that is based on automatic tools quickly became misinterpreted and misunderstood. This has to do with the lack of knowledge and the lack of understanding and I want to change that.
European Commission (EC) just published an 2024 update of eGovernment Benchmark (opens in new window) and I decided to extract the information about digital accessibility and comment on some important findings. I also need to express some concerns with the wrong interpretations of the state of accessibility that I stumbled upon when reading the reports.
You can read about the first accessibility comparison in my blog post from 2023.
Web accessibility measured in 2024
Web accessibility reporting is still a pilot reporting (the results are not included in the overall scores of eGovernment Benchmark that is used by the commission), just as last year.
Updated methodology to WCAG 2.2
Benchmark used automatic accessibility testing tool – as in 2023 they used Deque’s axe extension – and as it got updated with WCAG 2.2. they decided to get rid of WCAG 4.1.1 (as 4.1.1 is not relevant any more) and used Discernible Button Text (part of WCAG 4.1.2) instead.
Results for 2024 – overall 5% improvement
If we consider the average for 2023 which was 30% of websites in the scope not failing selected automatic tests, the report for 2024 is showing 35%. This is quite an improvement in theory. I need to express my concerns with this trend afterwards in the post.
# | Country | Score |
---|---|---|
1 | Norway | 79 |
2 | Finland | 77 |
3 | The Netherlands | 74 |
4 | Sweden | 70 |
5 | Denmark | 68 |
6 | Luxemburg | 52 |
7 | Austria | 49 |
8 | Malta | 48 |
9 | Ireland | 44 |
10 | Spain | 42 |
11 | Poland | 41 |
12 | Hungary | 40 |
13 | France | 37 |
14 | Germany | 35 |
15 | Italy | 32 |
16 | Lithuania | 31 |
17 | Estonia | 27 |
18 | Cyprus | 27 |
19 | Belgium | 26 |
20 | Latvia | 24 |
21 | Czech Republic | 17 |
22 | Slovakia | 17 |
23 | Switzerland | 19 |
24 | Portugal | 15 |
25 | Iceland | 15 |
26 | Bulgaria | 14 |
27 | Slovenia | 12 |
28 | Croatia | 10 |
29 | Greece | 8 |
30 | Romania | 6 |
31 | Montenegro | 4 |
32 | Albania | 2 |
33 | Türkiye | 1 |
34 | North Macedonia | 1 |
35 | Moldova | 1 |
36 | Serbia | 0 |
37 | Ukraine | 0 |
We could in theory compare the scores with previous ones, but I didn’t do it intentionally due to the small difference in methodology and the fact that we don’t have all the data that would make comparison possible on a success criterion level.
Please read the full background report (opens in new window) for reference.
Reflections
Is progress real or is it just a result of changed methodology?
In my experience, failing success criterion 4.1.1 was one of the most common accessibility failures, especially when methodology was limited to automatic accessibility testing tools. 4.1.1 was actually the best WCAG success criterion for automatic accessibility testing tools as it has everything to do with the code and automatic tools use code to check code.
Getting rid of 4.1.1 when doing accessibility testing with such tools automatically means that we get better results.
So stating that the accessibility situation has improved is not necessarily true. We can not really know when we consider that the results are a product of automatic accessibility testing tools that are now a bit less capable.
As per Web Accessibility Directive (WAD) I would also add additional metrics that would be a better indicator – WAD requires countries to test some parts of public sector websites and mobile applications manually – and adding manual results would be a way better indicator about accessibility. Unfortunately the results are not available for each year and for same sites, but it would be an extremely valuable complement information.
Some countries misinterpret the results!
Results of automatic accessibility testing tools needs a back story. Before we interpret them we need to understand what are the limitations of automatic accessibility testing.
Once again – passing automatic test only means our code passed a number of specific test cases and our site probably still has accessibility issues (most probably), because whole WCAG can only be tested manually (not to even mention the EN 301 549 standard that includes WCAG and additional success criteria).
I invite you to read the report of your country and check what they are reporting.
I must say that I was very disappointed when I saw the following claim in Slovenian report:
Slovenia’s 75% compliance with the Web Content Accessibility Guidelines (EU average:77% of EU
Digital Decade Country report 2024 for Slovenia (opens in new window) – total misinterpretation of accessibility results.
average) may hinder citizens with reduced accessibility.
Please understand that I would be very happy if I knew that this is the reality. I am also certain that this is not intentional, but a mistake done because of the lack of awareness and knowledge.
There is no reference, I don’t know where the data came from, but I suspect that this is a misinterpretation of the results of automatic testing.
It is a clear indication that we need more awareness and knowledge, on all levels, not just for designers, developers and content creators!
Such misinterpretations may mean that stakeholders will invest less in accessibility. It may mean that accessibility will be degraded as a “we are doing quite well” key performance indicator, instead of much needed integral part of digitalization.
I will do my best to inform the authors of this report about the reality and I hope that with awareness and knowledge our common accessibility efforts will be more widespread and more understood.