In this blog post I go into details behind automatic accessibility testing and how I don’t really trust any accessibility scores such tools provide.
It all drills down to inability of automatic tools to pass WCAG success criteria and limited ability of them to fail some. Manual testing is the only real way to really know about state of accessibility.
Category: Accessibility testing
Latest posts:
Attended web performance conference (performance.now()) and found some thoughts about similarities with accessibility. Also made a simple proof of concept for a time to interactive metrics for screen-readers and other assistive technologies.
Before you order your accessibility audit you should read this article of mine. I try to be objective and constructive and present the good and the bad, the strengths and the weaknesses of accessibility audits.
The journey from content creator to end user is quite long. At least in terms of different software that needs to deliver. And as we all know – software has bugs. And sometimes even so called features that can actually be called bugs as well. So please test and if we find a problem – report it, so that we improve the accessibility one step at a time.
Some reflections after two years. Progress is slow, but steady. Personally I would invest into automatic testing solutions to monitor some basics, but that’s probably not possible before stakeholders, politicians and organizations really understand the impacts.
Website owners are responsible for use of third party widgets, plugins and more. Before using them they should check if they conform to WCAG, otherwise their site will not conform either. Checking for accessibility statement of the third party may reveal huge problems with their product’s future.
Finding errors and failures is quite simple. Finding their solutions not so much. Audits should in my opinion provide with specific solutions that are not vague and are totally actionable. Otherwise we need to call in other experts to translate them.
We all reach out to third party solutions and we like it when they claim they are accessible. But please don’t just believe them – check that they really are conforming. And when they update – check again.
Sometimes it’s simple to make a feature with JavaScript but not so simple to make it consistent for all those screen-reader and browser combinations. In this post I describe how I tried to update some live regions and the order in the DOM was not respected. Solution was simple, but it’s easy to forget about it when it works visually.
It was not clear to me if WCAG 4.1.3 can be applied to native mobile applications. At least on both iOS and Android. So I did some research and came with the conclusion that we can and should or even must use status messages also on native mobile apps.
What are the most critical requirements for testing native mobile accessibility? What do we have to have for testing? Should we only test with phones? What about different operating system versions? This post will give you some basic hints.
Building on first Slovenian Accessibility Awareness Day and on the first official Web Accessibility Directive reports from Slovenian public sector I made the Second Slovenian Accessibility Awareness Day. Still and always a work in progress, but please read the post and then if you wish also the reports to get some clues about the state of accessibility in Slovenia.
Some improvements can be detected and I also added some thoughts of mine about the parts that are not very obvious. Interestingly – e-commerce is almost worst – and that really is a surprise when we think about how much do they invest into ads and SEO, just to get some new users.
How do you test for something that can be only possible in certain conditions? Well, best way to do this kind of testing is to ask developers and others that were involved in the feature specifications.
Some accessibility issues originate in code. And when design is being recreated with code it may seem to work but when thinking about accessibility we may notice that it only works for some users and not for others. I’ve decided to describe some common accessibility fails that are on developers.
I was asked if I can issue “WCAG certificate” for a website, so I decided to investigate what would that actually mean as we all know that sites and mobile apps are constantly evolving and changing and even if they conform to WCAG they may not the following day. What would then mean to issue a WCAG certified certificate and still be ethical and the right thing to do?
What would I want from my Accessibility-as-a-Service provider? What would be the ideal here when we know that automatic testing is absolutely not enough? We must also get people as a part of the service – accessibility specialists and people with disabilities. And when done from start to end it is way more efficient compared to only using it at the end.
External agency made an accessibility audit. It provided a lot of possible solutions. In this post I try to make it easier to act on this audit. Breaking results into responsibilities, then prioritizing the issues and finally estimating and fixing them can be one way of doing so.
European authorities published accessibility reports from multiple EU lands and I decided to read all of them and make short summary with my personal comment about them. A lot can be learned from their first auditing and there is a lot that can and need to be improved throughout Europe.
To claim that our product is accessible needs more than just WCAG audit that did not discover any fails. Real users, people with disabilities are the only one that can really reflect on the accessibility of our products. That’s why we should include them in all reasonable parts of our production processes. Otherwise we may think we deliver accessibility but the truth can be opposite.
Automatic testing of software is brilliant. Saves a lot of time and effort, prevents problems soon and makes our products better. But when trying to automatically test accessibility we need to know about the challenges and problems before. Some tools may even produce wrong results and some tools may report everything is perfect when they can only test up to a third of criteria.
HTML semantics and assistive technologies support is way better than PDF’s. If you are a MAC user that needs to use a screen-reader you may be forced to experience the missing semantics of even most accessible PDF’s. And maybe it is time to move more PDF documents to HTML?
Where to start as a developer or designer wanting to test with screen-reader? With basics, right – and maybe with mobile first. But do not underestimate real users – they might surprise you.
Every (front-end) developer should add screen-reader to their tools. Screen-reader experience can really help us make products more accessible and also be better at our coding. Combinations of screen-readers and browsers can get over complicated, so it is important to think if code we write is supported for majority.
I organized an accessibility workshop for our front-end and full-stack developers, user interface and user experience designers and others involved in digital production. This post will concentrate on screen-reader (SR) users way of navigation because it may surprise non-screen-reader users quite a lot.
20th of May 2021 is Global Accessibility Awareness Day and I wanted to contribute by analyzing state of web accessibility on Slovenian web pages. We can therefore say the first ever contribution to Slovenian Accessibility Awareness Day.
Some reflections on The WebAIM Million annual accessibility analysis. There is some improvement but we all need more empathy and knowledge.
axe-hackathon video is now live and I decided to transcribe my presentation’s questions and answers regarding aXeSiA and publish the feedback I got from the committee as well.
Automatic tests can help a bit. WCAG evaluation methodology provides a good start for test focus. And if we add page popularity scoring and simple page complexity scoring, then we can really focus on the potentially difficult pages in our manual testing efforts.
Different automatic tools can produce different results. Accessibility conformance testing rules help a bit but there are still potential differences. And, again, automatic tools cover up to a portion when testing for WCAG success criteria, so please do test manually and with users to really make your products accessible