Can you automate accessibility testing?

If you work for a company that takes accessibility seriously and have gone through a few rounds of manual testing for accessibility issues, you’ve probably asked yourself whether you can automate accessibility testing and save yourself some time. It’s not that straightforward.

I’ll kick off this post by answering the question – No. No, you cannot automate accessibility testing.

Hopefully, you’ll stick around as I go into a little bit more detail and dig a little deeper into accessibility testing and what tools, and non-tools, you have access to.

Want to make your website accessible for everyone?

Why, you might ask? Because it’s good for business! Businesses with accessible websites have a 31% increase in traffic. We offer website accessibility reviews performed by real people.

Creating a distinction

So when I refer to accessibility testing in this article I’m talking specifically about testing a website, or specifically a page of a website, against the WCAG 2.1 rules. To pass “accessibility” is to meet the criteria outlined by WCAG 2.1.

Obviously, that isn’t really “accessibility” because you could pass the guidelines set out by WCAG but still have users with specific needs struggle to use your website.

True accessibility is just usability that includes everyone and the only way to have a website that is truly usable by everyone is to do proper user testing. User test like you would do anything else. Don’t consider accessibility as a stand-alone issue, just be more inclusive with your builds.

With that said though – this article focuses on accessibility defined by WCAG and whether that can be automated. Again, it can’t but more words follow.

What can you automate?

Some elements of accessibility testing can be automated to an extent. This automated testing may not flag all issues with a page though because automation is still fairly limited.

You can automate some testing against:

  • Elements that are not semantically correct
  • Elements that should be, but aren’t in a specific order (similar to the first point)
  • Text that doesn’t have the correct colour contrast
  • Images without specific attributes
  • Form elements without correct accompanying elements (fields without labels)

Automated tests like Lighthouse or AXE can test for the above and give recommendations when something isn’t quite right. These same tools though can very easily pass elements that may not actually pass the guidelines.

Think about images and alternative text. Automated tools can flag an <img> tag without a ‘alt’ attribute as a fail because it’s binary. All images should have the alt attribute even if the attribute is empty.

However, if an image is needed for context or for the understanding of content it should have a description. We’re not quite there with automated tools to be able to understand the context and understand the image enough to know whether it’s needed for context.

Automated tools cannot determine whether an image is content or not. Whether the content still makes sense without the image. Whether an image is decorative or not. They just don’t. But as long as that image has an alt attribute it will pass – whether or not that alt attribute should be null or have more detail.

Manual testing for the win – maybe not

So with automated tools having their downfall, it may sound like manual testing is the only way to succeed.

Although mostly true, it’s also not that straightforward either. Manual testing works perfectly if you know what you are doing, what you’re looking for and how to read the results.

In this context, manual testing refers to going through each WCAG 2.1 criteria and manually checking a page against it for a pass or fail. I’m of the opinion that manual testing can only work if the person doing the testing has enough experience to understand what may cause a pass or fail.

To manually test each criterion I use my own accessibility spreadsheet to help me. I have one document per website and each document covers multiple pages. I then manually go through each succession criteria and ensure the page adheres to it. I have experience in this so know what to look out for, but not everyone has the experience and you need a fairly broad skill set.

For some of the criteria that could mean you need experience in HTML in order to find and understand a failure. You may need a good understanding of the content to understand whether it’s understandable enough. You may need to understand how specialist software works and how those tools interact with pages.

Manual testing isn’t the perfect tool either.

So how do you properly test the accessibility of a site then?

The best way to ensure your website is accessible is to simply do a combination of automated testing, manual testing and user testing. It will be the most effective way of catching any accessibility issues. Putting all your tools to work is the only way to get reliable usability for every user.

I use multiple tools as part of my testing process including:

  • Lighthouse audits (automated)
  • Axe audits (automated)
  • My own free lighthouse-a11y-audit tool (automated)
  • Manual testing
  • User testing including interviews, workshops and surveys
  • DebugBear (automated)

Training and user feedback will help

Creating experiences for people with special requirements needs a lot of different considerations and tools. So along with your arsenal of tools, you should ensure that every single person who can influence the usability of a site has some training in creating accessible websites.

Another thing to consider is adding contact details within your accessibility statement to make it easy for users to report accessibility issues. This could be a useful tool for those issues you never catch during testing for whatever reason.

So you cannot automate accessibility testing. I also don’t believe you can rely just on manual testing. Tackle accessibility with the same process you would anything else and most importantly speak to your audience.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.