Skip to main content

What we learned while improving accessibility

Front-end developer, Jeremy Anderson
Front-end developer, Jeremy Anderson

Here in the Department for Work and Pensions (DWP) we're building services that should be fully accessible to everyone. As a front-end developer for Universal Credit, I'm part of the team helping to make this goal a reality.

Right from the time I was interviewed for the job, it was clear that providing a high level of accessibility would be a key part of the role.

One of my first tasks was to talk to a group of charities about the accessibility needs of the different people they each represent. These included:

  • Dyslexia Action
  • MenCap
  • Royal National Institute of Blind People (RNIB)
  • Scope

We were then able to put in place an appropriate plan to test the digital service and improve accessibility.

Accessibility audits

We worked with RNIB to conduct an accessibility audit of our digital service. They provided two (sighted) participants to do the testing.

RNIB were a great learning resource, and one of the most important things coming from this was the dialogue between them and the Universal Credit development team.

They initially came in for a week, drawing up a list of changes on a huge spreadsheet, with the intention of coming back to check these fixes once we had implemented the changes. We also had discussions with them on how to address certain issues, or approach new interactions within the service, which we fed back to our design team – while most accessibility changes had to be made to our HTML (usually adding missing ARIA attributes), there were also problems that were a direct result of the way some interactions had been designed.

We learned that the earlier you involve designers and researchers with accessibility testing, the more time you will save later when delivering an accessible service.

Usability testing

We're aiming for an AA Web Content Accessibility Guideline (awarded by W3C), which deals with the biggest and most common barriers for disabled users. We aim to meet all the criteria of an AA-standard website, with as much AAA compliance as possible. But accessibility isn’t just a tick-box exercise. Getting an AA rating is a simple matter of Pass or Fail, and this is what an accessibility audit is for.

However, there is much more to learn about accessibility by testing a service with real people.

Universal Credit information on
Universal Credit information on

The user research and design team were already conducting usability testing, so to extend this work we started to test the service with DWP employees who had special access requirements. Initially three people tested the service for us.

Two people were visually-impaired so used the JAWS screen-reader and ZoomText/SuperNova text magnifiers. The other person had repetitive strain injury (RSI), and preferred to navigate using Dragon voice-recognition.

We have learned a lot from usability testing. This less formal, but possibly more practical, testing is a great way to check that your site not only adheres to AA standards, but also helps people navigate and use the service, helping us make improvements to individual user interactions.

The exclamation icon on this section of the Universal Credit digital form has a semantic importance and the supporting code contains extra information for screen-readers.
The exclamation icon on this section of the Universal Credit digital form has a semantic importance and the supporting code contains extra information for screen-readers.

We learned that not everyone uses their own software in the same way, and usability testing certainly proves this. We now know it’s important that we continue to test the service in this way, as well as using automated testing.

The technical bit - automated testing

Although we built reusable components which should theoretically make maintaining good accessibility practice easier, the knowledge is so specialised that we realised we had to implement a form of automated testing.

We looked through various ways of testing the website, including standalone applications and scripts that could be added to the existing testing suites.

In the end, we decided on a javascript-based approach, using a publicly-available script on Github. The reasons for this were:

1) This would allow us to change the parameters of the testing to suit our requirements.

2) We could add the script to the DWP git account and give everyone access to it, which would standardise our approach.

3) It could be added to our existing testing schedule, making use of our existing framework.

Due to our testing workload, it was implemented into overnight testing, although this will be reviewed as we go forward.

Building a better service for everyone

Improving accessibility is an important part of how we work in DWP. It's all about building a better service that works well for everyone.

Sharing and comments

Share this page

1 comment

  1. Comment by Andrew posted on

    Thanks for sharing your experiences Jeremy.

    I'm wondering which testing script you ended up deploying and why.