I’m sitting in a near pitch black room with only the glare of two monitors and a window into the adjacent room. In it sits a Government Digital Service (GDS) researcher and a member of the public. As I watch the monitors and occasionally look through the window, my immediate thoughts are “what we’ve delivered is good, but how can it be better?”
It’s a user research day in Salford for the Carer's Allowance Digital Service (CADS) with fellow members of the team that have helped deliver the current beta version of CADS. We’re here to observe users completing a prototype version of CADS that the GDS research team has put together to test potential improvements.
The users know they are going to help complete a service but nothing more specific, they have all placed themselves low on the Digital Inclusion scale; a 1 to 9 rating of internet and computer skills. Each user is shown into a room with a computer and recording equipment to capture the onscreen clicks, reactions to prototype design choices and answers to the questions the researcher asks.
The improvements that we are testing, observing and analysing today are as a result of previous research days that the team have run. In the room containing the team, each member is busily scribbling observations about what we are witnessing while the GDS researcher offers occasional help in completing CADS and asking pre agreed questions to help the team gain insight.
Each observation is taken back to the Preston Office Centre where the CADS team is based. These observations are affinity mapped (Post It notes placed on a work board), these are grouped together into themes which will uncover insights into what we’ve seen on the day and what actions can be taken away to improve the service. Some observations confirmed an insight from a previous testing day; these are then turned into user stories to be delivered into CADS beta by the Digital Service Team and the development partner Valtech.
So far roughly 60 hours of user testing has gone into the current CADS system, various prototypes tested with lots of users with different circumstances and I.T abilities to bring together a service that is accessible to all, yet tailored to the individual. The testing has helped reduce completion times for users from 45 minutes to 28 minutes and seen an uptake in digital claims from 25% to 50% in only a few months. The CADS team are currently working towards live accreditation which requires the service to meet the 26 GDS standards, the first of which is “Understand user need”, something the CADS team are very passionate about.
These days provide a fascinating insight into user needs, observing users struggle to complete certain sections due to their circumstances or lack of confidence in their I.T skills really drives home the message that “what we’ve delivered is good, but how can it be better?”.
Comment by Carole Wingfield posted on
Great post. I'm up in Manchester for user testing on 15/7/14 and as I haven't seen any for a couple fo months. I'm really looking forward to it
Comment by Dean Hogarth posted on
Excellent piece of writing! Brilliantly captures the process and gives an insight into the service users lives.
Comment by Simon H posted on
I think that's one of the things we should be most proud of and something that is evident every day, the commitment of the team to keep making things better and better.
Great blog post, well done 🙂