We’ve learnt a lot about assessing services over the last few months. James Kemp, acting Service Manager for passport appointment booking at HM Passport Office, attended one of the early assessments and we asked him to write about the experience.
As one of the very first services to go through the assessment process it wasn’t entirely clear what I needed to provide in the way of evidence, or how strongly we would be held to the published criteria.
So, in the run-up to the panel, I read through the Government Service Design Manual (which I strongly recommend) and the 26 service standards making notes against each one on how well I thought we met them and what would demonstrate that. I thought that we managed to meet the vast majority or had action in hand to meet them within a week or two. There were about five that I thought we might struggle with; I hoped that would be sufficient for the service assessment.
The Service Assessment
The service assessment was relatively informal, with three GDS people and myself. I was initially asked to give a brief overview of the online appointment booking service and to confirm the details I’d given in the updated GDS Proposition.
Once I had done that we went through each of the standards. There was a particular emphasis on user needs and showing that we’d built the service to meet them.
While we’d decided to provide online appointment booking in response to our customer insight material, we’d tested the service on a wide range of people within HM Passport Office and our suppliers. We hadn’t actually tested it on ‘external’ customers.
This, it turned out, wasn’t quite what was expected of us. While we’re all passport customers outside of work, our own staff aren’t entirely typical of members of the public. They tended to understand what was needed to complete the appointment booking.
I was then asked to demonstrate our service for the panel. I hadn’t been expecting this and so had nothing prepared, however I was able to access the version our SME had set up for testing. This walk through picked up some usability issues that we hadn’t previously encountered, but it was done in a friendly and helpful manner.
After the panel finished I was introduced to a designer, Ed Horsford, who spent a couple of hours showing me some examples of recent work, going through our service, and giving me some very constructive comments on how we could improve it. I got some very useful pointers on how to do user testing, delivering continuous improvement post-launch and on using the performance platform to get information about how our service was being used.
The experience was valuable both for myself and for GDS.
Since the meeting they’ve created email templates giving clearer information about what is required and how the assessment process will be run. They now also provide assessment materials such as prompts and blank checklists in advance of the panel, so services can self-assess before meeting the GDS team. My colleagues working on other digital services have found these materials and updated guidance to be very useful.
For my part, I took a few things away from the assessment:
- don’t underestimate the focus on users
- engage early and often with GDS, they are friendly and want to help
- GDS service assessments come when you are ready to move to the next step and cannot be done in parallel with other approvals
- design good services that work digitally (not a web front end)