In my last blog post I talked about our technology strategy and how we arrived at a set of principles. I also talked about how we saw our technology developing over the next four years.
We’ve worked hard on raising awareness of the technology strategy and how it links to the digital strategy. I developed and delivered a training course to over 140 people, we gave all staff briefings and we updated our design documents to refer to the principles.
But a strategy with some principles is not going to make change happen on its own so we’ve also decided to look at our technical governance processes too.
A necessary evil (or not)
We already had an existing technical governance process that pre-dated the new technology strategy. When I looked more closely at the process I found that it revolved around documents and the need for these to be “signed off” by the board before a project could carry on.
This approach to governance led to a number of problems:
- as we adopted agile methods we had services being developed that didn’t produce documentation that fitted with the old model. These services were either falling out of the governance process or being forced to create documentation simply to fit the governance model
- the need to produce certain documents in a particular format and to get them signed off was leading to delays
- the model of a design being signed off by a board meant that teams developing a service didn’t feel ownership of the design as it would become the board’s design
- technical governance was seen as something done by the infrastructure teams but not by our colleagues developing bespoke applications
Everyone I spoke to recognised the importance of having a governance process to make sure that everything fits together.
A better way
We looked at other models of governance to try and find one that would address these problems. The model that we decided to follow is based on the Government Digital Service’s (GDS) service review process. We particularly liked the governance principles that were set out by GDS and were determined to follow them ourselves:
1. Don’t slow down delivery
2. Decisions when they are needed, at the right level
3. Do it with the right people
4. Go see for yourself
5. Only do it if it adds value
6. Trust and verify
We had a chat with colleagues at GDS to understand the pros and cons of their approach and also to see if we should be using their process. It became clear that we could learn lots from GDS. But their process is based around the Service Manual and operates at a higher level than our technology strategy so we couldn’t simply use the GDS process. The model, however, was something that we could adopt.
Giving it a go
We’re determined to iterate our process and recognise that we’ll change things as we learn more. To get going we’ve developed a process based around the discovery/alpha/beta/live service design phases. We’ll carry out a service design review as service designs reach the alpha-beta and beta-live transition points.
Our service reviews will be 'light touch' using assessors from within the Digital Service and looking at the ten design principles from the technology strategy. Importantly the assessors won’t be looking at the technical merits of the design. We’ll trust the team to get that right and create opportunities for technical peer review. The assessors will instead be looking at how the design addresses the design principles.
To do this work we’ve created a group of service assessors who come from both our infrastructure teams and our development teams. We’ve been working on the process and the types of detailed questions that we might ask during an assessment. It’s been great to bring together these two teams as we don’t get opportunities to work together as often as we’d like.
We're going to start using the process in September and I’ll keep blogging about what we learn.
Read more about the technology work at PDS.