https://pds.blog.parliament.uk/2018/09/05/analytics-at-pds-alpha-ambiguity/

Analytics at PDS: alpha ambiguity

The performance analysts are a growing team in PDS. Our work involves measuring the size of an online user’s problems and then looking at how effective we are at solving those problems.

One of our regular tasks is carrying out tracking to look at online behaviour and then giving recommendations to product teams and staff in Parliament based on our insight. It's important that we interpret data in a meaningful way and make sure that the findings are understood and questioned further.

One of our aims is to create a more effective way of working, particularly within our product teams. What better way to do this than to challenge some perceptions?

The task

I decided to give the product managers in PDS a task. The purpose was to get them to create a visual representation of how different disciplines, like user research, work through each of the four product phases (discovery, alpha, beta, live).

So I asked them to draw each member of their product teams' work (effort or demand) in the form of a line graph. I then repeated this activity at a cross-government analytics conference, where 66 other performance analysts took part.

The results

3 images showing the line graphs drawn by product managers
Line graphs drawn by product managers

The task showed that all participants (product managers and analysts) created very similar representations of team activity. What I didn't tell them was the purpose of the task.

Although I asked all participants to portray the work of their team, my question really was: how do they perceive the work of a performance analyst in each phase?

3 images showing the line graphs drawn by performance analysts
Line graphs showing the contribution of analysts during the different development phases

The results above show that analysts have a high level of activity during discovery and beta, but a large dip in work during alpha.

What a performance analyst should do during alpha

An optimistic assumption from this exercise is that all members of the team show consistently high levels of effort throughout each product phase. If this is true, the purpose of analytics should be defined through every phase.

For analysts, the skill of thinking in a quantifiable manner should not just mean building a dashboard, but should also result in the rebuffing of ideas and coming up with solutions like:

  • how to integrate offline data
  • teaching or training staff and those involved in a project how to read data
  • looking at priority tasks and measuring how well users can achieve them

A performance analyst’s output is insight

I’ve come across many misconceptions about what analysts do and I’ve been asked many times to "do some analytics".

As analysts, we need to establish that our output is not creating dashboards but providing insight. Analytics is only the process used to gain insight.

As our team continues to grow and adapt, we analysts are looking to really establish our purpose. To do so, we must explain our insight in a way that’s easily understood by our product teams. With this comes more informed decisions and, therefore, smarter design.

Read more about the work we're doing on parliament.uk

Leave a comment