This is part three of a blog series setting forth a different measurement approach for public services centred on learning, which will culminate in a discussion paper in the autumn. This approach has been shaped by the practical insights and experiences of a workgroup of ten pioneering local authorities as part of the Upstream Collaborative, an active learning network of Local Government innovators convened by Nesta. The workgroup members are from the following local authorities: Barking & Dagenham, Derbyshire, Gateshead, Huntingdonshire, Kirklees, Leeds, Greater Manchester, Oxford, Redcar & Cleveland, and York. John Burgoyne, from the Centre for Public Impact (CPI), serves as a listener and facilitator of the working group, gathering collective input to shape this blog series and the forthcoming discussion paper.
In this final post of our blog series, we will describe some of the methods and tools that can enable measurement for learning. As a working group, we have explored measurement methods that broadly fall into two categories:
We would like to share two measurement methods that better enable learning and improvement within local authorities. An important theme across these methods is that they are most effective when a team has a strong culture anchored around a set of values and principles that support openness, collaboration, and learning. If, for example, frontline staff do not feel trusted to share their authentic perspective, including what they feel is not working, then it will be difficult for the broader team to learn from their experiences.
When senior leaders in the field embrace these types of measurement methods, they are signaling that they aim to cultivate cultures that support learning and improvement. When staff feel that what they share will not be used for accountability (i.e., they will not be punished or rewarded), they are more likely to be open and constructive about their experiences. A baseline level of psychological safety enables staff to speak candidly about both successes and failures they observe, leading to learning about how to do better.
Learning pods are one of the internal measurement methods where this psychological safety is critical. This method, inspired by Chris Bolton’s viral blog post on deploying learning and innovation teams in response to COVID-19, pairs staff who interact directly with residents to reflect on what they have experienced and learned over the past week.
Instead of predetermining what they will report on, learning pods use a set of open-ended questions so the staff participating can share what has emerged in a dynamic, adaptive way. Questions include:
Knowing they will not be punished for what they say, staff feel safe opening up about what they think could have gone better, and those insights are used to inform decision making about how to adapt and improve moving forward. After reflecting on the questions with a learning partner, staff come together for a group discussion to understand perspectives across different pods. In addition to enabling learning, the pods build empathy as staff are exposed to a wide range of perspectives.
Becky Willis, Project Manager at Oxford City Council, who is trialling this approach has reflected:
"Meaningful measurement could be a big turning point for our council."
Another way to enable learning and improvement is to explicitly check-in on the values and principles underpinning your approach to measurement. Our working group has recommended a set of values - trust, authenticity, and curiosity - and principles that flow out of those, but we encourage you to develop your own, which you can embody and hold yourself accountable for.
This method draws inspiration from Confirmation Practices (see a video summarising what they are here), simple routines for systematic reflective practice developed by Andy Brogan at Easier Inc. Our adapted version, which you can find here, simply involves bringing team members together on a regular basis (e.g., every month) to reflect on the following questions for each principle you have defined:
Similar to the learning pods, the purpose is not to punish the team if ratings are low, but instead to genuinely understand how everyone feels the team is doing so that you can adapt and improve. This approach enables you to reflect on how you are (or aren’t) living out your values as a learning organisation. As Hannah Elliott, Transformation Lead at Kirklees and working group member, has described:
“It is important to articulate how you are a learning organisation and what this looks like at the frontline and strategic leadership level.”
Learning externally is just as important as learning internally, and requires a different set of methods that draw on a similar set of values and principles. The same trust, authenticity, and curiosity you demonstrate among team members should be extended to people who live in the communities you serve.
Across any measurement tool or method you use to learn about residents and the complex problems they face, it is important to understand historical context and create the conditions for people to feel safe and comfortable opening up and sharing. For example, many in local government are currently trying to better understand the experiences and challenges Black communities face so they can improve their services to better meet their needs. To hear and seek out this truth, it is critical that councils understand how racism has shaped and continues to shape Black communities’ relationship with government, and what this means for how we enable greater autonomy and ownership for people sharing stories and insights.
One method that gives participants ownership is the storytelling evaluation methodology developed by Arts at the Old Fire Station (AOFS) in Oxford. Instead of predefining outcomes to measure against, the storytelling evaluation methodology lets people identify what outcomes matter to them and decide how they want to talk about them. Unlike traditional attempts to evaluate impact, this method does a better job of meaningfully capturing the complexity of individuals’ lives and is actually a fun, enjoyable process for all involved.
Sarah Cassidy, the Inclusion Manager at AOFS, says that previous attempts at measuring impact led to asking certain questions of people that actually get in the way of the work and damage the relationship between those evaluating and those being evaluated. Alternatively, the creative storytelling method offers the interconnected benefits of:
Sarah and the team at AOFS are working with Oxford Hub to collect and analyse the stories of people involved in Oxford’s community response to Covid-19, which will be shared later this year. Sara Fernandez, who leads Oxford Hub and is a working group member, reflects on the value of this method: “you cannot really measure the quality of relationships in the community unless you are capturing stories.”
To bring stories into your measurement work, remember an important point from Alexis Pala, who practices meaningful measurement and innovation at Y Lab, shared with inspiration from Dr Emma Blomkamp:
“Every story has a number and every number has a story.”
Next time you are evaluating quantitative data, it is worth considering how to capture the story behind the numbers.
In Huntingdonshire City Council, Oliver Morley, Corporate Director (People) and one of our working group members, has been working with Claudia Deeth, Community Protection and Enforcement Officer and a range of partners from across the system, to create Life Journey Maps.
To create the maps, the council and partners start by providing “meaningful conversations” training (based on the Making Every Contact Count training commonly used in Health) to staff, designed to help them have conversations with residents that get at the root cause of an issue, listen for keywords, and deeply understand residents’ perspectives and where they could use help. Trained staff then proactively reach out to residents (e.g., those who have missed a council tax payment) to have a meaningful conversation and identify key moments in residents’ lives, their perception of these events, and impacts it had on them.
From these conversations, staff create a visual Life Journey Map that plots out life events, and quantifies the level of risk associated with each event using tools such as the Holmes-Rahe Stress score, which weights the stress level of a life event. By adding together the risk levels of each event, we can calculate a total “risk” score of an individual. As this metaphor illustrates, the more traumatic events an individual has experienced, the more likely they are to be susceptible to a crisis (e.g., homelessness, mental illness, arrest). Each event is also associated with the cost to various systems within local government (e.g., police, social care, education).
See a detailed version of the map here.
The Life Journey Map combines quantitative data points with qualitative insights to provide a holistic view of key points in a person’s life. The map serves multiple purposes, enabling:
“Bad things do happen, but most of the time the signs are there. Quantifying, preventing, and addressing risks reduces poor outcomes.”
We are interested in building this tool out and testing it with other local authorities across the UK. If you would like to be involved in this effort, get in touch.
These measurement methods and tools are part of a bigger movement we are seeing in local government across the UK. In response to COVID-19, many of the traditional forms of measuring impact - people in positions of power setting and managing targets for others to hit - suddenly disappeared overnight, including charitable foundations suspending performance targets, Care Quality Commission and Ofsted suspending their inspection regimes, and NHS England turning off all the financial targets for the year.
With this top-down approach to measurement receding, our working group is proposing an alternative approach to measurement focused on learning. Measurement for learning does not mean removing the metrics and accountability regulatory bodies provide, but rather complementing quantitative data with qualitative insights and giving those closest to the problems the power to define what they want government to be accountable for. We believe this approach can fundamentally change how authorities approach their work, enabling them to better cultivate relationships with residents and ultimately improve their services.
To make the case for why we need this approach and how it is already coming to life, we have published this blog series and will be publishing a discussion paper that expands on these ideas and shares additional examples and methods in the Autumn. Follow along on Twitter #MeasurementforLearning. We are excited to see how our thinking can contribute to growing the conversations and changes taking place at other levels of the system.
The Measurement for Learning report will be part of a series of learning products exploring the new operating models emerging in local government – how they work, what they look like and the key features needed to replicate success elsewhere. It draws on the experience of the twenty pioneering Local Authorities participating in the Upstream Collaborative, which has been led by Nesta in partnership with Collaborate CIC. The full package of reports will launch in September 2020.