What is it they say about ‘doing the same things but expecting different results’? 

I was reminded of this when I was reading the conclusions and recommendations our evaluation team were making for a recent national evaluation of a one-year funding scheme targeted at a clinical speciality in the NHS. There were, of course, nuances - but I had definitely read similar things before. 

This is not to criticise the originality of our report writing or suggest we lack creativity in our vocabulary; I’m not planning on burning my career down with this post! And if we are presented with the same or similar findings from one evaluation to the next it is our responsibility to represent these accurately and objectively. 

But, at some point, you have to ask ‘why’? Why are the same themes emerging? And what can we do to change this? 

It’s not that our clients aren’t listening or don’t agree with our evaluation findings or aren’t committed to acting upon them. As healthcare evaluators - and part of the NHS - we are fortunate to work on some of the most interesting subjects in the sector, often at the forefront of innovation. The varied clients with whom we work are dedicated and passionate about making a difference. We always test our conclusions and recommendations with various stakeholders before finalising them, ensuring they are relevant and workable. 

So, what’s going on here? Taking a step back, I think the issue is systemic. Lessons from evaluation are not being synthesised and aggregated. They are being considered as entirely unique to the context in which they have been written. This is invariably not the case; typically, our findings have more general applicability. 

What’s to be done? There is no current apparatus for collating evaluation findings in healthcare, pulling out the common threads and having the means of communicating these with key decision-makers across the NHS. This feels like an essential task; synthesising what we already know before we take on more primary research.

So we’ve decided to make a start. We have looked across our evaluation reports over many years and consolidated our ruminations into a series of short, action-focused reports across specific themes. Each is just a few pages long.  

The first report looks at short-term funding in the NHS. We have evaluated many national funding schemes that provide one-off payments to services within a clinical speciality in the hope of bringing about improvement in one form or another. Expectations with these schemes start high. But reality often falls short – and often for predictable reasons: delays getting the funding out; lack of support (or knowledge of what is happening) from local decision-makers who are needed to sustain any gains made; and poor-quality data to demonstrate impact. Here we suggest some ways for designing and delivering short-term funding schemes that can extract better value for money.

As part of this series we will also provide posts, and supporting workshops, from our team focused on:

  • How to design programmes to support evaluation of their impact
  • Designing and delivering process evaluations that answer the right questions.

As part of the NHS, we feel a responsibility to share what we have learnt. We believe in the improving power of evidence and knowledge. We want to help programmes prepare for the challenges they are likely to face, to know how to avoid them – and, crucially, how to understand and respond to what has happened.

Crude research (me asking colleagues) shows that our evaluation team has more than a century of collective experience of leading major healthcare evaluations. If our team was a person, they would have been around even before the Labour party got its first sniff of government (a short-lived minority affair in 1924). We have seen a lot between us; and we are committed to sharing. 

There would be many benefits from aggregating lessons over time. Running smoother programmes, saving money, improving care and outcomes. And one (admittedly selfish) hope of mine is that it also leads to some fresh conclusions and recommendations for me to read about. Variety is, after all, the spice of life.