How do you use the data from routine evaluation to reflect on the experience of your clients and develop the way you deliver a service? This two-part blog series describes one service’s journey over some seven years, and concludes with this second blog and the opening of Pandora’s box
In the first blog in this two part series I outlined the early part of our journey, during which we implemented the CORE System, struggled hugely with the quality of our data, and eventually coaxed some encouraging improvement data from the mountains of paper we accumulated along the way.
Here, I describe the adoption of the PC based software that transformed our relationship with our data, and, in some ways, opened Pandora’s box.
2001: the arrival of CORE PC
If the truth of the early years of our use of the CORE System was that we never truly understood its potential, the years from 2001 were quite the opposite. One simple piece of software changed everything. We quickly began to realise the power in our data, and at times that realisation was very scary indeed.
When CORE PC is launched we are one of the early implementers. It’s a no-brainer. We are paying £6 for every set of client forms that passes through PTRC’s monster batch scanner. An annual licence for CORE PC for 100 clients costs £200. We have to take care of the data input ourselves, but as soon as it is input, it is available for analysis. Not only can we analyse data for clients whose cases are closed, but also for clients while they are in progress.
We take out a licence for CORE PC and begin to travel hopefully. At this point no-one seems to have a road map.

2002 – 2005: In which CORE PC turns into Pandora’s Box
In Greek mythology, after the god Zeus has created Pandora, her gives her a jar, telling her never, ever to open it. Pandora, ever curious, cannot resist knowing its contents and opens the jar. As she does, out flies every ill that has befallen humanity since. Pandora manages to trap just one remaining spirit in the jar – that of hope.
And so it is that we take charge of our own data for the first time. I slowly learn how to use CORE PC’s filters and reports to extract ever more value from our data. I learn that improvement rates, important though they are, are only part of a wider picture of service quality.
I learn that there is a relationship between the proportion of clients that we accept into therapy, and those that will eventually complete. I learn that clients who start therapy in the ‘healthy’ range on the CORE Outcome Measure are four times more likely to show deterioration than improvement. I learn to think of therapy as a journey with fixed points along the way, at any of which the client may disengage.
I start to understand that by simply focusing on improvement rates I am in danger of overlooking all those clients that don’t complete therapy, and whose experiences we need to understand. Put simply, which is better? That only 50% of my clients reach a planned ending but that 80% of those show improvement, or that 80% reach a planned ending and 50% show improvement?
It is during this period that we opt-in to CORE PC’s optional appraisal function – better known as the ‘scary button’. One element of this function shows the service’s performance on a range of key benchmarks against that of a range of national services. It allows us to see that while our improvement rates are higher than the average, so too are our rates of unplanned endings, and by some distance.
While we know that therapies appear to be broadly equivalent in their outcome, we are beginning to understand that services are anything but.

We believe we can improve, so we start challenging ourselves to do better in the next period than we did in the last. Over a period of three or so years we begin to build improvements in key performance indicators such as unplanned endings, session attendance and improvement rates into our annual business planning cycle. We identify what we believe we can improve, and what we believe it will require. We reflect, we
set objectives, we implement, we measure, and a year later we do it all again. Over a period of three years between 2002 and 2004, we halve our rate of unplanned endings and push up our rate of clinical and/or reliable improvement from an already respectable 79%, to 85%.
This is not the end of the story, however. The second key element of the appraisal function profiles the relative performance of each therapist, myself included, against the service average. Truly, this feels like opening Pandora’s box. Once open, what’s inside cannot be put back. Once I know what our respective contributions are to our overall service data, I can never un-know it. If it highlights issues that I should be concerned about, then I will have a responsibility to act.
I feel I am crossing into unknown territory, into an area formerly boundaried by the external supervisory relationship. But it is part of my nature to take things apart to see how they work. It is part way through 2004, and I am reporting on the service’s 2003 performance. I am gratified to see that our rate of unplanned endings has reduced from 2002’s level of 31% to 25%.
A less welcome discovery comes when I run the ‘scary button’, however. Against the 25% for the service, my own rate of unplanned endings is 43%, the highest in the service. Against a backdrop of conflict with two team members and uncertainty over the service’s future, I have taken my eye off this ball. Pandora’s Box has delivered me a harsh but necessary lesson.
Over the years 2002 – 2004 the service’s rate of unplanned endings halve from 31 – 16%.

What did we learn?
At the point at which I left the RCN in October 2005, 84% of the service’s clients who progressed into therapy reached a planned ending. Of those, 85% achieved a clinical and/or reliable improvement in their levels of distress. This placed us in the top 10% of services using the CORE system at the time (the red target symbol in the diagram below represents where the RCN service sits relative to others). Looking back, I’m still incredibly proud of what I and my team achieved.

As I said in the first part of this blog series, this didn’t happen by chance. Rather, it was the result of a systematic and ongoing process of measuring key performance indicators, internal and external benchmarking, identifying opportunities for improvement, acting on those opportunities and, once again, measuring the results. I was also fortunate in having a hugely skilled and committed team that bought into the process. Perhaps more than anything, however, the learning that I have taken from this time and all my experience since is this – that the story of how clients experience a service is written in its data, if you take the time to learn how to read it. The story of my clients’ experience of me in 2003 was written into my own data. It just took me a while before I was able to read it. Now methodology exists that brings feedback much more immediately into our therapy rooms, if we are willing to engage with it.
You can read the first part of this two part series here