How do you use the data from routine evaluation to reflect on the experience of your clients and develop the way you deliver a service? This two-part blog describes one service’s journey over some seven years. This is how it all started.
Back in the day, I managed one of the top performing therapy services in the UK. I feel I’ve earned the right to make that claim. When I left the Royal College of Nursing (RCN) Counselling Service, now more than ten years ago, our CORE System evaluation data showed that 84% of the service’s clients who progressed into therapy reached a planned ending. Of those, 85% achieved a clinical and/or reliable improvement in their levels of distress.
That level of performance didn’t happen by chance, and it certainly wasn’t where we started from. It placed my service in the top 10% of services using the CORE System at the time. The story of how we achieved this is told below (with apologies for use of the past present tense and some inevitable historical inaccuracies).
June 1994: The story begins
Fesh from the week long intensive immersion that is the RCN Annual Congress, I am beginning to grapple with some of the challenges that face me as the newly appointed Head of Counselling. I am required to provide a short-term counselling service to more than 300,000 members of the RCN in the UK. As a start, I have a resource equivalent to three full-time counsellors, including myself. At times, the numbers keep me awake at night.
As if that weren’t enough, it is made clear to me that an additional part of the service’s remit will be to promote the provision of staff counselling by employers in the health service. Ideally, we should be working to make ourselves redundant because health service employers are taking appropriate responsibility for the psychological wellbeing of their staff.
It’s just as well I enjoy a challenge.
June 1994 through 1995
Having made a proposal for a service model based on a mix of face to face and telephone counselling, and a successful bid for additional resources to support it, my new team is up and running. We’re geographically dispersed in RCN offices throughout the UK, with a small core team in London.
Each counsellor’s job description has written into it their regional responsibility to promote employer based counselling, and we’re working out how best to argue the case. It doesn’t take long before it becomes apparent that we really need an evidence base for workplace based counselling to support our argument. It doesn’t take much longer to realise that there isn’t one.
We are going to have to do this the hard way. We are going to have to try and grow our own.
1996 – in which I bluff my way onto the BACP Research Committee
I look to my professional body for help in the search for an evidence base for workplace counselling. Somehow, I find myself invited to a meeting of the BACP Research Committee in London. Before I know it, I am co-opted onto the committee. As a practitioner and service manager among a group of largely academic researchers, I am, apparently, able to help them as much as I hope they can help me. Early discussions take place which will lead, much later. to the publication of BACP’s first review of sector based counselling, John McLeod’s systematic review of the research evidence for counselling in the workplace. [i]
Among the existing committee are Professor Michael Barkham and John Mellor-Clark, both of the Psychological Therapies Research Centre (PTRC) at the University of Leeds. They are involved with the development of something called the CORE System, designed to provide a standardised system for evaluating quality and outcomes across a range of therapy settings.
I may have found what I’ve been looking for.
1998 – CORE and early learning
After rigorous scoping of the options available to us, and extensive discussions within the RCN counselling team, we decide to implement the CORE System on a trial basis in 1999. We have agreed with PTRC that they will provide periodic analysis of our data for us. All we have to do is to send the client completed CORE Outcome Measures (CORE-OM) and the practitioner completed Therapy Assessment and End of Therapy Forms to them, and they will do the rest.
And so the guinea pigs start their journey.
After ‘go live’ we spend three or four months collecting paper forms. Client completed first measure, Therapy Assessment form, client completed last measure, End of Therapy form. Wash, rinse, repeat. The regional counsellors batch up forms for closed cases and send them to RCN HQ in London. Before long we are awash with forms and cease to count them, measuring them instead by their height in inches. Nine inches of CORE forms are accumulated in about four to five months, then are batched up and posted to PTRC. Then we wait.
1998 – 2000: in which we grow a bigger paper mountain
Over the coming months, we learn to do quality data. We draw up protocols, we write summary guides for the team on which of the fields in the system must be completed. We laminate the guidance and make up packs for the team’s therapists. Eventually, every form that is returned to HQ is screened for the appropriate level of completion. Incomplete forms are sent back to the offender. We learn fast.
At some point we receive a report on our first complete year of data. In among all the other data is what I am most interested in – our improvement rate. It shows that the proportion of clients achieving either clinical and/or reliable change stands at 72%. I sense that’s something we can be reasonably pleased with, though at this stage I’m not entirely certain.
At this point, however, I have no idea that the next step we are about to take is to revolutionise our relationship with our data.
You can read part 2 here