Estimated reading time: 8 minutes

In their new book Outcome Measures and Evaluation in Counselling and Psychotherapy, authors Chris Evans and Jo-anne Carlyle have pulled off a rare feat. In making some important statistical concepts accessible, presenting ‘for’ and ‘against’ arguments for measurement in a balanced way, and leading the reader through a range of implementation scenarios, they have written the guide I wish I’d read twenty five years ago. In fact, they’ve written the book I wish I’d written myself.

If you have no other guide to outcome measurement and evaluation in psychological therapy on your bookshelf, make sure you have this one (use the links below and grab a 25% discount). Here, Chris reflects on the journey that led to its publication.

Same events – how come we saw them so differently?

Back in the day, I used to assess young psychiatrists aspiring to become consultants. I’m of an age when this process involved the candidate conducting assessments with paid service users rather than actors. As part of this they would meet the user for 45 minutes, then have 15 minutes to collect their thoughts. They then had 10 minutes to convey what they had learned to myself and a co-examiner.

We would ask them to invite the user in to go through two tasks in “history taking” or “mental state examination” with the user.  The user was thanked (by us as well as the candidate, some candidates forgot that: one way lose a lot of points!).  They then had a final 10 minutes in which we discussed what we had each seen.

Some candidates were more anxious than the service users, many of whom were very old hands at this game. Commonly, the service user who walked into the room wasn’t the person we had expected from the candidate’s description. Examples included not having been told about missing limbs or eyes, or about the incredibly unusual garb. These might have been the first things a child would have told us about the person, but in their anxiety to “do the correct professional thing” the candidate omitted this.

Another classic occurrence was the service user entering and behaving in an extremely socially unusual (and often very interesting) way, for example, hanging an umbrella from the light above us. When asked (after the user had left) what they had noticed, not a single candidate ever mentioned such things.

So what did I learn?

I learned some things from these experiences:

  1. It’s hard to summarise a lot of data at the best of times and very hard when anxious.
  2. We pick up gigabytes of information from a session with any client.
  3. To summarise that it’s vital to have templates, systems, agreed ways we expect to do it and ways we expect someone with whom we share the summary to understand it.
  4. However, sometimes we miss what matters when we force data into a system or template without thinking about it.
  5. So we need real skills, time and calm to summarise data well.

We need to love the numbers too….

What has this to do with numbers?  Well, we live in a world in which employers, managers, politicians, all of us actually, do want numbers.  You want to know when your session should finish. If you’re in private practice you want to remember if you have been paid on time by this client, or to run over with a new one how much you charge. You probably want to keep at least a rough track of how many sessions you have had. You may, particularly if you read Therapy Meets Numbers (TMN), also be interested in clients’ self-report measure scores, or you may have to try to digest them because your service insists on them.

Now, I think that no counsellor or therapist fails to convey a picture of their client (carefully respecting confidentiality) when discussing the client in consultation/supervision with another practitioner.  That’s a skill we have, partly from childhood, but also partly honed by life and training experience. These are the skills to summarise gigabytes of data per client, compress it into a word picture, and hand it over. They are qualitative, narrative skills. 

Reproduced with kind permission of

Of course, any counsellor can read the time, keep to time, add up income and put it, roughly, against outgoings: we all do have numeracy skills.  However, these are very rarely confidence zones for practitioners. Sometimes, if we’re asked to read a quantitative research paper (or even one of the more numerate blog posts here) anxiety gets in the way.  Sadly, it’s still rare, rarer the earlier you trained, that you were taught any number digestion skills in your training.

Unfortunately, failing to embrace these skills has cost our profession and clients dearly, it has deprived us of a quantitative evidence base: lack of quantitative data can leave services vulnerable to cuts and many clients now want to see that we collect and understand such data.

Bridging the gap between research, evaluation and practice

That’s one reason why Jo-anne (Carlyle, better half) and I wrote Outcome Measures and Evaluation in Counselling and Psychotherapy.  However, the other huge reason we wrote it, and wrote it for practitioners (and, we hope, a few thoughtful managers and perhaps even politicians) is that we believe that much digestion of therapy self-report change data is neither very thoughtful, nor very helpful.  Like our candidate earlier, trying to force data into too neat, too controlled a structure may set us up to learn little from it. Worse, it may lead us to very misleading impressions of what the data says.

There’s a real overlap here with Therapy Meets Numbers which sets out to help you build bridges between research, evaluation and practice.  In the book, and in TMN, the message is that we can learn to look at numerical data less like those anxiety paralysed candidates trying desperately to fit things into some template and we can learn to sidestep many of the myths about therapy change data and build a new and wiser use of data.

What’s inside the book?

How can you judge the quality of an outcome measure? How can you choose an outcome measure that’s congruent with your practice, and how do you introduce it into your work? How can the data from a client completed measure enhance your work, rather than become its sole focus? What might you learn about your practice or service from outcome data for a large number of clients?

Reproduced with kind permission of

These are some of the questions that we’ve set out to answer in the book, alongside setting out a framework for understanding the core features of outcome measures and their scope. We also take you through a process of implementing an outcome evaluation framework using worked examples.

One example imagines how two very different single-handed practitioners might use questionnaire change data in ways congruent with their ways of working.  Another explores how an imaginary managed stepped care service might use its data beyond simply generating statutory IAPT reports.  The last chapter provides a ‘snapshot’ review of the preceding nine chapters and tips for “constructive critique as a core practitioner skill”. 

Developing additional free resources

In addition to the book, I am building resources in pages off There’s a growing glossary, a recording of the launch webinar, all the plots from the chapters (and the wonderful xkcd cartoons) and, by the end of 2022, there should be a growing collection of interactive online tools to help you digest data and not just use the unhelpful summaries you have been told to expect.  I won’t attempt to summarise 184 pages and 10 chapters here: that would be poor numeracy and impossible qualitative, narrative work!  However, I am tempting you to read it. Give it a look!

Chris Evans – about the author of this blog

Chris holds honorary chairs in UDLA (Universidad de Las Américas) in Ecuador and the University of Roehampton in the UK.  He trained in medicine and psychiatry, specialising in psychotherapy with trainings in individual analytic therapy, group analysis and family/systemic therapy. He always combined research with clinical work and his driving interest has been “how is it that we think we know what it is that we think we know?” … which he admits may have been a bit generic for a really focused research career! However, it drove wide exploration of research and assessment methods, led him to co-create the CORE system ( and to co-lead over 35 translations of CORE instruments. He co-authored the first paper naming “practice based evidence”: the theme of this book. He is now a full-time freelance researcher and has information about his research outside of CORE and is his personal web site.

TMN reader offer

Get 25% off the paper version with discount code UK22AUTHOR at

For 25% off the Ebook go to click through to checkout and then add discount code “OUTCOME25”

Buy it for yourself, and why not also buy one for your manager, your local MP and members of your local Clinical Commissioning Group for Christmas! 

Just like you we thrive on feedback.

Please leave your thoughts on what you’ve read in the comments section below.

Share with your networks
Posted by:Barry McInnes

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.