Why do clients drop out of therapy? How much of a problem is it, in therapy generally, and for you particularly? How can we measure it, how can we understand it, and what can we do about it?
This first of a two-part blog explores the scale of the problem, invites you to reflect on your own practice or service, and serves as an encouragement to us all to reflect critically on a phenomenon that is often seen as an inevitable part of the therapy process
In this blog, I’m aiming to highlight how big a problem dropout is in therapy generally, how you can know how much of a problem it may be for you or your service, and what simple steps you can take to reduce it. We all know that dropout occurs in therapy.
Those of us who are therapists will have experienced the mixed emotions we feel when the client we were expecting to be sitting in front of us now has failed to show up without notice. We may try to make contact, but we may not succeed. If we don’t, we’re left with a question mark sitting where our client should have been.
Chances are we will spend some time trying figure out the reason for this no show. Then, in the absence of any feedback from the client, we will make what sense of it we can, and move on. But what story will we tell ourselves about what happened, and will it be an accurate reflection of events?


Why do clients drop out of therapy?
By drop out, I’m referring to situations where the client terminates therapy unilaterally. Sometimes, they will notify us of their intention to discontinue, but most often they will simply not show for their next scheduled appointment. At the most basic level, I believe there are two main reasons why clients drop out of therapy. These are:
They have got what wanted from therapy and neglected to tell us, or
They haven’t got what they wanted and have not been able to tell us
Often, when I discuss dropout with practitioners, the explanation I hear offered is some variation of the first reason. The core belief goes something like the panel adjacent. Clearly, we must respect clients’ right to disengage, as well as their right not to tell us of their intention to do so. But I think we need to question any assumption that clients withdraw unilaterally from therapy because they have got what they came for, and have simply neglected to tell us.
The client has got what they needed from the therapy process and reached a good enough level of benefit not to need more. In not telling us they are exercising their right as autonomous individuals and we must respect that.
In general, attrition or dropout in therapy is a serious problem. To illustrate how serious it is, take a look at the number in the box adjacent. That number represents the number of sessions that therapy clients most commonly attend. That’s right. More clients attend just one session than attend any other number.

It may be a more comforting narrative than the alternative, but that doesn’t make it true. Clients that are properly engaged in the process of therapy don’t suddenly disappear in my experience, so when they do we should use it as a prompt to ask some serious questions.
But I don’t have a problem with drop out…..
I didn’t either until I started to look at my own experience, so I invite you to read through the rest of this section with an open mind. After that, if you still don’t think you have a problem with dropout, then you can give yourself a big pack on the back and feel smug as you continue to read the advice for those that may.
I’m no exception to this general rule. I’ve just looked at all my referrals for the past year and guess what is my most commonly attended number of sessions? With a slight, but possibly significant, caveat which I’ll come to later, it’s ONE. Annoying, but true. Curious about whether you are an exception to this pattern? If you are, I simply invite you look at all your referrals for the past year and see for yourself. How many did you have just a single contact with? Of those, how many did you agree to have further contact with, only for that not to take place?
Here’s a service based perspective on drop out.
The two graphics below are based on data from the Improving Access to Psychological Therapy (IAPT) programme in England. The graphic on the left is based on referrals that are recorded as having ended in the 2015 – 16 year. As you can see, a total of 1,299,525 referrals ended. Of those, 405,974 were never seen by the service. In other words, just over 31%, or nearly one in three, of the people that were referred, were never seen.


The graphic on the right is based on referrals that had one or more treatment appointments. In total, 858,896 people had one or more appointments. Of those, 321,765 had only one treatment appointment. So, just over 37%, or nearly four in ten, of those that were recorded as treated had just one appointment.
IAPT’s published performance data comprises the pooled data from services commissioned by more than 200 Clinical Commissioning Groups (CCG’s). Beneath the national averages for indicators like dropout and improvement rates lies a wide range of performance across the services. Sadly, the range of figures for client dropout across CCG’s is not easy to extract from the raw data, however, there are other data that we can draw upon to illustrate the point.
The table below is from data drawn from the CORE National Research Database for Primary Care (NRDfPC), collected between 1999 and 2008. [i] The database comprises data from 35 primary care counselling services. It shows the average ‘declared’ [ii] rate of unplanned endings for those services to be 22.5%. Beneath that average lies a range across services between 1.2% and 43.5%.
If that lower figure looks too good to be true, it could be because it is. To get a true picture of service performance, data for all clients that are seen should be included, but I have occasionally seen service data that includes only the cases of clients that complete therapy. Almost by definition these will be planned endings. The most meaningful story, however, lies in what are known as the quartile ranges.

Source: CORE IMS: Benchmarks for Primary Care Counselling Services: Planned/Unplanned endings
What are quartiles? Quite simply, if you take the range of unplanned endings for each service, and split them into four equal parts, you have quartiles. These are represented as colour coded blocks in the figure below. The green quartile represents the 25% of services with the lowest rate of declared unplanned endings (1.2% – 15.1%). Red represents the 25% of services with the highest rate of declared endings (31.4% – 43.5%).

Source: CORE IMS: Benchmarks for Primary Care Counselling Services: Planned/Unplanned endings
Put another way, services whose unplanned ending rates are between 22.7% and 31.4% can be described as having an above average rate, and those above 31.4% as having a high rate of unplanned endings relative to other services.
Remember that these figures are based on declared endings, where the services specified that each client’s ending was either planned or unplanned. For those clients where ending type was missing, it was assumed those clients with missing data had also reached an unplanned ending, and an estimated rate of unplanned endings was calculated. Taking this missing data into account, the average estimated rate of unplanned endings was between 26.1% and 90.5%, with an average of 52.4%. In reality the true figure will lie somewhere between the two.
So, what’s going on here?
If we really care about the quality of service that we individually, and therapy collectively, are providing for clients, then these data should prompt us to ask questions. Maybe what you’ve read has prompted some questions of your own, and if so I’d love to hear your thoughts. In the meantime, here are some of mine:

What might account for the fact that for IAPT clients nationally, nearly one in three were never seen by a service? Did they not actually need a service or were poor referrals, did they improve without an intervention, or did they lose hope in the queue waiting for a service?

Why did nearly four in ten IAPT clients that received some form of treatment receive only one session? In contrast with what’s known about how clients typically improve over the course of therapy, perhaps one session was enough? Conversely, did those clients simply not experience a sense of ‘fit’ between their needs and the therapy or therapist that on offer?

How can we understand the fact (from the CORE data from primary care) that services with the highest rates of unplanned ending had a rate that was at the very least double that of services with the lowest rates? Were the characteristics of their clients such that some were harder to hold in therapy, or were therapist and service factors implicated?

How aware are we, and how aware are services, of these wide variations in unplanned ending rates, and how prepared are we to ask ourselves some tough questions?

If we are, and we do, what’s to be done?
The question of what’s to be done about these variations depends on what conclusions we draw from our consideration of the data. If our conclusions are to be accurate, we need a more than passing acquaintance with the context which gives rise to the data.
Data in context, and what is to be done?
Just as every behaviour, however ‘irrational’ it may seem, makes sense when you understand the context in which it first arose, so it is with data. When we see wide variations across services’ and therapists’ performance, we should resist jumping to conclusions or making judgements before we have attempted to understand the context in which they arise.
The less we know the context, the more cautious we should be. We might expect the services which make up the CORE data above to be broadly similar in their client profile and service delivery, and we may be correct. We may, however, be very mistaken. A range of factors from primary referral sources to the degree to which the local population is transient or demographically diverse is likely to impact on levels of engagement in therapy.
Equally, however, we should not seek to explain away variations across services as due to differing characteristics of their clients. The fact is that some therapists are more effective than others in holding clients in therapy, and services with higher proportions of such therapists will perform better overall. It’s a fact that some clients disengage from therapy, or perhaps never properly engage in the first place. What is also evident is that the proportions of clients that disengage at various points varies widely across services and practitioners. Not just services in different sectors, but services in the same sector. Not just practitioners across different services, but within the same service. We are not all the same. When clients drop out, there is often no way that we can capture their experience. But it shouldn’t stop us from trying to understand what might be going on. In addressing what’s to be done, I’d like to offer a simple approach to drop out that we might adopt:

Take it personally. As I’ve written previously [iii], we should stop seeing drop out as in any way inevitable. And in the true spirit of enquiry, we should see it as a chance to reflect on what conditions may have impacted on our clients’ engagement that we could have made different.

Know what’s normal. Sure, some drop out is inevitable. But it’s important to know, first, what normal and acceptable drop out means in your context. Second, you need to know your actual level of drop out (read on for advice on how, and for a personal and embarrassing story of my own that relates)

Develop a framework for preventing and managing drop out. There are simple things that you can do that will help minimise the likelihood of the client disengaging unilaterally, and I offer some evidence based suggestions that may help in the blog that accompanies this one.

Monitor your level of drop out over time (it changes!). Don’t assume your dropout rate will remain fixed. Like the value of shares, they may rise or fall over time, and there will be a story behind those trends.
What’s next?
In the next blog, I will be outlining a simple framework that you can use to begin monitoring your own levels of unplanned ending, and offer some thoughts about how you might go about bringing it down if it appears you may have a problem.
By way of a teaser, one study published in 2011 found that clients who received information about the number of sessions normally required to achieve improvement were more than three and a half times more likely to complete. That’s a relatively simple conversation that you can have with clients at the outset – but only if you know the data! More on that in the next blog.

Caveats and confession time
I mentioned at the start that there is a caveat to my general finding that my most commonly attended number of sessions is one, and it’s this. That figure does not remotely apply the referrals that come from Employee Assistance Programmes (EAPs). Almost without exception, EAP referrals tend to continue to a planned ending, and to use the full number of sessions allocated. I’m not yet fully sure about why that is yet, but if you think you have the answer, do let me know!
Last, but not least, I mentioned above that I have a story of my own about drop out, and one that I’m not awfully proud to admit to. In the latter days of my time at the Royal College of Nursing Counselling Service my annual audit of service data revealed that of the team’s eight practitioners, I had the highest level of client drop out, and by some distance. For a variety of reasons, I wasn’t having the best of times at that point, but it never crossed my mind that it might be impacting so directly on my client work until I saw it writ large. It was quite a wakeup call, and it was both necessary and sufficient!
You can read the second blog in this series here.
References
[i] http://www.coreims.co.uk/site_downloads/PC%20Benchmarks%20-%207%20-%20Unplanned%20-%20final.pdf
[ii] Declared’ means that client’s ending type was specified in their data as either ‘planned’ or ‘unplanned’, as opposed to ‘Estimated’, where clients’ ending data was missing, but was attributed as an unplanned ending.
[iii] Drop-outs aren’t just a statistic. Therapy Today, May 2014. Online access to BACP members: http://www.bacp.co.uk/docs/pdf/15222_may_14_tt.pdf