We don’t make the finish line with all clients. What goes wrong, what’s the cost, and what can we do to try and prevent dropout? I’ve worked out my unplanned endings over two years as well as their impact on my income. This is where I put my dropout figures on the line and mark my homework.
I’ve blogged about dropout before. My first blog on the topic looked at dropout in general terms, including some of the available literature and benchmark data on the subject. In the second, I outlined a simple framework to monitor and reflect on our own levels of dropout, whether at an individual or service level.
This time I’m going one further, using that framework to work out my own levels of dropout in the past two years and using that as the basis for reflection on my findings. It’s left me in no doubt that there is more that I could do, and need to do, to try to impact on a couple of problematic areas.
Dropout within services, and across practitioners within services, has been extensively written about and researched. But nowhere have I come across any studies that look at rates of dropout among therapists in private practice. I therefore have no idea if my experience is representative of the private practice community or not, and I’d really welcome your thoughts and experiences in the comments section below.
My figures in context
Compared to many practitioners, I suspect my overall client caseload is relatively small, standing in mid-tens each year. This has been the case for the past two to three years, reducing from rather higher figures in previous years. Given that therapy is not my only source of income, these numbers suit me well.
Before I go further, let me clarify what I mean by the term dropout. In terms of a definition, the most helpful that I’ve seen is that of Hatchett & Park: [i]
“..the client has left therapy before obtaining a requisite level of improvement or completing therapy goals.”
In common with every therapist, a proportion of my clients don’t reach a planned ending to the work. For one reason or another, they disengage prematurely. As I’ve said previously, my own preferred way of recording dropout is taken from the CORE System, [ii] which breaks planned and unplanned endings down into four distinct categories each:
Planned from outset
Agreed during therapy
Agreed at the end of therapy
Other planned ending
Due to loss of contact
Due to crisis
Client did not wish to continue
Other unplanned ending
While I accept that dropout is part of the territory, I do struggle with the rather passive acceptance of it that I sometimes hear, as if dropout were the ‘collateral damage’ of therapy. I don’t see it that way, and that view upsets me as much as when my own clients unilaterally disengage. I take it personally, because I believe that sometimes, often even, there’s probably more I could have done to prevent it.
My activity and dropout figures
My figures are shown in the table below. They are split across the calendar years 2016 and 2017. Each year has a column for the total activity for the year, which I’ve further split into Employee Assistance Program (EAP) and non-EAP clients. The rows start with clients referred, which is shorthand for potential clients that either made initial contact or were referred via one of the EAPs I work for.
Some of those potential clients went on to have an assessment/first session with me (the Assessed row). Some went no further, while with others we agreed to that we would together (Therapy agreed row). Some of those clients I continue to work with (In progress row).
The Therapy agreed/closed row shows clients with whom I agreed to work, and contact is now finished. The final two rows show whether those clients reached either a planned or an unplanned ending to the work, using the CORE definitions shown above.
What the numbers show
It doesn’t take much to distil the key points from the figures above.
My drop out rate has improved over the two years (from 30% in 2016 to 21% in 2017), but I shouldn’t be too relaxed about this for reasons I explain below.
The number of clients referred by EAPs has risen as a proportion of my overall caseload, from 37% in 2016, to 62% in 2017. This has had a direct bearing on the reduction in my overall dropout rate for 2017.
Clients that are referred to me by EAPs are significantly more likely to reach a planned ending than those from elsewhere.
While the numbers are small, clients that came through non-EAP routes had more unplanned than planned endings.
When I last blogged about the subject, I was developing the spreadsheet I use for recording client contact details to enable me to do the kind of analysis presented here. At that point I had some early figures which suggested a marked difference between EAP and non-EAP dropout rates. That’s now been confirmed.
When clients drop out I routinely take time to reflect on the circumstances that may have contributed towards this, and what I might have done differently. Reviewing dropout on a case by case basis, however, is very different from having data for two full years, where I can see trends. Viewed positively, I could be very pleased about the low dropout among EAP clients. In reality, I’m more concerned that more than half my non-EAP clients have dropped out in both years. I would dearly love to know if this is typical of the experience of others.
What’s to be done?
I’ve reviewed again each of the 17 clients that dropped out over the two years and concluded that for five there was probably nothing I could have done differently. Life happened, and there were crises and competing priorities.
For the remaining 12, however, I’m not so sure. These are the key themes I think I can identify, even this far after the event:
It’s OK if you want to stop, but could you please let me know?
I’m sure a decent handful of these 12 were feeling some benefit and may not have felt the need for further contact. For some, terminating without notice may not have been an issue. They got what they came for and forgot to let me know. Alternatively, perhaps having that conversation with me felt too difficult. Either way, I wasn’t able to find out.
Are you in or are you out?
A smaller proportion were perhaps not fully bought into the process, not clear about whether they would benefit, or were not seeing any immediate benefit. These themes seem to touch directly on key issues of motivation to engage, hope of a benefit from engagement, and expectations of the speed and shape of change.
Therapists have lives too
For a period towards the latter end of 2017 family circumstances meant that I had to cancel a couple of weeks of appointments and wasn’t immediately able to offer alternative dates. I communicated that by text in the first instance, and now wonder if a call may have felt a more human connection for some. This may have been a mistake that lost me the two or three clients who chose not to re-engage. It can be easy to forget the importance that we assume in the lives of some clients.
The cost of those dropouts
No-one wins when clients drop out. Clients are deprived of the opportunity to benefit from an effective piece of work. Therapists are left with a mild to moderate sense of discomfort and possibly rejection. When clients are fee paying, we also suffer a loss of potential income.
If half of the 12 clients I mentioned above could have been kept engaged, and each had attended an additional six sessions, I’d be looking at the best part of £1.5K of additional income. I wouldn’t turn that down and I’m guessing you wouldn’t either. The question is, what might work to maximise that engagement?
Back to school
Clients are marking our homework, as I said in my last blog. For me, in some cases, the verdict may be……could do better. Based on the reflections above, I have some ideas about what might make a difference. These are my resolutions:
More carefully assess, and if necessary work to strengthen, clients’ levels of motivation for change at the start.
Ensure that the client and I have sufficient clarity over the goals of therapy, and how we will work towards them, as well as keeping progress under regular review.
Make more explicit expectations over session attendance and the process of termination in early contracting.
Clarify and if necessary adjust clients’ expectations over how soon they should expect to experience a sense of progress (one study I’ve profiled previously demonstrated that addressing unrealistic expectations about treatment length significantly reduced dropout). [iii]
This time next year I’m hoping to see a different pattern of planned and unplanned endings among my non-EAP clients. Who knows, maybe I’ll be able to report back some good news, be a little wiser, and a little better off?
If you’d like a blank copy of the spreadsheet I’ve used to make your own you can download it here. If you’d like to know more about it details are in a previous blog on dropout
[i] Comparison of Four Operational Definitions of Premature Termination. Gregory Hatchett & Heather Park. Psychotherapy: Theory, Research, Practice, Training, Vol 40(3), 2003, 226-231
[iii] Decreasing treatment dropout by addressing expectations for treatment length. Joshua Swift & Jennifer Callaghan. Psychotherapy Research · March 2011
We thrive on feedback, so please let us know what you think about what you’ve read in the comment section below. Only the name you use to identify yourself will be shown publicly. Thank you!
4 replies on “Dropout: what’s normal and what’s it costing us?”
I thoroughly ” enjoyed” this blog and the associated articles – it certainly got me thinking about my practice in light of your experiences and analysis of your data – I work in an organisation that uses client testimonies for final evaluations of therapy which I find limiting. Recently I started using the PHQ9 as I am familiar with this from my Gp practice experiences. I have started up in private practice recently. From reading this piece I aim to ” fully assess” – “clarify goals/expectations and how we work together to achieve them” and ” discuss endings and number of sessions from the outset” I already do some of these three things and now I am supported from the iresearch findings highlighting exactly what works best in practice. Directly asking the questions about goals/expectations/outcomes and frequently checking out how are we doing here is crucial. Also informing my clients about what the therapy research shows and applying this to my current practice experiences can only be empowering for all. Thank you
Marie – many thanks for you comments.
I so agree with you that testimonies are limiting. When was the last time you saw a negative testimony on a therapist website? In a way perhaps some clients that drop out are offering a testimony of sorts, but seldom one that we can really use for learning?
Keep up the good work, keep reading and we’ll keep posting. Thanks again!
Following your calculations and spreadsheet, I did some work to produce figures based on clients who finished in the period April 2016 – 2017. Last year, I began using CORE with my private clients and this drew my attention to who was and was not attending their final session.
To get figures I could compare with yours I had to go through my closed cases and decide on planned and unplanned endings, as I had previously only really been looking at who attended a final session and who cancelled or did not arrive.
As you have done, I used the CORE criteria for deciding upon planned or unplanned endings, as that suggested by Hatchett & Park is a lot harder to gauge for those who have an unplanned ending. In my experience some clients will not attend their final session because they have achieved their therapy goals.
Total EAP Non-EAP
n % n % n %
Contacts 229 130 99
Contact Only 105 65 40
Referred 124 65 59
Assessed 116 64 52
Assessment Only 10 0 10
Therapy Agreed 106 64 42
In Progress 15 10 5
Therapy Closed 91 54 37
Planned 55 (60%) 45 (83%) 10 (27%)
Unplanned 36 (40%) 9 (17%) 27 (73%)
The result shows that I have work to do even though I already follow some of your ‘back to school’ points.
From my analysis, I can see that clients who dropout attend 3 sessions on average, while those who successfully complete have an average of 8 sessions. These figures are based upon the 48 clients who completed a CORE Form during the assessment (It excludes couples, but includes most private and some EAP clients). Additionally, those who completed a CORE Form in their final session showed a decrease in distress of 9 points. This suggests to me that those who do not buy into the process recognise it and stop quickly.
I think that the table shows a similar picture to the one you provided. However, even though I have seen more clients than you, I am not sure how reliable the results are. The low numbers mean that what appear to be trends may not be reliable. For example, since the start of April, I have finished working with another 13 clients, only two of which had an unplanned ending. All of these started in the past year and would add 3% to the total of planned endings. Which leaves me wondering what a decent sample size would be?
Hi David and very many thanks for your post. I must say I’m very impressed with your analysis.
I’ve tried to do a little formatting of the post but sadly it won’t accept columns, but reading from your top line of Total, EAP and non-EAP clients, it’s clearly possible to see each of those reflected in the rows that follow. Your figures seem to reflect my own: that EAP clients seem very much more likely to ‘stick’. I guess the only way you’ll know about trends is to track your endings over time. Personally, I’m aiming to update every three months on a rolling basis. I really don’t want to get to the end of another year and find the same result, if that makes sense?
I’m no statistician, but your sample sizes seem decent enough. Certainly enough that I would be very confident that the differences between EAP and non-EAP drop out rates would be statistically significant.
I’m guessing also that, like me, you’re thinking about what those unplanned endings have cost you in the year, and what you’ll be looking to do going forward, as they say.
I really appreciate your transparency – thank you again!