Is CBT The Golden Standard For Psychological Therapy? It’s Time To Stop Pretending And Get Real

Recently, I have heard a lot of complaints from consumers of mental health services as well as highly trained and experienced colleagues talk about the draconian imposts that an insistence on so-called “evidence-based” therapies be delivered by psychologists who are registered with Medicare under the Better Access To Mental Health program.  Basically, “evidence-based” means Cognitive Behaviour Therapy (CBT) because that is the most widely researched kind of psychotherapy and it is the basis upon which the Better Access program is funded. 

Consumers complain that if they are depressed or anxious, the last thing they feel like doing is judging their mood on a scale of one to ten every half hour and predict how they would be feeling in the next half hour.  This kind of writing homework is so overwhelming and nonsensical to them that they become even more depressed, hiding for days by curling up in bed. Some will actively look for therapists who don’t use CBT because this approach doesn’t work for them.

Some practitioners, on the other hand, seek to get around this impost by defining everything they do in CBT terms, thus ensuring they are on the right side of the authorities, even if it’s at the expense of a client’s wellbeing.

It’s not my intention here to argue about the relative merits of CBT or question the practices of other therapists – we are all on a learning journey in this together and we all improve with practice and experience.  However, I do question the status quo of our current approach to funding and urge a more realistic approach to mental health care in the interests of all who seek relief from their emotional pain. I whole-heartedly support the position of my colleague, Dr Aaron Frost, whose excellent article you can read here.

Up until 1999, the scientist practitioner model was upheld as the ideal to which professionals should aspire (Henshaw, 2000).  It espouses that psychologists should discerningly consume new research findings, put them into practice in an applied setting, evaluate their own interventions using empirical methods, and then disseminate their results to the wider scientific community in the form of published articles (Barlow, Hayes & Nelson, 1984).  

However, in practice this hardly ever happens.  Few clinicians use research as their primary source of information when treating clients.  They complain that studies are too concerned with methodological issues such as statistical sampling and reliability, which severely limits their applicability in treating individual clients (Barlow et al., 1984, Cohen, Sargent & Sechrest, 1986, Polkinghorne, 1991).

The obvious failure of the scientist-practitioner model has inspired some to challenge why psychologist should encourage a dishonest adherence to this ideal (e.g., John, 1998).  However, until fairly recently, no practical alternative has been suggested.  Flawed though it has been, the scientist-practitioner model has allowed more scope than no model at all to offer treatment provision services in an environment of economic rationalism, evidence-based interventions and competition from other health professionals and paraprofessionals.  Adherence to it persuades policy-making bodies like governments and insurers to continue their support of psychology (Cotton, 1998).

Although objective decision-making is greatly emphasised in postgraduate education, Cohen et al. (1986) found that in clinical practice it played a minor part in the selection of treatments.  Even when psychologists attempted to practise therapy according to the scientist-practitioner ideal, there was still much room for subjectivity.  It has been estimated that as much as 90% of the observed variance in natural settings is accounted for in the professional literature.  Accordingly, practitioners who share the same theoretical framework can make quite different treatment decisions, whereas those from different therapeutic traditions often make similar treatment decisions, even though their post-hoc rationalisations of this decisions vary greatly (John, 1998).

An insight into the way psychologists use the psychotherapy literature in clinical decision-making processes was provided by Cohen et al. (1986).  They argued that an important aspect of the problem not addressed by Barlow et al. (1984) was the definition of “research use.”  Therefore, two types of usage were proposed: instrumental and conceptual.  

Instrumental usage occurs on several different levels.  For example, a clinician may be aware of empirical studies, but not use the data in making decisions.  On another level, decisions might be considered on the basis of published data, but rejected because of impeding factors such as cost, inconvenience or perceived irrelevance for a particular client.  Finally, research might be implemented if it is deemed useful to do so.  

Instrumental implementation has been viewed as the most appropriate definition of research use, but it rarely occurs in clinical practice (Barlow et al., 1984).  Alternatively, conceptual usage, which refers to the gradual and implicit influence literature has on awareness of clinical issues, problem-conceptualisation and decision-making, is quite widespread (Cohen et al., 1986).

Since the 1999 publication of Hubble, Duncan and Miller’s groundbreaking book, “The Heart and Soul of Change” and its updated reprint in 2010, we have an alternative explanation of how therapy works and it has very little to do with the type of model used (yes, even CBT).  Using the results from hundreds of studies published over several decades and convincing data from a number of meta-analytic studies, they concluded that therapy works and proposed a four-factor model to account for the observed change.  The four factors are:

    1) If you have a good relationship with your therapist, this accounts for 30% of the observed change

    2) If you have strengths and resources that, together with your therapist, you will uncover, use, enhance and add to, this accounts for 40% of the observed change 

    3) If you expect therapy to work because you are in the care of a highly qualified and experienced professional (also known as the expectancy or placebo effect), this accounts for 15% of the observed change 

    4) The specific model used – whether CBT, ACT, IPT, DBT or other models only accounts for 15% of the variance observed. 

The authors call the equivalence of effectiveness among psychotherapies the “Dodo verdict” after Lewis Carroll’s “Alice In Wonderland,” where the Dodo says “Everybody has won and all must have prizes.”  They argue that if the first three factors are strongly met then even something such as religious healing can be as effective as any psychological therapy model. 

They propose that rather than using the evidence-based, outdated scientist-pracitioner model, clinicians should use an outcomes-based model instead.  An outcomes-based approach tracks the four factors very carefully.  Its advantages far outweigh the earlier model for tracking important information such as the early identification of clients who are likely to drop out, detection of clients who are not benefiting from treatment or clients who are deteriorating (Frost, 2015).  

It measures the effectiveness of each individual session a client has using the Session Rating Scale, which determines how effective the client-therapist relationship is and whether the client feels heard, understood and respected. It also assesses wether the session addressed goals and topics that are important to the client, whether or not the model the therapist uses is a good fit and the overall effectiveness of the session.  

In between sessions, the Outcome Rating Scale assesses how the client has progressed in the week between sessions in terms of personal wellbeing, relationship and social functioning and overall.  These measures are tracked over the course of therapy so that the therapist can adjust their approach to suit the client and improve therapeutic effectiveness overall.

I, for one, long for the day when consumers of mental health services are heard, respected and their needs are being met.  The costs of treating mental health problems are far outweighed by the benefits to the economy in terms of increased productivity and a happier, healthier workforce generally.

I also long for a more common sense approach to mental health care that doesn’t involve an attitude from the authorities that psychologists are a bunch of cowboys that need to be reigned in and justify everything they do according to an outdated but favoured evidence-based, scientist-practitioner model. 

Let’s put the needs of our clients first and foremost and focus on working with them towards enhancing their emotional wellbeing and let’s leave behind the divisive politicking and funding based on a rigid adherence to a convenient, but out-of-reality approach.

 

References

Barlow, D. H., Hayes, S. C., & Nelson, R. O. (1984). The Scientist Practitioner.  New York: Pergamon.

Cohen, L. H., Sargent, M. M., & Sechrest, L. B. (1986). Use of psychotherapy research by professional psychologists.  American Psychologist, 41, 198 – 206.

Cotton, P. (1998).  The framing of knowledge and practice in psychology: A reply to John. American Psychologist, 33, 31 – 37.

Frost, A. (2015). Better Access To Psychologists: Is It Value For Money?  
http://www.aaronfrost.com.au/better-access-to-psychologists-is-it-value-for-money/

Henshaw, S. (2000). Living With Unemployment: A Grounded Theory Study. Doctoral Thesis: Murdoch University.

Duncan, B. L., Miller, S. D., Wampold, B. E., & Hubble, M. A. (2010). The Heart and Soul of Change: Delivering What Works in Therapy. Washington: American Psychological Association.

John, I. (1998). The scientist-practitioner model: A critical examination. Australian Psychologist, 33, 24 – 30.

Polkinghorne, D. E. (1991). Two conflicting calls for methodological reform. The Counseling Psychologist, 9, 103 – 114.

 

Related Post

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.