Joint letter to BACP, UKCP and BPC on the SCoPEd consultation results

In response to the SCoPEd initial consultation results, a joint letter to BACP, UKCP and BPC has been signed by the Alliance for Counselling and Psychotherapy, the National Counselling Society, Psychotherapists and Counsellors for Social Responsibility, the Psychotherapy and Counselling Union and the College of Psychoanalysts.


Dear Chairs and Chief Executives of BACP, UKCP and BPC,

The Alliance for Counselling and Psychotherapy, the National Counselling Society, Psychotherapists and Counsellors for Social Responsibility, the Psychotherapy and Counselling Union and the College of Psychoanalysts have noted your claims hailing the results of the recent consultation.

We have analysed the available statistics, and, on behalf of our combined memberships of well over 2,000 practitioners, nearly all of whom register with yourselves, respectfully beg to differ.

The results are hardly a ringing endorsement of the SCoPEd project (dramatically so, as far as BACP is concerned).

The return rates are assuredly below acceptable minima for the adoption of such wholesale change in any profession. We calculate that there is an overall return rate of the survey of around 13 per cent (7,087 respondents out of 53,500 members) – or about one in eight.

BACP’s return rate appears to be 13 per cent (5,878 respondents out of 44,000 members. (If the smaller register were used then the return rate will have been higher.)

BPC’s return rate appears to be 15 per cent (230 respondents out of 1,500 members).

And UKCP’s return rate appears to be 12 per cent (979 respondents out of 8,000 members).

Our organisations consider that it would be foolhardy to attempt to make such fundamental changes to the structure of our professions on the basis of the level of response garnered up to now. Don’t forget, it is you yourselves who have asserted that the changes will be fundamental, not only your critics. We will continue proactively to oppose any such developments.

Nor do the more detailed statistics offer you anything like the succour that you have claimed. Drilling down, we find that:

60 per cent of respondents did not believe SCoPEd would improve things for clients.

46 per cent did not believe it would help recruitment.

39 per cent did not believe it would make things clearer for trainees.

46 per cent did not believe it would help professional organisations to promote therapy.

Given that the leaderships of the three organisations so strongly supported the direction of travel of the project, these figures should make for depressing reading for you.

And among BACP members, the positive responses were even lower. Only 36 per cent of BACP respondents to the survey believe SCoPEd will make things easier for clients trying to find the right help (Question 1a). This is just 2,131 members, which is about 5 per cent of BACP’s total membership.

For comparison and to get these returns into some kind of proportion, this is 1,000 less than those, mainly but not all BACP members, who signed the petition to scrap the project.

It also contrasts fairly dramatically with the 57 per cent of BPC and 56 per cent of UKCP respondents who believe the framework would be positive for clients – an intriguing difference that is reflected throughout all the results, as laid out here.

On the question of how useful SCoPEd will be for employers (Q1b), 50 per cent of BACP respondents answered that it will be easier to establish who to employ, whereas 78 per cent of BPC and 71 per cent of UKCP respondents agreed.

On the effect on clarity for students choosing training pathways (Q1c), 57 per cent of BACP respondents were positive, compared with 84 per cent of BPC and 78 per cent of UKCP. Similarly, 50 per cent of BACP members answering the survey believed SCoPEd would make promotion of members’ skills by professional organisations easier (Q1d), whereas 75 per cent (BPC) and 73 per cent (UKCP) felt the same.

What are we to make of this? Is it surprising that organisations representing those identifying more often as ‘psychotherapists’ (and in BPC’s case, exclusively psychoanalytic psychotherapists), rather than ‘counsellors’, would favour a framework that places psychoanalytic psychotherapy at the top of a hierarchy of practice? We also note with as little cynicism as we can manage the close ties these organisations have with training programmes that would profit from such an assertion or reassertion of superiority.

Despite the deeply problematic nature of the consultation methodology, as shown in this article, and the lack of any real endorsement of the project in the results – not to mention the widespread dissatisfaction with the framework (particularly amongst ‘counsellors’ and especially the under-represented person-centred/experiential/existential/humanistic communities), as well as the substantive critiques of the political agendas and claimed ‘evidence base’ of the project – despite all this, BACP, BPC and UKCP assert nonetheless that, ‘we have an early indication that we should progress this work’.

Surely, if anything, a dispassionate viewpoint would be that there is an ‘early indication’ that the entire project is deeply flawed, and is pursuing a path that a substantial portion of the field finds at best misguided, and at worst a complete betrayal of their practices. In what sense, then, can this work be said to be happening ‘alongside our memberships’?

To progress the SCoPEd framework anywhere near ethically, it would mean reappraising every single aspect of it: its motivations and intentions, its assumptions, its methodology, its form, the composition of its ‘expert reference group’, the ‘independent’ chair, the disputed ‘evidence base’, the nature of further consultations, and so on.

Is there any will at all to do this within BACP, BPC and UKCP? The leaderships of your organisations may ‘acknowledge’ the ‘strength of feeling’ in the debates around ScoPEd, but how can they possibly continue with the project in this form, knowing the numerous substantive critiques of the project and its current functioning?

Perhaps the 3,000 consultation comments, as yet not analysed by the ‘independent research company’, hold some of the answers. Is it possible that all of these comments and all other relevant data beyond what you have released thus could be published on one of your websites? We are serious about this and consider it to be normal good practice for a consultation. Not to do so, or to refuse to release the comments, will leave you open to allegations of cooking the books.

In the meantime, the organisations sending this letter would welcome open dialogue, above all in a public format, with BACP, BPC and UKCP about the future of the therapy field.

Collegial greetings from,

The Alliance for Counselling and Psychotherapy

Psychotherapists and Counsellors for Social Responsibility

The Psychotherapy and Counselling Union

The College of Psychoanalysts

The National Counselling Society

 

19 March 2019: post amended to add the National Counselling Society to the letter signatories.

 

*


 

SCoPEd Consultation: Methodologically Challenged

Screen Shot 2019-03-12 at 20.30.00

Richard House Ph.D., former Senior Lecturer in Psychotherapy, Counselling and Psychology at the University of Roehampton, subjects the SCoPEd framework consultation exercise to critical analysis.


Introduction and Context

In this commentary I wish to deconstruct, and subject to critical analysis, the apparent methodology used by the psy sponsor organisations in their recent SCoPEd consultation process. In precipitating a process that could end up having major implications for the practice of many thousands of psy practitioners, organisations surely have a grave responsibility to ensure that, as far as possible, any research that is carried out is methodologically robust, and transparently fair and unbiased. In the case of the SCoPEd consultation, I will show below that this is, alas, very far from being the case – which, in turn, places substantial doubt on the reliability of the initial findings as recently announced.

A Fair Consultation?

The first observation to make about the consultation process is that the most important question of all wasn’t even posed – i.e. “Do you think that it is necessary and appropriate for the psy organisations to develop an explicit written framework for competent practice in the therapy field?”.

Rather, the need for a competency framework is merely asserted and assumed by “organisational fiat”, as a background given; and only then do respondents answer the questions posed, having already tacitly and implicitly agreed to the need for such a framework by the way the consultation has positioned them, and by the very act of them completing the consultation.

It should by no means be tacitly or casually assumed that everyone who completed the consultation necessarily agrees that such a framework is necessary; yet there is no mechanism within the consultation as implemented to discover this vital information. One has to ask whether this was an oversight, or a quite deliberate “positioning” by those conducting the process.

So one has to ask, further, why were members not asked, first and foremost, to give their view on whether a generic framework is necessary and appropriate? Although of course this has to be speculative, it might conceivably be because by doing this, it would then have been far more difficult to position members into accepting the principle of a framework per se without any debate. And as mentioned above, the very “democratic” act of completing the consultation can easily be read as giving tacit legitimacy to that which, at the outset, should have been open to discussion and possible refutation, rather than merely assumed as an uncontested datum.

Thus, a fair and proportionate consultation that was genuinely aiming to find out members’ views – as opposed to one merely seeking rubber-stamping legitimation for a pre-decided view – would have sent all potential respondents both the proposed framework and a document of equal length critiquing the need for a proposed framework. This would then have left members free to make up their own minds, “un-nudged”, with an accompanying, genuinely open-ended set of consultation questions.

As it is, a methodological “coach-and-horses” can be driven through this whole process, as anyone with any expertise in research methodology will know. (I can just imagine what a group of sharp, methodology-savvy Roehampton PsychD research students would have made of this! – and it wouldn’t have been pretty…)

It’s therefore extremely disappointing to this commentator, at least, that this consultation wasn’t far better informed methodologically. Moreover, this in turn is, at the very least, consistent with the suspicions of organisations like the Alliance for Counselling and Psychotherapy that this is yet another choreographed, top-down power move by our field’s psy organisations, still intent on importing the dead hand of state regulation into our work.

An “Independent” Research Company?

We read in the rubric from the organisational sponsors of “[t]he consultation exercise, which was run by an independent research company on behalf of BACP, the British Psychoanalytic Council (BPC) and the UK Council for Psychotherapy (UKCP), … [and that] … More than 3,000 members and stakeholders submitted a comment as part of the consultation process. These are currently being analysed by the independent research company for the key themes, which will be published in the summer.” (my emphasis)

The phrase “was run by” needs to be carefully interrogated and unpacked. It is indeed potentially reassuring to be told that the research company “running” the consultation process was “independent”; but such cosy reassurance is of no substance unless respondents are told in detail what the term “running” actually means in practice. For example, to have any methodological confidence in the consultation’s reported findings, the public needs to know what written remit the “independent research company” was given by the sponsoring organisations prior to the consultation exercise. This is critical, because it needs to be totally transparent to what extent the research company is, indeed, genuinely “independent” – e.g. merely in the sense that they carried out the data-collecting exercise, or in the sense that they themselves decided on the questions to be asked in it, and how those questions were framed.

Moreover, regarding the analysis of the comments received, we also need to know what, if any, guidance was given to the “independent” company by the sponsoring organisations, in terms of how the company analyses and presents the qualitative findings. If this information is not completely transparent, respondents will have no way of knowing whether the presentation of the results is a fair and representative depiction of the actual feedback which respondents gave in their thousands.

The Questions Themselves

Regarding the actual questions posed in the consultation procedure: first, respondents were asked, “Q1a – How will the framework affect clients or patients being able to find the right kind of help to meet their needs?” (my italics).

First, note that the tell-tale word “will” is used here, rather than “would”. If this were a genuinely open-minded consultation that hadn’t already pre-decided the desired outcome, the word “will” would most certainly not have been used in this question. Rather, the hypothetical “would” should and would have been used.

This is by no means a minor, semantics-oriented issue – for the way these questions are worded will have a major impact in creating the background “mood-music” to ease the driving through of any required institutional agenda. Those composing the wording of these survey questions will have been well aware of this (and if the sponsoring organisations weren’t, for any reason, then any reputable “research company” worth its salt certainly would have been).

In my view, and strictly speaking, respondents who were expecting a fair and open consultation which was not already positioning them by the way the questions were posed should have refused to answer this question. A much fairer and objective wording for this question would have been something like the following:

Q1aWould a framework like the one suggested have any impact, negative or positive, on clients/patients being able to find the right kind of help to meet their needs; and if so, how?”.

The key point here is that such a question might well have yielded significantly different results from the question that was actually posed (on which, see below).

Indeed, all four consultation questions commit this elementary methodological error in using the weasel word “will”. So, in relation to question 1b, a fair, more objective wording would have been as follows:

Q1b – Would a framework like the one suggested have any effect, negative or positive, on employers being able to establish which counsellors and psychotherapists to employ in their service; and if so, how?”

And for question 1c:

“Q1c – Would a framework like the one suggested have any effect, negative or positive, on trainees in their understanding of the pathways open to them for core training with adults; and if so, what and how?”

And finally for 1d:

“Q1d – Would a framework like the one suggested have any effect on professional bodies being able to promote the skills and services of their members; and if so, how?”

If the employed research company wished to test the reliability of the first reported consultation results, they could quite easily carry out a much smaller survey of practitioners who did not complete the first survey, using these alternatively worded questions. The results of such a survey would then provide clear evidence on the extent to which the original survey results are reliable and representative, or otherwise. Without doing such a follow-up, the reliability of the original survey results must remain in question.

Finally, regarding the raw presented statistical results, it’s clear that even when we ignore the multiple biasing effects of the way in which the whole consultation process was conducted (referred to in detail above), around 25 per cent of respondents – a considerable minority – were not happy with the proposed framework. If I were one of the psy organisations wishing to see this framework implemented, I certainly wouldn’t be feeling at all triumphant about these initial results.

In Conclusion

I have raised a number of core methodological issues in this commentary, and I ask the sponsoring organisations to reply to the concerns I have raised here in adequate detail.

If there is no full response, the silence will be deafening, and the many thousands of concerned practitioners will no doubt reach their own conclusions.

 

Dr Richard House, C.Psychol., AFBPsS, Cert.Couns.

Former Senior Lecturer in Psychotherapy, Counselling and Psychology, University of Roehampton; former PsychD research supervisor; former counsellor and psychotherapist in General Practice (1990–2007); author of Therapy Beyond Modernity (2003) and co-editor (with Del Loewenthal) of Against and For CBT (2008).

richardahouse@hotmail.com

 

*

 


 

 

Politics and Insights

Public interest issues, policy, equality, human rights, social science, analysis

Psychotherapy and everyday life

what we learn from ourselves and our interactions

psyCommons

An addition to the commons of air, seas, seeds, forests and rivers. Discover, celebrate and understand the psyCommons and help it thrive

the free psychotherapy network

free psychotherapy for people on low incomes and benefits