You are on page 1of 6

ETHICAL ISSUES INTHE USE OF PSYCHOTHERAPEUTIC SOFTWARE

As with the implementation of any healthcare service, the application of psychotherapeutic


software might raise a number of ethical concerns. First, we must consider whether such
applications are safe and able to manage crisis situations effectively. Second, we must consider
whether these programs represent a second best to clients and practitioners. Finally, we must
consider issues of client trust and confidentiality: are these systems misleading and are they
robust when it comes to data storage?

First do no harm: is psychotherapeutic software safe?

One of the most common concerns about computer-mediated psychotherapy is the ability (or
inability) to attend to crisis situations. People suffering from mental health problems who can
benefit from psychotherapy are also a group who, on average, may be at increased risk of suicide
and self-harm. Consider, for example the population of depressed clients who might use Beating
the Bines or MmdStreet. Diagnosable depressive disorders are implicated in between 40 and 60

per "cent of instances of suicide (Fowler et al., 1986; Clark and Fawcett, 1992; Henriksson et al.,
1993) and it has been estimated that between 10 and 15 per cent of people diagnosed with major
depres-sive disorder eventually kill themselves (Clark and Fawcett, 1992; Maris et al., 1992), a
risk rate 20 times higher than population-expected values (Harris and Barraclough, 1997). I! one
of the psychotherapists duties is to identify and manage patient risk, is safety compromised by
the use of psychotherapeutic software? In a traditional face-to-face psychotherapy setting, the
practitioner has an ethical (and often legal) responsibility to ensure the clients safety or the
safety of other individuals against whom an aggressive client has threatened harm. The question
for the present discussion is whether or not offering computerized therapy applications (and/or
their developers) carries a similar ethical responsibility. Where psychotherapeutic software is
used adjunctively with face-to-face therapist contact, risk assessment and monitoring functions
are not typically performed by the software. For example, in the delivery of FearFighter, and
other computer-delivered therapies {COPE,BALANCE, BT STEPS) at the London Stress Self-
Help Clinic, patients are prescreened and those at risk are not offered a computerized
inter-vention. In the case of MmdStreet, where clients see a practitioner as well as the computer
during each session, risk-monitoring remains in the hands of the human not the program. Where
psychotherapeutic software is largely a stand-alone interven tion, risk assessment, monitoring
and responsibility may be more complex and sophisticated. Therapists are appropriately sensitive
to. risk issues and may therefore be wary of computerized therapy on the grounds that clients
need to be monitored for risks such as suicide. The individual or organization that offers
computerized therapy may reasonably have a duty to prevent suicide, or any serious harm to self
or others. The delivery of computerized therapy must therefore include a reliable system for the
detection and appro-priate communication of risk. Risk management is implicated in assessments
of both client suitability and client progress.

Risk assessment: the suitability of the program

In the case of stand-alone psychotherapeutic software, risk assessment prior to the program is the
practitioners responsibility. Referral is managed by a health professional (usually the patients
family doctor or mental health worker), who must assess the patients suitability for the program.
Referring staff are advised that the program is not suit- able for those contemplating suicide. For
example, support staff working with Beating the Bines participate in Training sessions and
receive printed manuals in which this exclusion is reinforced.

Risk-monitoring: session by session

Risk status can change rapidly and the initial assessment or screening may not be sufficient to
detect dynamic safety issues. Where software is delivered independently, session-by-session
monitoring is recommended. For example, Beating the Blues monitors patients thoughts and
plans of suicide at the beginning of each weekly session. Examples and case example interaction
from the program can be seen below.

Risk-monitoring in B e a tin g th e B lu e s

Session I: risk-monitoring (repeated at the beginning of each session)

Computer (voice-over, accompanied by screen graphic): Your doctor will also get a brief
progress report each week. It will contain your anxiety and depression ratings, how distressing
your problems have been and whether you have had any upsets or thoughts of suicide.
Computer (voice-over, accompanied by screen graphic and text): Ineed to ask you each week
whether you have had any thoughts of suicide. Have you had any thoughts of suicide in the last
week?

1. Yes

2. No

Patient: No

Screenshot 9.1 Thoughts of suicide

Where no suicidal ideation is indicated, the program moves on to a discussion of recent upsets
and disappointments. Where the patient reports thoughts of suicide (response: yes), the program
continues thus: Computer (voice-over, screen text and graphic with multiple choice responses):
How often have you thought about ending your life in the last week?

1. Once

2. Twice

3. Three times

4. More than three times

Patient: any response

Computer (voice-over, screen texr and graphic with multiple choice responses): How seriously
have you planned to carry it out? not very seriously very seriously.

Patient: response 0-4

Computer (voice-over and screen graphic): Im sorry youve been thinking about suicide. Things
have obviously been very bad for you. If you find your plans are getting any more serious please
stop using Beating the Blues and see your doctor or someone else who can help you.

Patient: response 5-8


Computer (voice-over and screen graphic): Im concerned youve been thinking about suicide.
Things have obviously been very bad for you. Please see your doctor or someone else who can
help you straight away. The clients responses to the risk interview are indicated in full on his

or her progress report. Where active suicidal ideation is recorded this is highlighted on the
progress report for the attention of his or her clinician. The limitations of this monitoring process
lie in the establishment of a viable and effective safety net system such that risk information is

appropriately communicated to a clinician or other person in accord with local practice


standards. For example, when delivered in primary care environments, the clients progress
report can be delivered to and checked by a clinician before he or she leaves the building.
Possible barriers to the effective implementation of such a patient- monitoring system include
training and the resource constraints of delivering psychotherapy in primary care. It is imperative
that the clinical helper (typically a nurse or administrative member of staff who supports the day-
to-day running of the program) and the clini-cally responsible party (for example family doctor
or therapist) understand the progress report information and make time to read and sign off the
progress report each week. Furthermore, clinical helpers need training and support, or they may
feel uncomfortable or ill equipped to deal with the risk information provided to them. Future
programs might overcome these barriers by linking the thera-peutic software to the local case
management database, such that progress reports can be delivered instantaneously to the
clinicians desk or a case management call centre, marked for urgency as appro-priate to the
clients level of risk. Again, local safety protocols must be applied where risk is identified

Risk-monitoring: covert risk cues

It is possible that a client may not report suicidal intent, but their presenting problems may
contain a risk cue, such as an intention to harm themselves or other people. At present the
majority of psychotherapeutic software in use does not employ natural language parsing. Client
problems are simply repeated back by the computer verbatim. Future programs could be
designed to actively detect words and phrases such as, ...kill myself, life isnt worth living,

and I swear Im going to make him pay... to name but a few examples. In the case of such
phrase detection, risk alerts, similar to those activated in overt risk-monitoring, could be
presented to the clini-cian. However, such monitoring may contravene established
confi-dentiality agreements on program uptake. In the case of Beating the Blues, users are
informed that although their clinician will have access to risk information, mood and problem-
monitoring, all other inputs to the program will remain confidential. The implications of a caveat
permitting access to phrases that activate a risk indicator are unknown.

Risk-monitoring: homicide risk

To our knowledge no current computerized psychotherapy program is designed to assess, detect


or direct cases of other harm, homicidal ideation or intent, although such situations could
presumably be handled in a similar manner to situations involving self-harm.

Safe, safer, safest

Despite vocal professional concern, there is little evidence to suggest that computers do not
present a safe delivery vehicle for psychother-apeutic intervention. Indeed, psychotherapeutic
software may offer better risk detection than a human psychotherapist. The computer interface
may have a disinhibiting effect (Joinson, 1998), allowing clients to be more honest about their
feelings and intentions. Thus, clients are quite willing to undertake computerized assessment,
and are more likely to report sensitive information, including those associated with suicide, to a
computer than to a human practitioner (Erdman et al., 1985). Moreover, computerized risk
assessments have been demonstrated to be more accurate in predicting suicide attempts than
human therapists given the same information (Greist et al., 1973).

Is psychotherapeutic software a second best?

Effectiveness

The outcome studies available, although limited in number, indicate that psychotherapeutic
software can benefit people in need (see Chapter 8). In the case of programs designed for the
management of anxiety and depression using cognitive-behavioural techniques,treatment effect
sizes are comparable to those found in studies where the same techniques are delivered in the
traditional face-to-face manner (Ghosh and Marks, 1987; Selmi et al., 1990;Kenwright et al.,
2001; Wright et al., 2002; Proudfoot et al., 2003a and b). No psychotherapy is a panacea but, in
terms of clinical outcomes, there is no reason to believe that well-designed programs,offered to
appropriate clients, are a second best to traditional psychotherapy.

You might also like