Professional Documents
Culture Documents
One of the most common concerns about computer-mediated psychotherapy is the ability (or
inability) to attend to crisis situations. People suffering from mental health problems who can
benefit from psychotherapy are also a group who, on average, may be at increased risk of suicide
and self-harm. Consider, for example the population of depressed clients who might use Beating
the Bines or MmdStreet. Diagnosable depressive disorders are implicated in between 40 and 60
per "cent of instances of suicide (Fowler et al., 1986; Clark and Fawcett, 1992; Henriksson et al.,
1993) and it has been estimated that between 10 and 15 per cent of people diagnosed with major
depres-sive disorder eventually kill themselves (Clark and Fawcett, 1992; Maris et al., 1992), a
risk rate 20 times higher than population-expected values (Harris and Barraclough, 1997). I! one
of the psychotherapists duties is to identify and manage patient risk, is safety compromised by
the use of psychotherapeutic software? In a traditional face-to-face psychotherapy setting, the
practitioner has an ethical (and often legal) responsibility to ensure the clients safety or the
safety of other individuals against whom an aggressive client has threatened harm. The question
for the present discussion is whether or not offering computerized therapy applications (and/or
their developers) carries a similar ethical responsibility. Where psychotherapeutic software is
used adjunctively with face-to-face therapist contact, risk assessment and monitoring functions
are not typically performed by the software. For example, in the delivery of FearFighter, and
other computer-delivered therapies {COPE,BALANCE, BT STEPS) at the London Stress Self-
Help Clinic, patients are prescreened and those at risk are not offered a computerized
inter-vention. In the case of MmdStreet, where clients see a practitioner as well as the computer
during each session, risk-monitoring remains in the hands of the human not the program. Where
psychotherapeutic software is largely a stand-alone interven tion, risk assessment, monitoring
and responsibility may be more complex and sophisticated. Therapists are appropriately sensitive
to. risk issues and may therefore be wary of computerized therapy on the grounds that clients
need to be monitored for risks such as suicide. The individual or organization that offers
computerized therapy may reasonably have a duty to prevent suicide, or any serious harm to self
or others. The delivery of computerized therapy must therefore include a reliable system for the
detection and appro-priate communication of risk. Risk management is implicated in assessments
of both client suitability and client progress.
In the case of stand-alone psychotherapeutic software, risk assessment prior to the program is the
practitioners responsibility. Referral is managed by a health professional (usually the patients
family doctor or mental health worker), who must assess the patients suitability for the program.
Referring staff are advised that the program is not suit- able for those contemplating suicide. For
example, support staff working with Beating the Bines participate in Training sessions and
receive printed manuals in which this exclusion is reinforced.
Risk status can change rapidly and the initial assessment or screening may not be sufficient to
detect dynamic safety issues. Where software is delivered independently, session-by-session
monitoring is recommended. For example, Beating the Blues monitors patients thoughts and
plans of suicide at the beginning of each weekly session. Examples and case example interaction
from the program can be seen below.
Risk-monitoring in B e a tin g th e B lu e s
Computer (voice-over, accompanied by screen graphic): Your doctor will also get a brief
progress report each week. It will contain your anxiety and depression ratings, how distressing
your problems have been and whether you have had any upsets or thoughts of suicide.
Computer (voice-over, accompanied by screen graphic and text): Ineed to ask you each week
whether you have had any thoughts of suicide. Have you had any thoughts of suicide in the last
week?
1. Yes
2. No
Patient: No
Where no suicidal ideation is indicated, the program moves on to a discussion of recent upsets
and disappointments. Where the patient reports thoughts of suicide (response: yes), the program
continues thus: Computer (voice-over, screen text and graphic with multiple choice responses):
How often have you thought about ending your life in the last week?
1. Once
2. Twice
3. Three times
Computer (voice-over, screen texr and graphic with multiple choice responses): How seriously
have you planned to carry it out? not very seriously very seriously.
Computer (voice-over and screen graphic): Im sorry youve been thinking about suicide. Things
have obviously been very bad for you. If you find your plans are getting any more serious please
stop using Beating the Blues and see your doctor or someone else who can help you.
or her progress report. Where active suicidal ideation is recorded this is highlighted on the
progress report for the attention of his or her clinician. The limitations of this monitoring process
lie in the establishment of a viable and effective safety net system such that risk information is
It is possible that a client may not report suicidal intent, but their presenting problems may
contain a risk cue, such as an intention to harm themselves or other people. At present the
majority of psychotherapeutic software in use does not employ natural language parsing. Client
problems are simply repeated back by the computer verbatim. Future programs could be
designed to actively detect words and phrases such as, ...kill myself, life isnt worth living,
and I swear Im going to make him pay... to name but a few examples. In the case of such
phrase detection, risk alerts, similar to those activated in overt risk-monitoring, could be
presented to the clini-cian. However, such monitoring may contravene established
confi-dentiality agreements on program uptake. In the case of Beating the Blues, users are
informed that although their clinician will have access to risk information, mood and problem-
monitoring, all other inputs to the program will remain confidential. The implications of a caveat
permitting access to phrases that activate a risk indicator are unknown.
Despite vocal professional concern, there is little evidence to suggest that computers do not
present a safe delivery vehicle for psychother-apeutic intervention. Indeed, psychotherapeutic
software may offer better risk detection than a human psychotherapist. The computer interface
may have a disinhibiting effect (Joinson, 1998), allowing clients to be more honest about their
feelings and intentions. Thus, clients are quite willing to undertake computerized assessment,
and are more likely to report sensitive information, including those associated with suicide, to a
computer than to a human practitioner (Erdman et al., 1985). Moreover, computerized risk
assessments have been demonstrated to be more accurate in predicting suicide attempts than
human therapists given the same information (Greist et al., 1973).
Effectiveness
The outcome studies available, although limited in number, indicate that psychotherapeutic
software can benefit people in need (see Chapter 8). In the case of programs designed for the
management of anxiety and depression using cognitive-behavioural techniques,treatment effect
sizes are comparable to those found in studies where the same techniques are delivered in the
traditional face-to-face manner (Ghosh and Marks, 1987; Selmi et al., 1990;Kenwright et al.,
2001; Wright et al., 2002; Proudfoot et al., 2003a and b). No psychotherapy is a panacea but, in
terms of clinical outcomes, there is no reason to believe that well-designed programs,offered to
appropriate clients, are a second best to traditional psychotherapy.