You are on page 1of 16

Approaches to

Social
Research
FOURTH EDITION
Royce A. Singleton, Jr.
College of the Holy Cross
Bruce C. Straits
Uni versity of California, Santa Barbara
ew York . Oxford
Oxford University Press
2005
DOWNTOWN
Hf.:,l
, Sif17C:.
/.-0
Oxford Press To Nancy and Cathy
Oxford New York
Auckland Bnngkok Buenos Aires Cape Town Chennai
Dar es Salaa m Delhi Hong Kong Istanbul Karac hi Kolkata
Kualn Lumpur Madrid Melbourne Mexico Ci ty MUlllbai Nnirobi
Siio Paul o Shanghai Taipei Tokyo Toronto
Copyright 1988, 1993, 1999, 2005 by Oxford Universit y Press, Inc
Published by Oxford Universi ty Press. Inc.
198Madison Avenue, New York, New York, 1001 6
www.oup.com
Oxford is a registered trademark of Oxford Uni versit y Press
All ri ghts reserved.No part ofthis publicati on l.11 ay be reproduced,
stored in a retri evalsystem.ortrans mitted, in any form or by any mea ns,
electron ic, mechanical, photocopying, recording, or otherwise,
wi thout the prior permission ofOxford Uni versity Press
Library of Congress Cataloging-in-Publication Data
Singl eton, Royce.
Approaches to socialresearch / Royce A. Singleton, Jr. , Bruce C. Straits.-4th ed.
p. cm.
Includes bibliographicalreferences and index.
ISBN 0- 19-514794-4
I. Social sc iences-Resenrch. 2. Social sciences-Methodology. I. Straits ,Bruce C. II.
Title.
H62.S47762004
2004043396
300'.72-dc22
Printing number:9 8 76 54 32
Pri nted in the United States of America
on acid-free paper
-
514 DATA PROCESSI NG, ANALYSIS, ;\.,1) INTERPRETATION
abies increases by about ei gbt with the inclusi on of all inlervening variabl es, and by about
seven when all forms of extrac urricular activiti es arc included in the regress ion analysis.
18 Althougb Broh does not report any of the estimated coeffi cients for the two athlete
dummy control s, there is indirect evidence that one or both have negative effects in the base-
line modeling of math grades (modell, Table 15.4) since the other two coefficient estililates
are positive (230 for athletes, .005 for intercept) and the weighted total effects across all stu-
dent s should sum to ze ro (the mean of the standardized math grades).
16
Research Ethics
Thus far we have dealt with the technical side of social research-with issues of re-
search design, data collection and analysi s. Besides these technical aspects, there is
another dimension to social sc ience that must be considered-the moral dimension.
When we think about how to conduct research , we must think not only of using the
right techniques but also of rightly using the techniques we have learned. We must
think about research ethics.
Ethics is a branch of philosop hy and theology. Both theological ethicists, who
define the field in terms of religious tradition and sacred texts, and philosophical
ethicists, who define it strictly on the basis of reasoning independent of religious
faith, are concerned with the same fundamental question: What ought to be done')
Ethics is the study of "right behavior." F.or the social scientist, ethics poses ques-
tions concerning how to proceed in moral and responsible ways.
Ethical considerations underlie many deci sions about research methods. Just as
practical considerations can prevent researchers from implementing the ideal re-
search design or obtaining as large or di verse a sample as desired, so too can ethi-
cal concell1S constrain sci entific inquiry. Ethics may prohibit researchers from us-
ing experimental treatments that could harm research participants, from asking
questions that would prove extremely embarrassing or threatening, from making ob-
servations that would deceive or pl ace subjects under duress, and from reporting in-
formation that would constitute an invasion of privacy.
There are three broad areas of ethical concern in scientific research: the ethics
of data collection and analysis, the ethics of treatment of participants, and tbe ethics
of responsibility to soci ety (Reese and Fremouw, 1984). First , researchers are ex-
pected to be careful and forthright in observing, analyzing, and reponing findings.
Being ethical in this sense is synonymous with being a good research scientist. Sec-
ond, scienti sts have ethical obligations regarding the treatment of human subjects.
Basic ethical principles accepted in our cultural and legal tradition demand that re-
search participants be treated with respect and protected from harm. Finally, ethi-
cal concerns ari se from the relationship between science and society, especially re-
garding the uses of scientific knowledge. Many social sc ienti sts believe that
researchers have a responsibility to assess the possible uses of scientific findings,
to promote their beneficial application, and to speak OLlt against their destructive
appl ication.
In this chapter, we consider each of these areas of ethical concern; however, we
gi ve the greatest attention to the treatment of research participants. Historically, the
most controversial studies, Jllany of which are described below, have involved
clashes between scientific practice and the rights and welfare of research partici-
SIS
516 517
.....
DATA PROCESSI NG, ANALYSIS. MID li'ffERPRETATION
pants. Concern about potential harm to subjects led to the codification and adoption
offederal regulations regarding research practices (Singerand Levine, 2003). It al so
is a major focus ofthe ethical codes developed by professional societies suchas the
American Sociological Association and the American Psychological Association.
Data Collection and Analysis
Conducting ethical social research involves, first, researchers' obligations to one
another and to their discipline to make sure that theirdata are sound and trustwor-
thy. Because scientific progress rests upon the trustworthiness offindings from the
workofmany investigators, dishonesty and inaccuracy in reporting and conducting
research undermine scienceitself. Scientific norms thereforedemand intellectual in-
tegrity. Scientists are expected to be "unremittingly honest" in their observations
and analyses, to be tolerant, quest ioning, and willing to admit error, and to place
the pursuit ofknowledge and understanding above personal gain or the promotion
ofa particular philosophy or ideology (Cournand , 1977). Violations ofthis ethical
code vary from manipulatingdata in orderto obtain a desired result to thecomplete
fablication ofdata.
Unethical data manipulation can occur in many ways. A researcher may ex-
cludecertaincases from the analysis to achievea significantdifference betweenex-
perimental conditions; fail to reportresults that contradicta favored hypothesis; and
search for statistical tests, however inappropriate, that improve the appearance of
the data by yielding significant results or larger effect sizes [for an example, see
Berry (1991)]. The extent to which data are obscured in these ways is unknown;
however, moreextre meethical violations involving thecomplete fabrication ofdata
are thought to be exceedingly rare in science (MarshaJl, 2000). When, in 2001, so-
cial psychologist Karen Ruggieroretracted articles in two journals becauseshe had
fabricated the data , the editor ofonejournal remarked that he had never seen a re-
traction ofthis sort in the social psychology literature(Biernatand Crandall, 2001).
The Ruggiero case reveals the gravity of scientific fraud. Ruggeiro's work,
published in some leading psychology journals between 1995 and 2000, drew a
great deal of attention. One 1995 paper was cited in over50 studies. Her provoca-
tive central thesis-thatracial and sexual di scrimination are more widespread than
most peoplethink-alsohad importantpolicy implications. Before the revelationof
fraud, Ruggiero was a rising star in the discipline. After completing her PhD at
McGill Universi ty in 1996, she became an assi stant professor at Harvard, then
moved to the University ofTexas in 2000, where she quickly gained a reputation
as an excellentteacherand colleague (BiernatandCrandall ,2001). Apparently,sus-
picions about the validity ofher research arose when others could notrepli cate her
findings. When a former research assistant asked Harvard to investigate, she ad-
mitted to using " invalid data" in her research studies at Harvard (Holden, 2001).
The impact of scientific fraud is far-reaching. For the sc ienti st who is dis-
covered to have published fal se data, the consequences are severe. Ruggiero was
subjected to professional disgrace and to public humiliation from media coverage
of the incident. She resigned her faculty position and almost certainly ended her
Research Ethics
career as a research scientist. "The ripple effect on the field." a, Chris Crandall
(2001:20) points out, "can be even worse." Because scienti st s huild theirresearch
on the work of others, false leads can r 'wlt in the loss of time,
money, and energy for those who purs ue them. imilarly, the publicatio n of
fraudulent data harms everyone associated with it: Graduate students working in
the researcher's lab lose time and the trust ofcolleagues; the institution where the
fraud took place must investigate the case and, if the research was supported by
external funding, possibly return the funds; the image of the discipline may be
tarnished; and the validity ofthe work ofeveryone who cites the invalid articles
may be questioned.
Science is a public activity, carriecl out by a community ofscholars who con-
stantly evaluate each other's work. In the end, it was thi s scrutiny. specifically the
inability ofothers to replicate her findings, that called into question Ruggiero ' s re-
search. So, in one sense, the case shows that science's system of checks is work-
ing. Still, incidentsofdata manipulation and fabrication may bedifficult to uncover.
Even though fraud would seem most likely to surface when the research gains
prominence, as did Ruggiero ' s, fraudulent data may remain undetected for years. It
was well overa decadebeforeit was discovered that Cyril Burt's dataon twins (see
Box 2.2), which were highly influential in research on the heritability of intelli-
gence , were fabricated. Furthermore, re plication may not be the most effective
means of preventing or detecting fraud. Exact replications are not highly valued as
scholarly work; they are expensive; and varying conditions may confound the in-
terpretationoffailure s to replicate.Therefore,social researchers mustresortto other
methods to prevent scientific fraud.
Partofthe fallout from the Ruggiero case was a di scussion ofthe measures that
should be taken by social scientists to prevent scientific misconduct . Among the
recommendations were the following (Murray, 2002):
Research institutions should educate students about scientific mi sconduct.
Ethics training is mandatory for profess ional psyc hol ogy programs accredited
by the American Psychological Association, and is required for student s work-
ingon NIH (Nati onal Institutes ofHealth) researchgrants The topic ofresearc h
ethics also should be an integral pan ofall courses in social research methods.
Institutions, funding agencies, and individual rc"carchers should periodicall y
check dat a. The NIH conducts regul al' I'eseal'c h audits. But because uni versities
do not have the resources for intensiveaudit s thatthe has, we recommend
at least random CJuality checks by principal investigators. Survey researchers,
for example. should validate 3 sample ofinterviews for every interviewer.
Investigators should prescribe specific criteria for the inclusionorexclusion of
certain data,slich as outliers, before dat a are collected and anal yzed.
In papers submitt ed for publication, researchers should provide detailed infor-
mati on on how they collected. processed,and analyzed their data, and journal
editors shoul d insist on such detail ed accollnting.
We cannotoveremphasize the importance ofcarefully conducting research and
honestl y reporting findings. For the sc ienti st, thi s is the most fundamental ethical
dictullJ .
518
4
DATA PROCES. J:-JG, ANALYSIS, AND rNTERPRETATJON
TreatmentofHuman Subjects
Four problem areas have been identified most often regarding the et hi cal treatment
ofhuman subjects: potentialharm, lack ofinformedconsent,decepti on,and privacy
invas ion (Diener and CrandaJ), 1978). Each ofthese problems ari ses when research
practices violate basic human ri ghts. It is considered a violati on of basic rights to
harm others, to force people to perform actions against their will, to lie to or mi s-
lead them, and to invade their pri vacy. While most social research poses no threat
to these individual rights, there have beensome ethicallyquest ionabl e studi es in the
social sci ences. A re view ofthese four iss ues will sensitize the reader to research
situations that are potenti all y unethical as well as to strategies and guidelines that
help to ensure subjects' rights.
Harm
All new physici a ns take an oath of ethical, profess ional behavior attributed to the
Greek physician Hippocrates (460- 377 Be). One ofthe first provisions of the Hip-
pocratic oath is that the doctor "abstain from whatever is deleterious...." These
words ofHippocrates, advising that the physician do no harm, offer sound ethical
advice for research scientists as well. The first ri ght ofany participant in a research
project is the li ght to personal safety. Eth ical researchers recognize thi s right and
arecareful to respect it. Research that would endangerthe life or physical health of
a human su bject is simply not acceptable in the socialsciencecommunity.Even re-
search that harnl s animal s, although it mi ght directly benefit humans, has been the
foc us ofcontemporary ethical concem [see, for example, PloLl s (1996) and Rowan
( 1997).
The issue ofharm is not quite so simple and straightforward as it may appear,
however. For one thing, harm is sometimes difficult to define and predict. Given
the nature ofsocial science research projects, physical harm to subj ec ts is hi ghly
unlikel y. Yet , notall harm is ofa physica l nature .Peoplecan be harmedpersonally
(by being humiliatedorembarrassed),psychologically (by losingtheirself-es teem),
and soc ial ly (by los ing their trust in others) through their participation in research
thatmi ght neverthreaten theirphysical well -being (DienerandCrandall, 1978:nf).
Moreover, it is often difficult to predi ct whether, or the extent to which, one's in-
vesti gat ive procedures will be harmful to research participants. The pri son simula-
ti on st udy ofPhilip Zimbardo and colleagues (1973) is a good example. These in-
vestigators created a mock pri son in the basementofa Stanford University building
in which subj ects role-pl ayed prisoners a nd guards. Thestudy was sc hedul ed to run
two weeks but had to be terminated afteronly sixdays becauseofit s unanticipated
adverse effects on subjects. Guards physically and psychologically abused prison-
ers, and pri soners broke down, rebell ed, or became servil e and apathetic. Thesub-
j ects got socaught up in the situati on, became so absorbed in their roles, that they
began to confuserole-pl aying and self-identity .While Zimbardo and colleagues in-
tended to st udy how the roles of"guard" and "prisoner" influenced subjects' reac-
tions, they never anticipated such ex treme effects.
Research Ethics 519
Besides the di fficulty ofpredicti ng harm, most sc ientis ts would not ad here to
t he dictum that no harm whatsoever should ever come to researc h participants.
Some researchers take the position that potential harm shou ld be weighed against
the benefits that might be derived from the research.Thi s stance is implied by one
criterion for approval ofresearch in the Code ofFederal Regulations ( 1995:116):
" Risks to subjects are reasonable in relation to anticipated benefits...." If there is
littl e or no scientific value from a study that knowingly exposes subjects to harm,
the study should not bedone, no matterhow smal lthe harm. But ifa study has con-
si derable scientific merit , some degree ofpotenti al harm may be justified. For ex-
ample, although some researc h on hypothermia requ ires subjecting informed vol-
unteers to physical harm, such as by immersi ng them in cold water, thi s isjustified
by the potenti al scientific benefit of such an investi gation.
A majordifficu lty with thi s approach lies in beingable to assess the full extent
of costs and benefits. Costs and benefits may be imposs ible to predict or to mea-
s ure; and a cost-benefitanalysisignores individual ri ghts, orat leas t makes them
s ubservient to soc ietal benefits and to pragmatic considerations . Research seems
mostjustifiable when the person exposed to the ri sks will also receive the benefits
of potentially harmful procedures. However, the benefits of much sc ientific re-
search accrue not to the individual researc h participant but to the investigator, to
sc ience, or to the general publi c, and it is more questionable to just ify costs to an
individual solel y on thesegrounds (Dienerand Crandall , 1978).
In spite ofthese problems, a cost-benefitanalysis can be a helpful firs t step in
examining the ethics ofa proposed study .One should also be sensitive to areas of
study and to researc h procedures that pose the greatest ri sk of harm. The potential
fordoing harm to subjects may be highest in social research that investigates neg-
ative aspects ofhuman behav ior(e.g., aggression, obedience to malevolent author-
ity, cheating). The principal arena for such research is in laboratory and field ex-
periments. Through experimental manipulation, subjects may suffer a temporary
loss ofself-esteem or experience a hi gh degree ofstress, and as a resuIt they may
be embarrassed or may become angry about their in volvement in research.
Consider, for example, the work of Stanley Mil gram ( 1974) on obedience to
authority. Underthe gui se ofa teacher-learner experiment, Mil gram asked subjects
pl ayingtherole ofteac herto deliverto learners what the subjectsthought were dan-
gerously high levels ofelectric shock. Itgoes without sayingthat the learner,a con-
federate of the experimenter, was not actually being shoc ked. But to the subject s
this was a highly stressful conflict situation : Should they obey the experimenterin
admini stering the shocks or should they refuse to continue in the experiment? The
subjects showed many obvious signs ofstress; indeed, one subj ecthad a convulsive
seiZure that made it necessary to terminate hi s participation. Milgram, in turn, was
severely criticized for not protecting his subjects from potenti al harm.Forexample,
he made no effort to deteImine before their parti cipation whether su bjects should
be excluded from the experiment for physical or psychological reasons. Some re-
searchers also questioned the long- termeffects that the experi ment might have had
On subject s' sel f-concept s. What would subjects think ofthe mselves knowi ng that
they were capable ofinfli cting pain on another person')
520 DAT \ ANALYSI S. AND l I' rERPRETATION
Field experiments present even greater problems. In these settings the re-
searcher may find it virtually impossible to intervene when subjects are about to ex-
perience harm. The laboratory sC: lling guarantees a certain amount of control over
subjects ' behavior, allowing for intervention if necessary, but that control may be
altogether absent in a field sc:tting. Bibb Latane and John Darley (1970), for exam-
ple, staged a crime (looting of a liquor store) to explore the conditions under which
intervene to help. One bystander in their field experiment telephoned the
police, who showed up with guns drawn to arrest the researchers. This was a situ-
ation in which considerable harm could have come to both subject s and researchers.
The ethical issue of harm is much less a problem for survey researchers and
participant observers than it is for experimentalists, but even they must be alert to
the potential for doing harm. Survey researchers can harm people by asking threat-
ening questions. Participant observers can harm people by their own active in-
volvc:ment as participants. William Foote Whyte (1981 :313), for example, reports
that during the course of his study of Comervil1e, he voted four different times in
the fall 1937 congressional election. Whatever damage might have been done to the
opposition candidate by Whyte' s illegal actions, while probably insignificant, is not
irrelevant.
Aside from ass ssing the risk of harm to research participants and designing
one's research to minimize such risks, the researcher needs to be aware of several
widely adopted ethical principles that are designed to protect participants from harm
(see Diener and Crandall, 1978).
I. Researchers should inform subjects of any reasonable or foreseeable risks
or discomforts before the study begins and should give subjects sufficient opportu-
nity to consider whether to participate. Indeed, federal regulations mandate such
"informed discussed below, for all federally funded research. A major
criticism of Milgram's experiment, which predates the latter regulation, is that he
did not obtain prior permission from subjects to allow him to place them in a highly
stressful conflict situation.
2. Where appropriate, researchers should screen out research participants who
might be harmed by the research procedures. In their prison simulation study, Zim-
bardo and associates (Zimbardo, 1973) gave several personality tests to volunteers
in order to select subjects with " normal" personality profiles and thereby minimize
the possibility of destructive effects. Another critici sm of Milgram' s experiment is
that he failed to administer examinations before the experiment to determine whether
subjects suffered from psychological or physical problems that might have excluded
their participation.
3. If st ress or potential harm is possible, measures should be taken to assess
harm after the study, and research participants should be informed of procedures
for contacting the investigator. The debriefing session in experiments can help to
assess as well as ameliorate negative reactions. But if long-lasting effects are pos-
sible, the researcher has a special obligation to conduct follow-up interviews and
possibly to provide counseling. Zimbardo (1973) held an encounter session after his
study to allow subject s to express their feelings. He also conducted follow-up
interviews to assess the impact of the experience and found no evidence of long-
Reseorch Ethics 52l
lasting negative effects. Indeed, most subjects regarded the experiment as a valu-
able learning experience. Mi [gram (1974) also carefully questioned hi s subjects, in-
terviewing all of them immediately after the experiment and sending them reports
of the study and follOW-Up questionnaires asking for their reactions to their partic-
ipation in his research. His ultimate ethical justification for this research was that it
was judged acceptable by those who took part in it.
Informed Consenr
The second ethical issue arises from the value placed on freedom of choice in West-
em societies. For moral and legal reasons, subjects should not be coerced into par-
ticipating in social research. Not only must subjects understand that their participa-
tion is voluntary, they must also be given enough infonnation about the research to
make an informed decision about whether to participate. In other words, researchers
should obtain the explicit or implicit informed consent of their subjects to take part
in an investigation.
Just how much information about the research must be conveyed to subjects
for them to exercise their informed consent is not always clear and depends largely
on the nature of the research. Full disclosure of the research purpose and procedures
is usually not necessary, although subjects generally should be given some expla-
nation of the general purpose of the research and who is sponsoring it. Minimally,
they should be told that their participation is voluntary and that they are free to
withdraw from the study at any time; moreover, they must be given a clear de-
scription of the risk of hann involved and of personal rights that might be jeopar-
dized by their participation. Milgram's subjects, for example, should have been told
that they would feel stress and that this stress conceivably could have harmful
effects.
Ethical regulations for federally funded research dictate that a written consent
form, signed by the subject or the subject's legal guardian, must be used when more
than " minimal risk" of harm is anticipated. Minimal risks refer to risks that are no
greater than those ordinarily encountered in daily life (Code of Federal Regulations,
1995). In such cases, informed consent protects both subjects and researchers. Sub-
jects are protected from harm by being able to make up their own minds about the
risks of participation ; researchers are legally protected by subjects' explicit volun-
tary agreement. However, although written informed consent is accepted practice in
biomedical research, it has several limitations as an ethical safeguard and is not al-
ways desirable in social research.
As Edward Diener and Rick Crandall (1978) point out, it is often difficult, even
in biomedical research, truly to inform subjects about all the risks of research, since
these are not always known. Moreover, the subject's consent to participate does not
remove the researcher ' s responsibility to minimize danger to the subject, and it
should never be used to justify other unethical practices. Finally, the use of in-
formed-consent procedures presents methodological problems for several kinds of
studies. Research by Eleanor Singer (1978) has shown that requiring a signature on
a consent form reduces the response rate and elicits more socially desirable re-
sponses in surveys. And in laboratory experiments, the provision of full informa-
522 523 DATA PROCESSI NG, ANALYSIS. AI D INTERPRETATION
tion about the study can compIetely undermine its validity. As the concept of de-
mand characteristics implies. subjects who are told the true purpose of the study
may not behave naturally. It is not surprising. then, that studies that convey hy-
pothesis-related information in their in formed-conse nt procedures have failed to
replicate find ings of studies not containi ng suc h information (Adair. Dushenkci, and
Lindsay, 1985).
While it is clear that documentation of consent and full disc losure of research
purposes and procedures can present methodological problems, there are ways of
circumventing these problems while following the doctrine of informed consent. In
survey research, obtai ni ng a signature to docu ment consent "seems unnecessarily
burdensome," as Singer (1978: 159) has noted, given that the "same protection is af-
forded respondents by the right to refuse the interview, or to refuse to answer par-
ti cular questions wi thin the interview." In fact, the Code of Federal Regulations
(1995: 110) does not require written consent for surveys unless ( I) the information
collected is recorded so that respo ndents can be identified and (2) disclosure of the
information co uld place respondents at risk (or criminal or civil liability or damage
respondents' reputation (e.g. , if respondents are asked about sensitive topics such
as sexual behavior. drug abuse, illegal conduct). Federal regulations also provide
for a waiver of documentation or an alteration of some of the elements of informed
consent when these would adversely affect the study. However, the waiver of doc-
ument ati on can only be made when the research involves minimal risk to subjects.
Finally, it is common in medical and experimental research today to forewarn sub-
jects that a full disclosure of the purposes of the research is not possible until after
their participation. They might be told, in addition, that they may be in one of sev -
eral treatment conditions, but that the st udy result s would be inva lid if they knew
their assigned cond iti on prior to the conclusion of the research.
Field experiments and disguised or covert participant observation present the
grea test ethical risk from the standpoint of informed consent. In bot h types of stud-
ies, the researcher's desi re to observe subjects' spontaneous and natural behavior is
incompat ib le with the acquisition of consent: To obtain informed consent destroys
subjects ' naivete and defeats the purpose of the study. Whether such research is re-
garded as unethi cal depends, for some people, on ot her ethical considerations, such
as invasion of privacy, risk of harm, and the costs incurred in terms of time and
money. If the research does not invade the subjects ' privacy, is harmless. and is not
costly to the subjects, informed consent may be ethicall y unnecessary. In this sense,
testing the effects of different appeals when soliciting donations for a charitable or-
gani zation , such as Robert Cialdini and David Schroeder ( 1976) did in a field ex-
periment, would not be considered et hical ly questionable, because subjects were not
at risk and their rights were not violated. However, Latane and Darley's 1970 field
experiment involving the staging of a crime would be ethicall y questionable be-
cause subjects were exposed to cons iderable stress and risk of harm.
One of the most controversial st udies invol ving covert participant observation
was Laud Humphreys's st udy (1975), mentioned in chapter 10. of sexual encoun-
ters in public restrooms. Humphreys posed as a voyeur and "watchqueen," whose
job was to warn homosexuals of intruders as they engaged in fellatio. He also
recorded the license numbers of these men, traced their identities through the De-
Research Ethics
partment of Motor Vehicles by mi srepresenting himself as a market researcher, and
later interviewed them in their homes after cha nging hi s appearance so that he
would not be recognized. Despite the fact that Humphreys carefully guarded the
confidentiality of his subjects , thi s study now is considered ethically indefensible
by many social scienti sts. Among several other problems, Hu mphreys failed to
obtain hi s subjects' informed consent and risked doing serious damage to their
psyches and reputations.
Disguised participant observati on studies such as the one by Humphreys are
relatively rare and do not always pose such dangers. No matter what the appa rent
risk of harm, however. this kind of research is invariably controversial. In contrast
to the relativist et hi cal judgments about field experiments, some social scientists
take the absolut ist posit ion that research simply sbould not be done where investi-
gators deliberately misrepresent their identity in order to enter an otherwi se inac-
cessible social situation. Sociologist Kai Erikson (1967:368), for example, arg ues
that this kinG of research "can injure people in ways we can neither ant icipate in
advance nor compensate for afterward"; that it "may be painful to the people who
are . . . misled; and even if that were not the case, there are countless ways in which
a stranger who pretends to be something else can disturb others by failing to un-
derstand the conditi ons of intimacy that prevail in the groups he has tri ed to in-
vade. " In regard to thi s kind of research, Erikson (1967:368) also reiterates one of
the most basic assumpti ons of informed consent:
If wc happen to harm people who have agreed to act as subjects, we can at least
argue that they knew something of the risks involved and were willing to contrib-
ute to tilal vague program called the "advance of knowledge ." But when we do so
with people who have expressed no readiness to participate in our researches (in-
deed , people who presllmably would have refused if asked direclly) , we al'e in very
much the same ethical position as a physician who carries out medical experiments
on human subjec ts withoul their consent.
Deception
Deception, the th ird area of ethical concern, in some ways is the most controver-
sial. On the one hand, deception is a widely llsed and accepted practice in social re-
search, especiall y in experiments; one study fou nd that, in 1983, 58 percent of the
empirical studi es reported in three major social psychology journals used some
form of decepti on (Adair, Dushenko, and Lindsay, 1985). The most common de-
ception involves misleading subjects or respondents abo ut the purpose of the study.
A cover letter for a sur vey, for example, might indicate that the study's objective
is to examine general beliefs about health when, in fact, the investigators are inter-
ested specifica lly in their respondent s' knowledge of and beliefs about the rela-
tionship between smoking and lung cancer.
In their epileptic seizure experiment , described in chapter 6, Darley and Latane
(1968) told subjects that they were interested in the kinds of personal problems
faced by coll ege students when in reality they were testing subjects' willingness to
intervene in an emergency. They also deceived subjects about the reasons for the
524
~ ~ ~ ~ ~
DATA PROCESSING. ANAL YSI S, AND INTERPRETATION
experimental setup, explaining that it was necessary to separate subjects to avoid
the embarrassment of face-to-face interaction and that the experimenter would not
be present lest they feel inhibited by his presence. The actual reasons for these con-
ditions were to allow the experimenters to simulate the discussion of other subjects
and to remove the experimenter from the scene of the emergency. Other freqllent
forms of deception in experiments are using confederates to mislead subjects about
research purposes and tasks, as well as providing false feedback about subjects'
own behavior as a way of manipulating their feelings and thoughts.
The basic rationale for deception is that it is necessary in order to pJ.ace research
participants in a mental state where they wi/I behave naturally. If subjects know the
true purpose of a study, the results are meaningless. As we have seen, subjects typi-
cally will act so as to present the most favorable impression of themselves or to help
out the researcher by confirming the hypothesis. Deceiving subjects about the true pur-
pose of a study diverts their attention from the hypothesis and enhances experimental
realism by giving subjects a believable and engrossing explanation for what they are
doing. Defenders of deception also maintain that without it one simply could not ef-
fectively study behavior that people nomlally find objectionable, such as aggression,
confonnity, cheating, or failing to aid others in an emergency.
On the other hand, there are strong and vocal opponenLs of deception. Perhaps the
most vocal is psychologist Diana BaulTIrind (1985: 165), who argues that "intentional
deception in the research setting is unethical, imprudent, and unwarranted scientifi-
cally." Deception is unethical, according to Baumrind, because it violates a subject's
right to infonned consent (i.e., consent obtained by deceit, by definition, cannot be in-
fomled) and violates the trust implicit in the investigator-subject relationship. It is im-
prudent because it ultimately damages the credibility of behavioral scientists as well
as trust in other expert authorities. And it is unwarranted scientifically because decep-
tive practices do not accomplish the scientific objectives that justify their LIse. Baum-
rind claims that the almost routine use of deception in experiments is common knowl-
edge among some groups of subjects (presumably college students), which makes
them suspicious and unlikely to accept the experimenter' S cover story. Because of this,
deception may not produce the naive and spontaneous behavior that it is designed to
elicit, thereby making experimental results inherently ambiguous.
Despite these objections, the prevailing sentiment among social scientists is not
to rule out deception entirely. The codes of ethics of both the American Psycho-
logical Association (APA) and American Sociological Association (ASA) allow for
deception. Since describing the whole purpose of the study beforehand invalidates
most social research , omitting such information is considered a mild and acceptable
form of deception as long as none of the omitted infonnation concerns serious risks.
However, because of the legitimate concems expressed by Baumrind and others,
deceptions of greater magnitude, such as telling direct lies to subjects , using con-
federates, or deliberating misrepresenting oneself, warrant special attention. The
ASA code (1997) also states that
(a) Sociologists do not use deceptive techniques (I) unles s they have determined
that [it s] use will not be harmful to research participants; [and] is justified by the
study'S prospective scientific, educational, or appli ed value.
Research Ethics 525
(b) Sociologists should never deceive research participants about signifi cant as-
pects of the research that would affect their willingness to participate. such as phys-
ical risks, discomfort, or unpleasant emotional experiences.
(c) When deception is an integral feature or the design and conduct of research. so-
ciologists attemplto correct any misconception that re search participant s may ha ve
no later than at the conclusion of the research.
John Adair and colleagues (1985) point OLlt that the negative consequences in
deception research are usually minimal and that there is a lack of viable alternative
methodologies. Therefore, the deception dilemma may be rectified best by the
ASA's point (c)-adequate debriefing.
Debriefing. Debriefing serves methodological and educational as well as eth-
ical purposes; ideally, it should occur in all studies with human participants, not just
those studies involving deception. By interviewing subjects after their participation,
researchers may gain valuable information about subjects' interpretations of re-
search procedures; furthermore, by understanding the nature of the study, subjects
can gain a greater appreciation for their research experience. If subjects are de-
ceived, however, then the debriefing session becomes critically important. Not only
must the researcher explain the true purpose of the study and the reasons for the de-
ception, he or she must do so with great care and sensitivity.
Researchers must be alert to the fact that, when exposed to the truth, subjects
may feel embarrassed or angered about having been "fooled" and may harbor re-
sentment toward the investigator and toward social research in general. To obviate
such feelings, investigators have developed elaborate debriefing techniques (see
Carlsmith, Ellsworth, and Aronson, 1976; Mills, 1976). While we will not describe
these techniques in detail, certain common aspects deserve mention. First, it is best
to debrief subjects as soon after their participation as possible, especiall y if the de-
ception or its revelation is likely to cause discomfort. Second, the debriefing should
be carried out slowly and deliberately, first eliciting subjects ' reactions and then
graduaJIy explaining the nature of the experiment until subjects fully understand
every point. Third, since negative feelings about being deceived are worsened when
the deceiver is smug about it, researchers can relieve some of their subjects' dis-
comfort by expressing their own discomfort about the necessity of using deception
in order to arrive at the "truth." Fourth, researchers should point out to subjects that
if the experiment works well-if the cover story is convincing- then virtually
everyone gets fooled. Finally, above a1l , researchers should follow Herbert Kel-
man's (1968:222) guideline "that a subject ought not to leave the laboratory with
greater anxiety or lower self-esteem than he [or she] came in with."
Research on the effects of deception and debriefing indicates that, in general,
carefully admini s tered debriefing is effective. Stevens Smith and Deborah Richard-
son (1983) found that subjects who were deceived and subsequently debriefed re-
ported more positive experiences-for example, greater enjoyment and greater ed-
ucational benefit-from their research participation than did subjects who were not
deceived and, as a consequence, received less adequate debriefing. Indeed, the fi-
nal word on deception may be the finding of another study of subjects' reactions:
526 Di\ TA PROCESSING, ANALYSI S, AND I '\T\:RPRETATION
"li]l appears that subjects are willing to acce pt or tolerate certain di scomfitures or
unpleasantries if they are viewed as necessary elements of a scientific enterpri se.
Thus, learning that they had been deceived ... enhanced the subj ect ' s assessment
of the experiment's scientifi c value; elaborate deceptions are apparent ly viewed as
good social science methodologyl " (Straits. Wuebben, and Majka. 1972:515)
Pri vacy
The idea of the right to privacy goes back to antiquity. For example, Hippocrates'
oath promi ses: " Whatever ... I see or hear, in the life of men, which ought not to
be spoken of abroad, r will not di vulge as reckon ing that all such should be kept
secret." Despite its ancient origins, however, the moral claim to privacy was not
widely respected as a fundamental right until the last few centuries. The Industrial
Revolution made physical privacy possible, and political democracie s granted and
increasingly protected the privacy of individual belief and opinion (Ruebhausen and
Brim, 1966). Today, invasion of privacy remains a public concern as a result of
widely publicized accounts of government wiretapping, police entrapment, and cor-
porate drug testing.
The right to privacy is the individual's right to decide when, where, to whom,
and to what extent his or her attitudes, beliefs, and behavior will be revealed. So-
cial research presents many possibilities for invading the privacy of research par-
ticipant s, and it is essential that researchers be sensitive to the ways in which their
actions can violate this basic right.
The dramatic case of the Wichita Jury Study in 1954 shows how social research
can come into direct conflict with the value of privacy (Vaughan, 1967). In an ef-
fort to understand and perhaps even improve the operations of juries, researchers
secured the permission of judges to record six actual jury deliberations in Wichita,
Kan sas, without the knowledge of the jurors. When news of the study became
known, it was roundly criticized by columnists and commentators across the coun-
try, was investigated by a Senate subcommittee, and led ultimately to the passage
of a law prohibiting the recording of jury deliberations. The argument against thi s
study was that jury deliberations must be sacrosanct to protect the inalienable right
to trial by impartial jury. Surveillance "threatens impartiality to the extent that it in-
troduces any question of possible embarrassment, coercion, or other such consider-
ations into the minds of actual jurors" (Vaughan, 1967:72).
As thi s study shows, one way in which subjects' privacy can be invaded is
through the use of concealed devices such as one-way mirrors, microphones, and
cameras. If such devices are used with subjects' knowledge and consent, they pose
no problem. If they are used without subject's knowledge to record behavior in pub-
lic places (e.g. , restaurants and waiting rooms), they also are acceptable to many re-
searchers so long as su bjects remain anonymous and are not at ri sk. But when hid-
den recording devices are used to observe behavior in private settings to which the
research participant would not ordinarily allow the researcher access, an invasion
of privacy occurs. Besides juries, other settings that are considered private are
homes, personal offices, closed meetings, and physicians' examining rooms (Diener
and Crandall, 1978)
527
Research Ethics
Closely related to the use of concealed recording devices is the use of a false
cover to gain infor mation that subjects would not reveal if their informed consent
were obtained. Thi s became a major problem in the second phase of Laud
Humphreys 's study, mentioned above, when he got the names of men he had ob-
served performing homosexual acts and interviewed them in their homes. When
Humphreys observed these men in public restrooms, he did not know their names
or other detail s of their private lives . But the identifying information he subse-
quently obtained intruded on his s ubjects' privacy and, in the worst of circum-
stances, could have led to legal difficulties or even blackmail. (Unlike physicians,
lawyers, and the clergy, soc ial scienti sts are su bject to subpoena and cannot promi se
their respondents legal immunity.)
Whether we define access to information as an invasion of privacy will depend
on how private that information is. Humphreys 's research drew attention not just
because he used questionable means to procure information, but also because he
was investigating a sensitive area-sexual behavior. Clearly, some infonnation is
considered more pri vate or sensitive than others. Among the most sensitive and
threatening areas are sexual behavior and illegal activities. Researchers investigat-
ing these areas have a special obligation to protect the privacy of their informants.
Anonymity and confidentiality. No matter how sensitive the information,
however, ethical investi gators protect the right to privacy by guaranteeing anonymity
or confidentiality. Obviously, infollnation given anonymously secures the privacy
of individuals, but this safeguard is usually possible only in surveys using self-
administered quest ionnaires without names attached or in some available data stud-
ies. Most often the investigator can identify each individual's responses; therefore,
the principal means of protecting research participants ' privacy is to ensure confi-
dentiality. The researcher can do this in a variety of ways: by removing names and
other identifying information from the data as soon as possible , by not disclosing
individuals' identities in any reports of the study, and by not divulging the infor-
mation to persons or organizations requesting it without the research participant 's
permission.
Laud Humphreys defended his research partly in terms of the steps he took to
ensure confidentiality, such as destroying all data containing personally identifying
information after the completion of his study. Likewise, the researchers in the
Wichita Jury Study acted to protect privacy by destroying the original recording of
each jury deliberation after transcribing the recording and editing the transcript so
as to avoid the identification of any of the persons involved. The Census Bureau
protects confidentiality in a variety of ways, for example, by not releasing individ-
ual responses to the census of population and housing for seventy-two years-a per-
son's average lifetime- and, when releasing the Public Use Microdata Sample, sup-
pressing identifying informati on.
Field research usually requires more ingenuity to safeguard anonymity and con-
fidentiality. The traditional approach is to use fictitious names for individual s, groups,
and locations, although thi s alone may not be sufficient to prevent people from rec-
ognizing themselves and others. For example, in a study of the community of "Spring-
dale," a small town in upstate New York, the researchers promised their informants
528
529
DATA PROCESSI'JG, ANALYS IS. AND INTERPRETATION
that no individuals would be identified in printed reports. However, when Arthur
Vidich and Joseph Bensman ( 1958) published their research in a book, the people of
the town could clearly identify each character in spite of the authors' use of pseudo-
nyms. The townspeople were so outraged by the transparency of thei r characteriza-
tions and the consequent invasion of their privacy that they featured a t10at in the an-
nual Fourth of July parade with a large-scale copy of the jacket of the book, Small
Town in Mass Society. Thi s was followed first by residents "riding masked in cars la-
beled with the fictitious names given them in the book" and then by a manure
spreader, with an effigy of the author Vidich bending over the manure (Whyte, 1958).
Because Vidich and Bensman reported private material without protecting the
anonymity or obtaining the consent of their informants, they were severely cri ti-
cized by other social scientists. To remove the possibility of recognition, the au-
thors might have altered some of the information about people, such as their fam-
ily background, occupation, or other intimate details of their lives, or they might
have developed composite characters based on more than one informant. Perhaps
the best solution, however, is to ask the subjects themselves if the material consid-
ered for presentation or publica tion is acceptable to them. This is the strategy
adopted by Be tt ylou Valentine in her study of a community called "Blackston. "
Valentine (1978: 166) believed that some intimate details of people's lives in-
volving family size, family struc ture, and intenelationships were relevant to im-
portant points she wanted to make. She "did not see how it would be possible to
disguise the people enough to make them unrecognizable even to themselves and
at the same time accurately illustrative of the Blackston community." Therefore, af-
ter she had completed a draft of her manuscript, she sent copies to all the major
characters. She explained that her story mi ght be published in the future, and she
asked each person (1) whether her account was accurate and fair; (2) whether any
material would be embarrassing to anyone ; and (3) whether they had a ny com-
ments, conections, or other reactions. Finally, she subsequently returned to Blacks-
ton to talk to several of the persons involved. As a result of these contacts, Valen-
tine not only worked out additional disguises that protected the privacy of her
informa nts but also gained valuable insights and suggestions for her book.
Making Ethical Decisions
Ethical is sues arise in social research when conf] icts occur between societal values
such as freedom and privacy and scientific methods aimed at obtaining the highest
quality data. In the preceding sections we have identified some areas of potential
conflict-harm to participants, involuntary pal1icipation, intentional deception, and
an invas ion of privacy. We also have examined some current resolutions of these
ethi cal issues. It should be clear from our discussion that there are no easy answers;
indeed, frequently there is considerable disagreement among reviewers about the
ethicali ty of research proposals that raise ethical issues (Ceci , Peters, and Plotkin,
1985). With this in mind, how does the researcher decide what to do?
Some social sc ienti sts, such as Diana Baumrind (197 J:890), take the pos ition
that "scientifi c ends, however laudable they may be," should never justify the use
of means, such as lying to subjects, that violate fundamental moral principles or
Research Ethics
sacrifice the welfare of research participants. In philosophy, this ethical position is
known as deontology: Basic moral principles should allow no exceptions, no mat-
ter what the consequences. By contrast , the operating ethical philosophy of most
social scientis ts today-the philosophy behind professional ethical codes and fed-
eral ethical guidelines for research-is basically teleological: The morality of ac ts
should be judged in relation to the ends they serve. The overall guiding principle is
that "the potential benefits of the research (e.g., advancement of sc ientifi c knowl-
edge, beneficial technological applications, advantages to subjects) must be weighed
against the potential costs (e.g., harm to subjects, detrimental technological app li -
cations)" (Schlenker and Forsyth, 1977:371-72). As we saw in our discussion of
harm, cost-benefi t analyses do not always help to resolve ethical dilemmas. None-
theless , it is from this guidi ng cost-benefit principle that other rules for the conduct
of social research have been derived:
These include oblaining informed consent; remaining open and honest with the
part icipants; respecting the participants' freedom to decline participation; insuring
the confidentiality of the participants ' dat a; protecting the participants from phys-
ical and mental discomfort, harm, and danger; completely debriefing the partici-
pants; and removing allY undesirable effects of the research. (Schlenker and Forsyth,
1977371)
Although exceptions sometimes are made to these rules, these exceptions must be
based on a careful analysis of the possible benefits and costs of the study.
initially, the individual resea rcher is responsible for examining the ethics of a
study and it s prospective e ~ e f i t s and costs. Of course, when attempting to make
difficult ethical decisions, it is always a good idea to solicit others' opinions. Celia
Fi sher and Denise Fryberg ( 1994) argue that thi s should include the opinions of par-
ticipant s, especially in deception research. To assess tl1e potential impact of decep-
tion as well as enhance the protection of subjects, they propose that researchers sur-
vey prospective participants on ethical issues at the initial stages of planning a
study. Subjects could be asked, for example, about the sc ienti fic value of the study,
the relative advantage of deception and alternative procedures, their psychological
reactions to experimental manipulations, and the efficacy of debriefing procedures
in all eviating psychological discomfort. In thi s way, research participants become
partners in the process of ethical decision making.
Increasingly, however, the ultimate decision about whether a given study will
be conducted rests not with the researcher but with a committee responsi ble for re-
viewing research proposals involving the use of human (and animal) subjects. The
Department of Health and Human Services (DHHS) , as well as most other federal
agencies, institutes, and foundations, requires the approval of a;. research propos-
als by a human subjects committee (called an institutionaJ review board, or IRB)
as a precondition for the release of its funds. Virtually every college and university
in the United States and most tax-exempt private research foundations have IRBs.
And, in recent years, these inst itutions increasingly have mandated the IRB ap-
proval of all research involving human participants, not juS! research funded by the
federal government (Singer and Levine, 2003).
530 DATA PROCESSING, ANALYSI S, AND INTERPRETATI ON
According to federal regulations (Code of Federal Regul ati ons, 1995), each
IRB has at leas t five members, with va rying backgrounds thatensure the adequate
reviewofresearch proposals. Toprovideadiversity ofexperti se, the membersmust
includeatleas tone nonsc ienti st(suchas a lawyer,ethicist,ormemberofthe clergy)
and at leas tone member notaffili ated with the research institution, as well as per-
sons competent to review specific research activities (s uch as a social scienti st in
thecaseofsocialresearch). Investigatorssubmit written documents to the IRB that
descri be the proposed research and specificall y outline how research participants'
ri ghtsare to be protected, suchas provi sions for informed consentand measures to
ensure confidentialit y. IRBs then approve, modify, or di sapprove the research ac-
cording to their interpretati on offederalregulations outlined by DHHS.
Besidesfederal regul ations, socialscientists areguided by ethi calcodesfor the
treatment of research participant s developed by professional societies . Box 16. 1
provides excerpts from the ethical codes ofthree such societies: the American An-
thropological Association ( 1998), the American Psychological Associati on (2002),
and the American Sociological Association (1997). These codes cover ethical re-
sponsibilities not only to those studi ed, but also to the profession, the public, and
students.In the next sect ion, we exami ne ethical responsibilities to society.
BOX 16.1
Codes ofProfessional Ethics
The fo llowing swtements are excerpt s from the profession.al codes ofethi cs of three
nat ionalorganiza tions: the Ameri can Anthropological Association (AAA), the Amer-
ican Psyc ho logica l Associati on (MA ), 3nd the American Sociological Association
(ASA). You can read the complete codes ofethi cs by visiting the ir Web sites.
Professional Practice in the Conduct of Research
Anthropological researchers bear responsibility for the integrity and reputati on of
thei r discipline, of scholarshi p, and of science. Thus anthropological researchers
are subj ect to the general moral I1lles of sci entific and scholarly conduct: They
should notdece ive orkn owi ngly mi srepresent (i.e., fabricate evidence, falsify,pla-
giari ze), or allempt to prevent reponing of misconduct, or obstruct the scien-
tific/scholarl y research ofothers. (AAA)
Psychologists seek to promote acc uracy, honesty. and truthfulness in the
science ...ofpsychology. [n these activities psychologists do not steal, cheat,
orengage in fraud,subterfuge, or intent ional misrepresentation offact.
Psychologists do not fabricate data.
[fpsychologists discoversignificanterrors in the irpublished data, they take
reasonable steps to correct such errors in a correcti on, retraction, errat um, or
other approPJiate publi cation means.
Psychologists do not present portions of 3nothers work or data as the ir
own...(APA)
Research Elhics 53 1
Sociologists to the highest scientific and profes.,ional standards and
accept responsibilit y for their work.
[nplanning and impleme nting research, sociologi ."sminimize the possibi l-
ity that result s wi ll be misl eading.
Sociologists do not fabricate data or fal sify result s in their publicati ons or
presenLati ons.
[n presenting their work, sociologists report thei r findings full yand do not
omi trelevant data. They report res ult swhether they support or contradict theex-
pected outcomes.
Sociologists take particular care to state all releva nt qualifi cations on the
findings and interpreta ti onoftheir research.Sociologistsdisclose underlying as-
sumptions, theori es, methods, measu res, an.d resea rch designs that might bear
lIpon findings and in terpretations of their work.(ASA)
Treatment of Research Participants
Anthropologica l researchers must do everything in their power to ensure that
their researc h does not harm the safet y, di gnity, or pr ivacy of the people wit h
whom they work, conduct research.or perform other professionalac tivities.
Anthropological researchers must determine in adva nce whether their
hosts/providersofinformati onwishto remainanonymousor receive recognit ion,
and make everyeffort to complywi th those wishes. Researchers mustpresent to
their research parti cipants the possible impacts of the choices, and make clear
that despite their besteffort s, anonymity may be comprised or recognition fa ilto
materi ali ze.
Anthropologica lresealchers should obtain in the informed consent
ofpersons being stu died....(AAA)
When psychologists conduct research ...th ey obtai nthe informed conse nt
ofthe indi vidualorindi vidua.ls using language that is reasonably understandabl e
to thatpersonorpersonsexceptwhen conducting such activities withoutcom,ent
is mandated by law or govern mental regul ation oras otherwise provided in this
Ethi calCode.
Psychologists discuss with persons ...andorganizations with whom theyes-
tablish ascientifrc...relationshipthe relevant limitsofconfidentiality and the fore-
seeable lIsesofthe information generated through their psychologicalacti vi ties.
Psychologi sts clo notconduct astudy involving decepti on unless they have
determined that the use ofdeceptive techniques is justifi ed by the study's sig-
nifi cant prospect ive scientific, educati onal, or applied va lue and that effective
nondecepti ve alt ern ati ve procedures ale not feasible .
Psyc hologists do notdece ive prospecti ve participants about research that is
reasonablyexpected to ca use physical pain or severe emoti onal di stress.
Psychologistsexplain any deception that isan integralfea ture of the design
and conductofan experiment to participants as earl yas isfeas ible, preferabl yat
the conclusion of their participation, but no later than at the conclusion ofthe
data coll ection, and permit parti cipant s to withdraw their data. (APA)
Sociologists have an obli gation to ensure that confidential info rmati on is
protected.
Informed consent is a basic ethi cal tenet of scientific researc h on human
populations.
Sociologists take steps to implement protections for the rights and welfare
ofresearch parti cipants and other persons affect ed by the Iesearch.
532
DATA PROCESSING. ANALYStS , AND INTERPRETATION
In their rcst:J rch, sociol ogists do not encourage acti vities or themselves be-
have in ways that are hea lth- or life-threa tenin g to research participant s or
others. (ASA)
Responsibility to the Public
Anthropological researchers should make the results of their research appropriate ly
available to sponsors. students, deci sion makers, and other nonanthropol ogists. In
doing so, they must be truthful ; they are not only responsible for the factual con-
tent of tlleir state ments but also must consider carefully the soc ial and political im-
plications of the informati on they di sseminate. They must do eve rything in the ir
power to insure that such information is well understood, properly cont extuali zed,
and responsibl y utilized. They should make clear the empirical bases upon which
their report s stand, be candid about their qualiflcations and philosophical or polit-
ical bi ases, and recognize und make clear the limits of anthropological expertise.
At the same time, they must be alert to possible harm their informati on may cause
people wi th whom they wor' k or colleagues. (AAA)
Psychol ogists are committed to increas ing scie ntific and professional knowl-
edge of behavior and people 's unde rstanding of themselves and others and to the
use of such knowl edge to improve the conditi on of individuals, organizations,
and society. . . They strive to help the publi c in developing informed judgment s
and choices concerning humJn bellavior.
If psychologists learn of misuse or misrepresentati on of their work. they take
reasonable steps to correct or minimi ze the misuse or misrepresentation. (APA)
Sociologists are aware of their professional and sc ientific respo nsibilit y to
the communities and soc ieties in whi ch they live aod work. They apply and make
public their knowledge in order to contribute to the public good. When under-
tak in g research, they stri ve to ad vance the sc ience of sociol ogy and to ser ve the
public good. (ASA)
Source: American Anthropologic al Association ( 1998) . American Psychological Association
(2002), copyright 2002 by the Ameri can Psychological Association, and American Sociolog-
ical Associat ion ( 1997). Reproduced by permission. Not for further reproduction.
The Uses of Research: Science and Society
The Issue of Value-Neutrality
Social sc ienti sts have become increasingly sensitive not only to the ethical impli-
cations of their work for research participants but also to its moral and ideological
implications for the larger society. In the interest of promoting the scientific side of
social research, some people once held that social science should be "value-free."
According to thi s position (Lundberg, 1961 ), we can and should make a sharp dis-
tinction between the roles of scientist and citizen. Science is nonmoral. The meth-
ods of sc ience are designed to eliminate personal preferences and values; and " there
is nothing in scientific work, as such, which dictates to what ends the products of
science shall be used" (Lundberg, 1961 :32). Scientists' only imperative is "to say
what they know"-to present rele vant findings and theoretical int erpretations. in
Research lilies 533
their capacit y as citizens, scientists may take moral positions, campaigning, for ex-
ample, against nuclear weapons, ac id rain, or racial oppression. But if soc ial sc ien-
tists are to be take n seri ously as sc ientist;; , they should not confuse this role with
that of ci ti zen and should not let their personal values affect their research.
Thi s value-free ideology is no longer tenable for two main reasons. First, it is
now clear that values have a substantial influence on the research process. Personal
values and political beliefs inevitably affect how scientists select and conceptualize
problems and how they interpret their findin gs. As we noted in chapter 3, many
nonscientific factors affect problem selection: personal interests and ideologies, the
availability of funding, the climate of opinion in soc iety, research fads and fas hions.
Simil ar factors affect the perspective that researchers take and the kinds of ques-
tions they ask, which in turn determine the kinds of answers they will find. When
IQ tests were fi rst administered on a large scale during World War I, the prevail-
ing belief in both scientific and nonsci entifi c ci rcles was that ethnic groups mi grat-
Il1g from sou thern and eastern Europe were inferior to earlier immigrant groups.
Consequently, when the former groups scored consi stently lower than Ameri cans
of northem and western European ancestry, thi s was seen not only as proof of ex-
is ting beliefs about ethnic di fference s but al so as a validation of the tests as mea-
sures of innate intelligence. Of course, neither of these interpretations is acceptable
today because social scientists, whose perspective led them to focus on the social
environment, have demonst rated conclusively the effects of language, culture, and
soc ioeconomic factors on test scores.
The value-free ideology alleges that as scientists, soc ial researchers can remain
neutral in accumulating fact s about social life that are of equal utility to Democrats
and Republicans , liberal s and conservatives. The second problem with this position
is that while claiming to be value-neutral seemingly protects the sci enti st' s self-
interes t and autonomy, in effect it places researchers in the service of others' val -
ues, such as those of research sponsors or anyone el se who chooses to use their find-
ings. Those who advocated complete value neutrality took physical scienti sts as
their modeJ , cl aiming that these "real" sc ienti sts could serve equ ally well under
fascistic or democratic political regimes (Lundberg, 1961). But the moral bank-
ruptcy of this position comes into s harpest focus when we consider such "real" sci-
entists under the Nazi regime. German physi cians, apparently operating out of a
value-free model of sc ience, "systematically froze human beings in tubs of ice and,
III the conduct of sterilization experiments, sen t electrical charges through femal e
ovaries" (Gray, 1968). This example is extreme; no one today would argue that
value ne utrality justifies harming others. More to the point, sci enti sts have come to
realize that they bear some responsibility for applicat ions of their research. Physi-
cists who worked on the atomic bomb did not do so out of a value-free ideology,
but out of patriotism and a belief that Japan and Germany had to be stopped. How-
ever, many of them had second thoughts about helping to build the bomb, espe-
ctally after they saw its destructi ve effects.
Both of these problems o f maintaining value neutrality are exacerbated for so-
cial scientists, who typically study probl.ems that have immedi ate relevance to peo-
ple's lives. Indeed, more often than the ast ronomer or chemi st or phys icist, the so-
cial researcher is drawn to the study of parti cul ar phenomena for their social as well
534
I
..
DATA PROCESSING, ANALYSIS, AND INTERPRETATION
as their scientific significance. The nature of the problems selected and the moti-
vat ion to study them are inherently value laden in social research. Social scientists,
therefore, must be aware not only ofthe innuence ofpersonal values and political
preferenceson theirown work,butalsoofthe implicationsofthei r fi ndings for con-
structive or destructive use by ot he rs. .
Among social scienti sts, anthropol ogists probably have been mostkeenly aware
ofthe impact ofvalues on the research process. They developed the sensitizing con-
cept of"cultural relativity" to guard agai nst the tendency to judge other cultures in
relation to one'sowncultural world view.Cultural relativity is the idea thatcultural
values-standards of truth, morality, beauty, correct behavior, and so forth-vary
widely and must bej udged in relati on to a given society. In addition, anthropologi sts
also have pointed out the import ance oflanguage-thatWestern scientifi c language
may not be appropri ate for "t ranslating" the behaviorofanot her culture and that it is
therefore necessary to understand how subjects perceivethe world in theirown telms.
In a similar vein, sociologist Howard Becker ( 1967) pointed out that research
is always contaminated by personal and poli tical sympathies, but that the way to
deal with this is not to forsake thestandards of good sc ientifi c workand take sides,
but rather to consider carefull y " whose side we are on." Becker had in mind field
researchers, who often study the " underdog"-the devi ant , oppressed, or subordi-
nate .In trying to understand reality from the subjects' perspect ive, field researchers
may become sympathetic with that point ofview, which usually is contrary to the
accepted vi ew ofthe conventional, economically well-off, or superordinate. How-
ever this does not mean that one should always present all sides or should avoid
sides. These options are seldom, if ever, possible. What we should do, ac-
cordingto Becker, is admitto whoseside weareon,useourtheori esand techniques
imparti ally-taking precautionary measu res designed to guard against bias-and
make "clear the limits of what we have studied, marking the boundaries beyond
which our findings cannot be safely applied" (Becker, 1967:247). Part ofthi s "so-
ciological di sclaimer," Becker believes, should be a statement
in whi ch we say, fo r instance, that we have studi ed the prison through the eyes of
the inmates and not through the eyes of the guards or other involved parties. We
warn people, thus, tha t our study tells us on ly how things look from that vantage
point-what kinds ofobjec ts guards are in the pri soners' world-anodoes not at-
tempt to explain why guards do whatthey do orto absolve the guards of what may
seem, from the prisoners' side, morally unacceptabl e behavi or. (p. 247)
Becker does not argue that social sc ienti sts should stand pat with their "one-
sided" views of reality. In fact, he sees the long- term solution to an enlarged un-
ders tanding of instituti ons as the accumul ati on of many one-sided but di fferent
views of reality. This pos ition is analogous to the methodological principle of tri-
anaulation introduced in chapter 12. However, whereas before we suggested vari-
trianaulation techniques as ways ofeliminating methodological bi ases and er-
rors, her:we sugges t that these techniques also might be used to shed li ght on
personal values that may be embedded in a particular methodological approach or
view ofreali ty.
Research Ethics 535
The Application of Research Findings
For many socialscientists, guarding agai nst the intrusion ofvalues in research and
carefu lly notingthe limitations ofconclusions are not the extentofone'sethical re-
sponsibility to soc iety. We also must be aware ofand, some be li eve, provide di-
recti on to how others use soc ial science findings. There is little question that the
products of social science will be used by others. They al ready have had and will
continue to have a major impact on soc ial policy. To cite one prominent example,
the Supreme Court decis ion of 1954 (Brown v. Board of Educat ion of Topeka),
which declared that separate educati onal t'aci liti es for blacks and whites were in-
herentlyunequal, was based in largeparton socia lscience findings .The unanimous
opini on cited several stud ies showing that segregat ion had a detrimental psycho-
logical effect on black children.
Sincethi s deci sion.socia lscienti sts havecontinued to be among the sta unchest
and most vocal supporters of integration and civil rights, with many test ify ing in
cases involving school desegregati on, busing, and affirmative action. These sc hol-
ars have taken the initiative in offering thei rex pert advice in areas ofsocial policy.
For the most paJ1, they have attempted to show how social science findings sup-
ported positions that most of their colleagues favored, and their political involve-
ment is noncontrovers ial. The hard ethical debate concerns how much responsibil-
ity researchers bear for applications that are destructive or contrary to prevaili ng
scientifi c and public sentiment.Thatis the ques ti on that physici sts debatedafter the
bomb. Should one try to foreseepossible mi suses and abuses ofscientific findings?
If one can foresee mi suseorabuse, should the research be conducted at all ') And if
s uch research is conducted, how act ive a role should the researcherplay in the di s-
semination ofthe finding s') Is the researcher responsible for the way informat ion is
presented and for assess ing the publ ic's reaction')
The most controversial studies in the annals ofsocial science raise just these is-
sues.Forexample, Proj ectCamelot,a multimillion-dollarresearch study funded by the
U.S. Army, was des igned to measure and forecast the causes ofrevolution and insur-
gency in underdeveloped areas of the world (Horowitz, 1967). Because of its huge
scope, the project drew a large team of respected social scientists. Some, seeing the
projectas an unprecedented opportunity to do fundamental research on a grand scale,
may not have inquired too deeply into the ultimate purpose ofthe project. Others be-
li eved for various reasons that they were,in nosense,"sellingout" to the military.They
believed that they would have great freedom in handling the project, that there was a
possibilityofimprovingconditionsin underdevelopednati ons,and thattheycouldhave
an enlightening influence on the military (Horowitz, 1967). What they failed to envi-
sion was the "uses to which the United States Army or Central Intelli gence Agency
could have putthe information, such as fostering revolutions against regimes hostile to
the United States. They fail ed to recognize the grave concern those in othercountries
would haveoversuch potential uses" (Dienerand Crandall , 1978: I08) . Indeed, in July
of 1965, seven months after the project began, after its revelation made it a cause
celebre in Chile, Project Camelot was canceled by the Defense Department.
Itis precisely thi s potential (and act ual ) abuse offindings that led many social
sc ienti sts to condemn the researc h of educationa l psychologist Arthur Jensen. In
536 DATA PROCESS ING,ANALYSIS, AND INl ERPRFTATlON
1969,ArthurJensen (1969) published an article in the HarFard Educational ReFie\\'
entitled "How anWe Boost IQ and Scholastic Achievement ')" In arglling that IQ
was determined largely by heredity, Jensen concluded that genetic differences ac-
counted for the higherscoresofwhites than blacks on IQ tests, and that no amount
of compensatory education could undo this difference.Such a view had not been
propounded in respectable academic circles for many years prior to Jensen's aI1i-
cle; as aresult, the articlecreatedafuror. Many scholars severelycriticizedJensen's
conclusionson methodological grounds. But the point here is that Jensen apparently
failed to consider the uses to which his article might be put. Although he himself
wasopposed to segregationand argued that his research suggested the need for ed-
ucational programs tailored to individual differences (Edson, J970) , others seized
upon Jensen' s research to oppose integration. Less thana week after a report ofhis
article made headlines in Virginia newspapers, defense attorneys quoted heavily
from Jensen' s article in a suit in federal di strict court to integrate schools in two
Virginiacounties (Brazziel, 1969).Their main argument was thatdifferences in in-
telli gence between whites and blackswere innate; that white teachers could not un-
derstand bl acks;and that blackchildren should be admitted to white schools strictly
on the basis ofstandardized tests .
Still another, more recent st udy demonstrates the ethical tightrope that re-
searcherstraverse when their researchconcemsan importantpolicy issue. Lawrence
Sherman and Ri chard Berk (1984) were duly circumspect about the policy impli-
cations of their dome stic violence experiment, reported in chapt ers 12 and 13,
which showed that arrest reduced the likelihood of a repeat offense. The clearest
implication was that police should no longer be reluctant to make arrests in do-
mestic assault cases. But Sherman and Berk also cautioned against routinely re-
quiring arrests in such cases. They noted the unique features of the jurisdiction of
the study and concluded that, even if their findings were replicated in otherjuris-
dictions, arrest may work better for certain types of offenders and in certain types
ofsituations.Still, many activists, law professors, and others who favored manda-
tory arrest laws att acked Sherman and Berk' sposition onthe policy implications of
their study (Sherman, 1993). At the same time, others were critical ofSherman in
particularbecausethey perceived his dissemination ofthe study'S results to the me-
dia as an effort to innuence policy toward mandatory arrest (Binder and Meeker,
1993). The latter critics became especi ally vocal when replication experiments
failed to showconsistent beneficial effects ofarrest. [See Sherman (1993) and Berk
(1993) for rejoinders to this criticism.]
Whatare the ethical implications ofsuch controversies,) First ,social scientists
have an obligation to consider how their findings will be used. Research tllat is
clearly intended to be exploitative, such as management-sponsored research in-
tended to quiet labor unions, should not be done (Diener and Crandall, 1978). Sec-
ond, given that eventual applications usually are unknown, sc ienti sts should di s-
seminate knowledge to the widest possible audience, to increase public knowledge
and encourage debate, so that no one group can exploit the knowledge for its own
welfare (Dienerand Crandall, 1978). Third, when research has obvious and imme-
diate applications, as ill applied and evaluation research, scientists have a special
obligationto promoteactively appropriate usesand to prevent mi susesoftheir find-
537
Research Erhics
. From thi s standpoint, we believe that Sherman and Berk were ethically re-
Ings. (ble ) Finally sci enti sts can assume responsibility collectively for the appil-
sponsl. , . t their behalfand pro-
cation ofresearch through organizations that commul1lc(D(\ eon, dCrandall 1978),
d . ofpolicy-related Issues lenel an ,
videaforum for the ISCUSSlon . hI'I St d ofSocial Issues
One such organization is the Society for thePsyc 0 (SSSP).
(SPSSl); another is the Society for the SClentIflc Study 0
Summary
. . d t ' .anores. Research ethics is a set
Ethics is not someth1l1g one Simply accees Of01 10 are Judged. They are not
I . . Ie ' against whIch the actions 0 SCI en I
of ' ssanddon 'ts; rather, they pose dil emmas for researchers,.pressing
weiahthe costs and benefits ofactions anddeCISIOns. stage
search proc:ss presents its own problems for the InveStlgator w 0 wou research
" . It"thlno Thi sinteractionofethicsand SCIence repeated throughoutth
e
fIg1 . 0'
, h Id be a part ofevery social scienti st'sconsCI ousness. .
process S ou. of ethlcal concern are the ethics of data collection and
The three major areas . . f ' ._
. is the ethics ofthe treatment ofhuman subjects, and the ethICS0
analys ' . . Th first set of ethics prescribes that SCIentIsts carry out their re-
bIllty to SOCIety. he f dngs honestly and accurately; violations ofthese pnncI-
search and report t elr III Ibd fknowledge. The second area ofethics consi sts
pies undermIne sCience as a 0 yo..t The
ofa set ofrules that are desi gned to protect the rightsofresearch d _
. h h deal s with the rel ationship between sOCietal values an t e IS
ofscientific findings, generallyadvises scientists to promotethe
general welfare, . . nt s are a
Ethical considerati ons regarding the effects of research
. ofan research design. Presently,It IScommon prac Ice I
pass on the
ter instItutIonal approval, reseaIchers are gu . des n usediffer in lan-
. I h I ' d s Whde the vanouS co I
lations and social scientists take issue with current
guage and specI I I .
practice, certain rules ofconduct are faI rly standard.
Foremost the researcher should not expose parti cipants to subslanti ai
1. physical psychological harm, unl es s the benefits of participatI on excee
ri sks and subjects knowingly choose to partlclpale.. . , . I
2. Participants should be informed that their participatIon IS voluntary shou d
be told about any aspects ofthe research thaI mighl mfluence their WI mgness
to panicipate. I 'nd fully inform
3. Ifdecepti on is deemed necessary, the researcher 11l1l St Yp:%ible
t
subects ofthe deception as soon after theIr parliclpa Ion .' . .. , . _
4. Re;earchers should use all possible means to protect the confldentlaltt) of In
formati on provided by researchparticipants.
. h h potential benefits of research must be
The overall guiding principle lS t at t e
weighed against the potential costs.
538 DATA PROCESSING, ANALYSIS, AND II'TERPRET.\ T ION
Researchers also must consider the et hical implications oftheir research for the
larger society. It is now widely recognized that values-personal and societal-are
implicated throughout the research process. With this in mind, researchers should be
conscious ofthe ways in which their decisions conSlitute elhical judgments. They
should be aware ofthe potent ial uses and abuses ofthe knowledge they seek, guard
against the intrusionofpersonal values in the conductofresearch, and carefullypoint
out the limitat ions oftheir research. Finally, where appropriate, they should promote
beneficial applications andfight aga inst haml fu l applications ofresearch findings.
Key Terms
research ethics anonymi/)'
elhics cOllfiden liality
cost-benefit analysis instilulional review board (IRB)
informed consent cultural reiatil 'ily
debriefing
Review Questions and Problems
I . Why is it so important for sc ient ists to be completely honest and accurate
in conducti ng and reporting their research?
2. In what ways can research participants in soc ial research be harmed"
3. Is it ever considered ethical to use procedures that might expose research
participants to physical or mental discomfort, harm, or danger? Exp lain.
4. What are the limitati ons ofa cost- benefit analysis ofproposed research"
5. What safeguards do social scientists use to protect research parti cipants
from han,,?
6. Whatare the basic ingredie nts ofinfomled consent? Howdid Stanley Mil-
gram violate this principle in hi s research on obedience to authority?
7. Which research approaches present the most serious problems from the
standpoint ofinformed consent"
8. Why do researchers use deception? Whatare thearguments agai nst it s use
in soc ial research?
9. What is the most basic safeguard against the potentially harmful effects of
deception? Is it effective ? Explain.
10. When is social research likely to invade people' s privacy?
II. How is researc h participants' right to privacy typically secured in (a) sur-
veys and (b) field research?
12. What are institutional review boards (IRBs)? What part do they play in
evaluating the ethics ofresearch?
13. What is meant by val ue-free sociology? Identify the major challenges to
this position.
14. Explain Howard Becker's position that social scientists should decl are
"whose side they are on." What purposes does thi s declaration serve?

Research Eihics
539
IS. Whatobligati ons do soc ialscientists have regarding the use ofthe knowl-
edge they generate')
16. Discuss the et hical problems raised by the following research examples
a. (Hypotheti cal) A criminol ogist meets a professional fence through an
ex-convict he knows . As part of a study, the researcher convinces the
fence to talk about hi s work-why he sticks with this kind of work,
whatkind ofpeople he deals with, how he meets them, and so forth .To
gain the fence's cooperation ,the researcherpromises notto discloseany
personal detail s that would getthe fence introuble. However, when sub-
poenaed, he agrees to reveal hi s informantratherthan go to jai l. Has the
researcher violated an ethical principle in agreei ng to talk?
b. (Hypotheti cal) A researcher gai ns access to a c li nic serving AIDS pa-
tients by respondi ng to a ca ll for volunteers. While working at the
clinic,s he makesa record ofpatients ' names and laterapproaches them,
identifies herself as a soc ial scientist, fully explains the natureofherre-
search, and asks for their cooperation in her in-depth s urvey ofAIDS
victims. Most patients agree, although some react negatively to the re-
quest. What aspects of the researcher's strategy are ethically problem-
atic?
c. Stephen West, Steven Gunn,and Paul Cherni cky ( 1975) tested a propo-
si tion from attributi on theory in social psychology regarding the way
people perceive reprehensi ble acts. To do this they tempted subjects to
participate in a burglary and then tested whether those agreeing to par-
ticipate differed from those refusing and from s ubj ects not approached
wi th regard to their perceptions (attributi ons) aboll t this illegal act. One
ofthe experimenters, posing as a local private detective, contacted stu-
dents and prese nted an elaborate plan for burglarizing a local advertis-
ing firm. In two of the conditions, subjects were told that the burglary
was to be committed for a government agency; in another condition,
subjects were promised $2000 for thei r participation. The subj ect's
agreement or refusal to take part in the burglary and hi s or her reasons
forthedecision were the majordependentvariables.Theresearchers did
not , ofcourse, carry out the crime. What ethical problems does thi s
study pose? Describe how you would debriefsubjects in thi s study.

You might also like