You are on page 1of 12

Public Opinion Quarterly, Vol. 77, Special Issue, 2013, pp.

220231

USING MP3 PLAYERS IN SuRVEYS THE IMPACt OF A LOW-TECH SElF-ADMiNistRAtiON MODE ON REPORtiNG OF SENsitiVE AttitUDEs
SiMONCHAUCHARD*

Abstract This article introduces an inexpensive, low-tech Audio-SelfAdministered Questionnaire that uses a basic MP3 player (MP3/ASAQ) and compares its performance in collecting data about sensitive attitudes with a number of alternatives, including a face-to-face survey. The paper compares five administration procedures in an experiment conducted in a survey on sensitive caste-related attitudes in rural India. Respondents in the MP3/ASAQ group listened to a prerecorded instrument that presented them with a number of first-person statements made by respondents like [them], entered their responses on an answer sheet using simple shapes and logos, and finally placed their form in a bolted ballot box. Like previous studies evaluating self-administration techniques, our study indicates that the MP3/ASAQ significantly increased socially undesirable answers, as compared with an equivalent face-to-face interview. Comparisons with additional administration procedures suggest that when self-administration is combined with the use of earphones the threat of bystander disapproval (as opposed to interviewer disapproval) is reduced by effectively isolating respondents from their social environment.

Downloaded from http://poq.oxfordjournals.org/ by guest on March 10, 2014

Introduction
Self-interviewing, particularly in its modern form (computer-assisted selfinterviewing, or CASI), has emerged as an efficient way to limit underreporting of sensitive attitudes and behaviors. Numerous studies have demonstrated the effectiveness of video- and audio-CASI in increasing reports of risky behaviors (OReilly et al. 1994; Tourangeau and Smith 1996; Turner et al.
Simon Chauchard is an assistant professor of government at Dartmouth College, Hanover, NH, USA. He thanks Shankare Gowda, Martin Gilens, Don Green, Eric Dickson, Kanchan Chandra, Lynn Vavreck, Deborah Brooks, and three anonymous reviewers for feedback on the design and the presentation of this project. *Address correspondence to Simon Chauchard, Department of Government, Dartmouth College, 211 Silsby Hall, HB 6108, Hanover, NH 03755, USA; e-mail: simon.chauchard@dartmouth.edu.
doi:10.1093/poq/nfs060 The Author 2013. Published by Oxford University Press on behalf of the American Association for Public Opinion Research. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

Chauchard

221

1998; for a more recent example, see Brener etal. 2006). Over the past decade, these findings have largely been extended to telephone ACASI systems (Turner et al. 1996; Gribble et al. 2000; Moskowitz 2004; Villarroel et al. 2006). In addition, at least one study (Harmon etal. 2009) has shown that the benefits of self-administration extend to sensitive attitudes. Despite this, many surveys that include sensitive questions do not employ self-interviewing methods. This may be due to cost or other reasons. Selfinterviewing may require a degree of literacy that some respondents do not possess. While ACASI methodologies might help with literacy problems, technology-reliant survey modes come with their own set of issues. In some settings, technology attracts unwanted attention, and hence reduces the privacy of survey participants. In addition, many research teams throughout the world still do not have the technological skills necessary to produce a CASI survey. This article introduces the MP3/ASAQ methodology, a self-administered questionnaire that incorporates audio using an MP3 player.1 This significantly cheaper, simpler, and more discrete administration mode (compared to CASI) does not require that respondents be literate and hence minimizes the concerns listed above. After introducing the MP3/ASAQ methodology, this article shows that the MP3/ASAQ methodology retains the most important advantage of other self-administered methodologies: a beneficial impact on reporting. The rest of this article presents results from an experiment showing how the MP3/ASAQ methodology increases reporting of socially undesirable attitudes and isolates the causal mechanisms driving this reduction. Finally, the article discusses the implications of these findings.

Downloaded from http://poq.oxfordjournals.org/ by guest on March 10, 2014

The MP3/ASAQ Methodology


With the MP3/ASAQ methodology, respondents react privately to statements made by villagers like [them] in earlier conversations with the research team. In order to record their reactions to statements they hear through earphones, respondents mark an answer sheet, using simple shapes.2 The answer sheet provides a line for each question in the audio survey, and each line presents respondents with various response choices. The prerecorded voice explains how and where to answer. Respondents tick one of the shapes on each line; if they do not know what to answer or refuse to answer, they do not tick anything and move on to the next line. In order for illiterate respondents to identify the line associated with each question, a logo is affixed to every statement. After each statement, the prerecorded voice asks: How much do you agree with
1.This technology is very similar to the Walkman technology developed by Camburn, Cynamon, and Harel (1991). 2. The instrument is a succession of recordings alternating with five-second-long silences, a length that was determined through a pilot test to maximize respondents ability to respond while avoiding fatigue.

222

Using MP3 Players in Surveys

Downloaded from http://poq.oxfordjournals.org/ by guest on March 10, 2014

Figure1. The Answer Sheet. what this villager said? Please answer in front of the [scale or clock] symbol. An excerpt from the answer sheet is depicted in figure1. Because instructions from the interviewerand from the prerecorded voice itselfdetail what each thumb means (clearly disagree, somewhat disagree, somewhat agree, clearly agree), most respondents have no problem in responding. In this methodology, interviewers play a minimal role: after training respondents, they push the play button and wait away from the respondent until the recorded instrument has finished. At the end of the recording, the voice says, Your interview is now over. Please ask your interviewer to return. If you have missed any questions, please ask him to replay them for you. If respondents ask to have a question repeated, interviewers use the skip forward function to do so.3 Respondents deposit their completed answer sheet in a bolted ballot box to enhance privacy (Lowndes et al. 2012).

The Impact of the MP3/ASAQ Methodology on Reporting of Sensitive Attitudes: An Experiment


In order to assess whether and how this methodology affects reporting of socially undesirable attitudes, the MP3/ASAQ was compared to other administration procedures.
3. This methodology does not allow the respondent to repeat questions, which may be seen as a significant limitation in comparison to ACASI. However, in this study, respondents asked to have a question replayed in fewer than 5 percent of the cases.

Chauchard

223

CONtEXt

Comparisons of administration procedures took place in a survey on untouchability-related attitudes in India. Untouchability refers to the system of domination through which certain castes (formerly called the untouchables, now legally referred to as the Scheduled Castes) are discriminated against in most social activities. Despite the continuing prevalence of untouchability, researchers interviewing villagers often face denial about its existence. For the purpose of this study, therefore, reports of openly negative or discriminatory attitudes toward members of these castes were considered socially undesirable.
STUDY DESIGN Downloaded from http://poq.oxfordjournals.org/ by guest on March 10, 2014

To test the impact of the MP3/ASAQ on reporting, this study first compared the propensity with which respondents reported these socially undesirable attitudes using the MP3/ASAQ procedure with the propensity with which they reported them in a comparable face-to-face interview. The face-to-face instrument, which presented questions in the same order as the audio instrument, differed in several important ways. While the statements used were identical, respondents were asked: Many people say: statement X; how much do you agree with that? instead of being asked to react to statements made by villagers like [them] in conversations with the research team.4 In addition, no ballot boxes were used; responses were read by the interviewer, who recorded them on paper.5 Three additional interview procedureseach of which deliberately omitted one of the features of the MP3/ASAQ methodologywere compared with the MP3/ASAQ procedure in order to isolate the mechanisms through which that procedure might increase reporting of sensitive attitudes (see table1). First, to assess whether reporting was driven by concerns about interviewer disapproval, a group of respondents used the MP3/ASAQ, but handed their answer sheets to the interviewer instead of placing them in a ballot box. Second, to assess whether endorsement by fictitious fellow villagers encouraged the disclosure of undesirable attitudes, a variation of the MP3/ASAQ procedure was run in which respondents heard questions of the type Many people say that statement X. How much do you agree with that? instead of being presented with statements made by villagers like [them]. Third, to assess whether reporting was driven by concerns about bystander disapproval, surveys were administered but earphones were replaced by mini-speakers, so that potential bystanders could listen to the instrument. Altogether, five different datacollection procedures were examined.
4.Note that slightly different statements (many people vs. villagers like you) could affect responses. 5. This administration mode was thus pen-and-paper interviewing (PAPI) rather than computerassisted personal interviewing (CAPI).

224

Using MP3 Players in Surveys

Table1.Summary of the Characteristics of the Five Administration Procedures


Privacy/ Confidentiality/ interviewer: third parties: Was the Formulation: locked ballot First-person Were earphones used? box used? statements? No Yes No No Yes Yes No Yes
Downloaded from http://poq.oxfordjournals.org/ by guest on March 10, 2014

Selfadministration? Face-toface(FTF) MP3 MP3 without ballot box MP3 without forgiving formulations MP3 without earphones No Yes Yes

Yes

Yes Yes

Yes Yes

No Yes

Yes No

Surveys were administered to villagers from all middle and upper castes in eleven villages of Rajsamand and five villages of Jaipur districts, in the state of Rajasthan, in June 2009. Asystematic sample of households was drawn. Upon being assigned to a specific area, interviewers attempted to interview a member of every nth house (the n number depended on village, caste, and area size). Within each house, interviewers interviewed the first available male. Assignment to experimental treatments was done as follows: All datacollection procedures were used in all villages. Each day of fieldwork, interviewers were assigned two different administration procedures and alternated between them. Respondents were contacted at their homes and informed of the administration procedure after the usual assurances of confidentiality had been issued, and they had agreed to participate. A total of 455 interviews were completed, of which 439 were analyzed in this study.6 The cooperation rate was comparable among respondents in the self-administered and face-to-face modes: 75.6 and 73.7 percent,7 respectively.
6. Supervisors re-contacted all respondents to ensure that interviewers had fully followed the experimental procedure. Fourteen interviews were discarded because this had not been the case. In the course of this re-contact, ninety-one respondents were tested on the meaning of each thumb; two additional interviews were discarded, because the respondents could not identify the meaning of the thumbs. 7.While supervisors did not monitor the number of households in which no one answered or no male was present, they did measure refusals to participate. Subsequently, these percentages were calculated using the counts of refusals (which includes those target respondents who completed only a subset of the survey) and the count of completed interviews.

Chauchard

225

ANAlysEs AND REsUlts

To study the effect of administration mode on reports of socially undesirable attitudes, the statistic used in the study is the total number of hostile or discriminatory answers provided by each respondent in response to nineteen questions (see appendix A). Aresponse is considered socially undesirable when it lies at the extreme end of the four-point scale (that is, either a clearly agree response to a negative statement about members of the scheduled castes or a clearly disagree response to a positive statement). This statistic indicates the effect of the administration mode over the entire length of the questionnaire. Table2 reports the mean number of undesirable responses (out of a maximum of nineteen) for each of the five administration modes. This analysis suggests that the MP3/ASAQ mode strongly increased reporting of socially undesirable attitudes: a t-test establishes a highly statistically significant difference between the FTF and MP3 groups. Results from the three partial MP3/ASAQ conditions suggest that this reduction stems from effective isolation from bystanders, as well as to more forgiving formulations, but not to increased isolation from the interviewer himself. As shown in the lower rows of table2, the characteristics of respondents across experimental groups differed only slightly. However, to ensure that demographic differences did not cause these results, multivariate analyses were run and are presented in table3. Because of the nature of the dependent variable, negative binomial regressionspredicting the count of socially undesirable responses as a function of administration procedure and of key background characteristics, described in appendix Awere run. In all models, the statistics reported are incidence-ratio rates (IRR), which indicate the incidence rate for a one-unit change in any given variable X. The coefficient corresponding to each procedure thus shows the estimated difference between that procedure and the baseline MP3/ASAQ procedure, which is the omitted category. Model 1 confirms that the MP3/ASAQ procedure strongly increased reporting of sensitive attitudes. The number of undesirable responses given by villagers who were interviewed face-to-face was only 79/100 of the number given by respondents who used the MP3/ASAQ with the forgiving statements. Model 1 also confirms the results of table2 in suggesting why the MP3/ ASAQ procedure increases reporting of socially undesirable answers. The absence of a ballot box does not have any significant impact on counts of undesirable responses. Conversely, a departure from statement-based questions significantly decreased the number of undesirable attitudes respondents admitted holding, as did the replacement of earphone by mini-speakers. These results imply that both question formulation and isolation from bystanders drive the difference in means observed across the FTF group and the MP3 group in table2. In addition, a comparison of the findings for the MP3 without ballot box group with the MP3 without earphones group suggests that

Downloaded from http://poq.oxfordjournals.org/ by guest on March 10, 2014

226

Using MP3 Players in Surveys

Table 2.Mean Number of Undesirable Responses Reported and Characteristics of Procedures, by Administration Mode
MP3 without ballotbox MP3 without forgiving formulation s MP3 without earphones

FTF Mean number of undesirable responses Std. error Sample size Age group (in %) 1830 3140 4150 50+ Caste (in %) Middle castes Upper castes Education (in %) No school 14years 58years 912years 12years+

MP3

7.12 (0.31) 101 23.76 42.57 20.79 12.87 65.34 34.65 11.88 15.84 33.66 35.64 02.97

8.98a (0.34) 99 28.28 30.30 25.25 16.16 63.63 36.36 10.10 21.21 32.32 34.34 02.02

8.92 (0.38) 80 20.00 45.00 20.00 15.00 68.75 31.25 12.50 21.25 31.25 32.50 00.00

8.00 (0.36) 81 24.69 43.20 20.98 11.11 67.9 32.1 11.11 17.28 40.74 28.39 02.46

7.73 (0.34) 78 30.76 34.61 20.51 14.10 65.38 34.62 08.97 20.25 33.33 33.33 03.84
Downloaded from http://poq.oxfordjournals.org/ by guest on March 10, 2014

Note.The variables used in this table are described more extensively in appendix A. a A t-test establishes that the difference between mean of FTF group and MP3 group is > 0 at the 0.000 level.

the fear of bystander disapproval is a more important cause of underreporting of sensitive attitudes than the fear of interviewer disapproval. This intuition is confirmed when analyses include a control for the presence of bystanders. In model 2, bystanders (a dummy variable accounting for the presence of bystanders, further described in appendix A) yields a significant and negative coefficient, which implies that the presence of bystanders decreased reporting of sensitive attitudes. Since this impact should be amplified among the experimental groups in which respondents were not able to isolate themselves from others, model 3 includes interactions between this variable and administration procedures. In this specification, the coefficient for bystanders is not significant, which implies that the impact of third-party presence is entirely conditional on respondents being interviewed without earphones (in other words, that the use

Chauchard

227

Table 3.Negative Binomial Regressions of the Count of Socially Undesirable Responses on Administration Procedures and Background Characteristics (coefficients are incidence-rate ratios with standard errors in parentheses)
Variables Survey procedure MP3 FTF MP3 without ballot box MP3 without forgiving formulation MP3 without earphones Interactions andcontrol variables Age Education Caste Bystanders FTF bystanders MP3 without earphones bystanders Log-likelihood Pseudo R2 N 1092.732 0.0367 439 1090.763 0.0385 439 Model 1 0.79*** (0.04) 0.99 (0.05) 0.89** (0.05) 0.87** (0.05) 1.01*** (0.0015) 0.98*** (0.005) 0.97 (0.04) Model 2 0.79*** (0.04) 0.99 (0.05) 0.89** (0.05) 0.87** (0.05) 1.01*** (0.0015) 0.98*** (0.005) 0.97 (0.04) 0.93** (0.03) Model 3 0.84*** (0.05) 0.99 (0.05) 0.89** (0.05) 0.92 (0.06) 1.01*** (0.0015) 0.98*** (0.005) 0.98 (0.04) 0.97 (0.04) 0.79** (0.07) 0.84* (0.08) 1086.630 0.0421 439

Downloaded from http://poq.oxfordjournals.org/ by guest on March 10, 2014

Note.All results were checked using Poisson models and included robust standard errors. *p<.05, **p<.01, ***p<.001

of earphones is effective in reducing the influence of third parties). The fact that the IRRs on both interaction terms are significant and less than one is evidence of this explanation, in conjunction with the fact that the main effect on MP3 without earphones (accounting in this regression for those MP3/ASAQ interviews that took place without earphones AND without bystanders) is not significant.

228

Using MP3 Players in Surveys

Although replacing earphones with mini-speakers yields more desirable answers when others are present, this effect disappears when respondents are alone. By delivering questions over headphones without displaying them on a screen, the MP3/ASAQ methodology increases reporting of socially undesirable attitudes by erasing the potential influence of bystanders.

Discussion
This research admittedly took place in conditions that are, for many researchers, unusual. What, then, are the implications for external validity? This study rarely allowed for interviews to take place in isolation from others (the houses visited were small, and fewer than 50 percent of them had windows or doors that could be shut). Add to this the large size of households, the fact that bystanders often did not belong to the respondents own household (they often were neighbors), and the difficulty of conveying the need for privacy to the respondent, and these results may be more easily explainable.8 Because survey researchers are often unable to guarantee the isolation of respondents, making low-tech and inexpensive self-administered methods available to survey researchers working in such settings constitutes an important step toward the production of better surveydata. These results also carry broader implications for researchers. First, these results first provide additional evidence of the benefits of self-administered interviewing. When sensitive questions are asked, isolation from others (whether interviewers or bystanders) tends to increase reporting of sensitive attitudes. This conclusion extends to the simpler, significantly cheaper,9 more discreet, and less cognitively demanding methodology introduced in this study. Second, these results extend findings on bystanders influence on the report of sensitive behaviors (Gfroerer 1985; Aquilino 1993, 1997; Aquilino, Wright, and Supple 2000) to the study of sensitive attitudes. Finally, because these results show that reporting may be affected by a threat of bystander disapproval (as opposed to interviewer disapproval), they may encourage researchers to investigate the impact of bystanders on reporting in a larger variety of settings and to use administration procedures that fully isolate respondents from their social environment.

Downloaded from http://poq.oxfordjournals.org/ by guest on March 10, 2014

8.Moreover, the absence of women in the sampledue to many husbands reticence to allow their spouse to be interviewedimplies that the study underestimates the effect of bystander disapproval. In gender-conservative rural Rajasthan, it is indeed likely that the threat of bystander disapproval would be even larger among women. 9. The small-screen, unbranded MP3 players used in this study cost around $30 apiece. Such prices may constitute a substantial reduction in the cost of self-administered surveys, even compared to the cheapest computers currently available on the market.

Chauchard

229

Appendix A.Questions and Measures


Of the forty statement-based questions in the MP3/ASAQ, nineteen were about untouchability or about members of the Scheduled Castes (one of the terms used in contemporary India to refer to the former untouchables). Twentyone asked about other topics, including marriage, family life, the difference between life in villages and in cities, technological change in the village, and development. The nineteen statements were adapted from a series of openended interviews in which raw statements were modified to reflect positive (as in members of the scheduled castes should have the same rights) or negative views (as in members of the scheduled castes should NOT have the same rights) about caste members. Because respondents are hypothesized to be more likely to agree than to disagree with a given statement, we presented respondents with both negative and positive views about members of untouchable castes. The following nineteen statements were used in the study: If a member of the Scheduled Castes becomes sarpanch, he will only 1. take care of other SCs. Members of the Scheduled Castes are usually unable to do a good job 2. as sarpanch. They do not have the skills for that. Members of the Scheduled Castes are able to serve as politicians 3. such as MLAs or MPs. Members of the Scheduled Castes do not have ideas on how the vil4. lage should be run. Members of the Scheduled Castes and other SC castes are not like 5. other social groups in the village; they are completely different and should not get the same respect. 6.Members of the Scheduled Castes stand much lower than others in the hierarchy of groups. 7. Members of the Scheduled Castes cannot think for themselves; they usually prefer being dominated by members of higher castes. Members of the Scheduled Castes usually have low confidence. 8. 9. Members of the Scheduled Castes are not as intelligent as other villagers. Members of the Scheduled Castes are not as hardworking as other 10. villagers. 11.It is really unfair that members of the Scheduled Castes receive so much help from the government. 12. The authorities and the media should stop constantly portraying SCs as victims. 13. SCs are right to be unhappy about their condition; Iunderstand their demands. 14. It is their fault if members of the Scheduled Castes are backward today.

Downloaded from http://poq.oxfordjournals.org/ by guest on March 10, 2014

230

Using MP3 Players in Surveys

15. SCs think they can do whatever they want nowadays. 16. I think SCs should feel free to parade in the village when there is a marriage in their community. 17. SCs should be allowed to drink from the same glasses/plates as other folks. 18. SCs should not be allowed to come to the upper-caste hamlet and interact with upper castes. 19. SCs should feel free to enter village temples. Background variables used in the analyses reported in table 2 can be described as follows: Age: self-reported count, in years (the variable Age group in table2 simply summarizes these data). Level of education: self-reported count of completed years of school. Caste: self-reported dichotomous variable coded 1 (versus 0)when respondents report belonging to upper castes versus middle or other backward castes. These variables were included as controls insofar as they are often described as predictors of caste-related attitudes, with older, less educated, and uppercaste respondents usually seen as holding more antagonistic attitudes. In addition, the dichotomous variable bystanders was included in some analyses to control for the social environment of the interview. Bystanders is a dichotomous variable coded 1 when a bystander (of either gender and above twelve years of age) was present in the same room as the respondent for more than thirty continuous seconds during the interview, and 0 otherwise. This variable was coded based on a report by the interviewer.

Downloaded from http://poq.oxfordjournals.org/ by guest on March 10, 2014

References
Aquilino, William S. 1993. Effects of Spouse Presence during the Interview on Survey Responses Concerning Marriage. Public Opinion Quarterly 57:358376. . 1997. Privacy Effects on Self-Reported Drug Use: Interactions with Survey Mode and Respondent Characteristics. In The Validity of Self-Reported Drug Use: Improving the Accuracy of Survey Estimates, edited by L. Harrison and A. Hughes, 383415. National Institute on Drug Abuse Research Monograph 167. Washington, DC: National Institutes of Health, Department of Health and Human Services. Aquilino, William S., Debra L.Wright, and Andrew J.Supple. 2000. Response Effects Due to Bystander Presence in CASI and Paper-and-Pencil Surveys of Drug Use and Alcohol Use. Substance Use and Misuse 35:845867. Brener, Nancy D., Danice K.Eaton, Laura Kann, Jo Anne Grunbaum, Lori A. Gross, Tonja M. Kyle, and James G.Ross. 2006. The Association of Survey Setting and Mode with Self-Reported Health Risk Behaviors among High School Students. Public Opinion Quarterly 70:354374. Camburn, D., M. Cynamon, and Y. Harel. 1991. The Use of Audio Tapes and Written Questionnaires to Ask Sensitive Questions during Household Interviews. Paper presented at the National Field Directors/Field Technologies Conference, San Diego. Gfroerer, Joseph. 1985. Influence of Privacy on Self-Reported Drug Use by Youths. In SelfReport Methods of Estimating Drug Use: Meeting Current Challenges to Validity, edited by

Chauchard

231

B. A. Rouse, N. J. Kozel, and L. G. Richards, 2230. DHHS Publication No. (ADM) 851402. Rockville, MD: National Institute on Drug Abuse. Gribble, James N., Heather G.Miller, Philip C. Cooley, Joseph A. Catania, Lance Pollack, and Charles F.Turner. 2000. The Impact of T-ACASI Interviewing on Reported Drug Use among Men Who Have Sex with Men. Substance Use and Misuse 35:869890. Harmon, Thomas, Charles F.Turner, Susan M. Rogers, Elizabeth Eggleston, Anthony M. Roman, Maria A. Villarroel, James R. Chromy, Laxminarayana Ganapathi, and Sheping Li. 2009. Impact of T-ACASI on Survey Measurements of Subjective Phenomena. Public Opinion Quarterly 73:255280. Lowndes, Catherine M., A. A. Jayachandran, Pradeep Banandur, Banadakoppa M. Ramesh, Reynold Washington, B. M. Sangameshwar, Stephen Moses, James Blanchard, and Michel Alary. 2012. Polling Booth Surveys: A Novel Approach for Reducing Social Desirability Bias in HIV-Related Behavioral Surveys in Resource-Poor Settings. AIDS and Behavior 16:10541062. Moskowitz, Joel M. 2004. Assessment of Cigarette Smoking and Smoking Susceptibility among Youth: Telephone Computer-Assisted Self-Interviews versus Computer-Assisted Telephone Interviews. Public Opinion Quarterly 68:565587. OReilly, James M., Michael L.Hubbard, Judith T. Lessler, Paul P. Biemer, and Charles F.Turner. 1994. Audio and Video Computer Assisted Self-Interviewing: Preliminary Tests of New Technologies for Data Collection. Journal of Official Statistics 10:197214. Tourangeau, Roger, and Tom W. Smith. 1996. Asking Sensitive Questions: The Impact of Data-Collection Mode Question Format, and Question Context. Public Opinion Quarterly 60:275304. Turner, Charles F., Barbara H. Forsyth, James OReilly, Phillip C. Cooley, Timothy K. Smith, Susan M. Rogers, and Heather G.Miller. 1998. Automated Self-Interviewing and the Survey Measurement of Sensitive Behaviors. In Computer-Assisted Survey Information Collection, edited by M. P.Couper, R. P. Baker, J. Bethlehem, C. Z.F. Clark, J.Martin, W. L. Nicholls, and J. M. OReilly, 455473. New York: Wiley. Turner, Charles F., Heather G.Miller, Timothy K. Smith, Philip C. Cooley, and Susan M.Rogers. 1996. Telephone Audio Computer-Assisted Self-Interviewing (T-ACASI) and Survey Measurements of Sensitive Behaviors: Preliminary Results. In Survey and Statistical Computing, 1996: Proceedings of the Second ASC International Conference, edited by R. Banks, J. Fairgrieve, L. Gerrard, T. Orchard, C. Payne, and A. Westlake, 12130. Chesham, UK: Association for Survey Computing. Villarroel, Maria A., Charles F.Turner, Elizabeth E. Eggleston, Alia Al-Tayyib, Susan M. Rogers, Anthony M. Roman, Philip C. Cooley, and HarperGordek. 2006. Same-Gender Sex in the United States: Impact of T-ACASI on Prevalence Estimates. Public Opinion Quarterly 70:166196.

Downloaded from http://poq.oxfordjournals.org/ by guest on March 10, 2014

You might also like