You are on page 1of 4

Sociological

Methodology
http://smx.sagepub.com/

Optimism and Caution Regarding New Tools for Analyzing Qualitative Data
Andrew Junker
Sociological Methodology 2012 42: 85
DOI: 10.1177/0081175012460851
The online version of this article can be found at:
http://smx.sagepub.com/content/42/1/85

Published by:
http://www.sagepublications.com

On behalf of:

American Sociological Association

Additional services and information for Sociological Methodology can be found at:
Email Alerts: http://smx.sagepub.com/cgi/alerts
Subscriptions: http://smx.sagepub.com/subscriptions
Reprints: http://www.sagepub.com/journalsReprints.nav
Permissions: http://www.sagepub.com/journalsPermissions.nav

>> Version of Record - Nov 21, 2012


What is This?

Downloaded from smx.sagepub.com by guest on September 11, 2014

Symposium: Commentary

Optimism and Caution


Regarding New Tools for
Analyzing Qualitative Data

Sociological Methodology
Volume 42, 8587
American Sociological Association 2012
DOI: 10.1177/0081175012460851
http://sm.sagepub.com

Andrew Junker1

In using new qualitative data analysis software (QDAS), each different program can
feel like a world unto its own. Becoming competent, let alone fluent, in the operations
of any one of the programs, not to mention multiple programs, or mastering their more
advanced toolboxes, requires much time and perseverance. One can quickly feel that
the tools of research are overtaking the substantive goals of inquiry. Is computer-assisted
qualitative analysis worth the effort? The papers by Franzosi, De Fazio, and Vicari (this
volume, 2012:142) and by White, Judd, and Poliandri (this volume, 2012:4376) say
yes. Although both papers make caveats and offer cautions, each sees great promise in
new tools of computer-assisted analysis. Quantitative Narrative Analysis (QNA) is said
to break new ground in empirically measuring the elusive, celebrated concept of
agency, whereas QDAS programs for thematic analysis, it is provocatively suggested, might one day play a role for qualitative research akin to that played by computational software in fomenting the quantitative revolution (White et al., 2012:44).
Having recently installed the latest version of ATLAS.ti, which came loaded with
fabulous bells and whistles, I am acutely aware of what White and colleagues call the
mismatch between the technical capabilities of the software and how researchers
like me actually use itwhich is mainly as a computerized version of what earlier
generations did with index cards and highlighters. But I am not entirely convinced by
the authors that deploying the new tools is as promising as they suggest.
The authors provide a useful example of how secondary analysis of codes can shed
light on research questions. But their example also points to how much more complex
analysis becomes once one moves beyond code and retrieve. This complexity contains at least three levels: first, one has the initial complexity of assigning codes to
text. This process requires the standard concerns of relating texts to contexts, coding
and reliability, and so forthall the ordinary issues about which we are familiar. But
when one moves to meta-analyze codes, then we also must understand how the various
codes relate to other codes, which involves a different level of complexity. The authors
1

University of Chicago

Corresponding Author:
Andrew Junker, University of Chicago
Email: andrew.junker@yale.edu

Downloaded from smx.sagepub.com by guest on September 11, 2014

86

Sociological Methodology 42

tackle this second level of complexity by discussing denominator, null response, and
overlap. Of these, perhaps the denominator is the most vexing problem because in
textual analysis the unit of analysis and how units are related is hard to pin down. Is
the unit the linguistic passage that has been coded, the text to which it belongs, or the
subject who was interviewed? Units of textual analysis are not very much like the
units of survey research because they vary not just by their attributes but by linguistic
context (see Krippendorff 2004). For instance, should a code assigned to a 10-sentence
quotation in ATLAS.ti be treated quantitatively the same way as a code assigned to a
three-word quotation? Who could answer such a question?
The third level of complexity introduced by the higher order QDAS tools is that
of communicating your findings. Not only does the researcher need to explain the
relationship of codes to text and text to context, but now the author also needs to
explain all the meta-relationships of codes to codes and their process of quantization.
More complexity means less transparency, more room for error, and more potentially
dubious conclusions. Increased complexity is not, of course, grounds for rejecting
new tools and methods, but it does call out for caution and some healthy skepticism.
Having responded to White, Judd, and Poliandri by asking if the authors are too
optimistic, my response to Franzosi, De Fazio, and Vicari is in the opposite direction:
the authors might go even further in affirming the value of their tool for measuring
agency. Franzosi and colleagues say that QNA captures agency without meaning.
They reason that newspaper data cannot capture the subjective motivations of individual actorsthat these meanings would be better studied, as they note, through close
readings of diaries, letters, etc. But, contrary to the conclusion of the authors, I view
their network graphs (Figures 1 and 2) as demonstrating a structure of social relations
that is meaningful through and through: the graphs contain meaningfully constructed
categories (for example, negros, white mobs, white women, law enforcement) and meaningful actions (for example, violence, rape, assembling, coercion).
The QNA data offer a picture of agency from the perspective not of individual
subjectivity but of white-owned newspapers and, more broadly, institutionalized
power in Georgia between 1875 and 1930. This picture represents a meaningful field
of actors and action (and thus agency), which is shaped by racism, violence, and gender. Newspaper articles about lynching codified and disseminated those representations, which further propagated those collective identities and scripts for interaction.
The QNA data give us a quantified schematic of these representations of agency
aggregated over time. The network graphs are, in a sense, a picture of meaningful
agency construed from the perspective of relational sociologythe precise field to
which the authors want the method to be of service. In other words, should we not
see QNA as a tool for depicting meaningful agency from the methodologically relational, rather than individualist, point of view?
References
Franzosi, Roberto, Gianluca De Fazio, and Stefania Vicari. 2012. Ways of Measuring
Agency: An Application of Quantitative Narrative Analysis to Lynchings in Georgia
(18751930). Sociological Methodology 42:142.
Downloaded from smx.sagepub.com by guest on September 11, 2014

Junker

87

Krippendorff, Klaus. 2004. Content Analysis: An Introduction to Its Methodology. Thousand


Oaks, CA: Sage.
White, M., M. D. Judd, and S. Poliandri. 2012. Illumination with a Dim Bulb? What Do
Social Scientists Learn by Employing Qualitative Data Analysis Software in the Service of
Multimethod Designs? Sociological Methodology 42:4376.

Bio
Andrew Junker is a Harper Fellow in the Society of Fellows and Collegiate Assistant
Professor in the Social Sciences at the University of Chicago. His research includes a project
using quantitative narrative analysis, for which he received a Dissertation Improvement Grant
from the Methodology, Measurement, and Statistics Program of the National Science
Foundation. He earned his PhD in sociology at Yale University and holds an MA in religious
studies from Indiana University. Current research concerns the diffusion of social movement
protest practices within the global diaspora of China since 1978.

Downloaded from smx.sagepub.com by guest on September 11, 2014

You might also like