You are on page 1of 30

University of Minnesota, Twin Cities, USA

First Brazilian International Conference on Qualitative Research


March 2004

Guidelines for the Evaluation


of Dissertation Research

A Workshop

Jane F. Gilgun
University of Minnesota, Twin Cities USA

revised March 2005

Jane F. Gilgun, Ph.D., LICSW, is professor, School of Social Work,


University of Minnesota, Twin Cities, 1404 Gortner Avenue, St. Paul,
MN, 55108. USA. e-mail: jgilgun@umn.edu; See Professor Gilgun’s
books, articles, children’s stories, and videos on Amazon Kindle,
scribd.com/professorjane, stores.lulu.com/jgilgun, and
youtube.com/jgilgun.

Note: The first part of this document is an outline. The second is a


discussion, and the third is an outline of the expectations of the Smith
College School of Social Work.
Gilgun dissertation workshop
Page 2 of 30

University of Minnesota, Twin Cities


School of Social Work
Jane F. Gilgun, Ph.D., LICSW
jgilgun@umn.edu
April 1, 2007

Some Guidelines for the Design of Qualitative Research


with Emphasis on Dissertation Research
(Draft—Subject to Revision)

The following are guidelines for the design of qualitative research. Be sure to study this
outline carefully and read the dissertations that others have written. A strategy for writing
dissertations is to check to make sure that the key concepts of the proposed research are
represented in every section of the dissertation. If you address the key concepts in every section,
then you will have a unified and coherent piece of work. You can use this outline to write
dissertation proposals. In proposals, you explain what you will do. In dissertations, you explain
what you have done.

Recommended Preliminary Work


Writing out assumptions before collecting data
your own assumptions, values, and experiences that are relevant to your topic
thoughts about how participants may experience as being part of your sample
Deciding on whether to use hypotheses and sensitizing concepts
Development of preliminary codes
Informational Interviews
Preliminary research

Content of Dissertations
Always provide an introductory overview of each of the major sections
The introduction to the entire dissertation includes brief statements about
topic to be studied
conceptual framework/theory to be applied to the topic to be studied
sample
method
significance/implications
The introduction to each section typically begins with a topic sentence and mentions the
topics to be covered.

The introduction to the findings section typically summarizes the main findings.

These general statements make sense only after you have read well-done dissertations
and well-done research reports.

Conceptual Framework
Introductory overview of the proposed research containing the topics mentioned
earlier
Literature review
An analysis of existing knowledge that relates to the topic—includes
Gilgun dissertation workshop
Page 3 of 30
research and theory always and when relevant policy, program,
and practice principles
Reflexivity statement (negotiable with committee and depends upon
methodology)
can build on preliminary work of writing out assumptions
a statement of personal and professional values and experiences relevant
to the project
Précis: a summing up, a summary of what went before and a looking forward to
research questions/hypotheses/purposes of research. A précis is
composed of
• summary of cited research, theory, etc; this means a critical synthesis
of what is known and not known about the topic
• summary of professional and personal experience
• statement of significance of proposed project
• statement about the approach (methods and methodologies) proposal
will use

Research questions/hypotheses
statement of the questions/hypotheses
definitions of key concepts that compose the questions/hypotheses in your
own words but whose bases are well documented in the literature
review
diagrams of the relationships among concepts if this is a project whose
purpose is to modify an initial conceptual framework

Overview of Methods Section

Methods

Methodology
statement and discussion about the general principles and ideas
methodological principles that you will be following such as feminist,
emancipatory, phenomenological, theory-building, critical race theory,
descriptive, narrative, life history, portraiture.

The Design
o Introductory statement
 Sometimes a discussion of the relevance of the
methodology is here
o Sampling and recruitment
o Interview plan
o number of interviews
o length of interviews
o who will be interviewed
o who will conduct the interviews and qualifications
o reflections on how respondents may receive the project
o procedures to follow to ensure that respondents
o will not be harmed by the interview
o will have free choice about answering or not answering
questions
Gilgun dissertation workshop
Page 4 of 30
 Interview schedule
 Apparatus for conducting interviews and observations
 Data analysis and interpretation
• Transcriptions
• Fieldnotes
• Solo or group analysis of data?
• Testing out your interpretations
• Treating each interview as a “pilot”
• Thematic analysis? Which one?
• Coding scheme? Which one?
o Open
o Axial
o Selective
o other
• Sources of codes
pre-planned codes based on sensitizing concepts that are
taken from the conceptual framework of the
project
“mid-stream” codes taken from literature read in the course
of data collection and analysis that arise from
researchers’ general store of researcher knowledge
including their general knowledge of research and
theory, professional and personal experience and
values
Names of codes
words from research and theory
words from informants
words from researchers’ general stores of knowledge
• Timeline
• Human subjects committee approval and any additional ethical
considerations not already covered. Much of the material on
human subjects’ committee issues such as consent forms are in the
Appendix

Organization of Findings
Introductory statement
Diagrams of any revised conceptual frameworks Where this goes depends upon the
logic of your findings section, but usually right after the introductory statement is a
good fit.
Present findings so the patterns and exceptions to patterns are clear
Present findings in terms of concepts, categories of typologies, hypotheses,
and statements describing patterns
How you present findings depends upon the methodological principles of your
study
The following are typical ways of presenting findings
Researcher statements and interpretations
Linked to concepts, hypotheses and/or patterns
Discussion of each
Excerpts to provide examples and to support discussion/interpretation
Links to related research and theory
Gilgun dissertation workshop
Page 5 of 30
how your findings add to, modify, or refute what is known
In general, in presenting findings, all statements supported by data and by existing
research and theory

Credibility
Introductory statement
Immersion in the field
Use language audiences understand
Situate findings within social science traditions
Present findings so audience has vicarious experience of being there
Grab and “heart”
Modifiability
Transparency of reasoning and design
Language and methods used are consistent with the type of qualitative research
that researchers are using
Some researchers do inter-rate reliabilities
Findings presented so that they are consistent with philosophy of science underlying the
research

Other indicators of quality


Coherent organization
Focus of research is clear
Every part of the proposal from introduction to final statements have links to
focus and further either the understanding of the focus (conceptual
framework) or how the researchers will investigate the focus of the study
Ideas supported by data
Researchers convey something meaningful

Applications to policy, practice, and programs


Introductory statement
Analytic generalizability
How does it fit in particular settings with particular people at particular times
Sensitizing, illuminating

Dissemination Plan

Journal articles, conference papers, books, workshops, in-service training, policy briefs, website
postings, blogs, videos, DVDs are just some of the ways to disseminate research
Gilgun dissertation workshop
Page 6 of 30

University of Minnesota, Twin Cities


School of Social work
Jane F. Gilgun, Ph.D., LICSW
jgilgun@umn.edu
March 2005

Some Guidelines for the Design of Qualitative Research


with Emphasis on Dissertation Research

The following are guidelines for the conduct of qualitative research, with special
emphasis on dissertation research. My intended audiences are Ph.D. students who may be
conducting their first research projects and their advisors and committee members who may not
be as familiar with qualitative approaches as they are with surveys and experimental and quasi-
experimental designs. The guidelines can be summarized into a kind of checklist that students,
advisors, and committee members can use to ensure that students cover important topics in their
research and in their dissertation.

Recommended Preliminary Work

Much of the work that researchers do does not appear in dissertations or other research
reports. Some of the material in this section is that kind of work, but some of it can be integrated
into proposals and reports.

Writing out Assumptions Before Collecting Data

Researchers, like other human beings, operate on the basis of prior assumptions, some of
which are not in awareness when we use them. These assumptions, however, influence what we
notice and how we interpret what we notice. Therefore, to be reflective, aware researchers, it’s
helpful to write out assumptions, perspectives, beliefs, values, emotions, and hunches about the
topic you will be researching. Some topics are emotionally-loaded. So, I’ recommend that you
“free write”—meaning write down whatever comes to mind without worry about logic,
grammar, and whether you are making sense—to get your thoughts, fears, and perspectives out
on paper. You could have on-going conversations with other researchers and other interested
persons about your perspectives.

This is important to do for a few reasons. First, it moves you away from naive
empiricism that is characterized by researchers’ inattention to their personal perspectives. and
their often unexamined assumptions that their personal perspectives are unrelated to the research
they are conducting when actually assumptions are inseparable. As Glaser and Strauss (1967)
stated in a footnote, we are not tabula rasa but bring our personal theories, values, and
experiences, as well as our more formal training and knowledge, into the analysis. It is better to
state clearly what our assumptions, values, and hypotheses are so that we can test them against
what we are learning from doing the research. Unchecked, naïve empiricism can result in
researchers shaping findings in ways they do not recognize. Researchers may also be blind to
potentially important analyses.

Second, doing this may make you more able to hear informants’ points of view more
clearly. If we have an awareness of our own assumptions, we can put some restraints on them
Gilgun dissertation workshop
Page 7 of 30
and be positioned to challenge them. By doing this, we are more likely to welcome points of
view other than our own.

Third, your personal and professional assumptions, experiences, and values may make
important contributions to the focus of your study, including any hypotheses that you formulate
and seek to disconfirm through your research. See Gilgun (2005) on evidence-based practice for
elaboration of these ideas.

Fourth, by raising your awareness about your assumptions, experiences, and values, you
may arrive at some terrific hypotheses to test and some important sensitizing concepts that will
guide your research.

Fifth, you can use some of this writing in the reflexivity statement of your proposal and
dissertation.

Besides writing our issues that relate to you personally, I recommend that you write out
how participanting in research can affect your informants. The following are xamples of
questions to ask. You undoubtedly can come up with others. What’s in it for them to be part of
your research? Are you taking advantage of potential respondents’ desires to please? Are you
aware of the power that respondents may attribute to you? How will you ensure that you are not
using these factors to get your sample? What kinds of vulnerabilities do your potential
informants bring to research situations? How can you build on their strengths, values, and
capacities? Who can help you understand your potential respondents in a more clear light? How
might human subjects committee guidelines help you in this?Such “free writing” about
respondent issues can greatly enhance the ethics of your research.

Thinking Through Hypotheses, Sensitizing Concepts, and the Development of Preliminary


Codes

Before you begin your research, think about whether you want to test hypotheses.
Hypotheses in qualitative research typically are simple statements of relationships between two
or more variables. An example is: “Persons who are emotionally expressive and who have
experienced adversities usually are more resilient than persons who are not emotionally
expressive” (Gilgun, 1992, 1996, 1999, in press b). Your decision will influence how you
structure and conduct your research and how you write it up.

Some researchers do not formally test hypotheses. Instead, they write out their informed
expectations and compare them to what they learned from doing the research. Informed
expectations, like hypotheses, are based on prior research and theory and professional
experiences and values. Often researchers who want to do descriptive studies of various types
and are not interested in theory development do not label the ideas they begin with as
hypotheses.

Hypothesis testing is an important way to do qualitative research. Researchers trained in


logico-deductive approaches prefer to begin their studies with hypotheses and conceptual
models. Many researchers assume that to do qualitative studies, they have to forsake well-
formulated conceptual models and hypothesis testing and begin their research in an open-ended
way (See Kidd, 2002, as one example.).
Gilgun dissertation workshop
Page 8 of 30
This widespread impression stems from the procedures of grounded theory (Strauss &
Corbin, 1998) that are designed for researchers who want to identify research problems through
preliminary studies. Yet, researchers interested in particular theoretical models cannot, nor
should they be expected, to start anew, or act as if they do not already know something about
their areas of interest. There is no reason why they cannot test their models qualitatively.
Furthermore, dissertation committees are unlikely to approve studies that do not build on what is
already known nor are funders inclined to commit money to such studies. Finally, students do
not have the time that could be needed to do research that has no hypotheses to test. Hypothesis-
testing qualitative research is focused and leads to a more efficient use of time.

Hypothesis-testing often help make the research process more transparent—meaning that
you are sharing with other researchers the processes you used to arrive at findings. Transparency
of research procedures contributes to credibility of findings.

Testing hypotheses and testing presumed patterns in qualitative research is usually


done once case at a time, where researchers write out the hypotheses or the patterns they
expect, test them on you’re the first case. Then, they change their hypotheses/patterns to
fit case 1. They test the revised framework on the next case and change the framework to
fit case 2. And so on. This is part of deductive qualitative analysis (Gilgun, in press a).

Sensitizing concepts (Blumer, 1986) give direction to researchers. They alert


researchers to some important aspects of research situations, though they also may direct
attention away from other important aspects. Our theories and assumptions may help us to
identify patterns and processes but they may blind us to others. Research usually begins with
such concepts, whether researchers state this or not and whether they are aware of them or not.
Furthermore, researchers planning on doing descriptive research as well as those interested in
theory development would probably have more transparent processes and also find more focus to
their research if they identify and define sensitizing concepts.

Concepts that form hypotheses and sensitizing concepts form a set of pre-established
codes that you identify and define before you collect data and analyze it. Researchers may have
reason to change the codes over the course of the research, as they would any other part of the
design. Other codes are identified in the course of doing research. The sources of them can be
research and theory that researchers read as they collect and analyze data as well as knowledge
that researchers have developed from personal and professional experiences and values.

Researchers may be wondering if unexpected findings can emerge if they begin


fieldwork with prior frameworks. Yes, they can. A purpose of qualitative research is the
construction of new understandings. In many, but maybe not all forms of qualitative approaches,
researchers consciously try to refute their hypotheses, frameworks, and concepts that they think
are central. That is, they purposefully seek information that will contradict what they have
assumed and what they have found so far through their research.

They are doing the research in order to arrive at deeper fuller understandings and
by having this as a goal they therefore will end up with hypotheses, other frameworks,
descriptions, and concepts that could be quite different from what they expected, or they could
end up with material that at least have been changed to better reflect what informants state to be
their experiences.
Gilgun dissertation workshop
Page 9 of 30

Informational Interviews and Preliminary Research

Sometimes students believe that they are researching an area in which little is known, or
they want to develop a research question that comes from the field and/or from personal
experience and not solely from published research. In these cases, students can do informational
interviews with persons who have direct experience in their areas of interest in order to develop a
focus and research questions. This would probably not require a human subjects committee
review, but it’s best to contact them and find out. Another approach is preliminary research for
the purpose of finding a focus and research questions. This may involve focus groups and
interviews. Such preliminary research would likely require human subjects committee review.
If students have personal or professional experience in the area they are researching, this
experiential knowledge can help them develop the focus, the questions, and the conceptual
framework in general.
Strauss (1987) was explicit about the necessity of doing preliminary work when
writing proposals for funding. He wrote, “No proposal should be written without
preliminary data collection and analysis” (p. 286). He also recommended that researchers
give examples of the codes they have constructed from such work. He gave other advice,
such as clear definitions of terms and the avoidance of jargon, are consistent with the
NIH document.

Content of Dissertations

The following is a discussion of topics that typically are included in dissertations and
other research reports. Your department and/or dissertation committee may provide you with the
outline they want you to follow. This outline typically is somewhat negotiable but certain topics
are essential to any proposal, as you will learn as you gain in experience.

Abstract

An abstract describes the issue to be researched, preferably in one sentence as well as


characteristics of the sample such as number, age, gender, and other demographics relevant to
the topic, methods and methodology, and sometimes significance. The APA manual provides
guidance about the contents of abstracts.

Introduction

Every piece of scholarly writing has an introduction that provides an overview of the
scholarly work. The introduction of a proposal provides an overview of the entire work.

Conceptual Framework

Most faculty members agree that students must begin their dissertations with a conceptual
framework, which includes a review of the literature, a précis of the concepts and ideas they are
working with, and research questions or hypotheses (Drisko, 1997). I have listed the
components of a conceptual framework in the order they should appear in a dissertation. Being a
scholar, and Ph.D. students are scholars, requires familiarity with the scholarly traditions of their
research areas. One of the challenges in doing qualitative dissertation research is the likelihood
that reading the relevant literature will be on-going throughout the course of doing the research
Gilgun dissertation workshop
Page 10 of 30
and writing it up. Deductive qualitative analysis (Gilgun, 2005) recognizes these issues and
provides guidelines for doing qualitative research that begins with an initial conceptual
framework.

Once students have a focus that they developed through their preliminary work discussed
earlier, then they can do a literature review organized around their focus. Their goal would be to
understand what is known and to identify gaps in what is known that they can respond to in their
research. Their goals is to add to what is known, correct what is known, or even contradict what
is known, or all three. If they use personal and professiona experience and values in the
development of th focus, in guiding the topics covered in the literature review and in formulating
and defining hypotheses, they should clearly state when they are doing so.

From the literature review, experiential knowledge, and preliminary research if any,
students can write a précis, which summarizes the literature review, states the focus of the
research, shows why this focus is significants, and states the research method that will best suit
the research focus and indicate the methodological perspectives they are takingn. They also write
out their hypotheses, or research questions, or assumptions about their topic. They define any
concepts that are part of these statements and questions.

Methodology

Methodology is a term that designates the set of principles researchers use that guide their
research (Gilgun, 1999b). A section on methodology usually is in dissertations. Typically, in
qualitative research, methodologies are broadly phenomenological, where the focus is on
meanings and experiences that informants share, or they be more epistemologically focused,
where the emphasis is less on meanings and experience and more on whether what the
informants report is reliable and valid. We all have plenty of training on reliabilities and
validities but less on phenomenology. A good source for understanding phenomenological
perspectives include van Maanen (1988), Patton (2002), and Benner (1994) on interpretive
phenomenology. Qualitative Health Research, a journal that Jan Morse edits, if full of good
information, as is the Denzin & Lincoln (2000) volume on Handbook of Qualitative Research.

A great deal of qualitative research is emancipatory; that is, seeking to document the
various ways that social structures, culture themes, and practices affect persons’ access to power,
opportunities, and prestige. These issues are related to status variables such as gender, age,
sexual orientation, social class, physical abilities and looks, ethnicity, and country of origin.
When such topics are part of the research, then a statement on the emancipatory nature of the
research is typical—that is, is the research taking a feminist, critical, participatory action research
perspective. Denzin and Lincoln (2000) give a good preliminary discussion of these issues.

There are countless variations in methodologies. Methodologies have implications for


the important questions about methods, such as whether researchers are interested in
• testing, developing, and/or refining theory as I was in my study of the moral discourse
of incest perpetrators (Gilgun, 1995)
• typologies that show the variations of a particular phenomena, such as how
perpetrators of child sexual abuse see their victims (Gilgun, 1994);
• an a priori set of codes that serve as sensitizing concepts that function as a means of
alerting researchers as to what to look for in texts (cf, Gilgun, 1999a); or
• a set of guiding principles that organize findings, such as those that are frequent in
various types of narrative analysis (See Crepeau, 2000, for an example.);
Gilgun dissertation workshop
Page 11 of 30
• or loosely structured accounts such as those characteristic of oral histories.

Methods involve the concrete steps that researcher take in order to do the research. Methods
stem from methodological perspectives. Thinking through and writing up methodological
perspectives help students to decide which type of qualitative research they want to do and
approaches to data collection and analysis and how to write up findings.

The Logic of the Design

The design of a research project, including dissertation research, is a plan for how
researchers intend to do the research. All of the elements of a proposal follow from the
conceptual framework and the methodology (Drisko, 1997). Thus, sampling plan, number of
interviews, interview schedules, plans for observations, data collection and analysis,
interpretations, and writing up of findings all must be shown to be connected to the conceptual
framework and methodology. Furthermore, the voices in which writers of dissertations speak,
the degree to which writers of dissertations share their own perspectives, and the space and care
devoted to the voices of informants all must be consistent with the conceptual framework
(Gilgun, 2005).

Human Subjects Committee Approval and Ethical Considerations in General

All dissertation research in the United States must be approved by an institutional review
board typically called the IRB or the Human Subjects committee. Guidelines for the approval
process are developed at the federal level and implemented at colleges, universities, and many
other public institutions that receive federal funding. The purpose of these committees is to
provide ethical oversight of research. This means they are charged to protect human subjects, to
ensure that consent to participate in research is informed, that subjects understand risks and
benefits, and that they know they can withdraw from the research at any time without prejudice.

Researchers cannot approach potential subjects directly, but must recruit through a third
person who informs potential subjects about the possibility of participating in the research. Only
with the permission of potential subjects may the third parties give their names to researchers.
Alternatively, potential subjects are free to contact researchers to find out more about their
research. Only after researchers engage potential subjects in a multiple-part consent procedures
can researchers enroll subjects in the project.

Beyond the requirements of human subjects committees researchers are ethically bound
to ensure that participants freely choose to be part of the research, are free to leave at any time,
and are treated with the utmost respect for their own sense of safety and well-being. Researchers
often don’t realize the power that they have over research participants who may see us as
powerful persons whose judgments about them matter. Procedures for ensuring the safety and
freedom of choice or participants are an important part of dissertation proposals.

Sampling and Recruitment

Who to interview, the number of persons to interviews, and length of interviews depend
upon the conceptual framework, including the research questions to be answered. However
plans are made for sampling, they must fit the conceptual framework. A general guideline for
dissertation research is 40 to 60 hours of interviewing or observations. The number persons to
be interviewed or the number of settings to be observed depends on whether researchers are
Gilgun dissertation workshop
Page 12 of 30
more interested in depth or breadth. If breadth is the goal, then at least 20 persons should be
interviewed. If depth is a goal, as few as one person can be interviewed. In studies of families,
in-depth work would suggest at least two families with two or more family members being
interviewed. As stated, the total interview time should be between 40 to 60 hours. The same is
true of settings. One setting is fine as long as the setting has enough going on to justify 40 to 60
hours of observations and interviewing.

There are many ways to choose a sample, but again the sampling plan must be consistent
with the conceptual framework. If students want to develop and test theory, then they can learn
how to do theoretical sampling, which involves having a theoretical justification for choosing the
next person or setting to engage (Glaser, 1978; Strauss & Corbin, 1998), and/or negative case
analysis, which involves deliberately seeking informants or settings that is different enough from
the sample already worked with to suggest that new understandings might arise (Gilgun, in press
a).

In recruiting participants, the IRB recruitments discussed earlier have to be followed.


Furthermore, researchers require the help of persons who can identify and aid in the recruitment
of study participants. Sometimes developing relationships with these persons is an essential part
of recruitment. This can take a great deal of time that student researchers should take into
account in order to plan their research. Researchers have to ensure that these people understand
the research well and understand the significance of non-coercion. Recruitment may take a long
time. Researchers often are frustrated by delays in the referral of potential participants. Human
subjects committees no longer allow snowball sampling if this means that researchers contact
potential participants themselves. Persons who have participated in the research can inform
others of the possibility of being in the research, but researchers cannot ask them to do this.
Here, too, we have to be careful of not taking advantage of some participants desires to please us.

Interview Plan

After deciding on the depth or breadth issue, researchers are then positioned to decide on
how many interviews and to set up the interview guide. I recommend three interviews,
especially if the topic is sensitive. If the interview involves what persons think of a current event
or a product to be bought at stores, then one interview is likely to be enough. If the topic is at all
personal, three interviews can work well. In some cases, one or two interviews would work, but
this must be carefully justified.

With a three-interview format, the first one typically is a way for participants and
interviewers to get to know each other, for the researcher to gather some demographic data, and
some general information that is relevant to the research questions. The second interview can go
into more depth and gather more detail. The third interview can be used to answer questions
stirred by the first two interviews and to take the time to do a wrap-up, which can involve a
review of what researchers understood informants to have said and statements of appreciation.
Wengraf (2002) also recommends three interviews for his historical-biographic method, and he
provides a structure for two different types: a loosely structured and a more tightly structured set
of interviews. In some of my research, I have done several interviews. I was looking for detailed
life histories on sensitive subjects, but such in-depth interviewing is typically not required in
dissertation research, unless the research questions call for it.

In general, students should be prepared to do at least two interviews unless their


conceptual framework justifies one.
Gilgun dissertation workshop
Page 13 of 30

Interview Schedule

The topics that students want to cover in their interviews should be laid out before they
begin the interviews. How loose or structured the interview schedule is, the questions that are on
the schedule, and plans to modify the questions as researchers gain experience with participants
and their concerns stem from the conceptual framework. The schedules are subject to Human
Subjects Committee review.

Apparatus of Conducting Interviews and Observations

Researchers typically audiotape interviews. Videotaping interviews is another approach.


Sometimes researchers do both simultaneously or sequentially. Observations often are
videotaped. When researchers do observations, they sometimes speak into a tape recorder at
intervals to record their observations. Typically, they do this out of sight of the settings they
observe.

I recommend taking two tape recorders to interviews in case one breaks down, extra
batteries, and more tapes than you think you will need. Be sure to label the tapes before you put
them in the tape recorder. Record at the beginning of each tape the date, the name(s) of the
person(s) you are interviewing, and the place.

Overview of Data Analysis

Learning to do the analysis of qualitative data is on-going. Those who are just
beginning and those who have been doing in for decades are still learning. We learn how
to do the analysis by doing it, by consulting what others have done it, and by discussing our
work with others.

Transcriptions

When interviewing, I recommend verbatim transcription that involves including the


“ums” and “ahs” and counting the number of seconds the pauses last. These details can be
important cues about informants’ states of mind—and your own. For example, after a
particularly startling statement by one of my informants. I was silent for about 13 seconds. That
is a long time in an interview. Sometimes informants are silent for a long time, and this can
be significant, too.

With a limited budget, some researchers only transcribe parts of the tapes they
think are relevant to their research. With this said, I recommend that scholars listen to the
tapes at least one more time before they conclude that their analysis is complete. In that
way, they can check to see if they missed anything that did not seem important at the time
the partial transcription was done.

In some rare occasions, some interviewees are reluctant to be tape recorded. This must be
respected. However, check to see if your erasing any references to names and erasing any
information that could be identifying would ease the anxiety. If not, then immediately after the
interview, tape record your recollections of what the interviewee said in the interview and any
other relevant details. You can then transcribe this tape recorded account. During the interview,
take detailed notes. Write them up immediately after the interview using the format that follows
Gilgun dissertation workshop
Page 14 of 30
below. If the interviewee does not want you to take notes during the interview, then respect
that, but, again, write up your notes right after the interview using the format below. Be sure to
inform the participant that this is your plan.

Often students transcribe the tapes themselves. If they can hire someone else to do the
transcription, it is important that researchers instruct them in how they want the tapes
transcribed. Verbatim transcriptions are most desirable.

If videotaping interviews and observations, it is not necessary to transcribe the verbal


portions in their entirety, but some researchers may.

Fieldnotes

If you are doing observations only, fieldnotes are your data. When doing interviews,
fieldnotes are invaluable. They are a wonderful source of ideas and can provide a good
accounting of the phenomena in which researchers are interested. A useful way to organize
fieldnotes is to have four sections:

1) Preliminaries: a careful description of the setting, who was present, description of


interviewee and/or persons observed, and diagrams of the settings and of any movements that
may have occurred among the participants in the setting, when this is relevant to the research
questions.

2) Descriptive text: a section on descriptions of what happened, who said what, the non-
verbals, the tones of voice, and any other relevant detail.

3) Observer comments (O.C.): a place for researchers to record their emotional reactions,
their doubts, fears, and concerns. These can be placed in the midst of the text of the descriptions
and labeled “Comments.” Some reflections and quick thoughts about applications, comparisons
to what you’ve found in other interviews/settings/observations, and linkages to related research
and theory can be made here.

4) Memos: a section at the end of the fieldnotes where researchers record their reflections
in a more leisurely way. They can reflect on what they observed in their interviews/observations,
think about what else they might want to know, think about research and theory that is relevant
to what they are learning, the scholarship that the findings might contribute to, and comparisons
across cases, among many other topics.

Bogdan & Biklen (1998) have an excellent discussion of fieldnotes. I have found field
notes to be very important in my analysis of data. More than one of my articles was based
primarily on field notes (“Gendering Violence” and “Fingernails Painted Red.”)

On-Going Analysis

While conducting interviews and/or doing observations, you will naturally be analyzing
data. More or less automatically, you will be classifying what you are learning in informal ways,
based on your store of general knowledge. Ideally you also will have in mind loosely formulated
hypotheses that you are testing, in the sense that you want to see how they hold up as you do the
analysis. You also will be changing these hypotheses in light of your on-going interpretations.
Furthermore, you will be identifying dimensions of the concepts that compose the hypotheses.
Gilgun dissertation workshop
Page 15 of 30

You can think of hypotheses in various ways. I think of them as preliminary statements
about the relationships among variables. If you are doing deductive qualitative analysis (DQA)
(Gilgun, in press a), they you are likely to have explicit hypotheses that you begin the research
with. Still, because DQA involves searching for ways to add it, undermine, or transform your
original ideas, you will change the hypotheses as well.

Solo or Group Analysis of Data?

The ideal is to have at least one other person to talk to about emerging findings, including
having that persons read your fieldnotes and discuss them with you. You can edit out any
personal information that you prefer not to share. Sometimes it is helpful to have at least one
other person read and code your transcripts with you. The synergy that results can greatly
enhance your understandings and your analysis. Some people call these sessions “peer
debriefings,” but this term has such strong military connotations for me that I don’t like to use it.

Besides, group analysis of data is a term that goes way back, at least to Booth’s (1903)
studies of the London poor at the end of the nineteenth century. My preference is to keep
qualitative approaches rooted in their scholarly traditions.

Testing out Your Interpretations

As Roland Barthes (1974) observed, texts are “wiley;” that is, open to multiple
interpretations. The interpretations you make logically stem from your conceptual framework
and often from personal and professional experience. it is ideal to be aware of the sources of
your interpretations. Many qualitative researchers seek to represent informants’ points of view,
often through extensive excerpts from interviews. They often want to make their own
interpretations of what informants said. Thus, in general, there are sections that are meant to
represent informants’ views and then sections devoted to researcher interpretations.

In the conduct of my life history research, I often share with informants toward the end of
the interview period what I have heard them say. I ask them if I have understood what they have
said. Usually, they are unused to having their words interpreted because so much of the material
they share with me they rarely tell others.

It is important to check in with informants to make sure you do represent them well. You
are free not to share your interpretations, but I have many times and given that I have developed
relationships with informants, they are interested in my thoughts on their lives.

Another way for me to test my interpretations is through teaching and through lectures
and seminars in community and academic setting. It usually is quite easy to tell if you are
making any sense. People tell you directly or through body language what they think.

Treating Each Interview as a “Pilot”

If you discuss with partners in your research each interview after it occurs, you may find
reasons to tinker with your approach to the next informant, such as asking more follow-up
questions or doing more active listening. You can evaluate your interview protocol and your
interview style as you go along and thus modify your approach. Writing fieldnotes after each
interview also provides you with opportunities to reflect not only on your emerging
Gilgun dissertation workshop
Page 16 of 30
interpretations but also on the processes of how you are conducting interviews and how
informants are responding.

Coding

Strauss and Corbin (1998) talk about three types of coding. One is open coding, where
researchers “sweep” through their data marking up the text. You can do this either within
the computer files or you can do it on paper. If you print the transcripts, it’s a good idea
to give them narrow columns and leave about half of the page blank so you can write
your codes next to the segments you are coding. Fix.

The second type they call axial which means you are relating codes to each other.
They recommend connecting codes through their own coding “paradigm” which is
arranging the codes according to “conditions, context, action/interactional strategies and
consequences” (p. 96). You don’t have to use this approach but it does provide one kind
of structure. Axial coding simply happens. I did it for years automatically without
thinking that I had to give it a name. I don’t especially like the name—it is natural that
one code will be linked to another. That’s just how we think—our informants as well as
ourselves as researchers.

The third type is selective coding. This means you probably have exhausted the
data in terms of developing any new codes but you go back to the data once again to find
instances that might add further dimensions to your codes. You thus are
dimensionalizing your codes.

By the time you get to axial coding, you can probably call your codes “core
categories,” again language developed by Strauss and colleagues (Glaser, 1992, 1978; Glaser &
Strauss, 1967; Strauss, 1987; Strauss & Corbin, 1998). Core concepts can order a lot of the
information that you are developing. An example is the idea of justice in a paper I did on the
moral discourse of incest perpetrators (Gilgun, 1995). I defined the term justice, which was a
sensitizing concept in my research, and then did deductive qualitative analysis. The concept
helped me to see in my material what I hadn’t seen previously, and the procedures of DQA
guided me to seek exceptions and reasons to reformulate both the dimensions of the concepts and
the relationship. Thus, I modified the core concepts and the hypotheses that I had begun my
research with.

Sources of codes. What do you think the sources of your codes will be? This often is an
unstated issue in qualitative research. Many researchers think they should enter the field with no
pre-conceptions and the codes will somehow emerge from the analysis of the data. Many
people think this is what induction is, but I think it’s naïve empiricism. We bring with us
our own frames of reference and these frames influence what we notice and what we
don’t and how we interpret what we notice. These frames of reference can be outside of
our awareness.

So, in terms of codes, I think in many instances researchers would be more open
about their process if they developed a set of codes and define them before doing any
data collection. They will identity new codes and aspects (e.g., dimensions) of codes they
hadn’t anticipated when they collect data and do the analysis.

There is a strong philosophical background to my point of view. As Popper


Gilgun dissertation workshop
Page 17 of 30
(1969) pointed out eloquently, there is no such thing as induction. We bring our
assumptions, perspectives, beliefs, and values with us. Because we do, we operate as
deductive human beings, not inductive. There is no such thing as an immaculate concept
(Van Maanen, 1988), perhaps echoing Glaser (1978), who said, “Immaculate conceptions are not
necessary” (p. 8). In other words, all of our observations have an overlay of our own
thinking, perceptions, etc. So, we can’t really enter the field without codes that are
already formed. Codes can be defined broadly as categories that we expect will help
us organize the questions that we ask and that will influence how informants respond to
our questions.

Organization of Findings

While thinking about data analysis, it is important to think about how you will
present your findings in research reports. Usually the revised/new concepts, hypotheses,
and patterns of relationships provide the organizing framework. Diagramming your
findings can be helpful, too. The findings section begins with an overview of what the
researchers have concluded about their analysis. . The following are typical ways to organize
findings.

Using concepts, hypotheses, and patterns to organize. List concepts, if your findings are
a typology or findings can be organized under headings, which typically you develop from your
codes. If your findings are hypotheses, list them, or if patterns of behaviors, relationships, etc.,
list them. Then create sections for each concept, hypothesis, or patterns. Then describe each of
them in your own words and how they might link to other findings. Present excerpts from your
data to support your descriptions, which often are also interpretations. The excerpts provide
some evidence for interpretations as well as to illustrate the “conceptual” findings. Also in this
section, researchers link their findings to related research and theory.

This linking to related research and theory is important to do, especially in qualitative
research because our findings may be quite different from what has come before and anything
we can do to show how what we finds fits and/or changes other research and theory will help
readers understand our findings. Besides, doing this simply is good scholarship.

Diagrams. Schematics can be terrific ways of showing your findings, especially


if there are multiple pathways and/or multiple outcomes and processes. Even with samples that
have small n’s, there can be more than one process to diagram.

No matter how you organize findings, it is important to describe any general patterns you
find and the exceptions to the patterns. This kind of reporting is one aspect of qualitative
research that makes it so important to do.

Summary of Issues related to Data Analysis

Data analysis is a multi-faceted endeavor in qualitative research. It requires a


great deal of planning, an appreciation of the provisional nature of human knowledge,
capacities for being open to views that are different from our own, strong conceptual
skills, and excellent scholarship. Analysis can take a long time to learn. Much of the
learning takes place as researcher do the research.
Gilgun dissertation workshop
Page 18 of 30

Applications of the Findings of Qualitative Research

How to talk about and apply findings from qualitative research can be pitfalls for new
researchers. One common error is for researchers to discuss their findings as if they had a
random sample. Thus, researchers may have a sample of 10 or 20, with half being women and
half men. They will state something like the following: “75% of the women and only 20% of the
men talked to someone else about their job loss.” What is much more accurate to say, “There
were patterns in the findings in terms of gender, job loss, and talking things over. The women in
this sample were more likely to talk about their job loss than the men.” Researchers can even
make a table with numbers showing these findings, but they have no way of knowing whether
the numbers will maintain their proportions in a larger sample.

How researchers talk about the use and applications of findings is often inaccurate as
well. To say findings are not generalizable has no meaning. Generalizable to what? An average
woman? Man? How everyone feels about job loss?

Analytic and Probabilistic Generalizability

Researchers accustomed to think in terms of probability theory find it difficult to


understand how the results of qualitative research can be generalized. Qualitative approaches
rarely have random samples or even randomized assignment to groups. The sample can be
small. I’ve published studies based on a n of 1 (Gilgun, 1999a) and an n of 2 (Gilgun &
McLeod, 1999). What can these minute samples tell us? The question can be reformulated to
what can case studies tell us? This discussion goes far beyond the scope of this article, but I
have a published paper (Gilgun, 1994) and one on the web
(ssw.che.umn.edu\faculty\jgilgun.htm) that discusses them in detail. Yin (2002) and Stake
(1995) have informative books.

Generalizability has several different meanings, and only one of them is probabilistic and
dependent on random samples. Some well-established philosophies of science lead to detailed
studies of cases, such as work done in chaos and complexity theory, genetics, astrophysics, and
the work of early behaviorists. In applied social research, the usefulness of findings depends
upon their analytic generalizability, meaning whether they shed new light on phenomena of
interest. In the social sciences, the concern is whether they can make contributions to policies,
programs, and interventions. As Cronbach (1975) observed in his presidential address to the
American Psychological Association, all findings no matter how derived are working hypotheses
when applied to new settings.

Subjectivity

Four basic ideas can organize discussions about subjectivity in qualitative research. First,
qualitative research typically involves the immersion of researchers in the field, which means
that researchers seek to understand persons and their cultures in great detail and from many
points of view (Gilgun, 1992). Geertz (1973) calls this “thick description.” Second, to develop
thick descriptions requires that researchers make personal connections to those they research.
Otherwise, informants might not trust them enough to reveal details important to understanding.
Third, researchers have the power to represent their informants to many different audiences.
Four, researchers have an obligation to engage in conjectures and refutations in on-going ways if
Gilgun dissertation workshop
Page 19 of 30
we are to represent informants and their cultures in ways that reflect the perspectives of
informants more than they reflect the biases of researchers. These are enduring issues in social
research, not only qualitative research.

With few exceptions, the conduct of qualitative family research involves highly personal
and sometimes painful topics that can evoke powerful emotions in researchers and informants.
In such evocative situations, researchers have opportunities to explore deep meanings of the
phenomena of interest and thus develop new theories and understandings that have rich and
nuanced dimensions. At the same time, researchers have a significant task in managing their own
emotions. Qualitative research on families can be research that “breaks your heart” (Behar,
1996).

Researcher self-awareness is important in emotion-laden situations. Academic integrity


requires fair presentations of informants' experiences. Knowing the difference between one's
own responses and the views of informants is central to the interpretation of research material.
Informants, too, have emotional reactions to being part of a research projects. Especially when
working in emotionally sensitive areas, researchers are bound ethically to ensure that no harm
comes to participants.

While informants have the power to define us as well as to choose what to withhold and
to share, researchers not only have these powers but we also have the power of representing
informants in our research reports and public presentations. Attention to issues such as how we
represent informants, how and whether we influence informants, and how to account for any
influences we might have had on informants all improves the quality of the research.

Who we think we are in relationship to informants is a topic worthy of serious


consideration. For example, in my research, I am a woman who talks to men about their
violence toward women. Do I have a point of view on men who commit violence? Do I have a
point of view of myself as member of a class of persons oppressed by male violence? Certainly.
How I manage these issues greatly affects what the informants say to me. Can I shape and
manage my own reflexivity so that my representation of these men is balanced? There are no
easy answers. Researchers who delve into the intimate areas of family life are confronted with
similar questions.

General Ideas on Judging the Quality of Qualitative Research

Credibility of Qualitative Research

For the past 40 years until today, the epistemological concerns of reliability and validity
were the standards by which the quality of research was judged. Though much of qualitative
research is concerned with ontological issues, such as what it means to be human in particular
situations, qualitative researchers could and sometimes did justify their research in the
epistemological terms of their day (e.g., Jick, 1979; LeCompte & Goetz, 1982; Kidder, 1981;
Rosenblatt, 1983). Most qualitative approaches, however, represented more than what could be
accounted for in these other epistemologies.

Glaser and Strauss (l967) were concerned that qualitative research be taken seriously.
Building on Chicago traditions, they developed guidelines for evaluating qualitative research that
were quite different from the prevailing ideas of the time. As they pointed out, the immersion of
researchers in the field became a fundamental argument for the strength of qualitative research.
Gilgun dissertation workshop
Page 20 of 30
By the time researchers are ready to publish, they are so intimate with their material that they
have great confidence in its credibility.

Researchers' confidence and the demonstration of credibility, however, are not the same
thing, as Glaser and Strauss demonstrated. Researchers have the responsibility to convey the
bases on which others may conclude that the findings are credible. They can do so in several
ways. Credibility rests on conveying findings in understandable terms. The first strategy that
Glaser and Strauss (l967) suggested is for researchers to present their theoretical frameworks
using conventional "abstract social science terminology" (p. 228). The presentation should be
extensive, and, since, the terms are familiar, the framework should be readily understood.

The second strategy that Glaser and Strauss (l967) discussed is to present findings in such
ways that readers are "sufficiently caught up in the description so that he [sic] feels vicariously
that he [sic] was also in the field" (p. 230). Glaser (1978) later called this quality "grab." This
connects to the methodological stances of the Chicago School of Sociology, especially as
represented by Robert Park and others. Denzin, who worked for several years with Lindesmith
and Strauss (Lindesmith, Strauss, & Denzin, 1975) on a social psychology textbook, reworked
this idea for contemporary times and saw its relevance to policy research. He wrote, "The
perspectives and experiences of those persons who are served by applied programs must be
grasped, interpreted, and understood, if solid, effective applied programs are to be created"
[emphasis in original text] (p. 12). A third strategy is to convey how researchers analyzed the
data so that readers can understand how researchers arrived at their conclusions. Constantly
comparing emerging findings across and within cases and searching for "negative cases" and
"alternative hypotheses" (p. 230) all are important to delineate. Above all, integrating the
theoretical statements with evidence help in conveying credibility.

Finally, Glaser and Strauss (l967) recognize the mutual responsibilities of researchers and
their audiences. Researchers have the responsibility to convey findings as clearly as they can,
including specifying how they arrived at their theoretical statements. Readers have the
responsibility not only of demanding such evidence but also of making "the necessary
corrections, adjustments, invalidations and inapplications when thinking about or using the
theory" (p. 232). These researchers, therefore, made modest claims for their theories, seeing
them as provisional and subject to interpretations and applications by others.

In Theoretical Sensitivity, Glaser (1978) again discussed the evaluation of grounded


theory which he said is to be judged on fit, relevance, modifiability, and whether it works. Fit
means whether or not abstract statements and concepts are congruent with the evidence.
Refitting the theory to ever-emerging understandings as the research continues is part of the
assessment. Thus, concepts and theory are not borrowed, but they "earn" their way into the
emerging theory (p. 4). Findings become relevant when researchers allow emergence to happen
and do not impose pre-conceived ideas onto them or do not shape findings to fit pre-
formulations. Like other qualitative methodologists, such as Thomas and Znaniecki
(1918/1920), discussed earlier, Glaser viewed all findings as modifiable as new understandings
emerge. Modifiability, in fact, is a standard by which Glaser believes theory could fruitfully be
evaluated. Theory that has fit, relevance, and modifiability, will also "work;" that is, "should be
able to explain what happened, predict what will happen and interpret what is happening" (p. 4).

Guidelines for evaluating the theory generated by qualitative methods, then, were
important to Glaser and to Strauss. Their views are based on grounded theory, which has much
in common with others methods and methodologies, but there also are differences between
Gilgun dissertation workshop
Page 21 of 30
approaches. These variations in approaches are to be taken into account if the many kinds of
qualitative research is to be evaluated fairly.

By now I hope it is clear that there is no one way to do qualitative research. The very
richness of these possibilities, however, can render evaluation of qualitative research
problematic. Many persons have written about judging quality (cf., among others, Altheide &
Johnson, 1994; Ambert et al, 1995; Gilgun, 1993; Lincoln, 1995, Reason & Rowan, 1981; Smith,
1993; Schwandt, 1996; van Maanen, 1988). From the point of view of common sense, a research
report perhaps could best be evaluated in terms of what it purports to be (Gilgun, 1993). After
all, we judge apples by their appleness and oranges by their orangeness. We do not fault an
apple because it is not an orange. For example, a report that considers itself phenomenological
should present results that represent informants' lived experience. If such a report does not, then
it should be revised until the purpose and the intellectual context of the research line up with
findings. If authors state their purpose is theory testing or development, then the findings must
contain a theory. In the case of deductive qualitative analysis, an initial hypothesis also must be
present. If a report states that it is experimental and is pushing the boundaries, say of
ethnography to create, perhaps, a personal ethnography (called an autoethnography) then this
report can be evaluated as to whether or not cultural themes and practices are part of the report as
well as whether the report is a narrative about the personal meanings of the author.

To facilitate evaluation of qualitative pieces, authors routinely, but not always, state the
type of research they are doing and provide citations. Reviewers not familiar with the style of
qualitative research can use the references to educate themselves or send the article back to the
journal editor who would find a more knowledgeable reviewer. Some articles using qualitative
methods do not state the type of research they are doing. As a reviewer, I would fault the authors
and ask them to situate themselves in a research tradition. In some experimental work, the
tradition may be identified in the article's abstract, but it is there somewhere, if for no other
reason than to orient those not familiar with that style of research.

Besides these general guidelines, the views of Glaser and Strauss (1967) and Glaser
(1978) continue to be relevant: "grab", modifiability to fit particular situations, an account of
procedures that led to the results, and the situating of findings within social science traditions.
The idea of on-going emergence implies the idea that there always is more to learn about the
phenomena; thus, Glaser and Strauss--and the Chicago School of Sociology tradition on which
they have built--have modest goals for the universality of findings. Perhaps because they were
concerned that methods and the findings could be misunderstood they also recommended that
researchers present their theoretical frameworks in conventional social science language. Using
language familiar to audiences of interest to describe methods and findings helps to bridge
research traditions.

Depending upon the type of research, many of the desirable qualities Lincoln (1995)
identified may be relevant: researchers being clear about their own stances regarding their topics;
findings that are relevant not only to other researchers and policy makers but also to
communities that have a stake in findings; attention to voice so that multiple points of views--or
voices--are represented in research reports, not simply those who have traditionally spoken for
others; and accounts of reflexivity that show a deep understanding of the positions of others
while staying in touch with one's own experiences, including any transformation that researchers
may experience as a result of the research. This long and incomplete list is suggestive; some and
not all of these desired qualities are relevant to any particular research report. I think most
qualitative researchers would agree on the overall importance of presenting evidence of
Gilgun dissertation workshop
Page 22 of 30
researchers' immersion in the field, conveyed through rich and deep descriptions that evoke new
understandings.

Though researchers from within an interpretive tradition would accept the above qualities
as desirable, there are some who consider themselves qualitative researchers who would not. As
one example, Clavarino, Najman, & Silverman (1995), concerned with the ambiguity of analysis,
argued forcefully that inter-rater reliabilities must be established in the analysis of qualitative
data. They urged other researchers to resist "methodological anarchy" (p. 224). They don't
specify what constitutes this "anarchy," but it could be such tradition-bound procedures as group
analysis of data that does not involve mathematical inter-rater reliability. Miles and Huberman
(1994) also recommend doing inter-rater reliabilities. They use the term check coding. They
note as a issue in this regard that persons from one social science tradition may not see the same
things in data as persons from others. Their discussion suggests that the coders should ultimately
agree on the names of the codes and the meanings of the data. Such a stance could be consistent
with some philosophies of science, but a viable alternative is to code the same chunk of data in
more than one way and to heed the idea that multiple interpretations possible are not only
possible but are desirable.

Not all qualitative data must be subjected to inter-rater reliability studies. By the time
well-done qualitative research is presented in a dissertation or research report, researchers and
their co-researchers with whom they have analyzed data are so immersed in the data that not only
is the analogue of inter-rater reliability already present, but there is a depth and breadth made
possible through prolonged discussion and reflection that goes far beyond what highly focused
inter-rater reliability studies can offer. Similarly, no one of any of the above described desirable
qualities must be in any one study. The paradigm underlying the research guides in making
decisions about procedures and interpretive processes. To insist that all qualitative studies must
do a quantified assessment of inter-rater reliability or must use other procedures that are
inconsistent with the research paradigm would be akin to insisting that the only good apples are
those whose juice tastes like the liquid that comes from oranges.

Qualitative interview data are sometimes subjected to inter-rater reliability studies, as


Mendenhall, Grotevant, & McRoy (1996) did in their study of couples' movement and non-
movement to openness in adoption. They were interested in relatively focused phenomena, and
the use of inter-rater reliabilities in the context of their research goals made sense. Given the
methodological sophistication that is emerging in these postmodern times, the perspective that
one set of criteria fits all research is not operative. Each type of qualitative research
productively can be evaluated in terms of what it purports to be doing.

Some qualities are desirable regardless of the paradigms in which they are embedded.
Coherent organization of ideas, ideas supported by data, and clarity in the presentation of ideas
are some of them. Sussman (1993) identified "heart" as an important quality in any kind of
research By heart, he meant a sense that the authors conveyed something meaningful and in
which they believe.

Discussion

This brief overview alerts PhD students, committee members, and advisors on some
important elements in the doing and the reporting of qualitative research. There is likely much
more to be said, but the material I’ve presented provides some direction and guidance. The
Gilgun dissertation workshop
Page 23 of 30
general principle is that how students do research must be consistent with their conceptual
framework.
References

Altheide, D. L., & Johnson, J. M. (1994). Criteria for assessing interpretive validity in
qualitative research. In N. K. Denzin & Y. S. Lincoln (Eds.), Handbook of qualitative research
(485-499). Thousand Oaks, CA: Sage.
Ambert, A. M., Adler, P. A., Adler, P., & Detzner, D. F. (1995). Understanding and
evaluating qualitative research. Journal of Marriage and the Family, 57, 879-893.
Barthes, Roland (l974). S/Z: An essay. (Richard Miller, Trans.). New York: Hill and
Wang. Original edition published in 1970 by Editions du Seuil, Paris.
Behar, R. (1996). The vulnerable observer: Anthropology that breaks your heart.
Boston: Beacon.
Benner, Patricia (Ed.) (1994). Interpretive interactionism. Thousand Oaks, CA:
Sage.
Bogdan, Robert C., & Sari Knopp Biklen. (1998). Qualitative research for
education (3rd ed.) Boston: Allyn & Bacon.
Booth, C. (1903). Life and labour of the people in London. Final volume. London and New
York: Macmillan.
Blumer, Herbert (1986). What is wrong with social theory? In Herbert Blumer,
Symbolic interactionism: Perspective and method (pp. 140-152). Berkeley: University of
California Press.
Clavarino, A. M., Najman, J. M., & Silverman, D. (1995) The quality of qualitative data:
Two strategies for analyzing medical interviews. Qualitative Inquiry, 1, 223-242.
Crepeau, Elizabeth Blesedell (2000). Reconstructing Gloria: A narrative analysis of team
meetings. Qualitative Health Research, 10(6), 766-787.
Denzin, Norman K., & Yvonna S. Lincoln (Eds.), Handbook of qualitative research (2nd
ed.). (509-535). Thousand Oaks, CA: Sage.
Drisko, James W. (1997). Strengthening qualitative studies and reports: Standards to
promote academic integrity. Journal of Social Work Education, 33(1), 185-197.
Gertz, C. (1973). The interpretation of culture. New York: Basic.
Gilgun, Jane F. (2005a). “Grab” and good science: Writing up the results of qualitative
research. Qualitative Health Research, 15(2), 256-262..
Gilgun, Jane F. (2005b). The four cornerstones of evidence-based practice in social work.
Research on Social Work Practice, 15(1), 52-61.
Gilgun, Jane F. (in press a). Deductive qualitative analysis and family psychology.
Manuscript submitted for publication. Journal of Family Psychology.
Gilgun, Jane F. (in press b). Evidence-based practice, descriptive research, and the
resilience-schema-gender-brain (RSGB) assessment. British Journal of Social Work.
Gilgun, Jane F. (2002). Conjectures and refutations: Governmental funding and
qualitative research. Qualitative Social Work, 1(3), 359-375.
Gilgun, Jane F. (2001). "Case Study Research, Analytic Induction, and Theory
Development: The Future and the Past," ( November) paper presented at the Preconference
Workshop on Theory Development and Research Methodology, National Conference on
Family Relations, Rochester, NY, November.
Gilgun, Jane F. (1995). We shared something special: The moral discourse of incest
perpetrators. Journal of Marriage and the Family, 57, 265-281.
Gilgun, Jane F. (1999a). Fingernails painted red: A feminist, semiotic analysis of
"hot" text, Qualitative Inquiry, 5, 181-207.
Gilgun dissertation workshop
Page 24 of 30
Gilgun, Jane F. (1999b). Methodological pluralism and qualitative family research. In
Suzanne K. Steinmetz, Marvin B. Sussman, and Gary W. Peterson (Eds.), Handbook of Marriage
and the Family (2nd ed.) (pp. 219-261). New York: Plenum.
Gilgun, Jane F. (1994). A case for case studies in social work research. Social Work,
39, 371-380.
Gilgun, J. F. (1993). Publishing research reports based on qualitative methods. Marriage
& Family Review, 18, 177-180
Gilgun, J. F. (1992). Hypothesis generation in social work research. Journal of Social
Service Research, 15, 113-135.
Gilgun, Jane F., & Laura McLeod (1999). Gendering violence. Studies in Symbolic
Interactionism, 22, 167-193.
Glaser, B. (1978). Theoretical sensitivity. Mill Valley, CA: Sociology Press.
Glaser, B. (1992). Basics of grounded theory analysis. Mill Valley, CA: Sociology
Press.
Glaser, B., & Strauss, A. (l967). The discovery of grounded theory. Chicago: Aldine.
Jick, T. D. (1979). Mixing qualitative and quantitative methods: Triangulation in
action. Administrative Science Quarterly, 24, 602-611.
Kidder, L. H. (1981). Qualitative research and quasi-experimental frameworks.
In M. B. Brewer & B. E. Collins (eds.) Scientific inquiry and the social sciences. (pp., 226-256).
San Francisco: Jossey-Bass Publishers.
LeCompte, M.D. & Goetz, J. P. (1982). Problems of reliability and validity in
ethnographic research. Review of Educational Research, 52, 31-60.
Lincoln, Y. (1995). Emerging criteria for quality in qualitative and interpretive
research. Qualitative Inquiry, 1, 275-289.
Lindesmith, A. R., Strauss, A. L., & Denzin, N. K. (l975). Social psychology (5th ed.).
New York: Holt, Rinehart and Winston.
Mendenhall, T. J., Grotevant, H. D., & McRoy, R. G. (1996). Adoptive couples:
communication and changes made in openness levels. Family Relations, 45, 223-229.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis (2nd ed.). Thousand
Oaks, CA: Sage.
Patton, Michael Quinn (2002). Qualitative research and evaluation methods (3rd
ed.). Thousand Oaks, CA: Sage
Popper, Karl R. (1969). Conjectures and refutations: The growth of scientific
knowledge. London: Routledge and Kegan Paul.
Reason, P., & Rowan, J. (Eds.). (1981). Human inquiry: A sourcebook of new paradigm
inquiry. New York: Wiley.
Rosenblatt, P. C. (l983). Bitter, bitter tears: Nineteenth century diarists and twentieth
century grief theories. Minneapolis: University of Minnesota Press.
Schwandt, T. A. (1996). Farewell to criteriology. Qualitative Inquiry, 2, 58-72.
Smith, S. (1993). Who's talking/Who's talking back? Signs, 18, 392-407.
Stake, R. E. (1995). The art of case study research. Thousand Oaks, CA: Sage.
Strauss, Anselm (l987). Qualitative analysis for social scientists. New York:
Cambridge University Press.
Strauss, Anselm & Juliet Corbin (1998). Basics of qualitative research:
Techniques and procedures for developing grounded theory (2nd ed.). Thousand Oaks,
CA: Sage.
Sussman, Marvin B. (1993). Commentary on publishing. Marriage & Family Review,
18, 109-117.
Thomas, W. I., & Znaniecki, F. (1918-1920/1927). The Polish peasant in Europe and
America, Vol. 1-2. New York: Knopf. First published in 1918-1920
Gilgun dissertation workshop
Page 25 of 30
Van Maanen, J. (1988). Tales of the field: On writing ethnography. Chicago: University
of Chicago Press.
Van Manen, Max (1990). Researching lived experience: Human science for an
action sensitive pedagogy. Albany: State University of New York.
Wengraf, Tom (2002). Qualitative research interviewing: Biographic narrative and semi-
structured methods. London: Sage.
Patton, Michael Quinn (2002). Qualitative research and evaluation methods (3rd ed.).
Thousand Oaks, CA: Sage.
Van Maanen, J. (1988). Tales of the field: On writing ethnography. Chicago: University
of Chicago Press.
Van Manen, M. (1990). Researching lived experience: Human science for an action
sensitive pedagogy. Albany: State University of New York.
Yin, R. K. (2002). Case study research: Design and methods (3rd ed.). Thousand Oaks,
CA: Sage.

Parts of this paper adapted from


Gilgun, Jane F. (2005). Qualitative research and family psychology. Journal of Family
Psychology,19(1), 40-50.
Gilgun, Jane F. (1999b). Methodological pluralism and qualitative family research. In
Suzanne K. Steinmetz, Marvin B. Sussman, and Gary W. Peterson (Eds.), Handbook of Marriage
and the Family (2nd ed.) (pp. 219-261). New York: Plenum.

Codes
usually
are
sensitizing
concepts
that
over
the
course
of
the
analysis
may
become
core
concepts 

Gilgun dissertation workshop
Page 26 of 30

Guidelines for Review of Doctoral Dissertations
Adapted from the Smith College School for Social Work

The following are guidelines for writing dissertations at the School of Social Work,
University of Minnesota, Twin Cities. They are based on the work of the doctoral committee at
the Smith College School for Social Work. These guidelines are applicable to qualitative,
quantitative and mixed methods dissertations. The School of Social Work at the University of
Minnesota, Twin Cities, also has guidelines that are specific to qualitative dissertations and a
generic outline for quantitative dissertations that provides an overview of structure of
dissertations.

As the Smith College doctoral committee pointed out, different dimensions of this
assessment have relevance at different stages of dissertation work. Therefore, these guidelines
have three sections. Section I contains questions to ask yourself and that your committee will
ask when deciding upon a dissertation topic. Sections II through V are appropriate to both
proposals and research reports when the issue is describing and defending the methods used to
study the issue selected. Section VI relates only to completed dissertation reports.

The ratings are outstanding (0), satisfactory (S), needs improvement (NI), and not
applicable (NA) .Use of these ratings is designed to offer feedback on the areas of relative
strength and weakness in the dissertation .

Guidelines for Review of Doctoral Dissertations:


Qualitative, Quantitative, and Mixed Methods

II. SCHOLARSHIP/LITERATURE REVIEWS O S NI NA

1. Students have cited research, theory, policy, and practice


literature that is relevant to the topic. (If there is no
specific literature on the topic, studies closest to the issue
have been used.) O O O O
2. The literature review demonstrates sound knowledge
of, synthesis of, and critical thinking about the
literature. O O O O
3. The study hypotheses or questions flow clearly
from the literature review. O O O O
4. Diversity issues are discussed as appropriate to
the study topic. O O O O
5. An epistemological position is articulated and defended
if necessary. O O O O
6. A theoretical framework or perspective is articulated
its strengths and weaknesses identified, and the choice
of theory defended. O O O O
Gilgun dissertation workshop
Page 27 of 30

III. RESEARCH DESIGN ISSUES O S NI NA

A. Research question

1. The research question (s) are clearly stated and


subquestions articulated. O O O O
O S NI NA

2. The research questions flow from the literature review. O O O O


3. The rationale and assumptions that underlie the
study questions are made explicit. O O O O
B. Research Design

1. The design of the study is clearly identified. O O O O


2. The design of the study is appropriate to the
questions asked. O O O O
3. The strengths and limitations of the
design are described. O O O O

4. The scope of the study is manageable. O O O O

5. Definitions are given for all important terms


and concepts. O O O O
C. Sample

1 .The nature of the study sample is clearly described


and are appropriate to the research design. O O O O
2. The method of sampling, the rationale for its use,
and its strengths and limitations are clearly described. O O O O

3. The rationale for the sample size is clearly given. O O O O


4. Appropriate measures to maintain and/or evaluate the
integrity of the sample are employed (e.g.. response.
selection and attrition issues addressed). O O O O

5. Diversity issues in the sample are addressed. O O O O


Gilgun dissertation workshop
Page 28 of 30

O S NI NA

D .Data Collection

1. The method(s} of data collection used are identified, a


rationale for the choice given, and the strengths and
weaknesses of the data collection method are identified. O O O O
2. Explanation is given about how data bearing on each
concept and variable important to the study will be
gathered, including discussion of reliability and validity. O O O O

3. Data collection procedures and tools are


clearly described. O O O O

4. Methods of data collection are appropriate to the


study sample including relevance to gender, race/
ethnicity and other diversities . O O O O

5. Pilot testing has been done and its results described. O O O O


E. Data Analysis

1 .The nature of the data analysis is clearly described O O O O


2. Data analyses are consistent with the study questions,
design, sample, and the nature of the data collected. O O O O
3. The data analysis demonstrates sound knowledge
of the techniques used and their alternatives. O O O O
IV. ETHICS O S NI NA

1. The research goals are consistent with social work


principles of working toward improving the situations
of individuals and/or groups in society. O O O O
2 If the study involves human participants. the risks and
benefits of participation are clearly explained. O O O O
3 If the study involves human participants, threats to
free and informed consent are adequately addressed. O O O O
4 Procedures to protect privacy and confidentiality
are explicated. including storage of the data. O O O O

O S NI NA
Gilgun dissertation workshop
Page 29 of 30

5. The study has received or will receive ethical


clearance from an appropriate committee prior to
any contact with study participants. O O O O
V. PRESENTATION O S NI NA

A. Style and Organization

1. The document has a logical, easily understandable flow from


initial statement of the problem through the appendixes. O O O O
2. Major topics are separated under appropriate
headings and subheadings. O O O O

O S NI NA

3. The writing style is clear and of professional quality. O O O O


B. Completeness

1. Copies of all relevant materials (instruments,


recruitment materials, pilot data, etc.) are appended. O O O O
2. The study is described well enough that another
research could carry it out in the same way as the author. O O O O
3. References and citations to the relevant work of
others are clearly given and complete. O O O O
D. Technical Adequacy

Required and current APA style is used throughout. O O O O


5. The strengths and limitations of the study are
clearly and cogently described. O O O O
6. Implications and limitations of the study for
diverse groups are considered. O O O O

7. Directions for future research are discussed. O O O O


Gilgun dissertation workshop
Page 30 of 30

VII. ADDITIONAL COMMENTS

This section can be used to record additional comments and/or to note questions and issues for the
defense.

These guidelines are revisions of those developed by Dr. Wes Shera of the University of Toronto
School of Social Work. We at Smith gratefully acknowledge his contribution. Jeane W. Anastas,
PhD, and James W. Drisko, PhD, expanded and revised them to create this version. Contact:
jdrisko@email.smith.edu

You might also like