You are on page 1of 15

Quantitative, Qualitative, and Mixed

Research Methods in Engineering Education

MAURA BORREGO change, development of the field bears many similarities to his sci-
EngineeringEducation entific revolution argument (Borrego, 2007b; Borrego et al., 2008).
Virginia Tech For example, Kuhn maintains that new paradigms require prior un-
derstandings to be reconstructed. He argues that this time of"tradi-
ELLIOT P. DOUGLAS tion shattering" is difficult, time consuming, and met with much
MaterialsScience andEngineering resistance. In this case, the established disciplines which must un-
University of Florida dergo tradition shattering are the constituent engineering, educa-
tion, and other disciplines in which most engineering education re-
CATHERINE T. AMELINK searchers were traditionally trained. This may be particularly
Division ofStudent Affairs frustrating to engineers and others who are more accustomed to
Virginia Tech working within long-standing paradigms (Borrego, 2007a).
Kuhn argues that formation of a paradigm is necessary and as-
sists researchers in building a discipline. In order to establish itself
ABSTRACT as a research field, engineering education is negotiating input from
both qualitative and quantitative methods advocates, both with
The purpose of this research review is to open dialog about quan- strong investment in the field. Though there have been efforts
to identify important research areas (Steering Committee of the
titative, qualitative, and mixed research methods in engineering
education research. Our position is that no particular method is National Engineering Education Research Colloquies, 2006) and
privileged over any other. Rather, the choice must be driven by results (Heywood, 2005), appropriate methods, convincing evi-
the research questions. For each approach we offer a definition, dence, and standards for evaluating the quality of research studies
aims, appropriate research questions, evaluation criteria, and are just as important to scientific field development.
examples from the Journalof EngineeringEducation. Then, we
SThe purpose of this paper is to open a dialog of methods in engi-
present empirical results from a prestigious international confer- neering education research. Our position is that no particular
ence on engineering education research. Participants expressed method (quantitative, qualitative, or mixed) is privileged over any
disappointment in the low representation of qualitative studies; other. Rather, the choice of method must be driven by the research
nonetheless, there appeared to be a strong preference for quanti- questions (Creswell, 2002). For quantitative, qualitative, and mixed
tative methods, particularly classroom-based experiments. Given methods, we offer a basic definition, aims, appropriate research
the wide variety of issues still to be explored within engineering questions, or hypotheses. A separate section compares and con-
education, we expect that quantitative, qualitative, and mixed trasts evaluation criteria for the three approaches. Our references
approaches will be essential in the future. We encourage readers are a combination of methods texts from social sciences and exam-
to further investigate alternate research methods by accessing ples from the JournalofEngineeringEducation (JEE) since its repo-
some of our sources and collaborating across education/social sci- sitioning as a scholarly journal in 1993 (Ernst, 1993) (The journal
ence and engineering disciplinary boundaries. was also repositioned in 2003 to focus exclusively on research (Jour-
nalofEngineeringEducation,2005).) In our JEE search, we empha-
Keywords: quantitative, qualitative, and mixed-methods sized articles framed as investigating research questions or testing
research hypotheses over descriptions of interventions or assessment meth-
ods. One notable exception is articles directly addressing methods,
which, while directly relevant to research methods, were often
I. INTRODUCTION framed as assessment topics. Our intention is not to provide a com-
prehensive review of studies, but rather to use selected examples to
Engineering educationI as a developing field shares many char- illustrate the ways in which educational research methods have
acteristics of a discipline undergoing a scientific revolution as de- been and could be used in engineering education.
scribed by Thomas Kuhn (1962, 1970). As Kuhn describes the Finally, we offer some empirical data to substantiate our claims
transition associated with change, he asserts that negotiation over that methods need to be more openly discussed in engineering
the path and endpoint are inevitable. While engineering education education. Observations at an international engineering educa-
is not undergoing a purely scientific revolution in Kuhnian terms, tion research conference uncovered a strong preference for quan-
which occurs when established disciplines experience a paradigm titative methods and their associated evaluation criteria, likely due
to most participants' technical training. While participants
'By Engineering Education, we refer to the field of engineering education lamented a lack of reviewers' acceptance or understanding of
research, not the practice of educating engineers. qualitative work, the same participants enacted a quantitative,

January 2009 Journalof EngineeringEducation 53


classroom-experimental model in critiquing each others' work. Haberstroh, 2002) are discussed. Third, we discuss studies that
Though classroom experiments might provide particularly com- more explicitly utilize theory and advanced statistical methods to
pelling evidence, the approach appears to add unnecessary frustra- test hypotheses that concern relationships between and among
tion for those hoping to publish their work. Specifically, faculty various indicators (Davis, 1996; Kirschman and Greenstein,
felt that they must repeat an intervention semester after semester 2002; Rutz et al., 2003; Shiavi and Brodersen, 2005; Webster and
until a statistically significant difference is established without Haberstroh, 2002).
question. We argue that expanding the repertoire of engineering
education researchers through more open dialog, to the extent A. Descriptive Statistics
that alternative methods appropriately address the research ques- Quantifiable results as they pertain to opinions, attitudes, or
tions, would greatly benefit the field. trends are one of the goals of conducting a survey (Creswell, 2002).
Articles in JEE have relied on the reporting of frequencies, or de-
scriptive statistics, to examine the status of engineering education
II. QUANTITATIVE METHODS related to degree requirements (Dorato and Abdallah, 1993), cur-
rent educational practices (Hodge and Steele, 2002), the education-
Much of engineering research seeks to identify how out- al experiences of students enrolled in engineering programs (Todd
comes (i.e., mechanical failure) are determined by reducing et al., 1995), and to document trends in student enrollments, reten-
plausible causes to a discrete set of indicators or variables. tion, and starting salaries (Heckel, 1994, 1995, 1996). For ABET
Quantitative methods are a good fit for deductive approaches, accreditation purposes, engineering education has also examined
in which a theory or hypothesis justifies the variables, the pur- the employer's perspective through surveys collecting perceptions
pose statement, and the direction of the narrowly defined re- on elements that should be included or emphasized within curricu-
search questions. The hypothesis being tested and the phrasing lar design (Downing, 2001; Lang et al., 1999; B. K. Walker et al.,
of the research questions govern how data will be collected (i.e., 1998). Involvement of underrepresented groups in engineering
a locally developed survey, commercial instrument, or final such as women has also been examined through the use of surveys
course grades) as well as the method of statistical analysis used measuring their professional preparation (Robinson and Reilly,
to examine the data (Creswell, 2002). 1993). Fewer descriptive studies have employed a longitudinal de-
The purpose of quantitative studies is for the researcher to pro- sign (i.e., tracking students over extended periods of time) to examine
ject his or her findings onto the larger population through an objec- students' growth in cognitive abilities (Lumsdaine and Lumsdaine,
tive process. Data collected, often through surveys administered to a 1995) and retention (Felder, Felder, and Dietz, 1998; Felder et al.,
sample or subset of the entire population, allow the researcher to 1993; Felder et al., 1994). In each of these studies, the authors re-
generalize or make inferences. Results are interpreted to determine ported only the number and/or percentage of students, employers,
the probability that the conclusions found among the sample can be or faculty in each category. These are referred to as descriptive stud-
replicated within the larger population. Conclusions are derived ies because they describe the situation without addressing any rela-
from data collected and measures of statistical analysis (Creswell, tionships between variables or groups. As previously stated, this ap-
2002; Thorne and Giesen, 2002). proach can be useful in the case of topics about which little is
Various topics in engineering education have been examined known.
through a quantitative approach. In their review article from the
JEE 2005 special issue, Olds, Moskal and Miller (2005) point to a B. Examining Relationships Between and Among
number of quantitative assessment methods pertinent to engi- Various Indicators
neering education, including surveys, meta-analysis and experi- In other cases, researchers want to investigate cause and effect or
mental designs. While engineers often use quantitative methods differences between various groups or treatments. Pre-existing the-
in their research or applied settings, this article is designed to ad- ory is typically used to guide the formation of hypotheses about re-
dress multiple audiences that might employ a quantitative ap- lationships that might exist concerning a particular group, topic, or
proach within an educational context. In the subsections that fol- situation. The hypothesis is typically formulated as a research ques-
low quantitative methods as they have been used to explore topics tion, and then data are collected and analyzed to answer the re-
in engineering education are discussed. First, descriptive statistics search question. Following data collection, the hypothesis will be
such as percentages, means and standard deviations are reviewed, either accepted or rejected based upon the results of the statistical
as they have been used to illustrate various points and describe a analysis. The indicators, or variables, that are being used to measure
situation, particularly one that has not been studied previously a particular theory will determine what type of analysis is used (refer
(Dorato and Abdallah, 1993; Hodge and Steele, 2002; Todd, to Table 1).
Magleby, Sorensen, Swan, and Anthony, 1995). Quantitative Studies that have examined how different variables are related to
studies within engineering education rely heavily on descriptive students' experiences within the classroom illustrate how theory is
statistics derived from surveys or commercial instruments (Dorato used to develop a hypothesis and how an appropriate method is se-
and Abdallah, 1993; Downing, 2001; Hodge and Steele, 2002; lected for analysis based upon the indicators involved. This is seen
Lang, Cruse, McVey, and McMasters, 1999; Todd et al., 1995; in various studies that have examined whether there is a difference
B. K. Walker et al., 1998). Second, quantitative research designs in subject mastery between distributed or distance learning engi-
using statistical analyses to examine whether there are significant neering students and students enrolled in more traditional delivery
differences between groups on various indicators (Carpenter, methods (Davis, 1996; Kirschman and Greenstein, 2002; Rutz
Harding, Finelli, Montgomery, and Passow, 2006; Davis, 1996; et al., 2003; Webster and Haberstroh, 2002). Similarly designed
Kirschman, and Greenstein, 2002; Rutz et al., 2003; Webster and studies have examined whether technology-enhanced learning
54 Journalof EngineeringEducation
January 2009
Predictive
Variables Analyses Analyses
Categorical Contingency Tables Logistic regression
and Chi Squared Discriminant
Analysis
Continuous (i.e, ANOVA, MANOVA, Linear Regression
scale, ordinal, t-tests Pearson's Multiple Linear
ratio) Correlation Regression

Table 1. Statisticalanalysesused to examine relationshipsbetween variablesin engineeringeducation.Adaptedffrom Tborneand Giesen


(2002). This table only includes statisticalanalysesused to examinerelationshipsbetween variablesin engineeringeducationas mentioned in
thisarticle.

environments make a difference in student academic success (Hsi, In some cases, researchers seek to examine relationships between
Linn, and Bell, 1997; Merino and Abel, 2003; Ogot, Elliott, and different indicators so that they can make predictions. When more
Glumac, 2003). Trussell and Dietz (2003) examined whether en- complex statistical analyses such as these are conducted, theory is uti-
rollment in classes using graded homework assignments were relat- lized to justify the survey instrument selected (Felder, Felder, and
ed to higher test scores than enrollment in classes with no graded Dietz, 2002; Hunkeler and Sharp, 1997) as well as which items will
.assignments. Since the dependent variables examined in these serve as predictor variables for the dependent variable. In the case of
studies are continuous (i.e., GPA, exam score), Pearson's correlation, regression analyses, theory guides how independent variables are
t-tests, ANOVAs, or MANOVAs have been used to analyze the entered in the regression equation and how the influence of certain
results to determine whether there is a significant relationship be- variables is controlled for during the analysis. Regression analyses re-
tween indicators or whether the mean score of one group differs sig- ported in the JEE have used a particular theory to guide the selection
nificantly from another. If variables or indicators being examined of predictor variables such as high school grade point average, SAT
are categorical (e.g., male/female, course section, pass/fail), statistics score, gender, and race and examine their relation to student reten-
are analyzed through Chi-square to examine the differences be- tion (French, Immekus, and Oakes, 2005; Moeller-Wong and Eide,
tween groups in reported frequency of responses. This is seen in 1997; Ohland and Zhang, 2002) and academic success (Bjorklund,
Shiavi and Brodersen's (2005) study that examined whether stu- Parente, and Sathianathan, 2004; Haines, Wallace, and Cannon,
dents favored learning in a laboratory setting or via lecture. 2001; Lackey et al., 2003). Social Adjustment Theory provides the
basis for regression analyses that have been employed to examine the
C. Explicit Use ofTheory effect of cooperative education experience on student outcomes such
Other studies have used similar statistical analyses to examine as grades and starting salaries (Blair, Millea, and Hammer, 2004;
how different groups within engineering education perform in rela- Parsons, Caylor, and Simmons, 2005). In Burtner's (2005) study
tion to a given theory. The same methodological approach is taken utilizing Tinto's (1993) Interactionalist Theory, discriminant
with regard to developing a hypothesis or research questions and analysis was used to identify attitudes and perceptions that influence
collecting and analyzing relevant data, but they are more explicit in students' decisions to remain in an engineering program.
employing theory to guide their investigations. Bell et al. (2003) in- The use of statistical analysis such as the examples given here is
vestigated how Stereotype Threat impacts performance by male often familiar to engineering researchers, and as such, provides well-
and female students on tests in an engineering classroom setting. established methods for conducting quantitative research. Given the
Terenzini et al. (2001) used learning theory related to student en- fact that consensus is well-established in quantitative methods, there
gagement to examine whether active and collaborative learning are many textbooks that provide insight to the utility of statistical
methods differ from traditional lecture courses in their ability to analyses and appropriate research design when employing these
promote the development of students' engineering design, problem- methods (Hinkle, Wiersma, and Jurs, 2002; Thorne and Giesen,
solving, communication, and group participation skills. Taraban 2002). However, researchers should also consider the issues that are
et al. (2007) used Constructivist learning theory to extend previous encountered when utilizing quantitative methods within engineer-
research by examining the degree to which engineering students' ing education (Larpkiataworn et al., 2003; Tebbs and Bower, 2003).
conceptual and problem solving knowledge differed based upon use Quantitative methods are one technique that can be employed when
of instructional software. Theory that considers the role of gender examining a particular issue but other research designs, such as quali-
has been employed in studies that utilized categorical data to exam- tative methods, may offer additional insights.
ine differences in students' experiences in the classroom (Brainard
and Carlin, 1998). Theory describing gender differences has also
guided studies examining mean score differences between males III. QUAITATIVE METHODS
and females in reported self-confidence, future expectations
(Hawks and Spade, 1998), and spatial reasoning (Peters, Chisholm, Qualitative research is characterized by the collection and analy-
and Laeng, 1994). sis of textual data (surveys, interviews, focus groups, conversational

January 2009 Journalof EngineeringEducation 55


Interpretivist
(constructivism,
social
constructionism,
Theoretical Post- hermeneutics, Critical/
perspective positivist phenomenology) emancipatory Postmodern/poststructural
View on Single Multiple subjective Multiple Multiple fragmented
reality falsifiable realities subjective realities
reality and political
realities
Purpose To find To describe a To produce To deconstruct existing
relationships situation, experience, a socio- 'grand narratives'
among or phenomenon political
variables, to critique
define
cause-and
effect
Methods Methods Methods and Methods Methods and
and approaches emerge and approaches generated
variables and are to be approaches during the study
defined in adjusted during study designed to
advance, capture
hypothesis inequities
driven
The role Researcher Researcher and Researcher Researchers and
of is detached participants are and participants have
researcher partners participants various changing roles
are activists
Outcome Context-free Situated descriptions Critical Reconceptualized
or generalizations essays, descriptions of the
research policy phenomenon
product changes

Table2. Comparisonbetween tbeoreticalperspectivesfromKoro-LjungbergandDouglas(2008).

analysis, observation, ethnographies (Olds et al., 2005)), and by its information is collected haphazardly as it becomes available, while
emphasis on the context within which the study occurs. The re- qualitative research involves the careful planning of a research de-
search questions that can be answered by qualitative studies are sign that encompasses all aspects of the study, from research ques-
questions such as: What is occurring? Why does something occur? tions to sampling to data collection and analysis. However, engi-
How does one phenomenon affect another? While numbers can be neering educators who have been trained primarily within the
used to summarize qualitative data, answering these questions gen- quantitative tradition may not be familiar with some of the norms
erally requires rich, contextual descriptions of the data, what is often of qualitative research. Important differences exist between the two,
called "thick" description. Several texts provide descriptions and ex- with some of the most significant being the assumed nature of
amples of qualitative research in the social sciences (Creswell, 2007; truth, the role of theory, sampling, and generalizability. Each of
Denzin and Lincoln, 2005; Merriam, 2002; Patton, 2002), and two these issues is discussed further.
recent publications describe the conduct of qualitative research
within the context of engineering (Chism, Douglas, and Hilson Jr., A. Assumed Nature ofTruth
2008; Koro-Ljungberg and Douglas, 2008). The tbeoreticalperspectiveof a study (not to be confused with a
Several authors have pointed out the danger in assuming that theory describing a phenomenon) describes the approach used to
qualitative research is easier and less rigorous than quantitative re- explain reality, and is related to a particular epistemology, or way of
search (Hoaglin et al., 1982; Koro-Ljungberg and Douglas, 2008). understanding reality (Crotty, 2003). Quantitative research is
As these authors point out, qualitative research is rigorous and in- invariably conducted from a post-positivist perspective, which
volves its own set of data collection and analysis methods that en- posits the existence of an absolute truth that can never be con-
sure the trustworthiness of the findings. Tonso (1996) specifically firmed, only disconfirmed (the concept of falsifiability). Although
contrasts qualitative research with anecdotal information; anecdotal not the usual approach, qualitative research in engineering has also

56 Journalof EngineeringEducation
January 2009
been conducted from a post-positivist perspective, often in combi- situation in enough depth that the full meaning of what occurs is
nation with the collection of quantitative data. These mixed meth- made apparent. This approach is particularly useful when consider-
ods studies are described further in Section IV. More commonly, ing unusual or non-traditional cases. For example, as described by
however, qualitative research uses alternate perspectives, which are Foor et al. (2007), statistical analysis can bury the voices of under-
summarized in Table 2, taken from Koro-Ljungberg and Douglas represented groups: "Using surveys with large sample sizes, null hy-
(2008). For a full discussion of these perspectives see Crotty (2003), potheses, and the expectation of statistical significance cannot
and for descriptions in the context of engineering see Koro- meaningfully describe marginalized individual's experiences." In
Ljungberg and Douglas (2008) and Chism et al. (2008). Examples most cases, several participants are studied to provide sufficient de-
of how these perspectives are used exist within the engineering edu- scription of a particular situation. Studies in the engineering educa-
cation literature. Klukken, Parsons, and Columbus (1997) used a tion literature, for example, have involved interviewing 55 practic-
phenomenological interpretive perspective to describe the experi- ing engineers (Trevelyan, 2007), and observation of seven student
ence of being a creative engineer; Donath et al. (2005) used the design teams (Tonso, 2006). However, even a single case can be il-
interpretive perspective of social constructionism to examine how lustrative (Foor et al., 2007). By reading the rich contextual descrip-
knowledge is created within an active learning group; and Tonso tions afforded by focusing on only a few cases, engineering educa-
(1996) utilized a critical feminist perspective to examine how the tors can recognize practices that occur within their own schools.
discourse in a design class defines gender roles. As with all aspects of
the research design, the theoretical perspective one chooses, D. Generalizability and Transferability
whether positivist, interpretivist, or critical, is ultimately driven by, The concept ofgeneralizabilityin quantitative studies is replaced
and must be consistent with, the research questions of the study. by the term transferability in qualitative studies. Quantitative re-
search is focused on generalizing to the larger population indepen-
B. The Role ofTheory dent of context, and thus there is a heavy emphasis in the research
The role of theory (a description or explanation of a phenome- design on random sampling and statistical significance. In contrast,
non) is very different in qualitative and quantitative research. As qualitative research seeks to generalize through thick description of
stated earlier, in quantitative studies, theory is utilized early in the a specific context, allowing the reader to make connections between
research design to identify hypotheses and to select appropriate the study and his or her own situation. In short, quantitative re-
measurement instruments. In contrast, the use of theory in qualita- search places the burden of demonstrating generalizability on the
tive research comes much later, if at all, as a lens through which the researcher, while qualitative research places the burden of identify-
findings can be interpreted. In general, qualitative research takes an ing appropriate contexts for transferability on the reader. Just as rig-
inductive approach to data analysis. The data are examined without orous statistical analysis is essential in quantitative research to en-
preconceptions as to existing theory or pre-determined categories, sure reliability and generalizability of the results, so too is rich
allowing themes or categories to emerge from the data. In order to description of the context and experiences of the participants essen-
accomplish this, potential biases of the researcher are examined and tial in qualitative research to ensure trustworthiness (see section V)
made known (in phenomenology, the process of identifying biases of the findings and transfer to other contexts. Foor et al's (2007)
and then setting them aside is called "bracketing"). This process al- study illustrates the difference between generalizability and trans-
lows new insight that would not be possible if an existing theory or ferability. From a quantitative perspective, the use of a single case
concept were imposed on the data. Klukken et al. (1997) used a does not provide any means for generalizing, as there is no way of
phenomenological approach to identify the characteristics of cre- knowing if that single case is typical. From a qualitative perspective,
ative engineers, the first step towards building a theory of creativity however, the thick description of that single case allows readers to
in engineering. Even when theory development is not the goal of a identify elements that can be transferred to their own situations.
study, one of the strengths of qualitative research is that its open,
emerging approach to the conduct of the research allows new phe-
nomena to be identified which would not have been expected (and IV. MIXED METHODS
thus not identified if a priorihypotheses or specific measurement
instruments drove the research). As one example, McLoughlin Mixed methods has been described as the "third methodological
(2005) identified a new type of gender bias in her study of women in movement" (following quantitatively and qualitatively oriented
engineering. Because this type of bias had not been previously iden- approaches) (Teddlie and Tashakkori, 2003). Many descriptions of
tified, a quantitative study of gender bias would not have revealed it. mixed methods place it in the context of more established tradi-'
Only by examining the words of the participants and interpreting tions, criticizing some for being too divisive by artificially emphasiz-
those words with qualitative analysis techniques was McLoughlin ing differences, specifically the "incompatibility thesis" (Howe,
able to identify a new phenomenon. Her results now provide 1988) that quantitative and qualitative paradigms "cannot and
opportunities for additional qualitative and quantitative studies to should not be mixed" (Johnson and Onwuegbuzie, 2004, p. 14). In-
examine this phenomenon further. stead, they are proponents of pragmatism, in which "[w]hat is most
fundamental is the research question-research methods should
C. Sampling follow research questions in a way that offers the best chance to ob-
In comparison to quantitative studies, with their emphasis on tain useful answers" (Johnson and Onwueghuzie, 2004, pp. 16-17).
large, representative samples, qualitative research focuses on smaller Creswell et al. define a mixed methods study as follows:
groups in order to examine a particular context in great detail. The
goal is not to provide a broad, generalizable description that is rep- A mixed methods study involves the collection or analysis of
resentative of most situations, but rather to describe a particular both quantitative and/or qualitative data in a single study in

January 2009 Journalof EngineeringEducation 57


Relative Mixing - when
Timing of quan weighting of quan and qual
and qual quan and qual phases are
Design Type phases components integrated Notation
Triangulation Concurrent Equal During QUAN + QUAL
interpretation
or analysis

Embedded Concurrent or Unequal One is QUAN(qual) or


Sequential embedded within QUAL(quan)
the other

Explanatory Sequential, Usually quan is Phase 1 informs QUAN -> qual


quan then qual given priority phase 2

Exploratory Sequential, Usually qual is Phase 1 informs QUAL -> quan


qual then quan given priority phase 2

Table 3.Adaptedfrom CreswellandPlano Clark (2007). Usesabbreviationsandnotationfrom Morse (2003).

which the data are collected concurrently or sequentially, are Another recent example actually used mixed methods terminol-
given a priority, and involve the integration of the data at ogy to describe the overall project design:
one or more stages in the process of research (Creswell et al.,
2003, p. 212). The [Academic Pathways Study] uses a concurrent
triangulation mixed-methods design, in which both
This is distinguished from multimethod approaches (D.Campbell qualitative and quantitative methods are employed to collect
and Fiske, 1959), which may include multiple quantitative or and analyze data. The integration of results occurs during
qualitative studies but not necessarily both. Based on decisions the interpretation phase (Creswell et al., 2003), enabling
regarding the sequence of data collection, relative priority, and researchers to address a broad range of research questions
stage at which integration of quantitative and qualitative compo- toward discerning complex phenomena like student learning
nents takes place, Creswell et al. identify four basic mixed meth- and development (Johnson and Onwuegbuzie, 2004)
ods designs (Creswell and Plano Clark, 2007). These are summa- (Kilgore et al., 2007, p. 323).
rized in Table 3, and this section is organized around these basic
designs. Only one of our JEE examples self-identifies with one of The article itself describes analysis of an open-ended design task,
these categories; the others we use to illustrate examples and as- with a thick description of the categories and their relationships, as
pects of each design. well as quantifying frequencies of various behaviors and strategies
employed by their participants (Kilgore et al., 2007).
A. Triangulation Designs
The term triangulation in research was first used by Denzin B. Embedded Designs
(1978) to describe bringing together complementary methods or Embedded designs are not distinguished by the concurrent or
data sources to offset weaknesses in each. Data are collected concur- sequential nature of data collection (either is allowed). Rather, one
rently in one phase, and interpretation involves comparing the type of data takes a supplemental role to the other. Creswell and Plano
results of each to best understand the research question (Creswell Clark offer the criterion that a study is embedded if the secondary
and Plano Clark, 2007; Morse, 1991). data are not useflul or meaningful without the primary study (2007).
One engineering education example is a survey with ap- Following this criterion, we can identify several engineering ed-
proximately equal numbers of quantitative and qualitative ucation articles with very limited secondary components as embed-
questions. Sageev and Romanowski (2001) collected quantita- ded designs. As might be expected, engineering education studies
tive data about time spent on communication tasks and quali- exhibited a strong preference for assigning priority to quantitative
tative data on the impact of communication coursework on data and analysis. For example, it was quite common to develop an
alumni participants' careers. The results section is organized by extensive control group or pre-test/post-test quantitative design to
survey question, but the authors allow responses to "spill over" assess learning and augment it with one or a few open-ended ques-
from one section to the next to illustrate important connec- tions to students about their attitude toward the intervention
tions. Additionally, they included an extensive (over one page) (Campbell et al., 2002; Lee, Castella, and Middleton, 1997; Morell
conclusions section addressing the responses and implications et al., 2001; Raju and Sankar, 1999; Weisner and Lan, 2004). The
for technical communication faculty through integration of the extent to which authors presented the qualitative results ranges
findings. from referencing another publication (Brown, Morning, and

58 Journalof EngineeringEducation January 2009


Watkins, 2005) to nearly a full page (Morell et al., 2001; Raju and In both cases, we can also argue based on the publication space
Sankar, 1999). Most included one or two paragraphs discussing dedicated to each type of results that the authors assigned priority to
student attitudes and intervention logistics (Campbell et al., 2002; the quantitative phase.
Lee et al., 1997; Weisner and Lan, 2004).
D. Exploratory Designs
C. Explanatory Designs Exploratory designs begin with a primary qualitative phase, then
Explanatory mixed methods designs are characterized by an ini- the findings are validated or otherwise informed by quantitative re-
tial and extensive quantitative phase built upon by a subsequent sults. This approach is usually employed to develop a standardized
qualitative phase. Usually, the qualitative results serve to explain the (quantitative) instrument in a relatively unstudied area. The qualita-
quantitative results. Integration occurs between phases, as the quan- tive phase identifies important factors, while the quantitative phase
titative results often inform the questions or sampling in the second applies them to a larger and/or more diverse sample (Creswell and
phase (Creswell and Plano Clark, 2007; Creswell et al., 2003). Plano Clark, 2007).
A few JEE articles provide hints that authors pursued a mixed Baker et al. describe how they used an initial interview pilot
methods approach (likely explanatory) to provide complementary data: study to develop survey questions (which were also mixed):

As a complement to the statistical FCQ_analysis, two mid- Guided by a modified Glaser and Strauss approach
semester classroom interviews were conducted (Gall et al., (1967)[grounded theory, a qualitative tradition], the
2003, p. 340). [interview] tapes were analysed [sic] in order to isolate
either the relevant themes of data units which were then
Finally, open-ended comments were analyzed as a sorted into conceptual categories. These categories formed
qualitative component to shed light on numerical results the basis of the survey questionnaire that we used in the
(Hackett and Martin, 1998, p. 87). second phase of the study (Baker, Tancred, and
Whitesides, 2002, p. 46).
Since we found no statistically significant ACT/SAT or
GPA data that could explain the dramatic increases in the E. "Quantitizing" Qualitative Data
graduation rates of Connections students, we administered a Another very important trend in engineering education re-
follow-up survey in the fall of 2000 asking participants in search involves transforming qualitative behaviors or work prod-
the program to reflect about their experiences (Olds and ucts into quantitative data for statistical analysis. Teddlie and
Miller, 2004, p. 30). Tashakkori (2003) credit Miles and Huberman (1994) for the
term and concept of"quantitizing" to convert qualitative data into
In a few cases, authors were explicit in how the first phase in- numerical codes. Sandelowski (2003) describes this process in
formed the second: greater detail, as well as its less common analog of "qualitizing"
quantitative data.
This study employed both quantitative and qualitative We found many examples of this in the engineering education
analysis methods. Quantitative data came from statistical literature. All involve applying a framework of categorization or a
analyses of hundreds of megabytes of access log files stored scoring rubric to qualitative data such as publications, open-ended
on the web server.... Qualitative data came from user responses, or concept maps. Most simply reported frequencies
interviews. ... Empirical findings from the quantitative and percentages of themes or categories of responses (Compton,
analysis helped to form some of the initial questions used 1995; Dabbagh and Menasce, 2006; Gall et al., 2003; Napp,
during the interviews (Liang, Bell, and Leifer, 2001, p. 520). 2004; Olds and Miller, 2004; Whitin and Sheppard, 2004). One
also reported descriptive statistics (i.e., means) (Vanderburg and
Here, quantitative results provided ideas for qualitative interview Khan, 1994), while another reported validity statistics for the
questions. In another example, the quantitative phase identified in- rubric (Walker and King, 2003). Only one publication went as far
terview participants (who had the highest number of entries in the as calculating statistically significant differences between rubric
database), as well as providing findings for participants to react to in scores of various treatment groups (Rayne et al., 2006). Several of
interviews: these included examples of quotes, responses, and other qualita-
tive data, but generally did not provide enough qualitative data to
Readers will see that the publication analysis alone raises be organized around categories. An exception is Pavelich and
many questions with respect to the motivation of the Moore, who assigned a Perry development level to students based
publication authors. To provide additional context for on interviews, and reported descriptive statistics, but then also in-
understanding the impact of the coalitions, selected cluded several quotes to describe how students were characterized
coalition personnel reflected on the quantitative analysis as well as evidence of transitions from one level to another
results and described their perceptions of the work at the (Pavelich and Moore, 1996).
time and in light of recent calls for rigor. For the interview It is interesting to consider whether these studies should be
component of the study, the preliminary [quantitative] evaluated as quantitative, qualitative, or mixed methods. In the
results of the publication analysis were shared with coalition absence of labeling by the authors themselves, we can speculate
leaders and first authors of the highest number of that these hold up to quantitative criteria and most mixed meth-
publications (Borrego, 2007b, p. 9). ods criteria (although whether they actually include a qualitative

January 2009 Journalof Engineering Education 59


Quantitative Research Criteria Qualitative Research Criteria

Validity: project and Credibility: establishing that the


instruments measure what is results are credible or believable
intended to be measured
Generalizability: results are Transferability: applicability of
applicable to other settings, research findings to other settings,
achieved through achieved through thick description
representative sampling
Reliability: findings are Dependability: researchers account
replicable or repeatable for the ever-changing context within
which the research occurs
Objectivity: researcher limits Reflexivity: researchers examine their
bias and interaction with own biases and make them known
participants

. Table 4. Quantitativeandqualitativeresearchcriteria,fromLincoln andGuba, 1985; Tashakkoriand Teddlie, 1998; and Chism


et al., 2008.

component to qualify them as mixed is questionable), but that are (use of multiple data sources), thick description, peer debriefing
generally not consistent with qualitative criteria. Evaluation crite- (asking other researchers not directly involved in the research to re-
ria are described in detail in the following section. view the findings), creating an audit trail (a dear link between the
raw data and the findings), and a statement of researcher subjectivity.
Regarding mixed methods criteria, Creswell and Piano Clark
V. EVALUATION CRITERIA FOR QUANTITATIVE, (2007) first briefly review quantitative and qualitative criteria, then
QUALITATIVE, AND MIXED METHODS RESEARCH list four criteria for evaluating mixed methods studies:
1. Whether the study is indeed mixed methods (collecting, ana-
Evaluation of quantitative, qualitative, and mixed methods re- lyzing and mixing quantitative and qualitative approaches).
search has similar aims, although the details differ. In all cases, the The most inclusive definitions allow for representation of
goal is to establish that the results provide convincing evidence quantitative and qualitative perspectives in at least one of: data
sufficient to answer the research questions. A number of factors collection, data analysis, or theoretical perspective (1I1A).
go into establishing the quality of quantitative and qualitative re- 2. Detail and consistency in describing the design, theoretical
search, some of which are listed in Table 4. perspective, need for both quantitative and qualitative ap-
In quantitative research, these criteria are met through objec- proaches, and how the two components are mixed. Detailed
tivity, reliability, and validity of the instrument (e.g., question- quantitative and qualitative procedures should be described,
naire), and generalizability of the findings. Conclusions of stud- as well as sequential or concurrent data collection and analy-
ies are furthered by reporting the validity, or whether the sis. Interpretations should be defended.
instrument used actually measures the variables or theory it 3. Inclusion of advanced mixed methods features, including (a)
claims. Reliability of scores associated with the instruments em- specified type of design, (b) a visual diagram of the proce-
ployed, or whether the same result is derived on repeated trials, dures, (c) mixed methods purpose statement, research ques-
is also an important consideration. Moskal et al. (2002) illustrate tion and data analysis, and (d) citation of mixed methods
how these concepts can be used to improve assessment efforts in studies and methodological artides.
engineering education, and Blumner et al. (1997) discuss how 4. Sensitivity to the challenges of using the mixed methods
reliability and validity have been established on a locally devel- design. Authors should acknowledge the challenges and how
oped instrument to measure study habits among engineering they are addressed. Specific challenges include threats to
students. Fewer articles have discussed Chronbach-alpha scores validity such as sampling, sample sizes, and integration phases.
(Cronbach, 1951) relative to the construction of classroom tests In light of these criteria, it is dear that although there are several
(Allen et al., 2008). engineering education studies utilizing quantitative and qualitative
Comparable criteria exist for qualitative research, although the data, we are not as open as we should be about describing the
terms used and the focus of how these criteria are met are different methods and related challenges. Similarly, we might argue that au-
(see Table 4). The term trustworthinessis often used to describe the thors in the emerging research field of engineering education
extent to which a study meets these criteria. Some of the important should also explicitly address quantitative or qualitative research
ways in which trustworthiness can be established are a dear state- evaluation criteria (as appropriate) as a cue to readers and review-
ment of the theoretical perspective of the study, member checking ers. In the following section, we provide a summary of empirical
(asking participants to review the research findings), triangulation data from conference discussions to further support our arguments

60 Journalof EngineeringEducation January 2009


that methods are not being discussed as openly as they should be in model: classroom-based experiments comparing a control group to
engineering education. a treatment group. More often than not, the first questions asked of
a presenting author focused on control groups, representative sam-
pling, and triangulation of results with course grades.
VI. EMPIRICAL RESULTS 1. ControlGroups: Participants wanted to know whether student
gender and major were represented in the samples studied, or
In this section we present empirical results from observations whether any differences were detected between these groups (e.g.,
at a prestigious international engineering education research con- "What was the gender ratio of the students in your study?") If an au-
ference. The conference and analysis methods are described else- thor went into too much detail about the intervention itself and ne-
where (Borrego, Froyd, and Knight, 2007; Jesiek, Newswander, glected to describe the experimental design, participants asked
and Borrego, 2009). The format of this conference was unique be- about control groups, for example, "Are you going to do a compari-
cause each paper was read by other participants assigned to the son with 'traditional problems [given to students]?" In one ex-
group and discussed for 45 minutes to an hour rather than being change, there was even evidence that participants perceived a set-
presented. As expected, methods entered the conversation fre- ting to be unworthy of study if a control group is not available.
quently and in a variety ofways. Many participants appeared more Specifically, once a course becomes required for all students, there is
comfortable with quantitative approaches, either by virtue of their no control group. "It is compulsory for all, so there is no comparison
training, or based on expectations of reviewers' preferences. available," was spoken by one participant with a dejected tone and
Therefore, we focus the first portion of our findings on ways that bowed head.
qualitative methods entered the conversation. There was mention of a few alternatives to an experimental de-
sign with a control group. In at least one case, the author employed
A. Discussions of Qualitative Methods at the Conference historical controls by administering a few instruments before the
When qualitative methods were presented in the paper sessions, intervention was implemented. A third type of experimental or as-
it was often in the later and deeper discussions of specific papers. sessment design discussed was pre-test/treatment/post-test. How-
Five-minute summaries opening each paper discussion generally ever, in at least one case, participants perceived this to be inferior to
described quantitative results (if any). Later, when discussion shift- a control group design, as is also reflected in the literature (Leedy,
ed to conjecture as to why certain phenomena were observed, quali- 1997; U.S. Department of Education, 2007). Even Scientific Re-
tative results were described. As other participants mused about un- search in Education (Shavelson and Towne, 2002), cited as criteria
derlying explanations, authors described emerging results not for "rigorous research in engineering education" (Borrego, 2007a),
included in their conference papers to flesh out description of learn- has been criticized for valuing quantitative approaches over qualita-
ing or motivation mechanisms. It appears these authors were more tive methodologies (St. Pierre, 2002). One participant at this meet-
comfortable in face-to-face discussions of qualitative methods ing stated, "So you didn't have a control group. That makes it ques-
(rather than blind review), perhaps because initial conversation was tionable about whether you can say there was a difference from pre
going well. It should be noted that many other authors were able to and post as a result of the intervention." Another participant present
describe the purposes of their qualitative work comfortably. during this conversation attempted to describe the merit of this
One of the very few extended discussions about qualitative alternative design:
methods focused on ethnography. Using question and answer for-
mat, participants learned about ethnography perspectives, including I have been doing this for a while, and I learned when I was
its relationship to traditional views of validity and reliability. It is in- working on my Ph.D. in [research area] that there is a huge
teresting to note that the facilitator did not necessarily view this as a difference between proving that your intervention is
useful discussion and directed the author toward presenting results, adequate and wanting to prove it is better. If you want to
requesting, "please talk about what you found." In various sessions prove that it's adequate, which is what I think [this author]
on the second day, several presenters refocused on methods discus- wants to do, [then pre-test/post-test is fine]. I suspect he
sions in their papers for feedback, one stating "So [I] wanted to wants to show that it's better, which would require a control
focus this session more on method than results" or were in general group, but he is still able to show that it works [without a
more open about research design weaknesses for which they wanted control].
advice. In other words, an initial implicit assumption of participants
was that the paper discussions should focus on results, but upon re- 2. Statistical Significance: As suggested in the quotation
flection, authors presenting on the second day decided that they above, the ultimate goal of a treatment-control design is to show
would benefit more from discussions of methods. This attests to the that the treatment is superior through a statisticalUy significant
importance and interest in open discussion of methods in engineer- difference. It should be noted that while this approach is com-
ing education. mon in engineering research, it may establish an impossible bur-
den of proof in engineering education, as students cannot be
B. Applying a Quantitative Model in Critiquing Research controlled as carefully as "I-beams or fruit flies" (Wankat et al.,
Perhaps the most interesting finding was that although partici- 2002). In fact, many conference discussions focused on alterna-
pants lamented the low representation of qualitative methods, tive interpretations and confounds. Participants asked, "Can you
many of these same participants enacted a quantitative model in cri- rightly compare these groups?" and authors apologized with
tiquing others' research during the paper sessions. Though the con- responses like, "I admit that the sample size ought to be expanded
ference papers represented a range of quantitative approaches, the to 200-300 to be more representative," and "Because the sam-
discussion at the conference seemed more restricted to a single ple size was small, the numbers are not statistically significant."

January 2009 Journalof EngineeringEducation 61


(It should be noted that authors were not observed to discuss was also a source of anxiety with respect to publishing results.
calculating the power (Cohen, 1988), which would indicate Throughout the conference, there was much discussion of what
whether a larger sample size is likely to produce the desired sta- constitutes "enough" to publish in top journals like JEE. To fol-
tistically significant difference.) Surprisingly, considering the lowers of the classroom-experimental model, "enough to publish"
focus on quantitative studies, there were also a few instances in directly translated to high n, multi-year, multi-institution studies.
which authors presented numerical data and tables but did not Classroom research which had not yet expanded was described by
run statistical analyses to show whether highlighted differences participants several times as action research. Action research is a
were significant or not. (Other conference participants suggest- type of inquiry in which instructors make documented, systematic
ed running these analyses.) It should also be noted that other improvements in their classrooms as a means of applying new
participants, particularly those in a measurement group, were knowledge as it is generated (Feldman and Minstrell, 2000).
notably more comfortable discussing various statistical tests. While this approach allows findings to be quickly applied, a com-
Even though authors were striving for high sample size, some mon criticism is that it lacks the rigor traditionally applied in ex-
were reluctant to emphasize a statistically significant difference periments to more systematically or objectively establish the effec-
even when they found one. In one particular case, it was evident tiveness of the intervention (Gentile, 1994). One participant
that most participants in the discussion felt the intervention had asserted:
established its effectiveness by virtue of a doubled pass rate, but
the author was hesitant to claim victory beforie a carefully con- Action research is not enough to get into JEE now. That's
trolled experiment could be executed: "I know. I'm being very where we have the dilemma. We need people within the
cautious. Next round will be more in-depth. Next year we'll content area, but the JEE is on a level that's cross-institutional,
hopefully have two groups of students, randomly assigned." cross-national studies. Just one case is not enough.
While it is important to skeptically evaluate and replicate results
in educational studies (Shavelson and Towne, 2002), controlled Particularly for traditional engineering faculty who are often not
classroom experiments are not the only means of replication. For directly rewarded for their engineering education research, repeat-
example, many variations of pre- and post-testing designs, com- ing an intervention semester after semester is simply too much of a
bine'd with control group designs can establish a finding across time investment for one article. Statements like, "We don't have the
diverse settings with various constraints on experimental design. luxury of people who are full-time researchers," were made a few
The few times these were mentioned by participants, they were times, indicative of perceptions that such high publication stan-
described as less desirable than controlled experiments and often dards exclude practitioners from engineering education research.
dismissed. 2. PerceptionsofReview Criteria:Participants frequently ex-
3. Validity, Reliability, and Triangulation: In most of the pressed the perception that peer review in engineering education
groups, validity, reliability, and triangulation were discussed very is unintentionally biased toward quantitative studies. One exam-
little, at least explicitly. One exception was triangulating research ple was,
results with course grades, e.g., "Did you compare performance
based on something like grades? You saw higher motivation. Is If we devise a questionnaire, if we make a statistical analysis
that correlated with performance like grades?" This reliance on and submit to JEE, probably they will accept it. Same action
grades to triangulate engineering education research findings is research, just analyzed differently... I could have used
particularly problematic when faculty researchers don't find the questionnaires or surveys, statistical analysis, this would
significant difference they were looking for. One group of partici- qualify as quantitative analysis, but this could miss things.
pants bemoaned, "We're all kind of stuck because the correlations
[between grades and other data] aren't there." The explanation offered for this perceived bias was the limited,
David Labaree (2003) writes about this phenomenon, in which traditional training and exposure of reviewers. Since peer reviewers
new educational researchers with extensive classroom experience are members of the community, conference presentations and dis-
approach educational research as proving that an intervention cussions are an important means of exposure to different method-
works, rather than being skeptical or objective as their experimental ological approaches. We observed that groups which were most
design requires. However, it is also a possibility that these interven- balanced between quantitative and qualitative papers were most
tions do indeed work, and that other constructs or subjectivities re- open in exploring methods and questioning customary approaches.
flected in course grades are overshadowing actual increases in learn- The two examples we can cite of transformative participant experi-
ing. For example, the reliability and validity of the tests and ences were improved quantitative designs openly discussed by
homework assignments are usually not established (Allen et al., methodologically diverse groups. In one case, a paper initially
2008). In this case, it is important for engineering education class- framed as naturalistic (qualitative) was refrained as a well-controlled
room researchers to consider other sources for triangulation, such as assessment of improved learning resulting from an intervention. In
scores on specific assignments or test questions, concept inventories the other case, a participant limited by availability of validated in-
or focus groups that complement the principal measurement proce- struments was encouraged to, "jump off the deep end and start vali-
dure, and retention or passing rates in subsequent courses. dating your own measure that addresses this issue." These impor-
tant examples illustrate that open dialog across methods
C. Perceived Implications for Publishing in ResearchJournals orientations does lead to better research. As a comparison case, one
1. "Enoughto Publish:'"While the classroom-experimental ap- group consisting of entirely quantitative papers had a much more
proach to engineering education research complete with control focused discussion which resulted in valuable but specific feedback
groups seemed a well-suited one to many participants, this focus unlikely to result in methodological breakthroughs. Since there

62 Journalof EngineeringEducation
January 2009
were no entirely qualitative groups to study, we cannot speculate on REFERENCES
the dynamics or potential outcomes; however, our analysis points to
diversity of viewpoints as the critical feature of inclusive, generative, Allen, K., T. Reed-Rhoads, R.A. Terry, T.J. Murphy, and A.D.
constructive discussions at this particular conference. Stone. 2008. Coefficient alpha: An engineer's interpretation of test
reliability. JournalofEngineeringEducation97 (1): 87-94.
Baker, S., P. Tancred, and S. Whitesides. 2002. Gender and
VII. DISCUSSION AND CONCLUSION graduate school: Engineering students confront life after the B.
Eng.JournalofEngineeringEducation 91 (1): 41-47.
There are several plausible explanations for why engineering ed- Bell, A.E., S.J. Spencer, E. Iserman, and C.E.R. Logel. 2003.
ucation researchers appear to strongly prefer quantitative methods. Stereotype threat and women's performance in engineering. Journal
Creswell (2002) lists three criteria for selecting from among quanti- ofEngineeringEducation 92 (4): 307-12.
tative, qualitative and mixed methods approaches: (1) the research Bjorklund, S.A., J.M. Parente, and D. Sathianathan. 2004. Ef-
problem, (2) the personal experiences of the researcher, and (3) the fects of faculty interaction and feedback on gains in student skills.
audience. The majority of engineering education researchers is JournalofEngineeringEducation93 (2): 153-60.
engineering faculty members who were trained within the post- Blair, B.F., M. Millea, and J. Hammer. 2004. The impact of co-
positivist perspective. As a result, the personal experiences of the re- operative education on academic performance and compensation of
searchers would tend to favor quantitative experimental approaches. engineering majors.JournalofEngineeringEducation 93 (4): 333-38.
Similarly, the audience for much of engineering education research Blumner, H.N., and H.C. Richards. 1997. Study habits and
is also engineering faculty, who have more experience interpreting academic achievement of engineering students. JournalofEngineer-
quantitative results. However, we take the view that the approach ingEducation86 (2): 125-32.
used by a given study should be driven by the research questions of Borrego, M. 2007a. Conceptual hurdles experienced by engi-
that study. Given the wide variety of issues still to be explored with- neering faculty becoming engineering education researchers. Journal
in engineering education (Steering Committee of the National En- ofEngineeringEducation96 (2): 91-102.
gineering Education Research Colloquies, 2006), we would expect Borrego, M. 2007b. Development of engineering education as a
that all of these approaches would be represented. While examples rigorous discipline: A study of the publication patterns of four coali-
of all three approaches do exist within the pages of JEE, the empiri- tions.JournalofEngineeringEducation 96 (1): 5-18.
cal results for an engineering education conference described here Borrego, M.,J. Froyd, and D. Knight. 2007. Accelerating emer-
show that they are not being used equally. There appears to be a gence of engineering education via the International Conference on
trend towards the use of quantitative methods, and even within the Research in Engineering Education (ICREE).JournalofEngineer-
quantitative area only certain approaches are deemed to be worth- ingEducation96 (4): 281-82.
while. These results provide a further example of engineering edu- Borrego, M., R.A. Streveler, R.L. Miller, and K.A. Smith.
cation as a low consensus field, which may be caused by the lack of 2008. A new paradigm for a new field: Communicating represen-
training of researchers in alternate methods, or perceived require- tations of engineering education research. Journal of Engineering
ments for publication. We encourage readers to further investigate Education 97 (2): 147-62.
alternate research methods by accessing some of our sources and Brainard, S.G., and L. Carlin. 1998. A six-year longitudinal
collaborating across education/social science and engineering disci- study of undergraduate women in engineering and science. Journal
plinary boundaries with those who have spent more time applying ofEngineeringEducation 87 (4): 369-75.
these research methods in educational settings. In addition to seek- Brown, A.R., C. Morning, and C. Watkins. 2005. Influence of
ing expertise in the various methods, educators should consider the African American engineering student perceptions of campus dimate
criteria outlined in this article that would frame a study employing a on graduation rates.JournalofEngineeringEducation 94 (2): 263-71.
quantitative, qualitative or mixed method approach. Burtner, J. 2005. The use of discriminant analysis to investigate
the influence of non-cognitive factors on engineering school persis-
tence.JournalofEngineeringEducation94 (3): 335-38.
ACKNOWLEDGMENTS Campbell, D., and D.W. Fiske. 1959. Convergent and discrim-
inant validation by the multitrait-multimethod matrix. Psychological
The authors would like to thank the U.S. National Science Bulletin 54:297-312.
Foundation for supporting this project through EEC-0645736 Ac- Campbell, J.O., J.R. Bourne, P.J. Mosterman, and A.J.
celerating Development of Engineering Education as a Discipline Brodersen. 2002. The effectiveness of learning simulations for elec-
via Organized Dialogue. We are also grateful to Lynette Osborne tronic laboratories.JournalofEngineering Education 91 (1): 81-87.
for providing field notes and analyzing conference data, as well as Carpenter, D.D., T.S. Harding, C.J. Finelli, S.M. Montgomery,
other conference ethnographers Brent Jesiek, Lynita Newswander, and H.J. Passow. 2006. Engineering students' perceptions of and
and Devon Hensel for providing field notes and reviewing drafts of attitudes towards cheating. JournalofEngineeringEducation 95 (3):
conference results. In framing the results as part of a research re- 181-94.
view, we are indebted to Elizabeth Creamer, Jack Lohmann, and Chism, N. van N., E. Douglas, and W.J. Hilson Jr. 2008. Quali-
anonymous JEE associate editor and reviewers for constructive crit- tative researchbasics.-A guidefor engineeringeducators (available from
icism and helpful discussions. Finally, we thank members of the Rigorous Research in Engineering Education, co Ruth Streveler,
conference team for planning and evaluating the event, and last but 701 W. Stadium Avenue, West Lafayette, IN 47907).
not least, the other conference attendees who agreed to serve as the Cohen, J. 1988. Statisticalpoweranalysisforthe behavioralsciences
participants in this research. (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum.

January 2009 Journalof Engineering Education 63


Compton, W.D. 1995. Encouraging graduate study in engi- science. In Handbook ofresearchdesign in mathematicsand science edu-
neering.JournalofEngineeringEducation 84 (3): 249-55. cation, eds. A.E. Kelly and R.A. Lesh, 429-456. Hillsdale, NJ:
Creswell, J.W. 2002. Research design: Qualitative, quantitative, Lawrence Erlbaum.
andmixed methods approaches.New York. Sage Publications. Foor, C.E., S.E. Walden, and D.A. Trytten. 2007. "I wish
Creswell, J.W. 2007. Qualitative inquiry and research design: that I belonged more in this whole engineering group:" Achieving
Choosing amongfive approaches,2nd edition.Thousand Oaks, CA: individual diversity. Journal of Engineering Education 96 (2):
Sage Publications. 103-15.
Creswell, J.W., and V.L. Plano Clark. 2007. Designing and French, B.F.,J.C. Immekus, and W.C. Oakes. 2005. An exami-
conducting mixed methods research. Thousand Oaks, CA: Sage nation of indicators of engineering students' success and persistence.
Publications. JournalofEngineeringEducation
94 (4): 419-25.
Creswell, J.W., V.L. Plano Clark, M.L. Gutmann, and W.E. Gall, K.,D.W. Knight, L.E. Carlson, and J.F. Sullivan. 2003.
Hanson. 2003. Advanced mixed methods research designs. In Making the grade with students: The case for accessibility. Journal
Handbook of mixed methods in socialand behavioralresearch, eds. A. ofEngineeringEducation92 (4): 337-43.
Tashakkori and C. Teddlie, 209-240. Thousand Oaks, CA: Sage Gentile, J.R. 1994. Inaction research: A superior and cheaper
Publications. alternative for educational researchers. EducationalResearcher 23
Cronbach, Lj. 1951. Coefficient alpha and the internal struc- (2): 30-32.
ture of tests. Psychometrika16 (3): 297-334. Glaser, B.G., and A. Strauss. 1967. The discovery of grounded
Crotty, M. 2003. The foundations of social research. Thousand theory: Strategiesforqualitative research. Chicago, IL: Aldine.
Oaks, CA: Sage Publications. Hackett, R.K., and G.R. Martin. 1998. Faculty support for
Dabbagh, N., and D.A. Menasce. 2006. Student perceptions minority engineering programs. JournalofEngineeringEducation
of engineering entrepreneurship: An exploratory study. Journalof 87 (1): 87-95.
EngineeringEducation 95 (2): 153-64. Haines, V.A.,J.E. Wallace, and M.E. Cannon. 2001. Exploring
Davis, J.L. 1996. Computer-assisted distance learning, part II: the gender gap in engineering: A re-specification and test of the
Examination performance of students on and off campus.Journal of hypothesis of cumulative advantages and disadvantages. Journalof
EngineeringEducation85 (1): 77-82. EngineeringEducation 90 (4): 677-84.
Denzin, N.K. 1978. The logic of naturalistic inquiry. In Sociological Hawks, B.K., and J.Z. Spade. 1998. Women and men engi-
methods.'Asourcebook, ed. N.K. Denzin. NewYork-McGraw-Hill. neering students: Anticipation of family and work roles. Journalof
Denzin, N., and Y. Lincoln. 2005. Introduction: The discipline EngineeringEducation87 (3): 249-56.
and practice of qualitative research. In The Sage handbook ofqualita- Heckel, R.W. 1994. Current and emerging statistical trends in
tive research, 3rd edition, eds. N. Denzin and Y. Lincoln, 1-32. engineering education. Journal of EngineeringEducation 83 (4):
Thousand_Oaks, CA: Sage Publications. 1-7.
Donath, L., R. Spray, N.S. Thompson, E.M. Afford, N. Craig, Heckel, R.W. 1995. Disciplinary patterns in degrees, faculty and
and M.A. Matthews. 2005. Characterizing discourse among un- research funding.JournalofEngineeringEducation84 (1): 1-10.
dergraduate researchers in an inquiry-based community of practice. Heckel, R.W. 1996. Engineering freshman enrollments: Criti-
JournalofEngineeringEducation94 (4): 403-17. cal and non-critical factors. Journal ofEngineeringEducation85 (1):
Dorato, P., and C. Abdallah. 1993. A survey of engineering ed- 15-21.
ucation outside the United States: Implications for the ideal engi- Heywood, J. 2005. Engineering education.: Research and develop-
neering program.JournalofEngineering Education 82 (4): 212-15. ment in curriculum andinstruction. Hoboken, NJ: IEEE Press.
Downing, C.G. 2001. Essential non-technical skills for team- Hinkle, D.E., W. Wiersma, and S.G. Jurs. 2002. Appliedstatistics
ing.JournalofEngineeringEducation90 (1): 113-17. for the behavioralsciences.New York: Houghton Mifflin Company.
Ernst, E.W. 1993. The editor's page.JournalofEngineeringEd- Hoaglin, D.C., Rj. Light, B. McPeek, F. Mosteller, and M.A.
ucation 82 (1): 1. Stoto. 1982. Datafor decisions:Information strategiesforpolicymakers.
Felder, R.M., G.N. Felder, and Ej. Dietz. 1998. A longitudinal Cambridge, MA: Abt Associates, Inc.
study of engineering student performance and retention. V. Com- Hodge, B.K., and W.G. Steele. 2002. A survey of computation-
parisons with traditionally-taught students. Journalof Engineering al paradigms in undergraduate mechanical engineering education.
Education87 (4): 469-80. JournalofEngineeringEducation 91 (4): 415-17.
Felder, R.M., G.N. Felder, and Ej. Dietz. 2002. The effects of Howe, K.R. 1988. Against the quantitative-qualitative incom-
personality type on engineering student performance and attitudes. patibility thesis or dogmas die hard. EducationalResearcher 17 (8):
JournalofEngineeringEducation91 (1): 3-17. 10-16.
Felder, R.M., K.D. Forrest, L. Baker-Ward, Ej. Dietz, and Hsi, S., M.C. Linn, and J.E. Bell. 1997. The role of spatial rea-
P.H. Mohr. 1993. A longitudinal study of engineering student per- soning in engineering and the design of spatial instruction. Journal
formance and retention: I. Success and failure in the introductory ofEngineeringEducation86 (2): 151-58.
course.JournalofEngineeringEducation82 (1): 15-21. Hunkeler, D., and J.E. Sharp. 1997. Assigning functional
Felder, R.M., P.H. Mohr, Ej. Dietz, and L. Baker-Ward. groups: The influence of group size, academic record, practical ex-
1994. A longitudinal study of engineering student performance and perience, and learning style.JournalofEngineeringEducation 86 (4):
retention II. Rural/urban student differences. JournalofEngineering 321-32.
Education83 (3): 209-17. Jesiek, B.K., L. Newswander, and M. Borrego. 2009. Engineer-
Feldman, A., and J. Minstrell. 2000. Action research as a re- ing education research: Field, community, or discipline? Journalof
search methodology for the study of the teaching and learning of EngineeringEducation 98 (1): 39-52.

64 Journalof EngineeringEducation January 2009


Johnson, R.B., and Aj. Onwuegbuzie. 2004. Mixed methods Moeller-Wong, C., and A. Eide. 1997. An engineering stu-
research: A paradigm whose time has come. EducationalResearcher dent retention study. Journalof Engineering Education 86 (1):
33 (7): 14-26. 7-15.
Journal of Engineering Education. 2005. JEE strategic plan, Morell, L., R. Buxeda, M. Orengo, and A. Sanchez. 2001. After
2005-2010: Advancing engineering education research worldwide. so much effort: Is faculty using cooperative learning in the class-
JournalofEngineeringEducation94 (3): 283-84. room?JournalofEngineeringEducation 90 (3): 357-62.
Kilgore, D., CJ. Atman, K. Yasuhara, T.J. Barker, and A. Morse, J.M. 1991. Approaches to qualitative-quantitative
Morozov. 2007. Considering context: A study of first-year engi- methodological triangulation. NursingResearch 40: 120-23.
neering students.JournalofEngineeringEducation 96 (4): 321-34. Morse, J.M. 2003. Principles of mixed methods and multi-
Kirschman, J.S., and J.S. Greenstein. 2002. The use of group- method research design. In Handbook of mixed methods in socialand
ware for collaboration in distributed student engineering design behavioralresearch, eds. A. Tashakkori and C. Teddlie, 189-208.
teams.JournalofEngineeringEducation 91 (4): 403-07. Thousand Oaks: Sage Publications.
Koro-Ljungberg, M., and E.P. Douglas. 2008. State of qualita- Moskal, B.M., J.A. Leydens, and MJ. Pavelich. 2002. Validity,
tive research in engineering education: Meta-analysis of JEE arti- reliability and the assessment of engineering education. Journalof
cles, 2005-2006.JournalofEngineeringEducation 97 (2): 163-76. EngineeringEducation91 (3): 351-54.
Klukken, P.G., J.Rt Parsons, and P.J. Columbus. 1997. The cre- Napp, J.B. 2004. Survey of library services at engineering news
ative experience in engineering practice: Implications for engineer- record's top 500 design firms: Implications for engineering educa-
ing education. JournalEngineeringEducation 86 (2): 133-38. tion.JournalofEngineeringEducation (3): 247-52.
Kuhn, T. 1962. The structure of scientific revolutions. Illinois: Ogot, M., G. Elliott, G., and N. Glumac. 2003. An assessment
University of Chicago Press. of in-person and remotely operated laboratories. Journalof Engi-
Kuhn, T. 1970. The structure of scientific revolutions. Illinois: neering Education 92 (1): 57-64.
University of Chicago Press. Ohland, M.W., and G. Zhang. 2002. A study of the impact of
Labaree, D.F. 2003. The peculiar problems of preparing educa- minority engineering programs at the FAMU-FSU college of engi-
tional researchers. EducationalResearcher32 (4): 13-22. neering.JournalofEngineeringEducation 91 (4): 435-40.
Lackey, L.W., W.J. Lackey, H.M. Grady, and M.T. Davis. Olds, B.M., and R.L. Miller. 2004. The effect of a first-year in-
2003. Efficacy of using a single, non-technical variable to predict tegrated engineering curriculum on graduation rates and student
the academic success of freshmen engineering students. Journalof satisfaction: A longitudinal study. JournalofEngineeringEducation
EngineeringEducation92 (2): 41-48. 93 (1): 23-35.
Lang, J.D., S. Cruse, F.D. McVey, andJ. McMasters. 1999. In- Olds, B.M., B.M. Moskal, and R.L. Miller. 2005. Assessment
dustry expectations of new engineers: A survey to assist curriculum in engineering education: Evolution, approaches and future collab-
designers.JournalofEngineeringEducation 88 (1): 43-51. orations.JournalofEngineeringEducation 94 (1): 13-25.
Larpkiataworn, S., 0. Muogboh, M. Besterfield-Sacre, LJ. Parsons, C.K., E. Caylor, and H.S. Simmons. 2005. Coopera-
Shuman, and H. Wolfe. 2003. Special considerations when using tive education work assignments: The role of organizational and in-
statistical analysis in engineering education assessment and evalua- dividual factors in enhancing ABET competencies and co-op
tion. JournalofEngineeringEducation92 (3): 207-15. workplace well-being. Journal of Engineering Education 94 (3):
Lee, J.A., D.M. Castella, and S.G. Middleton. 1997. Faculty 309-18.
perceptions of academe's evaluation system. Journalof Engineering Patton, M.Q. 2002. Qualitative research & evaluation methods.
Education 86 (3): 263-67. Thousand Oaks, CA: Sage Publications.
Leedy, P.D. 1997. Practicalresearch:Planningand design (6th ed.). Pavelich, MJ., and W.S. Moore. 1996. Measuring the effect of
Upper Saddle River, NJ: Merrill. experiential education using the Perry Model. JournalofEngineer-
Liang, T., D.G. Bell, and LJ. Leifer. 2001. Re-use or re-invent? ingEducation85 (4): 287-92.
Understanding and supporting learning from experience of peers in Peters, M., P. Chisholm, and B. Laeng. 1994. Spatial ability,
a product development community. JournalofEngineeringEduca- student gender, and academic performance. Journalof Engineering
tion 90 (4): 519-26. Education 83 (1): 1-5.
Lincoln, Y.S., and E.G. Guba. 1985. Naturaliiticinquiry.Beverly Raju, P.K., and C.S. Sankar. 1999. Teaching real-world issues
Hills, CA: Sage Publications. through case studies. Journalof Engineering Education 88 (4):
Lumsdaine, M., and E. Lumsdaine. 1995. Thinking preferences 501-08.
of engineering students: Implications for curriculum restructuring. Rayne, K., T. Martin, S. Brophy, NJ. Kemp, J.D. Hart, and
JournalofEngineeringEducation 84 (2): 193-204. K.R. Diller. 2006. The development of adaptive expertise in bio-
McLoughlin, L.A. 2005. Spotlighting: Emergent gender bias in medical engineering ethics. JournalofEngineeringEducation 95 (2):
undergraduate engineering education. JournalofEngineeringEdu- 165-73.
cation 94 (4): 373-81. Robinson, D.A.G., and B.A. Reilly. 1993. Women engineers:
Merino, D.N., and K.D. Abel. 2003. Evaluating the effective- A study of educational preparation and professional success.Journal
ness of computer tutorials versus traditional lecturing in accounting ofEngineeringEducation 82 (2): 78-82.
topics.JournalofEngineeringEducation 92 (2): 189-94. Rutz, E., R. Eckhart, J.E. Wade, C. Maltbie, C. Rafter, and V.
Merriam, S.B. 2002. Qualitativeresearchin practice.: Examplesfor Elkins. 2003. Student performance and acceptance of instructional
discussion and analysis. San Francisco, CA: Jossey-Bass. technology: Comparing technology-enhanced and traditional
Miles, M.B., and M. Huberman. 1994. Qualitativedata analysis: instruction for a course in statics. JournalofEngineeringEducation
An expandedsourcebook.Thousand Oaks, CA: Sage Publications. 92 (2): 133-40.

January 2009 Journalof Engineering Education 65


Sageev, P., and CJ. Romanowski. 2001. A message from recent Vanderburg, W.H., and N. Khan 1994. How well is engineer-
engineering graduates in the workplace: Results of a survey on tech- ing education incorporating societal issues? Journalof Engineering
nical communication skills. JournalofEngineeringEducation 90 (4): Education 83 (4): 1-5.
685-93. Walker, B.K., S. Jeng, P.D. Orkwis, G.L. Slater, P.K. Khosla,
Sandelowski, M. 2003. Tables or tableux? The challenges
of and GJ. Simitses. 1998. Redesigning an aerospace engineering cur-
writing and reading mixed methods studies. In Handbook of mixed riculum for the twenty-first century: Results of a survey. Journalof
methods in socialand behavioralresearch, eds. A. Tashakkori and C. EngineeringEducation 87 (4): 481-87.
Teddlie, 321-350. Thousand Oaks, CA: Sage Publications. Walker, J.M., and P.H. King. 2003. Concept mapping as a form
Shavelson, R., and L. Towne. 2002. Scientific research in educa- of student assessment and instruction in the domain of bioengi-
tion. Washington, DC: National Academies Press. neering.JournalofEngineeringEducation
92 (2): 167-79.
Shiavi, R., and A. Brodersen. 2005. Study of instructional Wankat, P.C., R.M. Felder, K.A. Smith, and F.S. Oreovicz.
modes for introductory computing. JournalofEngineeringEducation 2002. The scholarship of teaching and learning in engineering. In
94 (4): 355-62. Disciplinarystyles in the scholarshipofteachingand learning:Exploring
St. Pierre, E.A. 2002. "Science" rejects postmodernism. Educa- common ground, eds. M.T. Huber and S.P. Morrealle, 217-237.
tionalResearcber3l(8): 25-27. Sterling, VA: Stylus Publishing.
Steering Committee of the National Engineering Education Webster, TJ., and K.M. Haberstroh. 2002. An interactive,
Research Colloquies. 2006. Special report: The research agenda for video-teleconferenced, graduate course in biomedical engineering.
the new discipline of engineering education. JournalofEngineering JournalofEngineeringEducation91 (2): 159-66.
Education 95 (4): 259-61. Weisner, T.F., and W. Lan. 2004. Comparison of student
Taraban, R., A. DeFinis, A.G. Brown, E.E. Anderson, and learning in physical and simulated unit operations experiments.
M.P. Sharma. 2007. A paradigm for assessing conceptual and pro- JournalofEngineeringEducation 93 (3): 195-04.
cedural knowledge in engineering students. Journalof Engineering Whitin, K., and S. Sheppard. 2004. Taking stock: An analysis of
Education96 (4): 335-45. the publishing record as represented by the Journalof Engineering
Tashakkori, A., and C. Teddlie. 1998. Mixed metbodologyv. Com- Education.JournalofEngineeringEducation 93 (1): 5-12.
bining qualitativeandquantitativeapproaches.Thousand Oaks, CA:
Sage Publications.
Tebbs, J.M., K.M. Bower. 2003. Some comments
on the ro- AUTHORS' BIOGRAPHIES
bustness of student t procedures. Journalof EngineeringEducation
92 (1): 91-94. Maura Borrego is an assistant professor of Engineering Educa-
Teddlie, C., and A. Tashakkori. 2003. Major issues and contro- tion at Virginia Tech. Dr. Borrego holds an M.S. and Ph.D. in
versies in the use of mixed methods in the social and behavioral sci- Materials Science and Engineering from Stanford University. Her
ences. In Handbook ofmixed methods in socialand behavioralresearch, current research interests center around interdisciplinary collabo-
eds. A. Tashakkori and C. Teddlie, 3-50. Thousand Oaks, CA: ration in engineering and engineering education. Funded by her
Sage Publications. CAREER grant, she also studies interdisciplinarity in engineering
Terenzini, P.T., A.F. Cabrera, C.L. Colbeck,J.M. Parente, and graduate programs nationwide.
S.A. Bjorklund. 2001. Collaborative learning vs. lecture/discussion: Address: Engineering Education (0218), Blacksburg, Virginia
Students' reported learning gains. JournalofEngineeringEducation 24061; telephone: (+ 1) 540.231.9536; e-mail: mborrego@vt.edu.
90 (1): 123-30.
, Thorne, M., and M. Giesen. 2002. Statisticsforthe behavioralsci- Elliot P. Douglas is an associate professor in the Department
ences. New York: McGraw-Hill. of Materials Science and Engineering. He received his doctorate
Tinto, V. 1993. Leaving college: Rethinking the causes and cures of in Polymer Science and Engineering from the University of
student attrition.Illinois: University Of Chicago Press. Massachusetts-Amherst. His research interests in engineering
Todd, R.H., S.P. Magleby, C.D. Sorensen, B.R. Swan, and education are in the areas of active learning, critical thinking, and
D.K. Anthony. 1995. A survey of capstone engineering courses the use of qualitative methods.
in North America. Journalof EngineeringEducation 84 (2): Address: 323 Materials Engineering Building, Box 116400, Uni-
165-74. versity of Florida, Gainesville, FL 32611; telephone: (+1) 352.
Tonso, K. 1996. The impact of cultural norms on women.Jour- 846.2836; fax: (+ 1) 352.392.3771; e-mail: edoug@mse.ufl.edu.
nalofEngineeringEducation85 (3): 217-25.
Tonso, K.L. 2006. Teams that work: Campus culture, engineer Catherine T. Amelink is currently serving as the Assessment
identity, and social interactions. Journal ofEngineeringEducation 95 Coordinator for the Division of Student Affairs, Virginia Tech. She
(1): 25-37. received her Ph.D. from Virginia Tech in Educational Leadership
Trevelyan, J. 2007. Technical coordination in engineering prac- and Policy Studies. She currently works with departments on pro-
tice.JournalofEngineeringEducation96 (3): 191-04. gram review activities, data analysis, and assessment of learning out-
Trussell, H.J., and E.J. Dietz. 2003. A study of the effect of
comes. Her research interests include work-life spillover among
graded homework in a preparatory math course for electrical engi- faculty and issues confronting underrepresented groups in the
neers.JournalofEngineeringEducation
92 (2): 141-46. STEM fields.
U.S. Department of Education. 2007. Report of the Academic Address: 112 Burruss Hall (0250), Blacksburg, VA 24061;
Competitiveness Council.Washington, DC. e-mail: amelink@vt.edu.

66 Journalof EngineeringEducation
January 2009
COPYRIGHT INFORMATION

TITLE: Quantitative, Qualitative, and Mixed Research Methods in


Engineering Education
SOURCE: J Eng Educ (Washington, D.C.) 98 no1 Ja 2009

The magazine publisher is the copyright holder of this article and it


is reproduced with permission. Further reproduction of this article in
violation of the copyright is prohibited. To contact the publisher:
http://www.asee.org/publications/jee/index.cfm

You might also like