You are on page 1of 20

International Journal of Science Education

ISSN: 0950-0693 (Print) 1464-5289 (Online) Journal homepage: http://www.tandfonline.com/loi/tsed20

Exploring college students’ cognitive patterns


during reasoning

Shiyu Liu & Frances Lawrenz

To cite this article: Shiyu Liu & Frances Lawrenz (2018): Exploring college students’
cognitive patterns during reasoning, International Journal of Science Education, DOI:
10.1080/09500693.2018.1511072

To link to this article: https://doi.org/10.1080/09500693.2018.1511072

Published online: 23 Aug 2018.

Submit your article to this journal

Article views: 27

View Crossmark data

Full Terms & Conditions of access and use can be found at


http://www.tandfonline.com/action/journalInformation?journalCode=tsed20
INTERNATIONAL JOURNAL OF SCIENCE EDUCATION
https://doi.org/10.1080/09500693.2018.1511072

Exploring college students’ cognitive patterns during


reasoning*
a
Shiyu Liu and Frances Lawrenzb
a
Department of Education, Ocean University of China, Qingdao, People’s Republic of China; bDepartment of
Educational Psychology, University of Minnesota, Minneapolis, MN, USA

ABSTRACT ARTICLE HISTORY


This research aimed to investigate the nature of cognitive processes Received 3 November 2016
when college students reason about evidence on global climate Accepted 8 August 2018
change (GCC). Twenty-six undergraduate students participated in
KEYWORDS
this qualitative study, where they were interviewed to evaluate Reasoning; argumentation;
competing arguments on key issues related to GCC and discuss global climate change;
their own perspectives. Constant comparative analysis of data college students
from think-aloud protocols and semi-structured interviews
revealed three patterns of reasoning: minimum reasoning,
constrained reasoning, and deliberative reasoning. Minimum
reasoning demonstrated that participants predominantly favoured
arguments which supported their own beliefs, with limited
reasoning about the relative correctness of opposing arguments.
Constrained reasoning showed participants’ emphasis on surface
features of evidence on GCC rather than its scientific
underpinnings. In contrast, deliberative reasoning involved more
sophisticated cognitive efforts in coordinating evidence and
claims, and a key characteristic of this pattern was in-depth
statistical and causal reasoning. The current findings added to our
understanding of college students’ reasoning processes when
they are faced with controversial issues like GCC. This work
contributed to current efforts in using cognitive research to
inform science and environmental education, and laid a
foundation for future endeavours in promoting scientific
reasoning and argumentation in climate change education.

Introduction
The skills to reason actively and scientifically are crucial in science learning. While tra-
ditional science education mainly focuses on factual recall and confirmatory experiments
(Driver, Newton, & Osborne, 2000; Weiss, Pasely, Smith, Banilower, & Heck, 2003), there
have been increasing emphases on promoting students’ scientific reasoning skills (e.g.
American Association for the Advancement of Science, 1989, 2007; National Research
Council, 1996, 2001, 2012). For instance, in the U.S., the Next Generation Science Stan-
dards (NGSS Lead States, 2013) stressed that students should be able to reason scientifi-
cally to link evidence to explanations, as well as defending and critiquing claims and

CONTACT Shiyu Liu shiyuliu@ouc.edu.cn Department of Education, Ocean University of China, 238 Songling Road,
Qingdao 266100, People’s Republic of China
*The original research presented in this manuscript was drawn upon the first author’s doctoral dissertation.
© 2018 Informa UK Limited, trading as Taylor & Francis Group
2 S. LIU AND F. LAWRENZ

explanations. Through scientific reasoning, students can evaluate the merits of arguments
critically, in not only science classrooms but also everyday life. Similarly, in Europe, great
efforts have been made to enhance students’ reasoning skills such as encouraging engage-
ment in argumentation (European Union, 2006).
To promote scientific reasoning, a majority of previous research has looked into the
development of cognitive skills in aspects such as hypothesis testing, experimental
design, and evidence evaluation (Zimmerman, 2005). This line of work aimed to
answer the question of ‘how people reason’ by centreing on individuals’ performance in
cognitive operations which include, but are not limited to, conducting experimental pro-
cedures (e.g. Dunbar, 2001; Schauble, 1996), coordinating theory and evidence (e.g.
Lawson et al., 2000; Watters & English, 1995; Zeineddin & Abd-El-Khalick, 2010), and
evaluating the quality of evidence (e.g. Lombardi & Sinatra, 2013). Among the abundant
studies in this regard, researchers have extensively explored individual differences during
reasoning, and attempted to investigate factors that may contribute to the distinctive
approaches people take to reason, even when they are presented with the same infor-
mation (Koslowski, 1996; van der Graaf, Segers, & Verhoeven, 2016).
Nevertheless, only a few studies have explored the cognitive underpinnings of the
variety of reasoning performance among individuals (Wu & Tsai, 2011). Indeed, the
majority of studies on reasoning have been more oriented toward a heavy emphasis on
the product or context of reasoning, whereas not enough attention has been given to
the fundamental features of cognitive processes during reasoning (Bond, Philo, &
Shipton, 2011). Hence, no consensus has been reached on what constitutes the essence
of scientific reasoning, and there is a call for more research to look into the nature of
how people reason (Kind, 2013). The present study aimed to examine the cognitive pro-
cesses that college students engage in during reasoning. Our research question was: What
cognitive patterns can be identified when college students reason about competing argu-
ments? We argue that by exploring the cognitive patterns which students demonstrate,
this work may help educators to gauge the quality of reasoning, which can ultimately
offer insights into approaches to enhancing college students’ competence in scientific
reasoning.

Theoretical backgrounds
Reasoning and argumentation
Reasoning is a process of drawing conclusions from principles and evidence so as to infer
new conclusions based on what is already known (Holyoak & Morrison, 2005; Wason &
Johnson-Laird, 1972). As an important type of reasoning, scientific reasoning has drawn
attention from scholars in a variety of fields. Since researchers differ in their views about
science and reasoning, definitions for scientific reasoning vary in the literature (Zimmer-
man, 2005). Some researchers define scientific reasoning as formal reasoning and mainly
investigate how people use logic to reason (e.g. Fischer et al., 2014; Giere, Bickle, &
Mauldin, 2006). In this line of work, tasks that are well-structured with clear-cut solutions
are often adopted (e.g. Chen & Klahr, 1999; Inhelder & Piaget, 1958; Lawson, 2005). As
these well-defined tasks tend to be lean on contexts, sophisticated domain knowledge is
not required to engage in reasoning. Instead, cognitive skills required in general
INTERNATIONAL JOURNAL OF SCIENCE EDUCATION 3

problem solving and spatial thinking are more essential (e.g. Chinn & Malhotra, 2002;
Mayer, Sodian, Koerber, & Schwippert, 2014). While research in this respect is critical
in revealing key features of reasoning, further investigation is needed to provide an auth-
entic perspective of how people really reason when faced with a wide variety of tasks in real
life (Kind, 2013).
To unveil the authentic nature of scientific reasoning, in the recent couple of decades,
an increasing number of studies have explored how people reason in argumentative con-
texts (e.g. Acar, Turkmen, & Roychoudhury, 2010; Kuhn, 1993; Sadler & Donnelly, 2006).
Argumentation is a social and verbal means of trying to resolve a conflict or difference
between two or more parties, and scientific argumentation requires individuals to evaluate
data and rationalise its use as evidence for a claim (Sampson & Clark, 2008; Walker &
Sampson, 2013). Kuhn (1993) suggested that the forms of reasoning ‘can be rigorously
defined within the framework provided by the structure of argument’ (p. 333). In this
respect, argumentation is a ‘verbal activity’ to probe into the internal reasoning processes
(Bricker & Bell, 2008; Driver et al., 2000; Duschl & Osborne, 2002; Yang & Tsai, 2010). As
argumentative contexts provide valuable experience to foster thinking, exploring the
nature of reasoning within argumentation not only enriches our understanding of reason-
ing, but also contributes to educational efforts in facilitating the development of students’
reasoning skills (Osborne, 2010).
To further elaborate on the essential characteristics of reasoning and its relationship
with argumentation, Mercier and colleagues proposed the argumentative theory of reason-
ing (Mercier & Sperber, 2011; Mercier, Boudry, Paglieri, & Trouche, 2017). According to
their theoretical perspective, argumentation is the purpose of reasoning. People are motiv-
ated to reason mostly for convincing others through the production of arguments. To
achieve this goal, they tend to find reasons that confirm their own stances, also called
myside bias (Mercier & Sperber, 2011), or just being lazy and content with superficial evi-
dence that can be easily obtained (Mercier et al., 2017). While with well-structured pro-
blems, the impacts of myside bias and laziness may not be apparent, when it comes to
controversial matters like socioscientific issues, such features of reasoning would be
much more manifested. Hence, this theoretical perspective laid a solid foundation for
the present study in the investigation of reasoning processes in argumentative contexts.

Reasoning about socioscientific issues


Current educational reforms are placing growing concerns on effective teaching of socio-
scientific issues (SSIs) in science classrooms (European Union, 2006; NGSS Lead States,
2013). SSIs are controversial social issues with scientific underpinnings, such as global
climate change (GCC) and genetic engineering (Sadler, 2004). Nowadays, SSIs have
been widely debated in the society, and there are currently no straightforward, clear-cut
solutions to them (di Sessa, 1993; Mason & Scirica, 2006; Yang, Chang, & Hsu, 2008).
As a result, the public are constantly faced with arguments from various sources that
support competing perspectives. To sufficiently evaluate competing arguments on SSIs,
it is important to move beyond mere formal, logical thinking and incorporate reasoning
that will allow the appreciation of the inherent complexity of science (Kuhn, 1993;
Sadler, 2004; Sadler, Barab, & Scott, 2007). After all, evidence on SSIs is usually implied
and the criteria for evidence evaluation are not well defined. Thus, individuals should
4 S. LIU AND F. LAWRENZ

understand science as a ‘way of knowing’ rather than absolute truth to conceptualise the
inherent complexity of SSIs (Zeidler, Sadler, Simmons, & Howes, 2005).
Research abounds on how individuals reason about SSIs, but much is left to be explored
regarding the cognitive processes during reasoning (Wu & Tsai, 2011). At present, the
primary theoretical framework that accounts for the reasoning processes from a cognitive
perspective is the dual-process theory (Evans, 2002, 2003). According to this theory, there
are two distinct cognitive systems: implicit and explicit. While the former (System 1) is
unconscious, pragmatic, and contextualised, the latter (System 2) is conscious and involves
logical and abstract thinking. In general, people first make decisions and then think after-
ward to justify their choices that are unconsciously determined (Evans, 1996). Accord-
ingly, reasoning about SSIs can be considered as coordinating between the implicit and
explicit cognitive systems. In particular, Wu and Tsai (2007) proposed that there are
two stages of reasoning in contexts of SSIs: the preliminary stage and the deliberation
stage. Individuals usually rely on their past experiences, such as prior knowledge and per-
sonal beliefs, to immediately make an initial decision. While some individuals only experi-
ence this stage, others proceed to reason in the deliberation stage, where they elaborate on
their thinking and make a final decision.
Based on the dual-process theory, researchers have proposed several cognitive schemes
to capture how one reasons about SSIs. For instance, Driver, Leach, Millar, and Scott (1996)
divided reasoning into three categories, including phenomenon-based reasoning, relation-
based reasoning, and model-based reasoning. Similarly, Yang and Anderson (2003)
classified student reasoning into three modes: scientifically oriented reasoning, socially
oriented reasoning, and equally disposed reasoning. Sadler and Zeidler (2005) explored
college students’ negotiation of SSIs and identified rationalistic reasoning, emotive reason-
ing, and intuitive reasoning as three main patterns of reasoning. Recently, with a more
specific focus on how middle school students reasoned about climate change issues, Lom-
bardi, Bickel, Brandt, and Burg (2017) categorised four patterns of evidence evaluation
based on Driver et al. (1996) and Dole and Sinatra’s (1998) frameworks, including: erro-
neous evaluation, descriptive evaluation, relational evaluation, and critical evaluation.
While the aforementioned characterisations provided detailed insights into individuals’
reasoning processes, further empirical work is needed to move toward a more integrative
scheme that can capture the common features of reasoning in contexts of SSIs.

The present study


The overarching goal of this work was to further our understanding of the cognitive pro-
cesses that individuals engage in when reasoning about SSIs. In particular, the topic of
interest in this study was global climate change (GCC). GCC is a pressing environmental
concern faced by humans. Confronted by overwhelming information on climate issues,
individuals tend to rely on their deeply held beliefs as an important mental filter when
reasoning about competing arguments (Maibach, Roser-Renouf, & Leiserowitz, 2008).
As a result, a considerable percentage of the public still dismisses the seriousness of
GCC (Gallup, 2010; Leiserowitz, Maibach, Roser-Renouf, & Smith, 2011). To promote
the public’s understandings of GCC, it is critical to enhance their competence in reason-
ing, so that they could more sufficiently coordinate evidence and arguments from various
sources (Sinatra, Kienhues, & Hofer, 2014).
INTERNATIONAL JOURNAL OF SCIENCE EDUCATION 5

While great efforts have been made to promote students’ reasoning skills about GCC
issues, difficulties abound when they try to process the multitude of perspectives about
GCC (e.g. Braasch, Bråten, Strømsø, Anmarkrud, & Ferguson, 2013; Gil, Bråten, Vidal-
Abarca, & Strømsø, 2010; Yang & Tsai, 2010). In recent years, many studies have specifi-
cally explored the challenges and obstacles college students encounter when reasoning
about arguments related to GCC. For example, Lombardi and colleagues found that stu-
dents had difficulties in evaluating the plausibility of evidence in arguments on GCC
(Lombardi, Brandt, Bickel, & Burg, 2016; Lombardi, Danielson, & Young, 2016), as well
as reasoning about particular aspects of GCC knowledge such as geological time (Lom-
bardi & Sinatra, 2012). To help students improve their reasoning about complex GCC
issues, it is pivotal to obtain an in-depth understanding of the cognitive approaches
they take. Therefore, building on the dual-process theory and the argumentative theory
of reasoning, the present study aimed to investigate the patterns of cognitive engagement
among college students when they reason about competing arguments on GCC.

Methods
Participants
A total of 26 undergraduate students (20 females and 6 males, Mage = 19.65 years, SDage =
1.06) participated in this qualitative study. They were all from a major university in the
Midwestern U.S. and selected through theoretical sampling (Glaser, 1978). Given the
aim of this work, theoretical sampling was employed to maximise the diversity of the
sample, which allowed a more rounded exploration of participants’ reasoning processes.
In particular, we recruited participants from different academic backgrounds and ethnicity
groups. The majority of participants were Caucasians, with four African Americans, two
Asians and one Native American. They were from a very wide range of majors, including
communications, child psychology, kinesiology, pre-nursing, family social science,
elementary education, art, nutrition, speech and hearing, political science, linguistics,
business, anthropology and so on.

Materials
An interview protocol, consisting of reading materials and interview questions, was
developed for data collection. In particular, a reading document including three sets
of passages was developed, and each set had two short articles that presented opposing
arguments (see Appendix 1). The reading materials were constructed based on previous
IPCC reports (2007, 2013). However, to help elicit reasoning as participants evaluated
the arguments, drawing on tools from previous research (Liu & Roehrig, 2017), evidence
presented in the reading was purposefully designed with flaws, such as using evidence for
correlation to claim causation. The opposing arguments were all centred on whether
human activities or natural changes are the main cause of GCC, and addressed three
commonly discussed topics on GCC: Earth’s temperature change, rising sea level, and
extreme weather events.
According to the dual-process theory, System 1 and 2 thinking involve three main
aspects: prior knowledge, personal beliefs, and effortful reasoning (Evans, 2008). Thus, we
6 S. LIU AND F. LAWRENZ

developed interview questions to evaluate participants’ understandings and beliefs about


GCC in these regards (see Appendix 2). Moreover, as evaluation of arguments is a critical
predictor for competence in reasoning (Wu & Tsai, 2011), to probe into their reasoning pro-
cesses, we investigated how participants critique on the quality of evidence and its justifica-
tion for competing arguments. Altogether, three interview questions assessed prior
knowledge about the basic science underlying GCC, such as greenhouse effect; seven ques-
tions specifically focused on evaluation of arguments; and the three remaining questions
tapped into personal beliefs about GCC. The whole interview protocol was piloted with
college students and revised for clarity and conciseness before being used in this study.

Data collection
To explore the characteristics of participants’ reasoning processes, we conducted semi-
structured interviews and employed the think-aloud technique (Ericsson & Simon,
1993; Magliano, Trabasso, & Graesser, 1999), which allowed direct insights into partici-
pants’ moment-by-moment cognitive processes. Before the interview, each participant
received a consent form and brief instruction about the tasks they would complete. The
reading document was then provided with each set of arguments presented on a separate
page, and the participants were asked to read aloud and think aloud after each sentence.
Upon finishing each set of the arguments, participants were asked to critique them. At the
end of the interview, they answered questions regarding their perspectives toward GCC.
The interviews were conducted individually and each interview lasted for approximately
45 minutes.

Data analysis
The interviews were audio-recorded and transcribed verbatim, and the transcripts were
entered into NVivo 10 for further analysis. The data analysis process was conducted
based on the grounded theory methodology (Glaser & Strauss, 1967; Strauss & Corbin,
1990), which features an exploratory development of theory grounding in data from the
field. In particular, the constant comparative method (Strauss & Corbin, 1998) was
used for coding, entailing three stages: open coding, axial coding, and selective coding.
While open coding requires line-by-line categorisation to identify the properties and
dimensions of participants’ understandings, axial coding involves constant comparison
of these categories, and selective coding is the integration process where categories are
refined. Aligned with the theoretical sampling process, data coding underwent a recursive
process to saturate theoretical categories and themes (Charmaz, 2006; Fassinger, 2005;
Jones, Torres, & Arminio, 2006). As new interviews were conducted and analysed, pre-
vious transcripts were constantly revisited, so that categories could be compared with
one another in terms of whether they interacted with and/or subsumed each other.
Specifically, during the first round of analysis, we carefully read through the interview
transcripts and generated open codes through sentence-by-sentence coding. Next, we
sorted through all open codes for repetitions, reorganised the codes and categorised
them into groups to let features of reasoning processes gradually emerge. Two researchers
were involved in constructing the coding schemes, and techniques such as memoing and
diagramming were employed to establish the trustworthiness of data analysis (Lincoln &
INTERNATIONAL JOURNAL OF SCIENCE EDUCATION 7

Guba, 1985). Categories that were similar in their definitions were further analysed to
decide their final categorisation, until the inter-rater reliability Cronbach’s α was higher
than 0.8 for final coding. The final decision of saturation being reached was made after
26 participants were interviewed, and 3 additional participants were interviewed to
cross check for any additional coding that may emerge. As no new theoretical categories
emerged from these interviews, the results reported in this paper only include data from 26
participants.

Results
Prior knowledge
Qualitative analysis of participants’ interview responses revealed that none of them were
proficient in explaining the basic science of GCC. Most participants were only able to gen-
erally explain terms such as greenhouse effect with very limited, and even inaccurate,
understanding of the science behind it. On one hand, 18 out of the 26 participants
either did not understand the metaphorical use of ‘greenhouse’ in these two terms or
were not able to give detailed definitions for greenhouse gases and greenhouse effect.
For instance, when asked to explain greenhouse gases, P22 answered ‘Greenhouse
gases? I don’t even remember. Do those come from greenhouses?’ Some participants
also demonstrated perception of a causal relationship between CO2 emissions and
ozone depletion. For example, when defining greenhouse gasses, P16 mentioned:
I know that they [greenhouse gases] are causing a big hole in the ozone over the North Pole.
I’m pretty sure they’re mostly from giant cow farms and giant urban cities that have 50,000
cars on the road. And greenhouse gases, I think, just are that we cut all the forests and now we
don’t have as much oxygen and the CO2 isn’t really going anywhere.

In comparison, however, eight participants were able to provide more detailed scientific
explanations about greenhouse gases and greenhouse effect, and were also very much
aware of humans’ impacts on the Earth’s temperature. P11, for instance, illustrated his
understanding of greenhouse gases and greenhouse effect with a focus on human activities:
We can take them [greenhouse gases] and they [the atmosphere and oceans] can hold them
for a certain amount of time, and then release them to something that is good for everyone.
But we’re putting so much out there that they can’t contain them [greenhouse gases] at all.
We’re also cutting down the trees or hurting the oceans, so there’re issues like being unable to
contain them, so we’re having all this excessive heat that we’re generating, producing right
into the world and because this is no way to convert them and keep the cycle going,
they’re staying in the atmosphere.

Patterns of reasoning
When evaluating the arguments, participants demonstrated three patterns of reasoning. It
should be noted that while these patterns were distinctive from one another, participants
did not reason in just one pattern. Instead, there was a combination of these patterns
throughout their think-aloud protocols and interviews. In this section, we explain in
detail the main features of each pattern. Here, quotes from participants are specified
with their assigned numbers (such as P1, P2, and so on).
8 S. LIU AND F. LAWRENZ

Pattern 1: minimum reasoning


When reading the arguments, participants often responded in very simple ways, revealing
none or rather limited reasoning as they were thinking aloud and evaluating the argu-
ments. In particular, we identified two main features of the cognitive engagement in
this pattern of reasoning: simple reactions to arguments and confirming one’s own beliefs.
Simple reactions to arguments. Participants tended to repeat the sentences word by
word as they were reading the arguments and thinking aloud. Some participants only
reacted to the arguments by simply saying ‘I agree/disagree’, whereas others also expressed
that they liked or disliked the information presented when reporting their thinking at the
moment. Yet, participants generally shied away from providing detailed critiques, for
reasons such as they may ‘sound dumb’. As participants were evaluating evidence and
making judgments about a given argument, they often simply conformed to or refuted
it without providing sufficient reasons or specifying their criteria for evaluation. For
example, P23 explained that her reason for agreeing with the argument which supported
‘Temperature change is due to human-induced CO2 emissions’ was that ‘It seems strong
… there’s no unnecessary information. It just seems credible.’
Sometimes, participants requested for clarification and confirmation upon reading
information that they were not familiar with or uncertain about. For example, after
reading the title ‘Temperature Change and CO2 Emissions’, P15 asked ‘CO2, that’s
carbon dioxide, isn’t it?’ Moreover, some participants also recognised and briefly identified
the consistency or conflict of information in the reading with what they already knew.
After reading about the composition of the atmosphere and the percentage of CO2 in
it, P14 responded ‘OK, I guess I didn’t know that CO2 only constitutes less than 1% of
the trace gases.’ However, more often than not, neither request for clarification nor
identification of information conflicts was followed by further reasoning.
Confirmation of one’s own beliefs. Personal beliefs about GCC were revealed to have
important bearings on the paths participants took to evaluate the given arguments.
With or without being aware of it, participants were prone to strongly agreeing with argu-
ments aligned with their own perspectives, even when they lacked sufficient knowledge to
justify the scientific credibility of the evidence. P10, for example, believed that ‘The Earth is
going to go on for a long, long time, why do I need to really care about that?’ When she was
asked to evaluate evidence used in both sides of the arguments, she considered all the evi-
dence supporting her belief ‘just seems more plausible’ but could not provide any specific
scientific explanations.

Pattern 2: constrained reasoning


This pattern of reasoning involved more cognitive efforts and included more details about
how participants processed given information. Compared to Pattern 1 reasoning, Pattern
2 revealed further evaluation of the characteristics of evidence and arguments, but such
analysis was more confined to their surface features, including features of writing, avail-
ability of numbers, and so on. More importantly, in Pattern 2 reasoning, participants
were more reflective on their thinking, even though they demonstrated difficulties in
aspects such as reasoning about deep time.
Fixation on surface features of writing. Participants placed great emphasis on the
surface features of evidence, as well as the arguments being supported. One of the main
concerns they raised was how the arguments were worded when supporting a given
INTERNATIONAL JOURNAL OF SCIENCE EDUCATION 9

claim. Many participants paid particular attention to the tone of writing. For instance,
after reading the sentence ‘Changes in the frequency and intensity of extreme weather
events are due to human-caused Earth’s temperature increase’, P22 commented that
‘even though I believe in this, this sentence came off a little biased. Even though I do
believe that it is human caused, it came off a little strong. So, I don’t know.’
When making further, in-depth evaluation of arguments, participants brought up con-
cerns about the composition of key elements in the arguments (the claim, evidence, and
justification), in terms of their coherence, straightforwardness, and choice of wording.
P4, for example, commented on the argument which claimed that extreme weather
events were not human-caused by pointing out that ‘words like “appears to have
decreased” are not concrete’. Similarly, P18, made the following critiques on the argument
that supported ‘Rising sea level is not human caused’:
I would definitely first change the wording of these numbers and eliminate the parentheses,
so these are actually part of the sentence, and then I might write at the end a summary sen-
tence saying ‘this suggests that the rising sea level is not human caused.’ And then that would
kind of sum it up in a nice way.

Preference for availability of numerical values. A common critique which participants pro-
vided was that more factual information should be used to strengthen the arguments.
When evaluating the argument for ‘Increase in extreme weather events is not human-
caused’, P19 considered that the major weakness was ‘they didn’t give as many dates
[as the opposing argument] or say specifically like there was a decrease [of extreme
weather events] during this time and increase during this time’. As for its opposing argu-
ment, P19 pointed out that ‘I wish they have put more numbers in there.’ However, such
preference for more factual information was not clearly justified. In other words, when
participants evaluated evidence, they were prone to relying on the availability of
numbers rather than consideration of their statistical inferences. For example, when pro-
posing an alternative way to structure the evidence, P3 made the following comments:
The argument [which supported that ‘the increase in extreme weather events is human-
caused changes’] could have added some numbers, maybe to say, like, during these years,
there are how many hurricanes and tropical storms that have happened and how it’s
doubled since 1970 and 1974.

Difficulty relating to deep time. Whether the given evidence was from recent years or a
rather long time ago was emphasised when participants evaluated the quality of evidence.
They tended to relate better to evidence that discussed GCC in recent years and perceived
evidence as stronger when it specifically included years very close to the present. In con-
trast, evidence that presented events from the ancient years or focused on a larger time-
scale was usually considered as insufficient or irrelevant, even when such events
supported the claim scientifically. For example, when comparing the competing argu-
ments in the reading, P26 considered arguments with evidence from recent years as
appearing stronger, but did not follow up with any further explanation for how evidence
from different points of time may strengthen its link with the claim:
I feel like it [the side that climate change is human induced] uses more recent data to support
its claims, because it cites 2012, which is 2 years ago, and it really does it for all three pages. It
goes back to 2000s, so it’s really recent. And in claim 2 [the side that climate change is due to
10 S. LIU AND F. LAWRENZ

natural changes], I feel like it’s not strong because the first claim, extreme events, it talks
about recent years, but in the second claim, rising sea levels, it just goes back thousands of
years, and in the third claim, it doesn’t even really address a recent time change.

Active reflection. Rather than only agreeing or disagreeing with the text when thinking aloud,
participants constantly and actively reflected on their knowledge, beliefs, and thinking pro-
cesses. For instance, after reading the sentence ‘The increase in extreme weather events is
human-caused’, P23 evaluated her confusion and pointed out that it might have been due
to lack of knowledge, saying that ‘I don’t see how a temperature increase would really
result in a more intense tropical cyclone. Ok, I just don’t know that much about weather.’
When asked to evaluate evidence in detail, participants expressed concerns about
whether the evidence was described sufficiently. For instance, in the argument for ‘Temp-
erature change is due to human-induced CO2 emissions’, the evidence of ‘Compared to
pre-industrial values, there has been a 50% increase of CO2 concentration in the air as
of 2011’ was used. After reading this evidence, P25 made the following comment:
I think it’s [evidence in both arguments about temperature change and CO2 emissions] very
vague. Because the first one is saying that “there has been a 50% increase of CO2 concen-
tration in the air as of 2011”, but they’re not really showing that as compared to what.
What they’re comparing that increase to?

Pattern 3: deliberative reasoning


This pattern of reasoning moved beyond the mere concern of surface features and revealed
more in-depth analysis of the structure and content of arguments. Specifically, participants
were more actively evaluative of evidence to distinguish between correlation and causa-
tion, clarify the statistical meaning of numerical values, and justify the scientific connec-
tions between claim and evidence.
Distinguishing correlation and causation. During the interviews, participants who were
clearly aware of the distinction between correlation and causation adopted this distinction
as their main criterion for evidence evaluation. For example, regarding the evidence used
to support the claim of ‘Temperature change is due to human-induced CO2 emissions’,
P17 briefly commented that ‘I found that they’re showing a correlation between high
CO2 emissions and climate change, but they’re not showing causation.’ Participants also
expressed that there was a need for more examples and scientific explanations to prove
any causal relationship. For example, P6 identified that coexistence of events does not
suffice for causation if no further information was presented:
Just because these things [human activities and extreme weather events] happened at the
same time doesn’t mean they’re connected. And if they’re saying that human activities
have resulted in temperature increase and there has been an increase in cyclone activities,
they are trying to connect those things just because they are putting them in the same sen-
tence, but there is no proof that says they correlate at all.

To distinguish between correlation and causation, participants requested further infor-


mation regarding the potential impacts of other related factors. For example, when eval-
uating the argument which supported that ‘The increase of extreme weather events is
human-caused’, P14 commented as follows:
I think we learned that you cannot always assume things, I mean even if they have a corre-
lation, that doesn’t mean causation. There might be another factor that could be affecting this
INTERNATIONAL JOURNAL OF SCIENCE EDUCATION 11

instead. So, for example, if they did have the improvement of technology, it could also mean
there has been an increase in these activities because they have been able to track them. So, I
think that this argument doesn’t account for other variables.

Additionally, participants also revealed their awareness of the complexity involved in


proving causation. Below, P11 illustrated his openness to the uncertainty that might be
involved in exploring causal relationships:
It [causation] is obviously something hard to prove. I feel like it’s a theory where it’s hard for
you to prove that it’s true, but I feel like the more you get toward it, the harder it is to buy that
argument of these aren’t related. I mean we’re never going to know if X causes Y, but the
more proof we have of X most likely is the cause of Y, then the more we should start
doing something about this.

Consideration of statistical inferences. Pattern 3 also featured participants’ reasoning about


the statistical inferences rather than only fixating on their own intuitions about numerical
values. When presented with numbers such as the percentage of CO2 in the Earth’s atmos-
phere, participants raised concerns that the mere presence of magnitude of numerical
values may not be sufficient evidence. For example, P25 requested for clarification regard-
ing the statistical significance of these numbers:
Even if the increase of CO2 went from like 0.5 to 1%, I don’t know how significant that would
be. They’re also showing that the increase in the Earth average surface temperature but it
looks like a very small amount. I don’t know how big of an impact that would have.

However, while participants often mentioned the need for more ‘statistics’ to
strengthen an argument, it was sometimes unclear what statistical information they
referred to. Only one participant, P26, explicitly suggested using statistics such as p-
value as evidence to examine whether the rates of increasing sea levels were significantly
different over the years.
Considering alternative explanations. Rather than simply conforming to or refuting the
arguments, some participants proposed viable solutions to improve the quality of evidence
and justifications. For instance, P23 expressed her concerns about how much the findings
from one region may be generalised in a larger geographical scale. Regarding the third set
of articles, she critiqued as follows:
They only mentioned one place: they’re just saying the Australian region. It would be prob-
ably more reliable if they said, or the argument would probably be strengthened, if they
included more areas, not just the Australian region.

In Pattern 3 reasoning, one of the key features is critical evaluation of both sides of the
arguments. Participants were able to provide critiques for arguments regardless of their
own preference and prior beliefs. After reading the arguments on the topic of ‘temperature
change and extreme weather events’, P14 critiqued that the evidence was insufficient in
both sides of the argument:
I think if they show that the number was really unreliable, instead of just saying that, that
would help back up their argument. And it would help with the second argument if they
could show maybe there was just an improvement in technology during this time period-
we only had this kind of techniques to detect them and now in this time period we have
this kind of techniques.
12 S. LIU AND F. LAWRENZ

Discussion
This study took a qualitative approach to explore how college students reasoned when
faced with competing arguments on GCC. Three patterns were identified to capture the
key characteristics of students’ reasoning processes. First, it was common that students
were only engaged in minimum reasoning when they could simply draw on their prior
beliefs to make judgments on a given argument. Constrained reasoning was demonstrated
as students attempted to elaborate on their evaluation of evidence, which featured an over-
whelming emphasis on surface characteristics of arguments such as the writing style and
availability of numerical values. Nonetheless, we also identified that college students were
able to reason at a more sophisticated and comprehensive level. During such deliberative
reasoning, they made more cognitive efforts to distinguish between causation and corre-
lation, evaluate the statistical meaning of numerical evidence, and so on. In all, these pat-
terns of reasoning, as well as their specific characteristics, added to our understanding of
the different approaches students take during reasoning.
Consistent with previous research, this work further revealed the intuitive and rational
aspects of individuals’ reasoning in the context of SSIs (e.g. Sadler & Zeidler, 2005). On
one hand, the current findings were consistent with the propositions of the argumentative
theory of reasoning (Mercier et al., 2017; Mercier & Sperber, 2011), which emphasised that
individuals tend to demonstrate myside bias and laziness in their reasoning. When trying
to achieve the goal of persuading others through argumentation, they tend to only seek
reasons to support their own beliefs and overlook the importance of coordinating alterna-
tive perspectives. Moreover, even for those who are competent in reasoning scientifically,
it is common that they may only focus on superficial features of evidence when generating
or evaluating arguments.
On the other hand, the cognitive patterns of reasoning emerged in this work aligned
with the two systems of thinking proposed in the dual-process theory. The pattern of
minimum reasoning revealed critical characteristics of System 1 thinking, featuring instan-
taneous and intuitive comments about GCC in lack of careful consideration of the evi-
dence in hand. In contrast, deliberative reasoning appeared to be the production of
System 2 thinking: cognitive processes, such as distinguishing correlation and causation,
demonstrated how participants consciously conducted logical and abstract thinking to
evaluate competing arguments. In addition, confined reasoning fell in the coordination
between System 1 and System 2, where participants engaged in effortful evaluation of
arguments, but were heavily dependent on their personal beliefs rather than scientific cri-
teria. This work provided valuable empirical support for the dual-process theory in the
context of GCC, and more importantly, promoted further discussion on how System 1
and 2 thinking operate when people reason about SSIs.
Many studies have demonstrated that students at all levels experience difficulties in
reasoning, especially during evidence evaluation (e.g. Dawson & Venville, 2009;
Koslowski, Marasia, Chelenza, & Dublin, 2008; Sandoval & Millwood, 2005). For
example, Pluta, Buckland, Chinn, Duncan, and Duschl (2008) documented that students
typically found it difficult to make judgments about the relative strength of evidence, and
tended to treat all evidence as equally strong. They mostly just mentioned the evidence
that was potentially supportive of a claim and had limited understanding of the need to
provide more elaborated justifications for the connection between evidence and claim.
INTERNATIONAL JOURNAL OF SCIENCE EDUCATION 13

Sanchez, Wiley, and Goldman (2006) identified four key challenges students mostly
experience, which included evaluating the source of the information, interpreting the evi-
dence that was presented, thinking about how the evidence fits into an explanation of the
phenomena, and integrating the information with prior knowledge. By engaging partici-
pants in reasoning about evidence in competing arguments on GCC, this study revealed
that such challenges are still commonly existent, especially when it comes to SSIs, such
as reasoning about geological time (Lombardi & Sinatra, 2012).
An important reason that may account for difficulties students experience in evidence
evaluation may be that they did not hold sufficient understanding of the criteria to judge
the reliability and validity of evidence (Wu & Hsieh, 2006). Driver et al. (2000) noted that
insufficient time is typically given in class to evaluative tasks beyond simple interpretation
of data. In particular, questions such as ‘What trust can we place in data?’ or ‘Are there
different possible interpretations of this data?’ are not frequently addressed. Thus, students
tend to treat data as non-biased and do not raise concerns over the credibility of the evi-
dence (Nicolaidou, Kyza, Terzian, Hadjichambis, & Kafouris, 2011). Such approach to
interpreting data and evaluating evidence can be especially detrimental for enhancing stu-
dents’ climate literacy given the ill-defined nature of climate science. Thus, more instruc-
tional efforts should be devoted to enhancing students’ reasoning in evidence evaluation.
Many studies have specifically investigated the effect of instructional intervention on
improving students’ skills in evaluating evidence (Engelmann, Neuhaus, & Fischer,
2016). McNeill, Lizotte, Krajcik, and Marx (2006) found that faded scaffolding of
written argumentation may facilitate students’ understanding and skills in evidence evalu-
ation. Providing students with more opportunities to reason in applied contexts can be
another helpful approach to fostering students’ reasoning about evidence (Bond et al.,
2011). By engaging in activities such as constructing scientific explanations and participat-
ing in scientific debates (Braaten & Windschitl, 2011; Tang, 2016), students could not only
improve their skills in coordinating evidence and claims, but also develop scientific under-
standings that the ‘right answer’ is not straightforwardly waiting for people to reveal it
(Mercier et al., 2017). Moreover, the recent work by Lombardi and colleagues found
that model-evidence-link diagram activities may also help students to enhance their
reasoning performance during evidence evaluation (Lombardi, Danielson et al., 2016).
By exploring the different patterns of college students’ reasoning about arguments on
GCC, this work provided important insights for future instructional efforts to be more
grounded in students’ cognitive competence when it comes to SSIs.
It should be noted that there are a few limitations in this work that need to be addressed
in future research. First, the topic used in this investigation of scientific reasoning was
GCC, and whether scientific reasoning processes may differ across various topics
remains to be further explored (Topcu, Sadler, & Yilmaz-Tuzun, 2010). Follow-up
studies should continue to explore whether reasoning about other SSIs, such as genetic
engineering and water pollution, may also lead to the three patterns emerged in this
work. Second, the task in this study was to critique arguments that were already generated,
which may not necessarily elicit reasoning processes that would otherwise demonstrate in
constructing arguments. Future work should consider engaging participants in a variety of
argumentative activities to obtain a more authentic view of how people reason. Further-
more, given the qualitative nature of this work, it is beyond our scope to quantitatively
gauge the reasoning patterns. With a main goal of characterising the cognitive processes,
14 S. LIU AND F. LAWRENZ

we also did not specifically investigate the dynamic connections between reasoning pat-
terns and relevant factors such as prior knowledge. However, in our next steps, we will
continue to probe into influential factors of the reasoning patterns drawing on mixed
methods approaches. Such efforts will add to our current findings and enrich our discus-
sion of the cognitive processes during reasoning when it comes to SSIs.

Conclusion
The present study aimed to explore the patterns of reasoning that college students
demonstrate when faced with competing arguments on GCC. With a primary focus
on students’ cognitive processes as they were evaluating evidence and arguments,
the current findings revealed three main patterns of reasoning and their key features.
This work not only added to existing literature on exploring the various approaches
individuals take to reasoning, but also enriched our understandings about the auth-
entic nature of individuals’ cognitive processes during reasoning in the context of
GCC. More importantly, with a cognitive perspective, this research may further con-
tribute to current reflection on potential ways to bridge cognitive research with
science education. Through continuous efforts in exploring the authentic character-
istics of reasoning processes when it comes to issues such as GCC, future work in
this line of research will lead to closer collaborations between scientists and educators
to enhance student learning of SSIs.

Disclosure statement
No potential conflict of interest was reported by the authors.

Funding
This work was supported by the MOE (Ministry of Education of the People’s Republic of China)
Project of Humanities and Social Sciences [grant number 18YJC880055], Fundamental Research
Funds for the Central Universities [grant number 201713005], Shandong Province Higher Edu-
cation Institutions Research Project of Humanities and Social Sciences [grant number
J17RB189], and Qingdao Municipal Planning Project of Social Sciences [grant number
QDSKLZ17002].

ORCID
Shiyu Liu http://orcid.org/0000-0001-9331-548X

References
Acar, O., Turkmen, L., & Roychoudhury, A. (2010). Student difficulties in socio-scientific argumen-
tation and decision-making research findings: Crossing the borders of two research lines.
International Journal of Science Education, 32(9), 1191–1206.
American Association for the Advancement of Science. (1989). Science for all Americans. New York,
NY: Oxford University Press.
American Association for the Advancement of Science. (2007). Atlas of scientific literacy (Vol. 2).
Washington, DC: Author.
INTERNATIONAL JOURNAL OF SCIENCE EDUCATION 15

Bond, C., Philo, C., & Shipton, Z. (2011). When there isn’t a right answer: Interpretation and
reasoning, key skills for twenty-first century geosciences. International Journal of Science
Education, 33(5), 629–652.
Braasch, J. L., Bråten, I., Strømsø, H. I., Anmarkrud, Ø, & Ferguson, L. E. (2013). Promoting sec-
ondary school students’ evaluation of source features of multiple documents. Contemporary
Educational Psychology, 38(3), 180–195.
Braaten, M., & Windschitl, M. (2011). Working toward a stronger conceptualization of scientific
explanation for science education. Science Education, 95(4), 639–669.
Bricker, L. A., & Bell, P. (2008). Conceptualizations of argumentation from science studies and the
learning sciences and their implications for the practices of science education. Science Education,
92(3), 473–498.
Charmaz, K. (2006). Constructing grounded theory: A practical guide through qualitative analysis.
London: Sage.
Chen, Z., & Klahr, D. (1999). All other things being equal: Children’s acquisition of the control of
variables strategy. Child Development, 70, 1098–1120.
Chinn, C., & Malhotra, B. (2002). Epistemologically authentic inquiry in schools: A theoretical fra-
mework for evaluating inquiry tasks. Science Education, 86(2), 175–218.
Dawson, V., & Venville, G. (2009). High-school students’ informal reasoning and argumentation
about biotechnology: An indicator of scientific literacy? International Journal of Science
Education, 31(11), 1421–1445.
di Sessa, A. (1993). Toward an epistemology of physics. Cognition and Instruction, 10(2/3), 105–225.
Dole, J. A., & Sinatra, G. M. (1998). Reconceptualizing change in the cognitive construction of
knowledge. Educational Psychologist, 33(2–3), 109–128.
Driver, R., Leach, J., Millar, R., & Scott, P. (1996). Young people’s images of science. Buckingham:
Open University Press.
Driver, R., Newton, P., & Osborne, J. (2000). Establishing the norms of scientific argumentation in
classrooms. Science Education, 84(3), 287–312.
Dunbar, K. (2001). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so
difficult in the psychology laboratory. In D. Gentner, K. J. Holyoak, & B. Kokinov (Eds.),
Analogy: Perspectives from cognitive science (pp. 313–314). Cambridge, MA: MIT Press.
Duschl, R., & Osborne, J. (2002). Supporting and promoting argumentation discourse in science
education. Studies in Science Education, 38, 39–72.
Engelmann, K., Neuhaus, B., & Fischer, F. (2016). Fostering scientific reasoning in education –
meta-analysis evidence from intervention studies. Educational Research and Evaluation, 22(5–
6), 333–349.
Ericsson, K., & Simon, H. (1993). Protocol analysis: Verbal reports as data. Cambridge, MA: MIT
Press.
European Union (2006). Recommendation of the European parliament and of the council of 18
December 2006 on key competences for lifelong learning. Official Journal of the European
Union, L 394/10–L 394/18. Retrieved from http://eur-lex.europa.eu/legal-content/EN/TXT/?
uri=celex:32006H0962
Evans, J. (1996). Deciding before you think: Relevance and reasoning in the selection task. British
Journal of Psychology, 87, 223–240.
Evans, J. S. B. T. (2002). Logic and human reasoning: An assessment of the deduction paradigm.
Psychological Bulletin, 128, 978–996.
Evans, J. S. B. T. (2003). In two minds: Dual-process accounts of reasoning. Trends in Cognitive
Sciences, 7, 454–459.
Evans, J. S. B. T. (2008). Dual-processing accounts of reasoning, judgment, and social cognition.
Annual Review of Psychology, 59(1), 255–278.
Fassinger, R. E. (2005). Paradigms, praxis, problems, and promise: Grounded theory in counseling
psychology research. Journal of Counseling Psychology, 52, 156–166.
Fischer, F., Kollar, I., Ufer, S., Sodian, B., Hussmann, H., Pekrun, R. … Eberle, J. (2014). Scientific
reasoning and argumentation: Advancing an interdisciplinary research agenda in education.
Frontline Learning Research, 4, 28–45.
16 S. LIU AND F. LAWRENZ

Gallup. (2010). Americans’ global warming concerns continue to drop. Retrieved from www.gallup.
com/poll/126560/americans-global-warming-concerns-continue-drop.aspx
Giere, R., Bickle, J., & Mauldin, R. (2006). Understanding scientific reasoning (5th ed.). Belmont,
CA: Thomson Wadsworth.
Gil, L., Bråten, I., Vidal-Abarca, E., & Strømsø, H. I. (2010). Summary versus argument tasks when
working with multiple documents: Which is better for whom? Contemporary Educational
Psychology, 35(3), 157–173.
Glaser, G. (1978). Theoretical sensitivity. Mill Valley, CA: The Sociology Press.
Glaser, B. G., & Strauss, A. (1967). The discovery of grounded theory: Strategies for qualitative
research. Chicago, IL: Aldine.
Holyoak, K., & Morrison, R. (2005). Thinking and reasoning: A reader’s guide. In K. J. Holyoak & R.
G. Morrison (Eds.), The Cambridge handbook of thinking and reasoning (pp. 1–11). Cambridge:
Cambridge University Press.
Inhelder, B., & Piaget, J. (1958). The growth of logical thinking from childhood to adolescence.
New York: Basic Books.
Intergovernmental Panel on Climate Change. (2007). Climate change 2007: The physical science basis.
Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel
on Climate Change. S. Solomon, D. Qin, M. Manning, Z. Chen, M. Marquis, K. B. Averyt,
M. Tignor, & H. L. Miller (Eds.). Cambridge: Cambridge University Press.
Intergovernmental Panel on Climate Change. (2013). Climate change 2013: The physical science
basis. Contribution of Working Group I to the Fifth Assessment Report of the
Intergovernmental Panel on Climate Change. T. F. Stocker, D. Qin, G.-K. Plattner, M. Tignor,
S. K. Allen, J. Boschung, A. Nauels, Y. Xia, V. Bex & P. M. Midgley (Eds.). Cambridge:
Cambridge University Press.
Jones, S. R., Torres, V., & Arminio, J. (2006). Negotiating the complexities of qualitative research in
higher education: Fundamental elements and issues. New York, NY: Routledge.
Kind, P. (2013). Establishing assessment scales using a novel disciplinary rationale for scientific
reasoning. Journal of Research in Science Teaching, 50(5), 530–560.
Koslowski, B. (1996). Theory and evidence: The development of scientific reasoning. Cambridge, MA:
MIT Press.
Koslowski, B., Marasia, J., Chelenza, M., & Dublin, R. (2008). Information becomes evidence when
an explanation can incorporate it into a causal framework. Cognitive Development, 23, 472–487.
Kuhn, D. (1993). Science as argument: Implications for teaching and learning scientific thinking.
Science Education, 77(3), 319–337.
Lawson, A. (2005). What is the role of induction and deduction in reasoning and scientific inquiry.
Journal of Research in Science Teaching, 42(6), 716–740.
Lawson, A. E., Clark, B., Cramer-Meldrum, E., Falconer, K. A., Kwon, Y. J., & Sequist, J. M. (2000).
The development of reasoning skills in college biology: Do two levels of general hypothesis-
testing skills exist? Journal of Research in Science Teaching, 30(10), 1327–1348.
Leiserowitz, A., Maibach, E., Roser-Renouf, C., & Smith, N. (2011). Global warming’s six Americas,
May 2011. New Haven, CT: Yale Project on Climate Change Communication/Yale University
and George Mason University.
Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Beverly Hills, CA: SAGE.
Liu, S., & Roehrig, G. (2017). Exploring in-service teachers’ argumentation and personal epistem-
ology about climate change. Research in Science Education. Advance online publication. doi:10.
1007/s11165-017-9617-3
Lombardi, D., Bickel, E., Brandt, C., & Burg, C. (2017). Categorising students’ evaluations of evi-
dence and explanations about climate change. International Journal of Global Warming, 12(3/
4), 313–330.
Lombardi, D., Brandt, C., Bickel, E., & Burg, C. (2016). Students’ evaluations about climate change.
International Journal of Science Education, 38(8), 1392–1414.
Lombardi, D., Danielson, R., & Young, N. (2016). A plausible connection: Models examining the
relations between evaluation, plausibility, and the refutation text effect. Learning and
Instruction, 44, 74–86.
INTERNATIONAL JOURNAL OF SCIENCE EDUCATION 17

Lombardi, D., & Sinatra, G. (2012). College students’ perceptions about the plausibility of human-
induced climate change. Research in Science Education, 42, 201–217.
Lombardi, D., & Sinatra, G. (2013). Emotions about teaching about human-induced climate
change. International Journal of Science Education, 35(1), 167–191.
Magliano, J., Trabasso, T., & Graesser, A. C. (1999). Strategic processes during comprehension.
Journal of Educational Psychology, 91(4), 615–629.
Maibach, E., Roser-Renouf, C., & Leiserowitz, A. (2008). Communication and marketing as climate
change intervention assets: A public health perspective. American Journal of Preventive Medicine,
35(5), 488–500.
Mason, L., & Scirica, F. (2006). Prediction of students’ argumentation skills about controversial
topics by epistemological understanding. Learning and Instruction, 16, 492–509.
Mayer, D., Sodian, B., Koerber, S., & Schwippert, K. (2014). Scientific reasoning in elementary
school children: Assessment and relations with cognitive abilities. Learning and Instruction,
29, 43–55.
McNeill, K., Lizotte, D., Krajcik, J., & Marx, R. (2006). Supporting students’ construction of scien-
tific explanations by fading scaffolds in instructional materials. Journal of the Learning Sciences,
15(2), 153–191.
Mercier, H., Boudry, M., Paglieri, F., & Trouche, E. (2017). Natural-born arguers: Teaching how to
make the best of our reasoning abilities. Educational Psychologist, 52(1), 1–16.
Mercier, H., & Sperber, D. (2011). Why do humans reason? Arguments for an argumentative
theory. Behavioral and Brain Sciences, 34, 57–74.
National Research Council. (1996). National science education standards. Washington, DC:
National Academy Press.
National Research Council. (2001). Educating teachers of science, mathematics, and technology.
Washington, DC: National Academies Press.
National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting
concepts, and core ideas. Committee on a conceptual framework for new K-12 science education
standards. Board on Science Education, Division of Behavioral and Social Sciences and
Education, Washington, DC: The National Academics Press.
NGSS Lead States. (2013). Next generation science standards: For states, by states. Washington, DC:
The National Academies Press.
Nicolaidou, I., Kyza, E., Terzian, D., Hadjichambis, A., & Kafouris, D. (2011). A framework for
scaffolding students’ assessment of the credibility of evidence. Journal of Research in Science
Teaching, 48(7), 711–744.
Osborne, J. (2010). Arguing to learn in science: The role of collaborative, critical discourse. Science,
328(5977), 463–466.
Pluta, W. J., Buckland, L. A., Chinn, A. C., Duncan, R. G., & Duschl, R. A. (2008). Learning to evalu-
ate scientific models. In G. Kanselaar, J. van Merriënboer, P. Kirschner, & T. de Jong (Eds.),
International perspectives in the learning sciences: Creating a learning world. Proceedings of the
eight international conference for the learning sciences (pp. 411–412). Utrecht.
Sadler, T. (2004). Informal reasoning regarding socioscientific issues: A critical review of research.
Journal of Research in Science Teaching, 41(5), 513–536.
Sadler, T., Barab, S., & Scott, B. (2007). What do students gain by engaging in socioscientific inquiry.
Research in Science Education, 37, 371–391.
Sadler, T., & Donnelly, L. (2006). Socioscientific argumentation: The effects of content knowledge
and morality. International Journal of Science Education, 28(12), 1463–1488.
Sadler, T., & Zeidler, D. (2005). Patterns of informal reasoning in the context of socioscientific
decision making. Journal of Research in Science Teaching, 42, 112–138.
Sampson, V., & Clark, D. (2008). Assessment of the ways students generate arguments in science
education: Current perspectives and recommendations for future directions. Science Education,
92(3), 447–472.
Sanchez, C. A., Wiley, J., & Goldman, S. R. (2006). Teaching students to evaluate source reliability
during internet research tasks. In S. A. Barab, K. E. Hay, & D. T. Hickey (Eds.), Proceedings of the
seventh international conference on the learning sciences (pp. 662–666). Mahwah, NJ: Erlbaum.
18 S. LIU AND F. LAWRENZ

Sandoval, W., & Millwood, K. (2005). The quality of students’ use of evidence in written scientific
explanations. Cognition and Instruction, 23(1), 23–55.
Schauble, L. (1996). The development of scientific reasoning in knowledge-rich contexts.
Developmental Psychology, 32(1), 102–119.
Sinatra, G., Kienhues, D., & Hofer, B. K. (2014). Addressing challenges to public understanding of
science: Epistemic cognition, motivated reasoning, and conceptual change. Educational
Psychologist, 49(2), 123–138.
Strauss, A. L., & Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures and
techniques. Newbury Park, CA: Sage.
Strauss, A. L., & Corbin, J. (1998). Basics of qualitative research: Grounded theory procedures and
techniques. Newbury Park, CA: Sage.
Tang, K. (2016). Constructing scientific explanations through premise-reasoning-outcome (PRO):
an exploratory study to scaffold students in structuring written explanations. International
Journal of Science Education, 38(9), 1415–1440.
Topcu, M., Sadler, T., & Yilmaz-Tuzun, O. (2010). Preservice science teachers’ informal reasoning
about socioscientific issues: The influence of issue context. International Journal of Science
Education, 32(18), 2475–2495.
van der Graaf, J., Segers, E., & Verhoeven, L. (2016). Scientific reasoning in kindergarten: Cognitive
factors in experimentation and evidence evaluation. Learning and Individual Differences, 49,
190–200.
Walker, J., & Sampson, V. (2013). Learning to argue and argue to learn: Argument-driven inquiry
as a way to help undergraduate chemistry students learn how to construct arguments and engage
in argumentation during a laboratory course. Journal of Research in Science Teaching, 50(5),
561–596.
Wason, P., & Johnson-Laird, P. (1972). Psychology of reasoning: Structure and content. Cambridge,
MA: Harvard University Press.
Watters, J., & English, L. (1995). Children’s application of simultaneous and successive processing
in inductive and deductive reasoning problems: Implications for developing scientific reasoning
skills. Journal of Research in Science Teaching, 32(7), 699–714.
Weiss, I. R., Pasely, J. D., Smith, P. S., Banilower, E. R., & Heck, D. J. (2003). Looking inside the
classroom: A study of K-12 mathematics and science education in the United States. Chapel
Hill, NC: Horizon Research.
Wu, H. K., & Hsieh, C. E. (2006). Developing sixth graders’ inquiry skills to construct explanatory
inquiry-based learning environments. International Journal of Science Education, 28(11), 1289–1313.
Wu, Y., & Tsai, C.-C. (2007). High school students’ informal reasoning on a socio-scientific issue:
Qualitative and quantitative analyses. International Journal of Science Education, 29(9), 1163–1187.
Wu, Y., & Tsai, C.-C. (2011). High school students’ informal reasoning regarding a socio-scientific
issue, with relation to scientific epistemological beliefs and cognitive structures. International
Journal of Science Education, 33(3), 371–400.
Yang, F. Y., & Anderson, O. R. (2003). Senior high school students’ preference and reasoning modes
about nuclear energy use. International Journal of Science Education, 25, 689–725.
Yang, F. Y., Chang, C. Y., & Hsu, Y. S. (2008). Teacher views about the constructivist instruction
and personal epistemology – a national study in Taiwan. Educational Studies, 34, 527–542.
Yang, F. Y., & Tsai, C.-C. (2010). Reasoning about science-related uncertain issues and epistemo-
logical perspectives among children. Instructional Science, 38, 325–354.
Zeidler, D., Sadler, T., Simmons, M., & Howes, E. (2005). Beyond STS: A research-based framework
for socioscientific issues education. Science Education, 89, 357–377.
Zeineddin, A., & Abd-El-Khalick, F. (2010). Scientific reasoning and epistemological commitments:
Coordination of theory and evidence among college science standards. Journal of Research in
Science Teaching, 47(9), 1064–1093.
Zimmerman, C. (2005). The development of scientific reasoning skills: What psychologists contribute
to an understanding of elementary science learning. Final draft of a report to the National
Research Council Committee on Student Learning Kindergarten through Eighth Grade.
Washington, DC: National Research Council.
INTERNATIONAL JOURNAL OF SCIENCE EDUCATION 19

Appendices
Appendix 1. Excerpt from the reading document
Global climate change: human induced or natural changes?
There have been heated debates about what causes global climate change. Many people argue that
climate change is mainly human induced as the change has become especially significant since
industrial revolution. They consider the rising temperature is due to human activities, and so are
other aspects such as rising sea level and extreme weather events.
However, others consider the current climate change as mainly natural since climate has
changed in similar patterns throughout Earth’s history. They hold that the Earth’s temperature
change is natural fluctuation and so are other aspects such as the changes in sea level and
number of extreme weather events.
The following paragraphs include the evidence-based arguments from both sides.

Temperature change and CO2 emissions


Claim 1: Temperature Change is Due to Human-Induced CO2 Emissions
The Earth’s temperature change is mainly due to the increasing human-caused CO2 emissions
since industrial revolution. Compared to pre-industrial values, there has been a 50% increase of
CO2 concentration in the air as of 2011. At the same time, the Earth’s average surface temperature
has increased by 0.85°C (1.53°F) from 1880 to 2012.
Claim 2: Temperature Change is NOT Due to Human-Induced CO2 Emissions
Although there has been a human-caused CO2 increase in the atmosphere, it is not the main
cause for the Earth’s temperature change. Earth’s atmosphere is composed of 78% of nitrogen
gas, 21% of oxygen, and 1% of other trace gases. CO2 constitutes less than 1% of the trace gases.
Thus, human-caused CO2 emissions only influence a tiny fraction of Earth’s atmosphere.

Appendix 2. Interview questions

(1) What is your definition for greenhouse gases? What are some greenhouse gases that you
know?
(2) How would you explain the greenhouse effect?
(3) How much would you say that you know about global climate change, compared to the
average person?
(4) Which evidence seems more plausible? Why?
(5) What are the strengths of each evidence?
(6) What are the weaknesses of each evidence?
(7) How well do you think the evidence supports the claim?
(8) Do you think the evidence is explained sufficiently? If not, how would you explain it
differently?
(9) Is there any other information you may use as evidence? If so, what is it? How does it support
one or both of the arguments?
(10) In general, which argument about the cause for global climate change do you think is stron-
ger? Why?
(11) Do you think only one side of the argument is correct or is it possible that both are correct?
Why?
(12) How are these arguments possible if people have access to the same set of data and use them to
derive their conclusions?
(13) Do you think people who advocate either argument may change their views on global climate
change? Please explain your answer.

You might also like