You are on page 1of 26

A Psychometric Approach to the Development of a 5E

Lesson Plan Scoring Instrument for Inquiry-Based


Teaching
M. Jenice Goldston

John Dantzler

Jeanelle Day

Brenda Webb
Published online: 25 December 2012
The Association for Science Teacher Education, USA 2012
Abstract This researchcenters onthe psychometric examination of the structure of an
instrument, known as the 5E Lesson Plan (5E ILPv2) rubric for inquiry-based teaching.
The instrument is intended to measure an individuals skill in developing written 5E
lesson plans for inquiry teaching. In stage one of the instruments development, an
exploratory factor analysis on a fteen-item 5E ILP instrument revealed only three
factor loadings instead of the expected ve factors, which led to its subsequent revision.
Modications in the original instrument led to a revised 5EILPv2 instrument comprised
of twenty-one items. This instrument, like its precursor, has a scoring scale that ranges
from zero to four points per item. Content validity of the 5E ILPv2 was determined
through the expertise of a panel of science educators. Over the course of ve semesters,
three elementary science methods instructors in three different universities collected
post lesson plan data from 224 pre-service teachers enrolled in their courses. Each
instructor scored their students post 5E inquiry lesson plans using the 5E ILPv2
instrument recording a score for each item on the instrument. A factor analysis with
maximum likelihood extraction and promax oblique rotation provided evidence of
M. J. Goldston (&)
The University of Alabama, 204 Graves Hall, Tuscaloosa, AL 35405, USA
e-mail: dgoldsto@bamaed.ua.edu
J. Dantzler
The University of Alabama, Carmichael Hall, Tuscaloosa, AL 35405, USA
e-mail: Jdantzler@bamaed.ua.edu
J. Day
Eastern Connecticut State University, 83 Windham Str., Rm 144 Webb Hall,
Willimatic, CT 06226, USA
e-mail: dayj@easternct.edu
B. Webb
University of North Alabama, Florence, AL, USA
e-mail: bwebb@una.edu
1 3
J Sci Teacher Educ (2013) 24:527551
DOI 10.1007/s10972-012-9327-7
construct validity for ve factors and explained 85.5 % of the variability in the total
instrument. All items loaded with their theoretical factors exhibiting high ordinal alpha
reliability estimates of .94, .99, .96, .97, and .95 for the engage, explore, explain,
elaborate, and evaluate subscales respectively. The total instrument reliability estimate
was 0.98 indicating strong evidence of total scale reliability.
Keywords Assessment Inquiry-based teaching 5E lesson planning
Background
Today, evaluation is a predominant feature woven within the fabric of science and
mathematics education in the United States. In fact, the importance placed on
evaluating student achievement in science and mathematics reaches a global scale
with the testing of U.S. students in the fourth and eighth grade as part of the Trends in
International Mathematics and Science Study (TIMSS). With the TIMSS, students
are tested across the globe in science and mathematics, whereby participating nations
are ranked based on their students test scores. On a national level, every four to ve
years, U.S. students are tested in the disciplines, and their scores are reported in the
Nations Report Card for the fourth-, eighth- and twelfth-grade levels (NAEP 2010a,
b). Furthermore, every spring across the United States, evaluation is ubiquitous with
state-mandated, standardized testing for all students. For K-12 teachers, the impact of
testing has become more pronounced with the reauthorization of the Elementary and
Secondary Education Act of 1965, known today as No Child Left Behind (NCLB)
(2002). As a result of NCLB, standardized test scores have resulted in what is viewed
by many as equivalent to a students success and the single measure for determining
successful schools and the teachers working therein. Shifting from the broad
perspectives on testing and evaluation to peer into a K-12 science teachers
classroom in a local setting, one will nd evaluation again revealing itself in many
forms. Teachers may use many forms of evaluation as a mechanism for meeting local
standards and classroom objectives that measure students learning of science
content and skill. No matter its purpose or whether it is conducted locally or globally,
evaluation as part of accountability is deeply embedded within the fabric of the
United States educational system where student outcomes are made public and the
eyes of society are constantly viewing and critiquing the results.
Teacher preparation programs and associated faculty, much like our K-12 public
school counterparts, are also held accountable for student performance. For instance,
in some states, the Colleges of Education and the professoriate who teach pre-service
methods courses are accountable for the performance of their graduates for up to
2 years after graduation and certication from their teacher preparation programs. In
other words, if a graduate from their teacher preparation program is unsuccessful as a
teacher hired by a school district in the rst 2 years of their career, the professors of
the College of Education program can be called, free of charge, to remediate their
recent graduate if requested to do so by a public school administrator.
Today, as never before accountability and emphasis on high-quality science
teaching is paramount at all levels of teacher preparation. According to the Nations
528 M. J. Goldston et al.
1 3
Report Card on Hands-On and Interactive Computer Tasks Assessment from the
2009 Science Assessment (NAEP 2010a, b), the majority of students were able to
make observations of data, but were unable to make decisions about the appropriate
data to collect in investigations and even fewer students could select correct
conclusions and explain results. Inquiry-based teaching approaches if implemented
properly can afford teachers opportunities to lead students through exploratory
activities that address content and practices across STEM elds. Science methods
courses are designed to prepare pre-service teachers in using inquiry-based teaching
approaches that foster K-12 student learning of science concepts, as well as
practices of the STEM elds as advocated in documents such as the National
Science Education Standards (NRC 1996), Benchmarks for Science Literacy
(AAAS 1993), and Blueprints for Reform (AAAS 1998). With the publication of
A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and
Core Ideas (NRC 2011), the forerunner to the Next Generation Science Standards
(NGSS) (Achieve 2012), there is a continued and clear need for classroom inquiry
pedagogies that foster student learning of both content as well as science and
engineering practices. A Framework for K-12 Science Education Practices,
Crosscutting Concepts, and Core Ideas identies eight practices in science and
engineering that are essential for classroom curriculum. These include the
following: (a) asking questions (science) and dening problems (engineering),
(b) developing and using models, (c) planning and carrying out investigations,
(d) analyzing and interpreting data, (e) constructing explanations (science) and
designing solutions (engineering), (f) engaging in argument from evidence, and
(g) obtaining, evaluating, and communicating information (2012, p. 49). Though
some of these practices are often different in science and engineering, addressing
both provides students with a way of understanding how scientists and engineers
work. Despite a shift away from the use of the term inquiry within A Framework for
K-12 Science Education: Practices, Crosscutting Concepts and Core Ideas (NRC
2011) and The Next Generation Science Standards (Achieve 2012), many of the
scientic practices advocated are not new and can be seen in the following NSES
description of student inquiry as a
multifaceted activity that involves making observations, posing questions;
examining books and other sources of information to see what is already
known; planning investigations; reviewing what is already known in light of
experimental evidence; using tools to gather, analyze, and interpret data;
proposing answers, explanations, and predictions; and communicating the
results. Inquiry requires identication of assumptions, use of critical and
logical thinking, and consideration of alternative explanations. (NRC 1996,
p. 23)
Along these same lines, Settlage et al. (2008) sum it up by stating that inquiry is
the process students go through to encounter the evidence that serves as the source
of scientic ideas (2008, p. 179).
Given the emphasis of the NGSS that students acquire knowledge and skills of
scientic and engineering practices, it is even more important that preservice
teachers are competent in using inquiry teaching practices. It is through the use of a
Inquiry-Based Teaching 529
1 3
range of classroom inquiry pedagogies that students acquire knowledge of and
practice such skills. Inquiry and the National Science Education Standards (NRC
2000) describe scientic practices as a part of student inquiry and as focal point for
building classroom inquiry strategies as seen in The Essential Features of Classroom
Inquiry and Their Variations. These essential features include the following: (a) the
learners engagement in scientically oriented questions, (b) priority of evidence in
response to questions, (c) formulation of explanations from evidence, (d) explana-
tions connected to scientic knowledge and (e) communication and justication of
explanations (NRC 2000; p. 29). Though these features are but a framework for
inquiry teaching, they offer varying degrees of engagement for students to gain
knowledge and skill with scientic practices. The Essential Features of Classroom
Inquiry clearly represent some important scientic, as well as, engineering practices
as noted earlier that all students should acquire as part of the K-12 school experience.
For elementary and secondary science methods courses, teaching science using
inquiry-based pedagogies with its many permutations is a central premise around
which other components of the methods course connect. According to Marek et al.
(2003), it is classroom inquiry-based pedagogy that links all the components of
science methods courses. Thus, classroom inquiry as the centerpiece of science
methods courses leads to the focus of this studythe development of an assessment
instrument that provides science instructors a tool for assessing and evaluating pre-
service teachers skills in developing inquiry-based lesson plans using a 5E
instructional model.
Inquiry in Science Teaching
Despite decades of science reform with focused endeavors advocating the use of
inquiry as a pedagogical practice in the science classroom, it is still not a common
teaching approach seen in elementary or secondary science classrooms today (Weiss
2006; Weiss et al. 2003). Research ndings suggest several rationales that K-12
teachers give for not using inquiry teaching approaches. In general, the reasons
include the following: (a) managing inquiry is difcult, (b) inquiry takes too much
time, (c) inquiry is for advanced students, (d) inquiry does not provide information
to students needed for the next grade level, (e) lack condence responding to student
questions due to a lack content knowledge, and (f) pressure to teach other subjects
(Hodson 1988; Welch et al. 1981; Pomperoy 1993; Slotta 2004; Sunal and Wright
2006; Appleton 2008). Further confounding the reasons teachers give for not
utilizing inquiry teaching approaches in their science classes is the term inquiry
itself. The term inquiry used without care can be confusing because it often refers to
(1) teaching approaches and (2) what students do (Colburn 2008). In both
elementary and secondary science teacher preparation, recognizing the distinction is
important. As noted in A Framework for K-12 Science Education: Practices,
Crosscutting Concepts, and Core Ideas, having knowledge of the progression of
classroom inquiry practices, preservice teachers will be able to guide their students
through careful and systematic investigations (NRC 2011, p. 61) appropriate for
each grade level.
530 M. J. Goldston et al.
1 3
As such, in science methods courses, inquiry-based teaching approaches are
viewed on a continuum that shifts from predominantly teacher-centered, to various
forms of guided inquiry to open inquiry that is primarily student-centered (Olson
and Loucks-Horsley 2000; NRC 2000). Eick et al. advocate using the essential
features of classroom inquiry as a scaffold for designing inquiry-based teaching
that engage students in scientic phenomena through direct observation, data
gatherings and analysis of evidence (2005, p. 49). Furthermore, Eick et al. (2005)
suggest teachers should use the essential key features as a guide for scaffolding the
learning of science based upon students needs and skills. For instance, in the early
grades, students may need a great deal of direction and structure with learning
scientic concepts and practices associated with the essential features, thus teacher
directed inquiry-based teaching is generally appropriate. As students develop
knowledge and skill with scientic practices for conducting investigations and
experimentation, the choice of inquiry-based pedagogies may shift to a guided
approach whereby students have more decision-making opportunities such as
choosing the question to explore or giving priority to evidence by deciding what
data are important and what data will be collected. Using the essential features of
the classroom inquiry, teachers can design inquiry-based lessons that not only
support students knowledge of concepts, but also the development of scientic
practices until students can conduct investigations independently.
No matter where on the continuum that inquiry-based instruction falls, an
instructional model that has been viewed as successful for inquiry teaching since its
inception is the learning cycle (Atkins and Karplus 1962; Marek and Cavallo 1997;
Blank 2000). In their early paper, Atkins and Karplus (1962) did not identify a phase
as exploration or use the term learning cycle, however, it is evident that the phase
was present in the invention and discovery stages of their lessons structure. The
term, learning cycle, actually rst appeared in the Science Curriculum Improvement
Study Teachers Guides in the 1970s; however, the phases of the learning cycle had
been discussed in previous publications (Karplus and Thier 1967; Jacobson and
Kondo 1968; Barman and Shedd 1992). Though the learning cycle phases have
undergone an evolution of names, today it is commonly recognized by three phases
known as Explore, Introduction of Concepts, and Application of Concepts. The rst
stage of the learning cycle begins with the exploration phase that provides students
with an activity to give them experiences for constructing science concepts and
skills. Next, using students data or ideas gleaned from their activities, the teacher
involves students in an interactive discussion introducing them to appropriate
concepts and vocabulary connecting the exploration to the second phase,
introduction to concepts. Last, during the application of concepts phase, the
students are challenged to apply the newly acquired concepts in a new situation
connecting it to the previous phase. The three-stage learning cycle approach draws
upon works of Deweys reective thinking, Piagets theory of cognitive develop-
ment, and social constructivism. Thus, the learning cycle underpinned by a
constructivist stance fosters climates where students question and are actively
immersed in learning while they construct meaning from experiences and social
interaction involving questioning and rich discussion that aligns well with inquiry-
based teaching as discussed with the essential features. The learning cycle approach
Inquiry-Based Teaching 531
1 3
stands in a stark contrast to the traditional image of students as passive receivers of
facts and concepts derived from a teachers lecture.
Findings associated with the learning cycle uncover a multitude of studies that
address various aspects of its effectiveness from different research perspectives.
Some of the studies have been conducted to ascertain the level of the learning
cycles success in science teaching (Karplus 1979; Karplus and Thier 1967; Lawson
1995; Settlage 2000; Odom and Kelly 2001). Other areas of research examine the
learning cycle and student learning outcomes (Jinkins 2002; Cavallo and Laubach
2001; Odom and Kelly 2001; Dwyer and Lopez 2001; Munsheno and Lawson 1999;
Lovoie 1999; Barman 1993). Furthermore, many studies describe teacher activities
and their actions associated with using the learning cycle (Jinkins 2002; Settlage
2000; Barman 1992; Glasson and Lilik 1993; Odom and Settlage 1996; Marek and
Methven 1992; Barman and Shedd 1992; Lawson et al. 1989; Marek et al. 1990).
Associated with this last category, some research ndings emphasize that
understanding the learning cycle and its lesson development are difcult for
teachers (Settlage 2000; Odom and Settlage 1996); while other studies suggest that
teachers understandings of the learning cycle demonstrate a wide array of
understanding (Atkins and Karplus 1962; Karplus et al. 1975; Marek et al. 2008).
Despite the contrasting ndings, the learning cycle continues to be supported and
utilized as an effective inquiry-based approach in science methods teaching.
For this research, the Five E instructional model, a modication of the learning
cycle has been used for inquiry-based science teaching (Trowbridge and Bybee
1996; Bybee 1997; Bybee et al. 2006). The 5E model consists of ve phases. Each
of the ve phases begins with the letter e and includes ve phases instead of three
phases used in the learning cycle. The ve phases of the 5E model are engage,
explore, explain, elaborate and evaluate. Examining the 5E and Learning Cycle
models reveal that the phases of the 5E phases align with the Learning Cycle as
follows: Exploration (5E-Engage and Explore), Concept Introduction (5E-Explain),
and Application of Concepts (5E-Elaborate and Evaluate).
5E Instructional Model
This section describes each of the phases of the 5E instructional model used in
inquiry-based teaching. The 5E models rst phase, engage, is one whereby a
teacher utilizes strategies that ascertain students prior understandings of the science
concepts to be taught, encourages students questions, and generates students
interest for the activities that follow. During the second phase, explore, the teacher
facilitates students actively working together with other students in a hands-on,
minds-on activity. Also during the explore phase, a teacher gives directions,
responds to students, and encourages students to nd answers on their own. The
explain phase begins when a teacher starts questioning students and encouraging
them to explain their ideas about the concepts based upon the evidences of their
activity. It is during this phase that the concepts are given labels, terms dened and
discussed with the learners. The elaborate phase is one in which the students are
given new opportunities to use or apply their newly acquired skills or concepts in
532 M. J. Goldston et al.
1 3
everyday situations. In the evaluation phase, a summative evaluation is created to
match the stated objectives in the inquiry lesson and includes a rubric with
appropriate criteria as needed.
Essential Features of Classroom Inquiry and the 5E Inquiry Model
The Essential Features of Classroom Inquiry (NRC 2000) is a useful guide for
inquiry-based lesson planning. Using the essential features, 5E lessons can address
content as well as scientic and engineering practices (NRC 2011) whether the
approach used is directed, guided, or open inquiry. The following brief example
describes how the 5E instructional approach may integrate the essential features that
foster development of scientic and engineering practices. For instance, though the
engage stage of the 5E approach is designed to evoke students prior knowledge
and/or questions about a concept or topic, the engage stage can also be used to have
students to generate questions for science investigations or problems about an
engineering design depending on the lessons objectives. Therefore, the essential
feature, the learner engages in scientically oriented questions or an engineering
problem, may occur in the 5E engage stage. However, depending on the teachers
intent, students questions or design problems could occur at the beginning of the
explore stage of the 5E approach prior to student investigation. It is during the
exploration stage of the 5E approach that one may nd the essential features of the
learner formulates explanations from evidence and the learner gives priority to
evidence in response to a question addressed as part of the learners investigations.
The next stage, explain of the 5E model, often integrates the essential features of the
learner connects explanations to scientic knowledge and the learner communicates
and justies explanations with teachers facilitating learner discourse. During 5E
explain stage, the instructor may a) explain the data ndings utilized in directed
approaches, b) facilitate learner explanations gleaned during investigations and
readings through questioning seen in guided approaches, or c) students may be held
responsible for providing explanations and evidences as with full or open inquiry.
Depending on the activity used in the elaborate stage, the essential features might
be the learner gives priority to evidence in response to a question and/or the learner
formulates explanations from evidence to allow students to apply what they have
learned. Last, based on the lessons objectives, the evaluate stage might have an
assessment whereby the learner communicates and justies explanations. So,
depending on the teachers objectives, the 5E instructional model used for inquiry
teaching has the exibility to incorporate content as well as scientic or engineering
practices into a range of lessons that span direct, guided, or full inquiry for student
investigation or design.
This study utilizes the 5E inquiry model that the three researchers have used for
over 10 years while teaching elementary science methods courses. The researchers
use the 5E instructional model instead of the Learning Cycle, nding the additional
phases of the engage and evaluate useful in scaffolding the development of pre-
service science teachers skills in writing inquiry-based 5E lesson plans. Further-
more, the engage stage of the 5E instructional model supports active mental
Inquiry-Based Teaching 533
1 3
processing evoking the students prior knowledge which can be a powerful inuence
on learning subject content and the evaluate stage supports pre-service teachers
development of skills in gathering and documenting student achievement and
growth. Crafting create effective evaluations and understanding their varied uses is
a critical skill for science teaching professionals given the state and federal policy
demands for school and teacher accountability.
Purpose
Thus, the purpose of this research is to describe the redesign and psychometric
examination of the 5E Lesson Plan rubric (5E ILPv2) for inquiry teaching. The 5E
ILPv2 instrument is developed for use in assessing a pre-service teachers ability to
create inquiry-based 5E lesson plans (See Appendix). An extensive literature
search for instrumentation relevant to planning inquiry lessons revealed little. The
search did reveal an inquiry-based science teaching rubric, STIR, for observing
inquiry-based science teaching (Bodzin and Beerer 2003; Beerer and Bodzin 2004),
assessments that determine teachers knowledge of inquiry process skills, instru-
ments for determining understandings about the nature of science (Lederman et al.
1998; Ackerson et al. 2000), and instruments for examining teachers understand-
ings of the learning cycle (Odom and Settlage 1996; Marek 2008). However, we
found no such inquiry-based instrument designed for assessing a teachers ability to
write an inquiry-based 5E lesson plan. As such, the initial development of a 5E
lesson plan rubric (Goldston et al. 2009) and its revised form, 5EILPv2, for inquiry-
based teaching is the focus of this paper. The instrument was developed by the
researchers with a threefold purpose. These include a need by instructors (a) to
assess students 5E lesson plans in equitable ways with a validated instrument, (b) to
examine a students inquiry-based 5E lesson plan and provide detailed feedback
aligned to specic criteria associated with each of the phases of the 5E model, and
(c) to guide our teaching of the 5E instructional model to support pre-service
teachers skills in designing inquiry-based 5E lessons.
Methods
Psychometric Development of the 5E ILP: Stage One
In the pilot study, an exploratory factor analysis was conducted on the 5E Lesson
Plan (5E ILP) instrument designed to assess pre-service teachers abilities to
develop inquiry-based 5E lesson plans. The initial 5E ILP instrument incorporated a
Likert-type scale of 04 points per item with a total of sixty points. The entire
instrument included 15 items, with 12 items associated with the phases of 5E model
used in the analysis. The instrument included one item for the engage phase, three
items for the explore phase, three items for the explain phase, two items for the
elaborate phase and three items for the evaluation phase (Goldston et al. 2009).
Using 66 pre-service teachers post-course lesson plan data, a factor analysis using
534 M. J. Goldston et al.
1 3
maximum likelihood extraction and varimax orthogonal rotation was conducted on
the items of the 5E ILP instrument to establish evidence of construct validity.
Despite showing strong evidence of validity and reliability, the ndings revealed
only three of the ve distinct factors corresponding with the ve E stages. The three
factors identied were explore, engage/explain/elaborate, and evaluate which
accounted for 75.98 % of the rubrics total variability. These ndings led to re-
examining and expanding the number of items for all the theoretical factors and
thereby strengthen the instrument resulting in the 5E ILPv2.
Psychometric Development of the 5E ILPv2: Stage Two
In stage two of the instruments development, the research methodologists and
science researchers met and identied nine items requiring revisions for incorpo-
ration into the 5E ILPv2. These nine additions resulted in each of the ve phases
being comprised of three to six items for a total of 21 items. The 5E ILPv2 is a
Likert-type instrument with a range of 04 points per item with a total of 84 points.
Analysis of the original 5E ILP instrument revealed that individual items contained
multiple elements that should be separated and made into individual items of a 5E
phase. As a result, the additional items incorporated into the 5E ILPv2 were not
newly constructed items, but were separated from individual items with multiple
elements found in the original 5E ILP rubric. As a result of the revisions, the 5E
ILPv2 instruments engage subscale has four items that address students prior
knowledge, motivation, student discussion, and transition into the explore phase.
The next four items of the explore subscale target teacher instruction, involve
hands-on minds-on activity, utilize student-centered activity, and show evidence of
student learning. The explain subscale is comprised of six items that focus on
fostering student discussion by means of questions associated with the explore
activity, the use of divergent/convergent questions, an explanation of the concept
and appropriate terminology, and the use of a variety of approaches to develop
concepts. The elaborate subscale includes three items aimed at providing students
opportunities to apply their knowledge in new situations with real-life connections.
Lastly, four evaluate subscale items are directed toward the objectives and their
alignment to the evaluation questions or task, the appropriateness of the task for
concepts or skills, and the quality of rubric features and criteria. Four additional
items commonly found in lesson plans were included (objectives, standards,
materials, safety) in the instrument; however, only the items directly related to the
5E inquiry were used in the instrument analysis.
Content validity for the 5E ILP instrument was assessed by a committee of ve
science educators who have used the 5E inquiry instructional model for over
10 years. The committees task was to examine the instrument and determine
whether it aligned with the 5E instructional model and to determine whether the
scoring criteria were clear and commonly understood by educators. Because the
revisions for the 5E ILPv2 involved pulling out single elements from those listed
within the items of the original instrument, there were no substantive content changes
in the 5E ILPv2 instrument, so the content validity was not re-conrmed.
Inquiry-Based Teaching 535
1 3
Inter-rater Reliability
To establish inter-rater reliability, two of the science education researchers met
three times to develop consistency in scoring on the same set of ten lesson plans.
When scoring differences on the items occurred, the researchers discussed the items
with key criteria that were used to delineate between the various scoring levels. For
instance, examining the rst explore item, a score of zero was given when teacher
instructions were not presented in the lesson plan. A score of two was given if
teacher instructions were present, developmentally appropriate, clear, and under-
standable but were missing some important details. A score of three was given if
teacher instructions were present, developmentally appropriate, clear, and under-
standable with minor detail omissions. The high score of four was given if teacher
instructions were detailed, clear, and developmentally appropriate with nothing
missing.
A third science education researcher joined the team and met to score the same
ten lessons to develop consistency using the rubric. After all three science education
researchers scored the practice lesson plans, they discussed the rubrics criteria.
Following this, each researcher who also taught an elementary methods science
course independently scored the same set of twenty lesson plans using the 5E ILPv2
rubric. An intraclass correlation coefcient was computed to determine inter-rater
reliability among the three researchers using their scores for the set of twenty lesson
plans. The intraclass correlation for all raters was .84 with a range of .79 to .88 for
pairs of raters indicating high inter-rater reliability.
Sample Population
Data for analyzingthe revised 5EILPv2, were collected fromundergraduate pre-service
teachers enrolled in elementary science methods from three different universities. The
participants came from one large university, with approximately 30,000 students and
two Masters granting state universities with enrollments of about 5,000 students. One
university was located in the northeast and two universities were located in the
southeastern United States. The preservice teachers in the sample were undergraduates
in education programs and in their last semester prior to their internship. The pre-service
teachers enrolledinthe science methods courses were completingcourseworktoacquire
K-6 teaching certication. Data from224 pre-service teachers were collected frompost-
course lesson plans assigned as part of the elementary methods courses over ve
consecutive semesters. In nearly all cases, the science methods course is their rst
introduction to the 5Einstructional model. During the science methods courses, the pre-
service teachers participated in inquiry-based 5E lessons and activities modeled by the
instructors discussed the 5E model and its phases and learned key features of the 5E
inquiry lesson plan throughout the course. Three researchers, also science educators,
taught the elementary science methods courses and were responsible for scoring the
lessons of their respective students. As part of their science methods courses, pre-service
teachers were asked to developand write three inquiry-based5Escience lessonplans for
teaching elementary students. A fourth researcher, an educational research methodol-
ogist, guided the development and the analysis of the instrument.
536 M. J. Goldston et al.
1 3
Results
Analysis of the 5E ILPv2 Instrument
Utilizing the 5E ILPv2 instrument, 224 pre-service teachers 5E inquiry-based post-
course lesson plans were scored item by item and underwent psychometric analysis.
Three science education researchers combined efforts to collect and score the post
lesson plans for this study from multiple classes of elementary pre-service teachers.
One science educator provided the majority of post-course lesson plan data with
67 % of the total sample. The other two science educators provided approximately
equal amounts of data with 16.0 and 17.0 % respectively.
Analysis of post-course lesson plan data using the 5E ILPv2 instrument reveals
that mean scores for the items ranged from 2.68 to 3.30. More specically, the
engage items range from 3.06 to 3.30; the explore items from 2.81 to 3.00; the
explain items from 2.97 to 3.09; the elaborate items from 2.79 to 2.82; and the
evaluate items from 2.68 to 3.02. Table 1 details the descriptive statistics for each
item in the 5E ILPv2.
Table 1 5E ILPv2 rubric Item descriptive statistics (n = 224)
Item Range M SE S Skew
a
Kurtosis
b
Engage 1 14 3.30 .057 0.85 -0.89 -0.31
Engage 2 14 3.11 .057 0.85 -0.53 -0.66
Engage 3 04 3.27 .060 0.90 -1.12 0.70
Engage 4 04 3.06 .076 1.13 -1.09 0.45
Explore 1 04 3.00 .091 1.36 -1.23 0.24
Explore 2 04 2.92 .089 1.33 -1.18 0.25
Explore 3 04 2.90 .089 1.33 -1.08 0.03
Explore 4 04 2.81 .094 1.40 -0.88 -0.50
Explain 1 14 3.09 .064 0.96 -0.55 -0.99
Explain 2 04 2.78 .083 1.25 -0.91 -0.01
Explain 3 04 2.75 .080 1.19 -0.94 0.14
Explain 4 04 2.86 .082 1.23 -0.84 -0.28
Explain 5 14 3.02 .057 0.86 -0.34 -0.91
Explain 6 04 2.97 .067 1.00 -0.70 -0.22
Elaborate 1 04 2.82 .088 1.32 -0.87 -0.39
Elaborate 2 04 2.79 .087 1.30 -0.96 -0.14
Elaborate 3 04 2.80 .077 1.16 -0.73 -0.23
Evaluate 1 04 3.02 .074 1.11 -0.93 0.01
Evaluate 2 04 2.87 .078 1.17 -0.96 0.23
Evaluate 3 04 2.68 .084 1.25 -0.78 -0.35
Evaluate 4 04 2.73 .084 1.25 -0.84 -0.23
a
SE Skew = 0.163
b
SE Kurtosis = 0.324
Inquiry-Based Teaching 537
1 3
Fig. 1 Scree plot of the 5E ILPv2
Table 2 Results of parallel
analysis
* Eigenvalues based on adjusted
correlation matrices with
squared multiple correlation
(SMD) on the diagonal
a
Eigenvalues based on adjusted
correlation matrices with
squared multiple correlations
(SMC) on the diagonal
Root Raw data
a
Random data
Means 95th percentile
1 13.42* 0.69 0.79
2 1.44* 0.59 0.67
3 0.87* 0.50 0.57
4 0.77* 0.43 0.50
5 0.58* 0.37 0.42
6 0.26 0.31 0.37
7 0.08 0.25 0.31
8 0.07 0.20 0.24
9 0.05 0.15 0.18
10 0.02 0.10 0.14
11 0.01 0.05 0.09
12 -0.00 0.01 0.04
13 -0.01 -0.03 0.00
14 -0.03 -0.07 -0.03
15 -0.04 -0.11 -0.08
16 -0.05 -0.15 -0.12
17 -0.06 -0.19 -0.15
18 -0.06 -0.23 -0.20
19 -0.08 -0.27 -0.24
20 -0.09 -0.32 -0.28
21 -0.11 -0.36 -0.33
538 M. J. Goldston et al.
1 3
In addition to the descriptive statistics, 17 out of the 21 instrument items display
the full range of possible scores from zero to four. Four items, however, Engage
items one and two, and Explain items one and ve display scores from one to four.
Reliability and Validity
A factor analysis using maximum likelihood extraction and promax oblique rotation
was conducted using 224 itemized post-course lesson plan rubric scores to establish
evidence of construct validity of the 5E ILPv2 instrument. The sample of 224 meets
Nunnallys (1978) recommendation of a ten-to-one participant to item ratio. In
addition, the KaiserMeyerOlkin measure of sample adequacy of .95 was obtained.
The closer the value is to 1.0 indicates that patterns of correlations are compact, and
the sample size is large enough to produce a satisfactory factor structure (Fields
2005; Hutcheson and Sofroniou 1999).
The scree plot method seen in Fig. 1 was initially used to determine that ve
distinct factors were evident at the point of inexion (Tabachnick and Fidell 2006).
The eigenvalues for each of the factors were 13.60, 1.59, 1.04, 0.93, and 0.80
respectively. Given that the reliability of scree plot interpretations of number of
factors is low (Streiner 1998), two procedures for determining optimal number of
factors were implemented; a parallel analysis and Velicers MAP test.
Table 3 Results of MAP
analysis
* Denotes factor eigenvalues
above the upper point of the
95th percentile range of
eigenvalues from randomly
drawn datasets
Root Squared Power 4
0 0.4047 0.1859
1 0.0584* 0.0167*
2 0.0552* 0.0111*
3 0.0546* 0.0090*
4 0.0486* 0.0081*
5 0.0362* 0.0060*
6 0.0341* 0.0069
7 0.0427 0.0084
8 0.0532 0.0116
9 0.0570 0.0128
10 0.0665 0.0225
11 0.0829 0.0339
12 0.1015 0.0395
13 0.1228 0.0561
14 0.1739 0.0881
15 0.2297 0.1363
16 0.2499 0.1449
17 0.2487 0.1304
18 0.3744 0.2465
19 0.5295 0.4018
20 1.0000 1.0000
Inquiry-Based Teaching 539
1 3
Parallel analysis rst proposed by Horn (1965) and endorsed by psychometric
researchers (Zwick and Velicer 1986; OConnor 2000; Hayton et al. 2004;
Worthingtong and Whittaker 2006) is a statistical procedure based on the generation
of random eigenvalues and comparing these with computed eigenvalues from
psychometric data. Theoretically, any computed eigenvalue that is greater than the
average of a large number of randomly generated eigenvalues should be considered
as non-trivial and, thus, representative of an actual dimension in the data. Using an
SPSS procedure developed by OConnor (2000), eigenvalues were generated for
100 randomly drawn datasets extracted through a principal axis factoring method.
Principal axis factoring (PAF) was chosen over principal component factoring due
to PAF analyzing only shared variance among variables. The upper point of the 95th
percentile for the average eigenvalues over the randomly generated data sets was
lower than the computed eigenvalues based on the adjusted correlation matrix for
the 5E ILPv2 data for the rst ve factors (Table 2) indicating the non-trivial nature
of ve components.
Velicers MAP test (Velicer 1976; Velicer et al. 2000) is a method to determine
the optimal number of factors in an instrument through the examination of partial
correlation matrices. A series of squared coefcients in off-diagonals of partial
correlation matrices for successive components are computed. The components with
the lowest average squared partial correlations are retained as the best solution.
Table 4 Item communalities
Items Initial Extraction
Engage 1 .675 .626
Engage 2 .710 .729
Engage 3 .763 .819
Engage 4 .819 .790
Explore 1 .927 .931
Explore 2 .951 .971
Explore 3 .935 .942
Explore 4 .859 .843
Explain 1 .749 .715
Explain 2 .843 .876
Explain 3 .844 .888
Explain 4 .651 .633
Explain 5 .689 .656
Explain 6 .807 .771
Elaborate 1 .866 .883
Elaborate 2 .887 .948
Elaborate 3 .798 .806
Evaluate 1 .758 .720
Evaluate 2 .654 .621
Evaluate 3 .942 .963
Evaluate 4 .943 .973
540 M. J. Goldston et al.
1 3
Velicer et al. (2000) indicated that coefcients raised to the fourth power may yield
more accurate results than squared partial correlations. A MAP SPSS procedure
(OConnor 2000) indicated that the optimal number of components based on
Velicers original MAP test was six; however, the revised MAP test with partial
correlations raised to the 4th power indicated that a ve-factor solution was
indicated (Table 3). The results of the parallel analysis and MAP analysis, in
conjunction with the scree plot, conrmed a ve-factor solution explaining 85.5 %
of the total variability within the instrument. This is a gain of 9.52 % explanation
over the original 5E ILP (Goldston et al. 2009).
All items of the 5E ILPv2 instrument have moderate to high communality
estimates (h
2
) indicating that they are strong measures of the underlying theoretical
construct (See Table 4). The lowest communality estimate was .621 for the Evaluate
2 item, and the highest was .973 for the Evaluate 4 item. Factor loadings for the
items indicated moderate to high overlap between items and their extracted factors.
The highest loadings for each item were on their theoretical factors.
Pattern and structure matrices are shown in Tables 5 and 6. Possible double
loading issues exist for the Explain item ve which also loads on the Engage
Table 5 Pattern matrix
Factor 1
(Explore)
Factor 2
(Evaluate)
Factor 3
(Engage)
Factor 4
(Elaborate)
Factor 5
(Explain)
Explore 2 1.034 -.012 -.012 -.019 -.025
Explore 3 .986 -.038 -.023 -.041 .075
Explore 1 .885 -.006 .074 .064 -.033
Explore 4 .799 .012 .017 -.021 .147
Evaluate 4 -.027 1.079 -.045 -.052 -.032
Evaluate 3 -.003 1.061 -.042 -.038 -.047
Evaluate 2 -.054 .763 .034 -.006 .059
Evaluate 1 .220 .498 -.026 .285 -.016
Engage 3 .009 -.009 1.007 -.034 -.109
Engage 1 .071 -.097 .813 -.032 .002
Engage 2 -.023 .021 .796 -.015 .094
Engage 4 .428 .024 .512 .134 -.163
Elaborate 2 -.012 -.039 -.014 1.012 .005
Elaborate 3 .041 -.017 -.117 .959 .001
Elaborate 1 -.055 -.012 .132 .865 .032
Explain 3 -.007 -.068 -.130 .049 1.053
Explain 2 .065 -.036 -.048 -.028 .967
Explain 4 .021 .171 .159 -.050 .560
Explain 6 .092 .093 .322 .014 .449
Explain 5 -.003 .091 .334 .104 .373
Explain 1 .137 .119 .224 .158 .322
Loadings of items within each factor are bolded
Inquiry-Based Teaching 541
1 3
factor, and the Engage four item also loads on the Explore factor. The Engage
four item loads primarily on its theoretical construct at .512, but also loads on
the Explore factor at .428 (.841 and .828 respectively on the structure matrix).
While the Explain ve item loads primarily on the expected Explain factor at
.373, it also loads on the Engage factor with a loading of .334 (.759 and .749
respectively on the structure matrix). The factors are all strongly correlated
with each other (Table 7), thus providing support for the oblique rotation
method.
Table 6 Structure matrix
Factor 1
(Explore)
Factor 2
(Evaluate)
Factor 3
(Engage)
Factor 4
(Elaborate)
Factor 5
(Explain)
Explore 2 .985 .590 .769 .695 .699
Explore 3 .969 .580 .763 .684 .722
Explore 1 .963 .605 .795 .731 .710
Explore 4 .912 .601 .756 .679 .737
Evaluate 4 .543 .981 .553 .534 .599
Evaluate 3 .547 .978 .561 .545 .600
Evaluate 2 .483 .787 .513 .487 .550
Evaluate 1 .701 .781 .656 .719 .655
Engage 3 .699 .538 .901 .617 .634
Engage 2 .679 .562 .851 .620 .686
Engage 4 .828 .585 .841 .715 .649
Engage 1 .635 .441 .787 .547 .586
Elaborate 2 .688 .567 .685 .973 .665
Elaborate 1 .691 .588 .729 .936 .687
Elaborate 3 .632 .522 .597 .895 .599
Explain 3 .651 .578 .658 .642 .937
Explain 2 .690 .602 .697 .636 .934
Explain 6 .742 .660 .805 .681 .832
Explain 1 .737 .656 .766 .715 .781
Explain 4 .625 .627 .674 .574 .776
Explain 5 .666 .613 .749 .659 .759
Loadings of items within each factor are bolded
Table 7 Factor correlation matrix
Factor 1
(Explore)
Factor 2
(Evaluate)
Factor 3
(Engage)
Factor 4
(Elaborate)
Factor 5
(Explain)
Factor 1 1.000 .617 .794 .723 .729
Factor 2 .617 1.000 .630 .612 .667
Factor 3 .794 .630 1.000 .721 .761
Factor 4 .723 .612 .721 1.000 .697
Factor 5 .729 .667 .761 .697 1.000
542 M. J. Goldston et al.
1 3
Internal consistency was assessed using the ordinal alpha reliability coefcient
(Zumbo et al. 2007). All ve subscales derived from the factor analysis displayed
strong evidence of internal consistency with very high reliability coefcients.
Ordinal alpha estimates were .94, .99, .96, .97, and .93 for the engage, explore,
explain, elaborate, and evaluate subscales respectively.
Discussion
Psychometric analysis of the 5E ILPv2 rubric revealed an overall solid instrument
for its designed purpose of assessing written 5E inquiry-based lesson plans. By
design, the 5E ILPv2 rubric as a technical instrument identied key items of the 5E
instructional approach and posed some interesting ndings. For one, while
examining the twenty-one items for scoring ranges (04), there were four specic
items: the engage items (1 and 2) and explain items (1 and 5) that lacked zero scores
in post lesson plan data. A possible interpretation is that by the end of the semester,
all the students had at minimum learned to address these four items (see
Appendix). The two engage items focused on ascertaining what learners know
about a concept and motivating students by setting the stage for exploration. The
explain item 1 was a transition item while explain item 5 focused on the use of
multiple strategies in building lesson concepts. Each of these items upon
examination is straight forward and perhaps less difcult than other items, it does
appear from the ndings that all the preservice teachers attempted to address these
four items in writing in their nal 5E lesson plans.
Another surprising nding stems from the factor analysis. Unexpectedly, the
instruments items engage four and explain ve, both loaded primarily on their
expected factor, however also loaded on another factor in the 5E ILPv2 instrument
analysis. The engage four item also loaded on the explore factor which could be
explained by the items focus on creating a logical connection and transition
between the end of engage phase and the beginning of the explore phase. The
double loading of explain item ve on the engage factor proves a bit more difcult
to interpret. Explain ve item focuses on a teacher using more than a single
pedagogical approach when facilitating a discussion of concepts examined by
students during the explore phase. Scoring of this item is based on whether the
explain phase of the inquiry lesson plan displays multiple approaches. So if the
explain included a student discussion, power point, and a demonstration, this would
score higher than a lesson including only a discussion. We recognize that this is not
necessarily a key element of the 5E instructional model, but it is an effective
teaching strategy. In future versions of the instrument, this item may need to be
changed or eliminated.
The 5E ILPv2 was developed to assist instructors in assessing inquiry-based 5E
lesson plans more equitably and identify problem areas, as well as give feedback to
preservice teachers on problem areas. The scoring of any lesson plan is no easy task,
and the scoring of an inquiry-based 5E lesson plan is no different. The 5E ILPv2
rubric was developed to support preservice teachers skill in developing 5E lesson
plans by serving as a guide for them, as well as allow the instructor to provide
Inquiry-Based Teaching 543
1 3
feedback on specic aspects of the inquiry lesson plan phases that may need more
development. For instance, if the questions of a 5E lesson plans explain stage are
not written in such a way or are not sequenced properly or are not complete enough
to develop the concept targeted in the lesson, then the explain item of the rubric
would reect a lower score. Additional feedback on the item is therefore specic, so
the lesson can be revised and improved prior to teaching. There is no perfect
instrument and all are in some way subjective; while we have attempted to make the
items reective of the key aspects of the 5E model supported through psychometric
analysis, there are always limitations.
The items of the 5E ILPv2 instrument help to identify and determine the quality
of the distinct phases of the instructional model; however, one limitation is that no
single item captures the uid, holistic nature of an inquiry-based 5E lesson plan.
Indeed, the 5E ILPv2 instruments items appear as discrete, isolated elements while
the 5E instructional approach is holistic and represents continuity, a ow within and
between the ve phases that builds both content and skill. The authors attempted to
capture continuity between phases with transition items that connect the phases.
Recall that one of these items, Engage 4, loaded on both the engage and explore
factors where one might expect a link. In addition, to represent cohesiveness within
the phases, the items are descriptive to link the key elements of each phase. For
example, examining the 5E ILPv2 instrument for how well or uid the explain stage
addresses and develops the target concepts requires the scorers attention to focus on
the question quality and the sequence of questions or strategies used. Furthermore,
from a pragmatic stance in scoring lesson plans, a single item related to concept
development is less useful to students than the items addressing the strategies and
questions during the explain that are critical to the development of the concept(s).
While recognizing the limitations of the instrument, some aspects of the lesson plan
process may be best viewed in the actual orchestration or teaching of the lesson
rather than the lesson plan itself. At some point writing about every aspect of any
lesson, much less an inquiry lesson makes for a long unwieldy lesson plan. Specic
items related to the continuity between and within the phases or single items that
capture the holistic nature of the lesson are currently not included in the 5E ILPv2,
but will be considered and examined along with examining the instruments items
for levels of difculty using Rasch analysis.
As educators, we use assessments during our courses to improve and guide
instruction. Thus, using the 5E ILP to generate descriptive statistics from individual
class data can be useful to instructors in identifying areas of inquiry-based 5E lesson
planning that preservice teachers are struggling to grasp and those items they have
already learned and can apply. The descriptive statistics of the 224 preservice
teachers post lesson plan data reveal that by the end of the semester, the preservice
teachers appear to have higher mean scores with the engage items and lower mean
scores with the elaborate and some items of the evaluate. This suggests to us that
additional work with these two phases is warranted in our courses. Examining
specic items can help instructors revise their strategies for teaching and modeling
5E lessons to assist students in further developing their knowledge of and skills in
designing inquiry-based 5E lesson plans.
544 M. J. Goldston et al.
1 3
Conclusion
The purpose of this study was to conduct a psychometric analysis on the 5EILPv2 rubric
for inquiry-based lesson plans in order to assess evidence of validity and reliability. The
conclusion of this analysis suggests that the 5E ILPv2 displays strong evidence of both,
and it is an appropriate instrument for use in evaluating preservice teachers inquiry
lesson plan development usingthe 5Einstructional model. The 5Einstructional approach
is built upon the three phases of the learning cycle (Atkins and Karplus 1962).
Modications of the learning cycle evolved into the 5Einstructional model (Bybee et al.
2006) which includes an engage phase and the addition of an evaluation phase. These
phases have emerged over time as a result of research on effective learning and the
demands for accountability. Thus, the 5EILPv2 rubric was examined to discern whether
the items associated with each of the ve different phases hold together as ve distinct
subscales as opposedtothreefoundintheinitial 5EILPinstrument (Goldstonet al. 2009).
Using 5E ILPv2 itemized scores from224 pre-service elementary science teachers post
lesson plans, a factor analysis revealed ve distinct theoretical constructs with the items
loading onthe expectedfactor. With 85.5 %of the instruments variability explained and
the items loading on their associated theoretical constructs, the 5E ILPv2 is a strong
instrument for assessing an individuals ability to write inquiry-based 5E lesson plans.
Given the lack of rubrics available to science educators for scoring 5E lesson
plans, the usefulness of having an instrument such as this offers a tool that provides
equity and consistency in scoring. From a practical stance, the instrument provides
pre-service teachers a guide to use in writing inquiry lessons with item-by-item
descriptions of the inquiry models ve phases. As an assessment tool, it provides
feedback on lesson plan items that pre-service teachers developed well and those
areas that still need improvement. Last, the 5E ILPv2 also offers instructors
opportunities to research 5E lesson planning within their own courses by examining
students progress on specic phases or items.
Appendix
5E Inquiry Lesson Plan Version 2 Rubric (5E ILPv2)
Name(s)________________________ Lesson Title ________________________ Grade leve1 __________
Approval of Field/Clinical Placement Supervisor/Faculty ____________________
Approval of Methods Faculty __________________________________________
Science Learning Cycle Lesson Plan Rubric v1
0 1 2 3 4 Concepts and/or skills selected for the lesson align with National Science
Education Standards and relevant state/local standards
0 1 2 3 4 The lesson plan contains objectives that are clear, appropriate, measurable, and
align with the assessment/evaluation
0 1 2 3 4 Materials list is present and complete
Inquiry-Based Teaching 545
1 3
ExplorationPhase 1 (Engage and Explore)
InventionPhase 2 (Explain)
Explain item 1
0 1 2 3 4 There is a logical transition from the explore phase to the explain phase
Explain item 2
0 1 2 3 4 The explain includes teacher questions that lead to the development of concepts
and skills (Draws upon the explore activities/or data collected during the explore
activities)
Explain item 3
0 1 2 3 4 The explain includes mixed divergent and convergent questions for interactive
discussion facilitated by teacher and/or students to develop concepts or skills
Explain item 4
0 1 2 3 4 The explain includes a complete explanation of the concept (s) and/or
skill(s) taught
Explain item 5
0 1 2 3 4 The explain phase provides a variety of approaches to explain and illustrate the
concept or skill. (For example, approaches might include but are not limited to
the use of technology, virtual eld trips, demonstrations, cooperative group
discussions, panel discussions, interview of guest speaker, video/print/audio/
computer program materials, or teacher explanations.)
Explain item 6
0 1 2 3 4 The discussions or activity during the explain phase allows the teacher to assess
students present understanding of concept(s) or skill(s)
Engage item 1
0 1 2 3 4 The engage elicits students prior knowledge (based upon the objectives)
Engage item 2
0 1 2 3 4 The engage raises student interest/motivation to learn
Engage item 3
0 1 2 3 4 The engage provides opportunities for student discussion/questions (or invites student
questions)
Engage item 4
0 1 2 3 4 The engage leads into the exploration
Explore item 1
0 1 2 3 4 During the explore phase, teachers present instructions
Explore item 2
0 1 2 3 4 Learning activities in the exploration phase involves hands-on/minds-on activities
Explore item 3
0 1 2 3 4 Learning activities in the exploration phase are student-centered (When appropriate,
teacher questions evoke the learners ideas and/or generate new questions from
students. Student inquiry may involve student questioning, manipulating objects,
developing inquiry skills (as appropriate) and developing abstract ideas). *See back
for list of typical inquiry skills
Explore item 4
0 1 2 3 4 The inquiry activities of the explore show evidence of student learning (formative/
authentic assessment). *See back for a list of formative assessment methods
546 M. J. Goldston et al.
1 3
ExpansionPhase 3 (Elaborate and Evaluate)
Points
Additional Lesson Plan components:
Scoring Criteria
4 Excellent All elements of the item are present, complete, appropriate, and accurate, with
rich details. Another teacher can use the plan(or phase) as written
3 Good Most of the elements of the item are present, complete, appropriate, and
accurate, with rich details. Another teacher could use the plan (or phase) with
a few modications
Elaborate item 1
0 1 2 3 4 There is a logical transition from the explain phase to the elaborate phase
Elaborate item 2
0 1 2 3 4 The elaborate activities provide students with the opportunity to apply the newly
acquired concepts and skills into new areas
Elaborate item 3
0 1 2 3 4 The elaborate activities encourage students to nd real-life (every day)
connections with the newly acquired concepts or skills
Evaluation item 1
0 1 2 3 4 The lesson includes summative evaluation, which can include a variety of forms/
approaches. * See back for list of some methods of evaluation
Evaluation item 2
0 1 2 3 4 The evaluation matches the objectives
Evaluation item 3
0 1 2 3 4 The evaluation criteria are clear and appropriate
Evaluation item 4
0 1 2 3 4 The evaluation criteria are measurable (i.e., rubrics)
0 1 2 3 4 Relevant safety issues are addressed. Appropriate safety equipment is delineated.
Selection of materials is age appropriate
0 1 2 3 4 The time specied in each of the lesson plan phases (exploration, invention,
expansion) is appropriate
0 1 2 3 4 Accommodations for students with special needs are addressed. A variety of
cognitive levels is addressed throughout the lesson. The lesson is appropriate for
all students
0 1 2 3 4 The lesson plan includes a bibliography. Cited works include web sites, textbooks,
childrens literature, and relevant articles. Using only childrens literature is not
acceptable. Multiple sources must be used for content verication
Inquiry-Based Teaching 547
1 3
Appendix continued
2 Average Approximately half of the elements of the item are present, complete,
appropriate, and accurate, with some details. Another teacher could use
the plan (or phase) with modications
1 Poor Few of the elements of the item are present, complete, appropriate, and
accurate, with few details. Another teacher would have to re-write the
lesson (or phase) in order to implement the lesson
0 Unacceptable Key elements of the item are not present. Descriptions are inappropriate.
Plan lacks coherence and is unusable as written
*Typical inquiry skillspredicting, hypothesizing, observing, measuring, test-
ing, recording, graphing, creating tables, drawing conclusions.
*Typical formative assessment methods: science journals, science notebooks,
photonarratives, KWL charts, concept maps, writing assignments, art work,
drawings/charts, graph, quiz, test, PowerPoint presentation, I-movie, movie,
cartoons. Note that evaluation comes from the culmination of the formative
assessments used during the lesson.
*Examples of appropriate experiences include the following: the use of
technology, Internet eld trips, eld trips, hands-on/minds-on learning activities,
cooperative group discussions, panel discussions, interview of guest speaker, video/
print/audio/computer program materials, teacher explanations, Webquest, Track-
Star, I-movie, PowerPoint.
References
AAAS. (1993). Benchmarks for scientic literacy. New York: Oxford University Press.
AAAS. (1998). Blueprints for reform: Science, mathematics, and technology education. New York:
Oxford University Press.
Achieve. (2012). Next generation science standards: For states by states. Retrieved at http://www.
nextgenscience.org/next-generation-science-standards/.
Ackerson, V., Abd-El-Khalick, F., & Lederman, N. (2000). Inuence of a reective explicit activity-
based approach on elementary teachers conceptions of nature of science. Journal of Research in
Science Teaching, 37(4), 295317.
Appleton, K. (2008). Developing science pedagogical content knowledge through mentoring elementary
teachers. Journal of Science Teacher Education, 19(6), 523545.
Atkins, J. M., & Karplus, R. (1962). Discovery or invention? Science Teacher, 29(5), 45.
Barman, C. R. (1992). An evaluation of the use of a technique designed to assist prospective elementary
teachers use of the learning cycle with science textbooks. School Science and Mathematics, 92(2),
5963.
Barman, C. R. (1993). The learning cycle: A basic tool for teachers, too. Perspectives in Education and
Deafness, 11(4), 711.
Barman, C., & Shedd, J. (1992). Program designed to introduce K-6 teachers to the learning cycle
teaching approach. Journal of Science Teacher Education, 3(2), 5864.
Beerer, K. & Bodzin, A. (2004). Promoting inquiry-based science instruction with the science teacher
inquiry rubric (STIR). Paper presented at the 2004 Association for the Education of Teachers in
Science (AETS), Annual International Conference in Nashville, TN.
Blank, L. (2000). A metacognitive learning cycle: A better warranty for student understanding. Science
Education, 84(4), 486506.
548 M. J. Goldston et al.
1 3
Bodzin, A. & Beerer, K. (2003). Promoting inquiry based science instruction: the validation of the science
teacher inquiry. Journal of Elementary Science Education. 15(2), 3949. Retrieved on line at http://
www.thefreelibrary.com/Promoting+inquiry-based+science+instruction%3a+the+validation+of+the
-a0108967578.
Bybee, R. (1997). Achieving scientic literacy: From purpose to practice. Portsmouth, NH: Heinemann
Press.
Bybee, R., Taylor, J., Gardner, A., Van Scotter, P., Carlson, J., Westbrook, A., & Landes, N. (2006). The
BSCE 5e instructional model: Origins and effectiveness. A report for Ofce of Science Education
National Institutes of Health. Retrieved on line at http://science.education.nih.gov/houseofreps.nsf/
b82d55fa138783c2852572c9004f5566/$FILE/Appendix?D.pdf.
Cavallo, A., & Laubach, T. (2001). Students science perceptions and enrollment decisions in differing
learning cycle classrooms. Journal of Research in Science Teaching, 38(9), 10291062.
Colburn, A. (2008). An inquiry primer. In E. Brunsell (Ed.), Readings in science methods K8 (pp.
3336). Arlington, VA: NSTA Press.
Dwyer, W., & Lopez, V. (2001). Simulations in the learning cycle: A case study involving Exploring the
Nardoo. In J. Price, et al. (Eds.), Proceedings of the Society for Information Technology and
Teacher Education International Conference (pp. 25562557). Chesapeake, VA: AACE.
Eick, C., Meadows, L., & Balkcom, R. (2005). Breaking into inquiry: Scaffolding support beginning
efforts to implement inquiry in the classroom. The Science Teacher, 72(7), 4953.
Fields, A. (2005). Discovering statistics using SPSS. London: Sage Publications.
Glasson, G., & Lilik, R. (1993). Reinterpreting the learning cycle from a social constructivist perspective:
A qualitative study of teachers beliefs and practices. Journal of Research in Science Teaching,
30(2), 187207.
Goldston, M. J., Day, J., Sundberg, C., & Dantzler, J. (2010). Psychometric analysis of a 5E learning
cycle lesson plan assessment instrument. International Journal of Science and Mathematics
Education, 8(4),633645.
Hayton, J. C., Allen, D. G., & Scarpello, V. (2004). Factor retention decisions in exploratory factor
analysis: A tutorial on parallel analysis. Organizational Research Methods, 7, 191205.
Hodson, D. (1988). Toward a philosophically more valid science curriculum. Science Education, 72(1),
1940.
Horn, J. L. (1965). A rationale and test for the number of factors in factor analysis. Psychometrika, 30,
179185.
Hutcheson, G., & Sofroniou, N. (1999). The multivariate social scientist. London: Sage Publications.
Jacobson, W., & Kondo, A. (1968). SCIS elementary science sourcebook. Berkeley, CA: Science
Curriculum Improvement Study.
Jinkins, D. (2002). Impact of the implementation of the teaching/learning cycle on teacher decision-
making and emergent readers. Reading Psychology, 22(4), 267288.
Karplus, R. (1979). Teaching for the development of reasoning. In A. Lawson (Ed.), 1980 AETS
yearbook: The psychology of teaching for thinking and creativity. ERIC/SMEAC: Columbus, OH.
Karplus, R., Collea, F, Fuller, R., Paldy, L., & Renner, J. (1975). Workshop in physics teaching and the
development of reasoning. Presented for the American Association of Physics Teachers.
Karplus, R., & Thier, H. D. (1967). A new look at elementary school science. Chicago, IL: Rand McNally.
Lawson, A. E. (1995). Science teaching and the development of thinking. Belmont, CA: Wadsworth.
Lawson, A. & Abraham, M. & Renner, J. (1989). A theory of instruction: Using the learning cycle to
teach science concepts and thinking skills. NARST monograph, number one, National Association of
Research in Science Teaching.
Lederman, N., Wade, P., & Bell, R. (1998). Assessing understanding of the nature of science: A historical
perspective. In W. McComas (Ed.), The nature of science and science education: Rationales and
strategies (pp. 331350). Dordrecht, the Netherlands: Kluwer Academic.
Lovoie, D. (1999). Effects of emphasizing hypothetico-predictive reasoning within the science learning
cycle on high school students process skills and conceptual understanding of biology. Journal of
Research in Science Teaching, 36(10), 11271147.
Marek, E. (2008). Why the learning cycle? Journal of Elementary Science Education, 20(3), 6369.
Marek, E., & Cavallo, A. (1997). Learning cycle: Elementary school science and beyond. Portsmith, NH:
Heinnemann.
Marek, E., Eubanks, C., & Gallaher, T. (1990). Teachers understanding and the use of the learning cycle.
Journal of Research in Science Teaching, 27(9), 821834.
Inquiry-Based Teaching 549
1 3
Marek, E., Laubach, T. A., & Pederson, J. (2003). Preservice elementary school teachers understandings
of theory based science education. Journal of Science Teacher Education, 14(3), 147159.
Marek, E., Maier, S., & McCann, F. (2008). Assessing understanding of the learning cycle: The ULC.
Journal of Science Teacher Education, 19(4), 375389.
Marek, E., & Methven, S. (1992). Effects of the learning cycle upon student and classroom teacher
performance. Journal of Research in Science Teaching, 28(1), 4153.
Munsheno, B., & Lawson, A. (1999). Effects of learning cycle and traditional text on comprehension of
science concepts by students at differing reasoning levels. Journal of Research in Science Teaching,
36(1), 2337.
National Assessment of Educational Progress. (2010a). The Nations Report Card: Science 2009.
National Center for Educational Statistics (NCES Publication 2011-451 or 15654K PDF). Retrieved
from http://nces.ed.gov/nationsreportcard/pubs/main2009/2011451.asp.
National Assessment of Educational Progress. (2010b). Hands-on and interactive computer assessment
from 2009 Science Assessment. Retrieved from http://nationsreportcard.gov/science_2009/ict_
summary.asp.
National Research Council. (1996). National science education standards. Washington, DC: National
Academy Press.
National Research Council. (2011). A framework for k-12 science education: Practices, crosscutting
concepts, and core ideas. Washington, DC: National Academies Press.
National Research Council (NRC). (2000). Inquiry and the national science education standards.
Washington, DC: National Academy Press.
No Child Let Behind. (2002). No child left behind act of 2001. U. S. Pub.L. No. 107110, 115 Stat. 435.
Nunnally, J. C. (1978). Psychometric theory. New York, NY: McGraw-Hill.
OConnor, B. P. (2000). SPSS and SAS programs for determining the number of components using
parallel analysis and Velicers MAP test. Behavior Research Methods, Instruments, & Computers,
32(3), 396402.
Odom, A., & Kelly, P. (2001). Integrating concept mapping and the learning cycle to teach diffusion and
osmosis concepts to high school biology students. Science Education, 85(6), 615635.
Odom, A., & Settlage, J. J. (1996). Teachers understandings of the learning cycle as assessed with a two-
tier test. Journal of Science Teacher Education, 7(4), 123142.
Olson, S., & Loucks-Horsley, S. (2000). Inquiry and the national science education standards: A guide
for teaching and learning. Washington DC: National Academy of Sciences.
Pomperoy, D. (1993). Implications of teachers beliefs about the nature of science: Comparison of the
beliefs of scientists, secondary science teachers, and elementary teachers. Science Education, 77(3),
26278.
Settlage, J. J. (2000). Understanding the learning cycle: Inuences on abilities to embrace the approach by
preservice elementary school teachers. Science Education, 84, 4350.
Settlage, J., Meadows, L., Olson, M., & Blanchard, M. (2008). Teacher knowledge about inquiry:
Incorporating conceptual change theory. In E. Abrams, S. Southerland, & P. Silva (Eds.), Inquiry in
the classroom: Realities and opportunities (pp. 172191). Greenwich, CT: Information Age
Publishing.
Slotta, J. D. (2004). The web-based inquiry science environment (WISE): Scaffolding knowledge
integration in the science classroom. In M. C. Linn, P. Bell & E. Davis (Eds.), Internet Environments
for Science Education (pp. 203232). LEA.
Streiner, D. L. (1998). Factors affecting reliability of interpretations of scree plots. Psychological Reports,
83, 687694.
Sunal, D., & Wright, E. (2006). Teacher perceptions of science standards in K-12 classrooms: An
Alabama case study. In D. Sunal & E. Wright (Eds.), The impact of state and national standards on
k-12 science teaching (pp. 749). Greenwich, CT: Information Age Publishing.
Tabachnick, B., & Fidell, L. (2006). Using multivariate statistics (5th ed.). Boston, MA: Allyn & Bacon.
Trowbridge, L., & Bybee, R. (1996). Teaching secondary school science: Strategies for developing
scientic literacy (6th ed.). Engelwood Cliffs, NJ: Merrill.
Velicer, W. F. (1976). Determining the number of components from the matrix of partial correlations.
Psychometrika, 41, 321327.
Velicer, W. F., Eaton, C. A., & Fava, J. L. (2000). Construct explication through factor or component
analysis: A review and evaluation of alternative procedures for determining the number of factors or
components. In R. D. Gofn & E. Helmes (Eds.), Problems and solutions in human assessment (pp.
4171). Boston: Kluwer.
550 M. J. Goldston et al.
1 3
Weiss, I. (2006). A framework for investigating the inuence of the national science standards. In D.
Sunal & E. Wright (Eds.), The impact of state and national standards on K-12 science teaching (pp.
123152). Greenwich, CT: Information Age Publishing.
Weiss, I., Pasley, J. D., Smith, P. S., Banilower, E. R., & Heck, D. J. (2003). Looking inside the
classroom: A study of K-12 mathematics and science education in the United States. Chapel Hill,
NC: Horizon Research.
Welch, W., Klopfer, L. E., Aikenhead, G., & Robinson, J. (1981). The roles of inquiry in science
education: Analysis and recommendations. Science Education, 65(1), 3350.
Worthingtong, R. L., & Whittaker, T. A. (2006). Scale development research. A content analysis and
recommendations for best practices. The Counseling Psychologist, 34, 806838.
Zumbo, B. D., Gadermann, A. M., & Zeisser, C. (2007). Ordinal versions of coefcients alpha and theta
for likert rating scales. Journal of Modern Applied Statistical Methods, 6, 2129.
Zwick, W. R., & Velicer, W. F. (1986). Comparison of ve rules for determining the number of
components to retain. Psychological Bulletin, 99, 432442.
Inquiry-Based Teaching 551
1 3
Copyright of Journal of Science Teacher Education is the property of Springer Science & Business Media B.V.
and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright
holder's express written permission. However, users may print, download, or email articles for individual use.

You might also like