Professional Documents
Culture Documents
Focus group feedback was provided by the following organizations: American Association of
Publishers, Council of Chief State School Officers, Council of the Great City Schools, Council of
State Science Supervisors, Hands on Science Partnership, K-12 Alliance, National Science
Education Leadership Association, and National Science Teachers Association.
The timing for revisions coincided with the revision of the EQuIP Rubric for Science 3.0, a
process that was coordinated by Achieve with ongoing input from many of the organizations
mentioned above. In addition, the PEEC Prescreen process was piloted with a group of
educators during a course at the National NSTA Conference in Los Angeles. A special thanks to
this group for their feedback during the course and, especially to Jaqueline Rojas, who provided
detailed feedback on the whole document following the pilot.
Legal
This document is offered under the Attribution 4.0 International (CC BY 4.0) license by Creative
Commons.
What is PEEC?
PEEC is an acronym for the Primary Evaluation of Essential Criteria for Next Generation Science
Standards (NGSS) Instructional Materials Design. Per the Guide to Implementing the Next
Generation Science Standards, high-quality instructional materials designed for the NGSS are a
critical component of NGSS implementation. PEEC is designed to:
• Bring clarity to the complicated and parallel processes of selecting and developing those
instructional materials;
• Help educators and developers to focus on the critical innovations within the NGSS via a
process to dig deeply into instructional materials programs to evaluate their presence;
and
• Answer the question “How thoroughly are these science instructional materials
programs designed for the NGSS?”
PEEC is intended to evaluate the NGSS design of instructional materials programs built for year-
long courses (e.g. high school biology), or programs that span several grade levels (e.g. a K–5
elementary science series, or a middle school sequence for grades 5–8). These instructional
materials programs may be commercially available, developed by states or districts, and/or
provided as open educational resources. The instructional materials to be evaluated can be
organized in any of a variety of digital and print formats (e.g. kits, modules, workbooks,
textbooks, textbook series).
PEEC is not intended for the evaluation of individual lessons or instructional units. For these
smaller grain sizes of instructional materials, it is more appropriate to use the NGSS Lesson
Screener or the EQuIP Rubric for Science, which are explicitly designed for this purpose. PEEC is
also not intended to be used with supplemental materials or instructional materials compiled
from several different sources (e.g., a combination of various textbooks, kits, modules, and
digital supplements assembled by the user) unless there is clear guidance for how the different
components will be used in the classroom to address the criteria highlighted in this evaluation.
To determine the degree to which an instructional materials program is designed for the NGSS,
PEEC focuses on what makes the NGSS new and different from past science standards. These
differences were first articulated as “conceptual shifts” in Appendix A of the standards released
in 2013, but four years of subsequent implementation has refined our collective understanding
Each of these innovations and their implications for instructional materials are described in
detail in this document. The NGSS Innovations are the lens that PEEC uses to help educators
evaluate instructional materials, and should be the focus of those developing instructional
materials for the NGSS.
It should be noted that there are certainly additional criteria for evaluating the quality of
instructional materials that are not the primary focus of document, such as cost or ease of use
of any technological components. Their omission is not because they are not important, but
merely because they are not unique to materials designed for the NGSS. An initial discussion of
these issues is found in the Beyond PEEC section on page 47.
PEEC is a process.
PEEC is a process for schools, districts, or other teams of teachers to use to evaluate aspects of
instructional materials as described above. The PEEC evaluation process involves three
successive phases that are each explained in detail in this document.
1. PEEC Prescreen: The prescreen focuses on a small number of criteria that should be
readily apparent in instructional materials designed for the NGSS. This allows those
selecting materials to take a relatively quick look at a wide range of materials and
narrow the number of programs worthy of a closer look.
2. Unit Evaluation: If the prescreen of the materials indicates that there is at least the
potential that they are designed for the NGSS, the PEEC process uses the EQuIP Rubric
To effectively use PEEC, instructional materials evaluators and developers should already be
fluent in the language of the Framework, be comfortable navigating the NGSS (including the
Appendices),and have experience with applying the EQuIP Rubric for Science to evaluate units.
Users that are not familiar with these documents can find them and resources to support their
use at www.nextgenscience.org. PEEC also draws heavily from the discussions and evaluative
criteria in Guidelines for the Evaluation of Instructional Materials in Science—a document that
describes the research base for evaluative criteria that should be considered in building tools
for evaluating instructional materials designed for the NGSS. The criteria for all three phases of
PEEC have a close connection to those presented in the Guidelines.
PEEC represents the collective input, guidance, and efforts of many science educators around
the country. As their work continues, subsequent versions of PEEC will build on and incorporate
their experience.
We invite you to share your reactions to and suggestions for subsequent versions of PEEC by
emailing peec@achieve.org.
Why PEEC?
PEEC takes the compelling vision for science education as described in A Framework for K–12
Science Education and embodied in the NGSS and operationalizes it for two purposes:
1. To help educators determine how well instructional materials under consideration have
been designed for the Framework and NGSS; and
2. To help curriculum developers construct and write science instructional materials that
are designed for the Framework and NGSS.
The NGSS do not shy away from the complexity of effectively teaching and learning science.
They challenge us all to shift instructional materials to better support teachers as they create
learning environments that support all students to make sense of the world around them and
By the end of the 12th grade, students should have gained sufficient knowledge
of the practices, crosscutting concepts, and core ideas of science and
engineering to engage in public discussions on science-related issues, to be
critical consumers of scientific information related to their everyday lives, and
to continue to learn about science throughout their lives. They should come to
appreciate that science and the current scientific understanding of the world
are the result of many hundreds of years of creative human endeavor. It is
especially important to note that the above goals are for all students, not just
those who pursue careers in science, engineering, or technology or those who
continue on to higher education.
This vision is not only aspirational; it is based on scientific advances and educational research
about how students best learn science. This research and resulting vision for science education
have implications for instructional materials that reach far beyond minor adjustments to
lessons, adding callout boxes to margins, crafting a few new activities, or adding supplements
to curriculum units. The advances in the NGSS will be more successfully supported if entire
science instructional materials programs are designed with the innovations described by this
evaluation tool and if states, districts, and schools use this tool to ensure that the materials
they choose really measure up.
The word “designed” is intentionally and deliberately used here—and throughout the PEEC
materials—instead of “aligned.” This choice was made because alignment has come to
represent a practice that is insufficient to address the innovations in these standards.
When new standards are released, educators traditionally create a checklist or map in order to
determine how well their instructional materials match up with the standards. If enough of the
pieces of the standards match up with the pieces in the lessons or units or chapters, the
instructional materials are said to be “aligned.” In this sense, “alignment” is primarily
correlational and, if the correlation is not high enough, the only shift that is needed is to add
additional materials or remove particular pieces. This traditional approach to alignment
assumes that (1) matching content between the language of the standards and the instructional
materials is sufficient for ensuring that students meet the standards, and (2) that all approaches
to the way instructional experiences are designed in materials are created equally as long as the
content described by the standards appears.
However, the innovations of the Framework and NGSS cannot be supported by instructional
materials that simply have the same pieces and words as the standards. In the NGSS, academic
goals for students are stated as performance expectations that combine disciplinary core ideas,
crosscutting concepts, and science and engineering practices. The nature of this
multidimensional combination is as important as the presence of the constituent components,
This focus on these innovations speaks to the second purpose of PEEC: to support authors and
curriculum developers as they work to produce instructional materials for the NGSS. This
support began with NGSS Appendix A (The Conceptual Shifts in the Next Generation Science
Standards), and was soon followed by the first version of the Educators Evaluating the Quality
of Instructional Products (EQuIP) Rubric for Science that described what these shifts looked like
in instructional materials at the lesson and unit level. The EQuIP Rubric for Science has been
successively revised based on extensive use and feedback, and is now in its third version. The
lessons from EQuIP process have been further articulated and codified to form the NGSS
Innovations section of PEEC. While different from the “Publisher’s Criteria” that were
developed for the Common Core State Standards in scope, format, and structure, the core
intent of the innovations is similar: to help curriculum developers and curriculum users think
about how the standards should manifest themselves in instructional materials by focusing on
the aspects that are most central to meeting the demands of the NGSS and most different from
traditional approaches to standards, instruction, and materials. The goal is to help developers
more easily create and refine instructional materials, and to do so knowing that their efforts are
focused on the same innovations that schools, districts, and states will be using to select
instructional materials for use.
Although PEEC was explicitly and specifically designed to evaluate materials designed for the
NGSS and there are regular references to the NGSS throughout, the innovations that are part of
these standards are fundamentally rooted in the Framework. This means that states and
districts that did not adopt the NGSS, but that adopted standards based on the three
dimensions of the Framework should also be able to use it to evaluate instructional materials
that are developed for these key innovations.
As the key ways that the NGSS are new and different, these innovations also provide the
intellectual framework PEEC uses to evaluate science instructional materials.
This section describes each of the five NGSS Innovations and provides insight on how these
innovations should be expected to appear in instructional materials. Each innovation is described
with the following components.
The learning experiences provided for students should engage them with
fundamental questions about the world and how scientists have investigated
and found answers to those questions.
Though “making sense of phenomena and designing solutions to problems” is not one of the
three dimensions of the standards and “phenomenon” or “problem” are not words often found
within the performance expectations, a close look will reveal that the ability of students to make
sense of phenomena and design solutions to problems is indeed a core feature of these
standards. The easiest place to see this explicitly is to look at the foundation boxes connected to
Explaining phenomena and engineering design problems are not entirely new to science teaching
and learning—laboratory experiments have been a hallmark of science instruction for decades,
phenomena have frequently been used to “hook” students into learning, and engineering
activities have often been used for engagement or enrichment—but the expectation that they
are an organizing force for instruction is an innovation. By organizing instruction around
phenomena, students are provided with a reason to learn (beyond acquiring information they
are told they will later need) and shifts student focus from learning about a topic to figuring out
why or how something happens. Additionally, the focus on relevant, engaging phenomena and
design problems that students can access addresses diversity and equity considerations by
providing opportunities for students to make connections with the content based on their own
experiences and questions. This leads to deeper and more transferable knowledge and moves
everyone closer to the vision of the Framework.
As with science instruction, phenomena and problems are not new to science instructional
materials, but the shift to an expectation that student sense-making and problem-solving is
driving instruction means that materials will need to shift as well. In instructional materials
programs designed for the NGSS, this shift should be obvious in the organization and flow of
learning in student materials and a clear focus of the teacher supports for instruction and
monitoring student learning (see
For more resources on how making sense of phenomena and designing solutions to problems
are important for teaching and learning designed for the NGSS, visit
https://www.nextgenscience.org/resources/phenomena.
The following table provides examples of what instructional materials programs designed for this
NGSS Innovation include “less” of and “more” of. This is not an exhaustive list, but is intended to
call out key evidence that should be looked for in evaluating instructional materials. It should also
be noted that “less” does not mean “never” and “more” does not mean “always.”
Less More
Focus on delivering disciplinary core ideas to Engaging all students with phenomena and
students, neatly organized by related content problems that are meaningful and relevant;
topics; making sense of phenomena and that have intentional access points and
designing solutions to problems are used supports for all students; and that can be
occasionally as engagement strategies, but explained or solved through the application
are not a central part of student learning. of targeted grade-appropriate SEPs, CCCs,
and DCIs as the central component of
learning.
Making sense of phenomena and designing Students using appropriate SEPs and CCCs
solutions to problems separated from (such as systems thinking and modeling) to
learning (e.g., used only as an engagement make sense of phenomena and/or to design
tool to introduce the learning, only loosely solutions to give a context and need for the
connected to a disciplinary core idea, or used ideas to be learned.
as an end of unit or enrichment activity).
Leading students to just getting the “right” Using student sense-making and solution-
answer when making sense of phenomena. designing as a context for student learning
and a window into student understanding of
all three dimensions of the standards.
That there are three dimensions in the NGSS—the science and engineering practices (SEPs), the
disciplinary core ideas (DCIs), and crosscutting concepts (CCCs)—is their most recognizable
feature. The innovation of these three dimensions, however, lies not just in their existence in
the standards, but in how they exist in the standards. The NGSS are designed to make the two
important parts of this innovation clear: 1) that the all three dimensions are equally important
learning outcomes; and 2) that the integration of the three dimensions is key for student
learning.
It might seem like the existence of the three dimensions is the innovation, but each has a
predecessor in prior state standards and all three existed in many of those standards
documents in one way or another. Prior to the NGSS, the primary focus of most state standards
was on “science content” expected for students to know or understand. This “science content”
was the precursor of disciplinary core ideas. Many state standards also included at least one
standard that highlighted what students needed to know about how scientists do their work—
the precursor to the science and engineering practices. Often called “inquiry,” this was an
important component of many state standards documents. The precursors to the crosscutting
concepts were also included in state standards documents, but were often not in the standards
themselves. They were derived from the “Unifying Concepts and Processes” of the National
Science Education Standards (NRC 1996), the “Common Themes” of the Benchmarks for Science
Literacy (AAAS 2009), “themes” in Science for All Americans (AAAS 1989), and “crosscutting
ideas” in NSTA’s Science Anchors Project (2010).
How this information was organized in prior standards, however, conveyed a difference in the
relative importance of these three areas of student learning and these differences had a
significant impact on instruction, instructional materials, and assessments in science
classrooms. The “science content” portions took up the majority of the standards and because
of the sheer breadth of detailed information, most instruction that targeted the standards
focused on ways to disseminate this information to students. Though “inquiry” was highlighted
The NGSS, on the other hand, include all three dimensions in performance expectations,
intentionally signaling that all three dimensions are equally important for student learning.
Students cannot fully demonstrate understanding of disciplinary core ideas without using the
crosscutting concepts while engaging in the science and engineering practices. At the same
time, they cannot learn or show competence in practices except in the context of specific
content.
Building student proficiency in all three dimensions is a significant innovation all by itself, but
the implication of this innovation goes beyond three separate strands of learning that are
equally valued. The power of the three dimensions comes in their integration. The fact that
these standards are written as three-dimensional performance expectations is significant and
intentional, and should be reflected in student learning experiences. The Framework makes it
clear that, “In order to achieve the vision embodied in the framework and to best support
students’ learning, all three dimensions need to be integrated into the system of standards,
curriculum, instruction, and assessment” (2012). Students develop and apply the skills and
abilities described in the practices, as well as use the CCCs to make sense of phenomena and
make connections between different DCIs in order to help gain a better understanding of the
natural and designed world. The SEPs and CCCs provide multiple access points for students to
approach learning goals, enabling different students in different contexts to access the same
ideas. Simply parsing these dimensions back out into separate entities to be learned and
assessed in isolation misses the vision of the NGSS and the Framework.
It is also important to clarify that the NGSS were designed to be endpoints for a grade level (K–
5) or grade band (6–8; 9–12), and that they collectively describe what students should know
and be able to do at that endpoint. The exact pairings of the dimensions in the PEs should not
limit how the dimensions are integrated during classroom instruction and assessment. Because
the very architecture of the NGSS models three-dimensionality, a PE might seem like a
classroom lesson or unit, but it is not the intent of the NGSS to have students simply “do the
PEs.” Since the PEs are written as grade-level endpoints, they often contain elements of the
dimensions that may need to be taught at different times of the year. For example, a PE may
include a DCI that fits early in a year of instruction, but also a more advanced level of a CCC or
SEP that students might not be prepared for until the end of that same year. Furthermore, such
an endeavor would be impractical and inefficient, as many PEs overlap with and connect to
each other. Instead, three-dimensional learning experiences that integrate multiple SEPs, CCCs,
and DCIs will be needed to help all students build the needed competencies toward the
targeted performance expectations.
Instructional materials built for past science standards were organized just like the standards:
inquiry or science process was frequently addressed in an opening chapter, a majority of the
text was devoted to imparting “science content” to students, and the crosscutting concepts
precursors were generally only implicitly included in materials with little to no emphasis in
student learning goals. Instructional materials designed for the NGSS, on the other hand, must
communicate the equal value of the three dimensions. This has implications for how student
materials are organized and how the dimensions are presented in teacher support materials.
This importance can and should be conveyed explicitly, but it is also conveyed by how the
dimensions are presented. If one dimension is relegated to only appearing in the margins,
appears with much less frequency, is not supported in teacher materials, or significant learning
time is not devoted to ensuring student learning related to that dimension, then the materials
fall short of what is expected by these standards.
Instructional materials designed for the NGSS will not only value all three dimensions of the
standards, but will also integrate the three dimensions in instruction and assessment. For
instruction, this means that student learning experiences must be anchored with three-
dimensional student performances. It may not be possible for every student learning
experience to be three-dimensional, but these 3D performances should be common and central
to student learning. As mentioned above, the three dimensions of the standards should be
integrated in ways that help students to make sense of the world around them and/or design
solutions to problems—driving toward, but not limited by how the dimensions are integrated in
the performance expectations. Instructional materials designed for this NGSS Innovation should
make it clear which elements of the three dimensions are targeted by a given lesson or unit.
Instructional materials designed for the NGSS will integrate the three dimensions when
monitoring student progress with embedded formative and summative assessments. As with
instruction, this doesn’t mean every assessment task or item, all the time, but it also means
more than just an occasional three-dimensional assessment task here or there. The focus of
measuring student learning should utilize items and tasks that are measuring the dimensions
together—in pre-assessments, formative assessments, and summative assessments. Three-
dimensional assessment tasks should be embedded throughout instructional experiences,
taking advantage of the rich opportunities that are part of instruction during which students
make their thinking visible to themselves, their peers, and educators.
For an introduction regarding assessments and the NGSS, see Seeing Students Learn Science:
Integrating Assessment and Instruction in the Classroom (2017), the STEM Teaching Tool
practice briefs on assessment, and Developing Assessments for the Next Generation Science
Standards.
For some more concrete examples of what Innovation 2: Three-Dimensional Learning looks like
in instructional materials programs, see As was mentioned with Table 1, this is not an
exhaustive list, but is intended to call out key evidence that should be sought in evaluating
instructional materials. As a reminder, “less” does not mean “never” and “more” does not
mean “always.”
Table 2. As was mentioned with Table 1, this is not an exhaustive list, but is intended to call out
key evidence that should be sought in evaluating instructional materials. As a reminder, “less”
does not mean “never” and “more” does not mean “always.”
Less More
Using science practices and crosscutting Careful design to build student proficiency in
concepts only to serve the purpose of all three dimensions of the standards.
students acquiring more DCI information.
Teachers only posing questions that have Teachers posing questions that elicit the
one correct answer. range of student understanding.
Students learning the three dimensions in Integrating the SEPs, CCCs, and DCIs in ways
isolation from each other, i.e.: that instructionally make sense, as well as
inform teachers about student progress
• A separate lesson or unit on science toward the performance expectations,
process/methods followed by a later including:
lessons or units focused on delivering
science knowledge. • Students actively engaged in scientific
• Including crosscutting concepts only practices to develop an
implicitly, or in sidebars with no understanding of each of the three
attempt to build student proficiency dimensions.
in utilizing them. • CCCs are included explicitly, and
• Rote memorization of facts and students learn to use them as tools to
terminology; providing discrete facts make sense of phenomena and make
and concepts in science disciplines, connections across disciplines.
with limited application of practice or • Facts and terminology learned as
the interconnected nature of the needed while developing
disciplines. explanations and designing solutions
• Prioritizing science vocabulary and supported by evidence-based
definitions that are introduced before arguments and reasoning.
(or instead of) students develop a
conceptual understanding.
There are two components to this innovation. This first is what was described in the quote from
the Framework above: coherently building all three dimensions from kindergarten through the
twelfth grade. The second part of this focuses on how both engineering and the nature of
science are embedded across all grade levels.
While the three dimensions have appeared in past standards, the NGSS are the first standards
to build all three dimensions over time. Past standards may have included limited progressions
for both science and engineering practices (SEPs) and disciplinary core ideas (DCIs), but the
NGSS progressions are more robust in several ways. The precursors to the crosscutting
concepts (CCCs), on the other hand, were generally incorporated into the front matter of
standards without any indication of how they might be treated over time. Not only are the
three dimensions intentionally integrated into the performance expectations, but these
progressions are supported with three appendices—Appendix E: Disciplinary Core Ideas,
Appendix F: Science and Engineering Practices and Appendix G: Crosscutting Concepts—that
add additional clarity to how these dimensions build over time. The appendices break the grade
banded expectations for each DCI, SEP, and CCC into smaller elements for each to help
educators focus on what is unique about that dimension at that grade.
The SEP progressions in the NGSS are different because the more is expected for student
engagement in the practices over time. The SEPs specify what is often meant by “inquiry” and
address the range of cognitive, social, and physical practices that science and engineering
require in ways that were not included in past standards. This means there are more specific
expectations at each grade level. Furthermore, past science standards generally just increased
The DCIs are more focused than the “science content” of past standards, so the progressions
here look different as well. To be included in the Framework (and the NGSS), an idea had to:
have broad importance across one or more science disciplines; be important for understanding
more complex ideas and solving problems; relate to the interests and life experiences of
students and the world they live in; and be teachable and learnable over multiple grades with
increasing sophistication. The DCIs are driven less by information that we think that students
should know by a particular grade and more by focusing on the fundamental understanding
that will prepare them for their lives beyond high school. As a result, the DCIs have fewer
disconnected bits of information and are more focused on building these core ideas.
As was mentioned above, the predecessors to the CCCs were usually included in the front
matter of standards rather than in the standards themselves. Their addition to each of the
three-dimensional performance expectations of the NGSS means that this dimension of the
standards has an expected progression for the first time. The learning expectations of the CCCs
are scaffolded across the K-12 standards to help students connect knowledge from the various
disciplines into a coherent and scientifically-based view of the world.
Advancing the way that the DCIs and SEPs are built over time while establishing the first
progression for the CCCs is a significant innovation of the NGSS.
Instructional materials designed for the NGSS provide sustained learning opportunities from
kindergarten through high school for all students to engage in and develop a progressively
deeper understanding of each of the three dimensions. Students require coherent, explicit
learning progressions both within a grade level and across grade levels so they can continually
build on and revise their knowledge and expand their understanding of each of the three
dimensions. High-quality NGSS-designed instructional materials must clearly show how they
include coherent progressions of learning experiences that support students in reaching
proficiency on all parts (e.g., all elements of the SEPs, DCIs, and CCCs) of the NGSS by the end of
each grade level and across grades. Guidance should also be provided for teachers to adjust
instruction of all three dimensions to meet the needs of their students. In programs that extend
This means, for example, that the way materials expect students use each science and
engineering practice at the beginning of the school year should be significantly different from
how they are expected to use each practice by the end of the year. Students should have
experiences across the year designed to develop specific, grade-appropriate elements of each
practice and opportunities to apply these previously developed elements in new situations.
There are a variety of ways this might happen—initially providing supports for a practice and
then strategically removing them over time; focusing on deliberately developing a small
number of elements of a practice in a coordinated fashion throughout the year; practicing
already-developed elements of a practice when a different practice is foregrounded—but it
should be apparent in student materials how the practice is being used differently and the plan
for how the variety of student experiences builds to the full practice should be clearly explained
in teacher materials.
In a similar way, the CCCs and DCIs should be coordinated over time so learning of all three
dimensions is coherent from a student’s perspective and guidance should be provided to
teachers that explains how the organization of student learning experiences builds each
dimension for students.
See NGSS Appendix E, Appendix F, and Appendix G for more information about the learning
progressions for each dimension and how they build over time. For some more concrete
examples of what Innovation 3: Building K-12 Progressions looks like in instructional materials
programs, see Table 3. As was mentioned with earlier innovations, this is not an exhaustive list,
but is intended to call out key evidence that should be looked for in evaluating instructional
materials. As a reminder, “less” does not mean “never” and “more” does not mean “always.”
Table 3: NGSS Innovation 3—Building K-12 Progressions: Building the Three Dimensions
Less More
Building on students’ prior learning only for Building on students’ prior learning in all
the DCIs. three dimensions.
Little to no support for teachers to reveal Explicit support to teachers for identifying
students’ prior learning. students’ prior learning and accommodating
different entry points, and describes how the
learning sequence will build on the prior
learning.
Assuming that students are starting from Explicit connections between students’
scratch in their understanding. foundational knowledge and practice from
prior grade levels.
Students engaging in the SEPs only in service Students engaging in the SEPs in ways that
of learning the DCIs. not only integrate the other two dimensions,
but also explicitly build student
understanding and proficiency in the SEPs
over time.
CCCs marginalized to callout boxes, Students learn the CCCs in ways that not only
comments in the margins, or are implicit and integrate the other two dimensions, but also
conflated with the other dimensions and explicitly build student understanding and
therefore do not progress over time. proficiency in the CCCs over time.
Including teacher support that focuses only Including teacher support that clearly
on the large grain size of each dimension explains out how the elements of the
rather than digging down to the element practices are coherently mapped out over
level (e.g. the SEP “Analyzing and the course of the instructional materials
Interpreting data” rather than the grade 3–5 program.
element of the same practice “Analyze data
to refine a problem statement or the design
of a proposed object, tool, or process.”
The NGSS include engineering design and the nature of science as significant concepts,
embedding them throughout the performance expectations. In many ways they are addressed
within the progressions of the three dimensions of the three dimensions just described, but
The NGSS represent a commitment to integrating engineering design into the structure of
science education by raising engineering design to the same level as scientific inquiry when
teaching science disciplines at all levels, from kindergarten to grade 12. To ensure that this
happens coherently across students’ K–12 learning experience, (1) all the SEPs have elements
that are explicitly focused on engineering; (2) there are specific engineering design DCIs
throughout the standards; and (3) the ideas from the Engineering, Technology, Science, and
Society disciplinary core idea in the Framework are integrated into the crosscutting concepts in
each grade band. (See Chapter 3 in the Framework for a detailed description of how the
practices are used for both science and engineering. Box 3-2 briefly contrasts the role of each
practice’s manifestation in science with its counterpart in engineering.) These engineering
concepts and practices are embedded throughout the NGSS in the performance expectations
(PEs) that are marked with an asterisk. There are also grade-banded engineering design-specific
standards in the NGSS to ensure that student learning about engineering design concepts is
coherent and builds over time. More details about how engineering was embedded in the NGSS
can be found in Appendix I: Engineering Design in the NGSS and Appendix J: Science,
Technology, Society, and the Environment.
A deeper awareness and understanding of the connections between science and engineering
helps all students to be prepared for their lives beyond high school. In particular, the increased
emphasis of engineering in the NGSS has potential to be inclusive of students who have
traditionally been marginalized in the science classroom and do not see science as being
relevant to their lives or future. By solving problems through engineering in local contexts (e.g.,
gardening, improving air quality, or cleaning water pollution in the community), students gain
knowledge of science content, view science as relevant to their lives and future, and engage in
science in socially relevant ways.
Like engineering, some aspects of the nature of science are integrated directly into the three
dimensions of the standards—the integration of scientific and engineering practices,
disciplinary core ideas, and crosscutting concepts provide practical experiences for students
that set the stage for teaching and learning about the nature of science—but this part of the
Building K-12 Progressions innovation also goes beyond just the integration of the three
dimensions. In addition to learning experiences that model how science knowledge is acquired,
the NGSS incorporate eight major themes about the nature of science into the performance
expectations. Four of these themes extend the scientific and engineering practices and four
themes extend the crosscutting concepts. Though the nature of science was often addressed
somewhere within past standards documents, it has not been embedded in the standards over
time the way that it is in the NGSS. These eight themes and exactly how they are built into the
Though engineering has stand-alone standards for each grade band, it is important for
instructional materials not to isolate or separate engineering from science learning. Engineering
was intentionally embedded in the standards to ensure that it was not separated out and
taught as a separate unit or chapter. All three dimensions of the standards include learning that
is relevant to engineering and instructional materials should embed this learning throughout
the program and provide clear support for teachers to see how engineering is embedded
throughout the program. Instructional materials designed for the NGSS should make sure that
engineering is not an enrichment activity or engagement tool, but is incorporated meaningfully
with science throughout student learning, and included as explicit and integrated learning
targets.
Instructional materials designed for the NGSS should ensure that the eight nature of science
themes identified in Appendix H are likewise explicitly embedded throughout student learning
experiences and teacher supports, building learning progressions across grade bands.
For more examples of what Embedding Engineering Design and the Nature of Science looks like
in instructional materials programs, see Table 4. As was mentioned with earlier innovations,
this is not an exhaustive list, but is intended to call out key evidence that should be looked for
in evaluating instructional materials. As a reminder, “less” does not mean “never” and “more”
does not mean “always.”
Table 4: NGSS Innovation 3—Building K–12 Progressions: Embedding Engineering Design and
the Nature of Science
Less More
Presenting engineering design and the Engaging all students in learning experiences
nature of science disconnected from other that connect engineering design and the
science learning (e.g., design projects that nature of science with the three dimensions of
do not require science knowledge to the NGSS; not separated from science DCIs.
complete successfully, or an intro unit on
the nature of science).
Presenting engineering design and/or Both engineering design and nature of science
nature of science in a hit or miss fashion, are thoughtfully woven into the three-
i.e. they are made apparent to students, dimensional learning progressions so that
but there is no coherent effort to students receive support to develop their
coordinate or improve student understanding and proficiency.
understanding or proficiency over time.
Teacher support that only explains the Teacher support that explains how engineering
importance of the nature of science and design and the nature of science are
engineering design without a plan for coherently mapped out over the course of the
scaffolding student understanding and instructional materials program.
application.
Such convergence across content areas strengthens science learning for all students, especially
for students whose time for learning science may have been diminished by policies driven by an
accountability system dominated by reading and mathematics. Across the three subject areas,
students are expected to engage in argumentation from evidence; construct explanations;
obtain, synthesize, evaluate, and communicate information; and build a knowledge base
through content rich texts. Additionally, students learn the crosscutting concept of Patterns not
only across science disciplines but also across other subject areas of language arts,
mathematics, social studies, etc. Furthermore, the convergence of core ideas, practices, and
crosscutting concepts across subject areas offers multiple entry points to build and deepen
understanding for these students.
Instructional materials designed for the NGSS will highlight and support teachers in making
connections between science, mathematics, and English language arts. Grade-appropriate and
substantive overlapping of skills and knowledge helps provide all students equitable access to
the learning standards for science, mathematics, and English language arts (e.g., see NGSS
Appendix D Case Study 4: English Language Learners).
For examples of NGSS Innovation 4: Alignment with English language arts and Mathematics, see
Table 5. As was mentioned with earlier innovations, this is not an exhaustive list, but is
intended to call out key evidence that should be looked for in evaluating instructional materials.
As a reminder, “less” does not mean “never” and “more” does not mean “always.”
Science learning is isolated from related Engaging all students in science learning
learning in mathematics and English experiences that explicitly and intentionally
language arts. connect to mathematics and English
language arts learning in meaningful, real-
world, grade-appropriate, and substantive
ways and that build broad and deep
conceptual understanding in all three subject
areas.
Communities expect many things from their K-12 schools, among them the
development of students’ disciplinary knowledge, upward social mobility,
socialization into the local community and broader culture, and preparation for
informed citizenship. Because schools face many constraints and persistent
challenges in delivering this broad mandate for all students, one crucial role of
a framework and its subject matter standards is to help ensure and evaluate
educational equity.
The NGSS describe science expectations built on progressions of the disciplinary core ideas
(DCIs), the science and engineering practices (SEPs), and crosscutting concepts (CCCs) used
together in meaningful ways that both establish high expectations while providing the structure
to support students from diverse backgrounds in meeting them. This manifests directly in other
innovations of the standards; however, the implications for supporting all students go deeper
than those opportunities previously mentioned. As such, this innovation emphasizes those
features of implementing the NGSS that directly support all students, and particularly those
from traditionally underserved groups, in establishing and maintaining both achievement and
agency in science. Whereas innovations 1-4 describe what is different in the NGSS, innovation 5
describes how the features of the NGSS can be used to support all learners with a focus on
implications for instructional materials.
For further information and examples of how to support a range of students, please see NGSS
Appendix D and the accompanying case studies.
Instructional materials designed for the NGSS provide opportunities for all learners, and
guidance to teachers for supporting diverse student groups, including students from
economically disadvantaged backgrounds, students with special needs (e.g., visually impaired
students, hearing impaired students), English learners, students from diverse racial and ethnic
backgrounds, students with alternative education needs, and talented and gifted students.
They do so using a variety of approaches, but also ensure the features of NGSS design are
intentionally leveraged to support diverse learners as they develop proficiency, agency, and
identity in science.
Specifically, instructional materials that are designed for the NGSS should:
1. Provide substantial opportunities for students to express and negotiate their ideas and
prior knowledge, and capitalize on funds of knowledge (see NGSS Appendix D) as they
are making sense of phenomena and designing solutions to problems.
2. Include diverse examples of scientists and engineers, including women and members
other underserved populations, with whom a range of student groups can identify.
3. Offer meaningful opportunities for science learning experiences to value, respect, and
connect to students’ home, culture, and community.
4. Regularly provide opportunities for students to have ownership over their learning, as
they explore and come to more deeply understand the core scientific ideas described by
the standards.
5. Provide multiple access points, representations, and multimodal experiences for
students to engage with the science at hand.
6. Provide multiple ways in which to make student thinking visible.
7. Provide teachers with ample tools and supports to help a wide range of students learn
the designated content and skills, including through differentiation, engaging multiple
scientific competencies, supporting scientific identities, and cultivating scientific agency.
For more examples of NGSS Innovation 5: All Standards, All Students, see Table 6. As was
mentioned with earlier innovations, this is not an exhaustive list, but is intended to call out key
High-quality instructional materials programs designed for the NGSS include the following:
Less More
Materials providing limited ways of Materials engaging the SEPs, CCCs, and DCIs as
meeting learning goals, such as reading access points and diverse ways for students to
about topics, listening to lectures and learn (e.g., students using the practice of
note-taking, and following written or oral argumentation and evidence-based discourse to
labs. develop scientific understanding; students
developing and using modeling to make sense of
phenomena and problems as well as make
thinking visible in ways that are less dependent
on English language proficiency).
Materials focus only on helping students Materials help students learn the requisite
learn and remember “the right answer.” information while also growing students’ ability
to see themselves as scientists and engineers by
providing students multiple opportunities to
make their thinking visible, revisiting ideas, and
engaging in scientific discourse with peers.
Teacher materials that focus on Teacher materials that include suggestions for
delivering information to students how to connect instruction to the students'
without providing support to help home, neighborhood, community and/or culture
teachers value and build on the as appropriate, and provide opportunities for
experiences and knowledge that students students to connect their explanation of a
bring to the classroom phenomenon and/or their design solution to a
problem to questions from their own experience
and meaningful components of their own
contexts. Teacher materials provide suggestions
for how to support students’ through multiple
approaches to problems and phenomena.
Teacher materials that only offer minimal Teaching materials that include:
or non-context specific support for
differentiation. • Appropriate reading, writing, listening,
and/or speaking alternatives (e.g.,
translations, picture support, graphic
organizers, etc.) for students who are
English learners, have special needs, or
read well below the grade level.
• Extra support (e.g., phenomena,
representations, tasks) for students who
are struggling to meet the targeted
expectations.
• Extensions for students with high interest
or who have already met the
performance expectations to develop
deeper understanding of the practices,
disciplinary core ideas, and crosscutting
concepts.
• Support for how to engage students in
ownership of their learning.
The PEEC process involves three phases for each instructional materials program under
consideration.
PEEC was designed to determine the degree to which instructional materials programs are
designed with the innovations of the NGSS. As such, it is useful for both curriculum developers
and instructional materials authors as well as by schools, states, and districts seeking to
purchase or obtain instructional materials.
Some idea about how PEEC can be used by various audiences are described on the following
page:
• Describe the process for reviewing and selecting entire school science programs—school
science textbooks, textbook series, kit-based and other instructional materials and
support materials for teachers—that are designed for the NGSS; or
• Evaluate current science instructional materials to identify adaptations and
modifications to support NGSS implementation.
Summary PEEC Phase 1: The PEEC Prescreen is a quick look at NGSS design for
instructional materials programs.
Process 1. Prepare for the review by identifying the people involved, the
components of the instructional materials in question to review, and
the evidence to be sought.
2. Apply the PEEC prescreen. Use Tool 1A: PEEC Prescreen Response Form
(Phenomena), Tool 1B: PEEC Prescreen Response Form (Three
Dimensions), and Tool 1C: PEEC Prescreen Response Form (Three
Dimensions for Instruction and Assessment).
3. Analyze the results. Use Tool 2: PEEC Prescreen: Recommendation for
Review?.
The prescreen focuses on three criteria related to the first two NGSS Innovations: Innovation 1:
Making Sense of Phenomena and Designing Solutions to Problems and Innovation 2: Three-
Dimensional Learning as shown in Table 7.
The instructional materials program is designed to engage all students in making sense of
phenomena and/or designing solutions to problems through student performances that
integrate the three dimensions of the NGSS.
In the beginning of the review process, a decision needs to be made about who will be applying
the prescreen and conducting subsequent parts of the PEEC process. Will it be the whole group
that is reviewing materials, or will it be a small leadership group? Applying the prescreen with
the full group doing the review can be a way to build a common understanding of the first two
innovations before digging in deeper with the Unit Evaluation. However, depending on the
number of instructional materials programs being reviewed and the resources available to
support the review, it may make sense for only a leadership group to apply the Prescreen to the
full scope of materials being considered. Then, once a smaller set of programs have been
identified, a larger group of educators can be involved in the remaining two phases of PEEC.
Certainly, refer to state, district, and local laws, rules, and guidance documents to ensure that
all requirements are met. Suggestions for potential membership on the instructional materials
committee include state, district, and school-level science instruction, assessment, and equity
supervisors, district administrators, school principals, elementary, middle, and high school
science teachers, higher education and STEM partners, parents, students, and community
members.
All committee members need a thorough understanding of the National Research Council’s A
Framework for K–12 Science Education, the Next Generation Science Standards (NGSS), and the
NGSS Innovations. They need to be comfortable applying the EQuIP Rubric for Science 3.0. If
participants have not received formal professional learning to support using the EQuIP Rubric
for Science, that will need to be included in the process.
While it is possible for the prescreen and subsequent phases of the PEEC review to be applied
by an individual, the quality review process works best with a team of reviewers as a
collaborative process. As more people get involved, the likelihood for better evidence and
understanding increases as the additional perspectives can deepen the review process.
However, adding more review team members will increase the complexity and costs of a review
effort. Working as a group will not only result in a better-informed decision, but the
conversations can also bring the group to a common, deeper understanding of what
instructional materials designed for the NGSS look like.
The NGSS Innovations evaluated by the prescreen should be explicit and obvious, and they
should be present in the materials that are in the hands of all students and teachers—not just
in optional or ancillary materials. The components of the instructional materials program
chosen to review need to be selected in advance and consistent across programs. It is
important to review only what will be available to all teachers and to all students. Though this
is intended to be a quick read-through of materials, it is important—for all the materials
reviewed and for each of the criteria—to evaluate both the overall organization of the materials
and their content.
For each of the instructional material programs under consideration, teams should identify
which components will be included and which ones will not be included in the PEEC review
process.
Before applying the prescreen, it’s important that the review group has a common
understanding of what qualifies as evidence for the criteria. To establish this understanding,
start by reading the “less like, more like” tables in Tool 1A: PEEC Prescreen Response Form
(Phenomena), Tool 1B: PEEC Prescreen Response Form (Three Dimensions), and Tool 1C: PEEC
Prescreen Response Form (Three Dimensions for Instruction and Assessment). These are
shortened versions of the tables embedded in the NGSS Innovations discussion. If necessary,
review the descriptions of NGSS Innovations 1 and 2, and answer the following questions for
each criterion in the prescreen:
1. What would it look like for a student or teacher resource to be organized in a way that
demonstrates this innovation?
2. How would the content of a student or teacher resource look different if it were
demonstrating this innovation?
There are three forms to use, one for each criterion, to collect and articulate this evidence: Tool
1A: PEEC Prescreen Response Form (Phenomena), Tool 1B: PEEC Prescreen Response Form
(Three Dimensions), and Tool 1C: PEEC Prescreen Response Form (Three Dimensions for
Instruction and Assessment). See Table 8 below as an example. The recorded evidence should
answer the question in the table, “What was in the materials, where was it, and why is this
evidence?” relevant to each criterion.
During this stage of the work, it is important to remember that this is a prescreen and not the
full evaluation. It is not necessary to find every piece of evidence in the program; instead, make
a relatively quick pass through the materials. In materials that at least show promise for being
designed for the NGSS, it should not be difficult to see evidence of at least an attempt to
address these innovations. The degree to which these innovations are truly designed into the
materials will be evaluated in more detail later in this process.
Evidence this criterion is not designed Evidence this criterion is designed into
into this instructional materials this instructional materials program.
program.
What was in the materials, where was it,
What was in the materials, where was and why is this evidence?
it, and why is this evidence?
Page iii: table of contents is organized Pages 15–47 (Unit 1 student text) —
by “typical” science topics; the unit and though the title of this unit is “cells,” it
chapter titles give no indication that engages students with making sense of a
students are making sense of series of phenomena; student
phenomena or designing solutions to explanations of several smaller
problems. phenomena support students to explain
a larger phenomenon.
Page 115 (Unit 4 teacher text) — the
teacher support for using the Pages 124–177 (Unit 5 student text) —
phenomena of this unit only talks this unit explicitly incorporates the
about using the phenomena as hooks engineering design process; it is not just
☐
or engagement; it positions the for enrichment, or a culminating activity;
teacher to explain the phenomena it is not just a directions-following
rather than the students. activity.
Is there enough evidence to check the “shows promise?” box for each criterion?
Tool 1A: PEEC Prescreen Response Form (Phenomena), Tool 1B: PEEC Prescreen Response Form
(Three Dimensions), and Tool 1C: PEEC Prescreen Response Form (Three Dimensions for
Instruction and Assessment) all include a “shows promise?” checkbox that should be considered
once the evidence has been recorded on the tool.
Is there enough evidence across the three criteria to warrant further review?
All three criteria should have their “Shows promise” box checked to indicate that there is
sufficient initial evidence that the instructional materials program is designed to address these
first two key innovations of the NGSS. If instructional materials programs that do not meet this
expectation are carried over to the next step in this process, it should be done with the
awareness that this will require more time, effort, and energy in the review process.
Wrapping Up a Prescreen
After applying the PEEC Prescreen across the instructional materials programs that are being
considered, those that don’t meet the fundamental criteria of the prescreen should be set
aside. They can always be analyzed later if none of the initial materials measures up, but the
remaining analyses are more time- and resource-intensive, so focus on the programs that have
the clearest prescreen evidence of NGSS design.
Each member of the review group should complete Tool 2: PEEC Prescreen: Recommendation
for Review? to document their final analysis.
Summary PEEC Phase 2: Unit Evaluation uses the EQuIP Rubric for Science to dig deep
into a given unit of an instructional materials program.
Process 4. Select a single unit from the instructional materials program in question
to analyze. Use Tool 3: Unit Selection Table.
5. Apply the EQuIP Rubric for Science to the unit you have selected.
6. Connect the EQuIP Rubric for Science to the NGSS Innovations using
Tool 4: EQuIP Rubric Data Summary.
Once instructional materials programs have been established by the PEEC Phase 1: Prescreen to
at least have the appearance of being designed for the NGSS, the next step is to look at a full
unit to evaluate evidence for the rest of the NGSS Innovations. Luckily, a tool already exists for
this type of evaluation; the Educators Evaluating the Quality of Instructional Products (EQuIP)
Rubric for Science provides criteria by which to measure the alignment and overall quality of
lessons and units with respect to the NGSS. The EQuIP Rubric for Science guides reviewers to
look for evidence of three categories of NGSS Design, as shown in Table 9.
3 Monitoring NGSS The unit supports monitoring student progress in all three
Student Progress dimensions of the NGSS as students make sense of
phenomena and/or design solutions to problems.
Selecting a Unit
There are a variety of factors to consider in selecting a single unit to represent an instructional
materials program in the unit evaluation process. These include: the length of the unit;
similarity of units across programs; evaluator expertise; and available resources for review.
These features are described in this section. Tool 3: Unit Selection Table should be used by
groups to make the unit selection.
Different instructional materials programs may define a “unit” in different ways, so it will be
important to look across the programs that have cleared the prescreen and select a portion of
the program that has a comparable length of instruction. Generally, a unit is a collection of
lessons in an intentional sequence tied to a learning goal. Units usually take longer than two
weeks classroom time to complete, whereas lessons take a few days.
To be able to effectively apply the EQuIP Rubric for Science, a selected unit should include
sufficient length for students to:
• Explain at least one phenomenon and/or design a solution to at least one problem;
• Engage in at least one three-dimensional student performance; and
• Have their learning measured across the three dimensions of the standards.
As instructional materials programs are being designed for the NGSS and focusing more on
students using the three dimensions to make sense of phenomena and design solutions to
problems, it is quite possible that the units may not be as easily comparable in topic and
organization as they once were. For example, most current high school biology texts have a
single Biology unit focused on photosynthesis. However, as instructional materials programs
designed with the NGSS Innovations in mind are developed, the DCI information related to
photosynthesis may be spread out through both Chemistry and Biology courses, and the
concepts might be developed through several different instructional units. Since developers will
likely not all make curriculum design decisions in the same way, finding the right unit to
compare may become increasingly difficult. A plan should be made to ensure that a comparable
unit is selected across programs.
In considering which unit to review in each program, it is also important to consider the
expertise of the review team. Review team members’ understanding of the three dimensions of
the standards addressed in the unit being reviewed will affect the quality of their reviews. As an
obvious example, a physics teachers may not have the deep understanding of cellular
respiration needed to evaluate a photosynthesis unit. Similarly, a review team without a deep
understanding of the grade-level expectations of a CCC might not catch it when the CCC that is
addressed in a unit is at a much lower grade lever. But expertise can cut both ways: reviewers
with deep knowledge of the DCIs in the unit may be able to better recognize deficiencies in how
the DCIs are addressed, but they also might read between the lines to see connections that are
not explicit in the program—they might see connections that teachers implementing the
materials may not. It is important to know the review team’s expertise, to deliberately to this
into account in the selection of the unit for review and to support the team to be aware of their
own strengths and weaknesses as they are reviewing materials.
As always, these factors will need to be balanced with the resources—people, time, and
money—that are available. A longer selection will give a better look at what the program offers,
but it will also take more resources to evaluate. Having multiple groups look at each resource
and compare their evaluations will provide a more balanced evaluation, and the ensuing
conversations, if properly facilitated, can help prepare teachers to implement the materials
once they are selected. However, this requires a greater time commitment from those
participating in the review.
For each program being reviewed, identify which unit will be reviewed and explain why that
unit was selected in the Tool 3: Unit Selection Table.
The EQuIP Rubric for Science provides a close look at a single unit, but in programs designed for
the NGSS, the NGSS Innovations need to build across the program. For each of the Innovations,
this means looking for evidence beyond just the unit that was evaluated in PEEC Phase 2.
For example, the unit may have provided multiple and varied opportunities for students to ask
scientific questions based on their experiences—clearly engaging students in the SEP “Asking
Questions and Designing Problems” —but the scope of the unit may have been limited to
developing a particular element of the SEP (e.g., only asking scientific questions without
opportunities to define criteria and constraints associated with the solution to a problem) or to
developing student facility with a particular element to a certain degree (e.g., appropriately
removing scaffolds for development within the unit but not for the full expression of the SEP;
only beginning to connect this SEP to other relevant SEPs). It is also important that that
To do this across the entire instructional materials program, PEEC uses a different lens for
evaluation. In this phase of evaluation, the student and teacher materials are evaluated to look
for evidence of claims that would be expected to be present in materials designed for the NGSS.
This will build on the evidence base of the PEEC Prescreen and Unit Evaluation to move
reviewers to a final decision about which program to select.
1. Sample three learning sequences consisting of four to five lessons per sequence.
Based on our unit analysis in Phase 2, this sample should allow us to look for the
development and use of the three dimensions together over time in service of
students progressively making sense of phenomena.
2. Intentionally select one learning sequence from the beginning third of the program,
one in the middle third, and one in the final third to ensure that instructional
Once the evidence has been recorded, evaluate the degree to which there is evidence of each
criteria. Use the following as guidance for evaluating the categories/samples:
• No Evidence: There is not any evidence to support the claim in the sampled materials.
• Inadequate Evidence: There are a few instances of evidence to support the claim, but
they are intermittent or do not constitute adequate time or opportunity for students to
learn the content or develop the ability.
• Adequate Evidence: Evidence for this claim is common and there is adequate time and
opportunity, and support for all students to learn the content and develop the abilities.
• Extensive Evidence: Evidence for this claim is pervasive throughout the program and
there is adequate time, opportunity, and support for all students to learn the content
and develop the abilities.
These ratings of the quality of evidence supporting each claim should be done first individually
and then discussed as a group to reach consensus.
Finally, based on the evidence collected and the pattern of checks, complete the bottom
portion of the Tool that asks reviewers to decide the degree to which the innovation shows up
across the program. For materials that only partially incorporate the innovation, provide
suggestions for what will be needed: professional learning; additional lessons, units, or
modules; developing a district-wide approach to using the crosscutting concepts (because they
are not well represented in the materials); etc.
Repeat this process for the remaining four NGSS Innovations by completing Tool 5B: Program-
Level Evaluation Innovation 2: Three-Dimensional Learning, Tool 5C: Program-Level Evaluation
Innovation 3: Building Progressions, Tool 5D: Program-Level Evaluation Innovation 4: Alignment
with English Language Arts and Mathematics, and Tool 5E: Program-Level Evaluation
Innovation 5: All Standards, All Students.
These additional criteria should be present in all high-quality science instructional materials, but
are not specific to NGSS.
Crosscutting Concepts (CCC). These are concepts that hold true across the natural and
engineered world. Students can use them to make connections across seemingly disparate
disciplines or situations, connect new learning to prior experiences, and more deeply engage
with material across the other dimensions. The NGSS requires that students explicitly use their
understanding of the CCCs to make sense of phenomena or solve problems.
Disciplinary Core Ideas (DCI). The fundamental ideas that are necessary for understanding a
given science discipline. The core ideas all have broad importance within or across science or
engineering disciplines, provide a key tool for understanding or investigating complex ideas and
solving problems, relate to societal or personal concerns, and can be taught over multiple grade
levels at progressive levels of depth and complexity.
EQuIP Rubric for Science. Educators Evaluating Quality in Instructional Products (EQuIP) for
science is a tool and accompanying process for evaluating how well an individual lesson or
single unit (series of related lessons) is designed to support students developing the knowledge
and practice described by the Framework and the NGSS.
The Framework. A shortened title for the 2012 foundational report, A Framework for K-12
Science Education: Practices, Crosscutting Concepts, and Core Ideas, published by the National
Research Council (NRC) describes the scientific consensus for the science knowledge and skills
students should acquire during their K-12 experience. A team of states, coordinated by Achieve,
took the Framework and used it to develop the Next Generation Science Standards. The
Framework is available online in a variety of formats from the National Academies Press.
Instructional Materials. Tools used by teachers to plan and deliver lessons for students.
Generally instructional materials include activities for daily instruction (“lessons”), that are
organized into sequences (“units”, “chapters”).
Instructional Materials Program. A set of instructional materials that spans a large chunk of
time or instruction, generally a full course (e.g. a Biology textbook) or a middle-grades science
sequence. Distinguished from instructional materials that are not nearly as comprehensive,
such as those that focus on only a few days or weeks of instruction or on a given content area.
Learning Sequence. Several connected and sequential lesson that build student understanding
toward a set of learning goals progressively, over the course of weeks (as opposed to days).
Lesson. A set of instructional activities and assessments that may extend over several class
periods or days; it is more than a single activity.
NGSS Innovations. This document describes five NGSS Innovations that describe and explain
what is new and different about the NGSS, particularly regarding instructional materials design
and selection. The NGSS Innovations build on the conceptual shifts described in Appendix A of
the NGSS.
PEEC. Primary Evaluation of Essential Criteria (PEEC) takes the compelling vision for science
education as described in A Framework for K–12 Science Education and embodied in the Next
Generation Science Standards (NGSS) and operationalizes it for two purposes:
1. to help educators determine how well instructional materials under consideration have
been designed for the Framework and NGSS, and
2. to help curriculum developers construct and write science instructional materials that
are designed for the Framework and NGSS.
Performance Expectations (PEs). The NGSS are organized into a set of expectations for what
students should be able to do by the end of a period of instruction, generally measured by
years of schooling. The performance expectations describe the learning goals or outcomes for
students. Each performance expectation describes what students who demonstrate
understanding can do, often with a clarification statement that provides examples or additional
emphasis for individual performance expectation. An assessment boundary guides the
developers of large-scale assessments. Each performance expectation is derived from a set of
disciplinary core ideas, cross-cutting concepts, and science and engineering practices that are
defined in the Framework. Note that like all sets of standards, the NGSS do not prescribe the
methods or curriculum needed to reach these outcomes.
Phenomena. Observable events that students can use the three dimensions to explain or make
sense of. Lessons designed for the NGSS focus on explaining phenomena or designing solutions
to problems. Some additional resources about phenomena are available on the NGSS website.
Science and Engineering Practices (SEP). The practices are what students do to make sense of
phenomena. They are both a set of skills and a set of knowledge to be internalized. The SEPs
reflect the major practices that scientists and engineers use to investigate the world and design
and build systems.
Three-Dimensional Learning. Learning that integrates all three dimensions of the NGSS, that
allows students to actively engage with the practices and apply the crosscutting concepts to
deepen their understanding of core ideas across science disciplines. Click here to read more.
Three Dimensions. As described in the Framework, these are the three strands of knowledge
and skills that students should explicitly be able to use to explain phenomena and design
solutions to problems. The three dimensions are the Disciplinary Core Ideas (DCIs), Crosscutting
PEEC version 1.1 page 52 of 93
Concepts (CCCs), and Science and Engineering Practices (“the Practices” or SEPs). More
information about the three dimensions is available here.
PEEC supports educators, developers, and publishers. For educators, the evaluation tool
clarifies what to look for when identifying or selecting instructional materials programs and
assessments for the NGSS. For developers and publishers, PEEC provides guidance on what to
focus on and integrate when designing instructional materials programs for the NGSS. This tool
(1) prepares educators to accurately identify, select, or evaluate resources and (2) helps enable
developers and publishers to effectively design resources that meet criteria for the NGSS.
Question 2: How do the five innovations described in PEEC differ from the “conceptual
shifts” in Appendix A of the NGSS and the implications of the vision of the
Framework and the NGSS from the Guide to Implementing the NGSS?
PEEC focuses on what makes the NGSS new and different from past science standards. These
differences were first articulated as conceptual shifts in Appendix A of the standards. These
conceptual shifts still hold true today, but four years of standards implementation has refined
the understanding of what is unique about the NGSS and has revealed that these shifts
represent innovations in science teaching and learning.
The EQuIP Rubric for Science is designed to evaluate learning sequences and units for the
degree to which they are designed for the NGSS. It is embedded within PEEC as the tool for
evaluating a sample unit from the program as Phase 2 in the PEEC process. The evaluation from
this phase is combined with the PEEC Phase 1: Prescreen and PEEC Phase 3: Program Evaluation
to give an overall picture of how well the instructional materials program is designed for the
NGSS.
Question 4: Is this a science version of the Publisher’s Criteria that was developed for the
Common Core State Standards for mathematics?
Both PEEC and the Publisher’s Criteria documents are intended to inform both the developers
of instructional materials and those making the selection of which materials to use. The NGSS
Innovations in PEEC highlight the key differences in NGSS from previous sets of standards and
clarifies how these innovations should be represented in instructional materials.
Question 5: I'm interested in working with Achieve to train my teachers on how to use PEEC
to evaluate instructional materials. What should I do?
If you are interested in hiring Achieve to facilitate professional learning to support your district
team in using PEEC to select instructional materials, please contact peec@achieve.org. Training
for effective use takes a minimum of two days if the entire group has already received
professional learning for and are comfortable using EQuIP and a minimum of four days if they
are not proficient in using EQUIP.
PEEC is designed to support building and district-level selection of year-long (or longer)
instructional materials programs designed for the NGSS. Sometimes this task falls to teachers to
coordinate. PEEC provides guidelines for a process that teams can use to evaluate instructional
materials programs.
If you are not part of your school or district’s instructional materials program selection process,
but you want to make sure that the process is focusing on the appropriate criteria, share and
discuss this tool with those responsible for making these decisions.
If you are looking for support in transitioning your classroom lessons and units, you may want
to review the NGSS Lesson Screener or the EQUIP Rubric for Science.
While principals are not the primary audience for PEEC, there are several ways that it might be
relevant to your work. Some principals help with the selection of instructional materials for
your school or district, and PEEC includes both criteria and a process that can be used for that
purpose. If selecting instructional material programs is not a part of your duties, then share and
discuss this tool with those science teachers and administrators who are responsible for making
these decisions.
Question 8: I’m a district science leader or curriculum coordinator. How should I use PEEC?
If you’re in charge of coordinating the selection of science instructional materials, PEEC is built
to help your team make good decisions about what materials to purchase (or even to wait to
purchase materials until you find something that better matches your expectations): the NGSS
Innovations described in PEEC will help your selection team to develop a common
understanding of what to look for in materials designed for the NGSS; PEEC Appendix A will
help you to think about building your team and fitting materials selection into your broader
implementation plan for science; and the three phases of the PEEC process will help you to
design the process that you use for materials selection. If your team is already well-versed in A
Framework for K-12 Science Education and the NGSS, anticipate about three full days of
professional learning to prepare your team for this effort and then several days to dig in and
evaluate the materials (depending on how many materials are evaluated).
Question 9: I’m a developer or publisher of science instructional materials. How should I use
the PEEC tool?
The NGSS Innovations section of PEEC describes the most significant changes from past science
standards to the NGSS and their implications for instructional materials. These innovations
should focus the efforts to design materials for the NGSS and should be clearly apparent to
those making instructional materials selection decisions. A developer might also use the PEEC
processes and tools internally to self-evaluate the program that you are developing.
If you are interested in professional learning for your development staff to better understand
the evaluations and apply the rubric, or are interested in a confidential review of your
materials, please contact peec@achieve.org to discuss your needs in greater depth.
Question 10: Some instructional materials are more expensive than others. Why doesn’t
PEEC include cost estimates?
PEEC does not attempt to measure all things that might be considered in selecting instructional
materials. It is focused on evaluating how well an instructional materials program is designed
Question 11: How is this document different from the Guidelines for the Evaluation of
Instructional Materials in Science?
The Guidelines for the Evaluation of Instructional Materials in Science is not a tool or process for
evaluating instructional materials; rather, it describes the research base for evaluative criteria
that should be considered in building tools and processes for evaluating instructional materials
designed for the NGSS. Its development was informed by early versions of EQuIP Rubric for
Science and PEEC, and it informed the most recent version of PEEC. The criteria for all three
phases of PEEC have a close connection to those presented in the Guidelines.
Question 12: This document is listed as “Version 1.1”. What’s different from version 1.0?
One of the pieces highlighted for revision in version 1.0 was, “Iterating the Innovations. How
can the arguments and discussion about the five NGSS Innovations be more clear and
straightforward?” We received feedback from users in the field and from field testing that
helped us to revise the language of the innovations to better convey their original intent. In
particular, version 1.1:
• highlights the importance of equity and access for all students as foundational to all five
innovations;
• separates the NGSS Innovations from their implications for instructional materials in the
NGSS Innovations section;
• revises the wording of the NGSS Innovations for clarity.
As was the case with the EQuIP Rubric for Science, we expect that as more and more teachers,
schools, districts, authors, developers, and publishers use PEEC, the feedback loops in that
process will lead to ongoing improvements in PEEC. Please send comments and suggestions to
peec@achieve.org.
Guidelines Alignment. Version 1.2 of PEEC will include a full description of alignment to the
Guidelines for the Evaluation of Instructional Materials in Science.
Sampling. More specific guidance will be provided about how to sample instructional materials
programs to best balance both a rigorous review and the time commitment of the reviewer
Evidence. More examples and specifics will be provided about what users should classify as
evidence and provide support to determine if the quantity and quality of evidence collected is
sufficient to justify a particular claim.
Utility. The forms and tools will be made more useful for users, including templates and fillable
forms.
PEEC Professional Learning Facilitator’s Guide Coordination. Just like the EQuIP Rubric for
Science, a guide is currently under development to support leaders looking to facilitate
professional learning for a selection team. Future versions of PEEC will build a tighter
connection to the guide under development. This guidance will include:
• Streamlined processes for time-constrained users. Guidance will be provided for how
to adapt the PEEC tools and processes for situations that do not allow for the full
process due to resource limitations
• Streamlined presentation of the document and related resources. PEEC’s design will be
enhanced to better support users that want to adapt their use to meet local needs.
• Teaming and Decision Making. More detailed support about how to put together a
materials selection team, how to manage and facilitate the decision-making processes
within that team, and how to connect instructional materials review to a broader
implementation plan.
PEEC is a work in progress. Please send comments and suggestions for improvement to
peec@achieve.org.
American Association for the Advancement of Science. (1993). Benchmarks for Science Literacy.
New York, NY: Oxford University Press.
BSCS. (2017). Guidelines for the Evaluation of Instructional Materials in Science. Retrieved from
http://guidelinesummit.bscs.org.
Krajcik, J., Codere, S., Dahsah, C., Bayer, R., and Mun, K. (2014). Planning Instruction to Meet
the Intent of the Next Generation Science Standards. Journal of Science Teacher Education,
157–75.
Lee, O., Quinn, H., and Valdés, G. (2013). Science and Language for English Language Learners in
Relation to Next Generation Science Standards and with Implications for Common Core State
Standards for English Language Arts and Mathematics. Educational Researcher, 223–33.
National Academies of Sciences, Engineering, and Medicine. (2017). Seeing Students Learn
Science: Integrating Assessment and Instruction in the Classroom. Washington, DC: The National
Academies Press. doi: 10.17226/23548.
National Research Council. (1996). National Science Education Standards. Washington, DC: The
National Academies Press.
National Research Council. (2007). Taking Science to School: Learning and Teaching Science in
Grades K-8. Washington, DC: National Academies Press.
National Research Council. (2012). A Framework for K–12 Science Education. Washington, DC:
National Academies Press.
National Research Council. (2014). Developing Assessments for the Next Generation Science
Standards. Washington, DC: National Academies Press.
National Research Council. (2015). Guide to Implementing the Next Generation Science
Standards. Washington, DC: The National Academies Press.
Quinn, H., Lee, O., & Valdés, G. (2012). Language demands and opportunities in relation to Next
Generation Science Standards for English language learners: What teachers need to know.
Stanford, CA: Stanford University, Understanding Language Initiative (ell.stanford.edu).
The Next Generation Science Standards. (2013). Washington, DC: National Academies Press.
Warren, B., Ballenger, C., Ogonowski, M., Rosebery, A. S., and Hudicourt-Barnes, J. (2001).
Rethinking diversity in learning science: The logic of everyday sense making. Journal of Research
in Science Thinking, 529–52.
Making Sense of Phenomena and Designing Solutions to Problems: The instructional materials program focuses on supporting
students to make sense of a phenomenon or design solutions to a problem.
NGSS-designed programs will look less like this: NGSS-designed programs will look more like this:
Making sense of phenomena and designing solutions to The purpose and focus of a learning sequence is to support
problems are not a part of student learning or are presented students in making sense of phenomena and/or designing
separately from “learning time” (i.e. used only as a “hook” or solutions to problems. The entire sequence drives toward this
engagement tool; used only for enrichment or reward after goal.
learning; only loosely connected to a DCI).
The focus is only on getting the “right” answer to explain a Student sense-making of phenomena or designing of solutions is
phenomenon or replicating a known solution to a problem. used as a window into student understanding of all three
dimensions of the NGSS.
A different, new, or unrelated phenomenon is used to start Lessons work together in a coherent storyline to help students
every lesson. make sense of phenomena.
Phenomena are brought into learning after students develop The development of science ideas is anchored in making sense of
the science ideas so students can apply what they learned. phenomena or designing solutions to problems.
Using the chart below, record evidence that would indicate that the instructional materials program is designed for each criterion as
well as for evidence that the program is not designed for each criterion.
Evidence this criterion IS NOT designed into this Evidence this criterion IS designed into this
instructional materials program instructional materials program
What was in the materials, where was it, and why is this What was in the materials, where was it, and why is Shows
evidence? this evidence? Promise?
Three Dimensions: Students develop and use grade-appropriate elements of the science and engineering practices (SEPs),
disciplinary core ideas (DCIs), and crosscutting concepts (CCCs), which are deliberately selected to aid student sense-making of
phenomena or designing of solutions across the learning sequences and units of the program.
NGSS-designed programs will look less like this: NGSS-designed programs will look more like this:
The SEPs and CCCs can be inferred by the teacher (not Students explicitly use the SEP and CCC elements to make sense of
necessarily the students) from the materials. the phenomenon or to solve a problem.
Using the chart below, record evidence that would indicate that the instructional materials program is designed for each criterion as
well as for evidence that the program is not designed for each criterion.
Evidence this criterion IS NOT designed into this Evidence this criterion IS designed into this
instructional materials program instructional materials program
What was in the materials, where was it, and why is this What was in the materials, where was it, and why is Shows
evidence? this evidence? Promise?
Integrating the Three Dimensions for Instruction and Assessment: The instructional materials program requires student
performances that integrate elements of the SEPs, CCCs, and DCIs to make sense of phenomena or design solutions to problems,
and the learning sequence elicits student artifacts that show direct, observable evidence of three-dimensional learning.
NGSS-designed programs will look less like this: NGSS-designed programs will look more like this:
Students learn the three dimensions in isolation from each The learning sequence is designed to build student proficiency in
other (e.g., a separate lesson or activity on science methods at least one grade-appropriate element from each of the three
followed by a later lesson on science knowledge). dimensions.
Teachers assume that correct answers indicate student Teachers deliberately seek out student artifacts that show direct,
proficiency without the student providing evidence or observable evidence of learning, building toward all three
reasoning. dimensions of the NGSS at a grade-appropriate level.
Teachers measure only one dimension at a time (e.g., Teachers use tasks that ask students to explain phenomena or
separate items for measuring SEPs, DCIs, and CCCs). design solutions to problems, and that reveal the level of student
proficiency in all three dimensions.
Using the chart below, record evidence that would indicate that the instructional materials program is designed for each criterion as
well as for evidence that the program is not designed for each criterion.
Evidence this criterion IS NOT designed into this Evidence this criterion IS designed into this
instructional materials program. instructional materials program
What was in the materials, where was it, and why is this What was in the materials, where was it, and why is Shows
evidence? this evidence? Promise?
Reminder
The purpose of the PEEC Prescreen is to give a quick look at an instructional materials program. There are significant aspects of what
would be expected in a fully-vetted program designed for the NGSS that are not addressed in this tool and it should not be used to
fully vet resources or claim that the programs are designed for NGSS.
Recommendation
Unit Instructional Materials Program Name Unit (title and page numbers) Why this unit?
Description
III. A. Monitoring 3D
☐ None ☐ Inadequate ☐ Adequate ☐
Student
Extensive
Performances
Depending on how many programs made it to this phase of the analysis, the EQuIP Rubric for Science evaluations may be used to
continue to narrow the field of instructional materials programs being evaluated. After consensus reports have been generated for
each unit, the review team should evaluate whether or not all programs are worthy of further review. Unless the separation in
quality is very small, it is recommended that only the top two or three programs continue to the final phase of the PEEC process.
Directions
Using the sampling evaluation plan, record evidence of where the innovation has been clearly incorporated into the materials as well
as instances where it does not appear to have been incorporated. Your evidence should include page numbers, a brief description of
the evidence, and an explanation of how it either supports or contradicts the claim.
Sufficient evidence
to support the
Claim Evidence claim?
1. Based on the evidence collected, to what degree to the materials incorporate this innovation over the course of the
program?
3. If this innovation is only partially incorporated, suggest additional professional learning or other support that would be
needed for teachers to use the materials in a way that incorporated the innovation in their instruction.
Directions
Using the sampling evaluation plan, record evidence of where the innovation has been clearly incorporated into the materials as well
as instances where it does not appear to have been incorporated. Your evidence should include page numbers, a brief description of
the evidence, and an explanation of how it either supports or contradicts the claim.
Sufficient evidence
to support the
Claim Evidence claim?
Student materials include accessible What to look for as evidence in the student materials:
and unbiased formative and
summative assessments that provide • Materials regularly elicit direct, observable evidence of
clear evidence of students’ three- three-dimensional learning (SEP, DCI, CCC);
☐ None
dimensional learning. • Materials include authentic and relevant tasks that require
☐ Inadequate
students to use appropriate elements of the three
dimensions; ☐ Adequate
• Provide a range of item formats, including construct- ☐ Extensive
response and performance tasks, which are essential for the
assessment of three-dimensional learning consonant with
the framework and the NGSS.
Over the course of the program, a What to look for as evidence in the assessment system:
system of assessments coordinates
the variety of ways student learning is • Consistent use of pre-, formative, summative, self- and peer-
monitored to provide information to assessment measures that assess three-dimensional ☐ None
students and teachers regarding learning; ☐ Inadequate
student progress for all three • Consistent support for teachers to adjust instruction based ☐ Adequate
dimensions of the standards. on suggested formative classroom tasks; and ☐ Extensive
• Support for teachers and other leaders to make program-
level decisions based on unit, interim, and/or year-long
summative assessment data.
1. Based on the evidence collected, to what degree to the materials incorporate this innovation over the course of the
program?
3. If this innovation is only partially incorporated, suggest additional professional learning or other support that would be
needed for teachers to use the materials in a way that incorporated the innovation in their instruction.
Directions
Using the sampling evaluation plan, record evidence of where the innovation has been clearly incorporated into the materials as well
as instances where it does not appear to have been incorporated. Your evidence should include page numbers, a brief description of
the evidence, and an explanation of how it either supports or contradicts the claim.
Sufficient evidence
to support the
Claim Evidence claim?
1. Based on the evidence collected, to what degree to the materials incorporate this innovation over the course of the
program?
2. Reviewer Notes/Comments
3. If this innovation is only partially incorporated, suggest additional professional learning or other support that would be
needed for teachers to use the materials in a way that incorporated the innovation in their instruction.
Directions
Using the sampling evaluation plan, record evidence of where the innovation has been clearly incorporated into the materials as well
as instances where it does not appear to have been incorporated. Your evidence should include page numbers, a brief description of
the evidence, and an explanation of how it either supports or contradicts the claim.
Sufficient evidence
to support the
Claim Evidence claim?
1. Based on the evidence collected, to what degree to the materials incorporate this innovation over the course of the
program?
3. If this innovation is only partially incorporated, suggest additional professional learning or other support that would be
needed for teachers to use the materials in a way that incorporated the innovation in their instruction.
Directions
Using the sampling evaluation plan, record evidence of where the innovation has been clearly incorporated into the materials as well
as instances where it does not appear to have been incorporated. Your evidence should include page numbers, a brief description of
the evidence, and an explanation of how it either supports or contradicts the claim.
Sufficient evidence
to support the
Claim Evidence claim?
1. Based on the evidence collected, to what degree to the materials incorporate this innovation over the course of the
program?
3. If this innovation is only partially incorporated, suggest additional professional learning or other support that would be
needed for teachers to use the materials in a way that incorporated the innovation in their instruction.
Directions
Complete the table below by transferring the data from each of the three phases of PEEC.
Making Sense of Phenomena & ☐ Materials partially incorporate ☐ Materials partially incorporate
Shows Promise? ☐
Designing Solutions to Problems the innovation. the innovation.
Directions
Reflect on the summary table and the other evidence collected to make a final claim about whether the instructional materials
program is designed to provide adequate and appropriate opportunities for students to meet the performance expectations of the
NGSS. Once this claim is established, explain how the data in Tool 6: Program-Level Evaluation Evidence Summary support this
conclusion and highlight the most compelling evidence from each of the phases of PEEC to support the claim. After establishing the
evidence for the claim, summarize any recommendations for what would need to happen during implementation of the materials to
address any weaknesses that were identified in the analysis.
Claim
Title of instructional materials under review: ______________________________________ (does/does not) provide adequate and
appropriate opportunities for students to meet the performance expectations of the NGSS.
Evidence-Based Response
Recommendations