You are on page 1of 93

Primary Evaluation of Essential Criteria

(PEEC) for Next Generation Science


Standards Instructional Materials Design
Version 1.1—December 2017

PEEC version 1.1 page 1 of 93


Acknowledgements
PEEC was developed in a collaborative and iterative process managed by Achieve, including a
public draft review in summer 2015 and small group focused review sessions thereafter. The
following individuals were major contributors:

Rodger Bybee, Executive Director, Biological Sciences Curriculum Study (retired)

Cheryl Kleckner, State Science Education Specialist for Oregon (retired)

Phil Lafontaine, State Science Supervisor for California (retired)

Focus group feedback was provided by the following organizations: American Association of
Publishers, Council of Chief State School Officers, Council of the Great City Schools, Council of
State Science Supervisors, Hands on Science Partnership, K-12 Alliance, National Science
Education Leadership Association, and National Science Teachers Association.

The timing for revisions coincided with the revision of the EQuIP Rubric for Science 3.0, a
process that was coordinated by Achieve with ongoing input from many of the organizations
mentioned above. In addition, the PEEC Prescreen process was piloted with a group of
educators during a course at the National NSTA Conference in Los Angeles. A special thanks to
this group for their feedback during the course and, especially to Jaqueline Rojas, who provided
detailed feedback on the whole document following the pilot.

Legal
This document is offered under the Attribution 4.0 International (CC BY 4.0) license by Creative
Commons.

PEEC version 1.1 page 2 of 93


Table of Contents
Acknowledgements 2
Legal 2
Table of Contents 3
Executive Summary 5
What is PEEC? 5
Why PEEC? 7
The NGSS Innovations and Instructional Materials 10
Innovation 1: Making Sense of Phenomena and Designing Solutions to Problems 10
Innovation 2: Three-Dimensional Learning 13
Innovation 3: Building K–12 Progressions 17
Innovation 4: Alignment with English Language Arts and Mathematics 24
Innovation 5: All Standards, All Students 25
Using PEEC to Evaluate Instructional Materials Programs 30
States and PEEC 31
School Districts and PEEC 31
Developers, Writers, and PEEC 31
PEEC Phase 1: Prescreen 32
Preparing to use PEEC 34
Applying the PEEC Prescreen 36
Analyzing Results from A Prescreen 37
Wrapping Up a Prescreen 38
PEEC Phase 2: Unit Evaluation 39
Selecting a Unit 40
Applying the EQuIP Rubric for Science 42
Connecting the EQuIP Rubric for Science to the NGSS Innovations: 42
PEEC Phase 3: Program-Level Evaluation 43
Creating A Sampling Plan 44
Reviewing Claims and Evidence from The Sample 45
Summing Up 46
Beyond PEEC 47
Student Instructional Materials 47
Teacher Instructional Materials and Support 47
Equitable Opportunity to Learn in Instructional Materials 47
Assessment in Instructional Materials 48
Glossary 49
Frequently Asked Questions 52
References 57

PEEC version 1.1 page 3 of 93


Tool 1A: PEEC Prescreen Response Form (Phenomena) 59
Tool 1B: PEEC Prescreen Response Form (Three Dimensions) 61
Tool 1C: PEEC Prescreen Response Form (Three Dimensions for Instruction and Assessment)63
Tool 2: PEEC Prescreen: Recommendation for Review? 65
Tool 3: Unit Selection Table 66
Tool 4: EQuIP Rubric Data Summary 67
Tool 5A: Program-Level Evaluation Innovation 1: Making Sense of Phenomena and Designing
Solutions to Problems 71
Tool 5B: Program-Level Evaluation Innovation 2: Three-Dimensional Learning 74
Tool 5C: Program-Level Evaluation Innovation 3: Building Progressions 78
Tool 5D: Program-Level Evaluation Innovation 4: Alignment with English Language Arts and
Mathematics 82
Tool 5E: Program-Level Evaluation Innovation 5: All Standards, All Students 85
Tool 6: PEEC Evidence Summary 88
Tool 7: Final Evaluation 91

PEEC version 1.1 page 4 of 93


Executive Summary

What is PEEC?
PEEC is an acronym for the Primary Evaluation of Essential Criteria for Next Generation Science
Standards (NGSS) Instructional Materials Design. Per the Guide to Implementing the Next
Generation Science Standards, high-quality instructional materials designed for the NGSS are a
critical component of NGSS implementation. PEEC is designed to:

• Bring clarity to the complicated and parallel processes of selecting and developing those
instructional materials;
• Help educators and developers to focus on the critical innovations within the NGSS via a
process to dig deeply into instructional materials programs to evaluate their presence;
and
• Answer the question “How thoroughly are these science instructional materials
programs designed for the NGSS?”

PEEC evaluates instructional material programs.

PEEC is intended to evaluate the NGSS design of instructional materials programs built for year-
long courses (e.g. high school biology), or programs that span several grade levels (e.g. a K–5
elementary science series, or a middle school sequence for grades 5–8). These instructional
materials programs may be commercially available, developed by states or districts, and/or
provided as open educational resources. The instructional materials to be evaluated can be
organized in any of a variety of digital and print formats (e.g. kits, modules, workbooks,
textbooks, textbook series).

PEEC is not intended for the evaluation of individual lessons or instructional units. For these
smaller grain sizes of instructional materials, it is more appropriate to use the NGSS Lesson
Screener or the EQuIP Rubric for Science, which are explicitly designed for this purpose. PEEC is
also not intended to be used with supplemental materials or instructional materials compiled
from several different sources (e.g., a combination of various textbooks, kits, modules, and
digital supplements assembled by the user) unless there is clear guidance for how the different
components will be used in the classroom to address the criteria highlighted in this evaluation.

PEEC describes the NGSS Innovations.

To determine the degree to which an instructional materials program is designed for the NGSS,
PEEC focuses on what makes the NGSS new and different from past science standards. These
differences were first articulated as “conceptual shifts” in Appendix A of the standards released
in 2013, but four years of subsequent implementation has refined our collective understanding

PEEC version 1.1 page 5 of 93


of what is unique about the NGSS and has revealed that these are not just shifts. These
differences represent innovations in science teaching and learning.

The five “NGSS Innovations” are:

1. Making Sense of Phenomena and Designing Solutions to Problems. Making sense of


phenomena or designing solutions to problems drives student learning.
2. Three-Dimensional Learning. Student engagement in making sense of phenomena and
designing solutions to problems requires student performances that integrate grade-
appropriate elements of the Science and Engineering Practices (SEPs), Crosscutting
Concepts (CCCs), and Disciplinary Core Ideas (DCIs) in instruction and assessment.
3. Building K–12 Progressions. Students’ three-dimensional learning experiences are
designed and coordinated over time to ensure students build understanding of all three
dimensions of the standards, nature of science concepts, and engineering as expected
by the standards.
4. Alignment with English Language Arts and Mathematics. Students engage in learning
experiences with explicit connections to and alignment with English language arts (ELA)
and mathematics standards.
5. All Standards, All Students. Science instructional materials support equitable access to
science education for all students.

Each of these innovations and their implications for instructional materials are described in
detail in this document. The NGSS Innovations are the lens that PEEC uses to help educators
evaluate instructional materials, and should be the focus of those developing instructional
materials for the NGSS.

It should be noted that there are certainly additional criteria for evaluating the quality of
instructional materials that are not the primary focus of document, such as cost or ease of use
of any technological components. Their omission is not because they are not important, but
merely because they are not unique to materials designed for the NGSS. An initial discussion of
these issues is found in the Beyond PEEC section on page 47.

PEEC is a process.

PEEC is a process for schools, districts, or other teams of teachers to use to evaluate aspects of
instructional materials as described above. The PEEC evaluation process involves three
successive phases that are each explained in detail in this document.

1. PEEC Prescreen: The prescreen focuses on a small number of criteria that should be
readily apparent in instructional materials designed for the NGSS. This allows those
selecting materials to take a relatively quick look at a wide range of materials and
narrow the number of programs worthy of a closer look.
2. Unit Evaluation: If the prescreen of the materials indicates that there is at least the
potential that they are designed for the NGSS, the PEEC process uses the EQuIP Rubric

PEEC version 1.1 page 6 of 93


for Science as a sampling tool to evaluate a single unit of instruction for evidence it is
designed for the NGSS.
3. Program-Level Evaluation: For materials that show sufficient evidence of being
designed for the NGSS when they are evaluated with the EQUIP Rubric for Science, the
final phase of the PEEC process evaluates the evidence that the NGSS Innovations are
embedded across the entire instructional materials program.

PEEC builds on other tools.

To effectively use PEEC, instructional materials evaluators and developers should already be
fluent in the language of the Framework, be comfortable navigating the NGSS (including the
Appendices),and have experience with applying the EQuIP Rubric for Science to evaluate units.
Users that are not familiar with these documents can find them and resources to support their
use at www.nextgenscience.org. PEEC also draws heavily from the discussions and evaluative
criteria in Guidelines for the Evaluation of Instructional Materials in Science—a document that
describes the research base for evaluative criteria that should be considered in building tools
for evaluating instructional materials designed for the NGSS. The criteria for all three phases of
PEEC have a close connection to those presented in the Guidelines.

PEEC continues to evolve.

PEEC represents the collective input, guidance, and efforts of many science educators around
the country. As their work continues, subsequent versions of PEEC will build on and incorporate
their experience.

We invite you to share your reactions to and suggestions for subsequent versions of PEEC by
emailing peec@achieve.org.

Why PEEC?
PEEC takes the compelling vision for science education as described in A Framework for K–12
Science Education and embodied in the NGSS and operationalizes it for two purposes:

1. To help educators determine how well instructional materials under consideration have
been designed for the Framework and NGSS; and
2. To help curriculum developers construct and write science instructional materials that
are designed for the Framework and NGSS.

The NGSS do not shy away from the complexity of effectively teaching and learning science.
They challenge us all to shift instructional materials to better support teachers as they create
learning environments that support all students to make sense of the world around them and

PEEC version 1.1 page 7 of 93


design solutions to problems. This vision is summarized in the following paragraph from the
Framework:

By the end of the 12th grade, students should have gained sufficient knowledge
of the practices, crosscutting concepts, and core ideas of science and
engineering to engage in public discussions on science-related issues, to be
critical consumers of scientific information related to their everyday lives, and
to continue to learn about science throughout their lives. They should come to
appreciate that science and the current scientific understanding of the world
are the result of many hundreds of years of creative human endeavor. It is
especially important to note that the above goals are for all students, not just
those who pursue careers in science, engineering, or technology or those who
continue on to higher education.

This vision is not only aspirational; it is based on scientific advances and educational research
about how students best learn science. This research and resulting vision for science education
have implications for instructional materials that reach far beyond minor adjustments to
lessons, adding callout boxes to margins, crafting a few new activities, or adding supplements
to curriculum units. The advances in the NGSS will be more successfully supported if entire
science instructional materials programs are designed with the innovations described by this
evaluation tool and if states, districts, and schools use this tool to ensure that the materials
they choose really measure up.

The word “designed” is intentionally and deliberately used here—and throughout the PEEC
materials—instead of “aligned.” This choice was made because alignment has come to
represent a practice that is insufficient to address the innovations in these standards.

When new standards are released, educators traditionally create a checklist or map in order to
determine how well their instructional materials match up with the standards. If enough of the
pieces of the standards match up with the pieces in the lessons or units or chapters, the
instructional materials are said to be “aligned.” In this sense, “alignment” is primarily
correlational and, if the correlation is not high enough, the only shift that is needed is to add
additional materials or remove particular pieces. This traditional approach to alignment
assumes that (1) matching content between the language of the standards and the instructional
materials is sufficient for ensuring that students meet the standards, and (2) that all approaches
to the way instructional experiences are designed in materials are created equally as long as the
content described by the standards appears.

However, the innovations of the Framework and NGSS cannot be supported by instructional
materials that simply have the same pieces and words as the standards. In the NGSS, academic
goals for students are stated as performance expectations that combine disciplinary core ideas,
crosscutting concepts, and science and engineering practices. The nature of this
multidimensional combination is as important as the presence of the constituent components,

PEEC version 1.1 page 8 of 93


and has implications for how students build the knowledge and skills needed to be able to meet
multidimensional standards. Thus, the word “designed” was chosen because it reflects the
degree to which the innovations represented by the standards are a foundational aspect of
both the design and content the instructional materials.

This focus on these innovations speaks to the second purpose of PEEC: to support authors and
curriculum developers as they work to produce instructional materials for the NGSS. This
support began with NGSS Appendix A (The Conceptual Shifts in the Next Generation Science
Standards), and was soon followed by the first version of the Educators Evaluating the Quality
of Instructional Products (EQuIP) Rubric for Science that described what these shifts looked like
in instructional materials at the lesson and unit level. The EQuIP Rubric for Science has been
successively revised based on extensive use and feedback, and is now in its third version. The
lessons from EQuIP process have been further articulated and codified to form the NGSS
Innovations section of PEEC. While different from the “Publisher’s Criteria” that were
developed for the Common Core State Standards in scope, format, and structure, the core
intent of the innovations is similar: to help curriculum developers and curriculum users think
about how the standards should manifest themselves in instructional materials by focusing on
the aspects that are most central to meeting the demands of the NGSS and most different from
traditional approaches to standards, instruction, and materials. The goal is to help developers
more easily create and refine instructional materials, and to do so knowing that their efforts are
focused on the same innovations that schools, districts, and states will be using to select
instructional materials for use.

PEEC and Other Framework-based Standards

Although PEEC was explicitly and specifically designed to evaluate materials designed for the
NGSS and there are regular references to the NGSS throughout, the innovations that are part of
these standards are fundamentally rooted in the Framework. This means that states and
districts that did not adopt the NGSS, but that adopted standards based on the three
dimensions of the Framework should also be able to use it to evaluate instructional materials
that are developed for these key innovations.

PEEC version 1.1 page 9 of 93


The NGSS Innovations and Instructional Materials
The NGSS Innovations are the five most significant ways the NGSS advance science teaching and
learning, when compared to previous standards and typical instructional and curricular practice
in American schools. They build on the conceptual shifts described in Appendix A of the NGSS
using lessons learned by educators and researchers since implementation efforts began to bring
clarity and focus to what is truly innovative in the NGSS.

As the key ways that the NGSS are new and different, these innovations also provide the
intellectual framework PEEC uses to evaluate science instructional materials.

This section describes each of the five NGSS Innovations and provides insight on how these
innovations should be expected to appear in instructional materials. Each innovation is described
with the following components.

• A summary statement that distills the key idea of the innovation.


• A quote connecting each innovation to the research of the Framework.
• A detailed explanation of the innovation, often with links to portions of the NGSS.
• A description what this innovation looks like in instructional materials.
• A table providing concrete examples of the changes this innovation describes
instructional materials.

Innovation 1: Making Sense of Phenomena and Designing Solutions


to Problems

Summary Making sense of phenomena or designing solutions to problems drives student


learning.

From the Framework:

The learning experiences provided for students should engage them with
fundamental questions about the world and how scientists have investigated
and found answers to those questions.

Though “making sense of phenomena and designing solutions to problems” is not one of the
three dimensions of the standards and “phenomenon” or “problem” are not words often found
within the performance expectations, a close look will reveal that the ability of students to make
sense of phenomena and design solutions to problems is indeed a core feature of these
standards. The easiest place to see this explicitly is to look at the foundation boxes connected to

PEEC version 1.1 page 10 of 93


each performance expectation, or in Appendix F: Science and Engineering Practices and Appendix
G: Crosscutting Concepts. These appendices provide additional detail about learning
expectations in these two dimensions of the standards across grade levels and frequently
reference making sense of phenomena and/or designing solutions to problems.

Explaining phenomena and engineering design problems are not entirely new to science teaching
and learning—laboratory experiments have been a hallmark of science instruction for decades,
phenomena have frequently been used to “hook” students into learning, and engineering
activities have often been used for engagement or enrichment—but the expectation that they
are an organizing force for instruction is an innovation. By organizing instruction around
phenomena, students are provided with a reason to learn (beyond acquiring information they
are told they will later need) and shifts student focus from learning about a topic to figuring out
why or how something happens. Additionally, the focus on relevant, engaging phenomena and
design problems that students can access addresses diversity and equity considerations by
providing opportunities for students to make connections with the content based on their own
experiences and questions. This leads to deeper and more transferable knowledge and moves
everyone closer to the vision of the Framework.

Implications for Science Instructional Materials

As with science instruction, phenomena and problems are not new to science instructional
materials, but the shift to an expectation that student sense-making and problem-solving is
driving instruction means that materials will need to shift as well. In instructional materials
programs designed for the NGSS, this shift should be obvious in the organization and flow of
learning in student materials and a clear focus of the teacher supports for instruction and
monitoring student learning (see

PEEC version 1.1 page 11 of 93


Table 1 for additional ways that making sense of phenomena and designing solutions to problems
are different in the NGSS). This focus should be clear in even a quick scan through instructional
materials designed for the NGSS and, after a closer look, it should be clearly central to student
learning within lessons and units and coordinated over the whole program in a way that is
coherent for both students and teachers.

For more resources on how making sense of phenomena and designing solutions to problems
are important for teaching and learning designed for the NGSS, visit
https://www.nextgenscience.org/resources/phenomena.

The following table provides examples of what instructional materials programs designed for this
NGSS Innovation include “less” of and “more” of. This is not an exhaustive list, but is intended to
call out key evidence that should be looked for in evaluating instructional materials. It should also
be noted that “less” does not mean “never” and “more” does not mean “always.”

PEEC version 1.1 page 12 of 93


Table 1: Innovation 1—Making Sense of Phenomena and Designing Solutions to Problems

Instructional materials programs designed for the NGSS include:

Less More

Focus on delivering disciplinary core ideas to Engaging all students with phenomena and
students, neatly organized by related content problems that are meaningful and relevant;
topics; making sense of phenomena and that have intentional access points and
designing solutions to problems are used supports for all students; and that can be
occasionally as engagement strategies, but explained or solved through the application
are not a central part of student learning. of targeted grade-appropriate SEPs, CCCs,
and DCIs as the central component of
learning.

Making sense of phenomena and designing Students using appropriate SEPs and CCCs
solutions to problems separated from (such as systems thinking and modeling) to
learning (e.g., used only as an engagement make sense of phenomena and/or to design
tool to introduce the learning, only loosely solutions to give a context and need for the
connected to a disciplinary core idea, or used ideas to be learned.
as an end of unit or enrichment activity).

Instructions for students to “design Students learning aspects of how to design


solutions” as a step-by-step directions- solutions while engaged in the design
following exercise. process.

Only talking or reading about phenomena or Students experiencing phenomena directly or


how other scientists and engineers engaged through rich multimedia.
with phenomena and problems.

Leading students to just getting the “right” Using student sense-making and solution-
answer when making sense of phenomena. designing as a context for student learning
and a window into student understanding of
all three dimensions of the standards.

PEEC version 1.1 page 13 of 93


Innovation 2: Three-Dimensional Learning

Summary Student engagement in making sense of phenomena and designing solutions to


problems requires student performances that integrate grade-appropriate
elements of the SEPs, CCCs, and DCIs in instruction and assessment.

From the Framework:

Instructional materials must provide a research-based, carefully designed


sequence of learning experiences that develop students’ understanding of the
three dimensions and also deepen their insights in the ways people work to seek
explanations about the world and improve the built world.

That there are three dimensions in the NGSS—the science and engineering practices (SEPs), the
disciplinary core ideas (DCIs), and crosscutting concepts (CCCs)—is their most recognizable
feature. The innovation of these three dimensions, however, lies not just in their existence in
the standards, but in how they exist in the standards. The NGSS are designed to make the two
important parts of this innovation clear: 1) that the all three dimensions are equally important
learning outcomes; and 2) that the integration of the three dimensions is key for student
learning.

It might seem like the existence of the three dimensions is the innovation, but each has a
predecessor in prior state standards and all three existed in many of those standards
documents in one way or another. Prior to the NGSS, the primary focus of most state standards
was on “science content” expected for students to know or understand. This “science content”
was the precursor of disciplinary core ideas. Many state standards also included at least one
standard that highlighted what students needed to know about how scientists do their work—
the precursor to the science and engineering practices. Often called “inquiry,” this was an
important component of many state standards documents. The precursors to the crosscutting
concepts were also included in state standards documents, but were often not in the standards
themselves. They were derived from the “Unifying Concepts and Processes” of the National
Science Education Standards (NRC 1996), the “Common Themes” of the Benchmarks for Science
Literacy (AAAS 2009), “themes” in Science for All Americans (AAAS 1989), and “crosscutting
ideas” in NSTA’s Science Anchors Project (2010).

How this information was organized in prior standards, however, conveyed a difference in the
relative importance of these three areas of student learning and these differences had a
significant impact on instruction, instructional materials, and assessments in science
classrooms. The “science content” portions took up the majority of the standards and because
of the sheer breadth of detailed information, most instruction that targeted the standards
focused on ways to disseminate this information to students. Though “inquiry” was highlighted

PEEC version 1.1 page 14 of 93


in prior standards documents, it was typically a single standard while many more were devoted
to science content. The crosscutting concepts predecessors were frequently addressed either in
the front matter of the standards documents and/or were buried in standards that were
viewed as supplemental to core learning.

The NGSS, on the other hand, include all three dimensions in performance expectations,
intentionally signaling that all three dimensions are equally important for student learning.
Students cannot fully demonstrate understanding of disciplinary core ideas without using the
crosscutting concepts while engaging in the science and engineering practices. At the same
time, they cannot learn or show competence in practices except in the context of specific
content.

Building student proficiency in all three dimensions is a significant innovation all by itself, but
the implication of this innovation goes beyond three separate strands of learning that are
equally valued. The power of the three dimensions comes in their integration. The fact that
these standards are written as three-dimensional performance expectations is significant and
intentional, and should be reflected in student learning experiences. The Framework makes it
clear that, “In order to achieve the vision embodied in the framework and to best support
students’ learning, all three dimensions need to be integrated into the system of standards,
curriculum, instruction, and assessment” (2012). Students develop and apply the skills and
abilities described in the practices, as well as use the CCCs to make sense of phenomena and
make connections between different DCIs in order to help gain a better understanding of the
natural and designed world. The SEPs and CCCs provide multiple access points for students to
approach learning goals, enabling different students in different contexts to access the same
ideas. Simply parsing these dimensions back out into separate entities to be learned and
assessed in isolation misses the vision of the NGSS and the Framework.

It is also important to clarify that the NGSS were designed to be endpoints for a grade level (K–
5) or grade band (6–8; 9–12), and that they collectively describe what students should know
and be able to do at that endpoint. The exact pairings of the dimensions in the PEs should not
limit how the dimensions are integrated during classroom instruction and assessment. Because
the very architecture of the NGSS models three-dimensionality, a PE might seem like a
classroom lesson or unit, but it is not the intent of the NGSS to have students simply “do the
PEs.” Since the PEs are written as grade-level endpoints, they often contain elements of the
dimensions that may need to be taught at different times of the year. For example, a PE may
include a DCI that fits early in a year of instruction, but also a more advanced level of a CCC or
SEP that students might not be prepared for until the end of that same year. Furthermore, such
an endeavor would be impractical and inefficient, as many PEs overlap with and connect to
each other. Instead, three-dimensional learning experiences that integrate multiple SEPs, CCCs,
and DCIs will be needed to help all students build the needed competencies toward the
targeted performance expectations.

PEEC version 1.1 page 15 of 93


Implications for Instructional Materials

Instructional materials built for past science standards were organized just like the standards:
inquiry or science process was frequently addressed in an opening chapter, a majority of the
text was devoted to imparting “science content” to students, and the crosscutting concepts
precursors were generally only implicitly included in materials with little to no emphasis in
student learning goals. Instructional materials designed for the NGSS, on the other hand, must
communicate the equal value of the three dimensions. This has implications for how student
materials are organized and how the dimensions are presented in teacher support materials.
This importance can and should be conveyed explicitly, but it is also conveyed by how the
dimensions are presented. If one dimension is relegated to only appearing in the margins,
appears with much less frequency, is not supported in teacher materials, or significant learning
time is not devoted to ensuring student learning related to that dimension, then the materials
fall short of what is expected by these standards.

Instructional materials designed for the NGSS will not only value all three dimensions of the
standards, but will also integrate the three dimensions in instruction and assessment. For
instruction, this means that student learning experiences must be anchored with three-
dimensional student performances. It may not be possible for every student learning
experience to be three-dimensional, but these 3D performances should be common and central
to student learning. As mentioned above, the three dimensions of the standards should be
integrated in ways that help students to make sense of the world around them and/or design
solutions to problems—driving toward, but not limited by how the dimensions are integrated in
the performance expectations. Instructional materials designed for this NGSS Innovation should
make it clear which elements of the three dimensions are targeted by a given lesson or unit.

Instructional materials designed for the NGSS will integrate the three dimensions when
monitoring student progress with embedded formative and summative assessments. As with
instruction, this doesn’t mean every assessment task or item, all the time, but it also means
more than just an occasional three-dimensional assessment task here or there. The focus of
measuring student learning should utilize items and tasks that are measuring the dimensions
together—in pre-assessments, formative assessments, and summative assessments. Three-
dimensional assessment tasks should be embedded throughout instructional experiences,
taking advantage of the rich opportunities that are part of instruction during which students
make their thinking visible to themselves, their peers, and educators.

Effective assessment of three-dimensional science learning requires more than a one-to-one


mapping between the NGSS performance expectations and assessment tasks. It is important to
note that more than one assessment task may be required to adequately assess students’
mastery of some three-dimensional targets, and any given assessment task may assess aspects
of more than one performance expectation. In addition, to assess both understanding of core
knowledge and facility with a practice, assessments may need to probe students’ use of a given
practice in more than one disciplinary context. To adequately cover the three dimensions,
assessment tasks will generally need to contain multiple components (e.g., a set of interrelated

PEEC version 1.1 page 16 of 93


questions). Developers might focus on individual SEPs, DCIs, or CCCs in some components of an
assessment task, but together, the components need to support inferences about students’
three-dimensional science learning as described in a given set of three-dimensional learning
targets.

For an introduction regarding assessments and the NGSS, see Seeing Students Learn Science:
Integrating Assessment and Instruction in the Classroom (2017), the STEM Teaching Tool
practice briefs on assessment, and Developing Assessments for the Next Generation Science
Standards.

For some more concrete examples of what Innovation 2: Three-Dimensional Learning looks like
in instructional materials programs, see As was mentioned with Table 1, this is not an
exhaustive list, but is intended to call out key evidence that should be sought in evaluating
instructional materials. As a reminder, “less” does not mean “never” and “more” does not
mean “always.”

Table 2. As was mentioned with Table 1, this is not an exhaustive list, but is intended to call out
key evidence that should be sought in evaluating instructional materials. As a reminder, “less”
does not mean “never” and “more” does not mean “always.”

Table 2: NGSS Innovation 2—Three-Dimensional Learning

High-quality instructional materials programs designed for the NGSS include:

Less More

Using science practices and crosscutting Careful design to build student proficiency in
concepts only to serve the purpose of all three dimensions of the standards.
students acquiring more DCI information.

Teachers only posing questions that have Teachers posing questions that elicit the
one correct answer. range of student understanding.

Students discussing open-ended questions


that focus on the strength of evidence used
to generate claims.

PEEC version 1.1 page 17 of 93


Less More

Administering additional assessments during Formative assessment processes embedded


instruction (e.g., vocabulary checks) that lack into instruction to capture changes in
a clear feedback process to monitor and/or student thinking over time and adjust
move student experiences to meet targeted instruction
learning goals.

Assessments that focus on one dimension at Assessments within the instructional


a time and are mostly concerned with materials reflect each of the three distinct
measuring students’ ability to remember dimensions of science and their
information. interconnectedness.

Students learning the three dimensions in Integrating the SEPs, CCCs, and DCIs in ways
isolation from each other, i.e.: that instructionally make sense, as well as
inform teachers about student progress
• A separate lesson or unit on science toward the performance expectations,
process/methods followed by a later including:
lessons or units focused on delivering
science knowledge. • Students actively engaged in scientific
• Including crosscutting concepts only practices to develop an
implicitly, or in sidebars with no understanding of each of the three
attempt to build student proficiency dimensions.
in utilizing them. • CCCs are included explicitly, and
• Rote memorization of facts and students learn to use them as tools to
terminology; providing discrete facts make sense of phenomena and make
and concepts in science disciplines, connections across disciplines.
with limited application of practice or • Facts and terminology learned as
the interconnected nature of the needed while developing
disciplines. explanations and designing solutions
• Prioritizing science vocabulary and supported by evidence-based
definitions that are introduced before arguments and reasoning.
(or instead of) students develop a
conceptual understanding.

PEEC version 1.1 page 18 of 93


Innovation 3: Building K–12 Progressions

Summary Students’ three-dimensional learning experiences are designed and coordinated


over time to ensure students build understanding of all three dimensions of the
standards, nature of science concepts, and engineering as expected by the
standards.

From the Framework:

[Instructional materials] based on the framework and resulting standards


should integrate the three dimensions—scientific and engineering practices,
crosscutting concepts, and disciplinary core ideas—and follow the progressions
articulated in this report… In addition, curriculum materials need to be
developed as a multiyear sequence that helps students develop increasingly
sophisticated ideas across grades K–12.

There are two components to this innovation. This first is what was described in the quote from
the Framework above: coherently building all three dimensions from kindergarten through the
twelfth grade. The second part of this focuses on how both engineering and the nature of
science are embedded across all grade levels.

Building the Three Dimensions

While the three dimensions have appeared in past standards, the NGSS are the first standards
to build all three dimensions over time. Past standards may have included limited progressions
for both science and engineering practices (SEPs) and disciplinary core ideas (DCIs), but the
NGSS progressions are more robust in several ways. The precursors to the crosscutting
concepts (CCCs), on the other hand, were generally incorporated into the front matter of
standards without any indication of how they might be treated over time. Not only are the
three dimensions intentionally integrated into the performance expectations, but these
progressions are supported with three appendices—Appendix E: Disciplinary Core Ideas,
Appendix F: Science and Engineering Practices and Appendix G: Crosscutting Concepts—that
add additional clarity to how these dimensions build over time. The appendices break the grade
banded expectations for each DCI, SEP, and CCC into smaller elements for each to help
educators focus on what is unique about that dimension at that grade.

The SEP progressions in the NGSS are different because the more is expected for student
engagement in the practices over time. The SEPs specify what is often meant by “inquiry” and
address the range of cognitive, social, and physical practices that science and engineering
require in ways that were not included in past standards. This means there are more specific
expectations at each grade level. Furthermore, past science standards generally just increased

PEEC version 1.1 page 19 of 93


the complexity of inquiry standards by adding complexity to what is now one of the SEPs in the
NGSS--planning and carrying out an investigation. The features that were added over time over
time sometimes represent entire SEPs, which, in the NGSS, are built in developmentally
appropriate ways starting in kindergarten. For example, in state standards prior to the NGSS,
defending the results and conclusions of an investigation might not be mentioned in the
standards documents until the high school level. In the NGSS, students are expected to start
building the practice of engaging in argument from evidence in elementary school and that
practice is scaffolded across grades so that high school students are expected to have many
opportunities to engage in this practice before even reaching high school. In a similar fashion,
all eight practices are developed from kindergarten through high school. The added specificity
of the practices provides guidance for how each one builds over time.

The DCIs are more focused than the “science content” of past standards, so the progressions
here look different as well. To be included in the Framework (and the NGSS), an idea had to:
have broad importance across one or more science disciplines; be important for understanding
more complex ideas and solving problems; relate to the interests and life experiences of
students and the world they live in; and be teachable and learnable over multiple grades with
increasing sophistication. The DCIs are driven less by information that we think that students
should know by a particular grade and more by focusing on the fundamental understanding
that will prepare them for their lives beyond high school. As a result, the DCIs have fewer
disconnected bits of information and are more focused on building these core ideas.

As was mentioned above, the predecessors to the CCCs were usually included in the front
matter of standards rather than in the standards themselves. Their addition to each of the
three-dimensional performance expectations of the NGSS means that this dimension of the
standards has an expected progression for the first time. The learning expectations of the CCCs
are scaffolded across the K-12 standards to help students connect knowledge from the various
disciplines into a coherent and scientifically-based view of the world.

Advancing the way that the DCIs and SEPs are built over time while establishing the first
progression for the CCCs is a significant innovation of the NGSS.

Implications for Instructional Materials

Instructional materials designed for the NGSS provide sustained learning opportunities from
kindergarten through high school for all students to engage in and develop a progressively
deeper understanding of each of the three dimensions. Students require coherent, explicit
learning progressions both within a grade level and across grade levels so they can continually
build on and revise their knowledge and expand their understanding of each of the three
dimensions. High-quality NGSS-designed instructional materials must clearly show how they
include coherent progressions of learning experiences that support students in reaching
proficiency on all parts (e.g., all elements of the SEPs, DCIs, and CCCs) of the NGSS by the end of
each grade level and across grades. Guidance should also be provided for teachers to adjust
instruction of all three dimensions to meet the needs of their students. In programs that extend

PEEC version 1.1 page 20 of 93


beyond a single year, these progressions should be coordinated over the full breadth of the
instructional materials program.

This means, for example, that the way materials expect students use each science and
engineering practice at the beginning of the school year should be significantly different from
how they are expected to use each practice by the end of the year. Students should have
experiences across the year designed to develop specific, grade-appropriate elements of each
practice and opportunities to apply these previously developed elements in new situations.
There are a variety of ways this might happen—initially providing supports for a practice and
then strategically removing them over time; focusing on deliberately developing a small
number of elements of a practice in a coordinated fashion throughout the year; practicing
already-developed elements of a practice when a different practice is foregrounded—but it
should be apparent in student materials how the practice is being used differently and the plan
for how the variety of student experiences builds to the full practice should be clearly explained
in teacher materials.

In a similar way, the CCCs and DCIs should be coordinated over time so learning of all three
dimensions is coherent from a student’s perspective and guidance should be provided to
teachers that explains how the organization of student learning experiences builds each
dimension for students.

See NGSS Appendix E, Appendix F, and Appendix G for more information about the learning
progressions for each dimension and how they build over time. For some more concrete
examples of what Innovation 3: Building K-12 Progressions looks like in instructional materials
programs, see Table 3. As was mentioned with earlier innovations, this is not an exhaustive list,
but is intended to call out key evidence that should be looked for in evaluating instructional
materials. As a reminder, “less” does not mean “never” and “more” does not mean “always.”

Table 3: NGSS Innovation 3—Building K-12 Progressions: Building the Three Dimensions

High-quality instructional materials programs designed for the NGSS include:

Less More

Building on students’ prior learning only for Building on students’ prior learning in all
the DCIs. three dimensions.

PEEC version 1.1 page 21 of 93


Less More

Little to no support for teachers to reveal Explicit support to teachers for identifying
students’ prior learning. students’ prior learning and accommodating
different entry points, and describes how the
learning sequence will build on the prior
learning.

Assuming that students are starting from Explicit connections between students’
scratch in their understanding. foundational knowledge and practice from
prior grade levels.

Students engaging in the SEPs only in service Students engaging in the SEPs in ways that
of learning the DCIs. not only integrate the other two dimensions,
but also explicitly build student
understanding and proficiency in the SEPs
over time.

CCCs marginalized to callout boxes, Students learn the CCCs in ways that not only
comments in the margins, or are implicit and integrate the other two dimensions, but also
conflated with the other dimensions and explicitly build student understanding and
therefore do not progress over time. proficiency in the CCCs over time.

Including teacher support that focuses only Including teacher support that clearly
on the large grain size of each dimension explains out how the elements of the
rather than digging down to the element practices are coherently mapped out over
level (e.g. the SEP “Analyzing and the course of the instructional materials
Interpreting data” rather than the grade 3–5 program.
element of the same practice “Analyze data
to refine a problem statement or the design
of a proposed object, tool, or process.”

Embedding Engineering Design and the Nature of Science

The NGSS include engineering design and the nature of science as significant concepts,
embedding them throughout the performance expectations. In many ways they are addressed
within the progressions of the three dimensions of the three dimensions just described, but

PEEC version 1.1 page 22 of 93


there are also specific aspects of each that are highlighted within the NGSS beyond what was
included in the three dimensions. Similar to the three dimensions of the standards, engineering
design and the nature of science have been included in past science standards, but the degree
to which and the way they are incorporated into the NGSS is a distinct part of this innovation of
the NGSS.

The NGSS represent a commitment to integrating engineering design into the structure of
science education by raising engineering design to the same level as scientific inquiry when
teaching science disciplines at all levels, from kindergarten to grade 12. To ensure that this
happens coherently across students’ K–12 learning experience, (1) all the SEPs have elements
that are explicitly focused on engineering; (2) there are specific engineering design DCIs
throughout the standards; and (3) the ideas from the Engineering, Technology, Science, and
Society disciplinary core idea in the Framework are integrated into the crosscutting concepts in
each grade band. (See Chapter 3 in the Framework for a detailed description of how the
practices are used for both science and engineering. Box 3-2 briefly contrasts the role of each
practice’s manifestation in science with its counterpart in engineering.) These engineering
concepts and practices are embedded throughout the NGSS in the performance expectations
(PEs) that are marked with an asterisk. There are also grade-banded engineering design-specific
standards in the NGSS to ensure that student learning about engineering design concepts is
coherent and builds over time. More details about how engineering was embedded in the NGSS
can be found in Appendix I: Engineering Design in the NGSS and Appendix J: Science,
Technology, Society, and the Environment.

A deeper awareness and understanding of the connections between science and engineering
helps all students to be prepared for their lives beyond high school. In particular, the increased
emphasis of engineering in the NGSS has potential to be inclusive of students who have
traditionally been marginalized in the science classroom and do not see science as being
relevant to their lives or future. By solving problems through engineering in local contexts (e.g.,
gardening, improving air quality, or cleaning water pollution in the community), students gain
knowledge of science content, view science as relevant to their lives and future, and engage in
science in socially relevant ways.

Like engineering, some aspects of the nature of science are integrated directly into the three
dimensions of the standards—the integration of scientific and engineering practices,
disciplinary core ideas, and crosscutting concepts provide practical experiences for students
that set the stage for teaching and learning about the nature of science—but this part of the
Building K-12 Progressions innovation also goes beyond just the integration of the three
dimensions. In addition to learning experiences that model how science knowledge is acquired,
the NGSS incorporate eight major themes about the nature of science into the performance
expectations. Four of these themes extend the scientific and engineering practices and four
themes extend the crosscutting concepts. Though the nature of science was often addressed
somewhere within past standards documents, it has not been embedded in the standards over
time the way that it is in the NGSS. These eight themes and exactly how they are built into the

PEEC version 1.1 page 23 of 93


standards are explained in more detail in NGSS Appendix H: Understanding the Scientific
Enterprise: The Nature of Science in the Next Generation Science Standards.

Implications for Instructional Materials

Though engineering has stand-alone standards for each grade band, it is important for
instructional materials not to isolate or separate engineering from science learning. Engineering
was intentionally embedded in the standards to ensure that it was not separated out and
taught as a separate unit or chapter. All three dimensions of the standards include learning that
is relevant to engineering and instructional materials should embed this learning throughout
the program and provide clear support for teachers to see how engineering is embedded
throughout the program. Instructional materials designed for the NGSS should make sure that
engineering is not an enrichment activity or engagement tool, but is incorporated meaningfully
with science throughout student learning, and included as explicit and integrated learning
targets.

Instructional materials designed for the NGSS should ensure that the eight nature of science
themes identified in Appendix H are likewise explicitly embedded throughout student learning
experiences and teacher supports, building learning progressions across grade bands.

For more examples of what Embedding Engineering Design and the Nature of Science looks like
in instructional materials programs, see Table 4. As was mentioned with earlier innovations,
this is not an exhaustive list, but is intended to call out key evidence that should be looked for
in evaluating instructional materials. As a reminder, “less” does not mean “never” and “more”
does not mean “always.”

Table 4: NGSS Innovation 3—Building K–12 Progressions: Embedding Engineering Design and
the Nature of Science

High-quality instructional materials programs designed for the NGSS include:

Less More

Presenting engineering design and the Engaging all students in learning experiences
nature of science disconnected from other that connect engineering design and the
science learning (e.g., design projects that nature of science with the three dimensions of
do not require science knowledge to the NGSS; not separated from science DCIs.
complete successfully, or an intro unit on
the nature of science).

PEEC version 1.1 page 24 of 93


Less More

Presenting engineering design and/or Both engineering design and nature of science
nature of science in a hit or miss fashion, are thoughtfully woven into the three-
i.e. they are made apparent to students, dimensional learning progressions so that
but there is no coherent effort to students receive support to develop their
coordinate or improve student understanding and proficiency.
understanding or proficiency over time.

Introducing students to ideas about Measuring student learning in relation to


engineering design or the nature of engineering design and the nature of science
science, but not expecting students to across a system of assessments.
retain or apply this information.

Teacher support that only explains the Teacher support that explains how engineering
importance of the nature of science and design and the nature of science are
engineering design without a plan for coherently mapped out over the course of the
scaffolding student understanding and instructional materials program.
application.

Innovation 4: Alignment with English Language Arts and Mathematics

Summary Students engage in learning experiences with explicit connections to and


alignment with English language arts (ELA) and mathematics.

From the Framework:

…achieving coherence within the system is critical for ensuring an effective


science education for all students. An important aspect of coherence is
continuity across different subjects within a grade or grade band. By this we
mean “sensible connections and coordination [among] the topics that students
study in each subject within a grade and as they advance through the grades”
[3, p. 298]. The underlying argument is that coherence across subject areas
contributes to increased student learning because it provides opportunities for
reinforcement and additional uses of practices in each area.

PEEC version 1.1 page 25 of 93


The NGSS not only build coherence in science teaching and learning but also provide
connections with mathematics and ELA that are made explicit on each standards page. This
degree of connection across content areas is a significant innovation of the NGSS and, as is
highlighted in Appendix L and Appendix M, the NGSS went to great lengths to ensure that the
English language arts and mathematics expectations of students were grade-appropriate (as
determined by the Common Core State Standards for English Language Arts in Science and
Technical Subjects and Mathematics).

Such convergence across content areas strengthens science learning for all students, especially
for students whose time for learning science may have been diminished by policies driven by an
accountability system dominated by reading and mathematics. Across the three subject areas,
students are expected to engage in argumentation from evidence; construct explanations;
obtain, synthesize, evaluate, and communicate information; and build a knowledge base
through content rich texts. Additionally, students learn the crosscutting concept of Patterns not
only across science disciplines but also across other subject areas of language arts,
mathematics, social studies, etc. Furthermore, the convergence of core ideas, practices, and
crosscutting concepts across subject areas offers multiple entry points to build and deepen
understanding for these students.

Implications for Instructional Materials

Instructional materials designed for the NGSS will highlight and support teachers in making
connections between science, mathematics, and English language arts. Grade-appropriate and
substantive overlapping of skills and knowledge helps provide all students equitable access to
the learning standards for science, mathematics, and English language arts (e.g., see NGSS
Appendix D Case Study 4: English Language Learners).

For examples of NGSS Innovation 4: Alignment with English language arts and Mathematics, see
Table 5. As was mentioned with earlier innovations, this is not an exhaustive list, but is
intended to call out key evidence that should be looked for in evaluating instructional materials.
As a reminder, “less” does not mean “never” and “more” does not mean “always.”

Table 5: NGSS Innovation 4: Alignment with ELA and Mathematics

High-quality instructional materials programs designed for the NGSS include:

PEEC version 1.1 page 26 of 93


Less More

Science learning is isolated from related Engaging all students in science learning
learning in mathematics and English experiences that explicitly and intentionally
language arts. connect to mathematics and English
language arts learning in meaningful, real-
world, grade-appropriate, and substantive
ways and that build broad and deep
conceptual understanding in all three subject
areas.

Innovation 5: All Standards, All Students

Summary Science instructional materials support equitable access to science education


for all students.

From the Framework:

Communities expect many things from their K-12 schools, among them the
development of students’ disciplinary knowledge, upward social mobility,
socialization into the local community and broader culture, and preparation for
informed citizenship. Because schools face many constraints and persistent
challenges in delivering this broad mandate for all students, one crucial role of
a framework and its subject matter standards is to help ensure and evaluate
educational equity.

The NGSS describe science expectations built on progressions of the disciplinary core ideas
(DCIs), the science and engineering practices (SEPs), and crosscutting concepts (CCCs) used
together in meaningful ways that both establish high expectations while providing the structure
to support students from diverse backgrounds in meeting them. This manifests directly in other
innovations of the standards; however, the implications for supporting all students go deeper
than those opportunities previously mentioned. As such, this innovation emphasizes those
features of implementing the NGSS that directly support all students, and particularly those
from traditionally underserved groups, in establishing and maintaining both achievement and
agency in science. Whereas innovations 1-4 describe what is different in the NGSS, innovation 5
describes how the features of the NGSS can be used to support all learners with a focus on
implications for instructional materials.

PEEC version 1.1 page 27 of 93


The NGSS pose a vision for science education that goes beyond asking students to know
scientific information, or even applying scientific information via practices. To truly meet the
vision of the NGSS, all students need to be given the opportunity to become scientists and
engineers—scientific explainers and problem solvers—within the walls of their classrooms. An
important part of helping all students reach achievement in science is ensuring that they both
identify as scientists and engineers, and develop scientific agency—that is, they engage with
science directly as doers and drivers of scientific endeavors, value the ideas they bring with
them, have ownership over science and their learning, and participate in serious, engaging
learning experiences that are meaningful to them culturally and socially.

For further information and examples of how to support a range of students, please see NGSS
Appendix D and the accompanying case studies.

Implications for Instructional Materials

Instructional materials designed for the NGSS provide opportunities for all learners, and
guidance to teachers for supporting diverse student groups, including students from
economically disadvantaged backgrounds, students with special needs (e.g., visually impaired
students, hearing impaired students), English learners, students from diverse racial and ethnic
backgrounds, students with alternative education needs, and talented and gifted students.
They do so using a variety of approaches, but also ensure the features of NGSS design are
intentionally leveraged to support diverse learners as they develop proficiency, agency, and
identity in science.

Specifically, instructional materials that are designed for the NGSS should:

1. Provide substantial opportunities for students to express and negotiate their ideas and
prior knowledge, and capitalize on funds of knowledge (see NGSS Appendix D) as they
are making sense of phenomena and designing solutions to problems.
2. Include diverse examples of scientists and engineers, including women and members
other underserved populations, with whom a range of student groups can identify.
3. Offer meaningful opportunities for science learning experiences to value, respect, and
connect to students’ home, culture, and community.
4. Regularly provide opportunities for students to have ownership over their learning, as
they explore and come to more deeply understand the core scientific ideas described by
the standards.
5. Provide multiple access points, representations, and multimodal experiences for
students to engage with the science at hand.
6. Provide multiple ways in which to make student thinking visible.
7. Provide teachers with ample tools and supports to help a wide range of students learn
the designated content and skills, including through differentiation, engaging multiple
scientific competencies, supporting scientific identities, and cultivating scientific agency.

For more examples of NGSS Innovation 5: All Standards, All Students, see Table 6. As was
mentioned with earlier innovations, this is not an exhaustive list, but is intended to call out key

PEEC version 1.1 page 28 of 93


evidence that should be looked for in evaluating instructional materials. As a reminder, “less”
does not mean “never” and “more” does not mean “always.”

Table 6: NGSS Innovation 5: All Standards, All Students

High-quality instructional materials programs designed for the NGSS include the following:

Less More

Materials including separate lessons or Instructional materials create learning


activities for students with different experiences that students with diverse needs
language or abilities as the only support and abilities can connect to and use to make
for these learners. progress toward common learning goals through
a variety of student approaches within the same
learning sequence.

Use of flashy phenomena as an Inclusion of phenomena and problems that are


interesting hook with the assumption relevant and authentic to a range of student
that all students will find that compelling. backgrounds and interests, with supports for
modifying the context to meet local needs and
opportunities for students to make meaningful
connections to the context based on their
current understanding, personal experiences,
and cultural background.

PEEC version 1.1 page 29 of 93


Less More

Materials providing limited ways of Materials engaging the SEPs, CCCs, and DCIs as
meeting learning goals, such as reading access points and diverse ways for students to
about topics, listening to lectures and learn (e.g., students using the practice of
note-taking, and following written or oral argumentation and evidence-based discourse to
labs. develop scientific understanding; students
developing and using modeling to make sense of
phenomena and problems as well as make
thinking visible in ways that are less dependent
on English language proficiency).

Materials leverage the active components of the


dimensions to provide students with ways to
drive their own learning experiences, and
identify and capitalize on opportunities for active
learning.

Materials focus only on helping students Materials help students learn the requisite
learn and remember “the right answer.” information while also growing students’ ability
to see themselves as scientists and engineers by
providing students multiple opportunities to
make their thinking visible, revisiting ideas, and
engaging in scientific discourse with peers.

Teacher materials that focus on Teacher materials that include suggestions for
delivering information to students how to connect instruction to the students'
without providing support to help home, neighborhood, community and/or culture
teachers value and build on the as appropriate, and provide opportunities for
experiences and knowledge that students students to connect their explanation of a
bring to the classroom phenomenon and/or their design solution to a
problem to questions from their own experience
and meaningful components of their own
contexts. Teacher materials provide suggestions
for how to support students’ through multiple
approaches to problems and phenomena.

PEEC version 1.1 page 30 of 93


Less More

Teacher materials that only offer minimal Teaching materials that include:
or non-context specific support for
differentiation. • Appropriate reading, writing, listening,
and/or speaking alternatives (e.g.,
translations, picture support, graphic
organizers, etc.) for students who are
English learners, have special needs, or
read well below the grade level.
• Extra support (e.g., phenomena,
representations, tasks) for students who
are struggling to meet the targeted
expectations.
• Extensions for students with high interest
or who have already met the
performance expectations to develop
deeper understanding of the practices,
disciplinary core ideas, and crosscutting
concepts.
• Support for how to engage students in
ownership of their learning.

PEEC version 1.1 page 31 of 93


Using PEEC to Evaluate Instructional Materials
Programs
The NGSS Innovations just described form the foundation of the PEEC instructional materials
evaluation process. The criteria in PEEC explicitly focus on these innovations and how
thoroughly they are represented in instructional materials programs.

The PEEC process involves three phases for each instructional materials program under
consideration.

• PEEC Prescreen - A quick look at instructional materials programs to


narrow the scope of materials to be reviewed
1

• Unit Evaluation - A close look to verify the thoroughness with which


the materials are designed for the NGSS
2

• Program Level Evaluation - A broad look to evaluate the degree to


which the NGSS Innovations permeate the entire program
3

PEEC was designed to determine the degree to which instructional materials programs are
designed with the innovations of the NGSS. As such, it is useful for both curriculum developers
and instructional materials authors as well as by schools, states, and districts seeking to
purchase or obtain instructional materials.

Some idea about how PEEC can be used by various audiences are described on the following
page:

PEEC version 1.1 page 32 of 93


States and PEEC
PEEC can be used by States to:

• Develop criteria for reviewing and selecting state-adopted or recommended entire


school science instructional materials programs—school science textbooks, textbook
series, kit-based and other instructional materials and support materials for teachers—
that are designed for both year-long and K–12 education, that represent comprehensive
programs; or
• Describe a process for reviewing and selecting state adopted or recommended entire
school science instructional materials programs—school science textbooks, textbook
series, kit-based and other instructional materials and support materials for teachers—
that are designed for both year-long and K–12 education, that represent comprehensive
programs; or
• Provide guidance to districts to make strong instructional materials selections.

School Districts and PEEC


PEEC can be used by district and school educators to:

• Describe the process for reviewing and selecting entire school science programs—school
science textbooks, textbook series, kit-based and other instructional materials and
support materials for teachers—that are designed for the NGSS; or
• Evaluate current science instructional materials to identify adaptations and
modifications to support NGSS implementation.

Developers, Writers, and PEEC


PEEC can be used by instructional materials developers, authors, writers, and designers to:

• Enhance initial design and planning of an entire school science programs—school


science textbooks, textbook series, kit-based and other instructional materials and
support materials for teachers—so that subsequent development, writing, and field
testing best incorporates the NGSS.
• Analyze a program currently in development or in the market to understand if and how
the innovations within the NGSS manifest themselves, to make better decisions about
revisions or updates.
• Collect, document, and share evidence and claims so other educators can understand
how a given set of instructional materials are designed for the NGSS.
• Enhance the capacity of development and sales or marketing teams, so that the people
who work with schools, districts, and states on behalf of a vendors understand the NGSS
and the innovations the NGSS calls for.

PEEC version 1.1 page 33 of 93


PEEC Phase 1: Prescreen

• PEEC Prescreen - A quick look at instructional materials programs to


narrow the scope of materials to be reviewed
1

• Unit Evaluation - A close look to verify the thoroughness with which


the materials are designed for the NGSS
2

• Program-Level Evaluation - A broad look to evaluate the degree to


which the NGSS Innovations permeate the entire program
3

Summary PEEC Phase 1: The PEEC Prescreen is a quick look at NGSS design for
instructional materials programs.

Process 1. Prepare for the review by identifying the people involved, the
components of the instructional materials in question to review, and
the evidence to be sought.
2. Apply the PEEC prescreen. Use Tool 1A: PEEC Prescreen Response Form
(Phenomena), Tool 1B: PEEC Prescreen Response Form (Three
Dimensions), and Tool 1C: PEEC Prescreen Response Form (Three
Dimensions for Instruction and Assessment).
3. Analyze the results. Use Tool 2: PEEC Prescreen: Recommendation for
Review?.

The purpose of the prescreen is to do a relatively quick survey of an instructional materials


program to see if it warrants further review. Phase I offers users a tool and a process to
determine if a given set of instructional materials has the potential to be designed for the NGSS.
If the evidence for these three criteria is not clear and compelling, the materials are likely not
worth the time and capacity necessary to fully evaluate the degree to which the programs are
designed for the NGSS.

PEEC version 1.1 page 34 of 93


Applying the prescreen is not a thorough vetting of a resource and is not sufficient to support
claims of being designed for the NGSS. However, if these innovations are not clearly visible, it is
difficult to imagine that the resource is designed for the NGSS in a way that will support
advancing science instruction in the classroom.

The prescreen focuses on three criteria related to the first two NGSS Innovations: Innovation 1:
Making Sense of Phenomena and Designing Solutions to Problems and Innovation 2: Three-
Dimensional Learning as shown in Table 7.

Table 7: PEEC Prescreen Summary Table

The instructional materials program is designed to engage all students in making sense of
phenomena and/or designing solutions to problems through student performances that
integrate the three dimensions of the NGSS.

Innovation 1 Making Sense of Phenomena and Designing Solutions to Problems

The instructional materials program focuses on supporting students to make


sense of a phenomenon or design solutions to a problem.

Innovation 2 Three Dimensions

The instructional materials program is designed so that students develop and


use multiple grade-appropriate elements of the science and engineering
practices (SEPs), disciplinary core ideas (DCIs), and crosscutting concepts
(CCCs), which are deliberately selected to aid student sense-making of
phenomena or designing of solutions.

Integrating the Three Dimensions for Instruction and Assessment

The instructional materials program requires student performances that


integrate elements of the SEPs, CCCs, and DCIs to make sense of phenomena
or design solutions to problems, and elicits student artifacts that show
direct, observable evidence of three-dimensional learning.

PEEC version 1.1 page 35 of 93


Preparing to use PEEC
Before beginning a PEEC review process, several questions need to be answered.

Preparation Question 1: Who will be conducting the review?

In the beginning of the review process, a decision needs to be made about who will be applying
the prescreen and conducting subsequent parts of the PEEC process. Will it be the whole group
that is reviewing materials, or will it be a small leadership group? Applying the prescreen with
the full group doing the review can be a way to build a common understanding of the first two
innovations before digging in deeper with the Unit Evaluation. However, depending on the
number of instructional materials programs being reviewed and the resources available to
support the review, it may make sense for only a leadership group to apply the Prescreen to the
full scope of materials being considered. Then, once a smaller set of programs have been
identified, a larger group of educators can be involved in the remaining two phases of PEEC.

Certainly, refer to state, district, and local laws, rules, and guidance documents to ensure that
all requirements are met. Suggestions for potential membership on the instructional materials
committee include state, district, and school-level science instruction, assessment, and equity
supervisors, district administrators, school principals, elementary, middle, and high school
science teachers, higher education and STEM partners, parents, students, and community
members.

All committee members need a thorough understanding of the National Research Council’s A
Framework for K–12 Science Education, the Next Generation Science Standards (NGSS), and the
NGSS Innovations. They need to be comfortable applying the EQuIP Rubric for Science 3.0. If
participants have not received formal professional learning to support using the EQuIP Rubric
for Science, that will need to be included in the process.

While it is possible for the prescreen and subsequent phases of the PEEC review to be applied
by an individual, the quality review process works best with a team of reviewers as a
collaborative process. As more people get involved, the likelihood for better evidence and
understanding increases as the additional perspectives can deepen the review process.
However, adding more review team members will increase the complexity and costs of a review
effort. Working as a group will not only result in a better-informed decision, but the
conversations can also bring the group to a common, deeper understanding of what
instructional materials designed for the NGSS look like.

PEEC version 1.1 page 36 of 93


Regardless of the number of people involved, the same process works to collect input from
individuals to make a collective decision. Just as when using the full EQuIP Rubric for Science,
users should follow the sequence of steps below for each instructional materials program under
consideration:

1. Individually record criterion-based evidence.


2. Individually use this evidence to make a recommendation about whether to continue
review.
3. With team members, discuss evidence, recommendations, and reasoning.
4. Reach a consensus decision about conducting deeper analysis for this instructional
materials program in subsequent PEEC phases.

Preparation Question 2: Which components of the instructional materials


program will you review?

The NGSS Innovations evaluated by the prescreen should be explicit and obvious, and they
should be present in the materials that are in the hands of all students and teachers—not just
in optional or ancillary materials. The components of the instructional materials program
chosen to review need to be selected in advance and consistent across programs. It is
important to review only what will be available to all teachers and to all students. Though this
is intended to be a quick read-through of materials, it is important—for all the materials
reviewed and for each of the criteria—to evaluate both the overall organization of the materials
and their content.

For each of the instructional material programs under consideration, teams should identify
which components will be included and which ones will not be included in the PEEC review
process.

Preparation Question 3: What evidence should be sought?

Before applying the prescreen, it’s important that the review group has a common
understanding of what qualifies as evidence for the criteria. To establish this understanding,
start by reading the “less like, more like” tables in Tool 1A: PEEC Prescreen Response Form
(Phenomena), Tool 1B: PEEC Prescreen Response Form (Three Dimensions), and Tool 1C: PEEC
Prescreen Response Form (Three Dimensions for Instruction and Assessment). These are
shortened versions of the tables embedded in the NGSS Innovations discussion. If necessary,
review the descriptions of NGSS Innovations 1 and 2, and answer the following questions for
each criterion in the prescreen:

1. What would it look like for a student or teacher resource to be organized in a way that
demonstrates this innovation?
2. How would the content of a student or teacher resource look different if it were
demonstrating this innovation?

PEEC version 1.1 page 37 of 93


Applying the PEEC Prescreen
Once the reviewers have a common understanding of the evidence they are looking for, it is
time to examine the instructional materials programs under consideration. For each
instructional materials program that is to be reviewed, page through the selected program
materials and examine the chapter/unit/overall organization as well as the individual lessons
and units. For both the organization of the materials and the content, look for evidence that
would indicate that the instructional materials program is designed for each criterion as well as
for evidence that the program is not designed for each criterion.

There are three forms to use, one for each criterion, to collect and articulate this evidence: Tool
1A: PEEC Prescreen Response Form (Phenomena), Tool 1B: PEEC Prescreen Response Form
(Three Dimensions), and Tool 1C: PEEC Prescreen Response Form (Three Dimensions for
Instruction and Assessment). See Table 8 below as an example. The recorded evidence should
answer the question in the table, “What was in the materials, where was it, and why is this
evidence?” relevant to each criterion.

During this stage of the work, it is important to remember that this is a prescreen and not the
full evaluation. It is not necessary to find every piece of evidence in the program; instead, make
a relatively quick pass through the materials. In materials that at least show promise for being
designed for the NGSS, it should not be difficult to see evidence of at least an attempt to
address these innovations. The degree to which these innovations are truly designed into the
materials will be evaluated in more detail later in this process.

Table 8: Example Tool 1A: PEEC Prescreen Response Form (Phenomena)

Less Like This More Like This Shows


promise?

Evidence this criterion is not designed Evidence this criterion is designed into
into this instructional materials this instructional materials program.
program.
What was in the materials, where was it,
What was in the materials, where was and why is this evidence?
it, and why is this evidence?

PEEC version 1.1 page 38 of 93


Less Like This More Like This Shows
promise?

Page iii: table of contents is organized Pages 15–47 (Unit 1 student text) —
by “typical” science topics; the unit and though the title of this unit is “cells,” it
chapter titles give no indication that engages students with making sense of a
students are making sense of series of phenomena; student
phenomena or designing solutions to explanations of several smaller
problems. phenomena support students to explain
a larger phenomenon.
Page 115 (Unit 4 teacher text) — the
teacher support for using the Pages 124–177 (Unit 5 student text) —
phenomena of this unit only talks this unit explicitly incorporates the
about using the phenomena as hooks engineering design process; it is not just

or engagement; it positions the for enrichment, or a culminating activity;
teacher to explain the phenomena it is not just a directions-following
rather than the students. activity.

Pages 144–147 (Unit 5 teacher text) —


there is ample support here for teachers
to organize instruction to support
student discourse and suitable
information for teachers in our district
that may not have experience with
teaching engineering.

Analyzing Results from A Prescreen


Once the evidence has been recorded on Tool 1A: PEEC Prescreen Response Form
(Phenomena), Tool 1B: PEEC Prescreen Response Form (Three Dimensions), and Tool 1C: PEEC
Prescreen Response Form (Three Dimensions for Instruction and Assessment), it is time to
decide if the evidence indicates that the instructional materials program shows promise.
There are two levels where this question needs to be answered:

Is there enough evidence to check the “shows promise?” box for each criterion?

Tool 1A: PEEC Prescreen Response Form (Phenomena), Tool 1B: PEEC Prescreen Response Form
(Three Dimensions), and Tool 1C: PEEC Prescreen Response Form (Three Dimensions for
Instruction and Assessment) all include a “shows promise?” checkbox that should be considered
once the evidence has been recorded on the tool.

PEEC version 1.1 page 39 of 93


To answer this question, weigh the “More Like This” evidence with the “Less Like This”
evidence. This first phase of PEEC is meant to be a quick glance that sorts out instructional
materials programs that are not designed for the NGSS—if a program is close, it warrants
further review. Checking the box here does not mean that the criterion is thoroughly and
appropriately designed into the instructional materials program, but it does mean the program
shows promise and it is worth the time to dig deeper. Leaders should trust in the expertise of
the educators doing the review—their knowledge of the innovations of the NGSS and their
awareness of the needs of students in their classrooms is key to making this decision.

Is there enough evidence across the three criteria to warrant further review?

All three criteria should have their “Shows promise” box checked to indicate that there is
sufficient initial evidence that the instructional materials program is designed to address these
first two key innovations of the NGSS. If instructional materials programs that do not meet this
expectation are carried over to the next step in this process, it should be done with the
awareness that this will require more time, effort, and energy in the review process.

Wrapping Up a Prescreen
After applying the PEEC Prescreen across the instructional materials programs that are being
considered, those that don’t meet the fundamental criteria of the prescreen should be set
aside. They can always be analyzed later if none of the initial materials measures up, but the
remaining analyses are more time- and resource-intensive, so focus on the programs that have
the clearest prescreen evidence of NGSS design.

Each member of the review group should complete Tool 2: PEEC Prescreen: Recommendation
for Review? to document their final analysis.

PEEC version 1.1 page 40 of 93


PEEC Phase 2: Unit Evaluation

• PEEC Prescreen - A quick look at instructional materials programs to


narrow the scope of materials to be reviewed
1

• Unit Evaluation - A close look to verify the thoroughness with which


the materials are designed for the NGSS
2

• Program-Level Evaluation - A broad look to evaluate the degree to


which the NGSS Innovations permeate the entire program
3

Summary PEEC Phase 2: Unit Evaluation uses the EQuIP Rubric for Science to dig deep
into a given unit of an instructional materials program.

Process 4. Select a single unit from the instructional materials program in question
to analyze. Use Tool 3: Unit Selection Table.
5. Apply the EQuIP Rubric for Science to the unit you have selected.
6. Connect the EQuIP Rubric for Science to the NGSS Innovations using
Tool 4: EQuIP Rubric Data Summary.

Once instructional materials programs have been established by the PEEC Phase 1: Prescreen to
at least have the appearance of being designed for the NGSS, the next step is to look at a full
unit to evaluate evidence for the rest of the NGSS Innovations. Luckily, a tool already exists for
this type of evaluation; the Educators Evaluating the Quality of Instructional Products (EQuIP)
Rubric for Science provides criteria by which to measure the alignment and overall quality of
lessons and units with respect to the NGSS. The EQuIP Rubric for Science guides reviewers to
look for evidence of three categories of NGSS Design, as shown in Table 9.

PEEC version 1.1 page 41 of 93


Table 9: Categories of Evidence in the EQuIP Rubric for Science

Category Title Description

1 NGSS Three- The unit is designed so students make sense of phenomena


Dimensional and/or design solutions to problems by engaging in student
Design performances that integrate the three dimensions of the
NGSS.

2 NGSS Instructional The unit supports three-dimensional teaching and learning


Supports for ALL students by placing lessons in a sequence of
learning for all three dimensions and providing support for
teachers to engage all students.

3 Monitoring NGSS The unit supports monitoring student progress in all three
Student Progress dimensions of the NGSS as students make sense of
phenomena and/or design solutions to problems.

Selecting a Unit
There are a variety of factors to consider in selecting a single unit to represent an instructional
materials program in the unit evaluation process. These include: the length of the unit;
similarity of units across programs; evaluator expertise; and available resources for review.
These features are described in this section. Tool 3: Unit Selection Table should be used by
groups to make the unit selection.

Different instructional materials programs may define a “unit” in different ways, so it will be
important to look across the programs that have cleared the prescreen and select a portion of
the program that has a comparable length of instruction. Generally, a unit is a collection of
lessons in an intentional sequence tied to a learning goal. Units usually take longer than two
weeks classroom time to complete, whereas lessons take a few days.

To be able to effectively apply the EQuIP Rubric for Science, a selected unit should include
sufficient length for students to:

• Explain at least one phenomenon and/or design a solution to at least one problem;
• Engage in at least one three-dimensional student performance; and
• Have their learning measured across the three dimensions of the standards.

PEEC version 1.1 page 42 of 93


The unit evaluation should also include the teacher support materials that correspond with the
unit of instruction. The only caveat to this would be if these materials will not be available to
the teachers who will be implementing the program. In this case, only student materials should
be evaluated. The unit for evaluation may correspond with a chapter or unit in a book, or the
materials accompanying an online module, but reviewers should strive to select a comparable
section for review across programs.

As instructional materials programs are being designed for the NGSS and focusing more on
students using the three dimensions to make sense of phenomena and design solutions to
problems, it is quite possible that the units may not be as easily comparable in topic and
organization as they once were. For example, most current high school biology texts have a
single Biology unit focused on photosynthesis. However, as instructional materials programs
designed with the NGSS Innovations in mind are developed, the DCI information related to
photosynthesis may be spread out through both Chemistry and Biology courses, and the
concepts might be developed through several different instructional units. Since developers will
likely not all make curriculum design decisions in the same way, finding the right unit to
compare may become increasingly difficult. A plan should be made to ensure that a comparable
unit is selected across programs.

In considering which unit to review in each program, it is also important to consider the
expertise of the review team. Review team members’ understanding of the three dimensions of
the standards addressed in the unit being reviewed will affect the quality of their reviews. As an
obvious example, a physics teachers may not have the deep understanding of cellular
respiration needed to evaluate a photosynthesis unit. Similarly, a review team without a deep
understanding of the grade-level expectations of a CCC might not catch it when the CCC that is
addressed in a unit is at a much lower grade lever. But expertise can cut both ways: reviewers
with deep knowledge of the DCIs in the unit may be able to better recognize deficiencies in how
the DCIs are addressed, but they also might read between the lines to see connections that are
not explicit in the program—they might see connections that teachers implementing the
materials may not. It is important to know the review team’s expertise, to deliberately to this
into account in the selection of the unit for review and to support the team to be aware of their
own strengths and weaknesses as they are reviewing materials.

As always, these factors will need to be balanced with the resources—people, time, and
money—that are available. A longer selection will give a better look at what the program offers,
but it will also take more resources to evaluate. Having multiple groups look at each resource
and compare their evaluations will provide a more balanced evaluation, and the ensuing
conversations, if properly facilitated, can help prepare teachers to implement the materials
once they are selected. However, this requires a greater time commitment from those
participating in the review.

For each program being reviewed, identify which unit will be reviewed and explain why that
unit was selected in the Tool 3: Unit Selection Table.

PEEC version 1.1 page 43 of 93


Applying the EQuIP Rubric for Science
Once the unit that will be evaluated within each instructional materials program have been
identified, it is time to use the EQuIP Rubric for Science to evaluate each unit. Full support for
using the EQuIP Rubric for Science is not included within the PEEC document, but the process
for using it is described within the rubric itself and in the EQuIP Professional Learning
Facilitator’s Guide and associated resources found on the EQuIP Rubric for Science webpage.
Reviewers should not be expected to reliably apply this rubric to units without professional
learning support. It is not necessary to use the scoring guide portion of the rubric, because of
how the information from EQuIP is incorporated into PEEC, but it is important to gather specific
evidence of each criterion within the unit.

Connecting the EQuIP Rubric for Science to the NGSS Innovations:


Once the EQuIP Rubric for Science has been completed for the unit, transfer the information
captured in the “Evidence of Quality?” checkboxes to Tool 4: EQuIP Rubric Data Summary and
then, based on the pattern of checks and the evidence recorded in the rubric, decide the
degree to which the unit appears to have integrated the NGSS Innovations.

PEEC version 1.1 page 44 of 93


PEEC Phase 3: Program-Level Evaluation

• PEEC Prescreen - A quick look at instructional materials programs to


narrow the scope of materials to be reviewed
1

• Unit Evaluation - A close look to verify the thoroughness with which


the materials are designed for the NGSS
2

• Program-Level Evaluation - A broad look to evaluate the degree to


which the NGSS Innovations permeate the entire program
3

Summary In PEEC Phase 3: Program-Level Evaluation, the NGSS Innovations are


evaluated across an entire program.

Process 7. Determine a sampling plan for the instructional materials program in


question.
8. Review evidence and associated claims from the sample.
9. Sum up the claims and make a final recommendation.

The EQuIP Rubric for Science provides a close look at a single unit, but in programs designed for
the NGSS, the NGSS Innovations need to build across the program. For each of the Innovations,
this means looking for evidence beyond just the unit that was evaluated in PEEC Phase 2.

For example, the unit may have provided multiple and varied opportunities for students to ask
scientific questions based on their experiences—clearly engaging students in the SEP “Asking
Questions and Designing Problems” —but the scope of the unit may have been limited to
developing a particular element of the SEP (e.g., only asking scientific questions without
opportunities to define criteria and constraints associated with the solution to a problem) or to
developing student facility with a particular element to a certain degree (e.g., appropriately
removing scaffolds for development within the unit but not for the full expression of the SEP;
only beginning to connect this SEP to other relevant SEPs). It is also important that that

PEEC version 1.1 page 45 of 93


elements of that practice are effectively incorporated throughout the instructional year. As is
described in Innovation 3: Building Progressions, an instructional materials program designed
for the NGSS will not only engage students in the practices, but will also build their
understanding and use of each practice over time. If the unit evaluated in PEEC Phase 2 is either
the only time that students engage in this practice, or if students engage in the practice the
same way every time, then this innovation is not embedded in the program. PEEC Phase 3:
Program-Level Evaluation will support reviewers in examining the instructional materials
program to determine whether the unit was representative of how well the NGSS Innovations
are embedded throughout the instructional materials program.

To do this across the entire instructional materials program, PEEC uses a different lens for
evaluation. In this phase of evaluation, the student and teacher materials are evaluated to look
for evidence of claims that would be expected to be present in materials designed for the NGSS.
This will build on the evidence base of the PEEC Prescreen and Unit Evaluation to move
reviewers to a final decision about which program to select.

Creating A Sampling Plan


Reviewing every lesson, unit, and component of an instructional materials program is not
feasible in most circumstances; the time and effort for such a task would outweigh the benefit
for most users. Instead, PEEC users should develop a sampling plan that articulates which
portion of the instructional materials program is subject to review. This is particularly important
when comparing instructional materials programs.

A sampling plan is a document that articulates which portions or sections of a set of


instructional material programs will be reviewed during PEEC Phase 3: Program-Level
Evaluation. Sampling plans generally focus on learning sequences, which would feature four or
five classroom lessons. A sampling plan should:

• Focus on learning sequences that span at least 4-5 lessons


• Choose at least three learning sequences
• Ensure the learning sequences come from the beginning, middle, and end of the
instructional materials program

An example sampling plan thus might look like the following.

As we use PEEC to review Amazing Science ©2017, we will

1. Sample three learning sequences consisting of four to five lessons per sequence.
Based on our unit analysis in Phase 2, this sample should allow us to look for the
development and use of the three dimensions together over time in service of
students progressively making sense of phenomena.
2. Intentionally select one learning sequence from the beginning third of the program,
one in the middle third, and one in the final third to ensure that instructional

PEEC version 1.1 page 46 of 93


sequences logically build student proficiency from the beginning to the end of the
year (one of these samples could be the unit evaluated in phase 2).
3. Select sequences that allow for some connectivity across the year, such as a
particular SEP or CCC being foregrounded in all three sequences or sequences that
build on related DCIs.
4. Select sequences that cover a range of the three-dimensions so that we can evaluate
some measure of coverage.

Reviewing Claims and Evidence from The Sample


Once the sampling evaluation plan has been established, read through the claims in Tool 5A:
Program-Level Evaluation Innovation 1: Making Sense of Phenomena and Designing Solutions to
Problems and then read through the sample identified in the immediately preceding step to
determine if there is evidence in the materials that would support the claim. Record evidence
you find on the tool.

Once the evidence has been recorded, evaluate the degree to which there is evidence of each
criteria. Use the following as guidance for evaluating the categories/samples:

• No Evidence: There is not any evidence to support the claim in the sampled materials.
• Inadequate Evidence: There are a few instances of evidence to support the claim, but
they are intermittent or do not constitute adequate time or opportunity for students to
learn the content or develop the ability.
• Adequate Evidence: Evidence for this claim is common and there is adequate time and
opportunity, and support for all students to learn the content and develop the abilities.
• Extensive Evidence: Evidence for this claim is pervasive throughout the program and
there is adequate time, opportunity, and support for all students to learn the content
and develop the abilities.

These ratings of the quality of evidence supporting each claim should be done first individually
and then discussed as a group to reach consensus.

Finally, based on the evidence collected and the pattern of checks, complete the bottom
portion of the Tool that asks reviewers to decide the degree to which the innovation shows up
across the program. For materials that only partially incorporate the innovation, provide
suggestions for what will be needed: professional learning; additional lessons, units, or
modules; developing a district-wide approach to using the crosscutting concepts (because they
are not well represented in the materials); etc.

Repeat this process for the remaining four NGSS Innovations by completing Tool 5B: Program-
Level Evaluation Innovation 2: Three-Dimensional Learning, Tool 5C: Program-Level Evaluation
Innovation 3: Building Progressions, Tool 5D: Program-Level Evaluation Innovation 4: Alignment
with English Language Arts and Mathematics, and Tool 5E: Program-Level Evaluation
Innovation 5: All Standards, All Students.

PEEC version 1.1 page 47 of 93


Summing Up
To finish the PEEC process, complete Tool 6: PEEC Evidence Summary, adding information from
each phase of the PEEC process for the instructional materials program in question. Finally,
complete Tool 7: Final Evaluation to articulate your final recommendation.

PEEC version 1.1 page 48 of 93


Beyond PEEC
It is important to reiterate that there are certainly additional criteria for evaluating the quality
of instructional materials that are not discussed in this document. Their omission is not because
they are not important, but merely because they are not unique to materials designed for the
NGSS. Examples of these criteria can be found below. The additional criteria required by each
district or state can be applied during or after phase 3 of the PEEC evaluation process.

These additional criteria should be present in all high-quality science instructional materials, but
are not specific to NGSS.

Does the instructional materials program in question:

Student Instructional Materials


• Adhere to safety rules and regulations?
• Provide high-quality (e.g., durable, dependable, functioning as intended) materials,
equipment in kits, technological components, or online resources, where applicable?

Teacher Instructional Materials and Support


• Include precise and usable technology specifications?
• Describe strategies including alternative approaches and delivery that will assist in
differentiating instruction to meet the needs of all students (e.g., English learners,
special needs students, advanced learners, struggling students)?
• Include a detailed list of needed materials, both consumable (e.g., cotton balls, pinto
beans) and permanent (e.g., laboratory equipment), that are to be used throughout the
program?
• Provide sufficient description about how to use materials and laboratory equipment,
including safety practices and possible room arrangements?

Equitable Opportunity to Learn in Instructional Materials


• Provide the appropriate reading, writing, listening, and/or speaking modifications (e.g.,
translations, front-loaded vocabulary word lists, picture support, graphic organizers) for
students who are English learners, have special needs, or read below the grade level?
• Provide extra support for students who are struggling to meet performance
expectations?

PEEC version 1.1 page 49 of 93


Assessment in Instructional Materials
• Include assessments with explicitly stated purposes that are consistent with the
decisions they are designed to inform?
• Include assessments with clear systems to help educators use the resulting data for
feedback and monitoring purposes?
• Include assessments that are embedded throughout the instructional materials as tools
for monitoring students’ learning and teachers’ instruction?
• Include assessments that use varied methods, languages, representations, and examples
to provide teachers with a range of data to inform instruction?
• Include assessments that are unbiased and accessible to all students?

PEEC version 1.1 page 50 of 93


Glossary
The following terms are used throughout PEEC. For additional help with language and terms
used here, please see the List of Common Acronyms used by Next Generation Science
Standards.

Bundles/Bundling. Grouping elements or concepts from multiple performance expectations


into lessons, units, and/or assessments that students can develop and use together to build
toward proficiency on a set of performance expectations in a coherent manner. The article
available here provides more description and some video examples of bundles and bundling.

Crosscutting Concepts (CCC). These are concepts that hold true across the natural and
engineered world. Students can use them to make connections across seemingly disparate
disciplines or situations, connect new learning to prior experiences, and more deeply engage
with material across the other dimensions. The NGSS requires that students explicitly use their
understanding of the CCCs to make sense of phenomena or solve problems.

Disciplinary Core Ideas (DCI). The fundamental ideas that are necessary for understanding a
given science discipline. The core ideas all have broad importance within or across science or
engineering disciplines, provide a key tool for understanding or investigating complex ideas and
solving problems, relate to societal or personal concerns, and can be taught over multiple grade
levels at progressive levels of depth and complexity.

EQuIP Rubric for Science. Educators Evaluating Quality in Instructional Products (EQuIP) for
science is a tool and accompanying process for evaluating how well an individual lesson or
single unit (series of related lessons) is designed to support students developing the knowledge
and practice described by the Framework and the NGSS.

The Framework. A shortened title for the 2012 foundational report, A Framework for K-12
Science Education: Practices, Crosscutting Concepts, and Core Ideas, published by the National
Research Council (NRC) describes the scientific consensus for the science knowledge and skills
students should acquire during their K-12 experience. A team of states, coordinated by Achieve,
took the Framework and used it to develop the Next Generation Science Standards. The
Framework is available online in a variety of formats from the National Academies Press.

Instructional Materials. Tools used by teachers to plan and deliver lessons for students.
Generally instructional materials include activities for daily instruction (“lessons”), that are
organized into sequences (“units”, “chapters”).

Instructional Materials Program. A set of instructional materials that spans a large chunk of
time or instruction, generally a full course (e.g. a Biology textbook) or a middle-grades science
sequence. Distinguished from instructional materials that are not nearly as comprehensive,
such as those that focus on only a few days or weeks of instruction or on a given content area.

Learning Sequence. Several connected and sequential lesson that build student understanding
toward a set of learning goals progressively, over the course of weeks (as opposed to days).

PEEC version 1.1 page 51 of 93


Learning sequences target complete three-dimensional learning goals through a variety of
classroom experiences.

Lesson. A set of instructional activities and assessments that may extend over several class
periods or days; it is more than a single activity.

NGSS Innovations. This document describes five NGSS Innovations that describe and explain
what is new and different about the NGSS, particularly regarding instructional materials design
and selection. The NGSS Innovations build on the conceptual shifts described in Appendix A of
the NGSS.

PEEC. Primary Evaluation of Essential Criteria (PEEC) takes the compelling vision for science
education as described in A Framework for K–12 Science Education and embodied in the Next
Generation Science Standards (NGSS) and operationalizes it for two purposes:

1. to help educators determine how well instructional materials under consideration have
been designed for the Framework and NGSS, and
2. to help curriculum developers construct and write science instructional materials that
are designed for the Framework and NGSS.

Performance Expectations (PEs). The NGSS are organized into a set of expectations for what
students should be able to do by the end of a period of instruction, generally measured by
years of schooling. The performance expectations describe the learning goals or outcomes for
students. Each performance expectation describes what students who demonstrate
understanding can do, often with a clarification statement that provides examples or additional
emphasis for individual performance expectation. An assessment boundary guides the
developers of large-scale assessments. Each performance expectation is derived from a set of
disciplinary core ideas, cross-cutting concepts, and science and engineering practices that are
defined in the Framework. Note that like all sets of standards, the NGSS do not prescribe the
methods or curriculum needed to reach these outcomes.

Phenomena. Observable events that students can use the three dimensions to explain or make
sense of. Lessons designed for the NGSS focus on explaining phenomena or designing solutions
to problems. Some additional resources about phenomena are available on the NGSS website.

Science and Engineering Practices (SEP). The practices are what students do to make sense of
phenomena. They are both a set of skills and a set of knowledge to be internalized. The SEPs
reflect the major practices that scientists and engineers use to investigate the world and design
and build systems.

Three-Dimensional Learning. Learning that integrates all three dimensions of the NGSS, that
allows students to actively engage with the practices and apply the crosscutting concepts to
deepen their understanding of core ideas across science disciplines. Click here to read more.

Three Dimensions. As described in the Framework, these are the three strands of knowledge
and skills that students should explicitly be able to use to explain phenomena and design
solutions to problems. The three dimensions are the Disciplinary Core Ideas (DCIs), Crosscutting
PEEC version 1.1 page 52 of 93
Concepts (CCCs), and Science and Engineering Practices (“the Practices” or SEPs). More
information about the three dimensions is available here.

Unit. A set of lessons that extend over a longer time period.

PEEC version 1.1 page 53 of 93


Frequently Asked Questions
The following questions may help clarify some of the specifics about PEEC.

Question 1: Who is the primary audience for PEEC?

PEEC supports educators, developers, and publishers. For educators, the evaluation tool
clarifies what to look for when identifying or selecting instructional materials programs and
assessments for the NGSS. For developers and publishers, PEEC provides guidance on what to
focus on and integrate when designing instructional materials programs for the NGSS. This tool
(1) prepares educators to accurately identify, select, or evaluate resources and (2) helps enable
developers and publishers to effectively design resources that meet criteria for the NGSS.

Question 2: How do the five innovations described in PEEC differ from the “conceptual
shifts” in Appendix A of the NGSS and the implications of the vision of the
Framework and the NGSS from the Guide to Implementing the NGSS?

PEEC focuses on what makes the NGSS new and different from past science standards. These
differences were first articulated as conceptual shifts in Appendix A of the standards. These
conceptual shifts still hold true today, but four years of standards implementation has refined
the understanding of what is unique about the NGSS and has revealed that these shifts
represent innovations in science teaching and learning.

The five “NGSS Innovations” described in PEEC are:

1. Making Sense of Phenomena and Designing Solutions to Problems. Making sense of


phenomena or designing solutions to problems drives student learning.
2. Three-Dimensional Learning. Student engagement in making sense of phenomena and
designing solutions to problems requires student performances that integrate grade-
appropriate elements of the Science and Engineering Practices (SEPs), Crosscutting
Concepts (CCCs), and Disciplinary Core Ideas (DCIs) in instruction and assessment.
3. Building K–12 Progressions. Students’ three-dimensional learning experiences are
designed and coordinated over time to ensure students build understanding of all three
dimensions of the standards, nature of science concepts, and engineering as expected
by the standards.
4. Alignment with English Language Arts and Mathematics. Students engage in learning
experiences with explicit connections to and alignment with English language arts (ELA)
and mathematics standards.
5. All Standards, All Students. Science instructional materials support equitable access to
science education for all students.

PEEC version 1.1 page 54 of 93


Question 3: How does PEEC relate to the EQuIP Rubric for Science?

The EQuIP Rubric for Science is designed to evaluate learning sequences and units for the
degree to which they are designed for the NGSS. It is embedded within PEEC as the tool for
evaluating a sample unit from the program as Phase 2 in the PEEC process. The evaluation from
this phase is combined with the PEEC Phase 1: Prescreen and PEEC Phase 3: Program Evaluation
to give an overall picture of how well the instructional materials program is designed for the
NGSS.

Question 4: Is this a science version of the Publisher’s Criteria that was developed for the
Common Core State Standards for mathematics?

Both PEEC and the Publisher’s Criteria documents are intended to inform both the developers
of instructional materials and those making the selection of which materials to use. The NGSS
Innovations in PEEC highlight the key differences in NGSS from previous sets of standards and
clarifies how these innovations should be represented in instructional materials.

Question 5: I'm interested in working with Achieve to train my teachers on how to use PEEC
to evaluate instructional materials. What should I do?

If you are interested in hiring Achieve to facilitate professional learning to support your district
team in using PEEC to select instructional materials, please contact peec@achieve.org. Training
for effective use takes a minimum of two days if the entire group has already received
professional learning for and are comfortable using EQuIP and a minimum of four days if they
are not proficient in using EQUIP.

Question 6: I’m a science teacher. How should I use PEEC?

PEEC is designed to support building and district-level selection of year-long (or longer)
instructional materials programs designed for the NGSS. Sometimes this task falls to teachers to
coordinate. PEEC provides guidelines for a process that teams can use to evaluate instructional
materials programs.

If you are not part of your school or district’s instructional materials program selection process,
but you want to make sure that the process is focusing on the appropriate criteria, share and
discuss this tool with those responsible for making these decisions.

If you are looking for support in transitioning your classroom lessons and units, you may want
to review the NGSS Lesson Screener or the EQUIP Rubric for Science.

PEEC version 1.1 page 55 of 93


Question 7: I’m a school principal. How should I use PEEC?

While principals are not the primary audience for PEEC, there are several ways that it might be
relevant to your work. Some principals help with the selection of instructional materials for
your school or district, and PEEC includes both criteria and a process that can be used for that
purpose. If selecting instructional material programs is not a part of your duties, then share and
discuss this tool with those science teachers and administrators who are responsible for making
these decisions.

Question 8: I’m a district science leader or curriculum coordinator. How should I use PEEC?

If you’re in charge of coordinating the selection of science instructional materials, PEEC is built
to help your team make good decisions about what materials to purchase (or even to wait to
purchase materials until you find something that better matches your expectations): the NGSS
Innovations described in PEEC will help your selection team to develop a common
understanding of what to look for in materials designed for the NGSS; PEEC Appendix A will
help you to think about building your team and fitting materials selection into your broader
implementation plan for science; and the three phases of the PEEC process will help you to
design the process that you use for materials selection. If your team is already well-versed in A
Framework for K-12 Science Education and the NGSS, anticipate about three full days of
professional learning to prepare your team for this effort and then several days to dig in and
evaluate the materials (depending on how many materials are evaluated).

Question 9: I’m a developer or publisher of science instructional materials. How should I use
the PEEC tool?

The NGSS Innovations section of PEEC describes the most significant changes from past science
standards to the NGSS and their implications for instructional materials. These innovations
should focus the efforts to design materials for the NGSS and should be clearly apparent to
those making instructional materials selection decisions. A developer might also use the PEEC
processes and tools internally to self-evaluate the program that you are developing.

If you are interested in professional learning for your development staff to better understand
the evaluations and apply the rubric, or are interested in a confidential review of your
materials, please contact peec@achieve.org to discuss your needs in greater depth.

Question 10: Some instructional materials are more expensive than others. Why doesn’t
PEEC include cost estimates?

PEEC does not attempt to measure all things that might be considered in selecting instructional
materials. It is focused on evaluating how well an instructional materials program is designed

PEEC version 1.1 page 56 of 93


for the NGSS and asks reviewers to reflect on what the professional learning lift would be to
address any aspects of the innovations that are not well-supported in the materials. There are
some additional criteria in PEEC Appendix D that you may want to consider. Of course,
purchasers must determine how to weigh quality versus cost considerations in choosing
instructional materials.

Question 11: How is this document different from the Guidelines for the Evaluation of
Instructional Materials in Science?

The Guidelines for the Evaluation of Instructional Materials in Science is not a tool or process for
evaluating instructional materials; rather, it describes the research base for evaluative criteria
that should be considered in building tools and processes for evaluating instructional materials
designed for the NGSS. Its development was informed by early versions of EQuIP Rubric for
Science and PEEC, and it informed the most recent version of PEEC. The criteria for all three
phases of PEEC have a close connection to those presented in the Guidelines.

A full description of alignment to the Guidelines will be available in PEEC 1.2.

Question 12: This document is listed as “Version 1.1”. What’s different from version 1.0?

One of the pieces highlighted for revision in version 1.0 was, “Iterating the Innovations. How
can the arguments and discussion about the five NGSS Innovations be more clear and
straightforward?” We received feedback from users in the field and from field testing that
helped us to revise the language of the innovations to better convey their original intent. In
particular, version 1.1:

• highlights the importance of equity and access for all students as foundational to all five
innovations;
• separates the NGSS Innovations from their implications for instructional materials in the
NGSS Innovations section;
• revises the wording of the NGSS Innovations for clarity.

As was the case with the EQuIP Rubric for Science, we expect that as more and more teachers,
schools, districts, authors, developers, and publishers use PEEC, the feedback loops in that
process will lead to ongoing improvements in PEEC. Please send comments and suggestions to
peec@achieve.org.

PEEC version 1.1 page 57 of 93


Question 13: What’s coming in subsequent versions of PEEC?

Subsequent versions of PEEC will include the following enhancements:

Guidelines Alignment. Version 1.2 of PEEC will include a full description of alignment to the
Guidelines for the Evaluation of Instructional Materials in Science.

Sampling. More specific guidance will be provided about how to sample instructional materials
programs to best balance both a rigorous review and the time commitment of the reviewer

Evidence. More examples and specifics will be provided about what users should classify as
evidence and provide support to determine if the quantity and quality of evidence collected is
sufficient to justify a particular claim.

Utility. The forms and tools will be made more useful for users, including templates and fillable
forms.

PEEC Professional Learning Facilitator’s Guide Coordination. Just like the EQuIP Rubric for
Science, a guide is currently under development to support leaders looking to facilitate
professional learning for a selection team. Future versions of PEEC will build a tighter
connection to the guide under development. This guidance will include:

• Streamlined processes for time-constrained users. Guidance will be provided for how
to adapt the PEEC tools and processes for situations that do not allow for the full
process due to resource limitations
• Streamlined presentation of the document and related resources. PEEC’s design will be
enhanced to better support users that want to adapt their use to meet local needs.
• Teaming and Decision Making. More detailed support about how to put together a
materials selection team, how to manage and facilitate the decision-making processes
within that team, and how to connect instructional materials review to a broader
implementation plan.

PEEC is a work in progress. Please send comments and suggestions for improvement to
peec@achieve.org.

PEEC version 1.1 page 58 of 93


References
American Association for the Advancement of Science. (1989). Science for All Americans: A
Project 2061 Report. Washington, DC: AAAS.

American Association for the Advancement of Science. (1993). Benchmarks for Science Literacy.
New York, NY: Oxford University Press.

BSCS. (2017). Guidelines for the Evaluation of Instructional Materials in Science. Retrieved from
http://guidelinesummit.bscs.org.

Darling-Hammond, L. (2000). Teacher Quality and Student Achievement: A Review of State


Policy Evidence. Education Policy Analysis Archives.

Krajcik, J., Codere, S., Dahsah, C., Bayer, R., and Mun, K. (2014). Planning Instruction to Meet
the Intent of the Next Generation Science Standards. Journal of Science Teacher Education,
157–75.

Lee, O., Quinn, H., and Valdés, G. (2013). Science and Language for English Language Learners in
Relation to Next Generation Science Standards and with Implications for Common Core State
Standards for English Language Arts and Mathematics. Educational Researcher, 223–33.

National Academies of Sciences, Engineering, and Medicine. (2017). Seeing Students Learn
Science: Integrating Assessment and Instruction in the Classroom. Washington, DC: The National
Academies Press. doi: 10.17226/23548.

National Research Council. (1996). National Science Education Standards. Washington, DC: The
National Academies Press.

National Research Council. (2007). Taking Science to School: Learning and Teaching Science in
Grades K-8. Washington, DC: National Academies Press.

National Research Council. (2012). A Framework for K–12 Science Education. Washington, DC:
National Academies Press.

National Research Council. (2014). Developing Assessments for the Next Generation Science
Standards. Washington, DC: National Academies Press.

National Research Council. (2015). Guide to Implementing the Next Generation Science
Standards. Washington, DC: The National Academies Press.

NSTA. (2010). Science Anchors Project. http://www.nsta.org/involved/cse/scienceanchors.aspx

Quinn, H., Lee, O., & Valdés, G. (2012). Language demands and opportunities in relation to Next
Generation Science Standards for English language learners: What teachers need to know.
Stanford, CA: Stanford University, Understanding Language Initiative (ell.stanford.edu).

PEEC version 1.1 page 59 of 93


Rosebery, A. S., Ogonowski, M., DiSchino, M., and Warren, B. (2010). “ ‘The Coat Traps All Body
Heat:’ Heterogeneity as Fundamental to Learning.” The Journal of Learning Sciences 322–57.

The Next Generation Science Standards. (2013). Washington, DC: National Academies Press.

Warren, B., Ballenger, C., Ogonowski, M., Rosebery, A. S., and Hudicourt-Barnes, J. (2001).
Rethinking diversity in learning science: The logic of everyday sense making. Journal of Research
in Science Thinking, 529–52.

PEEC version 1.1 page 60 of 93


Tool 1A: PEEC Prescreen Response Form (Phenomena)
This tool is used during Phase 1: PEEC Prescreen to collect and organize data that describes how a single instructional materials
program supports students in making sense of phenomena and designing solutions to problems.

Making Sense of Phenomena and Designing Solutions to Problems: The instructional materials program focuses on supporting
students to make sense of a phenomenon or design solutions to a problem.

NGSS-designed programs will look less like this: NGSS-designed programs will look more like this:

Making sense of phenomena and designing solutions to The purpose and focus of a learning sequence is to support
problems are not a part of student learning or are presented students in making sense of phenomena and/or designing
separately from “learning time” (i.e. used only as a “hook” or solutions to problems. The entire sequence drives toward this
engagement tool; used only for enrichment or reward after goal.
learning; only loosely connected to a DCI).

The focus is only on getting the “right” answer to explain a Student sense-making of phenomena or designing of solutions is
phenomenon or replicating a known solution to a problem. used as a window into student understanding of all three
dimensions of the NGSS.

A different, new, or unrelated phenomenon is used to start Lessons work together in a coherent storyline to help students
every lesson. make sense of phenomena.

PEEC version 1.1—December 2017 page 61 of 93


Teachers tell students about an interesting phenomenon or Students get direct (preferably firsthand, or through media
problem in the world. representations) experience with a phenomenon or problem that
is relevant to them and is developmentally appropriate.

Phenomena are brought into learning after students develop The development of science ideas is anchored in making sense of
the science ideas so students can apply what they learned. phenomena or designing solutions to problems.

Using the chart below, record evidence that would indicate that the instructional materials program is designed for each criterion as
well as for evidence that the program is not designed for each criterion.

Less Like This More like this

Evidence this criterion IS NOT designed into this Evidence this criterion IS designed into this
instructional materials program instructional materials program

What was in the materials, where was it, and why is this What was in the materials, where was it, and why is Shows
evidence? this evidence? Promise?

PEEC version 1.1—December 2017 page 62 of 93


Tool 1B: PEEC Prescreen Response Form (Three Dimensions)
This tool is used during Phase 1: PEEC Prescreen to collect and organize data that describes how a single instructional materials
program supports students in three-dimensional learning.

Three Dimensions: Students develop and use grade-appropriate elements of the science and engineering practices (SEPs),
disciplinary core ideas (DCIs), and crosscutting concepts (CCCs), which are deliberately selected to aid student sense-making of
phenomena or designing of solutions across the learning sequences and units of the program.

NGSS-designed programs will look less like this: NGSS-designed programs will look more like this:

The learning sequence helps students use multiple (e.g., 2–4)


A single practice element shows up in a learning sequence.
practice elements as appropriate in their learning.

The learning sequence focuses on colloquial definitions of


Specific grade-appropriate elements of SEPs and CCCs (from NGSS
the practice or crosscutting concept names (e.g., “asking
Appendices F & G) are acquired, improved, or used by students to
questions”, “cause and effect”) rather than on grade-
help explain phenomena or solve problems during the learning
appropriate learning goals (e.g., elements in NGSS
sequence.
Appendices F &G).

The SEPs and CCCs can be inferred by the teacher (not Students explicitly use the SEP and CCC elements to make sense of
necessarily the students) from the materials. the phenomenon or to solve a problem.

PEEC version 1.1—December 2017 page 63 of 93


Engineering embedded in the learning sequence requires students
Engineering lessons focus on trial and error activities that to acquire and use elements of DCIs from physical, life, or Earth
don’t require science or engineering knowledge. and space sciences together with elements of DCIs from
engineering to solve design problems.

Using the chart below, record evidence that would indicate that the instructional materials program is designed for each criterion as
well as for evidence that the program is not designed for each criterion.

Less Like This More like this

Evidence this criterion IS NOT designed into this Evidence this criterion IS designed into this
instructional materials program instructional materials program

What was in the materials, where was it, and why is this What was in the materials, where was it, and why is Shows
evidence? this evidence? Promise?

PEEC version 1.1—December 2017 page 64 of 93


Tool 1C: PEEC Prescreen Response Form (Three Dimensions for
Instruction and Assessment)
This tool is used during Phase 1: PEEC Prescreen to collect and organize data that describes how a single instructional materials
program integrates the three dimensions for instruction and assessment.

Integrating the Three Dimensions for Instruction and Assessment: The instructional materials program requires student
performances that integrate elements of the SEPs, CCCs, and DCIs to make sense of phenomena or design solutions to problems,
and the learning sequence elicits student artifacts that show direct, observable evidence of three-dimensional learning.

NGSS-designed programs will look less like this: NGSS-designed programs will look more like this:

Students learn the three dimensions in isolation from each The learning sequence is designed to build student proficiency in
other (e.g., a separate lesson or activity on science methods at least one grade-appropriate element from each of the three
followed by a later lesson on science knowledge). dimensions.

The three dimensions intentionally work together to help students


explain a phenomenon or design solutions to a problem.

All three dimensions are necessary for sense-making and problem-


solving.

Teachers assume that correct answers indicate student Teachers deliberately seek out student artifacts that show direct,
proficiency without the student providing evidence or observable evidence of learning, building toward all three
reasoning. dimensions of the NGSS at a grade-appropriate level.

PEEC version 1.1—December 2017 page 65 of 93


NGSS-designed programs will look less like this: NGSS-designed programs will look more like this:

Teachers measure only one dimension at a time (e.g., Teachers use tasks that ask students to explain phenomena or
separate items for measuring SEPs, DCIs, and CCCs). design solutions to problems, and that reveal the level of student
proficiency in all three dimensions.

Using the chart below, record evidence that would indicate that the instructional materials program is designed for each criterion as
well as for evidence that the program is not designed for each criterion.

Less Like This More like this

Evidence this criterion IS NOT designed into this Evidence this criterion IS designed into this
instructional materials program. instructional materials program

What was in the materials, where was it, and why is this What was in the materials, where was it, and why is Shows
evidence? this evidence? Promise?

PEEC version 1.1—December 2017 page 66 of 93


Tool 2: PEEC Prescreen: Recommendation for Review?
This tool is used by a reviewer upon completion of PEEC Phase 1: Prescreen to document their final recommendation for an
instructional materials program.

Reviewer Name or ID: ___________________________ Grade: _________ Lesson/Unit Title: _____________________________

Reminder

The purpose of the PEEC Prescreen is to give a quick look at an instructional materials program. There are significant aspects of what
would be expected in a fully-vetted program designed for the NGSS that are not addressed in this tool and it should not be used to
fully vet resources or claim that the programs are designed for NGSS.

Overall Screening Summary

Recommendation

I recommend this resource to be evaluated by the full PEEC rubric: ________

PEEC version 1.1—December 2017 page 67 of 93


Tool 3: Unit Selection Table
This tool is used by a group of reviews to select matching or similar units to review from multiple instructional materials programs.

Unit What commonality makes the units comparable?


Target
[i.e., they address similar DCI-related topics (clarify which ones); they are designed to have students make sense of a
similar phenomenon (clarify what makes the phenomenon similar); the unit is the best example of engineering
integration in the program, etc.]

Unit Instructional Materials Program Name Unit (title and page numbers) Why this unit?
Description

PEEC version 1.1—December 2017 page 68 of 93


Tool 4: EQuIP Rubric Data Summary
This tool is used to summarize the results of the EQuIP Review for Science analysis of a given unit in one instructional materials
program as part of PEEC Phase 2: Unit Evaluation.

Innovation EQuIP Criterion Evidence of Quality? Unit Evaluation (summary)

☐ Materials incorporate the


Making Sense of innovation.
Phenomena and I. A. Explaining
☐ None ☐ Inadequate ☐ Adequate ☐ ☐ Materials partially incorporate
Designing Phenomena/
Extensive the innovation.
Solutions to Designing Solutions
Problems ☐ Materials do not incorporate the
innovation.

☐ None ☐ Inadequate ☐ Adequate ☐


I. B. Three Dimensions
Extensive

Three- I. C. Integrating the Three ☐ None ☐ Inadequate ☐ Adequate ☐


Dimensional Dimensions Extensive
Learning

III. A. Monitoring 3D
☐ None ☐ Inadequate ☐ Adequate ☐
Student
Extensive
Performances

PEEC version 1.1—December 2017 page 69 of 93


Innovation EQuIP Criterion Evidence of Quality? Unit Evaluation (summary)

☐ Materials incorporate the


☐ None ☐ Inadequate ☐ Adequate ☐ innovation.
III. B. Formative
Extensive
☐ Materials partially incorporate
the innovation.
☐ None ☐ Inadequate ☐ Adequate ☐
III. C. Scoring Guidance
Extensive ☐ Materials do not incorporate the
innovation.

III. E. Coherent Assessment ☐ None ☐ Inadequate ☐ Adequate ☐


System Extensive

☐ None ☐ Inadequate ☐ Adequate ☐ ☐ Materials incorporate the


I. D. Unit Coherence
Extensive innovation.

☐ Materials partially incorporate


Building K–12 ☐ None ☐ Inadequate ☐ Adequate ☐ the innovation.
II. C. Building Progressions
Progressions Extensive
☐ Materials do not incorporate the
innovation.
II. F. Teacher Support for ☐ None ☐ Inadequate ☐ Adequate ☐
Unit Coherence Extensive

Alignment with ☐ None ☐ Inadequate ☐ Adequate ☐ ☐ Materials incorporate the


I. F. Math and ELA
English language Extensive innovation.

PEEC version 1.1—December 2017 page 70 of 93


Innovation EQuIP Criterion Evidence of Quality? Unit Evaluation (summary)

arts and ☐ Materials partially incorporate


Mathematics the innovation.

☐ Materials do not incorporate the


innovation.

II. A. Relevance and ☐ None ☐ Inadequate ☐ Adequate ☐


Authenticity Extensive

II. B. Student Ideas ☐ None ☐ Inadequate ☐ Adequate ☐


Extensive
☐ Materials incorporate the
innovation.
II. E. Differentiated ☐ None ☐ Inadequate ☐ Adequate ☐
All Standards, All ☐ Materials partially incorporate
Instruction Extensive
Students the innovation.

☐ Materials do not incorporate the


II. G. Scaffolded ☐ None ☐ Inadequate ☐ Adequate ☐
innovation.
Differentiation over Extensive
Time

III. D. Unbiased tasks/item ☐ None ☐ Inadequate ☐ Adequate ☐


Extensive

PEEC version 1.1—December 2017 page 71 of 93


Innovation EQuIP Criterion Evidence of Quality? Unit Evaluation (summary)

III. F. Opportunity to Learn ☐ None ☐ Inadequate ☐ Adequate ☐


Extensive

Narrowing the Field?

Depending on how many programs made it to this phase of the analysis, the EQuIP Rubric for Science evaluations may be used to
continue to narrow the field of instructional materials programs being evaluated. After consensus reports have been generated for
each unit, the review team should evaluate whether or not all programs are worthy of further review. Unless the separation in
quality is very small, it is recommended that only the top two or three programs continue to the final phase of the PEEC process.

PEEC version 1.1—December 2017 page 72 of 93


Tool 5A: Program-Level Evaluation Innovation 1: Making Sense of
Phenomena and Designing Solutions to Problems
This tool is to be used to collect evidence and make claims about how an instructional materials program addresses NGSS Innovation
1: Making Sense of Phenomena and Designing Solutions to Problems.

Directions

Using the sampling evaluation plan, record evidence of where the innovation has been clearly incorporated into the materials as well
as instances where it does not appear to have been incorporated. Your evidence should include page numbers, a brief description of
the evidence, and an explanation of how it either supports or contradicts the claim.

Sufficient evidence
to support the
Claim Evidence claim?

From the student’s perspective, most ☐ None


learning experiences are focused on ☐ Inadequate
making sense of phenomena and ☐ Adequate
designing solutions to problems. ☐ Extensive

PEEC version 1.1—December 2017 page 73 of 93


Sufficient evidence
to support the
Claim Evidence claim?

Guidance is provided to teachers to What to look for as evidence:


support students in making sense of
phenomena and designing solutions One phenomena/problem or a series of related
to problems. phenomena/problem drive instruction and help maintain a focus for
all the lessons in a sequence. ☐ None
☐ Inadequate
Guidance is provided to the teacher for how each of the lessons
supports students in explaining the phenomena or solving the ☐ Adequate
problem ☐ Extensive

Teaching strategies are provided to use student sense-making and


solution-designing as a mechanism for making their three-
dimensional learning visible.

Summary and Recommendations

1. Based on the evidence collected, to what degree to the materials incorporate this innovation over the course of the
program?

☐ Materials incorporate the innovation.


☐ Materials partially incorporate the innovation.
☐ Materials do not incorporate the innovation.

PEEC version 1.1—December 2017 page 74 of 93


2. Reviewer Notes/Comments:

3. If this innovation is only partially incorporated, suggest additional professional learning or other support that would be
needed for teachers to use the materials in a way that incorporated the innovation in their instruction.

PEEC version 1.1—December 2017 page 75 of 93


Tool 5B: Program-Level Evaluation Innovation 2: Three-Dimensional Learning
This tool is to be used to collect evidence and make claims about how an instructional materials program addresses NGSS Innovation
2: Three-Dimensional Learning.

Directions

Using the sampling evaluation plan, record evidence of where the innovation has been clearly incorporated into the materials as well
as instances where it does not appear to have been incorporated. Your evidence should include page numbers, a brief description of
the evidence, and an explanation of how it either supports or contradicts the claim.

Sufficient evidence
to support the
Claim Evidence claim?

Student sense-making of phenomena ☐ None


and/or designing of solutions requires
☐ Inadequate
student performances that integrate
☐ Adequate
grade-appropriate elements of the
☐ Extensive
SEPs, CCCs, and DCIs.

Teacher materials communicate the


☐ None
deliberate and intentional design
☐ Inadequate
underpinning the selection of three-
☐ Adequate
dimensional learning goals across the
☐ Extensive
program.

PEEC version 1.1—December 2017 page 76 of 93


Sufficient evidence
to support the
Claim Evidence claim?

Student materials include accessible What to look for as evidence in the student materials:
and unbiased formative and
summative assessments that provide • Materials regularly elicit direct, observable evidence of
clear evidence of students’ three- three-dimensional learning (SEP, DCI, CCC);
☐ None
dimensional learning. • Materials include authentic and relevant tasks that require
☐ Inadequate
students to use appropriate elements of the three
dimensions; ☐ Adequate
• Provide a range of item formats, including construct- ☐ Extensive
response and performance tasks, which are essential for the
assessment of three-dimensional learning consonant with
the framework and the NGSS.

Over the course of the program, a What to look for as evidence in the assessment system:
system of assessments coordinates
the variety of ways student learning is • Consistent use of pre-, formative, summative, self- and peer-
monitored to provide information to assessment measures that assess three-dimensional ☐ None
students and teachers regarding learning; ☐ Inadequate
student progress for all three • Consistent support for teachers to adjust instruction based ☐ Adequate
dimensions of the standards. on suggested formative classroom tasks; and ☐ Extensive
• Support for teachers and other leaders to make program-
level decisions based on unit, interim, and/or year-long
summative assessment data.

PEEC version 1.1—December 2017 page 77 of 93


Sufficient evidence
to support the
Claim Evidence claim?

When appropriate, links are made What to look for as evidence:


across the science domains of life ☐ None
science, physical science, and Earth • Disciplinary core ideas from different disciplines are used
☐ Inadequate
and space science. together to explain phenomena.
☐ Adequate
• The usefulness of crosscutting concepts to make sense of
phenomena or design solutions to problems across science ☐ Extensive
domains is highlighted.

Summary and Recommendations

1. Based on the evidence collected, to what degree to the materials incorporate this innovation over the course of the
program?

☐ Materials incorporate the innovation.


☐ Materials partially incorporate the innovation.
☐ Materials do not incorporate the innovation.

PEEC version 1.1—December 2017 page 78 of 93


2. Reviewer Notes/Comments

3. If this innovation is only partially incorporated, suggest additional professional learning or other support that would be
needed for teachers to use the materials in a way that incorporated the innovation in their instruction.

PEEC version 1.1—December 2017 page 79 of 93


Tool 5C: Program-Level Evaluation Innovation 3: Building Progressions
This tool is to be used to collect evidence and make claims about how an instructional materials program addresses NGSS Innovation
3: Building Progressions.

Directions

Using the sampling evaluation plan, record evidence of where the innovation has been clearly incorporated into the materials as well
as instances where it does not appear to have been incorporated. Your evidence should include page numbers, a brief description of
the evidence, and an explanation of how it either supports or contradicts the claim.

Sufficient evidence
to support the
Claim Evidence claim?

Students engage in the science and ☐ None


engineering practices with increasing, ☐ Inadequate
grade-level appropriate complexity ☐ Adequate
over the course of the program. ☐ Extensive

Students utilize the crosscutting ☐ None


concepts with increasing grade-level ☐ Inadequate
appropriate complexity over the ☐ Adequate
course of the program. ☐ Extensive

PEEC version 1.1—December 2017 page 80 of 93


Sufficient evidence
to support the
Claim Evidence claim?

The disciplinary core ideas are ☐ None


presented in a way that is ☐ Inadequate
scientifically accurate and grade-level ☐ Adequate
appropriate. ☐ Extensive

Teacher materials make it clear how


each of the three dimensions builds
progressively over the course of the ☐ None
program in a way that gives students
☐ Inadequate
multiple opportunities to
☐ Adequate
demonstrate proficiency in the
breadth of the performance ☐ Extensive
expectations addressed in the
program.

Each unit builds on prior units by What to look for as evidence:


addressing questions raised in those
For each of the units, look at the transitions into and out of the ☐ None
units, cultivating new questions that
units. Are the units linked together from a student’s perspective? ☐ Inadequate
build on what students figured out, or
☐ Adequate
cultivating new questions from
related phenomena, problems, and ☐ Extensive
prior student experiences.

PEEC version 1.1—December 2017 page 81 of 93


Sufficient evidence
to support the
Claim Evidence claim?

Teacher materials clearly explain the ☐ None


design principles behind the ☐ Inadequate
sequencing of the storyline. ☐ Adequate
☐ Extensive

Student materials engage students


☐ None
with the nature of science and
☐ Inadequate
engineering, technology, and
☐ Adequate
applications of science over the
course of the program. ☐ Extensive

Teacher materials make connections ☐ None


to the nature of science; engineering,
☐ Inadequate
technology, and applications of
☐ Adequate
science over the course of the
☐ Extensive
program.

PEEC version 1.1—December 2017 page 82 of 93


Summary and Recommendations

1. Based on the evidence collected, to what degree to the materials incorporate this innovation over the course of the
program?

☐ Materials incorporate the innovation.


☐ Materials partially incorporate the innovation.
☐ Materials do not incorporate the innovation.

2. Reviewer Notes/Comments

3. If this innovation is only partially incorporated, suggest additional professional learning or other support that would be
needed for teachers to use the materials in a way that incorporated the innovation in their instruction.

PEEC version 1.1—December 2017 page 83 of 93


Tool 5D: Program-Level Evaluation Innovation 4: Alignment with English
Language Arts and Mathematics
This tool is to be used to collect evidence and make claims about how an instructional materials program addresses NGSS Innovation
4: Alignment with English Language Arts and Mathematics.

Directions

Using the sampling evaluation plan, record evidence of where the innovation has been clearly incorporated into the materials as well
as instances where it does not appear to have been incorporated. Your evidence should include page numbers, a brief description of
the evidence, and an explanation of how it either supports or contradicts the claim.

Sufficient evidence
to support the
Claim Evidence claim?

Materials engage students with ☐ None


English language arts in
☐ Inadequate
developmentally-appropriate ways
☐ Adequate
(supporting state English language
☐ Extensive
arts standards)

Materials engage students with ☐ None


mathematics in developmentally ☐ Inadequate
appropriate ways (supporting state ☐ Adequate
mathematics standards) ☐ Extensive

PEEC version 1.1—December 2017 page 84 of 93


Sufficient evidence
to support the
Claim Evidence claim?

Teacher materials make connections


to state mathematics and English ☐ None
language arts standards and ☐ Inadequate
incorporate teaching strategies that ☐ Adequate
support this student learning where ☐ Extensive
appropriate.

Summary and Recommendations

1. Based on the evidence collected, to what degree to the materials incorporate this innovation over the course of the
program?

☐ Materials incorporate the innovation.


☐ Materials partially incorporate the innovation.
☐ Materials do not incorporate the innovation.

PEEC version 1.1—December 2017 page 85 of 93


2. Reviewer Notes/Comments

3. If this innovation is only partially incorporated, suggest additional professional learning or other support that would be
needed for teachers to use the materials in a way that incorporated the innovation in their instruction.

PEEC version 1.1—December 2017 page 86 of 93


Tool 5E: Program-Level Evaluation Innovation 5: All Standards, All Students
This tool is to be used to collect evidence and make claims about how an instructional materials program addresses NGSS Innovation
5: All Standards, All Students.

Directions

Using the sampling evaluation plan, record evidence of where the innovation has been clearly incorporated into the materials as well
as instances where it does not appear to have been incorporated. Your evidence should include page numbers, a brief description of
the evidence, and an explanation of how it either supports or contradicts the claim.

Sufficient evidence
to support the
Claim Evidence claim?

Students have substantial


opportunities to express and
☐ None
negotiate their ideas, prior
☐ Inadequate
knowledge, and experiences as they
☐ Adequate
are using the three dimensions of the
☐ Extensive
NGSS to make sense of phenomena
and design solutions to problems.

Teacher materials anticipate common ☐ None


student ideas and include guidance to ☐ Inadequate
surface and challenge student ☐ Adequate
thinking. ☐ Extensive

PEEC version 1.1—December 2017 page 87 of 93


Sufficient evidence
to support the
Claim Evidence claim?

Materials provide suggestions for What to look for as evidence:


how to attend to students’ diverse
skills, needs, and interests in varied • Appropriate reading, writing, listening, and/or speaking
classroom settings. alternatives (e.g., translations, picture support, graphic
organizers, etc.) for students who are English learners, have
☐ None
special needs, or read well below the grade level
☐ Inadequate
• Extra support (e.g., phenomena, representations, tasks) for
students who are struggling to meet the targeted ☐ Adequate
expectations ☐ Extensive
• Extensions for students with high interest or who have
already met the performance expectations to develop
deeper understanding of the practices, disciplinary core
ideas, and crosscutting concepts

Summary and Recommendations

1. Based on the evidence collected, to what degree to the materials incorporate this innovation over the course of the
program?

☐ Materials incorporate the innovation.


☐ Materials partially incorporate the innovation.
☐ Materials do not incorporate the innovation.

PEEC version 1.1—December 2017 page 88 of 93


2. Reviewer Notes/Comments

3. If this innovation is only partially incorporated, suggest additional professional learning or other support that would be
needed for teachers to use the materials in a way that incorporated the innovation in their instruction.

PEEC version 1.1—December 2017 page 89 of 93


Tool 6: PEEC Evidence Summary
This tool is to be used to summarize evidence collected in all three phases of PEEC.

Directions

Complete the table below by transferring the data from each of the three phases of PEEC.

Phase 2: Unit Evaluation Phase 3: Program-Level


Innovation Phase 1: Prescreen (EQuIP summary) Evaluation

☐ Materials incorporate the ☐ Materials incorporate the


innovation. innovation.

Making Sense of Phenomena & ☐ Materials partially incorporate ☐ Materials partially incorporate
Shows Promise? ☐
Designing Solutions to Problems the innovation. the innovation.

☐ Materials do not incorporate ☐ Materials do not incorporate


the innovation. the innovation.

☐ Materials incorporate the ☐ Materials incorporate the


innovation. innovation.

☐ Materials partially incorporate ☐ Materials partially incorporate


Three-Dimensional Learning Shows Promise? ☐
the innovation. the innovation.

☐ Materials do not incorporate ☐ Materials do not incorporate


the innovation. the innovation.

PEEC version 1.1—December 2017 page 90 of 93


Phase 2: Unit Evaluation Phase 3: Program-Level
Innovation Phase 1: Prescreen (EQuIP summary) Evaluation

☐ Materials incorporate the ☐ Materials incorporate the


innovation. innovation.

☐ Materials partially incorporate ☐ Materials partially incorporate


Building K–12 Progressions n/a
the innovation. the innovation.

☐ Materials do not incorporate ☐ Materials do not incorporate


the innovation. the innovation.

☐ Materials incorporate the ☐ Materials incorporate the


innovation. innovation.

Alignment with English ☐ Materials partially incorporate ☐ Materials partially incorporate


n/a
Language Arts and Mathematics the innovation. the innovation.

☐ Materials do not incorporate ☐ Materials do not incorporate


the innovation. the innovation.

PEEC version 1.1—December 2017 page 91 of 93


Phase 2: Unit Evaluation Phase 3: Program-Level
Innovation Phase 1: Prescreen (EQuIP summary) Evaluation

☐ Materials incorporate the ☐ Materials incorporate the


innovation. innovation.
n/a
☐ Materials partially incorporate ☐ Materials partially incorporate
All Standards, All Students
the innovation. the innovation.

☐ Materials do not incorporate ☐ Materials do not incorporate


the innovation. the innovation.

PEEC version 1.1—December 2017 page 92 of 93


Tool 7: Final Evaluation
This tool is used at the end of the PEEC process to make a final recommendation about an instructional materials program.

Directions

Reflect on the summary table and the other evidence collected to make a final claim about whether the instructional materials
program is designed to provide adequate and appropriate opportunities for students to meet the performance expectations of the
NGSS. Once this claim is established, explain how the data in Tool 6: Program-Level Evaluation Evidence Summary support this
conclusion and highlight the most compelling evidence from each of the phases of PEEC to support the claim. After establishing the
evidence for the claim, summarize any recommendations for what would need to happen during implementation of the materials to
address any weaknesses that were identified in the analysis.

Claim

Title of instructional materials under review: ______________________________________ (does/does not) provide adequate and
appropriate opportunities for students to meet the performance expectations of the NGSS.

Evidence-Based Response

Recommendations

PEEC version 1.1—December 2017 page 93 of 93

You might also like