You are on page 1of 2

Top 10 Questions to Ask When

Comparing Interim Assessments


Not all interim assessments are created equal. They vary widely in their
designs, purposes, and validity. Some deliver data powerful enough for
educators to make informed decisions at the student, class, school, and
district leveland some dont. Along with measuring student growth and
achievement, quality interim data can provide stability during times of
transition in standards and curriculum.
Below, youll find 10 key questions to ask as you investigate interim
assessments to meet your needs.
1. Was the assessment designed to
provide achievement status AND
growth data?

3. Does it provide items at the


appropriate difficulty level for
each student?

Why its important: Growth data help engage


students in a growth mindset; they learn their ability
isnt fixed, but can increase with effort. Data that fail
to reflect student growth deny students and their
communities the opportunity to be proud of what
they accomplished. An assessment is only as good as
the scale that forms its foundation.

Why its important: Instructional readiness does not


always relate directly to grade level. Students may be
at, above, or below grade level in terms of what they
are ready to learn.

What to look for: Look for a stable vertical scale that


is regularly monitored for scale drift.

2. Is the assessment adaptive?


Why its important: Adaptive assessments choose
items based on a students response pattern, so the
students true ability level can be measured with
precision in the fewest number of test items.
What to look for: Look for whether the assessment
adapts to each student with each item. Many socalled adaptive assessments adapt after a bank
of questions has been answered, resulting in less
precision in the results.

Partnering to Help All Kids Learn | NWEA.org | 503.624.1951


10 Questions to Ask When Comparing Interim Assessments

What to look for: Look for an assessment that


adapts at-grade and out-of-grade, so that its able
to measure the students true starting point. Also
review the span of content the assessment covers.
An assessment that informs educators about each
students instructional readiness draws on content
that spans across grades.

4. Does the assessment link to relevant


resources to support instruction?
Why its important: Data linked to views, tools, and
instructional resources can help educators answer
the critical question, How do we make these data
actionable?
What to look for: Look for links to instructional
resources that support students in learning what they
are ready to learn. These resources may include openeducation resources (OER), full-blown curricula, or
resources linked to blended learning models.

5. Do the data inform decisionmaking at the classroom level?

8. Do the assessment data


have validity?

Why its important: At the classroom level, ability


grouping is one of the key uses for data; this includes
differentiating instruction as well as identifying
students for programs and resources that will best
support their needs.

Why its important: Every assessment is designed


with a purposeor purposesthat its data can
support. Validity, meaning whether an assessment
measures what it intends to measure, ensures that
the inferences made from the data are sound.

What to look for: Look at whether the report views


and tools empower classroom-level instructional
decisions and simplify differentiation and grouping.

6. Can the data inform decisionmaking at the building and


district levels?
Why its important: Report views that aggregate for
a school or multiple sites serve building- and districtlevel administrators data needs.
What to look for: Look at the features and
capabilities of the reports and tools to maximize the
value of the assessment data. The more assessment
data are leveraged, the less time needs to be
spent gathering them lending efficiency to your
assessment process.

7. Is the item pool sufficient to


support the test design and purpose?
Why its important: An item pools sufficiency can
be determined by a number of factors, including
its development process, maturity, size, depth, and
breadth. High-quality items are a critical component
to any assessment.
What to look for: Look at the item development,
alignment, and review processes. This information
should be available in the assessment providers
Technical Manual.

What to look for: Look for the test design and its
intended purpose(s). Understand what content is
covered, how the test should be administered and
scored, and its standard error of measurement.

9. Does the assessment provider


develop norms from their data? How
often are the norms updated?
Why its important: Norms can provide a relevant
data point that contextualizes a students assessment
results and helps students and teachers with goal
setting. It is also important to provide context around
student growth and answer questions such as How
much growth is sufficient? and Is the student
gaining or losing ground relative to their peers?
What to look for: Look for the assessment providers
practices around norming, including how often
normative studies are conducted, whether the
population is nationally representative, and whether
status and growth norms are developed.

10. Can the assessment make


predictions of student performance
on high-stakes summative year-end
tests and college benchmarks?
Why its important: Knowing if students are on track
to achieve proficiency on state assessments helps
teachers make adjustments in instructional pacing,
plan interventions, and provide additional resources.
What to look for: Look for predictive studies that
link student scores to proficiency levels for their state
assessments or college entrance examinations.

Learn more about NWEA by visiting NWEA.org.


Northwest Evaluation Association (NWEA) has nearly 40 years of experience helping educators move student learning
forward through computer-based assessment suites, professional development offerings, and research services.

Partnering to Help All Kids Learn | NWEA.org | 503.624.1951


10 Questions to Ask When Comparing Interim Assessments

Northwest Evaluation Association 2014. All rights reserved. Measures of Academic Progress, MAP, and Partnering to Help All Kids Learn are
registered trademarks and Northwest Evaluation Association, and NWEA are trademarks of Northwest Evaluation Association.

You might also like