You are on page 1of 20

Chapter

4
Preparing Instructional
Design Objectives and
Assessment Strategies

The great aim of education is not knowledge but action.


—Herbert Spencer

Chapter 4 Topics
▪ The role of objectives in instruction and instructional design
▪ Essential characteristics and components of instructional design objectives
▪ How to decide on appropriate assessment formats for various types of objectives
▪ Procedures for writing effective instructional design objectives
▪ Common errors and problems in writing instructional design objectives

Chapter 4 Learning Outcomes


1. Identify the roles that instructional design objectives play in a systematic design process.
2. Analyze instructional design objectives for missing characteristics and components.
3. Apply criteria for selecting appropriate assessment formats for given situations.
4. Identify and sequence the steps required to write effective instructional design objectives.
5. Apply criteria for effective instructional design objectives by correcting objectives that are not
stated appropriately.

69
70 Part I  •  Analysis

SCENARIO
The essential role of objectives and assessments
Aubrey Fair was an instructional designer for a large training consultant firm. A manu-
facturing company had hired his firm to update training in antitrust laws that it re-
quired all its managers to take. It was imperative that the company’s managers knew
these laws well and didn’t inadvertently break any antitrust rules, because the com-
pany would be held responsible for any infractions. For the last few years, a workshop
was offered by the company’s lead attorney, who was an expert in antitrust laws.
­Aubrey was not told what needed to be updated; he was simply instructed to begin by
meeting with the attorney.
Aubrey greeted the attorney cordially. “So how long have you been offering
these antitrust workshops?” he asked. “For about two-and-a-half years,” replied the
attorney stiffly, “and, frankly, I don’t see why we need to change them at all. They’ve
been working just fine up to now.” Aubrey immediately sensed the need to tread care-
fully. He was obviously invading the attorney’s domain, but he needed to know how
the training had been held in the past.
“Yes, I’ve heard you’re the company’s legal expert, and I’m very interested in
the approach you’ve been using in your workshops,” said Aubrey amiably. “Can you
share some of your materials with me? I’m especially interested in the objectives of
the workshop.”
“They’re quite straightforward, as you can see,” said the attorney who handed
him a notebook. “This is my instructor manual and handouts.” Aubrey read a few of
the statements on the list labeled “Workshop Objectives.” They included:
• Review the definition of “antitrust” as reflected in the basic laws.
• Review the purposes and main points of each of the laws.
• Give students an appreciation for the purposes of antitrust laws in business.
• etc.

Aubrey said, “Hmmm, I see. How do you tell if the workshop participants learn
what you have in mind? Do you have tests or assessments to measure what they’ve
learned?”
The attorney said, “Oh, yes, I can always tell they really get it. There are always
a lot of good questions, and everyone is very enthusiastic about the content. I always
have one or two come up afterward to shake my hand and tell me they’re glad they
attended. These are high-level guys, though, and I feel they would find tests demean-
ing. We do a debriefing at the end and go through a checklist together to make sure
they know everything they should. It really works well.” Aubrey thanked the attorney
and asked to take the notebook with him to look over.

BACKGROUND ON OBJECTIVES AND ASSESSMENT STRATEGIES


IN INSTRUCTIONAL DESIGN
A foundational concept of instructional design is that effective instruction not only
results in learning, which is an internal, unseen change in learners, it also makes pos-
sible a change in action or performance, which is an external, observable change. To
paraphrase the opening quote by Spencer, the desired result of instruction is action
as well as knowledge. In the exchange between the attorney and the instructional de-
signer scenario at the beginning of this chapter, the designer was expecting to see the
Chapter 4  •  Preparing Instructional Design Objectives and Assessment Strategies 71

workshop objectives as statements of actions that learners have to do to demonstrate


that they have achieved the intended results from the instruction. He also was looking
for clear, observable indicators that the attorney used to determine that workshop par-
ticipants had met the objectives. Instead, the attorney gave the designer a list of teach-
ing activities that he, the attorney, did during the workshop. Learning the workshop
content on antitrust laws was important to the participants; it should make them famil-
iar enough with situations that they might encounter where antitrust rules applied so
that they would not inadvertently break any of these rules. The attorney-instructor felt
that the workshop participants really “got it.” But how could he tell?
Statements of objectives are most helpful when they communicate clearly and
unambiguously the actions students are to do to show they have learned. Although
this may sound like a commonsense approach, it is no easy matter to write such clear,
specific objectives, even for those who are experts in the content. This section begins
by reviewing the key role that instructional design objectives play in a systematic
design approach, reviewing essential characteristics and components of these objec-
tives, and giving examples of them in a variety of different content areas.

 isten to Learn How Writing Objectives and Planning Assessment Strategies


L
Improves Instruction

A Review of Instructional Roles for Objectives


Before describing essential characteristics of good objectives and how to go about
writing them, this section provides background on the purposes that objectives serve
in instruction and why they have been called by various names in the past. It also dif-
ferentiates objectives from another term used to describe student performances that
instruction enables: standards.

PURPOSES OF OBJECTIVES IN INSTRUCTION.  As Waugh and Gronlund (2013) observe,


objectives play a key role in both instruction and assessment. “By describing the
performance that we are willing to accept as evidence of achievement, we provide a
locus for instruction, student learning, and assessment. Objectives help keep all three
in close harmony” (p. 35). Objectives like the ones at the beginning of textbook chap-
ters serve as “learning guides” for students. Readers use them not only to focus on
information they are to derive from the chapter, but also how they will be required to
demonstrate that they understand it.
Objectives that are written in a more detailed and precise way than other kinds
of objectives serve an essential role for instructional designers. After designers write
objectives, they focus all subsequent design activities on making students able to
demonstrate the behaviors described in the objective statements. Thus, objectives
serve as a framework for creating the instructional materials. Objectives also provide
criteria by which designers and others judge the quality of instruction. If students
can do the activities described in the objectives, the instruction is deemed to be suc-
cessful. If they cannot do the activities, the instruction is considered to be in need
of revision.
Robert Mager’s 1962 foundational book Preparing Instructional Objectives not
only described how to write clear, unambiguous instructional objectives, or postin-
struction actions students must be able to demonstrate to show they have achieved the
intended results from the instruction, it also made the practice very popular in educa-
tion and training. The 1970s found school districts and other institutions engaged in
writing instructional objectives for every topic they taught. However, many of these
72 Part I  •  Analysis

organizations never created actual assessments linked to these objectives or made sure
that instruction was in place to help bring about the outcomes they specified. There-
fore, the most important role of instructional objectives was never served. If objectives
are to be most useful in improving instruction, they are not ends in themselves, but
rather the first of in a series of carefully linked design activities.

STANDARDS VS. OBJECTIVES.  In the last decade, content standards are one kind
of performance “target” that has become increasingly well known and important
in education and training. For example in the United States, every state has a
set of standards for what students are to learn in each content area. In addition,
Common Core Standards have been created by the National Governors Association
Center for Best Practices and the Council of Chief State School Officers (http://
www.­corestandards.org). At this time, 45 states and the District of Columbia, four
territories, and the Department of Defense Education Activity have adopted the
Common Core Standards. While these are definitely statements of what students
should be able to do after instruction, they are more global in nature than those
required for instructional design purposes. For example, look at the following com-
parison between one of the Common Core Standards and three different objectives
that might be designed to measure achievement of that standard. In Figure 4.1, see
how a single standard can be assessed in many different ways with different actions
and criteria for meeting it.

TERMS FOR OBJECTIVES.  Various terms have been used to describe the behaviors
s­ tudents should be able to do as the result of instruction. These include: behavioral
objectives, instructional objectives, objectives, outcomes, outcome-oriented objec-
tives, and performance objectives. However, all of these terms are used in contexts
other than systematic instructional design, and the meaning becomes clear only if
the reader knows the context and purpose for which they are being used. The term
­instructional design objective is used in this design model to clarify that it is the prod-
uct of this instructional design step: a statement of behaviors and assessment criteria
that instructional designers write to specify what learners should be able to achieve as
a result of the instruction. This term also helps differentiate statements of objectives
that are useful for design purposes from those given to students or stated in textbooks,
because the latter may not be as detailed or stated in the same way as those needed
to drive instructional design.

Common Core
Standard for Grade 4
Language Arts Measureable Performance Objectives for the Standard
L.4.5 Explain the meaning 1. In at least 8 of 10 sentences that each contain an underlined simile
of simple similes and met- or metaphor, write below the sentence a synonym for the figure of
aphors (e.g., as pretty as a speech.
picture) in context. 2. Given a 6- to 8-sentence paragraph containing a total of 2 similes
and 2 metaphors and a list of meanings for them below the
paragraph, circle all 4 figures of speech and write each beside its
correct meaning.
3. In 10 short poems, 5 of which contain a simile and 5 of which contain
a metaphor, identify the figure of speech correctly in at least four of
each set by circling it and writing it s meaning below the poem.

FIGURE 4.1  Example standard and performance objectives matched to it.


Chapter 4  •  Preparing Instructional Design Objectives and Assessment Strategies 73

Check Your Understanding 4.1

Objective 1 Exercise—Roles for Instructional Objectives. Place a check by each of the


following that are roles that instructional design objectives should serve:
______ 1. Serve as learning guides for textbook readers
______ 2. Serve as a framework for creating the instructional materials
______ 3. Provide criteria by which designers and others judge the quality of instruction
______ 4. List all the required steps in presenting an instructional sequence
______ 5. Serve the same role as standards such as the Common Core Standards
______ 6. Communicate clearly actions students are to do to show they have learned
______ 7. Focus design activities on making students able to demonstrate stated behaviors

Click here for suggested answers

Essential Characteristics and Components


of Instructional Design Objectives
While all designers agree that objectives provide an important foundation for instruc-
tional design, several formats for objectives have emerged over the years, all with the
purpose of making the outcomes of instruction clear and unambiguous for design
purposes. The most popular formats are: ABCD, Gagné, and Mager, and they disagree
only in the number of components required to make this unambiguous quality pos-
sible. A comparison of these formats is shown in Table 4.1.
These formats were developed by instructional design experts in 1962 (Mager,
three components), 1968 (ABCD, four components), and 1974 (Gagné & Briggs, five
components). However, as the examples in Table 4.1 show, there was still consider-
able overlap among them. Each one included components they felt would clarify the
outcomes and make the statements most useful to designers. However, many design-
ers report difficulty in making all outcomes fit a given format. There seems to be no
one-size-fits-all method of specifying what should be included. And, although some
instructional design models call for writing objectives before identifying assessment
strategies, experience has shown that designers cannot really prepare objectives with-
out considering assessment methods. They develop the actual assessment materials
later, but deciding how assessment will be done is inextricably connected with how
students will demonstrate what they have learned.
Therefore, objectives and assessments should be considered together, and the
format of the objective depends in large part on the type of learning outcome. In
order to be clear statements of what students are to do, some objectives may require
only three components and others four or five components. The purpose that design-
ers must keep central in their minds is to make objectives communicate clearly. To
do this, all objectives should have the essential characteristics and components dis-
cussed in the following sections. How to go about writing statements that meet these
criteria is described later in this chapter under the section Preparing Instructional
Design Objectives.

ESSENTIAL CHARACTERISTICS.  No matter how they are stated, instructional design ob-
jectives should reflect certain qualities. First, there should always be an observable
action of some kind (e.g., write, create) rather than just an internal ability (e.g., under-
stand, know, learn) or a statement of content to be covered (e.g., review three chap-
ters). Second, the focus should always be on the actions of students after instruction,
74 Part I  •  Analysis

TABLE 4.1  Three Popular Formats for Instructional Design Objectives

  ABCD Format Gagné Format Mager Format


Developed by Instructional Robert Gagné and Robert Mager (1962)
Development Institutes, Leslie Briggs (1974)
1968 (Seels & Glasgow,
1998)
Components Four parts: Audience Five components: Three components:
(type of students), Characteristics of the Behavior (action
Behavior (action verb), stimulus situation, verb), conditions, and
Conditions, Degree learned capability verb, criteria for judging
(criteria for judging object of verb, action performance
performance) verb, special conditions
Comparison of A = participants in a Stimulus = given a Behavior = create
Components preconference web sheet of paper that correctly working links
design workshop designates five text and Conditions = given
B = make text and graphic items and what a sheet of paper that
graphic items into links they are to link to and an designates five text
C = given a sheet of on-screen web page in and graphic items
paper that designates an editor software that and an on-screen
five text and graphic contains all the items web page in an editor
items and what they Capability verb = software that contains
are to link to and an demonstrate all the items
on-screen web page in Object of verb = Criteria = for all
an editor software that correct procedure for five items
contains all the items making links
D = correctly working Action verb = creating
for five designated items links
Conditions = correctly
working in all five items
Comparison When given (1) a sheet When given (1) a sheet When given a sheet
of Resulting of paper with a list of of paper with a list of of paper with a list of
Objective five text and graphic five text and graphic five text and graphic
Statements items and what they items and what they items and an on-
are to link to, and (2) a are to link to, and (2) a screen web page in
web page in an editor web page in an editor an editor software
software that contains software that contains that contains all the
all the items, participants all the items, the student items, the learner
in a preconference web will demonstrate correct must create correctly
design workshop will procedure for making working links for all
make five designated links by creating correctly five items.
text or graphic items working links for all five
from the page into items.
correctly working links.

rather than those of the teacher or student during instruction. Finally, statements
should be so unambiguous that anyone reading them should know exactly what stu-
dents are to do to show they have learned. It is not necessary to state an objective in
only one sentence. Clarity and specificity are the most important qualities for instruc-
tional design objectives, and achieving these qualities may require several sentences
or a series of phrases.
Chapter 4  •  Preparing Instructional Design Objectives and Assessment Strategies 75

ESSENTIAL COMPONENTS.  Objective statements are most helpful for design purposes
when they have certain components. At the minimum, each statement should contain
three items to specify how the student will demonstrate what they have learned: ac-
tion, assessment, and performance level.
• Action.  The action the student is required to do is derived from the behavior
identified in the learning map, which designers create in the step before writ-
ing objectives. (See Chapter 3.) For example, one of the outcomes in a 3-D
Drawing Sample Project learning map in Chapter 3 is “Complete 3-D drawing
model.” The obvious action that would demonstrate knowledge of drawing
principles is: “Draw a model.” Actions should always be expressed as observ-
able activities; for example, “design, write, solve, draw, make, choose.” Avoid
action verbs that ­describe internal conditions that cannot be directly seen and
measured. Examples of these “verbs to avoid” are: understand, know, appreci-
ate, and feel.
• Assessment.  The designer must identify the circumstances under which the
student will complete the action. This may include methods, student/instructor
materials, and/or special circumstances that will apply as students show what
they have learned. Many objectives do not require that all four of the following
components be specified in order to make an objective clear enough for design
purposes; it depends on the type of learned behavior and what the designer con-
siders necessary for a valid assessment.
– Methods.  The objective should identify the means of assessing the action.
Completing a test or survey, doing a verbal description, performing an activity,
or developing a product all are possible assessment methods. (For details on
assessment method options, see the following section on Essential Criteria for
Selecting Appropriate Assessment Methods.)
– Student materials.  Assessment may require that students have additional
materials such as data charts and tables, calculators, dictionaries, or textbooks
available to them. If so, the objective should state them.
– Instructor materials.  Materials such as a rubric or a performance checklist
may be required so that instructors can rate or track performance. A rubric is
a scoring guide, and a performance checklist is a list of component tasks or
activities in a performance. (Both will be discussed in more detail in Chapter 5.)
For example, if students do web page layouts, the products might be judged
by a rubric or criterion checklist.
– Special circumstances.  Sometimes the objective must include a description
of certain conditions in which the assessment will be done. For example, stu-
dents must do an activity within a certain time limit or without any supporting
materials.
• Performance level.  Perhaps the most difficult part of writing an objective is
specifying how well a student will have to do an activity or how much they must
do it to show they have the necessary level of expertise. Designers must decide
what will constitute acceptable performance and specify it. Depending on the
assessment method, there are several ways to express acceptable performance
levels.
– Number correct.  Students may need to do a certain number of items or ac-
tivities correctly to demonstrate they have learned. If the assessment method
is a written test, the percentage or number of items required for passing the
test should be stated. If the action is a motor skill such as operating a piece
of equipment, the students may need to do it correctly a certain number of
times.
76 Part I  •  Analysis

– Level of accuracy.  If the designer knows there will be variation in the action,
the tolerance for this variation should be specified. For example, if an architec-
tural student is required to calculate the weight a structure will bear, a tolerance
range in pounds or ounces must be stated.
– Rating.  If the quality of performances or products is measured by a rubric or
checklist, the acceptable rating must be given. For example, in the web page
example, if a rubric is used to assess the quality of the student’s web page de-
sign, the designer would have to specify what would constitute an acceptable
rubric score. If students are to complete a series of activities, the rating may be
how many of the total number they must complete.
See Table 4.2 for examples of objectives that reflect all these components.

TABLE 4.2  Sample Instructional Design Objectives with Essential Components

1
Target Behavior 2 3 4
  from Learning Map Action Assessment Performance Level
Example 1 The student identifies The student labels The student labels a sample page The student must
examples of text, elements of a web printout randomly selected by correctly label 14 of
images, links, and page. the Instructor from 10 printouts. 15 elements.
tables on a web page. On each page, 15 elements are
indicated with an arrow and a
numbered line. The student must
label all parts within 10 minutes.
(Spelling does not count.)
Example 2 The student classifies The student The paragraph on a computer At least 14 of the
sentences as identifies sentences screen has 15 sentences 15 must be correctly
simple, compound, in a paragraph as with at least 2 of each type coded.
complex, or to type. represented. The instructor assigns
compound–complex. a color code for each type. Student
codes all 10 of the sentences within
10 minutes.
Example 3 The student The student creates On an AutoCAD screen, the The roof drawing
demonstrates the a CAD drawing of a student draws a roof with the must meet at least
procedure for using structure with 3-D correct size and shape within ten 9 out of 10 accuracy
AutoCAD to create a planes. minutes and with no reference and quality criteria
3-D plane in space. materials. The instructor grades on the instructor
with a checklist. checklist.
Example 5 The student states The student On a computer-screen image of Passing score is at
names for all bones labels bones on a the upper extremity, students enter least 61 of 64.
of the shoulder, wrist, computer-generated the name of the bone on the line
and hand. image of the opposite it, all within 30 minutes,
skeleton. using no reference materials;
spelling counts.
Example 6 The student executes The student types The student uses Microsoft Word The paragraph must
a typing exercise at a paragraph at software to type an assigned contain no more than
60 WPM. 60 WPM. paragraph. If needed, the three typographical
instructor will assist with setting up errors.
a new Word document. Students
will be given the paragraph on
paper and a verbal signal to begin
and end the test.
Chapter 4  •  Preparing Instructional Design Objectives and Assessment Strategies 77

Check Your Understanding 4.2

Objective 2 Exercise—Characteristics and Components of Instructional Design Objectives.


In each of the following statements, identify which required characteristics and components
it lacks. Place the letter(s) of the missing characteristic or component (listed on the right) on
the line next to the statements on the left. (NOTE: Some statements will be missing more than
one characteristic or component.)
Required Characteristics/
Incorrect Objective Statements Components
______ 1. Create a 5-minute video using Adobe Premiere A. The target behavior is
software; the video product will be evaluated stated.
by a rubric. B. The behavior is
______ 2. Identify the appropriate IRS form for given observable.
­tax-reporting needs by selecting the form C. It is stated as a student
name/number from a list of possible forms (not teacher) action.
in an online testing program. D. It is a postinstruction
______ 3. Teach the correct protocol for setting a ­broken behavior.
arm by demonstrating the operation on a E. It communicates clearly.
­patient mannequin. F. It contains an assessment
______ 4. Know how to write a subroutine in the C++ strategy.
programming language. G. It contains a performance
______ 5. Correctly dissect a frog using a computer level.
­simulation. The operation will be graded by
a performance checklist.

Click here for suggested answers

Essential Criteria for Selecting Appropriate Assessment Methods


Because assessment methods are an important component of instructional design
objectives, writing objectives requires designers to select from the array of available
assessment techniques. This section reviews the characteristics and uses of each
type of assessment method and factors to consider when selecting an assessment
method. Chapter 5 describes in detail how to prepare effective assessment materials
of each kind.

MENTAL SKILLS AND INFORMATION TESTS.  In recent years, test formats called mental
skills and information tests (or simply tests) long used in education and training (e.g.,
multiple-choice tests) have come under various kinds of criticism. These are instru-
ments consisting of individual items that are intended as indirect measures of student
abilities. Some educators feel the instruments are overused and are valid measures
of learning primarily for lower-level skills. However, tests remain the most com-
monly used assessments in education and training, and many educators feel that, when
properly applied and developed, they can effectively assess learning at many different
levels. Although most of these methods require a relatively simple external response
from the student, they can require a complex internal process. For example, the multiple-
choice example in Table 4.3 requires only that the student read the item and circle
a choice. However, in order to get the item correct, the student must first solve
a complex problem. Another criticism of true/false, multiple-choice, and matching
formats is that students can get some correct by guessing. However, several tech-
niques are used to address this potential problem. For example, in a multiple choice
78 Part I  •  Analysis

test, designers may require a number of correct items and can provide carefully
crafted wrong answers or distractors based on answers that can result from incorrect
processes.

PERFORMANCE MEASURES.  Checklists and rubrics have gained popularity in recent


years as they became associated with constructivist teaching methods and “authen-
tic assessment,” or requiring a behavior that simulates a real-world application of a
learned ability. For example, students may demonstrate a combination of mathemati-
cal and problem-solving skills by developing a solution to a scenario, or they may
work in a small group to create a multimedia presentation to show results of research
and skills in working cooperatively with others. Another popular approach is student
portfolios, collections of people’s work products over time, arranged so that they and
others can see how the person’s skills have developed and progressed. Instructors use
checklists, rubrics, or a combination of these to rate complex work or products. Sev-
eral organizations have developed and tested checklists and rubrics to support their
own activities and have offered them for use by others with similar needs. Examples of
some of these materials may be found in the Instructor Manual and, when applicable,
may be used by students in their instructional design projects.

ATTITUDE SURVEYS.  When the objective of the instruction is to change students’ per-
ceptions or behavior, Likert scales or semantic differentials ask them how they feel
about a topic or what they would do in a given situation. Likert scales are assessments
that ask the degree to which one agrees with statements, and semantic differentials
are assessments that ask where one’s views of something fall between a set of bipolar
adjectives. (Both will be discussed in more depth in Chapter 5.) Of course, we can
never be certain that what students say they will do on attitude measures is what they
actually will do. For example, a survey found a disconnect between what students say
they want to eat and what university food-service managers observed them choosing
to eat. The students said they wanted to eat healthy food like salads and fruit; how-
ever, the most popular foods were pizza and hamburgers (Farrell, 2002). Because most
actions cannot be observed so directly, attitude measures remain the most useful ways
to infer students’ likely performance and, thus, indicate that the instruction has had
the desired impact.

FACTORS TO CONSIDER WHEN SELECTING ASSESSMENT METHODS.  Which assessment


method fits a given objective? As with many instructional design activities, designers
must use guidelines rather than formulas to answer this question, and there is usually
more than one correct strategy. Consider the following four guidelines when selecting
a type of instrument to use:

• Guideline #1: Directness of measurement.  What is the most direct way to


determine if the student can do the desired performance? Very often, assess-
ments must use indirect strategies because it is not practical to do direct obser-
vations of student performances. For example, after instruction, you may want
to give someone a sales report and ask them to give an analysis of its important
information. But because assessment must be faster and easier to accomplish,
you have to choose a less direct method: asking them specific questions about
the report, each of which has one correct answer. You may want to see if a per-
son’s attitude toward a topic has changed. The most direct method is watching
his or her behavior over time to see what choices they make. Again, because
Chapter 4  •  Preparing Instructional Design Objectives and Assessment Strategies 79

Four guidelines can


help determine how
best to measure what
students learned from
instruction.

this is not feasible, you must choose a less direct method: asking them ques-
tions about what they will do in the future. In circumstances where there are
many learners to assess and time is an important factor, most assessments must
be indirect measures. However, the idea is to choose the most direct way that
is also logistically feasible to carry out in the setting in which instruction will
take place. When confronted with more than one way to assess individuals in-
directly (e.g., a matching versus a multiple-choice test), choose the one that is
the most direct measure of the performance learners would do in “real-world”
environments.
• Guideline #2: Resources required to establish reliability and validity. 
Designers must also make decisions based on their estimates of time and person-
nel resources it will take to make sure instruments are valid and reliable. Validity
means an assessment method measures what it is supposed to measure (Gay,
Mills, & Airasian, 2009; Oosterhof, 2009). Reliability means an assessment yields
consistent results over time, over items within the test, or over two or more scor-
ers. (Also see Chapter 5 for a more in-depth discussion of validity and reliability
when developing each type of instrument.)
– Validity.  For designers, validity means that an assessment should be
closely matched to the action stated in the objective. To increase validity,
designers try to select an assessment format that requires as little inference
as possible about whether students can actually do the action whenever they
are required to do it. For example, if the objective calls for students to solve
given algebra problems, a mental skills test that requires them to solve sam-
ple problems and indicate answers would be an appropriate way to infer stu-
dents’ skills in solving any and all such problems. However, if the objective
requires students to demonstrate they can analyze real-world situations and
develop complex solutions that require algebra skills, scenario-based prob-
lem solving evaluated by a performance measure such as a rubric or checklist
80 Part I  •  Analysis

would be more appropriate. One method designers frequently use to help


decide on assessments is analyzing the action for whether students should
select a response or construct a response (Popham, 2011). Sometimes, the
choice is obvious. For example, if the objective were for students to write a
well-developed paragraph, it would not be valid to ask them questions about
good paragraphs. However, many abilities can be sampled in a valid way
by having students select from possible responses. Deciding on the num-
ber of items or performances to establish competence must also be done at
this point.
– Reliability.  An assessment is reliable if it yields consistent results over time,
over items within the test, and over test scorers. When designers are develop-
ing tests, they are concerned with consistent measurement with an instrument
and over time. However, when selecting an assessment method, designers are
primarily concerned with a kind of reliability known as inter-rater reliability,
or the degree to which two or more persons scoring the same test are likely
to get the same score (Gay et al., 2009). Whenever answers can be scored
objectively (e.g., multiple choice, true/false, matching), scoring reliability is
high. Whenever the scorer has some latitude in determining correct answers
(e.g., short answer, essay, performance measures), scoring reliability is lower.
Designers must select a method that has as much potential as possible for inter-
rater reliability while still being a valid measure.
• Guideline #3: Instrument preparation logistics.  Valid, reliable assessments
take time to develop. Activities include: analyzing actions to determine instrument
requirements, writing items, creating scoring criteria, and collecting data to indicate
levels of validity and reliability. Depending on the available time and resources,
designers may wish to use existing assessments whose usefulness has already
been ascertained. For example, an organization may already have a multiple-
choice test or validated rubric. Although the designer would like to require a
different kind of test or different items or scoring procedures, there may be no
time to develop and validate them. The solution may be to adopt the existing
method while recommending that another measure be developed to implement
at a later time.
• Guideline #4: Administration and scoring logistics.  The relevance of con-
sidering the time and effort needed to administer and score assessments was
well illustrated in one statewide student testing program. The state had ad-
opted two measures to assess students’ mathematics and language skills: a
multiple-choice test and a performance test scored by rubrics. Although teach-
ers were hard pressed to find time to administer the performance tests, they
eventually collected data from both measures. However, when the time arrived
to score the assessments in order to make student placement decisions for the
next year, state education officials had to admit they lacked a sufficient number
of trained personnel to score the performance tests in time. They decided to
ignore all the carefully collected performance data and use only the less valid
but more easily scored multiple-choice tests. To avoid this kind of situation,
designers must analyze the time required for administration and scoring and
weigh this information against reliability and validity before selecting an as-
sessment format.
Table 4.3 summarizes the options available under each of the three major catego-
ries of assessment methods: mental skill and information tests, performance measures,
and attitude measures. It also gives an example item of each type. Table 4.4 summa-
rizes important issues to consider when selecting assessment formats.
Chapter 4  •  Preparing Instructional Design Objectives and Assessment Strategies 81

TABLE 4.3  Summary of Types of Assessment Methods and Instruments

Type of
Category Method Description Sample Action Sample Item
Mental Multiple choice Questions or “stems” with Identify correct answers 1. Which of the points listed
Skill and three to five alternative to geometry problems. below is on a circle with
Information answers provided for each. the following equation?
Tests Students select the most (x − 7)2 + (y + 3)2 = 25?
correct answer by circling or A. (10, 1)
writing the number or letter B. (17, 12)
of their choice C. (−8, −23)
D. (5, −6)
  True/false or Statements that the student Identify whether or not Tell whether or not each of
yes/no must decide are accurate or something is a prime the following numbers is a
not and write or circle true number. prime number by circling T if
or false or a similar indicator it is and F if not:
(e.g., yes/no, correct/incorrect, T F 1. 92
right/wrong, plus/minus) T F 2. 650
  Fill in the blank Statements that each have a Analyze a sales report The report reflects that the
(completion) word or phrase omitted that to determine important company’s best customer in
the student must insert items of information. the first half of the year
was _____.
  Short answer A set of questions, each of Identify the German verb Wie _____ es Ihnen? (gehen)
which the student answers form that is appropriate
with a word or brief phrase. for each sentence.
  Matching Two sets of related items; the Identify the area of the List of materials and list of
student connects them by library where a given library areas.
writing one beside the other item may be found.
or writing one’s letter beside
the other’s number.
Performance Essay (usually A statement or question Describe an instance of Give an example of an
Measures assessed by that requires a structured when the constructivist instructional objective
rubric; see but open-ended response; teaching technique for which a constructivist
description students write several would be an appropriate teaching technique would
below under paragraphs or pages. choice and describe the be appropriate, describe
Performance strategy that would be the technique, and give
Measures) appropriate for that three reasons it would be
situation. appropriate for the objective.
(Graded by an attached rubric.)
  Procedures A list of steps or activities Demonstrate the ______ 1. Turn on the
checklist students must complete procedure for using a camera.
successfully. digital camera to take a ______ 2. Adjust the
photo. settings, etc.
  Performance or A list of criteria that Develop a multimedia An example item for a
product rating students’ products or presentation that meets multimedia product:
scale performances must meet. all criteria for content, Scale
Each criterion may be instructional design, 3 = High, 2 = Acceptable
judged by a “yes/no” organization/ navigation, 1 = Unacceptable
standard or by a level of appearance, and _____ All content information
quality (e.g., 1, 2, or 3; low, graphics/sound. is current.
medium, high) _____ All information is
factually accurate,
etc.
(Continued )
82 Part I  •  Analysis

Table 4.3 (Continued)

Type of
Category Method Description Sample Action Sample Item
  Performance A set of elements that Develop a PowerPoint See examples at Kathy
or product describe a performance presentation to present Schrock’s Guide to Everything
rubric or product together with research findings. website: http://www
a scale (e.g., one to five .schrockguide.net/assessment-
points) based on levels of and-rubrics.html
quality for each element.
Attitude Likert scale A set of statements, and Demonstrate a I am likely to use the Hotline
Measures students must indicate a willingness to use the when I am faced with a
level of agreement for each company’s Information possible case of employee
set. Hotline to ascertain theft. Circle your choice:
company policy and SA A U D SD
procedure on important
personnel issues.
  Semantic Sets of bipolar adjectives, Demonstrate a positive When I think about working
differential each of which may describe attitude toward working with people from a culture
an item, person, or activity; with people of many other than my own, I feel:
each pair is separated by cultures. Good _ _ _ _ _ Bad
a set of lines or numbers; Happy _ _ _ _ _ Sad
students mark one to etc.
indicate a level of feeling on
the continuum from one to
the other.

TABLE 4.4  Summary Guidelines to Consider When Selecting


Assessment Formats
Type of Concern Questions for Designers to Ask
Directness • Is the method the most direct measure of learners’ performances?
• Does it satisfy you that they can do the desired performance in
real-world settings?
Validity • Is the method closely matched to the ability stated in the
objective?
• How easy is it to infer students’ true ability from the assessment?
• Are there enough items or performances required to establish true
competence in the skill?
Reliability • Will many different people be scoring the assessment?
• Can scoring procedures be simplified to reduce training
required?
Instrument • Is an existing test available to measure the behavior?
Development • Is there sufficient time to develop a better measure?
• Is there time to collect data to confirm reliability and validity?
Administration and • Is there time to train personnel who will score the tests?
Scoring Logistics • Will there be enough time to score assessments?
Chapter 4  •  Preparing Instructional Design Objectives and Assessment Strategies 83

Check Your Understanding 4.3

Objective 3 Exercise—Selecting Assessment Formats. Apply selection criteria to choose an


assessment format appropriate for each of the situations listed on the left. Types of formats
are listed on the right. Place the letter of the most appropriate format on the line next to each
of the descriptions on the left.
Objective Statements Assessment Formats
______ 1. A virtual school has an online course for teach- A. Multiple choice
ers in how to use software tools. The instructional B. True/false
goals call for teachers to create products (a word- C. Fill-in or short answer
processed document, spreadsheet, PowerPoint, D. Matching
etc.) that apply specified features of each tool. E. Essay (with rubric)
Each product will be graded according to how F. Procedures checklist
well it meets preset quality criteria. G. Performance or product
______ 2. A medical school wants to assess nurse practition­ rating scale
ers on their ability to calculate dosages of medi- H. Performance or product
cines for patients with various characteristics. rubric
______ 3. An instructional design consulting firm creates I. Likert scale
a unit on collaboration skills. One of its goals is J. Semantic differential
to make participants feel more positively about
working on projects in small groups.
______ 4. A community college has created a vocational
unit on how to do various electrical repairs, once
problems are already diagnosed. Students will
be graded on how well they follow steps in the
­correct order required to carry out repairs.
______ 5. The training unit of a pharmaceuticals company
has an online course for sales representatives
­designed to update them on products they sell
for various needs. They want to assess how well
­representatives can name a company product
for each of several needs a doctor might state,
promptly and without notes.

Click here for suggested answers

PREPARING INSTRUCTIONAL DESIGN OBJECTIVES

Sample Student Projects


Before writing instructional design objectives for your own instructional
product, see Sample Student Projects for four examples of how novice
designers accomplished these procedures for their own projects.

Procedures for Writing Instructional Design Objectives


Experienced designers tend to consider all components of an instructional design
objective at once and do a series of rapid writes and rewrites before settling on a final
statement that will be the basis for review by others and subsequent design work.
Many designers choose to have content experts and potential users of the instruction
review and give feedback on the objectives before proceeding with further design
work. For your work, your instructor serves this role. Novice designers should take
84 Part I  •  Analysis

the following step-by-step approach, breaking down each objective into distinct com-
ponents and writing each one before going back to refine each statement into a final
objective. This forces them to consider each component carefully, focusing on the es-
sential attributes of each one. However, if you are more comfortable working outside
a table, you may do that.
• Review the learning map.  In the Instructional Analysis step, you prepared a
learning map, analyzed learner needs, grouped the behaviors on the map into
learning segments each with a behavior to be measured, and decided on a se-
quence for teaching the segments. Now you should review the skills or steps that
lead up to learning and/or doing the behaviors. Some or all of these behaviors
will become an instructional design objective.
• List the target behaviors.  The first step in writing objectives for each segment
is either to create a table similar to the one in Table 4.2 and enter the target be-
haviors into the first column, or to simply make a list of the target behaviors.
• Decide on an action, assessment method, and performance level to dem-
onstrate the first behavior.  After deciding on the most direct way to assess
that the learner can do the behavior and carefully considering validity, reliability,
instrument preparation time, and administration and scoring logistics, decide on
assessment and performance level components for the first objective. Enter it
into a table or write it next to the behavior.
• Create the objective statement.  After completing the components of each ob-
jective, go back and review each objective and make any corrections necessary to
make it into a final statement. Finally, write completed statements of the objectives.
• Repeat the process for the other objectives.  As you write the statements, you
may realize that some behaviors can be combined into one objective. If neces-
sary, rewrite the statements to reflect the combined behaviors.

Check Your Understanding 4.4

Objective 4 Exercise—Steps in Preparing Instructional Design Objectives. From the fol-


lowing list, select only those that are steps needed to create objectives and put the steps in
correct sequence by placing the appropriate number to the left of the activity. (Reminder:
Some of the steps listed are not needed at all.)
______  Prepare more detailed learning maps for each objective.
______  Decide on an action, assessment method, and performance level for the first behavior.
______  Create the final objective statements for the first behavior.
______  Create new target behaviors, if needed.
______  Repeat the same process for all the objectives, combining behaviors, if necessary.
______  Revise each objective for face validity.
______  Review the learning map for target behaviors.
______  Make a list of the behaviors from the learning map.

Click here for suggested answers

Common Errors and Problems in Writing Objectives


Inexperienced designers tend to make certain common errors when writing
instructional design objectives. Look at the following problems to avoid in
each component. Each has an example that reflects the problem and a way to
correct it.
Chapter 4  •  Preparing Instructional Design Objectives and Assessment Strategies 85

• The action is too vague to be measured.  Although “develop” is an action


verb, the statement contains no activity the student will do to demonstrate a
greater “awareness.”
– Incorrect action.  Develop an awareness of the national debt.
– Correct action.  Describe causes of the national debt.
• The action focuses on the instructor rather than the student.  “Familiarize
students” places the focus on the instructor’s instruction, rather than the stu-
dents’ actions after instruction.
– Incorrect action.  Familiarize students with people’s actions that harm the
environment.
– Correct action.  Identify examples of people’s actions that harm the
environment.
• The action focuses on the students’ learning activities rather than postin-
struction activities.  “Use a CBL tutorial” places the focus on students’ learn-
ing activities, rather than the application students will make of what they have
learned after instruction.
– Incorrect action.  Use a CBL tutorial to learn how to apply actuarial
procedures.
– Correct action.  Do an actuarial analysis for a given situation.
• The action and/or assessment information are incomplete.  In this state-
ment, the focus is on the assessment, rather than the action being assessed. Also,
“World War II” does not provide specific enough information to clarify the area
of knowledge to be assessed.
– Incorrect objective.  Complete a 25-item fill-in-the-blank test on the events of
World War II. Students must get 24- of 25-items correct.
– Correct objective.  Identify important battles in the Pacific campaign of World
War II. Students complete a 25-item fill-in-the-blank test in which each statement
to be completed requires them to name a battle that fits the event described or
the role it played in the war. Students must get 24- of 25-items correct.
• The assessment does not match the required action.  The action calls for
students to demonstrate “proper procedures,” but the assessment calls for them
to match symptoms with diagnoses.
– Incorrect assessment for action.  Use proper procedures for determining
the appropriate diagnosis of elevated temperature in juveniles. Students com-
plete a matching test: the left-hand column has temperature symptoms of juve-
niles, and the right-hand column lists possible diagnoses.
– Correct assessment for action.  Use proper procedures for determining the
appropriate diagnosis of elevated temperature in juveniles. Students are given
five scenarios describing elevated temperature in juveniles and must write a
brief description of the procedures they would use to determine the correct
diagnosis. Descriptions are graded by a criterion checklist of required correct
procedures.
• The assessment does not specify how the action will be measured.  “Cor-
rectly” is not specific enough about what constitutes an adequate performance
with the spectrometer.
– Incorrect assessment.  Use correct procedures to use a spectrometer for ele-
ment identification. Students are given an element and must use the spectrom-
eter to obtain its spectrogram. All procedures must be done correctly.
– Correct assessment.  Apply correct procedures to use a spectrometer for ele-
ment identification. Students are given an element and must use the spectrom-
eter to obtain its spectrogram. The instructor uses a checklist of steps. Students
must complete each step on the checklist in correct order.
• The assessment does not require enough to confirm ability.  Students are
asked to label only two sentences. The objective should require them to do
86 Part I  •  Analysis

enough different examples to confirm they can identify any and all sentences as
fact or opinion.
– Incorrect assessment.  Select an example of fact and opinion. Give students
a newspaper story written at their grade level with all sentences numbered.
Under the paragraph, they must write the number of one sentence that is fact
and one that is opinion.
– Correct assessment.  Select an example of fact and opinion. Students are
given a newspaper story written at their grade level with all sentences num-
bered. Under the paragraph, they must write the numbers of five sentences
that are fact and five that are opinion. All 10 must be correctly labeled.
• The performance level criterion is not appropriate for the type of action
and/or the assessment.  The “accuracy” criterion relates to amounts and num-
bers (e.g., all items on a test are correct), but the action does not have items; it
must be assessed by requiring certain steps.
– Incorrect performance level.  Develop a plan for taking care of a given plant
in a way that will ensure it survives. The plan must be done with 100 percent
accuracy.
– Correct performance level.  Develop a plan for taking care of a given plant
in a way that will ensure it survives. The plan must reflect appropriate ways to
address each of the five care criteria.
• The performance level criterion is not realistic; it leaves no room for error. 
Because readings from a temperature probe are likely to fluctuate, demanding
exact readings is not realistic.
– Incorrect performance level.  Use a graphing calculator and temperature
probe to take readings of liquids. Readings of graph output must be exact.
– Correct performance level.  Use a graphing calculator and temperature
probe to take readings of liquids. Readings of the graph output must be cor-
rect within a range of ± .01.

Check Your Understanding 4.5

Objective 5 Exercise—Errors in Writing Instructional Design Objectives. Read each of


the following instructional design objectives, identify what is wrong with them, and rewrite
them correctly.
______ 1. Carry out an experiment on heat absorption with materials of various colors. Do
all steps in the experiment correctly. 
______ 2. Complete a worksheet of 25 items on the characteristics of planets in the solar
system. Twenty-four of the 25 items must be correct. 
______ 3. Do a t test with a given set of data. Students will be given 10 sets of data, and
must specify whether or not a t test can be calculated with the given data, then
perform the test, when possible. 
______ 4. Describe the appropriate sequence of procedures to use when taking a credit
card order over the phone. The list of steps will be given and students must put
them in correct order with 90 percent accuracy. 
______ 5. Learn how to download a plug-in from the Internet. Attend a demonstration on
how to download and use plug-ins; then download the plug-in required for a
given purpose. 

Click here for suggested answers


Chapter 4  •  Preparing Instructional Design Objectives and Assessment Strategies 87

Chapter 4 Summary

• Objectives can serve several kinds of useful in- establish validity and reliability; and logistics re-
structional roles (e.g., guides for reading, targets quired for instrument development, administra-
for students), but objectives for instructional de- tion, and scoring.
sign purposes are written to make sure required • Procedures for writing instructional design objec-
postinstruction performances align with assess- tives include: reviewing behaviors in the learn-
ments and instruction. Objectives also differ from ing map; listing the target behaviors; deciding on
content area standards; more than one objective an action, assessment method, and performance
may be needed to measure a standard. level to demonstrate the first behavior; creating
• Clarity and specificity are essential qualities for the objective statement; and repeating the process
instructional design objectives. All such objectives for each of the other behaviors.
must be in terms of what students will be able to • Common errors and problems in writing objec-
do and must specify the desired action the stu- tives include: the action is too vague to be mea-
dent will do post-instruction, as well as the assess- sured; action focuses on the instructor rather than
ment conditions and circumstances under which the student; the action focuses on the students’
they must do it and the performance criterion learning activities rather than postinstruction ac-
they must meet (e.g., number of items correct or tivities; the action and/or assessment information
level of accuracy). are incomplete; the assessment does not match
• Types of assessment methods include: mental the required action; the assessment does not spec-
skills and information tests (e.g., multiple choice, ify how the action will be measured; the assess-
true/false, fill-in-the-blank, matching, short an- ment does not require enough to confirm ability;
swer, essay), performance measures (graded the performance level criterion is not appropriate
by checklists and rubrics), and attitude surveys. for the type of action and/or the assessment; and
Guidelines for selecting the most appropriate for- the performance level criterion is not realistic be-
mat include: directness of measure as a reflection cause it leaves no room for error.
of real-world performance; resources required to

References

Farrell, E. (2002, July 12). Students won’t give up their French Popham, J. (2011). Classroom assessment: What teachers
fries. The Chronicle of Higher Education. Retrieved from need to know (6th ed.). Boston, MA: Allyn & Bacon.
http://chronicle.com/weekly/v48/i44/44a03501.htm Seels, B., & Glasgow, Z. (1998). Making instructional
Gagné, R., & Briggs, L. J. (1974). Principles of instructional ­design decisions (2nd ed.). Upper Saddle River, NJ:
design. New York, NY: Holt, Rinehart, & Winston. Merrill, Prentice Hall.
Gay, L. R., Mills, G. E., & Airasian, P. (2009). Educational Waugh, C., & Gronlund, N. (2013). Assessment of student
research: Competencies for analysis and application achievement (10th ed.). Upper Saddle River, NJ: Merrill,
(9th ed.). Upper Saddle River, NJ: Pearson Education, Prentice Hall.
Merrill/Prentice Hall. Willis, J. (1995). A recursive, reflexive instructional design
Mager, R. (1962). Preparing instructional objectives. model based on constructivist-interpretivist theory.
­Belmont, CA: Fearon. ­Educational Technology, 35(6), 5–23.
Oosterhof, A. (2009). Developing and using classroom as-
sessments (4th ed.). Upper Saddle River, NJ: Pearson
Education, Merrill.
88 Part I  •  Analysis

Chapter 4 Exercises

Click here to complete Exercise 4.1: New Terms and Concepts

Exercise 4.2: Questions for Thought and Discussion— Exercise 4.3: Design Project Activities and Assessment
These questions may be used for small group or class Criteria—As you prepare instructional design objectives
discussion or may be subjects for individual or group ac- for your product for this course, use the following criterion
tivities. Take part in these discussions in your in-person checklist to assess your work:
class meeting, or use your instructor-provided online dis- _____ 1. Instructional design objectives have been pre-
cussion area or blog. pared to cover all skills from the learning map
a. Willis (1995) says that “In the R2D2 (design) model, that will be included in the instruction.
specific objectives evolve naturally from the process _____ 2. For each objective, all three required compo-
of design and development . . . it is not important to nents are specified.
write specific objectives at the beginning of a (design) _____ 3. For each objective, the action is in terms of stu-
project.” Why does the approach that Willis recom- dent performance.
mends not work for systematic design models? Can _____ 4. For each objective, the assessment method will
you think of any design situations where the R2D2 be a valid, reliable, and practical way to confirm
model would be appropriate? that students have learned the action.
b. Popham (2011) notes that the standards currently be- _____ 5. For each objective, the performance level is a
ing offered by various content areas (e.g., science, reasonable requirement to demonstrate that stu-
mathematics, history) and by various state depart- dents have achieved the ability specified in the
ments can be very helpful to those selecting objec- objective.
tives to assess in schools. Give an example from
your chosen content area for how standards relate to
­instructional design objectives.

You might also like