You are on page 1of 4

This article was downloaded by: [Univeristy Of Sao Paulo]

On: 23 December 2009

Access details: Access Details: [subscription number 906391459]
Publisher Informa Healthcare
Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-
41 Mortimer Street, London W1T 3JH, UK

Medical Teacher
Publication details, including instructions for authors and subscription information:

Twelve tips for blueprinting

Sylvain Coderre a; Wayne Woloschuk a; Kevin McLaughlin
Office of Undergraduate Medical Education, University of Calgary, Canada

To cite this Article Coderre, Sylvain, Woloschuk, Wayne and McLaughlin, Kevin(2009) 'Twelve tips for blueprinting',
Medical Teacher, 31: 4, 322 — 324
To link to this Article: DOI: 10.1080/01421590802225770


Full terms and conditions of use:

This article may be used for research, teaching and private study purposes. Any substantial or
systematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply or
distribution in any form to anyone is expressly forbidden.

The publisher does not give any warranty express or implied or make any representation that the contents
will be complete or accurate or up to date. The accuracy of any instructions, formulae and drug doses
should be independently verified with primary sources. The publisher shall not be liable for any loss,
actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directly
or indirectly in connection with or arising out of the use of this material.
2009; 31: 322–324


Twelve tips for blueprinting

Office of Undergraduate Medical Education, University of Calgary, Canada

Background: Content validity is a requirement of every evaluation and is achieved when the evaluation content is congruent with
the learning objectives and the learning experiences. Congruence between these three pillars of education can be facilitated by
Aims: Here we describe an efficient process for creating a blueprint and explain how to use this tool to guide all aspects of course
creation and evaluation.
Conclusions: A well constructed blueprint is a valuable tool for medical educators. In addition to validating evaluation content, a
blueprint can also be used to guide selection of curricular content and learning experiences.
Downloaded By: [Univeristy Of Sao Paulo] At: 08:38 23 December 2009

Introduction creating items. Content importance, however, is difficult to

define. Attributes such as the potential harm to the patient from
Validity is a requirement of every evaluation and implies that
misdiagnosing a presentation (a measure of presentation
candidates achieving the minimum performance level have
‘impact’), the potential for significant disease prevention
acquired the level of competence set out in the learning
(also a measure of presentation ‘impact’), and how frequently
objectives. Typically, the type of validity that relates to
a presentation is encountered in clinical practice should be
measurements of academic achievement is content validity
(Hopkins 1998). Evaluation content is valid when it is
At the University of Calgary we rate the impact and
congruent with the objectives and learning experiences, and
frequency of clinical presentations based on the criteria shown
congruence between these pillars of education can be
in Table 2. The impact and frequency of each clinical
facilitated by using an evaluation blueprint (Bordage et al.
presentation are tabulated (columns 2 and 3 of Table 1) and
1995; Bridge et al. 2003).
then multiplied. This produces an I  F product for all eighteen
In this paper we describe an efficient and straightforward
clinical presentations, which ranges from 1 to 9. Next, the I  F
process for creating a blueprint, using examples from the
product for each clinical presentation (column 4 of Table 1)
University of Calgary medical school curriculum. Although its
primary function is to validate evaluation content, a well is divided by the total for the I  F column (80 in our example)
constructed blueprint can also serve other functions, such as to provide a relative weighting for each presentation,
guiding the selection of learning experiences. ‘Course blueprint’ which corresponds to the proportion of evaluation items
may therefore be a more appropriate descriptor of this tool. for this presentation (column 5 of Table 1). For example,
hyperkalemia – a life threatening emergency that is encoun-
Tip 1. Tabulate curricular content tered frequently by physicians caring for patients with kidney
diseases – has the highest relative weighting (0.1125). But
The first step in blueprinting is to define and tabulate the how do we know that this weighting is reliable?
curricular content. A blueprint template consists of a series of
rows and columns. At the University of Calgary, teaching of the
undergraduate curriculum is organized according to clinical Tip 3. Sample opinion on weighting from all
presentations, so the rows in our blueprints contain the clinical relevant groups
presentations relevant to the course being blueprinted
(Mandin et al. 1995). Column 1 in Table 1 shows the eighteen Reliability is improved by increasing sample size and breadth
clinical presentations for the Renal Course at the University of (Hopkins 1998). In addition to involving course chairs and
Calgary. Curricular content can be organized in many other evaluation coordinators, we solicit input from teachers and, if
ways, including course themes or units. relevant, previous learners (McLaughlin et al. 2005a). That is,
weighting of a content area is established through consensus.
Tip 2. Provide relative weighting of curricular content Giving potential users the opportunity to have input into the
blueprint creation may also improve the likelihood of the
Evaluations have a finite number of items, so some measure of
blueprint being used to guide all aspects of course design and
relative weighting of content areas must be decided upon so
evaluation (see Tip 10).
that priority can be given to more ‘important’ areas when

Correspondence: Dr Kevin McLaughlin, University of Calgary, Undergraduate Medical Education, Calgary, Alberta, Canada.

322 ISSN 0142–159X print/ISSN 1466–187X online/09/040322–3 ß 2009 Informa Healthcare Ltd.
DOI: 10.1080/01421590802225770
Twelve tips for blueprinting

Table 1. Blueprint for the undergraduate renal course at the University of Calgary.

Column #: 1 2 3 4 5 6 7 8 9 10

Presentation Impact Frequency IF Weight Number of items Diagnosis Investigation Treatment Basic science
Hypernatremia 2 1 2 0.025 1.50 1 0 0 1
Hyponatremia 3 2 6 0.075 4.50 2 0 1 1
Hyperkalemia 3 3 9 0.1125 6.75 3 1 2 1
Hypokalemia 2 2 4 0.05 3.00 2 0 0 1
Acidosis 3 2 6 0.075 4.50 2 0 1 1
Alkalosis 2 2 4 0.05 3.00 2 0 0 1
ARF 3 3 9 0.1125 6.75 5 1 1 0
CRF 2 3 6 0.075 4.50 3 1 1 0
Hematuria 2 2 4 0.05 3.00 2 1 0 0
Proteinuria 2 3 6 0.075 4.50 2 0 0 2
Edema 1 3 3 0.0375 2.25 1 0 1 0
Scrotal mass 2 2 4 0.05 3.00 2 1 0 0
Urinary retention 1 3 3 0.0375 2.25 1 0 1 0
Hypertension 2 3 6 0.075 4.50 2 1 1 0
Polyuria 1 1 1 0.0125 0.75 1 0 0 0
Renal colic 1 3 3 0.0375 2.25 1 0 1 0
Dysuria 1 2 2 0.025 1.50 1 0 1 0
Incontinence 1 2 2 0.025 1.50 1 0 1 0
TOTAL 80 1 60 34 6 12 8
Downloaded By: [Univeristy Of Sao Paulo] At: 08:38 23 December 2009

Table 2. Weighting for impact and frequency of the clinical presentations.

Impact Weight Frequency Weight

Non-urgent, little prevention potential 1 Rarely seen 1
Serious, but not immediately life threatening 2 Relatively common 2
Life threatening emergency and/or high potential for prevention impact 3 Very common 3

Tip 4. Decide on the number of items for each to management (Mandin 2004). The tasks for the seven
content area items on hyperkalemia reflect this balance (columns 7–10
of Table 1).
The first step in this process is deciding on the total number of
The blueprint for content validity is now complete; the next
evaluation items. Reliability of an evaluation is affected by both
challenge is to create the valid content.
the number and discrimination of items. As a rough guide, if
the average discrimination index of the items is 0.3, then
approximately 50–60 items are needed to achieve reliability of Tip 6. Create evaluations based on the blueprint
0.8. This number increases to 100 if the average item
All evaluations used in the course – formative, summative and
discrimination is 0.2. Reliability appears to plateau beyond
retake – should conform to the blueprint. The blueprint
100 items (Hopkins 1998).
specifies the number of items needed for each clinical
The next step is to allocate items to content areas. This can
presentation and, within each presentation, which tasks
be done by multiplying the total number of items on the
should be evaluated. The evaluation coordinator can now
evaluation by the relative weighting for each clinical presenta-
create valid evaluations by following these specifications.
tion, and then rounding up or down to the nearest whole
Providing this degree of detail is also very helpful to those
number. For example, a 60 item evaluation on the Renal
recruited to the task of item creation.
Course should have seven items (60  0.1125) on hyperkale-
mia and one (60  0.0125) on polyuria (column 6 of Table 1).
Tip 7. Use (or create) an item bank
Tip 5. Decide on the tasks for each content area Starting from scratch and creating new items for one or
more evaluations can appear onerous. Using an item bank to
There are a variety of tasks that can be evaluated within any match existing items to the blueprint reduces the burden
clinical presentation, such as diagnosing the underlying of creating evaluations. If an item bank does not exist, the
cause (including specific points of history and physical short-term investment of time and effort to create this pays off
examination), interpreting or selecting investigations, decid- in the long run as items can then be shared between courses
ing on management and/or prevention, demonstrating basic and even between medical schools.
science knowledge, etc. These tasks should be consistent
with the learning objectives of the relevant course. The
Tip 8. Revise learning objectives
Medical Council of Canada identifies three key objectives for
the clinical presentation of hyperkalemia – two are related As discussed above, a blueprint provides weighting for all
to diagnosis and interpretation of lab test, and one is related aspects of a course. This weighting provides an opportunity for
S. Coderre et al.

the course chair to reflect on the learning objectives. While it the intended curriculum (McLaughlin et al. 2005b). Fears that
may appear counterintuitive to revise learning objectives blueprint publication would improve learner performance by
based upon a blueprint weighting, to achieve content validity driving strategic learning are unsupported. In a previous study,
the number of objectives, hours our instruction, and number of we found that blueprint publication did not improve student
evaluation items for each clinical presentation should be performance, but significantly increased the perception of
proportional. Given the finite number of hours available for fairness of the evaluation process (McLaughlin et al. 2005c).
instruction, upon reflection it may become apparent that some
learning objectives are not achievable and need to be revised.
Tip 9. Revise learning experiences Blueprinting need not be onerous and we believe that the
initial investment of time and effort required to create a
The weighting provided by the blueprint also offers an blueprint will produce dividends over the long term. A well
opportunity for reflection on learning experiences – more constructed and reliable blueprint is a valuable educational
teaching time should be devoted to content areas with higher tool that can improve all aspects of course design and
weighting. But this does not imply a perfect linear relationship evaluation – benefiting both teacher and learners. After
between weighting and hours of instruction; some concepts creating a reliable blueprint, content validity is achieved only
take longer to teach than others, and the length of teaching when the blueprint is used to guide course design and
sessions needs to be adjusted to fit into available time slots. evaluation, and is maintained through a systematic monitoring
So now, in theory, we have congruence of learning of content.
objectives, learning experiences, and evaluation. However,
Downloaded By: [Univeristy Of Sao Paulo] At: 08:38 23 December 2009

in order to achieve content validity, the teachers need to Declaration of interest: The authors report no conflicts of
deliver the intended curriculum (Hafferty 1998). interest. The authors alone are responsible for the content and
writing of the paper.
Tip 10. Distribute the blueprint to teachers
A well constructed blueprint is a transparent outline of the
Notes on contributor
intended curriculum of a course. The detail contained within a DR CODERRE, MD, MSc, is an Assistant Dean of Undergraduate Medical
blueprint not only helps the course chair to select appropriate Education at the University of Calgary.
content area, but also helps teachers plan the learning DR WOLOSCHUK, Ph.D, is a program evaluator in the Office of
experiences so that the content delivered is congruent with Undergraduate Medical Education at the University of Calgary.

both the objectives and the evaluations. DR MCLAUGHLIN, MB Ch.B, Ph.D, is an Assistant Dean of Undergraduate
Medical Education at the University of Calgary

Tip 11. Monitor content validity

When course chairs, evaluators, teachers, and learners use the References
same blueprint the effects of hidden curricula should be
Bordage G, Brailovsky C, Carretier H, Page G. 1995. Content validation of
minimized (Hafferty 1998). It cannot be assumed however, key features on a national examination of clinical decision-making
that publishing a blueprint inevitably leads to its adoption – skills. Acad Med 70:276–281.
content validity still needs to be evaluated and monitored. At Bridge PD, Musial J, Frank R, Roe T, Sawilowsky S. 2003. Measurement
the University of Calgary we monitor content validity by asking practices: Methods for developing content-valid student examinations.
Med Teach 25:414–421.
students the question, ‘Did the final examination reflect the
Hafferty FW. 1998. Beyond curricular reform: Confronting medicine’s
material seen and taught?’ after each summative evaluation. hidden curriculum. Acad Med 73:403–407.
This allows us to evaluate and adjust the learning experiences Hopkins K. 1998. Educational and Psychological Measurement and
if the students’ perception of content validity is low. Evaluation. MA, Allyn and Bacon: Needham Heights.
Mandin H. 2004. Objectives for the Qualifying Examination. ON, Medical
Council of Canada: Ottawa.
Tip 12. Distribute the blueprint to learners Mandin H, Harasym P, Eagle C, Watanabe M. 1995. Developing a ‘clinical
presentation’ curriculum at the university of Calgary. Acad Med
Given the adverse consequences of academic failure in
medical school, it is inevitable that evaluation drives learning. McLaughlin K, Lemaire J, Coderre S. 2005a. Creating a reliable and valid
Ideally, creating and providing a blueprint to learners ensures blueprint for the internal medicine clerkship evaluation. Med Teach
that course leaders are ‘grabbing hold of the steering wheel’ 27:544–547.
and driving learning towards what is felt to be core course McLaughlin K, Coderre S, Woloschuk W, Mandin H. 2005b. Does blueprint
publication affect students’ perception of validity of the evaluation
material. When a blueprint provides content validity, the effect
process? Advan in Health Sci Educ 10:15–22.
of evaluation on learning can be embraced – rather than feared McLaughlin K, Woloschuk W, Lim TH, Coderre S, Mandin H. 2005c. The
– as this tool, shown to be important in student examination Influence of Objectives, Learning Experiences and Examination Blueprint
preparation, reinforces the learning objectives and delivery of on Medical Students’ Examination Preparation. BMC Med Educ 5:39.