You are on page 1of 21

Performance Reports for

Failing Candidates
Carol OByrne
Pharmacy Examining Board of Canada

Presented at the 2005 CLEAR Annual Conference


September 15-17 Phoenix, Arizona
What failing candidates want to know

How close was I to passing?


What did I do wrong? What did I miss?
How many such errors and omissions lead to a
failing result?
In which area(s) do I need to improve?
What does PEBC expect in these areas?
Why am I expected to perform at a higher level
than what I see some pharmacists doing?

Presented at the 2005 CLEAR Annual


PEBC rationale for providing
feedback to candidates
Supports PEBCs mandate: to certify candidates who
demonstrate that they have the knowledge, skills,
abilities and attitudes required for practice

Increases candidates awareness of practice


requirements

Supports the cooperative but arms length relationship


between credentialing bodies and training bodies

Presented at the 2005 CLEAR Annual


Rationale

Benefits all parties:


Assists candidates to recognize and address their
weaknesses
Improves efficiency of PEBC processes and lessens
potential threat on exam security by reducing the number
of retakes
Benefits the profession and the public by supporting
further development of qualifications of those preparing to
enter practice
Addresses manpower needs - guides remediation and
bridging efforts, facilitating earlier entry to the profession
of those who may not yet have received adequate training

Presented at the 2005 CLEAR Annual


Why only to failing candidates?

No demand from passing candidates


Resource issues
Issuance of reports
Failing candidates often retake the exam
without appropriate preparation and plug the
system

Presented at the 2005 CLEAR Annual


PEBC Qualifying Examination
Based on national Part I (MCQ) 200 scored
competencies and items
standards
Offered in English Part II (OSCE) 15 scored
stations
and French 12 SP/HP interactions
Must be PEBC + 3 non-client stations
certified to license in 7 minutes/station
9/10 provinces 1 assessor/station
Mobility enabled by 2 sets of scores/station
mutual recognition (if Analytical checklist
PEBC certified) Holistic scales

Presented at the 2005 CLEAR Annual


Competencies assessed

Competencies Weights
%

1. Practise pharmaceutical care 29


2: Assume ethical, legal and professional 9
responsibilities
3: Access, retrieve, evaluate and disseminate 5
relevant information
4: Communicate and educate effectively 43
5: Manage drug distribution 9
6: Apply practice management knowledge and skills 5

Presented at the 2005 CLEAR Annual


Test format
Interactive client stations
Standardized patients
Standardized health professionals
Non-client stations
Technical, e.g.:
Screening prescriptions for appropriateness
Checking dispensed prescriptions
Written short answer, e.g.:
Responding to drug information requests - evaluating and
interpreting drug information from several / conflicting sources
Medication management - reviewing patient data and
recommending therapeutic options, along with a rationale

Presented at the 2005 CLEAR Annual


Assessor scoring sheet - ratings
Three 4-point scales
1. Communications generic scale
Rapport
Organization and flexibility (adaptive to the client/situation)
Verbal and nonverbal skills (including language proficiency)
2. Outcome (problem solving) station specific scale
Based on critical checklist items
3. Overall Performance inclusive, global scale
Communications and outcome
Process quality and thoroughness (critical and noncritical items)
Accuracy (vs misinformation)
Risk (occurrence, degree)

Presented at the 2005 CLEAR Annual


Assessor scoring sheet - checklist
Critical items ()
essential to solve the problem & meet station objective/s
each linked to a competency assessed in the station
Noncritical items
represent good practice & contribute to effective outcome(s)
each linked to a competency
Risk and misinformation
Unique response (UR) - for scoring & QA purposes
Comment boxes - to record evidence to support scores
(used for QA purposes)

Presented at the 2005 CLEAR Annual


Scoring the examination
Analytical scores Holistic scores
Each checklist item relates Each scale 1 to 4 points
12 points per client station
to one competency (Comm, Outc, Perf)
Competency sub-scores = x 12 stations
percent of items related to 8 points per nonclient station
(Outc, Perf)
each competency to which x 3 stations
candidate responds Raw score = sum of all
Frequency of risk and stations holistic scale scores
misinformation tabulated Holistic cut score set for each
scale in each station
Cut score = sum of all stations
holistic cut scores

Presented at the 2005 CLEAR Annual


Mean scores & alphas
Holistic Scales Coefficient Holistic Scales
4
1. Communications .83 3.5

Scale Value
12 stns - competency 4 3
2.5
2. Outcome .66 2
1.5
15 stns all competencies
1
3. Performance .73 1 2
Holistic Scale Means
3

15 stns all competencies

Analytical Scores n items Coefficient Analytical Scores


1
1. Pharm care 107 .80
0.8
2. Ethics 7 .43

Percent
0.6
3. Drug information 8 .24 0.4
4. Communications 14 .33 0.2
0
5. Drug distribution 8 .57 1 2 3 4 5 6
6. Management 8 .55 Competency Means

Presented at the 2005 CLEAR Annual


Factors affecting competency sub-
score reliabilities

Candidate variability (or lack of)


Number and context of stations in which
the competency was assessed
Number of non-critical items vs critical
items (importance of their performance
to the task at hand)

Presented at the 2005 CLEAR Annual


Reports to candidates
Results: pass-fail status (all candidates)
Feedback (for failing candidates, on request):
Individual score breakdown
by major skill mean Communication, Outcome and
Performance ratings aggregated across all stations
by competency mean percent scores aggregated across all
stations in which the competency was assessed
by critical incident frequency of risk, misinformation
Comparative data
Reference group mean scores and frequencies for score
comparison with a stable population
to show where performance needs to improve

Presented at the 2005 CLEAR Annual


QUALIFYING EXAMINATION - Part II (OSCE) - revised 15Mar04

Training Station - Cancer Pain Control (Video Performance)


Station Checklist

Assessor scoring sheet


0 1 2 3 4 5 6 7 8 9 A B C D E F G H I J
STATION __ Location: 0 1 2 3 4 5 6 7 8 9
Track: 1 2 3 4 5 6 7 8 9 0

CASE #____ 0 1 2 3 4 5 6 7 8 9

Shift :__Candidate:______Start Stn:__ Assessor:_____ Client_____


_____________

ATTACH BAR CODE HERE 8 0 0 0 6 0 0 0

Be sure to attach candidate id barcode label. Also, ensure that your assessor id barcode or client (sp
id) barcode are correct. If not correct, please mark an X through the barcode box and enter the correct
number in the spaces (blank lines) provided above the box.

Practise Pharmaceutical Care:


Communicate and Educate Effectively:
1. Asks about nature of the pain (e.g. severity, frequency).
2. Asks patient about side affects (e.g. constipation). Unique Response (if any)
Advises / informs: If another response is given
which you are unsure is
3. It is safe to take both tablets and liquid. appropriate - or which
* influences your grading below -
4. Liquid morphine is for immediate relief of occasional pain; tablets are for please shade in the bubble and
note details
maintaining pain control.
5. Should stop taking Tylenol.
6. Liquid morphine will act quickly to relieve your pain.
7. Continue taking one morphine tablet twice daily.
*
8. Take 5 mL every 4 hours if needed / if pain recurs before you are to take your next morphine tablet.
*
9. You may experience more drowsiness / sleep (if you take both liquid and tablets).
*
10. For constipation, increase fluid and fibre intake, exercise regularly and/or could use a stimulant laxative.
11. Contact your doctor if you get breakthrough pain often OR need the liquid often.
12. If you need liquid morphine regularly it may indicate that your tablets are not strong enough / that you
need a dose adjustment.
13. Do NOT take both tablets and liquid / unsafe. (incorrect response)

COMMENT (if rating less than Acceptable/Marginal or Solved/Marginal) : use back of sheet if needed

Fill in one bubble for each rating scale below - to score the candidate's performance.

Communications Outcome Performance


Acceptable Problem Solved Acceptable Misinfo

Presented at the 2005 CLEAR Annual


Acceptable/ Marginal Solved/ Marginal Acceptable/ Marginal
Unacceptable/ Marginal Uncertain Unacceptable/ Marginal
Unacceptable Unsolved Unacceptable Risk
PEBC QUALIFYING EXAMINATION - PART II (OSCE)
EXAMINATION FEEDBACK REPORT

Candidate ID Status: Fail

Table 1 RATINGS
OSCE feedback report
Communication Rating # of Stations (of 12)
Your average 3.10 4=Acceptable 4
Group average 3.67 3=Marginally acceptable 6
2=Marginally u nacceptable 1
1=Unacceptable 1

Outcomes Rating # of Stations (of 15)


Your average 2.33 4=Problem solved 2
Group average 2.90 3=Solved marginally 5
2=Uncertain/marginally unsolved 4
1=Problem unsolved 4

Overall Performance Rating # of Stations (of 15)


Your average 2.51 4=Acceptable 3
Group average 3.01 3=Marginally acceptable 5
2=Marginally u nacceptable 4
1=Unacceptable 3

Table 2 MISINFORMATION AND RISK/INEFFECTIVE THERAPY

# of Stations/Instances (of 15) Group Average


Misinformation 3 1.31
Risk or ineffective therapy 2 0.63

Table 3 COMPETENCY SCORES

Competency Your Score (%) Group Average (%)


1 Practise pharmaceutical care 47 59
2 Assume ethical, legal and professional responsibilities 39 56
3 Access, retrieve, evaluate, disseminate relevant information 44 47
4 Communicate and educate effectively 35 53
5 Manage drug distribution 80 74
6 Apply practice management knowledge and skills 50 63

Presented at the 2005 CLEAR Annual


Candidate findings
Most candidates understand the information provided
but want more guidance (content information where
they went wrong)
Some do not accept the exam results and feedback
information may request hand-scoring
Failing candidates generally score low in
Communications (rating scale and competency 4) and/or
Pharmaceutical Care (competency 1 clinical role)
Many failing candidates lack clinical training in Canada
(or the US) - though many have some technical
training / experience (as a pharmacy technician)

Presented at the 2005 CLEAR Annual


Are we really helping candidates?

Anecdotally, yes some do not know where


to start, what to focus on
Skills scores and competency sub-scores are
consistent enough to be meaningful in areas
that are weighted more heavily
All candidates who fail show weaknesses in
one or more of these areas (low scores
relative to the reference group)

Presented at the 2005 CLEAR Annual


What questions do (can) we
answer?
What area(s) do I need to improve?
What does PEBC expect in these areas?
What did I do wrong? What did I miss?
Why am I expected to perform at a higher level
than what I see some pharmacists doing?
How many errors and omissions lead to a failing
result?
How close was I to passing?

Presented at the 2005 CLEAR Annual


What other strategies are (may be)
helpful?
Provide information about training and/or remedial
resources, e.g.:
Clear expressions, including visual exemplars, of good
practice in each competency area
Recognized training programs and resources
Practice exams (e.g. mock OSCEs) for format
familiarization
Provide general tips, e.g.:
Typical performance errors/deficits in each competency
Competency-related descriptions of candidates who are
clearly qualified, borderline qualified and unqualified

Presented at the 2005 CLEAR Annual


Contact information

Carol OByrne
Pharmacy Examining Board of Canada
415 Yonge Street, Suite 601
Toronto, ON M5B
T: 416-979-2431, ext 226
Email: obyrnec@pebc.ca
Website: www.pebc.ca

Presented at the 2005 CLEAR Annual

You might also like