Professional Documents
Culture Documents
2
Evidence-driven eating
You need to buy lunch. Before you decide what
to buy you consider a number of factors:
• how much money do you have?
• what do you feel like eating?
• what will you be having for dinner?
• how far do you need to go to buy food?
• how much time do you have?
• where are you going to eat it?
3
Evidence-driven teaching
I had a hunch that Ana wasn’t doing as well as she could in her
research assignments, a major part of the history course. What
made me think this?
Ana’s general work (especially her writing) was fine. She made
perceptive comments in class, contributed well in groups and had
good results overall last year, especially in English.
How did I decide what to do about it?
I looked more closely at her other work. I watched her working in the
library one day to see if it was her reading, her use of resources, her
note taking, her planning, or what. At morning tea I asked one of
Ana’s other teachers about Ana’s approach to similar tasks. I asked
Ana if she knew why her research results weren’t as good as her
other results, and what her plans were for the next assignment.
I thought about all of this and planned a course of action. I gave her
help with using indexes, searching, note taking and planning and
linking the various stages of her research.
4
Consider the Evidence
A resource to assist schools
to review their use of data and other evidence
5
Evidence
Any facts, circumstances or perceptions that can
be used as an input for an analysis or decision
7
Which factors are data?
Evidence to consider before buying lunch
We should
• draw on all our knowledge about the learning
environment to improve student achievement
• explore what lies behind patterns of
achievement
• decide what changes will make a difference
9
What evidence does a school have?
• Demographics
• Student achievement
• Perceptions
• School processes
• Other practice
10
Demographics
What data do we have now to provide a profile
of our school?
What other data could we create?
• School
• Students
• Staff
• Parents/caregivers and community
11
Demographics
Data that provides a profile of our school
• Self appraisal
• Formal and informal observations made by teachers
• Structured interactions
• Externally generated reports
• Student voice
• Other informal sources
15
Perceptions
Evidence about what students, staff, parents and the
community think about the school
• Self appraisal - student perceptions of their own abilities, potential,
achievements, attitudes …
• Formal and informal observations made by teachers - peer
interactions, behaviour, attitudes, engagement, student-teacher
relationships, learning styles, classroom dynamics …
• Structured interactions - records from student interviews, parent
interviews, staff conferences on students …
• Externally generated reports - from ERO and NZQA (these contain
data but also perceptions) …
• Student voice - student surveys, student council submissions …
• Other informal sources – views about the school environment, staff
and student morale, board perceptions, conversations among
teachers …
16
School processes
• Timetable
• Classes
• Resources
• Finance
• Staffing
17
School processes
Evidence about how our school is organised and
operates
• School processes - evidence and data about how your school is
organised and operates, including:
• Timetable –structure, period length, placement of breaks, subjects
offered, student choices, tertiary and workforce factors, etc
• Classes - how they are compiled, their characteristics, effect of
timetable choices, etc
• Resources - access to libraries, text books, ICT, special equipment,
etc
• Finance - how the school budget is allocated, how funds are used
within departments, expenditure on professional development
• Staffing - policies and procedures for employing staff, allocating
responsibility, special roles, workload, subjects and classes
18
Other practice
19
Other practice
How we can find out about what has worked in
other schools?
20
What can we do with evidence?
Shane’s story
A history HOD wants to see whether history students are performing to
their potential.
She prints the latest internally assessed NCEA records for history
students across all of their subjects. As a group, history students
seem to be doing as well in history as they are in other subjects.
Then she notices that Shane is doing very well in English and only
reasonably well in history. She wonders why, especially as both are
language-rich subjects with many similarities.
The HOD speaks with the history teacher, who says Shane is attentive,
catches on quickly and usually does all work required. He mentions
that Shane is regularly late for class, especially on Monday and
Thursday. So he often misses important information or takes time to
settle in. He has heard there are ‘problems at home’ so has
overlooked it, especially as the student is doing reasonably well in
history. contd ...
21
Shane’s story … contd
The HOD looks at the timetable and discovers that history is Period 1
on Monday and Thursday. She speaks to Shane’s form teacher who
says that she suspects Shane is actually late to school virtually
every day. They look at centralised records. Shane has excellent
attendance but frequent lateness to period 1 classes.
The HOD speaks to the dean who explains that Shane has to take his
younger sister to school each morning. He had raised the issue with
Shane but he said this was helping the household get over a difficult
period and claimed he could handle it.
The staff involved agree that Shane’s regular lateness is having a
demonstrable impact on his achievement, probably beyond history
but not so obviously.
The dean undertakes to speak to the student, history teacher, and
possibly the parents to find a remedy for the situation.
22
Thinking about Shane’s story
What were the key factors in the scenario about
Shane?
What types of data and other evidence were
used?
What questions did the HOD ask?
What happened in this case that wouldn’t
necessarily happen in some schools?
23
Shane’s story - keys to success
25
Evidence-driven strategic planning
. INDICATORS STRATEGIC GOAL ANNUAL PLAN YEAR TARGET EVALUATION
FROM DATA DATA
To raise the levels of writing Develop and implement Raise writing asTTle Appraisa
across the school a plan to raise levels of results year 9 boys l
Writing at year 9 from 3B to 3A
etc.
School
charter
26
The evidence-driven decision making cycle
Trigger
Explore
Question
Assemble
Analyse
Interpret
Intervene
Evaluate
Reflect
27
The evidence-driven decision making cycle
28
The evidence-driven decision making cycle
.
Speculate
A teacher has a
Trigger hunch about a
Data indicate a problem or a
possible issue that possible action
could impact on
student
achievement
Explore
Check data and
evidence to
Reflect
explore the issue Question
on what has been
learned, how Clarify the issue
practice will change and ask a question
Act
Carry out the
intervention
29
The evidence-driven decision making cycle
SPECULATE
.
TRIGGER
EXPLORE
REFLECT
QUESTION
EVALUATE ASSEMBLE
ACT
30
The evidence-driven decision making cycle
A teacher has a
. hunch - poor
writers might
Explore data
Survey of
Reflect Trigger spend little time students
How will we Significant on homework shows that
teach writing in numbers not Question
this is only
the future? achieving well What are the
partially true
in writing characteristics
of students who
are poor at
writing?
Evaluate Has
writing improved?
Analyse
NQF/NCEA
Intervene results by Assemble
Create multiple Interpret standard more data &
opportunities for writing; information other
include topics that can Poor writers evidence:
use sport as context; likely to play Analyse non asTTle reading,
connect speaking and sport, speak NQF/NCEA data homework,
writing. PD for staff. well, read less, and evidence extracurric,
do little HW attendance,
etc.
31
Evaluate and reflect
• Summative evaluation – assess how successful
the intervention was; decide how our practice
will change; report to board
33
Inter-subject analysis
34
Intra-subject analysis
35
Longitudinal analysis
36
The evidence-driven decision making cycle
37
Asking questions
38
Trigger questions
• How good/poor is …?
• What aspects of … are good/poor?
• Is … actually changing?
• How is … changing?
• Is … better than last year?
• How can … be improved?
• Why is … good/poor?
• What targets are reasonable for …?
• What factors influence the situation for …?
• What would happen if we …?
Formative or summative?
39
Summative questions
A target in the school’s annual plan is for all year
10 boys to improve their writing level by at least
one level using asTTle (e.g. from 4B to 4A).
40
Questions about policy
41
Formative questions from data
42
Formative questions from data
43
Questions from hunches
• I suspect this poor performance is being caused
by …
Is this true?
• We reckon results will improve if we put more
effort into ...
Is this likely?
• I think we’d get better results from this module if
we added …
Is there any evidence to support this idea?
44
Hunches from raw data
2.1 2.2 2.3 2.4* 2.5* 2.6* ABS DET
. 1 Pamela N A N N N N 20 6
2 Lee A A A N N A 12 0
3 Manu E E E E N E 18 4
4 Keisha N A N N N N 7 8
5 Bron E M M N N A 3 0
6 Deane M M E M N A 2 1
7 Slane N A N N N N 22 8
8 Sam A A N A A A 12 8
9 Sione M M N N N N 2 2
10 Oran A A A A A A 7 0
11 Shirin E E E E A E 6 0
12 Hanna E E M M A M 0 1
13 Val E E E E N E 0 0
14 Liam N A M M N M 10 2
15 Morgan M M M M N M 15 0
16 Hone N A N N N N 17 4
17 Mahi A A N A A A 10 0
45
Hunches from raw data
• Is the class as a whole doing better in internally
assessed standards than in externally assessed
standards? If so, why?
• Are the better students (with many Excellence
results) not doing as well in external
assessments as in internal? If so, why?
• Is there any relationship between absences and
achievement levels? It seems not, but it’s worth
analysing the data to be sure.
46
The evidence-driven decision making cycle
47
Question – Explore – Question
It looks like our students are doing well in A but
not in B. What can we do about it?
48
Question – Explore – Question
We have been running 60-minute periods for a
year now. Did the change achieve the desired
effects?
50
A very good question
51
Questions with purpose
What do we know about reported bullying
incidents for year 10 students?
MAY BE BETTER AS
Who has been bullying whom? Where?
What are students telling us?
What does pastoral care data tell us? Were
some interventions more effective with some
groups of students than others?
52
Write more purposeful questions
• What are the attendance rates for year 11
students?
• What has been the effect of the new 6-day x
50-min period structure?
• How well are boys performing in formal writing
in year 9?
• What has been the effect of shifting the lunch
break to after period 4?
53
More purposeful questions
1. How do year 11 attendance rates compare with other
year levels? Do any identifiable groups of year 11
students attend less regularly than average?
• Is the new 6-day x 50-min period structure having any
positive effect on student engagement levels? Is it
influencing attendance patterns? What do students say?
• Should we be concerned about boys’ writing? If so, what
action should we be taking to improve the writing of boys
in terms of the literacy requirements for NCEA Level 1?
• The new timing of the lunch break was intended to
improve student engagement levels after lunch. Did it
achieve this? If so, did improvements in student
engagement improve student achievement? Do the
benefits outweigh any disadvantages?
54
The evidence-driven decision making cycle
55
Assembling the evidence
56
Are our data any good?
A school found that a set of asTTle scores
indicated that almost all students were achieving
at lower levels than earlier in the year.
57
Think critically about data
• Was the assessment that created this data
assessing exactly what we are looking for?
• Was the assessment set at an appropriate level
for this group of students?
• Was the assessment properly administered?
• Are we comparing data for matched groups?
58
Cautionary tale 1
You want to look at changes in a cohort’s asTTle
writing levels over 12 months.
59
Cautionary tale 2
You have data that show two classes have
comparable mathematics ability. But end-of-year
assessments show one class achieved far better
than the other.
What could have caused this?
Was the original data flawed? How did teaching
methods differ? Was the timetable a factor? Did
you survey student views? Are the classes
comparable in terms of attendance, etc?
60
The evidence-driven decision making cycle
61
Analysing data and other evidence
• Schools need some staff members who are
responsible for leading data analysis
• Schools have access to electronic tools to
process data into graphs and tables
• All teachers do data analysis
• Data is not an end in itself – it’s one of the many
stages along the way to evidence-driven
decision making
62
Basic analysis
. 1
2.1
Pamela N
2.2
A
2.3
N
2.4*
N
2.5*
N
2.6* ABS DET
N 20 6
2 Lee A A A N N A 12 0
3 Manu E E E E N E 18 4
4 Keisha N A N N N N 7 8
5 Bron E M M N N A 3 0
6 Deane M M E M N A 2 1
7 Slane N A N N N N 22 8
8 Sam A A N A A A 12 8
9 Sione M M N N N N 2 2
10 Oran A A A A A A 7 0
11 Shirin E E E E A E 6 0
12 Hanna E E M M A M 0 1
13 Val E E E E N E 0 0
14 Liam N A M M N M 10 2
15 Morgan M M M M N M 15 0
16 Hone N A N N N N 17 4
17 Mahi A A N A A A 10 0
63
Basic analysis
• Divide the class into three groups on the basis of
overall achievement
• Identify students who are doing so well at level 2
that they could be working at a higher level
• Find trends for males and females, those who
are absent often, or have many detentions
• Compare this group’s external assessment
success rate with the national cohort.
64
Reading levels – terms 1 and 4
.
65
Making sense of the results
Think about significance and confidence
66
Making sense of the results
This table shows that
reading levels overall
were higher in term 4
than in term 1.
Scores improved for most students.
20% of students moved into level 5.
But the median score is still 4A.
67
Information
Knowledge gained from analysing data and
making meaning from evidence.
68
Information
Summative information is useful for reporting
against targets and as general feedback to
teachers.
69
Questions to elicit information
• Did the more able students make significant progress,
but not the lower quartile?
• How have the scores of individual students changed?
• How many remain on the same level?
• How much have our teaching approaches contributed to
this result?
• How much of this shift in scores is due to students’
predictable progress? Is there any data that will enable
us to compare our students with a national cohort?
• How does this shift compare with previous Year 9
cohorts?
70
Reading levels – terms 1 and 4
.
71
Words, words, words …
Information can … establish, indicate, confirm,
reinforce, back up, stress, highlight, state, imply,
suggest, hint at, cast doubt on, refute …
72
The evidence-driven decision making cycle
73
Making sense of information
Data becomes information when it is categorised,
analysed, summarised and placed in context.
Information therefore is data endowed with relevance
and purpose.
Information is developed into knowledge when it is used
to make comparisons, assess consequences, establish
connections and engage in dialogue.
Knowledge … can be seen as information that comes
laden with experience, judgment, intuition and values.
Empson (1999) cited in Mason (2003)
74
Interrogate the information
• Is this the sort of result we envisaged? If not,
why?
• How does this information compare with the
results of other research or the experiences of
other schools?
• Are there other variables that could account for
this result?
• Should we set this information alongside other
data or evidence to give us richer information?
• What new questions arise from this information?
75
Interrogate the information
• Does this relate to student achievement - or
does it actually tell us something about our
teaching practices?
• Does this information suggest that the school’s
strategic goals and targets are realistic and
achievable? If not, how should they change, or
should we change?
• Does the information suggest we need to modify
programmes or design different programmes?
• Does the information suggest changes need to
be made to school systems?
76
Interrogate the information
What effect is the new 6-day x 50-min period
structure having on student engagement levels?
77
Interrogate the information
What effect is the new 6-day x 50-min period
structure having on student engagement levels?
79
Professionals making decisions
How do we decide what action to take as result
of the information we get from the analysis?
80
Professional decision making
We have evidence-based information that we
see as reliable and valid
81
Professionals making decisions
Have my students not achieved a particular
history standard because they have poor formal
writing skills, rather than poor history
knowledge?
82
Professionals making decisions
Do any particular groups of year 11 students
attend less regularly than average for the whole
cohort?
83
Professionals making decisions
You asked what factors are related to poor student
performance in formal writing.
The analysis suggested that poor homework habits
have a significant impact on student writing.
We need to consider
• what control we have over the action
• the likely impact of the action
• the resources needed
85
Planning for action
• Is this a major change to policy or processes?
• What other changes are being proposed
• How soon can you make this change?
• How will you achieve wide buy-in?
• What time and resources will you need?
• Who will co-ordinate and monitor
implementation?
86
Planning for action
• Is this an incremental change? Or are you just
tweaking how you do things?
• How will you fit the change into your regular
work?
• When can you start the intervention?
• Will you need extra resources?
• How will this change affect other things you do?
• How will you monitor implementation?
87
Timing is all
• How long should we run the intervention before
we evaluate it?
• When is the best time of the year to start (and
finish) in terms of measuring changes in student
achievement?
• How much preparation time will we need to get
maximum benefit?
88
Planning for evaluation
89
Planning for evaluation
• What evidence do we need to collect before we
start?
• Do we need to collect evidence along the way,
or just at the end?
• How can we be sure that any assessment at the
end of the process will be comparable with
assessment at the outset?
• How will we monitor any unintended effects?
Don’t forget evidence such as timetables, student
opinions, teacher observations …
90
The evidence-driven decision making cycle
91
Evaluate the impact of our action
92
Evaluate the impact of our action
93
Evaluate the impact of our action
A school created a new year 13 art programme. In
the past students had been offered standard
design and painting programmes, internally and
externally assessed against the full range of
achievement standards. Some students had to
produce two folios for assessment and were
unsure of where to take their art after leaving
school.
The new programme blended drawing, design and
painting concepts and focused on electronic
media. Assessment was against internally
assessed standards only.
94
Evaluate the impact of our action
• Did students complete more assessments?
• Were students gain more national assessment
credits?
• How did student perceptions of workload and
satisfaction compare with teacher perceptions
from the previous year?
• Did students leave school with clearer intentions
about where to go next with their art than the
previous cohort?
• How did teachers and parents feel about the
change?
95
Evaluate the intervention
How well did we design and carry out the
intervention? Would we do anything differently if
we did it again?
Were our results affected by anything that
happened during the intervention period - within
or beyond our control?
Did we ask the right question in the first place?
How useful was our question?
How adequate were our evaluation data?
96
Think about the process
• Did we ask the right question in the first place?
How useful was our question?
• Did we select the right data? Could we have
used other evidence?
• Did the intervention work well? Could we have
done anything differently?
• Did we interpret the data-based information
correctly?
• How adequate were our evaluation data?
• Did the outcome justify the effort we put into it?
97
The evidence-driven decision making cycle
98
Future practice
100
Consider the Evidence
Terminology
101
Terminology
Terminology used in the
evidence-driven decision making cycle
102
Trigger
Data, ideas, hunches, etc that set a process in
action.
The trigger is whatever it is that makes you think
there could be an opportunity to improve student
achievement. You can routinely scan available
data looking for inconsistencies, etc. It can be
useful to speculate about possible causes or
effects - and then explore data and other
evidence to see if there are any grounds for the
speculation.
103
Explore
Initial data, ideas or hunches usually need some
preliminary exploration to pinpoint the issue and
suggest good questions to ask.
104
Question
This is the key point: what question/s do you
want answered. Questions can raise an issue
and/or propose a possible solution.
105
Assemble
Get together all the data and evidence you might
need – some will already exist and some will
have to be generated for the occasion.
106
Analyse
Process sets of data and relate them to other
evidence.
You are looking for trends and results that will
answer your questions (but watch out for
unexpected results that might suggest a new
question).
107
Interpret
Think about the results of the analysis and clarify
the knowledge and insights you think you have
gained.
Interrogate the information. It’s important to look
at the information critically. Was the data valid
and reliable enough to lead you to firm
conclusions? Do the results really mean what
they seems to mean? How sure are you about
the outcome? What aspects of the information
lead to possible action?
108
Intervene
Design and implement a plan of action designed
to change the situation you started with.
109
Evaluate
Using measures you decided in advance,
assess how successful the intervention has
been.
110
Reflect
Think about what has been learned and
discovered – and what practices you will change
as a consequence.
111
Terminology
112
Terminology
Analysis
113
Terminology
Aggregation
114
Terminology
Data
115
Terminology
Demographics
You will have the usual data relating to your students (gender,
ethnicity, etc) and your staff (gender, ethnicity, years of experience,
etc). Some schools collect other data, such as the residential
distribution of students and parental occupations.
116
Terminology
Disaggregation
See aggregation
When you disaggregate data, you take aggregated data apart to see
what you can discover from the component parts. For example, a
student may do moderately well across a whole subject, but you
need to disaggregate the year’s result to see where her weaknesses
lie.
117
Terminology
Evaluation
118
Terminology
Evidence
119
Terminology
Information
120
Terminology
Inter-subject analysis
121
Terminology
Intervention
122
Terminology
Intra-subject analysis
123
Terminology
Longitudinal analysis
124