You are on page 1of 124

Consider the Evidence

Evidence-driven decision making


for secondary schools
A resource to assist schools
to review their use of data and other evidence
1
Evidence-driven decision making
Today we aim to
• think about how we use data and other
evidence to improve teaching, learning and
student achievement
• improve our understanding, confidence and
capability in using data to improve practice
• discuss how we make decisions
• think about our needs and start to plan our own
evidence-based projects

2
Evidence-driven eating
You need to buy lunch. Before you decide what
to buy you consider a number of factors:
• how much money do you have?
• what do you feel like eating?
• what will you be having for dinner?
• how far do you need to go to buy food?
• how much time do you have?
• where are you going to eat it?

3
Evidence-driven teaching
I had a hunch that Ana wasn’t doing as well as she could in her
research assignments, a major part of the history course. What
made me think this?
Ana’s general work (especially her writing) was fine. She made
perceptive comments in class, contributed well in groups and had
good results overall last year, especially in English.
How did I decide what to do about it?
I looked more closely at her other work. I watched her working in the
library one day to see if it was her reading, her use of resources, her
note taking, her planning, or what. At morning tea I asked one of
Ana’s other teachers about Ana’s approach to similar tasks. I asked
Ana if she knew why her research results weren’t as good as her
other results, and what her plans were for the next assignment.
I thought about all of this and planned a course of action. I gave her
help with using indexes, searching, note taking and planning and
linking the various stages of her research.

4
Consider the Evidence
A resource to assist schools
to review their use of data and other evidence

What is meant by ‘data and other evidence’?

5
Evidence
Any facts, circumstances or perceptions that can
be used as an input for an analysis or decision

• how classes are compiled, how classes are


allocated to teachers, test results, teachers’
observations, attendance data, portfolios of
work, student opinions …

Data are one form of evidence


6
Data
Known facts or measurements, probably
expressed in some systematic or symbolic way
(e.g. as numbers)

• assessment results, gender, attendance,


ethnicity …

Data are one form of evidence

7
Which factors are data?
Evidence to consider before buying lunch

• how much money you have


• what you feel like eating
• what you’ll be having for dinner
• how far you need to go to buy food
• how much time you have
• where you’re going to eat
• what your diet allows
8
Evidence-driven decision making
We have more evidence about what students
know and can do than ever before – their
achievements, behaviours, environmental
factors that influence learning

We should
• draw on all our knowledge about the learning
environment to improve student achievement
• explore what lies behind patterns of
achievement
• decide what changes will make a difference
9
What evidence does a school have?

• Demographics
• Student achievement
• Perceptions
• School processes
• Other practice

10
Demographics
What data do we have now to provide a profile
of our school?
What other data could we create?

• School
• Students
• Staff
• Parents/caregivers and community

11
Demographics
Data that provides a profile of our school

• School – decile, roll size, urban/rural, single sex or co-


educational, teaching spaces …
• Students – ethnicity, gender, age, year level, attendance,
lateness, suspension and other disciplinary data,
previous school, part-time employment …
• Staff – gender, age, years of experience, qualifications,
teaching areas, involvement in national curriculum and
assessment, turnover rate …
• Parents/caregivers and community – socio-economic
factors, breadth of school catchment, occupations …
12
Student achievement
What evidence do we have now about student
achievement?
What other evidence could we collect?

• National assessment results


• Standardised assessment results administered
internally
• Other in-school assessments
• Student work
13
Student achievement
Evidence about student achievement

• National assessment results - NCEA, NZ Scholarship - details like


credits above and below year levels, breadth of subjects entered …

• Standardised assessment results administered internally - PAT,


asTTle …

• Other in-school assessments - most non-standardised but some,


especially within departments, will be consistent across classes -
includes data from previous schools, primary/intermediate

• Student work - work completion rates, internal assessment


completion patterns, exercise books, notes, drafts of material -
these can provide useful supplementary evidence
14
Perceptions

What evidence do we have now about what


students, staff and others think about the
school?
Are there other potential sources?

• Self appraisal
• Formal and informal observations made by teachers
• Structured interactions
• Externally generated reports
• Student voice
• Other informal sources
15
Perceptions
Evidence about what students, staff, parents and the
community think about the school
• Self appraisal - student perceptions of their own abilities, potential,
achievements, attitudes …
• Formal and informal observations made by teachers - peer
interactions, behaviour, attitudes, engagement, student-teacher
relationships, learning styles, classroom dynamics …
• Structured interactions - records from student interviews, parent
interviews, staff conferences on students …
• Externally generated reports - from ERO and NZQA (these contain
data but also perceptions) …
• Student voice - student surveys, student council submissions …
• Other informal sources – views about the school environment, staff
and student morale, board perceptions, conversations among
teachers …
16
School processes

What evidence do we have about how our


school is organised and operates?

• Timetable
• Classes
• Resources
• Finance
• Staffing

17
School processes
Evidence about how our school is organised and
operates
• School processes - evidence and data about how your school is
organised and operates, including:
• Timetable –structure, period length, placement of breaks, subjects
offered, student choices, tertiary and workforce factors, etc
• Classes - how they are compiled, their characteristics, effect of
timetable choices, etc
• Resources - access to libraries, text books, ICT, special equipment,
etc
• Finance - how the school budget is allocated, how funds are used
within departments, expenditure on professional development
• Staffing - policies and procedures for employing staff, allocating
responsibility, special roles, workload, subjects and classes

18
Other practice

How can we find out about what has worked (or


not) in other schools?

19
Other practice
How we can find out about what has worked in
other schools?

• Documented research – university and other


publications, Ministry of Education’s Best Evidence
Syntheses, NZCER, NZARE, overseas equivalents …
• Experiences of other schools – informal contacts, local
clusters, advisory services, TKI LeadSpace …

20
What can we do with evidence?
Shane’s story
A history HOD wants to see whether history students are performing to
their potential.
She prints the latest internally assessed NCEA records for history
students across all of their subjects. As a group, history students
seem to be doing as well in history as they are in other subjects.
Then she notices that Shane is doing very well in English and only
reasonably well in history. She wonders why, especially as both are
language-rich subjects with many similarities.
The HOD speaks with the history teacher, who says Shane is attentive,
catches on quickly and usually does all work required. He mentions
that Shane is regularly late for class, especially on Monday and
Thursday. So he often misses important information or takes time to
settle in. He has heard there are ‘problems at home’ so has
overlooked it, especially as the student is doing reasonably well in
history. contd ...

21
Shane’s story … contd

The HOD looks at the timetable and discovers that history is Period 1
on Monday and Thursday. She speaks to Shane’s form teacher who
says that she suspects Shane is actually late to school virtually
every day. They look at centralised records. Shane has excellent
attendance but frequent lateness to period 1 classes.
The HOD speaks to the dean who explains that Shane has to take his
younger sister to school each morning. He had raised the issue with
Shane but he said this was helping the household get over a difficult
period and claimed he could handle it.
The staff involved agree that Shane’s regular lateness is having a
demonstrable impact on his achievement, probably beyond history
but not so obviously.
The dean undertakes to speak to the student, history teacher, and
possibly the parents to find a remedy for the situation.

22
Thinking about Shane’s story
What were the key factors in the scenario about
Shane?
What types of data and other evidence were
used?
What questions did the HOD ask?
What happened in this case that wouldn’t
necessarily happen in some schools?

23
Shane’s story - keys to success

The history HOD looked at achievement data in English


and history.
She looked for something significant across the two data
sets, not just low achievement.
Then she asked a simple question: Why is there such a
disparity between in these two subjects for that student?
She sought information and comments (perceptions
evidence and data) from all relevant staff.
The school had centralised attendance and punctuality
records (demographic data) that form teacher could
access easily.
The action was based on all available evidence and
designed to achieve a clear aim.
24
Evidence-driven strategic planning
If we use evidence-driven decision making to
improve student achievement and enhance
teaching practice …

… it follows that strategic planning across the


school should also be evidence-driven.

25
Evidence-driven strategic planning
. INDICATORS STRATEGIC GOAL ANNUAL PLAN YEAR TARGET EVALUATION
FROM DATA DATA
To raise the levels of writing Develop and implement Raise writing asTTle Appraisa
across the school a plan to raise levels of results year 9 boys l
Writing at year 9 from 3B to 3A

asTTle scores Strategic action Development plan to be etc. asTTle writing


show a high based on an analysis of results improve by
proportion of Develop a writing all available data and to …
year 9 development plan which include a range of
achieving addresses writing across shared strategies Perception data
below subjects and levels , from Yr 9 staff PD
curriculum level including targets, etc. indicates …
professional development
NCEA results and other resourcing needs Evaluation of
show high non- effectiveness of
achievement in etc. range of shared
transactional strategies, barriers
writing and enablers …

Poor results in etc Self


other language review
NCEA
standards

etc.
School
charter

26
The evidence-driven decision making cycle

Trigger
Explore
Question
Assemble
Analyse
Interpret
Intervene
Evaluate
Reflect

27
The evidence-driven decision making cycle

Trigger Clues found in data, hunches


Explore Is there really an issue?
Question What do you want to know?
Assemble Get all useful evidence together
Analyse Process data and other evidence
Interpret What information do you have?
Intervene Design and carry out action
Evaluate What was the impact?
Reflect What will we change?

28
The evidence-driven decision making cycle

.
Speculate
A teacher has a
Trigger hunch about a
Data indicate a problem or a
possible issue that possible action
could impact on
student
achievement
Explore
Check data and
evidence to
Reflect
explore the issue Question
on what has been
learned, how Clarify the issue
practice will change and ask a question

Evaluate the Assemble Decide


impact on the what data and
intervention evidence might be
useful

Act
Carry out the
intervention

Intervene Interpret Insights Analyse data and


Plan an action aimed at that answer your evidence
improving student question
achievement

29
The evidence-driven decision making cycle
SPECULATE

.
TRIGGER

EXPLORE

REFLECT
QUESTION

EVALUATE ASSEMBLE

ACT

INTERVENE INTERPRET ANALYSE

30
The evidence-driven decision making cycle
A teacher has a
. hunch - poor
writers might
Explore data
Survey of
Reflect Trigger spend little time students
How will we Significant on homework shows that
teach writing in numbers not Question
this is only
the future? achieving well What are the
partially true
in writing characteristics
of students who
are poor at
writing?
Evaluate Has
writing improved?

Analyse
NQF/NCEA
Intervene results by Assemble
Create multiple Interpret standard more data &
opportunities for writing; information other
include topics that can Poor writers evidence:
use sport as context; likely to play Analyse non asTTle reading,
connect speaking and sport, speak NQF/NCEA data homework,
writing. PD for staff. well, read less, and evidence extracurric,
do little HW attendance,
etc.
31
Evaluate and reflect
• Summative evaluation – assess how successful
the intervention was; decide how our practice
will change; report to board

• Formative evaluation – at every stage in the


cycle we reflect and evaluate
Are we are on the right track?
Do we need to fine-tune?
Do we actually need to complete this?
32
Types of analysis
We can compare achievement data by subject
or across subjects for
• an individual student
• groups of students
• whole cohorts

The type of analysis we use depends on the


question we want to answer

33
Inter-subject analysis

• Have my students not achieved a particular


history standard because they have poor formal
writing skills, rather than poor history
knowledge?

34
Intra-subject analysis

• What are the areas of strength and weakness in


my own teaching of this class?

35
Longitudinal analysis

• Are we producing better results over time in year


11 biology?

36
The evidence-driven decision making cycle

> Trigger Clues found in data, hunches


Explore Is there really an issue?
Question What do you want to know?
Assemble Get all useful evidence together
Analyse Process data and other evidence
Interpret What information do you have?
Intervene Design and carry out action
Evaluate What was the impact?
Reflect What will we change?

37
Asking questions

Evidence-driven decision making starts with


asking good questions

You can tell whether a man is clever by his answers. You


can tell whether he is wise by his questions.
Nobel Prize winner, Naguib Mahfouz

38
Trigger questions
• How good/poor is …?
• What aspects of … are good/poor?
• Is … actually changing?
• How is … changing?
• Is … better than last year?
• How can … be improved?
• Why is … good/poor?
• What targets are reasonable for …?
• What factors influence the situation for …?
• What would happen if we …?

Formative or summative?
39
Summative questions
A target in the school’s annual plan is for all year
10 boys to improve their writing level by at least
one level using asTTle (e.g. from 4B to 4A).

Have all year 10 boys improved by at least one


asTTle level in writing?

40
Questions about policy

We have been running 60-minute periods for 5


years now.

What effect has the change had?

41
Formative questions from data

The data suggest our students are achieving


well in A, but less well in B.

What can we do about that?

42
Formative questions from data

A significant proportion of our school leavers


enrol in vocational programmes at polytechnic or
on-job.

How well do our school programmes prepare


those students?

43
Questions from hunches
• I suspect this poor performance is being caused
by …
Is this true?
• We reckon results will improve if we put more
effort into ...
Is this likely?
• I think we’d get better results from this module if
we added …
Is there any evidence to support this idea?
44
Hunches from raw data
2.1 2.2 2.3 2.4* 2.5* 2.6* ABS DET
. 1 Pamela N A N N N N 20 6
2 Lee A A A N N A 12 0
3 Manu E E E E N E 18 4
4 Keisha N A N N N N 7 8
5 Bron E M M N N A 3 0
6 Deane M M E M N A 2 1
7 Slane N A N N N N 22 8
8 Sam A A N A A A 12 8
9 Sione M M N N N N 2 2
10 Oran A A A A A A 7 0
11 Shirin E E E E A E 6 0
12 Hanna E E M M A M 0 1
13 Val E E E E N E 0 0
14 Liam N A M M N M 10 2
15 Morgan M M M M N M 15 0
16 Hone N A N N N N 17 4
17 Mahi A A N A A A 10 0

45
Hunches from raw data
• Is the class as a whole doing better in internally
assessed standards than in externally assessed
standards? If so, why?
• Are the better students (with many Excellence
results) not doing as well in external
assessments as in internal? If so, why?
• Is there any relationship between absences and
achievement levels? It seems not, but it’s worth
analysing the data to be sure.

46
The evidence-driven decision making cycle

Trigger Clues found in data, hunches


> Explore Is there really an issue?
Question What do you want to know?
Assemble Get all useful evidence together
Analyse Process data and other evidence
Interpret What information do you have?
Intervene Design and carry out action
Evaluate What was the impact?
Reflect What will we change?

47
Question – Explore – Question
It looks like our students are doing well in A but
not in B. What can we do about it?

EXPLORE … what else should we be asking?

Is this actually the case?


Is there anything in the data to suggest what we
could do about it?

48
Question – Explore – Question
We have been running 60-minute periods for a
year now. Did the change achieve the desired
effects?

EXPLORE … what else should we be asking?

How has the change impacted on student


achievement?
Has the change has had other effects?
Is there more truancy?
Is more time being spent in class on
assignments, rather than as homework?
49
The evidence-driven decision making cycle

Trigger Clues found in data, hunches


Explore Is there really an issue?
> Question What do you want to know?
Assemble Get all useful evidence together
Analyse Process data and other evidence
Interpret What information do you have?
Intervene Design and carry out action
Evaluate What was the impact?
Reflect What will we change?

50
A very good question

• Specific and with a clear purpose


• Able to be investigated through looking at data
and other evidence
• Likely to lead to information on which we can act

51
Questions with purpose
What do we know about reported bullying
incidents for year 10 students?
MAY BE BETTER AS
Who has been bullying whom? Where?
What are students telling us?
What does pastoral care data tell us? Were
some interventions more effective with some
groups of students than others?

52
Write more purposeful questions
• What are the attendance rates for year 11
students?
• What has been the effect of the new 6-day x
50-min period structure?
• How well are boys performing in formal writing
in year 9?
• What has been the effect of shifting the lunch
break to after period 4?

53
More purposeful questions
1. How do year 11 attendance rates compare with other
year levels? Do any identifiable groups of year 11
students attend less regularly than average?
• Is the new 6-day x 50-min period structure having any
positive effect on student engagement levels? Is it
influencing attendance patterns? What do students say?
• Should we be concerned about boys’ writing? If so, what
action should we be taking to improve the writing of boys
in terms of the literacy requirements for NCEA Level 1?
• The new timing of the lunch break was intended to
improve student engagement levels after lunch. Did it
achieve this? If so, did improvements in student
engagement improve student achievement? Do the
benefits outweigh any disadvantages?
54
The evidence-driven decision making cycle

Trigger Clues found in data, hunches


Explore Is there really an issue?
Question What do you want to know?
> Assemble Get all useful evidence together
Analyse Process data and other evidence
Interpret What information do you have?
Intervene Design and carry out action
Evaluate What was the impact?
Reflect What will we change?

55
Assembling the evidence

• We want to know if our senior students are doing


better in one area of NCEA biology than another.
So … we need NCEA results for our cohort.
• It could be that all biology students do better in
this area than others.
So … we also need data about national
differences across the two areas.

56
Are our data any good?
A school found that a set of asTTle scores
indicated that almost all students were achieving
at lower levels than earlier in the year.

Then they discovered that the first test had been


conducted in the morning, but the later test was
in the afternoon and soon after the students had
sat a two-hour exam.

57
Think critically about data
• Was the assessment that created this data
assessing exactly what we are looking for?
• Was the assessment set at an appropriate level
for this group of students?
• Was the assessment properly administered?
• Are we comparing data for matched groups?

58
Cautionary tale 1
You want to look at changes in a cohort’s asTTle
writing levels over 12 months.

Was the assessment conducted at the same


time both years?
Was it administered under the same conditions?
Has there been high turnover in the cohort?
If so, will it be valid to compare results?

59
Cautionary tale 2
You have data that show two classes have
comparable mathematics ability. But end-of-year
assessments show one class achieved far better
than the other.
What could have caused this?
Was the original data flawed? How did teaching
methods differ? Was the timetable a factor? Did
you survey student views? Are the classes
comparable in terms of attendance, etc?

60
The evidence-driven decision making cycle

Trigger Clues found in data, hunches


Explore Is there really an issue?
Question What do you want to know?
Assemble Get all useful evidence together
> Analyse Process data and other evidence
Interpret What information do you have?
Intervene Design and carry out action
Evaluate What was the impact?
Reflect What will we change?

61
Analysing data and other evidence
• Schools need some staff members who are
responsible for leading data analysis
• Schools have access to electronic tools to
process data into graphs and tables
• All teachers do data analysis
• Data is not an end in itself – it’s one of the many
stages along the way to evidence-driven
decision making

62
Basic analysis
. 1
2.1
Pamela N
2.2
A
2.3
N
2.4*
N
2.5*
N
2.6* ABS DET
N 20 6
2 Lee A A A N N A 12 0
3 Manu E E E E N E 18 4
4 Keisha N A N N N N 7 8
5 Bron E M M N N A 3 0
6 Deane M M E M N A 2 1
7 Slane N A N N N N 22 8
8 Sam A A N A A A 12 8
9 Sione M M N N N N 2 2
10 Oran A A A A A A 7 0
11 Shirin E E E E A E 6 0
12 Hanna E E M M A M 0 1
13 Val E E E E N E 0 0
14 Liam N A M M N M 10 2
15 Morgan M M M M N M 15 0
16 Hone N A N N N N 17 4
17 Mahi A A N A A A 10 0
63
Basic analysis
• Divide the class into three groups on the basis of
overall achievement
• Identify students who are doing so well at level 2
that they could be working at a higher level
• Find trends for males and females, those who
are absent often, or have many detentions
• Compare this group’s external assessment
success rate with the national cohort.

64
Reading levels – terms 1 and 4
.

65
Making sense of the results
Think about significance and confidence

How significant are any apparent trends?


How much confidence can we have in the
information?

66
Making sense of the results
This table shows that
reading levels overall
were higher in term 4
than in term 1.
Scores improved for most students.
20% of students moved into level 5.
But the median score is still 4A.

Is this information? Can we act on it?

67
Information
Knowledge gained from analysing data and
making meaning from evidence.

Information is knowledge (or understanding) that


can inform your decisions.
How certain you will be about this knowledge
depends on a number of factors: where your
data came from, how reliable it was, how
rigorous your analysis was.
So the information you get from analysing data
could be a conclusion, a trend, a possibility.

68
Information
Summative information is useful for reporting
against targets and as general feedback to
teachers.

Formative information is information we can act


on – it informs decision-making that can improve
learning.

69
Questions to elicit information
• Did the more able students make significant progress,
but not the lower quartile?
• How have the scores of individual students changed?
• How many remain on the same level?
• How much have our teaching approaches contributed to
this result?
• How much of this shift in scores is due to students’
predictable progress? Is there any data that will enable
us to compare our students with a national cohort?
• How does this shift compare with previous Year 9
cohorts?

70
Reading levels – terms 1 and 4
.

71
Words, words, words …
Information can … establish, indicate, confirm,
reinforce, back up, stress, highlight, state, imply,
suggest, hint at, cast doubt on, refute …

• Does this confirm that …?


• What does this suggest?
• What are the implications of …?
• How confident are we about this conclusion?

72
The evidence-driven decision making cycle

Trigger Clues found in data, hunches


Explore Is there really an issue?
Question What do you want to know?
Assemble Get all useful evidence together
Analyse Process data and other evidence
> Interpret What information do we have?
Intervene Design and carry out action
Evaluate What was the impact?
Reflect What will we change?

73
Making sense of information
Data becomes information when it is categorised,
analysed, summarised and placed in context.
Information therefore is data endowed with relevance
and purpose.
Information is developed into knowledge when it is used
to make comparisons, assess consequences, establish
connections and engage in dialogue.
Knowledge … can be seen as information that comes
laden with experience, judgment, intuition and values.
Empson (1999) cited in Mason (2003)

74
Interrogate the information
• Is this the sort of result we envisaged? If not,
why?
• How does this information compare with the
results of other research or the experiences of
other schools?
• Are there other variables that could account for
this result?
• Should we set this information alongside other
data or evidence to give us richer information?
• What new questions arise from this information?

75
Interrogate the information
• Does this relate to student achievement - or
does it actually tell us something about our
teaching practices?
• Does this information suggest that the school’s
strategic goals and targets are realistic and
achievable? If not, how should they change, or
should we change?
• Does the information suggest we need to modify
programmes or design different programmes?
• Does the information suggest changes need to
be made to school systems?

76
Interrogate the information
What effect is the new 6-day x 50-min period
structure having on student engagement levels?

77
Interrogate the information
What effect is the new 6-day x 50-min period
structure having on student engagement levels?

Do student views align with staff views?


Do positive effects outweigh negative effects?
Is there justification for reviewing the policy?
Does the information imply changes need to be
made to teaching practices or techniques?
Does the information offer any hint about what
sort of changes might work?
78
The evidence-driven decision making cycle

Trigger Clues found in data, hunches


Explore Is there really an issue?
Question What do you want to know?
Assemble Get all useful evidence together
Analyse Process data and other evidence
Interpret What information do you have?
> Intervene Design and carry out action
Evaluate What was the impact?
Reflect What will we change?

79
Professionals making decisions
How do we decide what action to take as result
of the information we get from the analysis?

We use our professional judgment.

80
Professional decision making
We have evidence-based information that we
see as reliable and valid

What do we do about it?

If the information indicates a need for action, we


use our collective experience to make a
professional decision

81
Professionals making decisions
Have my students not achieved a particular
history standard because they have poor formal
writing skills, rather than poor history
knowledge?

The answer was ‘Yes’ ... so I need to think about


how to improve their writing skills. How will I do
that?

82
Professionals making decisions
Do any particular groups of year 11 students
attend less regularly than average for the whole
cohort?

The analysis identified two groups – so I need to


think about how to deal with irregular attendance
for each group.
How will I do that?

83
Professionals making decisions
You asked what factors are related to poor student
performance in formal writing.
The analysis suggested that poor homework habits
have a significant impact on student writing.

You make some professional judgements and decide


• Students who do little homework don’t write enough
• You could take action to improve homework habits - but
you’ve tried that before and the success rate is low
• You have more control over other factors – like how
much time you give students to write in class

So you conclude – the real need is to get students to


write more often
84
Deciding on an action
Information will often suggest a number of
options for action. How do we decide which
action to choose?

We need to consider
• what control we have over the action
• the likely impact of the action
• the resources needed

85
Planning for action
• Is this a major change to policy or processes?
• What other changes are being proposed
• How soon can you make this change?
• How will you achieve wide buy-in?
• What time and resources will you need?
• Who will co-ordinate and monitor
implementation?

86
Planning for action
• Is this an incremental change? Or are you just
tweaking how you do things?
• How will you fit the change into your regular
work?
• When can you start the intervention?
• Will you need extra resources?
• How will this change affect other things you do?
• How will you monitor implementation?

87
Timing is all
• How long should we run the intervention before
we evaluate it?
• When is the best time of the year to start (and
finish) in terms of measuring changes in student
achievement?
• How much preparation time will we need to get
maximum benefit?

88
Planning for evaluation

We are carrying out this action to see what


impact it has on student achievement

We need to decide exactly how we’ll know how


successful the intervention has been

To do this we will need good baseline data

89
Planning for evaluation
• What evidence do we need to collect before we
start?
• Do we need to collect evidence along the way,
or just at the end?
• How can we be sure that any assessment at the
end of the process will be comparable with
assessment at the outset?
• How will we monitor any unintended effects?
Don’t forget evidence such as timetables, student
opinions, teacher observations …
90
The evidence-driven decision making cycle

Trigger Clues found in data, hunches


Explore Is there really an issue?
Question What do you want to know?
Assemble Get all useful evidence together
Analyse Process data and other evidence
Interpret What information do you have?
Intervene Design and carry out action
> Evaluate What was the impact?
Reflect What will we change?

91
Evaluate the impact of our action

Did the intervention improve the situation that


triggered the process?

If the aim was to improve student achievement,


did that happen?

92
Evaluate the impact of our action

Was any change in student achievement


significant?
What else happened that we didn’t expect?
How do our results compare with other similar
studies we can find?

Does the result give us the confidence to


make the change permanent?

93
Evaluate the impact of our action
A school created a new year 13 art programme. In
the past students had been offered standard
design and painting programmes, internally and
externally assessed against the full range of
achievement standards. Some students had to
produce two folios for assessment and were
unsure of where to take their art after leaving
school.
The new programme blended drawing, design and
painting concepts and focused on electronic
media. Assessment was against internally
assessed standards only.
94
Evaluate the impact of our action
• Did students complete more assessments?
• Were students gain more national assessment
credits?
• How did student perceptions of workload and
satisfaction compare with teacher perceptions
from the previous year?
• Did students leave school with clearer intentions
about where to go next with their art than the
previous cohort?
• How did teachers and parents feel about the
change?
95
Evaluate the intervention
How well did we design and carry out the
intervention? Would we do anything differently if
we did it again?
Were our results affected by anything that
happened during the intervention period - within
or beyond our control?
Did we ask the right question in the first place?
How useful was our question?
How adequate were our evaluation data?

96
Think about the process
• Did we ask the right question in the first place?
How useful was our question?
• Did we select the right data? Could we have
used other evidence?
• Did the intervention work well? Could we have
done anything differently?
• Did we interpret the data-based information
correctly?
• How adequate were our evaluation data?
• Did the outcome justify the effort we put into it?
97
The evidence-driven decision making cycle

Trigger Clues found in data, hunches


Explore Is there really an issue?
Question What do you want to know?
Assemble Get all useful evidence together
Analyse Process data and other evidence
Interpret What information do you have?
Intervene Design and carry out action
Evaluate What was the impact?
> Reflect What will we change?

98
Future practice

• What aspects of the intervention will we embed


in future practice?
• What aspects of the intervention will have the
greatest impact?
• What aspects of the intervention can we
maintain over time?
• What changes can we build into the way we do
things in our school?
• Would there be any side-effects?
99
Future directions
• What professional learning is needed? Who
would most benefit from it?
• Do we have the expertise we need in-house or
do we need external help?
• What other resources do we need?
• What disadvantages could there be?
• When will we evaluate this change again?

100
Consider the Evidence

Terminology

101
Terminology
Terminology used in the
evidence-driven decision making cycle

Trigger Clues found in data, hunches


Explore Is there really an issue?
Question What do you want to know?
Assemble Get all useful evidence together
Analyse Process data and other evidence
Interpret What information do you have?
Intervene Design and carry out action
Evaluate What was the impact?
Reflect What will we change?

102
Trigger
Data, ideas, hunches, etc that set a process in
action.
The trigger is whatever it is that makes you think
there could be an opportunity to improve student
achievement. You can routinely scan available
data looking for inconsistencies, etc. It can be
useful to speculate about possible causes or
effects - and then explore data and other
evidence to see if there are any grounds for the
speculation.
103
Explore
Initial data, ideas or hunches usually need some
preliminary exploration to pinpoint the issue and
suggest good questions to ask.

104
Question
This is the key point: what question/s do you
want answered. Questions can raise an issue
and/or propose a possible solution.

105
Assemble
Get together all the data and evidence you might
need – some will already exist and some will
have to be generated for the occasion.

106
Analyse
Process sets of data and relate them to other
evidence.
You are looking for trends and results that will
answer your questions (but watch out for
unexpected results that might suggest a new
question).

107
Interpret
Think about the results of the analysis and clarify
the knowledge and insights you think you have
gained.
Interrogate the information. It’s important to look
at the information critically. Was the data valid
and reliable enough to lead you to firm
conclusions? Do the results really mean what
they seems to mean? How sure are you about
the outcome? What aspects of the information
lead to possible action?
108
Intervene
Design and implement a plan of action designed
to change the situation you started with.

Be sure that your actions are manageable and


look at the resourcing needed. Consider how
you’ll know what has been achieved.

109
Evaluate
Using measures you decided in advance,
assess how successful the intervention has
been.

Has the situation that triggered the process been


improved? What else happened that you maybe
didn’t expect?

110
Reflect
Think about what has been learned and
discovered – and what practices you will change
as a consequence.

What did we do that worked? Did this process


suggest anything that we need to investigate
further? What aspects of the intervention can be
maintained? What support will we need?

111
Terminology

Other terms used


in
Consider the Evidence

112
Terminology
Analysis

A detailed examination of data and evidence intended to


answer a question or reveal something.

This simplistic definition is intended to point out that data analysis is


not just about crunching numbers - it’s about looking at data and
other evidence in a purposeful way, applying logic, creativity and
critical thinking to see if you can find answers to your questions or
reveal a need. For example, you can carry out a statistical analysis
of national assessment results in the various strands of English
across all classes at the same level. You could compare those
results with attendance patterns. But you might also think about
those results in relation to more subjective evidence - such as how
each teacher rates his/her strengths in teaching the various strands.

113
Terminology
Aggregation

A number of measures made into one.

This is a common and important concept in dealing with data. A


single score for a test that contains more than one question is an
aggregation - two or more results have been added to get a single
result. Aggregation is useful when you have too few data to create a
robust measure or you want to gain an overview of a situation. But
aggregation can blur distinctions that could be informative. So you
will often want to disaggregate some data – to take data apart to see
what you can discover from the component parts. For example, a
student may do moderately well across a whole subject, but you
need to disaggregate the year’s result to see where her weaknesses
lie.

114
Terminology
Data

Known facts or measurements, probably expressed in some


systematic or symbolic way (eg as numbers).

Data are codified evidence. (The word is used as a plural noun in


this kit.) The concepts of validity and reliability apply to data. It helps
to know where particular data came from; how data were collected
and maybe processed before you received them. Some data (eg
attendance figures) will come from a known source that you have
control of and feel you understand and can rely on. Other data (eg
standardised test results) come from a source you might not really
understand; they may be subject to manipulation and predetermined
criteria or processes (like standards or scaling). Some data (eg
personality profiles) may be presented as if they are sourced in an
objective way but their reliability might be variable.

115
Terminology
Demographics

Data relating to characteristics of groups within the school’s


population. Data that provides a profile of people at your
school.

You will have the usual data relating to your students (gender,
ethnicity, etc) and your staff (gender, ethnicity, years of experience,
etc). Some schools collect other data, such as the residential
distribution of students and parental occupations.

116
Terminology
Disaggregation

See aggregation

When you disaggregate data, you take aggregated data apart to see
what you can discover from the component parts. For example, a
student may do moderately well across a whole subject, but you
need to disaggregate the year’s result to see where her weaknesses
lie.

117
Terminology
Evaluation

Any process of reviewing or making a judgement about a


process or situation.

In this resource, evaluation is used in two different but related ways.


After you have analysed data and taken action to change a
situation, you will carry out an evaluation to see how successful you
have been - this is summative evaluation. But you are also
encouraged to evaluate at every step of the way - when you select
data, when you decide on questions, when you consider the results
of data analysis, when you decide what actions to take on the basis
of the data - this is called formative evaluation.

118
Terminology
Evidence

Any facts, circumstances or perceptions that can be used as an


input for an analysis or decision.

For example, the way classes are compiled, how a timetable is


structured, how classes are allocated to teachers, student portfolios
of work, student opinions. These are not data, because they are not
coded as numbers, but they can be factors in shaping teaching and
learning and should be taken into account whenever you analyse
data and when you decide on action that could improve student
achievement.

119
Terminology
Information

Knowledge gained from analysing data and making meaning


from evidence.

Information is knowledge (or understanding) that can inform your


decisions. How certain you will be about this knowledge depends on
a number of factors: where your data came from, how reliable it
was, how rigorous your analysis was. So the information you get
from analysing data could be a conclusion, a trend, a possibility.

120
Terminology
Inter-subject analysis

A detailed examination of data and evidence gathered from more


than one learning area.

Inter subject analysis can answer questions or reveal trends about


students or teaching practices that are common to more than one
learning area. For example, analysing the results of students taking
mathematics and physics subjects can indicate the extent to which
achievements in physics are aided or impeded by the students’
mathematical skills.

121
Terminology
Intervention

Any action that you take to change a situation, generally


following an analysis of data and evidence.

This term is useful as it emphasises that to change students’


achievement, you will have to change something about the situation
that lies behind achievement or non-achievement. You will take
action to interrupt the status quo.

122
Terminology
Intra-subject analysis

A detailed examination of data and other evidence gathered from


within a specific learning area.

Intra subject analysis can answer questions or reveal trends about


student achievement or teaching within a subject or learning area. For
example, an analysis of assessment results for all students studying a
particular subject in a school can reveal areas of strength and
weakness in student achievement and/or in teaching practices, etc.
Comparison of a school’s results in a subject with results in that subject
in other schools is also intra subject analysis.

123
Terminology
Longitudinal analysis

A detailed examination of data and evidence to reveal trends over


time.

Longitudinal analysis in education is generally used to reveal patterns


in student achievement, behaviour, etc over a number of years. Results
can reveal the relative impact of different learning environments, for
example. In this resource, it is suggested that longitudinal analysis can
be applied to teaching practice and school processes. For example, the
impact of modified teaching practices in a subject over a number of
years can be evaluated by analysing the achievements of successive
cohorts of students.

124

You might also like