You are on page 1of 164

STANDARDS AND BENCHMARKS:

THE IMPLEMENTATION OF THE STANDARDS IN


THE SACSA FRAMEWORK AND THE
BENCHMARKS IN THE LITERACY AND
NUMERACY TEST IN PRIMARY SCHOOLS IN
ADELAIDE

by

ROSMAWATI

A thesis submitted in fulfilment of the requirements for the degree of

Master of Education (18 units)

Flinders University

Adelaide, Australia

August, 2006
ii
TABLE OF CONTENTS

LIST OF TABLES......................................................................................................7
LIST OF FIGURES....................................................................................................8
ABSTRACT.................................................................................................................9
DECLARATION.......................................................................................................11
ACKNOWLEDGMENTS........................................................................................12
Chapter 1...................................................................................................................13
Introduction...............................................................................................................13
1.1 Problem ............................................................................................................13
1.2 Background.......................................................................................................13
1.3 Curriculum Development in Australia..............................................................14
1.3.1 Factors that influenced curriculum change................................................14
1.3.2 School-Based Curriculum Development in Australia................................18
1.3.3 Teacher autonomy and curriculum development.......................................20
1.3.4 A changing perspective in South Australia................................................22
1.3.5 Curriculum Statements and Profiles..........................................................22
1.3.6 Basic Skills Tests.......................................................................................25
1.3.7 The Literacy and Numeracy Test...............................................................26
1.4 Purpose of the study..........................................................................................26
1.5 Research questions............................................................................................27
1.6 Objectives of the study.....................................................................................27
1.7 Significance of the study..................................................................................27
Chapter 2...................................................................................................................28
Curriculum................................................................................................................28
2.1 Introduction ......................................................................................................28
2.2 What is curriculum?..........................................................................................28
2.3 The Standards and the Art of Teaching.............................................................30
2.4 The model of outcomes and table of specifications..........................................31
2.5 Objectives by national curriculum groups........................................................32
2.6 The taxonomy of educational objectives..........................................................32
2.7 Teaching methods and evaluation.....................................................................33
2.7.1 Classroom observation...............................................................................33
2.7.2 Grading or marking....................................................................................33
2.7.3 Spelling and writing assessment................................................................34
2.7.4 Teacher-made tests or School testing programs and Standardized tests ...35
2.8 Conclusion .......................................................................................................36
Chapter 3...................................................................................................................37
Standards, Benchmarks and SACSA Framework.................................................37
3.1 Introduction ......................................................................................................37
3.2 Definitions and Findings of Benchmarks.........................................................37
3.3 SACSA Framework..........................................................................................38
3.3.1 What is the SACSA Framework?..............................................................38
3.3.2 What theory of learning underpinned the curriculum framework?...........38
3.3.3 What are the key elements of the curriculum framework?........................39
3.3.4 Why did South Australian schools need the SACSA Framework?............42

iii
3.3.5 What is different about the SACSA Framework?......................................44
3.4 The Outcomes Statements................................................................................45
3.5 Conclusion .......................................................................................................46
Chapter 4...................................................................................................................47
Core Principles in Learning.....................................................................................47
4.1 Introduction .....................................................................................................47
4.2 Piaget’s theory about development and learning..............................................47
4.3 Vygotsky’s concepts of proximal development................................................49
4.4 The implication of Piaget’s and Vygotsky’s theories.......................................50
4.4.1 The contributions of Piaget’s and Vygotsky’s concepts for teaching ........50
4.4.2 Assumption ...............................................................................................51
4.4.3 Shayer and Adey’s (2002) use of theories of both Piaget and Vygotsky in
designing their cognitive acceleration program ......................................51
4.4.4 Training for teachers..................................................................................52
4.4.5 Research findings associated with AC interventions.................................52
4.5 John Dewey’s View of knowledge....................................................................54
4.6 Conclusion .......................................................................................................54
Chapter 5...................................................................................................................56
Methods of Inquiry...................................................................................................56
5.1 Introduction ......................................................................................................56
5.2 Research methods.............................................................................................56
5.2.1 Data collection...........................................................................................58
5.2.2 Data analysis..............................................................................................59
5.2.3 The sample and the rationale for the selection...........................................60
5.2.4 Procedure ..................................................................................................60
5.3 Conclusion .......................................................................................................61
Chapter 6...................................................................................................................63
Findings and Discussions.........................................................................................63
6.1 Introduction ......................................................................................................63
6.2 How do the primary school teachers in Adelaide react to the specification of
standards in the SACSA Framework and the Benchmarks in the Literacy and
Numeracy Testing Program?...........................................................................63
6.2.1 Question 1: What does your school understand by the terms of Standards
in SACSA and the Benchmarks in the Literacy and Numeracy Test?......63
6.2.2 Question 2: How are the Standards in the SACSA framework and the
Benchmarks in the Literacy and Numeracy Test currently used in your
school by the principal and the teachers? Is the use of Standards and the
Benchmarks influenced by the context in which your school is placed?. 70
6.2.3 Question 3: What is your personal opinion about the Standards in SACSA
and Benchmarks in the Literacy and Numeracy Test?.............................77
6.2.4 Question 5: Do you think the Standards and the Benchmarks that are
associated with outcome statements are appropriate for Year 3, or Year 5,
or Year 7 students?....................................................................................84
6.2.5 Question 6: Are there any conflicting ideas and practices on trying to
achieve both the Standards and the Benchmarks?....................................92
6.2.6 Question 7: Are there any ways in which you think the standards and the
Benchmarks should be changed or modified?..........................................96

iv
6.3 What teaching methods do the teachers use in applying the standards in the
SACSA Framework and the Benchmarks in Literacy and Numeracy Testing
Program?.......................................................................................................102
6.3.1 Question 4: What teaching methods do you use for your students to
achieve the Standards in the SACSA Framework and the Benchmarks in
the Literacy and Numeracy Test?...........................................................102
6.3.2 Question 8: How do you report to parents information about their children
achieving the Standards and the Benchmarks? Do you draw attention to
the context in which your school is set in discussion with parents?.......110
6.3.3 Question 9: What way do you provide feedback to the children about their
achieving the Standards and the Benchmarks?.......................................115
6.3.4 Question 10: How do you report to the Year 4/ Year 6/ Year 8 teachers and
secondary school teachers about the students’ performance on the
Standards and Benchmarks at Year 3/ Year 5/ Year 7, and do they use the
information you provide?.......................................................................120
6.4 Do the teachers from different schools have different teaching methods in
implementing the standards in SACSA and the Benchmarks in the Literacy
and Numeracy Testing Program?..................................................................125
6.4.1Teaching methods.....................................................................................125
6.4.2 The ways teachers reported to parents.....................................................126
6.4.3 The ways teachers provided feedback to students...................................127
6.4.4 The ways of providing information to the teachers in Year 4/6 and to the
secondary school teachers......................................................................128
Chapter 7.................................................................................................................130
Conclusion and Implication...................................................................................130
7.1 Introduction ....................................................................................................130
7.2 Conclusion......................................................................................................130
7.2.1 The primary school teachers’ reaction to the specification of Standards in
SACSA and the Benchmarks in the Literacy and Numeracy Testing
Program .................................................................................................130
7.2.2 The teaching methods used in applying the Standards in the SACSA
Framework and the Benchmarks in the Literacy and Numeracy Testing
Program..................................................................................................134
7.2.3 Whether the teachers from different schools have different teaching
methods in implementing the standards in SACSA and the Benchmarks in
the Literacy and Numeracy Testing Program.........................................136
7.3 Implications for theory...................................................................................137
7.4 Implications for practice and policy...............................................................138
7.5 Implications for further research.....................................................................138
7.6 Concluding comment......................................................................................138
REFERENCES.......................................................................................................141
APPENDIX 1...........................................................................................................147
APPENDIX 2...........................................................................................................148
APPENDIX 3...........................................................................................................149
APPENDIX 4...........................................................................................................150
APPENDIX 5...........................................................................................................163
APPENDIX 6...........................................................................................................164

v
vi
LIST OF TABLES
Table 6.1 Definitions of the Standards in the SACSA Framework......................66
Table 6.2 Definitions of the Benchmarks in the LAN Test....................................69
Table 6.3 Implementation of the Standards in the SACSA Framework..............72
Table 6.4 Implementation of Benchmarks in the LAN Test..................................75
Table 6.5 The implementation of both the Standards and the Benchmarks in
terms of school context.............................................................................................77
Table 6.6 Personal opinion about the Standards in the SACSA Framework......80
Table 6.7 Personal opinion about Benchmarks......................................................84
Table 6.8 The appropriateness of the Standards in Year 3....................................85
Table 6.9 The appropriateness of the Standards in Year 5....................................86
Table 6.10 The appropriateness of the Standards in Year 7..................................88
Table 6.11 The appropriateness of the Benchmarks in Year 3..............................89
Table 6.12 The appropriateness of the Benchmarks in Year 5..............................90
Table 6.13 The appropriateness of the Benchmarks in Year 7..............................91
Table 6.14 Some of the Standards and the Benchmarks were appropriate.........92
Table 6.15 Conflicting ideas in Standards..............................................................93
Table 6.16 Conflicting ideas in Benchmarks..........................................................96
Table 6.17 Suggestions for changing or modifying the Standards in the SACSA
Framework................................................................................................................97
Table 6.18 Suggestions for changing or modifying the Benchmarks in the LAN
Test...........................................................................................................................100
Table 6.19 Teaching methods used in achieving the standards in the SACSA
Framework..............................................................................................................104
Table 6.20 Teaching methods used in achieving the Benchmarks......................107
Table 6.21 The assessment used in the Standards in the SACSA Framework..109
Table 6.22 The ways of reporting to parents about children’s achieving on the
Standards in the SACSA Framework...................................................................112
Table 6.23 The ways of reporting to parents about children’s achieving on the
Benchmarks in LAN Test.......................................................................................114
Table 6.24 Whether the teachers drew attention to the school context..............115
Table 6.25 The ways to provide feedback to children regarding the Standards in
SACSA.....................................................................................................................118
Table 6.26 The ways to provide feedback to children regarding the Benchmarks
..................................................................................................................................120
Table 6.27 How to provide information for the next teachers or next school
teachers....................................................................................................................122
Table 6.28 The usefulness of the information provided for the next teachers or
next school...............................................................................................................124

7
Table 6.29 Comparison between the two schools associated with assessment
methods used in achieving the Standards or the Benchmarks...........................125
Table 6.30 Comparison between the two schools associated with teaching
methods used in achieving the Standards.............................................................125
Table 6.31 Comparison between the two schools associated with teaching
methods used in achieving the Benchmarks.........................................................126
Table 6.32 Comparison between the two schools associated with the ways
teachers reported to parents in terms of the Standards in SACSA...................126
Table 6.33 Comparison between the two schools associated with how the
teachers reported to parents in terms of the Benchmarks in the LAN Test......127
Table 6.34 Comparison between the two schools associated with how the
teachers reported to parents in terms of the school context...............................127
Table 6.35 Comparison between the two schools associated with how the
teachers provided feedback to students in terms of Standards in the SACSA
Framework..............................................................................................................127
Table 6.36 Comparison between the schools associated with the ways teachers
provided feedback to students in terms of the Benchmarks in the LAN Test...128
Table 6.37 How to provide the information..........................................................128
Table 6.38 Comparison between the two schools associated with the ways
teachers provided feedback to students in terms of the usefulness of the
information provided..............................................................................................129

LIST OF FIGURES
Figure 1.1 A map of Curriculum Changes in Australia in the 1960s and 1870s,
modified from Sturman, 1989..................................................................................18

8
ABSTRACT
This thesis explores how the primary school teachers in Adelaide implement the
Standards in the SACSA Framework and the Benchmarks in the LAN Test. This
exploratory study seeks to answer the research questions: how do the primary school
teachers in Adelaide react to the specification of standards in the SACSA Framework
and the Benchmarks in the Literacy and Numeracy Testing Program?; what teaching
methods do the teachers use in applying the standards in the SACSA Framework and
the Benchmarks in the Literacy and Numeracy Testing Program?; do the teachers
from different schools have different teaching methods in implementing the
standards in the SACSA Framework and the Benchmarks in the Literacy and
Numeracy Testing Program?
The research investigation took the form of a qualitative exploratory study. The
primary method of data collection was structured interviews. The interviews
involved 11 teachers including a school principal and deputy principals, two Year 7
teachers, two Year 5 teachers, and four Year 3 teachers.
It was found that the teachers from School A and School B understood the terms ‘the
Standards in the SACSA Framework and the Benchmarks in the LAN Test’ very
well. In terms of reporting systems associated with the Standards, there was no
uniformity among the teachers. Furthermore, there were different reactions among
the teachers towards the Benchmarks. Whereas some teachers supported the
Benchmarks, some others refused them. Associated with the school context, there
were two different views among the teachers. Some teachers believed that the school
context was important to consider in implementing both the Standards and the
Benchmarks.
Furthermore, it was found that the Standards in the SACSA Framework were useful.
However, with respect to the Benchmarks some teachers agreed while others
disagreed about their usefulness. In terms of the appropriateness of the Standards in
the SACSA Framework, there were different views of the Standards in Year 3 and 5.
Some teachers considered that the Standards were appropriate, while some others
stated that they were inappropriate. However, in Year 7, all teachers had the same
view that the Standards were appropriate. Moreover, most of the teachers believed
that there were conflicting ideas and practices in trying to achieve both the Standards

9
and the Benchmarks, while some others claimed that there were no conflicting ideas
and practices.

There were no teaching methods used in trying to achieve the Benchmarks since they
were merely tests. Furthermore, the teachers frequently conducted testing activities,
yet the tests were not about the Standards in the SACSA Framework. In reporting to
parents about students’ attainment of the Standards in the SACSA Framework, the
teachers used informal and formal ways or verbal and written ways. In addition,
some teachers provided feedback about students’ attainment of in the Standards, but
some others did not unless it was specifically asked. Finally, the teacher provided
information to the 4, 6, and 7 teachers and secondary school teachers about students’
performance on the Standards and the Benchmarks either formally or informally. Yet,
in regard to their usefulness, they provided very different responses.

10
DECLARATION

I certify that this thesis does not incorporate without acknowledgement any material

previously submitted for a degree or diploma in any university; and that to the best of

my knowledge and belief it does not contain any material previously published or

written by another person except where due reference is made in the text.

___________________________________

Rosmawati

I believe that this thesis is properly presented, conforms to the specifications of thesis

presentation and is of significant standard to be, prima facie, worthy of examination.

___________________________________

Prof. John P. Keeves

11
ACKNOWLEDGMENTS

This thesis is dedicated to Prof. John P. Keeves, my supervisor. Without Prof. John P.
Keeves, I would never complete this thesis. At one time I nearly gave up. His support
and encouragement made me never give up and I kept going. His words ‘Don’t give
up, keep going’ will always stay in my heart. These words influenced me very
significantly when I started collecting data. I found collecting my data was hard, and
I was nearly frustrated, but when he said to me shortly, ‘‘Don’t give up, keep going’
suddenly I felt tougher and optimistic. My special thanks go to him.

My thanks also go to Pawel Skuza who helped me edit my thesis. I also thank to my
friends: Siraj, Nordin and Dianti who always gave me support and encouragement
while this thesis was in progress. I also thank all my friends who have helped me
during my study at Flinders University.

My thanks go to an AusAID Scholarship for providing me with funding to undertake


postgraduate study at the School of Education, Flinders University of South
Australia. Studying in Australia is one of the most wonderful experiences that I have
had in my life.

My sincere thanks and appreciation also go to the teachers who spared their busy
time and agreed to be interviewed in this study.

My huge thanks and appreciation go to my husband, Zainal Abidin Harun who has
supported and encouraged me at all times to complete this thesis. My thanks also go
to my daughter, Muti’a Mustatira and my son, Naufal Ashley for their patience while
I was doing my thesis.

12
Chapter 1
Introduction

1.1 Problem

Many countries would appear to use the curriculum with the main objective of
equipping students with competencies or skills rather than pursuing the objective of
knowledge. They also believe that placing students at the centre of learning is very
important. For example, South Australia is implementing the South Australian
Curriculum Standards and Accountability (SACSA), and Indonesia is implementing
a Competency-Based Curriculum (CBC). The SACSA (South Australian Curriculum
Standards and Accountability) Framework began to be implemented in 2001.
Similarly, the CBC (Competency-Based Curriculum) started to be applied in 2001
but merely in a few schools. Since 2004, this curriculum has been implemented in all
the schools of Indonesia, from vocational to general schools, and from elementary to
high schools. However, many teachers still lack confidence and are confused with
utilizing this new curriculum.
Since the Indonesian curriculum is very similar to the Australian curriculum in some
respects, it is of interest to conduct a study into the implementation of Standards in
the SACSA Framework and the Benchmarks in the Literacy and Numeracy Test in
the Primary Schools of Adelaide. The results of this study can provide meaningful
information for Indonesian teachers who are implementing a new curriculum that is
very similar to the SACSA Framework.

1.2 Background

Many Indonesian teachers still lack confidence and are confused in applying the new
curriculum (CBC). The teachers’ confusion has been caused by the limited books or
resources available, insufficient training or in-service education for teachers, and lack
of facilities. This condition might be related to Keeves’ comments on curriculum
change. Keeves (1982) argued that the burden of developing a coherent curriculum
across an increasing range of curricular fields, that was placed on the teachers within
each school, without greatly increased resources, and without special training in the
tasks involved, proved to be a very considerable burden (Keeves, 1982, p. 27).

13
By exploring how the primary school teachers in South Australia are implementing
the Standards in SACSA and the Benchmarks in the Literacy and Numeracy Test, it
can be expected that the findings will present useful information for Indonesian
teachers specially the primary school teachers, so that they can obtain a clearer
picture or get new ideas about the implementation of their new curriculum through
making comparisons between the two programs.
Furthermore, in Australia itself, when The National Statements and Profiles for
Australian Schools (DECS, 1994) were developed, there was marked controversy
(Keeves, 1999, p. 114). In addition to the national statements that specify the
curriculum, the profiles identify the achieved curriculum, while the implementation
of the curriculum is now left to the teachers and is not prescribed in the statements
and profiles (Keeves, 1999, p. 115).
Similarly, the Indonesian curriculum (CBC) does not prescribe the implementation of
the curriculum. Consequently, the teachers are left to design activities for the
classroom. This has led many teachers to feel a lack of confidence and become very
confused in implementing this curriculum.

1.3 Curriculum Development in Australia

1.3.1 Factors that influenced curriculum change

What factors have influenced curriculum change? The changes of a curriculum in a


country may have resulted from several factors, such as the political situation,
demands of society, the growth of knowledge, and the development of technology
(cf. Marsh & Willis, 2003; Sturman, 1989). In general Marsh (1992, p. 211) pointed
to seven different sources for curriculum reform such as (a) teachers; (b) teacher
unions; (c) politicians; (c) media and (d) pressure groups. In addition to these sources
of curriculum change, Keeves (1999, p. 113.) has highlighted five sources of
curriculum change in Australia. First, with the expansion of upper secondary
education there has been the demand by some university academics for the
maintenance of standards as well as their fight to maintain territory. Second, there
have been the consequences of the politicization of education over this period. Third,
Teachers have made claims for greater professional freedom. Fourth, with the
increased mobility between the states and territories of substantial numbers of
students, the need for a common curriculum across Australia has been clearly seen.
But more significant than these four sources, according to Keeves, has been the

14
growth of knowledge and the findings of research, that have required successive
revisions of the school curriculum. Similarly, Morris and Howson (1972, p. 14)
claimed that the outcomes of educational research were the extrinsic factors that
would influence further development of the curriculum.
Moreover, Keeves (1999, p. 113) stated that the controversy and tensions together
with the perspectives and findings of educational research conducted both in
Australia and overseas have increased in the area of the curriculum of the schools as
well as the direction of development in education (cf. Kemmis and McTaggart,
1993). In addition, Harman (1999, p. 35) argued that the politics of education in the
1970s and the early 1980s had triggered the development of an Australian
curriculum, for example the Commonwealth’s role in education, the controversy over
state-aid for non government schools, the legal and constitutional aspects of
educational management, and the work of government education agencies such as
the Australian Universities Commission, the Schools Commission, and
Commonwealth Curriculum Corporation.
Furthermore, Keeves (1999, p. 114) pointed out several key characteristics of the
statements and profiles that were associated with the claims made for both
innovation and development: (a) the demands to teach new content and new skills;
(b), the emphasis on assessment and student achievement; (c) curriculum content
and the learning outcomes should be in a matching developmental sequence; (d)
benchmarks or standards of performance that students should attain as their learning
progressed and the scales of development associated with the eight domains of
learning and the different strands of learning within those domains; and (e) the
provision of a framework that systems, schools, and text book authors could use to
construct more detailed curricula.
Moreover, as has been mentioned previously, significant changes occurred in
Australian education in the 1960s and 1970s (Department Education of South
Australia, 1977; Morgan, 1978; and Sturman, 1989). Furthermore, Sturman (1989, p.
61) argued that there had been certain macro-factors impacting on all systems of
education in Australia. These factors, according to him, were related both to the
purposes of education and to its administration. Additionally, Keeves (1999, p. 113)
drew attention to several important points relating to curriculum change in Australia
in 1960s and 1970s.

15
Prior to the 1960s
Keeves (1999, p. 115) explained that in this period, each state had built a system of
free and compulsory secular schooling with ordered progression by students through
an age and grade sequence. During the Second World War the Commonwealth
Government had taken over the power of taxation, but provision for school systems
remained the responsibility of each of the states. The curricula of the schools were
laid down by the central administration in each of the state school systems.
Moreover, the state Education Departments employed teams of inspectors to monitor
the work of the schools, to ensure that prescribed curricula were being followed
(Harman, 1999, p. 36).
Another important change that occurred in the 1960s was the catering for individual
differences in the classroom (Keeves, 1999, p. 117). Additionally, Bessant (1986) and
Fitzgerald (1970) (cited in Sturman, 1989, p. 61) commented that by the mid 1960s it
was widely believed that many students were entering secondary schools who were
considered unsuited to the existing curriculum, which was geared towards
participation in higher education and passing through a public examination system.

The 1970s
Two important moves occurred in this period. First, a memorandum was issued in
South Australia of ‘Freedom and Authority in the Schools’. Keeves (1999, p. 117)
wrote that in 1970, the then Director-General of Education in South Australia issued
a memorandum to schools in that state that had a significant impact on the curricula
and teaching methods of the schools, not only in South Australia but also across
Australia through the programs of the Australian Schools Commission. The idea of
the memorandum was about ‘Freedom and Authority in the Schools’. Related to this,
The Education Department of South Australia (1977, p. 3) had commented that
teachers at that time had large measure of freedom in choosing the outcomes of
education that they wished to teach. This meant that teachers were free to choose the
subject matter they included in their science courses, teaching methods they adopted,
and the outcomes that they sought to achieve.
Another move was that of School-Based Curriculum Development and the
Curriculum Development Centre was proposed in 1973 as a new Commonwealth
Government agency (Brady, 1992; Keeves, 1999). Through this approach the
individual schools and their teachers, students, administrators and parents undertook

16
the development of a curriculum that was determined by the staff and greatly
influenced by the resources of the school (Keeves, 1999, p. 118).
Furthermore, Harman (1999, p. 31-48) examined the politics of education in
Australia in the 1960s and in the 1970s. Harman (1999, p. 36) claimed that the
politics of education in Australia had developed considerably as a sub-field within
education. He further argued that a significant amount of scholarly writing had been
generated and generally there was a much greater understanding of the influence of
political factors on education systems, institutions and policy than there was in the
late 1960s. Long gone was the day when many educators believed that education was
outside of politics (Harman, 1999, p. 36). Some researchers (e.g. Harman 1999,
p. 38) claimed that it was essential to conduct studies concerning the effects of
political decisions and policy changes on what went on in schools and post-school
institutions. Overall, few Australian scholars had attempted to contribute new ideas
for conceptualizing the link between education and politics in societies, and for
mapping the various components of the fields of the politics of education (Harman,
1999, p. 39).
In addition, Sturman (1989, p. 61) argued that in Australia, there was an acceptance
of the need to create alternative curricula for less academically inclined students, and
a corresponding assertion that school-based curriculum decision making might lead
to the creation of more relevant programs.
Moreover, the emergence of the Federal Government as an influence on education
had had an important effect on policies related to the administration of education
(Sturman, 1989). Sturman (1989, p. 61-68) discussed the curriculum changes that
occurred in Australia during the years from 1960s to 1970s. Those changes can be
summarized in the following map that is presented in figure 1.

17
The control of curriculum decision making: pressures for change

The 1960s The 1980s


foundations for The 1970s
A time for
change process of change
accountability

Curriculum
Commonwealth Other
Structural Attitudes Development
Schools national
pressures & values Centre
Commission influences
(CDC)

Figure 1.1 A map of Curriculum Changes in Australia in the 1960s and 1870s,
modified from Sturman, 1989
1.3.2 School-Based Curriculum Development in Australia.

School-Based Curriculum Development in Australia was a popular approach for the


development of curricula in many Western countries in the 1970s (Print, 1987;
Marsh, 1992). In addition, Print (1987, p. 13) defined SBCD as the development of a
curriculum, or an aspect of it, by one or more teachers in a school to meet the
perceived needs of a school population, this involved an on-site resolution, in
curricula terms, of problems experienced with the existing curricula.
During the 1970s and 1980s, teachers were given more responsibility for curriculum
development, and were even allowed to develop their own school curriculum within
centrally determined guidelines. This trend became known as ‘School-Based
Curriculum Development’ (Print, 1987; Marsh, 1992; Keeves, 1999).
Statements setting the context for SBCD were made as early as 1968 in South
Australia and Tasmania. The report, The School in Society (Tasmanian Department
of Education, 1968) recommended that: (a) teachers be involved in curriculum
development; (b) programs be interpreted and organized in the schools; and (c)
opportunities be given within the centralized framework for “the exercise of
autonomy and individual initiative by members of the service” (Brady, 1992, p. 5).
Moreover, in South Australia in 1990, the publication of Educating for the 21st
Century was released as the charter for public schooling in South Australia. Schools
developed curricula to ensure that the essential skills and understandings and the

18
seven areas of study listed in the charter were experienced by all students from
Reception to Year 10 (Brady, 1992, p. 11).

What are the concepts of SBCD?

Campbell (1985, p. 33) summarized Eggleston’s (1980) definition of SBCD into four
features.
(1) It is particularistic. This means the curriculum development activity is focused
upon the diagnosed and perceived needs of the specific school or part of it. (2) It is
process-oriented. In terms of strategies for the curriculum intended; the process by
which these are developed is important in itself. (3) It is participatory. This means
that the appropriate style for developing the curriculum is co-operative, with staff
working together to produce plans for change. (4) It is preliminary. This means the
curriculum developed is to be seen as experimental, in the sense that it is open to
evaluation and appraisal after its implementation.
In addition, Skilbeck (1976, cited in Campbell, 1985, p. 33) identified three models
of school-based curriculum development: (a) the rational-deductive, (b) the rational-
interactive and (c) the intuitive.
Based on these themes and models, Campbell (1985, p. 34) argued that school-based
curriculum development was predicated upon the concept, admittedly idealized, of
teachers who creatively reconstructed the curriculum within a recognized framework
of local and national expectations. It was not predicated upon passive acceptance of
external definitions of the curriculum, or the myth of ‘autonomous schools’, existing
independently of their political and economic context. Thus, school-based curriculum
development was both framework adaptive and role extensive. Role extensive
assumes acceptance by teachers of a wider role than that restricted to classroom
performance and it consequently assumed change, or flexibility, in their existing
roles (Campbell, 1985, p. 34).

SBCD in Australia and its characteristics

Print (1987) and Brady (1992) have described the implementation of SBCD in
Australia. They highlighted that a more inclusive interpretation of SBCD in Australia
was that provided by the CDC (1977). The characteristics of SBCD were as follows.
(1) It involved teacher participation in decision making relating to curriculum
development and implementation. (2) It related to only part of a school rather than

19
involving the whole school. (3) It should be selective or adaptive rather than
creative. (4) It involved a shift towards the school in the responsibility for
curriculum decision making rather than a complete severance of the school’s link
with the centre. (5) It was a continuing and dynamic process which ideally involved
teachers, students and community. (6) It involved the need for various support
structures. (7) It required a change in the traditional role of the teacher. Lastly, Brady
(1992, p. 25) concluded that SBCD in Australia accommodated different degrees of
curriculum participation. It depended on the school itself.
The characteristics of SBCD described by Brady (1992) were essentially similar to
those described by Skilbeck (cited in Marsh, 1992, p. 128), such as: (a) the decision
making involved teachers; (b) it involved various support structures; and (c) it was
internal and organic to the institution since the decisions depended on the school
itself (cf. Brady, p. 24)
On the other hand, Whitton, Sinclair, Barker, Nanlohy, and Nosworthy (2004, p. 74)
stated that SBCD operated at the institutional level. It was the process of developing
locally relevant and practical plans for teachers and students. School based curricula
could be more sensitive to the cultural and educational needs of particular groups of
children.

1.3.3 Teacher autonomy and curriculum development

In Britain, an awareness of the importance of autonomy in schools among the


teachers started in 1944 (Campbell, 1985, p. 13). It was said that one reason for this
was that the central and local authorities exercised relatively weak control over what
was taught in primary schools, and teachers began to realize that the potential for
developing the curriculum in their schools was restricted only by resources and by
the talents, commitment and energies of individual teachers. Related to this, Kogan
(1980, cited in Campbell, 1985, p. 13) noted that after 1945, the convention that
schools created their own curriculum became part of the so-called ‘established
wisdom’ of British education. It was announced to be the right way of doing things
In South Australia, since 1977 the Department of Education has given its strong
support for teachers who wanted to establish their own curricula (Brady, 1992, p. 5).
Similar situations developed in other States in Australia. For example, The Radford
Committee (Queensland Department of Education, 1970 cited in Brady 1992, p. 6)
appointed to review the public examination system in Queensland, recommended a

20
far greater involvement for teachers in curriculum development and decision making.
Further, this committee claimed that teachers would have responsibilities, broader
and deeper than they had been expected to shoulder in the past for the assessment of
achievement and for curriculum development (Brady 1992, p. 6).
Clearly, it was necessary for teachers or curriculum designers to consider these
factors when they wanted to set up a curriculum. The reasons were that a philosophy
of education was needed as a base or foundation for setting up new curricula.
Furthermore, to make the teaching meaningful for students it was important to
consider their needs. In other words, the courses should be associated with the
students’ needs and context.
The final report of the Committee of Inquiry into Education in South Australia (1982,
p. 40) stated that curriculum development and planning could take place at three
distinct levels within an education system. These were: (a) the central or systematic
level, (b) the individual school level and (c) the level of the individual teacher within
the school. This Committee pointed out that in South Australian Government
Schools, curriculum development had, in the past, taken place at the central level and
there was bureaucratic control to ensure the adoption of the curricula in the schools
and classrooms. Furthermore, the Committee reported that by the early 1980s in
some parts of South Australia it had become a relatively widespread practice for
teachers to teach courses that they had developed themselves independently of other
teachers in the school.
There were also four broad forms in which the products of curriculum development
might be prepared for use in a school. First, there were statements of the broad
general aims of education. The second form in which the products of curriculum
development could be distributed was that of a detailed statement of curriculum
objectives and syllabus guidelines. The third form was a detailed statement of the
organization and content of the course to be taught, sometimes with
recommendations on the methods to be employed in teaching certain aspects of the
course. Finally, there were student texts and workbooks with accompanying teacher’s
guides that covered in detail the content and material to be taught (the Committee of
Inquiry into Education in South Australia, 1982, p. 41).
Accordingly, the Committee recommended that: (a) the curricula of each school
should be reviewed on a regular basis; (b) during a review of the curriculum, records
of the school should be examined and, where appropriate, approval recommended for

21
the continued use of the curriculum program of the school to the Regional Director
who would authorize approval on behalf of the Director-General of Education; (c)
where the school chose to replace a recommended curriculum with a course
developed within the school, such a course would be granted interim approval by the
School Council and Principal of the school and copies of all documents would be
lodged with the Regional Director; and (d) at the time of a review of the school the
Review Panel would examine and would recommend approval of acceptable
curricula to the Regional Director who would authorize approval on behalf of the
Director-General of Education.

1.3.4 A changing perspective in South Australia

In the early 1980s the State Development Council of South Australia released a
discussion paper concerned with the planning of a strategy for the future of the State.
The report envisaged that only those countries that actively pursued and adopted new
technology would have a competitive advantage on international markets (the
Committee of Inquiry into Education in South Australia, 1982, p. 62).
The Committee of Inquiry into Technological Change in Australia (1982, p. 63) had
identified three general effects of technological change. First, there would be new
employment opportunities created by the introduction of technological change that
would require new skills and more flexible approaches to management in industry
and commerce. Second, there would be indirect economic and employment effects
that would result from improved efficiency in the production of goods and services,
together with the potential for higher profits and increased wages for employees.
Third, there would be important social effects as some employees were displaced and
faced unemployment and retraining, as well as, for many, a significant reduction in
hours of work.

1.3.5 Curriculum Statements and Profiles

What were the Statements and Profiles?

South Australia was the first school system to introduce the idea of ‘attainment
levels’. Other states and territories became interested and work began in 1990 on
national statements and profiles. The national profiles in each of the eight areas of
study had a high degree of similarity to the South Australian attainment levels, such
as: (1) both the national profiles and attainment level were standards referenced; (2)

22
they valued teacher judgment; (3) they provided a tool for teachers to describe
student achievement; (4) they provided a common reporting framework; (5) they
assisted teachers to make judgments about student achievement (Education
Department of SA b, 1993, p. 1)
Statement and profiles were one of the resources used in South Australia to support,
guide, and monitor the achievement of students in the compulsory years of schooling
(DECS, 1994, p. 4). These statements and profiles gave teachers, parents and
students: (a) an agreed progression of curriculum content and outcomes that
facilitated programming and planning; (b) explicit statements about the valued
curriculum and curriculum outcomes; (c) common ways of describing student
achievement; and (d) a common agreed framework to support assessment and
reporting of achievement (DECS, 1994, p. 4). Additionally, a significant part of this
initiative involved teachers and schools in reviewing and reforming their assessment,
recording and reporting practices (DECS, 1995, p. 3). Moreover, responsibility for
the effective implementation of statements and profiles fell mainly on teachers and
schools (Education Department of SA c, 1993, p. 1) .

Curriculum statements

Each statement in the statements and Profiles expanded the common and agreed
national goals of schooling, described the nature and scope of each area of study and
outlines the content of the curriculum across four bands of schooling from lower
primary to upper secondary (DECS, 1994, p13). Statements would be used by
curriculum developers at the system and classroom level for providing a common
framework within which all curricula could be developed at the school and system
level. They were not syllabuses nor were they courses of study (DECS, 1994;
Education Department of SA a, 1993).

Profiles

The Profiles provided a map of typical student achievement from Years 1 to 10 in the
eight prescribed areas of study. Each provided a common language for reporting
achievement in relation to a set of outcomes of student achievement. Levels of
achievement were independent of levels of schooling. It was expected that students
would achieve across a range of achievement levels at any year level (DECS, 1994,
p. 14).

23
When, and how were the Statements and Profiles developed?

DECS (1994, p. 16) provided useful information about when, and how the
Statements and Profiles developed. In 1986, the States and Territories Directors-
General of Education were concerned about inefficiencies, quality and increasing
costs of curriculum activity and consequently sought to investigate ways of working
cooperatively together. In 1988, Mr Dawkins, Commonwealth Minister of Education,
released a statement ‘Strengthening Australian Schools’. This statement involved the
States and Territories to together cooperatively. In 1989, The Australian Education
Council (AEC) endorsed common and agreed national goals for schooling. In April
1990, the AEC advocated the development of student profiles. In December 1990,
the AEC approved the Australian Curriculum Assessment Project (ACAP) proposal
to develop profiles in mathematics and English. In 1991, the AEC launched projects
in all eight curriculum areas (The Arts, English, Health and Physical Education,
languages other than English, Mathematics, Science, Studies of Society and
Environment, Technology). In July 1993 the South Australian Minister of Education,
Employment and Training issued a statement supporting the implementation of
Statements and Profiles in South Australia (SA). In October 1993, the SA Director-
General of Education issued a circular endorsing Management Plan for the
Monitoring for Student Achievement and the Implementation of the Statements and
Profiles commencing in February 1994 (DECS, 1994, p. 16).

Why were the Statements and Profiles developed?

The fundamental reason for developing the Statements and Profiles was to improve
students learning outcomes (DECS, 1994, p. 1). This framework supported those
working to address issues of educational disadvantage by: (a) providing appropriate
teaching and learning methodologies for students by groups, and as individuals; (b)
providing student outcomes that supported understanding about the unacceptability
of sexual and racist harassment, about decision making processes, and about the
contributions of diverse cultures; (c) encouraging teachers to devise new ways for
students to demonstrate learning specifically to address the needs of different groups
of students; (d) providing for students in the same class to progress at different rates
and in different sequences from each other (DECS, 1994, p. 4).

24
The introduction of Monitoring of Student Achievement in South Australia.

By 1996, schools were expected to be reporting on student achievement using the


profiles. Planning began three years previously and trialling of the South Australian
attainment levels began in primary schools in 1992. Some schools trialled the
reporting of student achievement in 1993 (DECS, 1994, p. 21).
To sum up, the Attainment Levels and the Curriculum Profiles were parallel
documents; and the debate about the comparative use of the two sets of documents
had been a challenging and revitalizing aspect of curriculum change and
implementation in South Australia (Education Department of SA d, 1993, p. 4).

1.3.6 Basic Skills Tests

The Basic Skills Tests were first introduced in 1989 in New South Wales, through
which all Year 3 and Year 6 students were to be tested annually in five aspects of
literacy and numeracy (Education Department of SA a, 1993, p. 3). In addition, at the
end of 1993 the South Australian Government changed. The Liberal (conservative)
Government took up office with a very public commitment to the introduction of
basic skills testing at two of Year levels in primary school (Education Department of
SA d, 1993, p. 3).
The Basic Skills Tests involved the assessment of literacy across all Key Learning
Areas (KLAs). All syllabus and curriculum documents, either explicitly or implicitly,
required competence in a range of literacy skills in order for students to understand
and express the type of knowledge relevant to each KLA (NSW Department of
School Education, 1997). To sum up, literacy in Basic Skills Test was linked to (a)
English; (b) Science and Technology; (c) Mathematics; (d) Human Society and Its
Environment; (e) Visual Arts; and (f) Personal Develoment Health and Physical
Education (NSW Department of School Education, 1997, p. 2).
The Basic Skills Test assessment data could be incorporated into established and
ongoing classroom assessment to provide useful indicators of student achievement in
relation to the syllabus and outcomes at successive stages of learning. The Basic
Skills Test results provided descriptors of each item that was tested (NSW
Department of School Education, 1997, p. 3).

25
1.3.7 The Literacy and Numeracy Test

The Literacy and Numeracy Test (LAN Test) is a test conducted in South Australia to
assess attainment of the national benchmarks that are issued by the Australian
Government. Each State uses different terms for the national benchmarking test. For
example, Western Australia is implementing The Western Australian Literacy and
Numeracy Assessment (WALNA). The WALNA is a curriculum-based assessment
that is criterion-referenced and tests students’ knowledge and skills in numeracy,
reading, spelling and writing; and that is administered annually to students in Years
3, 5 and 7 (Department of Education and Training, Western Australia, 2006, p. 1).

The Literacy and Numeracy Benchmarks

The Australian Council of State School Organisation (2006, p. 1) defined the


benchmarks as nationally agreed minimum standards concerned with the essential
elements of literacy and numeracy. In literacy, the main elements are reading, writing
and spelling. Preliminary work has been conducted on speaking, listening and
viewing. In numeracy, the main elements are number, space, measurement and data
(Australian Council of State School Organisation Inc, 2006, p. 1).
The Australian Council of State School Organisation (2006, p. 2) stated that the main
purpose of introducing the benchmarks was to assess and report the performance of
school systems to the Australian community using an agreed benchmark. This would
help systems to examine whether their strategies to improve literacy and numeracy
were working (Australian Council of State School Organisation Inc, 2006, p. 2,
online).

1.4 Purpose of the study

The purpose of this study is to explore how the primary school teachers have
implemented the Standards in SACSA and the Benchmarks in the Literacy and
Numeracy Test. This study can be seen to have been designed as a pilot study for the
undertaking of a larger investigation. Specifically, the researcher plans to conduct a
similar study in Indonesia which is implementing a new curriculum that is called the
‘Competency-Based Curriculum (CBC)’.

26
1.5 Research questions

This study of the implementation of the Standards in SACSA and the Benchmarks in
the Literacy and Numeracy Test is driven by the following questions.
How do the primary school teachers in Adelaide react to the specification of
standards in SACSA and the Benchmarks in the Literacy and Numeracy Testing
Program?
What teaching methods do the teachers use in applying the standards in the SACSA
Framework and the Benchmarks in the Literacy and Numeracy Testing Program?
Do the teachers from different schools have different teaching methods in
implementing the standards in SACSA and the Benchmarks in the Literacy and
Numeracy Testing Program?

1.6 Objectives of the study

The objectives of this study are: (a) to explore how primary school teachers in
Adelaide react to the standards in SACSA Framework and the Benchmarks in the
Literacy and Numeracy Testing Program, (b) to explore what teaching methods the
teachers use in applying the standards in SACSA Framework and the Benchmarks in
the Literacy and Numeracy Testing Program, (c) and to explore whether the teachers
from different schools have different teaching methods in applying the standards in
SACSA Framework and the Benchmarks in the Literacy and Numeracy Testing
Program.

1.7 Significance of the study

The significance of this study is to provide information for educators and teachers in
general. The information may benefit the teachers in Indonesia who are
implementing a new curriculum (Competency-Based Curriculum) that is very similar
to the Standards in SACSA Framework and the Benchmarks in the Literacy and
Numeracy Test in many respects. For the researcher, this research will become a pilot
study to conduct a larger study in Indonesia. Finally, this study provides valuable
information for the conduct of a major investigation into primary schools in South
Australia in order to improve the implementation of the South Australian curriculum,
specifically in primary schools.

27
Chapter 2
Curriculum

2.1 Introduction

This chapter discusses the school curriculum and its association with research. This
chapter also describes the definition of curriculum and its nature and purpose.
Moreover, it is necessary to discuss the definition of curriculum briefly before further
discussing the implementation of the South Australian Curriculum.

2.2 What is curriculum?

The curriculum used in schools has been a consideration of writers on education for
many centuries. For example, Plato (Greek philosopher, fourth century BC),
Comenius (seventeenth century), and Froebel (nineteenth century) all gave their
attention to the curriculum and its problems (Zais, 1981, p. 32). But Zais (1981)
noted that the specialized and systematic study of curriculum and curricular
phenomena and the identification of certain individuals as curriculum specialists did
not occur until the twentieth century. Regarding the importance of studying the
curriculum, Keeves (1999, p. 13) claimed that no area of Australian education is
more important than the curriculum of the schools. Thus, it is extremely valuable to
conduct a study involving curriculum. One of the significant outcomes is to improve
the curriculum itself.
Furthermore, in the broadest sense the term ‘curriculum’ is ordinarily used by
specialists in the field in two ways: (a) to indicate, roughly, a plan for the education
of learners, and (b) to identify a field of study (e.g. Zais, 1981; and Campbell, 1985).
In addition, Zais (1981) provided an explanation of these two ways of using the term.
‘Curriculum as a plan’ for the education of learners provides of the goals and
objectives of the curriculum field. ‘Curriculum as a field of study’, is defined by (a)
the range of subject matter with which it is concerned; and (b) the procedures of
inquiry and practice that are followed (Zais, 1981, p. 32).

28
Additionally, Finch and Crunkilton (1984, p. 9) defined curriculum as the sum of the
learning activities and experiences that a student had under the direction of the
school. They further argued that there were two additional supporting concepts. First,
the central focus of the curriculum was the student. A second, supporting concept had
to do with the breadth of learning experiences and activities associated with a
curriculum.
Levy (1996, cited in Tomlinson, 2001, p. 39) pointed out that curriculum was more
that the content of the subject taught, but beyond that, the subjects were means to
teach students how to observe, how to question, how to communicate, and how to
think.
In addition to the nature of the curriculum, theorists (following Bloom, Hastings and
Madaus, 1971) divided the curriculum into four types of curriculum: (a) curriculum
foundations, (b) the designed curriculum, (c) the implemented curriculum, and (d)
the achieved curriculum.

The curriculum foundation

It was argued that the curriculum foundations were those basic forces that influenced
and shaped the content and organization of the curriculum. Curriculum foundations
were often referred to in the literature as the sources of the curriculum (Zais, 1981,
Print, 1987). Those curriculum foundations, according to Zais (1981, p. 42.) were:
(a) philosophy and the nature of knowledge; (b) society and culture; (c) the
individual; and (d) learning theory.

The designed curriculum.

Curriculum design referred to the arrangement of the components or elements of a


curriculum: (a) aims, goals, and objectives; (b) the subject matter or content; and (c)
proposed learning activities (Zais, 1981; Finch and Crunkilton, 1984; Marsh &
Willis, 2003)

The implemented curriculum

According to Zais (1981, p. 45), curriculum implementation meant simply putting


the curriculum that was produced by the construction and development processes
into effect. He further stated that curriculum implementation also involved using
evaluative feedback in the construction and development processes during which
information from feedback was utilized for curriculum revision and improvement.

29
Additionally, Marsh and Willis (2003) and Brady (1992) discussed some factors that
influenced implementation, such as the principal, teacher-teacher relations, teacher
orientation, quality and practicality of the program, the achieved setting of
curriculum standards , teaching programs for student assessment, and the materials
and resources available.

2.3 The Standards and the Art of Teaching

Bloom, Hastings and Madaus (1971, p. 13) defined the ‘art of teaching’ as the
analysis of a complex final product into the components that must be attained
separately and in some sequence. Furthermore, Bloom, Hastings and Madaus (1971,
p. 13) argued that the analysis and organization of the learning process is difficult to
describe when one is dealing with complex ideas. Thus, they suggested that the
structure of a learning process must be broken into several major parts: a model of
outcomes, the diagnosis of the learner at the beginning of the learning unit, and the
instructional process.
In addition to the Standards and the art of teaching, Tomlinson (2001) discussed
similar issues. Tomlinson (2001, p. 38) argued that when educators accepted
responsibility for effective practice in their profession, they sought to ensure that
standards supported rather than prevented the excellence of a curriculum. Further,
Tomlinson (2001, p. 38) highlighted four advantages of academic standards in
schools: (a) they could help establish a common direction, (b) they could ensure
some equity in learning goals, (c) they could provide a ready means of
communication among educators as well as with parents and community, and (d)
they could serve as common benchmarks that allowed teachers to record a student’s
learning journey.
On the other hand, Tomlinson (2001, p. 39) claimed that: (a) lists of standards were
often presented to teachers with little or no effective guidance in how to use them in
curricular planning, (b) standards documents provide virtually no modelling of the
connectivity of knowledge, and (c) the topic of instruction was absent from most
standards documents. Thus, Tomlinson (2001, p. 39) suggested that for teachers and
leaders of teachers, the job was to ensure that they used what they knew in
conjunction with standards rather than all standards to cause them to jettison their
professional knowledge. This implied that teachers should be selective in choosing
the standards that were appropriate for the students, and applicable. Related to this,

30
Tomlinson (2001, p. 39) provided nine guidelines for ensuring that standards-based
practice and instructional best practice came together in schools and classrooms: (a)
reflect on the purpose of curriculum, (b) plan the curriculum to address all facets of
learning, (c) plan the curriculum to help students make sense of things, (d) organize
the curriculum so that its contents are manageable for teachers and students, (e)
design instruction so that learning is invitational to students, (f) design instruction for
focused action, (g) design instruction to attend to learner variance, (h) work for
learning environments typified by safety, respect and trust, and (i) teach for success.
In terms of designing instruction for focused action, Tomlinson (2001, p. 44) added
that instruction should be student-focused in ways that asked each student to create,
argue with, analyse, reconstruct, look at multiple perspectives on, use, reflect on, and
communicate about the precise ideas and skills on which teachers intended them to
focus. Likewise, Ellis (2003) raised an issue of task-based language learning and
teaching. Ellis (2003, p. 7) agreed that tasks involved cognitive processes such as
selecting, reasoning, classifying, sequencing information, and transforming
information from one form of representation to another. Clearly, both Tomlinson’s
and Ellis’s ideas promoted learning activities involving cognitive processes.

2.4 The model of outcomes and table of specifications

Bloom, Hastings and Madaus (1971) introduced a table of specification for


outcomes. It showed that the main idea of the model was to emphasize the use of
content, and behaviours or processes jointly. Related to this, Bloom, et al. (1971, p.
15) provided four suggestions: (a) specifying the behaviours that the students were
expected to possess or exhibit if they had attained the objectives; (b) representing the
outcomes in the form of the problems, questions, tasks, and the like that students had
to be able to do or the kinds of reactions they had to give to specific questions or
situations; (c) having in mind the kinds of students (students’ characteristics) who
were likely to be able to attain these outcomes in reasonable periods of time under
the learning conditions planned; and (d) making decisions about what materials to
use, and what teaching methods and activities were appropriately used. Teachers or
curriculum makers can use these suggestions as a guidance or model to specify both
outcomes and objectives. Subsequently, many curriculum developers have used this
model, especially those who were interested in educational planning (e.g. Kim, 1975;
and Ebel & Frisbei, 1986).

31
2.5 Objectives by national curriculum groups

The question arose of how link together the teachers’ objectives and the nationally
developed objectives. Bloom et al. (1971, p. 36) provided three answers to this
question: (a) the teachers could compare their objectives, what they thought
important, with those of the national group; (b) the teachers had to study carefully the
objectives of the given curriculum in relation to their particular situations; and (c) the
teachers had to study carefully the content. This implied that teachers did not need to
worry if there are national objectives issued to their schools and they also wanted to
develop their own objectives, since they could select and combine them. They could
follow the suggestions provided by Bloom, et al. (1971).

2.6 The taxonomy of educational objectives

Various strategies can be used in developing educational objectives. Usually a


theorist or a curriculum specialist has proposed various ways to develop educational
objectives. For example, Bloom, et al. (1971) suggested three different strategies.
One of these was the taxonomy of educational objectives. Bloom, et al. (1971, p. 39)
stated that taxonomy was the result of the work of a group of college examiners who
developed the classification system for educational objectives to facilitate
communication among themselves about objectives, test items, and test procedures.
In the definition of educational objectives, both behaviour and the content must be
specified. The main ideas of the taxonomy of educational objectives could be
summarized into three main themes: (a) it placed the behavioural aspect of the
objective within a hierarchical framework; (b) each category was assumed to include
behaviour more complex, abstract, or internalized than the previous category; and (c)
these categories were arranged along a continuum from simple to complex in the
cognitive domain (Bloom, et al. (1971, p. 39). To sum up, the taxonomy of
educational objectives emphasized the importance of two different domains, namely,
the cognitive domain and the affective domain.
In addition to a statement of an objective, Bloom, et al. (1971, p. 20) argued that an
educator had to choose words that conveyed the same meaning to all intended
readers, since a statement of objectives could be interpreted differently by different
readers could give no direction in selecting materials, organizing content, and

32
describing obtained outcomes, could provide little common basis for both instruction
or evaluation.

2.7 Teaching methods and evaluation

In terms of methods of teaching, Bloom et al. (1971, p. 17) argued that it was
necessary to use of a greater variety of instructional methods, and the choice of
methods had to be dependent on the objectives of the instruction. Furthermore, they
stated that the instructional material and process had to be organized into smaller
units than an entire course, grade or program, since the breakdown of the learning
task provided the specifications for formative evaluation tests or procedures (Bloom
et al, 1971, p. 16-17).
Regarded evaluation in education, Terwilliger (1971, p. 4) highlighted three distinct
aspects. First, there was a need for value judgments concerning the merit of methods
and materials used in education. A second aspect of evaluation in education was the
judgment of merit of personnel responsible for the educational enterprise. A third
aspect concerned value judgments about the individual student.

2.7.1 Classroom observation

Many people have acknowledged the advantages of classroom observation as a way


of evaluating students’ learning. By using classroom observation, teachers could look
at how their students engage in learning, and what problem or difficulty they faced.
In addition to these advantages of classroom observation use, Bloom et al. (1971, p.
38) argued that classroom observation not only helped identify objectives but also
resulted in more comprehensive and reliable evaluation. Furthermore, they stated that
evaluation involved a description of course outcomes. In this strategy, according to
them, the observer looked not only for expected terminal behaviours but also for
unanticipated outcomes; and the procedure was based on the assumption that while
the text or course materials had content validity, it was difficult to envision all the
resulting behaviours beforehand. This implied that evaluation using classroom
observation provided large and authentic information associating with students’
learning processes.

2.7.2 Grading or marking

Ebel and Frisbie (1986, p. 243) defined a grading system as a method of


communicating measurement of achievement. Bloom et al. (1971) and Terwilliger

33
(1971) provided descriptions of ‘grading or marking’. Bloom, et al. 91971, p. 7)
pointed out that the system of categorizing students was generally designed to
approximate a normal distribution of marks, such as A, B, C, D, and F at each grade
or level. In addition, they argued that the results of this method of categorizing
individuals was to convince some that they were able, good, and desirable from the
view point of the system and others that they were deficient, bad, and undesirable.
Likewise, Terwilliger (1971, p. 7) stated that the purpose of grading systems was to
provide a systematic and formal procedure for transmitting value judgments made
by teachers to the students and to others most directly concerned with their
development and welfare.
Moreover, Ebel and Frisbie (1986, p. 243) argued that grades or marks were used as
self- evaluative measures and also to report students’ educational status to parents,
future teachers, and prospective employers. Besides, grades or marks functioned as
feedback to plan the curriculum. As Ebel and Frisbie (1986, p. 243) stated grades
provided a basis for important decisions about educational plans and career options;
and they also provided an important means for stimulating, directing, and rewarding
the educational efforts of students.
It is interesting to compare Bloom et al’s ideas and Terwilliger’s ideas concerning the
disadvantages of grading or marking in educational evaluation. Whereas Bloom et al
(1971) looked at the disadvantage from the view point of a student’s self-concept,
Terwilliger (1971) looked at the disadvantage from the view point of a student’s
motivation. Bloom, et al. (1971, p. 7) claimed that it was not likely that this continual
labelling had beneficial consequences for the individual’s educational development,
and it was likely that it had an unfavourable influence on a student’s self-concept.
Furthermore, Terwilliger (1971, p. 9) argued that the statement ‘students were more
interested in their grades than what they learn’ had become a cliché with some
teachers. From Terwilliger’s point of view, this implied that the evaluation of
students was unrelated to the skills, knowledge, appreciations, and understandings
that were acquired.

2.7.3 Spelling and writing assessment

Many people believe that spelling skill influences writing skill. Tindal and Marston
(1990, p. 181) stated that the assessment of spelling was necessary to ensure that
students learned to communicate in writing consistently and conventionally.

34
Additionally, Wallace and Larsen (1978, cited in Tindal and Marston, 1990, p. 181)
defined spelling, “the ability to arrange properly letters into words that are necessary
for effective written communication”.
Some theorists argued that it was important to highlight what spelling was, since it
would influence not only the type of student response required to determine spelling
proficiency, but also the methods to assess spelling (e.g. Tindal and Marston, 1990, p.
181).
Writing is another skill that is commonly assessed. Wallace (1978, cited in Tindal
and Marston, 1990, p. 205) described written expression as a complex skill that
required fluency in many areas including speaking, reading, spelling, handwriting,
capitalization, word usage, and grammar. Furthermore, Tindal and Marston (1990, p.
205) classified written expression into three diverse areas: communication, the
conventional manipulation of graphic symbols, and social skills; and they also
grouped written expression into three major behavioural activities: (a) planning, that
involves idea development, (b) composing, that follows the conventions of language
usage, and (c) editing, that deals with refining the finished product (Tindal and
Marston, 1990, p. 205).

2.7.4 Teacher-made tests or School testing programs and Standardized


tests

Generally, a good test should cover these criteria: validity, reliability, objectivity,
comprehensiveness, economical use of time, and simplicity in use (Lindvall and
Nitko, 1975; and Tindal and Marston, 1990). The importance of administering
teacher-made tests has been highlighted by many researchers (e.g. Lindvall and
Nitko,1975; Tindal and Marston, 1990). For example, Tindal and Marston (1990, p.
205) stated that the information that teachers used and needed much for teaching did
not come from standardized tests but from tests they made themselves and from
structured performance samples. In addition, Stiggins (1985, cited in Tindal and
Marston, 1990, p. 285) argued that locally-made tests had the advantage of increased
relevance and utility. However, instruments that did not undergo an acceptable test
development phase could lead to the misidentification and misplacement of children
(Meisels, 1985 cited in Tindal and Marston, 1990, p. 285).
Another type of tests is a standardized test. This type of test is the opposite of
teacher-made tests. Ebel and Frisbie (1986, p. 267) stated that the term ‘standardized

35
test’ referred to a test that: (a) had been methodologically and expertly constructed,
usually with tryout, analysis and revision, and (b) provided tables of norms for score-
interpretation purposes, derived from administering the test in uniform fashion to a
defined sample of students. On the other hand, Shepard (1984, cited in Tindal and
Marston, 1990, p. 354) claimed that standards could be used to separate the good
from the bad, however as with other psychological concepts, they could not be
embodied exactly in concrete test scores. Thus, performance standards posed special
problems for measurement experts.
Similar to Ebel and Frisbie (1986), Payne (1974, p. 305) pointed out that
standardized achievement tests were usually: (a) based on extensive analyses of
common educational outcomes, (b) carefully developed and tested on usually large
and representative student populations, (c) uniform and controlled conditions were
required for administration and scoring, and (d) were sometimes accompanied by
normative data.

2.8 Conclusion

To sum up, the school curriculum may have different meanings for all those
concerned with it. For example, students and parents may think of it as the weekly
timetable of subjects; teachers may think of it as the overall teaching program or
planning that is set up term by term; and local or central authorities may think of it as
the broad aims or areas of knowledge that could be covered through teaching.
Standards and the art of teaching are two ideas that cannot be separated. ‘Standards’
refer to documents while the ‘art of teaching’ refers to the practice or the procedures
used to develop the attainment of standards.
The traditional concept of a marking system involves a single set of symbols which
are adopted for recording and reporting a global value judgment of the achievement
of each student in each school subject. For example, the letters A, B, C, D, and F are
widely employed to designate various levels of achievement.
A taxonomy of educational objectives contributes to teaching and learning in many
ways, such as : (a) helping teachers to specify in broad terms operational goals, (b)
using it as a model for constructing similar items peculiar to meet their content-area
needs, (c) suggesting new classes of objectives, (d) helping teachers to analyze a
standardized test (Bloom, et al. (1971, p. 39).

36
Chapter 3
Standards, Benchmarks and SACSA Framework

3.1 Introduction

This chapter describes the Standards, Benchmarks, and the SACSA Framework. It is
organized into several sections. First, it discusses the definitions of the Standards and
the Benchmarks. Second, it discusses SACSA Framework including its structure and
its elements.

3.2 Definitions and Findings of Benchmarks

In 1996, the concept of the ‘benchmarks’ was introduced and agreed to by the
Ministers for Education in the States, the Territories and the Commonwealth. It was
agreed that ‘every child leaving primary school should be numerate, and be able to
read, write and spell at an appropriate level’. Ministers agreed to develop national
benchmarks for use in reporting minimum acceptable standards of literacy and
numeracy achievement in support of the national goals (DECS, 2000, p. iii).
A survey entitled ‘Curriculum Trends across Australia’ was conducted by the
Australian Curriculum, Assessment, and Certification, Authorities (ACACA) in
2000. This survey aimed to identify the understandings of teachers across Australia
concerning the standards and the benchmarks including the implementation of these
two terms in the different Australian states. Clear results emerged from the survey.
The Queensland School Curriculum Council, for example, used the terms standards
and benchmarking in particular contexts. In the case of the term standards, it was
used in the context of student performance standards and also in the context of
Criteria and Standards-Based Assessment. Student performance standards involved a
reporting framework developed by Education Queensland in 1994 and based on the
earlier development of the Curriculum Statements and Profiles by the Curriculum
Corporation. However, in 2000, the Student Performance Standards were no longer
used in Queensland. The Queensland School Curriculum Council was in 2000
developing detailed syllabuses for each of the nationally agreed key learning areas.
These syllabuses specified learning outcomes (core and discretionary) for a number

37
of strands and levels, (Australian Curriculum, Assessment, and Certification,
Authorities or ACACA, Queensland, 2000, online).
Moreover, in South Australia the results of the survey found that the term ‘standard’
related to the recently drafted curriculum standards that, in conjunction with the
curriculum scope and curriculum accountability components, formed the basis of the
draft South Australian Curriculum Standards and Accountability (SACSA)
Framework, across eight specified areas of learning. The definition of a ‘curriculum
standard’ was that it, "provides a common reference point for teachers and other
educators to use in monitoring, judging and reporting on learner achievement (in
clearly defined skills, knowledge and dispositions) over time". The curriculum
standards were defined by sets of learning outcomes, evidence of learner
performance and work samples (ACACA, SA, 2000, online).
Generally, the respondents from each state across Australia had the same definition
for the benchmarks (National Literacy and Numeracy Benchmarks). They were
defined as descriptions of nationally agreed minimum acceptable standards of
literacy and numeracy performance at Years 3, 5 and 7.

3.3 SACSA Framework

3.3.1 What is the SACSA Framework?

The South Australian Curriculum, Standards and Accountability (SACSA)


Framework described key ideas and outcomes upon which learners from birth to Year
12 could expect their education to be built (DECS, 2000, p1).

3.3.2 What theory of learning underpinned the curriculum framework?

It was stated that the curriculum framework was based on constructivist theories of
learning that viewed the learner to be active in the process of taking in information
and building knowledge and understanding; in other words, of constructing their
own learning (DECS, 2000; DETE, 2000). The Framework’s key ideas and outcomes
provided the basis for constructivist approaches to teaching and learning which built
on learners’ prior knowledge and experience and engaged them in purposeful,
contextualized, challenging and inherently interesting learning activities (DECS,
2000; DETE, 2000). However, constructivist theories had strong ideological over
tones that were rarely made explicit (cf. Gibbons, 2004).

38
Associated with constructivism as the theoretical underpinning of the SACSA
Framework, Gibbons (2004, p. 155) argued that there were many varieties of
constructivism, and the Framework did not make it clear which varieties of
constructivism it adopted or discarded. Gibbons (2004, p. 155) further stated that
there were significant differences between the different varieties of constructivism.
From his point of view, constructivism was an ontological and epistemological
approach that was problematic and exhibited internal incoherence and contradiction.
In addition, Gibbons (2004, p. 45-73) discussed thoroughly the nature of
constructivism including the range of constructivist theories, psychological
constructivism, knowledge and viability, sociological constructivism, and Longino’s
constructivism.

3.3.3 What are the key elements of the curriculum framework?

DECS (2000) and DETE (2000) identified three elements of the curriculum
framework: (a) curriculum bands, (b) curriculum scopes and (c) curriculum
accountability.

Curriculum bands

The single framework was designed with four curriculum bands from birth to Year
12: early years band (birth to Year 2); primary years band (Years 3, 4, and 5); middle
years band (Years 6, 7, 8, and 9); senior years band (Year 10, 11, and 12) (DECS,
2000; DETE, 2000).

Curriculum scopes

The curriculum scope was organized around four areas: (a) Learning Areas, (b)
Essential Learnings (features, identity, interdependence, thinking, and
communication), (c) Equity and Cross-Curriculum Perspectives, (such as gender
perspectives and multicultural perspectives), and (d) Enterprise and Vocational
Education. This focus reflected both national and state education policies as well as
training, work and life long learning priorities that had been made explicit in the
‘Adelaide Declaration on National Goals for Schooling in the Twenty-first Century’.
This declaration stated that students leaving school should have: “employment
related skills, and understanding of the work environment, career options and
pathways as a foundation for, and positive attitudes towards, vocational education
and training, further education, employment and lifelong learning…” (Ministerial

39
Council on Education, Employment, Training and Youth Affairs or MCEETYA, 1999
cited in DECS, 2000, p. 3-4).

Curriculum accountability

Curriculum accountability was defined as the professional responsibility of


educators, site leaders and state office personnel to provide a comprehensive account
of the developmental learning outcomes and curriculum standards achieved by
learners. It explicitly accounted for the steps taken to improve learning outcomes
(DECS, 2000; DETE, 2000).

Structure

Additionally, seven key competencies were specified. They were the skills that
underpinned the transition from school to work, training and lifelong learning. The
competencies were explicitly coded (DECS, 2000, p. 5):
KC1: collecting, analyzing and organizing information,
KC2: communicating ideas and information,
KC3: planning and organizing activities,
KC4: working with others in teams,
KC5: using mathematical ideas and techniques,
KC6: solving problems, and
KC7: using technology
Moreover, it was stated that the key ideas within each learning area contained the
fundamental concepts of the Framework, and they increased in complexity across the
curriculum bands from the early years to senior years. Furthermore, it was stated that
the key ideas were an essential and required part of the Framework (DECS, 2000;
DETE, 2000).
The learning areas changed through the early years to capture the growth and
broadening horizons of young children. The learning areas for students from
Reception to Year 12 were drawn from the Adelaide Declaration on National Goals
for Schooling in the Twenty-first Century. In the SACSA Framework, these learning
areas were structured and organized through strands (DECS, 2000; DETE, 2000).
The essential learnings described the values, dispositions, skills, and understandings
that were considered crucial in the education and development of all learners. The
development of these learnings was an on-going, lifelong process and occurred in

40
every context of a learner’s life. The scope of essential learning, communication,
literacy and numeracy skills and information and communication technology skills
were explicit and developed throughout the Framework in all learning areas (DECS,
2000; DETE, 2000).

Standards and curriculum accountability

The SACSA Framework was a standards referenced framework, which described the
curriculum expected for all learners (DECS, 2000; DETE, 2000). It was provided a
Framework from which educators from early childhood to the end of secondary
schooling constructed learning programs, assessed progress and reported on the
standards achieved (DECS, 2000; DETE, 2000).The SACSA Framework provided
the context in which educators, site leaders, and state office personal planned,
monitored, allocated resources, took appropriate action and accounted for the quality
of the learning programs (DECS, 2000; DETE, 2000). Consequently, the standards in
the Framework represented the expectations that were held for learners. They
provided a common reference point for educators to use in monitoring, judging and
reporting on learner achievement over time (ACACA, 2000; DECS, 2000; DETE,
2000). Standards were also organized around Learning Areas in which Essential
Learnings, Equity Cross-curriculum Perspectives and Enterprise and Vocational
Education were interwoven (DECS, 2000; DETE, 2000).
In addition, the SACSA Framework contained specific standards of performance that
were constructed in three different, yet complementary ways, namely: (a)
developmental learning outcomes, (b) curriculum standards 1-5 and (c) Year 12
standards (DECS, 2000; DETE, 2000). The essence of curriculum accountability was
the construction of learning programs and reporting of learner achievement on the
basis of these standards (DECS, 2000; DETE, 2000).

Developmental learning outcomes

Educators within the birth to five years range of the early years band used the
curriculum scope and developmental learning outcomes as a central part of a
teacher’s initial engagement with children. The outcomes also formed the basis of
planning for teaching and learning and the subsequent reporting of achievement
throughout the years of schooling (DECS, 2000; DETE, 2000).

41
Curriculum Standards 1-5

The SACSA Framework contained five curriculum standards, that were placed at
two-year intervals and aligned with years of schooling (i.e. Years 2, 4, 6, 8, 10)
(DECS, 2000; DETE, 2000). These curriculum standards were performance
milestones that depicted what would reasonably be expected of learners along a
continuum of ever-improving performance (DECS, 2000; DETE, 2000). They
represented fixed and common points of reference for describing the progress of
learners. Teachers assessed attainment of a curriculum standard for a learning area
when a student demonstrated achievement of all outcomes comprising the standards
(DECS, 2000; DETE, 2000).

Year 12 Standards

In the senior years band, educators used Curriculum Standard 5 (Year 10) in the same
way as those in the primary and middle years. For Years 11 and 12, Year 12 standards
comprised the essential learning capabilities demonstrated along with standards from
externally developed curricula. Teachers assessed and reported according to the
attainment of the specified requirements of these external curricula (DECS, 2000;
DETE, 2000).

3.3.4 Why did South Australian schools need the SACSA Framework?

The continuing nature of change in society and the generation of new knowledge
affected the schools and the communities they served. In order to educate children
and students, the students needed to meet the requirements of the dynamic society in
which they lived and worked. Thus the curriculum needed to be dynamic (DETE,
2000, p. 6).
The SACSA Framework was the first South Australian Framework to emphasize the
dynamic nature of curriculum. The SACSA Framework placed learners at the centre
of the learning process. The experiences, expertise, interests and needs of all learners
formed the basis for constructing the curriculum. The characteristics of learners
within each band, as well as the characteristics of groups of learners, were
acknowledged by the use of appropriate teaching and learning methods. A cohesive
curriculum framework, rather than isolated segments of content, enabled learners to
develop values, skills, dispositions and understandings to: (a) respond to change and
plan for the future; (b) develop a positive sense of self and group; (c) work well with

42
a variety of others; (d) be independent critical thinkers; and (e) communicate
powerfully (DECS, 2000; DETE, 2000).

Learner achievement data

Curriculum accountability was demonstrated through the construction of a


curriculum that (a) was responsive to a diversity of learners; (b) involved regular
assessment of learner’s performance; (c) included the implementation of intervention
and support strategies; and (d) reported achievement (DECS, 2000; DETE, 2000).
The link between these four dimensions was learner achievement data. The
developmental learning outcomes and curriculum standards detailed in the SACSA
Framework provided a common basis for describing learners’ achievements (DECS,
2000; DETE, 2000).

Introduction to essential learnings and the SACSA Framework

Essential learnings were considered to be understandings, dispositions and


capabilities that were developed through the Learning Areas and formed an integral
part of children’s and students’ learning from birth to Year 12 and beyond (DECS,
2000; DETE, 2000). They were resources that were drawn upon throughout life and
enabled people to engage productively with changing times as thoughtful, active,
responsive and committed local, national and global citizens (DECS, 2000; DETE,
2000). Engaging with these concepts was crucial to enhance the learning culture
within and beyond schools and sites (DETE, online).
Within the SACSA Framework, five essential learnings were identified. They were:
(a) futures; (b) identity; (c) interdependence; (d) thinking, and (e) communication
(DECS, 2000; DETE, 2000). Specifically, these essential learnings fostered the
capabilities to: (a) develop the flexibility to respond to change, recognize
connections with the past and conceive solutions for preferred futures; (b) develop a
positive sense of self and group, accept individual and group responsibilities and
respect individual and group differences (identity); (c) work in harmony with others
and for common purposes, within and across cultures (interdependence); (d) be
independent and critical thinkers, with the ability to appraise information, make
decisions, be innovative and devise creative solutions (thinking); and (e)
communicate powerfully with others (DECS, 2000; DETE, 2000).

43
Teaching and learning process

The understandings, capabilities and dispositions encompassed in the essential


learnings should be achieved by learners through: (a) using certain constructivist
approaches to learning; (b) practising the relevant skills within supportive and
enabling learning environments; (c) learning through active involvement; (d)
applying their learning to new and different contexts (DECS, 2000; DETE, 2000).
In addition, there was concern for literacy and numeracy. Literacy was the ability to
understand, analyze, respond critically to and produce appropriate spoken, written,
visual and multimedia communication in different contexts. Numeracy was the
ability to understand, analyze, respond critically to and use mathematics in different
contexts (DECS, 2000; DETE, 2000).

3.3.5 What is different about the SACSA Framework?

DETE (2000, p. 9-10) described the differences between the SACSA Framework and
the previous frameworks in the following ways.
(1) The Framework comprised two required components for all educators’
curriculum planning, assessment and reporting: key ideas and outcomes.
(2) The language of the Framework spoke to educators as professionals while
maintaining a clear, consistent and direct style.
(3) The quantity of material had been reduced in comparison to previous
frameworks. There was consistency in the use of key ideas and illustrative material,
outcomes and examples of evidence. This made the Framework more manageable for
local curriculum planning. It enabled educators to make local decisions about the
curriculum detail that would meet local priorities and the needs of the learners.

Comparisons with the Curriculum Statements and Profiles

The SACSA Framework differed from the Curriculum Statements and Profiles in the
following ways.
(1) There were fewer R-12 learning area strands (28) than in the Curriculum
Statements and Profiles (37). There were no strand organizers.
(2) There were five curriculum standards in the SACSA Framework compared with
eight levels in the Curriculum Statements and Profiles.
(3) There were fewer outcomes than in previous frameworks.

44
In the early years there were eight developmental learning outcomes compared with
65 in the foundation areas of learning.
In the SACSA Framework there were 72 outcomes per standard across all R-12
learning areas compared with 112 per level in the Curriculum Statements Profiles.
(4) The general introduction has been reduced into three accessible parts: (a) the
vision for the SACSA Framework; (b) a rationale for the new aspects of the
Framework; and (c) a practical guide to using the Framework.
(5) The band structure of the SACSA Framework would assist educators to consider
in their curriculum planning the distinctive characteristics of learners and learning at
particular stages of their education and care.
(6) The learning areas had been transformed through the interweaving of the
essential learnings, equity cross-curriculum perspectives and enterprise and
vocational education. This was the most innovative and forward-looking feature of
the Framework. Its effect was to describe learning actively, inclusively and
practically. It offered educators a new basis for developing programs and learning
activities that would actively engage children and students in their learning.

3.4 The Outcomes Statements

Australian Council for Educational Research or ACER (2000) conducted a study


regarding the outcome statements. The results from the study were reported with the
following conclusions.
Except for the Design and Technology and English key learning areas, teachers who
participated in this study did not seem in general to understand all the outcome
statements. However, this level of ‘not understanding’ was likely to be an
overestimate. Despite this, these findings pointed to a need for greater clarity in the
wording of many of the outcome statements which were not well understood.
For the teachers participating in this study, the essential learnings did not generally
accord with the description provided by the learning area outcome statements. The
Languages and the Science key learning areas had the poorest results here.
For the teachers participating in this study, the evidence points were not always
consistent with outcome statements. This was one significant problem in the Arts key
learning area, but otherwise there were few concerns. There was some evidence that
those evidence points lower in the list were more often seen as inconsistent with the

45
outcome statement than those outcome statements higher on the list (ACER, 2000, p.
21).
The reviewers suggested that careful consideration needed to be given to the
language used to describe the difficulty of the outcomes, since it seemed that there
were particular problems with the outcome statements at Level 6, and, Levels 2
and 3.

3.5 Conclusion

The SACSA Framework was a framework that teachers could use as a guide in
planning their lessons or teaching activities. The teachers should also use this
framework to monitor, judge, and report the learners’ achievement. On the other
hand, Benchmarks were associated with the Literacy and Numeracy Test (LAN Test)
and issued by DECS. The Benchmarks were the minimum of performance levels that
children should attain according to their year level.

46
Chapter 4
Core Principles in Learning

4.1 Introduction

This chapter discusses the core principles of learning, and the chapter is organized
into several sections. First, it describes Piaget’s theory of development and learning.
Second, it considers Vygotsky concept of proximal development. Third, it discusses
the implication of theories of Piaget and Vygotsky. Fourth, it examines Dewey’s
theory of knowledge.

4.2 Piaget’s theory about development and learning

Piaget (1964, cited in Ripple and Rockcastle, 1964; Demetriou, Gustafsson, Efklides,
and Platsidou, 1992) differentiated between development and learning. According to
Piaget the development of knowledge was a spontaneous process, tied to the whole
process of embryogenesis. Thus for Piaget, embryogenesis was concerned with the
development of the body, but it was also concerned with the development of mental
functions. In other words, development was a process that was concerned with the
totality of the structures of the knowledge held by individuals, rather than groups.
On the other hand, in general, learning was provoked by situations; by a
psychological experimenter; or by a teacher with respect to some didactic point; or
by an external situation. In addition, it was a limited process, and limited to a single
problem, or to a single structure (Piaget, 1964, cited in Ripple and Rockcastle, 1964;
and Demetriou et al., 1992). However, like development it was concerned with
individuals rather than groups.
Furthermore, Piaget (1964, cited in Demetriou et al., 1992, and Larkin, 2002)
suggested an idea of ‘operation’ for use in understanding the development of
knowledge in individuals. He argued that to know an object, to know an event, was
not simply to look at it and make a mental copy, or image, but to act on it. Thus, an
operation was the essence of knowledge, and was never isolated.

47
Moreover, Piaget (1964, cited in Ripple and Rockcastle, 1964, p. 10) identified four
main factors to explain the development from one set of structures to another: (a)
maturation, since this development was a continuity of the embryogenesis; (b) the
role of experience of the effects of the physical environment on the structures of
intelligence; (c) social transmission (linguistic transmission, education etc.); and (d)
equilibrium or self-regulation.
Many other theorists have reviewed Piaget’s theory of cognitive development (e.g.
Fraser, 1978; Maier, 1978; Shayer & Adey, 1981; Demetriou, et al., 1992; and Bidell
and Fischer, 1992). They argued that development supplied general structures of
knowledge, which had the educational role of readiness in preparing children’s minds
for the experience of learning. On the other hand, learning had the role of filling up
these performed structures with educational content (Bidell and Fischer, 1992, p. 12).
Thus, they call this as the readiness dilemma.
The Education Department of South Australia (1977, p. 53) argued that Piaget’s
theory about stages of intellectual development was closely linked to students’
readiness for learning. They argued that the materials and ideas used for learning
should be associated with these stages.
Additionally, Cronbach (1964, p. 73-77) cited Piaget’s theory about the importance
of the child’s developing, through hundreds of trials, the ability to anticipate what
might happen under this or that configuration of events. Here, Cronbach (1964, p.
76) contended that the risk of teaching children some concept verbally before they
had stored up appropriate images of the relevant objects operation was a lesson every
educator felt he or she had now learned.
The work of Piaget was broadly influential throughout Australia (e.g. Education
Department of South Australia, 1977, p. 53). Piaget’s ideas guided the direction taken
by many Australian researchers using many different methods involving cognitive
probes (Rennie and Treagust, 1999; and Fraser, 1978).
Briefly, there were five phases in Piaget’s developmental theory: (a) sensorimotor,
(b) pre-conceptual, (c) intuitive thought, (d) concrete operations, and (e) formal
operations. Based on these, Maier (1978) suggested that to comprehend Piaget’s
theory of cognitive development, there were several terms that should be clearly
understood: adaptation, assimilation, accommodation, equilibrium, operation,
schema, memory, and perception.

48
4.3 Vygotsky’s concepts of proximal development

Whereas Piaget has been widely accepted for his theory of cognitive development,
Vygotsky’s ideas (1978) have also been accepted, and in particular his concepts of
proximal development. Vygotsky referred to these ideas as a ‘Zone of Proximal
Development’ (ZPD). Vygotsky’s concepts referred to a range of levels of ability that
an individual could achieve under different conditions of social support (Bidell and
Fischer,1992). According to Vygotsky, development comprised the formation of new
cognitive abilities with the support of the social environment (Bidell and
Fischer,1992, p. 12).
Moreover, Vygotsky (1978, cited in Shayer and Adey, 2002, p. 5) highlighted that the
zone of proximal development was the difference between what a child could
achieve unaided and what he or she could achieve with the assistance of a more able
peer or adult. Both of these notions led to the supposition that cognition could be
stimulated by the presentation of intellectual challenges of moderate difficulty
(Shayer and Adey, 2002, p. 5).

Social construction

Vygotsky (1978, cited in Shayer and Adey, 2002, p. 5) argued that the construction of
knowledge and understanding was pre-eminently a social process. He claimed that
understanding appeared first in the social space that learners shared, and then became
internalized by individuals. On the other hand, Piaget, cited in Shayer and Adey
(2002, p. 6) contended that the environment which created cognitive conflict and
stimulated cognitive growth was importantly a physical environment as much as a
social one. The activities of talking around new ideas, exploring them through group
discussion, and asking for explanations and justifications, were all part of the process
of building individual knowledge, but they were also based on experience of the real
world.
Moreover, Vygotsky cited in Roberston (2002, p. 61) believed that pupils needed to
use language so as to integrate learning, and to make it explicit. Robertson would
seem to support this theory. Related to this, Robertson (2002, p. 61) questioned how
the process of learning became integrated for each pupil if there was no opportunity
to discuss in a safe environment how learning took place. This question was raised in
relation to her study concerning pupil’s understanding of what helped them learn.

49
The results of her study showed that a large number of pupils expressed the view that
‘not talking’ was important; it helped them learn.
Additionally, Venville (2002, p. 49) found that the role of social activity was
prominent in the CASE@KSI lessons and evidence of Year 1 children building on
each other’s ideas to find a solution to a problem was demonstrated. Venville (2002,
p. 49) further stated that children were able to solve problems that they were initially
unable to solve through interaction with more knowledgeable peers and with their
teachers.

4.4 The implication of Piaget’s and Vygotsky’s theories

What are the contributions of Piaget’s theory of cognitive development for teachers
to improve their quality of teaching?
Shayer and Adey (1981, p. 49) provided some answers for the question stated above.
First, it helped teachers select from all of the curriculum material available. Second,
the material selected should be the most appropriate for their particular pupils. Third,
for teachers and others who developed their own curricula, it helped in the selection
of objectives and the construction of activities that were realistic and stimulating.
Shayer and Adey (1981) undertook a detailed curriculum analysis using the
taxonomy suggested by Piaget. The results showed that different raters generally
agreed about the level of demand of a curriculum activity, provided that they agreed
about just what it was that they were rating. Where the objectives, and teaching
methods, were spelt out in detail in the teachers’ guide there was less room for
ambiguity, but in most cases the user had to decide what teaching methods would be
used, and just what objectives were implied. Much curriculum material allowed for
more than one method of presentation, and more than one expected response from
the pupil. The rating would then depend on the presentation and expectations chosen
and the construction of activities that were realistic and stimulating (Shayer and
Adey, 1981, p…).

4.4.1 The contributions of Piaget’s and Vygotsky’s concepts for


teaching

The problem that had encouraged Shayer and Adey (2002, p. 1) to conduct
longitudinal research was that they were convinced that some 80 per cent of the
school population (in England) performed academically well below their potential.

50
This implied that student academic performance could be raised by using appropriate
interventions according to the students’ age level that would maximize cognitive
development. Thus they created Cognitive Acceleration (CA) interventions that were
based on the theories of both Piaget and Vygotsky. The main objective of the study
was to investigate how the CA interventions worked out in a variety of curricula and
age contexts.
In conducting their study, they advanced three important questions: (a) was it valid to
work on the basis of some general intellectual function in children that lay beneath
any particular context (subject)-dependent component? (b) did this general
intellectual function develop with age?; and (c) was the development of this general
intellectual function influenced both by the environment and by maturation.
Moreover, Shayer and Adey (2002, p. 2-6) in designing their principles that would be
applied in cognitive acceleration (CA), used the theories of Piaget (developmental
psychology) and of Vygotsky (the socio-cultural psychology). Based on these bodies
of theory, they proposed six pillars that would be implemented in CA: schema theory,
concrete preparation, cognitive conflict, social construction, metacognition, and
bridging. Thus, they used these pillars in arranging a CA lesson.

4.4.2 Assumption

Based on the basic principles above it was concluded that the assumption behind all
CA interventions was the ability to process many aspects of reality simultaneously.
This was the key to high performance in any sphere, and conversely any context-
related intervention; and this was likely to affect the learning ability of a child
generally (Shayer and Adey, 2002, p. 7).

4.4.3 Shayer and Adey’s (2002) use of theories of both Piaget and
Vygotsky in designing their cognitive acceleration program

The Piagetian aspect could be seen in providing the context for the intervention; and
the Vygotskyan concepts could be seen in ‘the collective’ that involved the
interventions being conducted in small group cooperation and discussion. In the idea
of ‘the collective’, Vygotsky argued that it was not a remnant of communist
mysticism but it was a true description of where most child development lay. The
reason why a child should learn collectively, according to Vygotsky was that no one
child had the whole of the world to invent and construct for themselves; in any

51
activity most of their gains would come from appropriating a higher level of
performance that they had witnessed in a member of their collective.
Briefly, in implementing the CA interventions, the students were grouped according
to their age ranges. Here, each group was engaged in different activities that
contained a distinct level of difficulty depending on their age group. Hence, it can be
seen how Shayer and Adey (2002) combined the educational task and a
psychological model into educational practice.

4.4.4 Training for teachers

Before CA lessons were implemented, teachers were provided with training about
how to implement CA interventions. The coaching work lasted in the teachers’ own
classroom so that the teachers could see how the methods worked. This was because
both Shayer and Adey (2002, p. 25) believed that effective delivery of the activities
depended on the teachers having a good understanding of the underlying theory,
much practice in generating cognitive conflict, and encouraging social construction
as well as metacognition involving each child.

4.4.5 Research findings associated with AC interventions

The success of the CA interventions had been published widely in the United
Kingdom. This led to many schools wishing to apply the CA interventions.
Another impact of the success of the CA interventions was that some other
researchers were involved in conducting experimental studies associating with
cognitive acceleration. They used similar concepts to the CA intervention project
held by both Shayer and Adey (2002, p. 1-17). For example, Venville was conducting
an experimental study about enhancing the quality of thinking in Year 1 classes. In
this research, Venville (2002, p. 49) used the concept of a Zone of Proximal
Development (ZPD) from Vygotsky’s theory that a child’s learning could be assessed
by additional problems that the child could solve with social assistance; and the
concept of Piaget about concrete operational thinking. Venville (2002, p. 49)
claimed that the kind of thinking that occurred in the CASE@KSI lessons provided
potential for enhanced cognitive development, and that children were stimulated to
develop concrete operational thought patterns during the course of the activities.
Venvill (2002, p. 50) found that Year 1 children were engaged in so-called ‘good
thinking’ while participating in the CASE@KSI activities. From the findings of this

52
study Venville (2002, p. 50) gave three suggestions about how teachers could foster
good thinking in Year 1. (1) Difficulty should be an accepted part of the classroom
and children should be encouraged to undertake challenging problems and helped
with strategies for solving the problems. (2) Talk that explored and explained the task
at hand was a critical aspect of good thinking in Year 1, and teachers should
encourage children to explain problems, their ideas, actions, misunderstandings,
agreements, questions and possible solutions. (3) Thinking needed to be a discernible
part of the classroom environment, and children should be given time to think,
teachers should model thinking out loud, talk about their thinking and encourage
children to do the same. Teachers should also use open-ended questions that required
the pupils to engage in their own individual and original thought before they were
required to answer.
Another study that aimed to enhance cognitive development was conducted by
Larkin (2002) concerned with creating metacognitive experiences among five-and
six- year-old children. Similar to the previous studies (e.g Venville 2002), this
research also used CASE lessons or Cognitive Acceleration interventions that were
promoted by both Shayer and Adey (2002). In Larkin’s study, the students were
provided with complex interactions that encouraged them to create metacognitive
experiences. The question that was addressed in this study was why was
metacognition important for learning? Here, Larkin (2002) stated that metacognion
involved self-regulation and reflection on learning. Additionally, Hartman (1998
cited in Larkin 2002, p. 66) highlighted that metacognition was especially important
because it affected acquisition, comprehension, retention and application of what was
learned, in addition to affecting learning efficiency, critical thinking and problem
solving. Metacognitive awareness enabled control or self-regulation over thinking
and learning process and products.
Using CA (Cognitive Acceleration) programs, another study was conducted by
Hodgen (2002) who investigated how teachers’ mathematical knowledge was
transformed through reflection. Thus, the programs provided many and various ways
in which to reflect. Hodgen (2002, p. 130), therefore, contended that experiences like
this were crucial for promoting fundamental changes in primary teachers’
mathematical knowledge for teaching. Yet, according to him, to replicate such
intense experiences for the majority of primary teachers would be an extremely
difficult task.

53
Adey (2002, p. 34) concluded that the experiment described in the Cognitive
Acceleration project had demonstrated that a cognitive program based on Piagetian
ideas of cognitive conflict and the schemata of concrete operations and on
Vygotskyan ideas of social construction and scaffolding could have a significant
effect on the rate of cognitive development of five- and six-year-old children in Year
1 classes in a disadvantaged inner city environment.

4.5 John Dewey’s View of knowledge

John Dewey cited in McNeil (1985, p. 332-333) insisted that the Herbartian
interpretations of morality were too narrow and too formal. He protested against the
teaching of particular virtues without regard for the motives of children. Whereas the
Herbatians relied on ideas as the basic guide to conduct and to conceive of
knowledge as something to be acquired, Dewey thought more in terms of the child’s
discovery and evaluation of knowledge than of mere acquisition. He recommended
that the learner became the link between knowledge and conduct. In contrast to the
Herbartians’ assumption that there was a body of known knowledge which was
indispensable and which could be made interesting to pupils, Dewey argued that
subject matter was interesting only when it served the purposes of the learners.
Hence, he emphasized the learners’ participation in formulating the purposes, and
these principles were the basis for the selection of subject matter.
Thus, Dewey would not have the curriculum start with the facts and truth that were
outside the range of experience of those taught. Rather, he would start with materials
for learning that were consistent with the experience learners already had and then
introduce new objects and events that would stimulate new ways to observe and to
judge. On the other hand, Plato cited in Lovat and Smith (1993, p. 78) defined
knowledge as a fixed, unchanging commodity that no human endeavour could adjust
or alter in any way. The application of this theory was teacher-centred, with specific
objectives.

4.6 Conclusion

It is clearly necessary for teachers to draw attention to the stages of intellectual


development when they design learning materials, objectives, and activities for
students. This relates to students’ readiness. If the teachers ignore the students’
readiness, this can lead to learning problems. Consequently, the learning materials

54
may be too difficult or too easy for the students. From my perspective, what the
students need is something that can challenge them. Thus the zone development is
always appropriate for students’ level of knowledge and the conceptual level of the
knowledge being taught to the students.
Moreover, it can be argued that the social environment as well as social interaction
can influence a student’s cognitive ability. This implies that engaging students in
non-individual activities, for instance, pair work or group work has a positive effect.
I consider that when a student works with his or her friends, he or she is more likely
to construct his or her knowledge from them and with them. A student can learn from
what he listens to and observes from his friends. Besides, this may help a struggling
student to learn without feeling frustrated. However, the individual also learns from
experience in the real world and from other actors in physical and practical setting, as
is implied in Dewey’s idea of ‘discovery’.
Consequently, it is claimed that from the work of Piaget “learning by doing” is more
effective than learning merely by “acquiring information”. In “learning by doing”,
students are led to be more active, solve a problem, and be involved in both
exploration and discovery. These kinds of activities encourage students to become
independent learners.

55
Chapter 5
Methods of Inquiry

5.1 Introduction

This chapter discusses several issues pertaining to the methods of research that were
employed in the study. First, it points out what research methods were used. Second,
it describes interviews as the main method of data collection; and highlights
questionnaires as the secondary method of data collection within this study. Third, it
describes how the data were analyzed in this study. Fourth, this chapter describes the
sample and rationale for identifying the selection of the people to be interviewed.
Finally, this chapter discusses the procedures that were used in drawing inferences
for this study as well as the procedures used in reporting the results.

5.2 Research methods

This study was designed using a qualitative method that was considered appropriate
for an exploratory study. Exploratory research involves an attempt to determine
whether or not a phenomenon exists (Dane, 1990). Dane (1990, p. 5) further
highlighted that often, the procedures of analysis used for exploratory research were
qualitative analyses. Thus, qualitative research played an important role in exploring
and understanding this world and human behaviour or action, and was
complementary to other methods of building knowledge (Maggs, 2001; Darlington &
Scott, 2002). Thus, Gilgun (2005) argued that qualitative approaches were useful for
accomplishing many tasks, such as theory building, model and hypothesis formation
and testing, for obtaining descriptions of lived experiences, forming typologies, for
reasons given of human behaviour, and providing examples of natural behaviour and
thought in order to answer questions that surveys could not. Thus, in this study, the
researcher explored how primary school teachers in Adelaide implemented the
Standards in the SACSA Framework and the Benchmarks in the Literacy and
Numeracy Test, what they did and how they thought ahead about what they did.

56
In order to explore how primary school teachers in Adelaide implemented the
Standards in the SACSA Framework and the Benchmarks in the LAN Test, ten
interview questions were addressed to the interviewees as following:
1. What does your school understand by the terms of Standards in the SACSA
Framework and the Benchmarks in the Literacy and Numeracy Test?
2. How are the Standards in the SACSA Framework and the Benchmarks in
the Literacy and Numeracy Test currently used in your school by the
principal and the teachers? Is the use of the Standards and the Benchmarks
influenced by the context in which your school is placed?
3. What is your personal opinion about the Standards in the SACSA
Framework and Benchmarks in the Literacy and Numeracy Test?
4. What teaching methods do you use for your students to achieve the
Standards in the SACSA Framework and the Benchmarks in the Literacy
and Numeracy Test?
5. Do you think the Standards and the Benchmarks that are associated with
outcome statements are appropriate for Year 3/ Year 5/ Year 7 students?
6. Are there any conflicting ideas and practices in trying to achieve both the
Standards and the Benchmarks?
7. Are there any ways in which you think the Standards and the Benchmarks
should be changed or modified?
8. How do you report to parents information about their children achieving the
Standards and the Benchmarks? Do you draw attention to the context in
which your school is set in discussion with parents?
9. How do you provide feedback to the children about their achieving the
Standards and the Benchmarks?
10. How do you report to the Year 4/ Year 6/ Year 8 teachers and secondary
school teachers about the students’ performance on the Standards and
Benchmarks at Year 3/ Year 5/Year 7, and do they use the information you
provide?
Before the real interviews were started, the researcher conducted a Pilot Interview
that involved two elementary school teachers. The results of the pilot interviews were
very helpful. The results provided some input for the researcher to improve and
extend the interview questions. Moreover, the interview questions with numbers 1, 2,
3, 5, 6, and 7 were provided to answer the research question number 1, ‘how do the
primary school teachers in Adelaide react to the specification of standards in the
SACSA Framework and the Benchmarks in the Literacy and Numeracy Testing
Program?’ Furthermore, the interview questions with numbers 4, 8, 9, and 10 were

57
used to answer the research question number 2, ‘what teaching methods do the
teachers use in applying the standards in the SACSA Framework and the
Benchmarks in the Literacy and Numeracy Testing Program?’. Data analysis for the
research questions numbers 1 and 2, the researcher used ‘Node Search’ and ‘Assay
Tool’. The reasons for choosing these types of analysis were: (a) the researcher
needed to show the real description that the interviewees stated, so that the readers
could see what the original description look like; (b) the researcher considered that it
was important to show the matrix tables of the similar description, so that it would be
easier to see the respondents who argued certain points.
Moreover, to answer the last research question, ‘do the teachers from different
schools have different teaching methods in implementing the standards in the
SACSA Framework and the Benchmarks in the Literacy and Numeracy Testing
Program?’, the researcher used the results provided by the interview questions with
numbers 4, 8, 9, and 10 by using ‘Boolean Search’.

5.2.1 Data collection

This study used interviews as the main source for collecting data, and used a brief
questionnaire to obtain information about the teachers who were interviewed. The
researcher used face-to-face interviews and interviewed the participants individually.
Related to this, Fontana and Frey (2005) pointed out that the most common form of
interviewing involved individual, face to face verbal interchange, but interviewing
could also take the form of face-to-face group interchange of ideas.
Collecting data using interviews had many advantages. By using interviews,
information could be gathered that went deeper than a superficial level. Usually a
respondent provided additional and useful information without being asked. Perakylä
(2005) claimed that by using interviews, the researcher could reach areas of reality at
a depth that would otherwise remain inaccessible. This could involve people’s
subjective experiences and attitudes. The interview was also a very convenient way
of overcoming distances both in space and in time. Past events and the reasons for
actions could only be studied by interviewing people who carried out those actions,
and who needed to give an account and explain these reasons.
Consequently, the researcher considered that the most appropriate method for
collecting data in this investigation was by structured interviews. Furthermore, the
researcher believed that it would be better to interview the respondents individually

58
than in groups, since it would help the researcher to transcribe the results of the
recording. Another reason was that it would enable the respondents to talk freely and
provide much more information. In addition, the main reason for the researcher using
face-to-face interviews was that it was not hard to get into the schools to interview
the participants since the schools were near to Flinders University. However,
opportunities arose for some of those teachers who were interviewed to prepare a
written account of their actions and how they thought. This provided an alternative
method of obtaining information for the respondents that would enable the two
methods of obtaining respondents’ answer to the same questions to be compared.
Moreover, a questionnaire was used in this research in order to provide supporting
information about the background of each respondent. The questionnaire is given in
Appendix 5. The responses to items in this background questionnaire could be used
as attributes of the respondents in further analysis of the data.

5.2.2 Data analysis

Why was NVivo used in analyzing these interview data?


Today NVivo is popular software for analyzing qualitative data. For many
researchers, this software has been considered to have many advantages in analyzing
qualitative data. One of them is that this software is very good for organizing large
bodies of data. This software also provides a facility referred to as ‘modeler’ that
allows the building of models to be tested by the data collected in the project.
Richards (1999, p. 4) stated that NVivo was designed together to introduce a
researcher to each of the data structures and research processes the software supports,
and to research design and analysis strategies the researcher may use as he or she
develops his or her own project. In addition, Welsh (2002) cited in Ozkan (2004)
concluded that using qualitative data analysis software (QDAS) basically helped and
assisted the researchers during the labour-intensive process of qualitative data
analysis. Furthermore, Kelle (2004) who reviewed the significance of IT- supported
methods for qualitative research argued that the use of appropriate software programs
might be valuable in: (a) examining differences, similarities and relationships
between passages of text; (b) developing typologies and theories; (c) testing
theoretical assumptions using qualitative data material and the integration of
qualitative and quantitative methods.

59
Moreover, Ozkan (2004) in his study employed a qualitative approach by using the
NVivo program. He described how a qualitative data package, NVivo, was used in a
study of authentic and constructivist learning and teaching in the classroom. He also
described how NVivo was used in the analysis of observational (video) data,
interviews, and field notes. Additionally, Ozkan (2004) in his article that was titled
‘Using Qualitative Classroom Data on Constructivist Learning Environments’,
described clearly how to use NVivo software in analyzing qualitative data. From the
statements presented, it is argued that the use of the NVivo program in analyzing the
data of this study is highly appropriate.

5.2.3 The sample and the rationale for the selection

First, it was decided to conduct the interviews with 11 respondents from two primary
schools in the southern suburb of Adelaide. Furthermore, the participants were the
school Principals, the Deputy Principals, Year 3 teachers, Year 5 teachers, and Year 7
teachers. In addition, some teachers volunteered to provide written answers to the
questions that were used in the interview. Four teachers were very helpful in this
way. This additional response from these teachers added another dimension to the
study, since it allowed for a simple comparison between responses received from the
teachers through interviewing and the responses received in written answers to the
questions asked. This became an important and interesting secondary issue to this
study, that was undertaken to provide findings that were additional to the research
questions listed in Chapter 1.
The reason for choosing the schools was that those schools were located near
Flinders University, so that it would be easier for the researcher to collect data; and
the reason for choosing the Year 3 teachers, Year 5 teachers, and Year 7 teachers was
that the Literacy and Numeracy Test was given to pupils in Years 3, 5, and 7. The
Principals and the Deputies of Principals were involved because the researcher
considered that these people would know more at the school level about the
implementation of the Standards in SACSA and the Benchmarks in the Literacy and
Numearcy Test.

5.2.4 Procedure

Prior to conducting this study, several Ethics Committee requirements had to be


satisfied. Ethics Committee approval for this study was obtained from The Flinders

60
University of South Australia Social Behavioural Research Committee (See
Appendix 1). The school principals were also given a Consent Form to be signed.
After ethics approval had been obtained, in February 2006, the Department of
Education and Children Services of South Australian was contacted. Subsequently,
the researcher contacted the schools. At the same time the researcher provided a
statement on the interview questions that would be addressed. Thus, the teachers
could consider whether they would participate in the study or not.
Then, the school Principal and the Deputy Principal contacted a number of teachers
in their schools and asked them whether they would like to participate in the study.
The researcher then arranged an appointment with the participants. The interviews
were held in the schools. Each interview lasted not more than 30 minutes.

School A
In School A, the school Principal was initially reluctant to participate in this study.
However, the number of teachers who participated was more than the researcher
required or more than five participants. Thus, the total number of the respondents
from this school was seven teachers namely the Deputy Principal, two Year 3
teachers, two Year 5 teacher, and one Year 7 teacher. The Deputy Principal contacted
the teachers of Years 3, 5, and 7 in the school, and asked if any of them would like to
take part in the project.

School B
In this school, the Principal, the Deputy Principal, and the teachers in Years 3, 5, and
7 showed no reluctance to take part in this study. The School Principal contacted all
the teachers in Years 3, 5, and 7, and asked them if they would like to participate in
the study. One teacher from each year level was willing to be interviewed. First, the
researcher interviewed the School Principal, and the following day the researcher
interviewed the Deputy Principal and the three teachers.

5.3 Conclusion

Since the research focused on two schools, it was believed that the study clearly
could not represent the whole population of primary schools in Adelaide. However,
through detailed interviews with the 11 teachers from two schools, the researcher
was able to obtain both comprehensive and valuable information that related to the

61
implementation of the Standards in the SACSA Framework and the Benchmarks in
the Literacy and Numeracy Test.

62
Chapter 6
Findings and Discussions

6.1 Introduction

These findings are presented on the results of the Node Search and Assay Tool
analysis. The results resulting from Node Search are presented in texts, while the
results resulting from the Assay Tool are displayed in tables. These results aim to
answer the three research questions. The three research questions addressed are: (a)
how do the primary school teachers in Adelaide react to the specification of standards
in the SACSA Framework and the Benchmarks in the Literacy and Numeracy
Testing Program?; (b) what teaching methods do the teachers use in applying the
standards in the SACSA Framework and the Benchmarks in Literacy and Numeracy
Testing Program?; and (c) do the teachers from different schools have different
teaching methods in implementing the standards in the SACSA Framework and the
Benchmarks in the Literacy and Numeracy Testing Program?.

6.2 How do the primary school teachers in Adelaide react to the


specification of standards in the SACSA Framework and the
Benchmarks in the Literacy and Numeracy Testing Program?

In order to answer this question, the researcher used six interview questions namely
Questions Number 1, 2, 3, 5, 6 and 7 (See Appendix 4).

6.2.1 Question 1: What does your school understand by the terms of


Standards in SACSA and the Benchmarks in the Literacy and
Numeracy Test?

Standards in the SACSA Framework

Based on the data analysis, it can be concluded that the teachers defined the
Standards in the SACSA Framework as: (a) a guideline, (b) a level of instruction, (c)
outcome assessment; (d) outcome expectation, (e) reports to parents, (f) a South
Australian base, (g) talking about Australian students, and (h) a two year band. These
definitions are displayed in the following results that were drawn from both Node
Search and Assay Tool analysis in Table 6.1.

63
Guidelines
Document 'BeChr', 1 passage. Section 2.1.2, Paragraph 8, 53 characters.
I take as a guide from this book, teaching resources.
Document 'BeFv', 1 passage. Section 2.1, Paragraph 4, 136 characters.
Our school uses SACSA as a guideline to curriculum planning and implements the
Literacy and Numeracy test as DECS requires to gain data.
Document 'BeVrc', 1 passage. Section 2.1, Paragraph 5, 30 characters.
The formal criteria in there.
Document 'daDsy', 1 passage. Section 2.1, Paragraph 5, 99 characters.
We understand that we have to use the Standards in SACSA to program and plan for
students’ learning
Level of instruction
Document 'BeSvn', 1 passage. Section 2.1, Paragraph 5, 71 characters.
The Standards or levels of instruction aimed at particular Year Levels,
Outcome assessment
Document 'BeChr', 1 passage. Section 2.1, Paragraph 5, 396 characters.
At the end of Year 5, I expect that those students who will have achieved
the outcomes that I’m expecting for a certain learning area… and… and those
outcomes that I’m naming for I can assess or I’ll assess but not everyone. Some of
them I‘ll do orally, some of them I’ll do in written ways. And those come up
checking to see whether the students have achieved the Standards what I’m
expecting.
Document 'daDsy', 1 passage. Section 2.1, Paragraph 5, 81 characters.
There is an outcome in Year 3 level that students have to make in over two years.
Outcome expectation
Document 'BeChr', 1 passage. Section 2.1, Paragraph 5, 261 characters.
Most of the students that I teach are Year 5 students. In terms of the school and
myself, what I understand by the terms Standards in SACSA is that what I expect
the students to achieve at a certain age or certain year level, for instance Year 5
because I teach in Year 5.
Document 'BeFrtn', 1 passage. Section 2.1, Paragraph 5, 113 characters.
Standards is the expected outcomes for year levels, and Benchmarks are minimum
level two years behind Standards.
Document 'BelFftn', 1 passage. Section 2.1, Paragraph 5, 142 characters.
Standards are the outcomes that have to be achieved by learners at particular year
levels along a continuum across all areas of the curriculum
Document 'BeSvn', 1 passage. Section 2.1, Paragraph 5, 106 characters.
It is the expected outcomes for year levels, and Benchmarks are the minimum level
two years behind Standards.
Document 'BeVrc', 1 passage. Section 2.1, Paragraph 5, 142 characters.

64
In the curriculum standards with the respect to...They respect not just the minimum.
Most kids can get through. And they are outcomes based.
Document 'daDsy', 1 passage. Section 2.1, Paragraph 5, 201 characters.
And there is an outcome in Year 3 level that students have to make in over two
years. In primary schools may be in band one, and band two and sometimes three.
We have to plan out lessons to make bands.
Reports to parents
Document 'daKty', 1 passage. Section 2.1, Paragraph 5, 94 characters.
The Standards in SACSA Framework we use them as ours of the Benchmarks for
reporting parents.
South Australian base
Document 'BeVrc', 1 passage. Section 2.1, Paragraph 5, 52 characters.
The Standards in SACSA are South Australian Based
Document 'daKty', 1 passage. Section 2.1, Paragraph 5, 57 characters.
The SACSA Framework Standards are just South Australian
Talking about Australian children
Document 'daKty', 1 passage. Section 2.1, Paragraph 5, 123 characters.
The SACSA Framework Standards are just South Australian and they are more to do
with teaching talk about Australian kids.
Two year bands
Document 'daBrn', 1 passage. Section 2.1, Paragraph 5, 180 characters.
Standards in SACSA are those standards that have been set that children should
achieve in two year blocks. Standard 1, Standard 2, so those standards that children
should achieve.
Document 'daCrln', 1 passage. Section 2.1, Paragraph 7, 338 characters.
For me, the standards are about what we believe in the two years of primary which
kids should be travelling. And the kids who are struggling might be moving
towards. At the beginning of the standards, kids have done very well, some are in
between, that’s the standards, kids are doing very well. I will change and start
moving to the next standard.
Document 'daJo', 1 passage. Section 2.1, Paragraph 5, 117 characters.
The Standards, they align to two levels with curriculum context, but standards are
over two year period within SACSA.
Additional comments
Document 'daCrln', 1 passage. Section 2.1, Paragraph 5, 411 characters.
I don’t think the implementation of SACSA at a whole department has been done
very well. And I believe that a lot of teachers disengage with it. And that’s why now
there is a new push around looking with SACSA, and I think what has created that
is the fact that we have to do our reporting against the standards. And always there
is something that makes people move around that they have to really address.
Document 'daKty', 1 passage. Section 2.1, Paragraph 5, 121 characters.
The Standards in SACSA Framework are in Year 2, Year 4, Year 6, and Year 8. So,
in between not anything that designed.

65
Table 6.1 displays how the teachers defined the Standards in SACSA. From the table,
it is obviously seen that most of the teachers defined the Standards in SACSA as an
outcome expectation. Six of the 11 teachers or 55 per cent of the total number
considered that the Standards were an outcome expectation. The Standards in
SACSA was also defined as a guideline for programming lessons. The number of the
teachers who agreed with this definition was 36 per cent. Other definitions of the
Standards in SACSA were South Australian base and outcome assessment. There
were 18 per cent respondents who supported these definitions. Finally, the three other
definitions of the Standards in SACSA were (a) talk about South Australian students,
(b) level of instruction, and (c) reports to parents.

Table 6.1 Definitions of the Standards in the SACSA Framework


BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

SA base 0 0 0 0 0 1 0 0 0 0 1 2 18

Outcome assessment 1 0 0 0 0 0 0 0 1 0 0 2 18

Talking about Aus kids 0 0 0 0 0 0 0 0 0 0 1 1 9

Outcome expectation 1 1 0 1 1 1 0 0 1 0 0 6 55

Two year band 0 0 0 0 0 0 1 1 0 1 0 3 27

Reports to parents 0 0 0 0 0 0 0 0 0 0 1 1 9

Guideline 1 0 1 0 0 1 0 0 1 0 0 4 36

Levels of instruction 0 0 0 0 1 0 0 0 0 0 0 1 9

Totals 3 1 1 1 2 3 1 2 3 1 3 22

Note: First row - labels of respondents; First column - names of used themes

As has been mentioned before that the majority of the respondents defined the
Standards in the SACSA Framework as outcome expectation. This definition
definitely associated with the statements that were issued by DECS (2000) and
DETE (2000) that the SACSA was a standard referenced framework, which
described the curriculum expected for all learners. Furthermore, DECS (2000,
online) and DETE (2000, online) stated that curriculum standards were performance
milestones that depicted what would reasonably be expected of learners along a
continuum of ever-improving performance.
Three definitions of the Standards in the SACSA Framework, namely a Standard as a
guideline for programming lessons, South Australian base, and reports to parents
were relevant to the results of a survey conducted Australian Curriculum,
Assessment, and Certification, Authorities of South Australia or ACACA (2000,
online). It was found that the teachers who took part in the survey defined
‘curriculum standard’ as a common reference point for teachers and other educators
to use in monitoring, judging and reporting on learner achievement (in clearly

66
defined skills, knowledge and dispositions) over time. The next definition of the
Standards was assessment outcomes. This definition accorded to DECS’ (2000,
online) and DETE’s (2000, online) description. They stated that teachers assessed
attainment of a curriculum standard for a learning area when a student demonstrated
achievement of all outcomes comprising the standards. “Teachers assessed and
reported according to the attainment of the specified requirements “(DECS, 2000,
online; DETE, 2000, online). Other definitions of the Standards in the SACSA
Framework were two year band and level of instruction. These arguments were
supported by DECS (2000, online) and DETE (2000, online). They stated that the
SACSA Framework contained five curriculum standards, that were placed at two-
year intervals and aligned with years of schooling, i.e. Years 2, 4, 6, 8, 10.
The Standards in the SACSA Framework were also defined as referring to Australian
children. This finding was associated with DECS’ (2000, online) and DETE’s (2000,
online) argument that SACSA was the framework from which educators from early
childhood to the end of secondary schooling.

Benchmarks

Based on the data analysis, it can be concluded that the teachers defined the
Benchmarks in the LAN Test as: (a) high or low band, (b) minimum requirement, (c)
national tests, and (d) support acquirement. These definitions can be seen in the
following results that were drawn from both Node Search and Assay Tool analysis in
Table 6.2.

Students’ band
Document 'BeSvn', 1 passage. Section 2.1, Paragraph 5, 70 characters.
Benchmarks are where students should be at their particular year level
Document 'daCrln', 1 passage. Section 2.1, Paragraph 9, 114 characters.
I guess the kit is that the Benchmarks allow you to know where our kids should be
at a certain time in the school.
Document 'daDsy', 1 passage. Section 2.1, Paragraph 5, 58 characters.
And it tells you whether the kids’ bands are high or low.
Minimum requirement
Document 'BeChr', 1 passage. Section 2.1.2, Paragraph 10, 116 characters.
And in terms of the benchmarks, I understand they are the minimum requirements
that the students should reach.
Document 'BeFrtn', 1 passage. Section 2.1, Paragraph 5, 57 characters.

67
Benchmarks are minimum level two years behind Standards.
Document 'BeSvn', 1 passage. Section 2.1, Paragraph 5, 57 characters.
Benchmarks are minimum level two years behind Standards.
Document 'BeVrc', 1 passage. Section 2.1, Paragraph 5, 57 characters.
All the performance can be measured to minimum standards.
Document 'daBrn', 1 passage. Section 2.1, Paragraph 5, 130 characters.
The Benchmark is something that has been set as a minimum standard that children
need to attain in the Literacy and Numeracy Test.
Document 'daJo', 1 passage. Section 2.1, Paragraph 5, 216 characters.
And the Benchmarks in the LAN Test state levels to determine, I’m not sure
whether they are issued by the Education Department or the Australian Government
that they are basic level that the children are functioning out.
National test
Document 'BeVrc', 1 passage. Section 2.1, Paragraph 5, 95 characters.
The Benchmarks, the LAN Test, it is the National. The tests are given to the Years 3,
5, and 7.
Document 'daKty', 1 1 passage. Section 2.1, Paragraph 5, 54 characters.
And the Benchmarks, the Statewide and Australian wide,
Support indicator
Document 'BeSvn', 1 passage. Section 2.1, Paragraph 5, 40 characters.
Whether they need support or extension.
Document 'daKty', 1 1 passage. Section 2.1, Paragraph 5, 124 characters.
The Benchmarks, we actually use when we are looking at allocating children with
support, for literacy and numeracy support.
Additional comments
Document 'BeChr', 1 passage. Section 2.1.2, Paragraph 10, 154 characters.
And the number I think is very low for the majority of the students in the class.
Therefore we expect all students in the class to reach those Benchmarks.
Document 'daCrln', 1 passage. Section 2.1, Paragraph 9, 168 characters.
I think one thing that is harder, Ros, that they across two years. And we still think
very much of school. And only early bases, and teachers have travelled with that.
Document 'daDsy', 1 passage. Section 2.1, Paragraph 5, 78 characters.
But we don’t teach the Literacy and Numeracy Test; it is a different standard.
Document 'daKty', 2 passages. Section 2.1, Paragraph 5, 131 characters.
The Benchmarks in the LAN Test, are not very clear. We haven’t actually got the
Benchmarks, the LAN Test, we just get the results.
Section 2.1, Paragraph 5, 45 characters.
Also, the Benchmarks are in Years 3, 5, and 7

68
Table 6.2 presents information about how the teachers defined the Benchmarks in the
LAN Test. It is clearly shown that most of the teachers defined the Benchmarks as a
minimum requirement (55%) that students should achieve. Another definition of the
Benchmarks was a high or low band. This definition was supported by three teachers
(27%). Furthermore, the Benchmarks were also defined as a national test. This was
agreed by two teachers (18%). Interestingly, there were two teachers (18%) who
believed that the Benchmarks were aimed at seeing whether the students needed
support or an extension program.

Table 6.2 Definitions of the Benchmarks in the LAN Test


BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent
Minimum requirement 1 1 0 0 1 1 1 0 0 1 0 6 54
National test 0 0 0 0 0 1 0 0 0 0 1 2 18
Support, Extension
indicator 0 0 0 0 1 0 0 0 0 0 1 2 18
Kids’ band 0 0 0 0 1 0 0 1 1 0 0 3 27
Totals 2 1 0 0 3 2 1 2 2 1 3 17

Note: First row - labels of respondents; First column - names of used themes

The majority of the respondents defined the Benchmarks as a minimum requirement.


This finding accorded with the Ministers’ statement that they agreed to develop
national benchmarks for use in reporting minimum acceptable standards of literacy
and numeracy achievement in support of the national goal (DECS, 2000, p. iii). In
addition, the results of the survey conducted by ACACA across Australia also
supported this definition. According to the respondents of the survey, Benchmarks
were the descriptions of nationally agreed minimum acceptable standards of literacy
and numeracy performance at Years 3, 5 and 7 (ACACA of SA, and ACACA of
Queensland 2000, online).
The next definition of the Benchmarks in the LAN Test was a Students’ band. This
definition accorded with the concept of the benchmarks introduced and agreed by the
Ministers for Education in the States, the Territories and the Commonwealth in 1996,
‘every child leaving primary school should be numerate, and be able to read, write
and spell at an appropriate level’ (DECS, 2000, p. iii).
The next definition of the Benchmarks in the LAN Test was a National Test. This
definition was associated with Australian Council of State School Organization
(2006, p. 1, online). It stated that the Benchmarks were nationally agreed minimum
standards concerned with the essential elements of literacy and numeracy.

69
The next definition of the Benchmarks in the LAN Test was a support indicator. Two
respondents considered that the LAN Test could be used when teachers want to see
whether a student needed learning support in literacy and numeracy. This point
seems not to be mentioned in DECS’ (2000) or DETE’s (2000) documents explicitly.
However, the Australian Council of State School Organization (2006, p. 2) pointed
out that the LAN Test would help systems to examine whether their strategies to
improve literacy and numeracy were working.

6.2.2 Question 2: How are the Standards in the SACSA framework and
the Benchmarks in the Literacy and Numeracy Test currently used
in your school by the principal and the teachers? Is the use of
Standards and the Benchmarks influenced by the context in
which your school is placed?

Standards in SACSA

The results show that implementation of the Standards in SACSA were: (a) creating a
reporting system, (b) a new reporting system ‘A,B,C, D, E’, (c) no tie to the
Standards and language in SACSA, (d) teaching the Standards in SACSA, and (e)
used to program lessons. These definitions can be seen in the following results that
were drawn from both Node Search and Assay Tool analysis in Table 6.3.

Creating reporting system


Document 'BeVrc', 1 passage. Section 4.1, Paragraph 10, 217 characters.
I need to show you something. Here are some tables to put data about students’
learning in literacy and numeracy. And we give them spelling and reading tests….
And we use all those data to look at what students need.
Document 'daCrln', 1 passage. Section 4.1, Paragraph 20, 302 characters.
I guess what we have done is creating a reporting system for kids and parents. And
closely I guess not closely but loosely with SACSA. Now we have on the stage that
we haven’t changed very closely and consistently against Standards. And that’s
going to change, what’s in there report and how it works.
New reporting system ‘A, B, C, D’.
Document 'daCrln', 1 passage. Section 4.1, Paragraph 17, 508 characters.
The Standards have, I guess they’ve got really new…a little bit focus on the
Standards at the moment, and that’s because of the new reporting system when we
have to put A, B, C, D, E, F and we have to report, and strongly against the
Standards. I would say that most of them have the reporting system that with
consistent with SACSA, but not we believe that we tie it to the Standards and the
language in SACSA. And the reason is because the language of SACSA is difficult
for teachers and also for parents.

70
No tie to Standards and language in SACSA
Document 'daCrln', 1 passage. Section 4.1, Paragraph 17, 550 characters.
We very much, we use like I said to you….The Standards have, I guess they’ve got
really new…a little bit focus on the Standards at the moment, and that’s because of
the new reporting system when we have to put A, B, C, D, F and we have to report,
and strongly against the Standards. I would say that most of them have the reporting
system that with consistent with SACSA, but not we believe that we tie it to the
Standards and the Language in SACSA. And the reason is because the Language of
SACSA is difficult for teachers and also for parents.
Teaching Standards in SACSA
Document 'daDsy', 1 passage. Section 4.1, Paragraph 10, 41 characters.
Basically we teach the Standards in SACSA
Used to program lessons
Document 'BelFftn', 1 passage. Section 4.1, Paragraph 9, 102 characters.
The Standards and the Benchmarks influence how we set up curriculum planning
and programming lessons.
Document 'BeSvn', 1 passage. Section 4.1, Paragraph 9, 52 characters.
The Standards can be used when programming lessons.
Document 'BeVrc', 1 passage. Section 4.1, Paragraph 10, 163 characters.
We can try to work out good strategies to help to improve literacy and numeracy
standards. They can use them to work out. What …make them to make a little
progress
Document 'daBrn', 1 passage. Section 4.1, Paragraph 9, 81 characters.
Well, the Standards are used by the teachers to plan what they are going to teach
Document 'daJo', 1 passage. Section 4.1, Paragraph 9, 81 characters.
The standards in SACSA are used by the staff to plan and program their students.
Document 'daKty', 1 passage. Section 4.1, Paragraph 10, 227 characters.
We use the standards in SACSA Framework to program. So, everything that we
teach based on the program due to SACSA Framework Standards. We also use it
just start to develop a report format using the SACSA Framework Standards.
Table 6.3 shows the implementation of the Standards in the SACSA framework. It is
seen that more than a half of the participants (55%) agreed with the assumption that
the Standards in the SACSA Framework were used to program lessons. Interestingly,
two respondents (18%) stated that they created their own reporting systems that
would be addressed to the students and parents. In addition, only one participant
(9%) stated that the new reporting system ‘A, B, C, D, E, F’ was suggested to be
applied in making a report about the students. There was a respondent who assumed
that when her school made a report, it did not tie it to the Standards and the language
in the SACSA Framework. Lastly, one participant stated that she taught the
Standards in her school.

71
Table 6.3 Implementation of the Standards in the SACSA Framework
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent
Creating Reporting
System 0 0 0 0 0 1 0 1 0 0 0 2 18
Not tie to standard &
lang 0 0 0 0 0 0 0 1 0 0 0 1 9
Use to program
lessons 0 0 0 1 1 1 1 0 0 1 1 6 54
New report Sys A, B,
… 0 0 0 0 0 0 0 1 0 0 0 1 9
Teaching standards in
SACSA 0 0 0 0 0 0 0 0 1 0 0 1 9
Totals 0 0 0 1 1 2 1 3 1 1 1 11

Note: First row - labels of respondents; First column - names of used themes

In implementing the Standards in the SACSA Framework, the majority of the


teachers used them to program lessons. This idea was supported by the statement
issued by ACACA (2000, online; DECS (2000, online); and DETE (2000, online).
They highlighted that SACSA was the framework from which educators from early
childhood to the end of secondary schooling constructed learning programs, assessed
progress and reported on the standards achieved.
In addition to the Standard implementation, two respondents argued that they created
their own reporting systems that would be addressed to the students and parents. One
of the two respondents mentioned clearly that what they put in the reporting system
was about students’ test results. The argument of creating a reporting system is
relevant to Tindal’s and Marston’s argument (1990).Tindal and Marston (1990, p.
205) stated that the information that teachers used and needed much for teaching did
not come from standardized tests but from tests they make themselves and from
structured performance samples. In addition, Stiggins (1985, cited in Tindal and
Marston, 1990, p. 285) argued that locally-made tests had the advantage of increased
relevance and utility.
Related to the Standard implementation, one participant highlighted a new reporting
system ‘A, B, C, D, and F’. She stated that teachers were expected to report using the
new reporting systems. Bloom, et al. (1971) and Terwilliger (1971) raised an issue
regarding such systems of marking. Bloom, et al. (1971, p. 7) argued that the system
of categorizing students was generally designed to approximate a normal distribution
of marks, such as A, B, C, D, E, and F at each grade or level. However, both Bloom
(1971) and Terwilliger (1971) acknowledged that this reporting system resulted in
disadvantages. Bloom, et al. (1971, p. 7) claimed that it was not likely that this
continual labelling had beneficial consequences for the individual’s educational
development, and it was likely that it had an unfavourable influence on a student’s

72
self-concept. Furthermore, Terwilliger (1971, p. 9) argued that the statement
‘students were more interested in their grades than what they learn’ had become a
cliché with some teachers.
Moreover, one of the respondents argued that when her school made a report, it did
not tie to the Standards and the language in the SACSA Framework, because the
language of the SACSA Framework was difficult for teachers and also for parents.
This implied that when the teacher reported her students’ attainment, she used
common terminology that was easy to understand by parents. Associated with the
importance of using commonly understandable terminology, Bloom, et al. (1971, p.
20) argued that an educator had to choose words that conveyed the same meaning to
all intended readers, since statement of objectives could be interpreted differently by
different readers and gave them no direction in selecting materials, organizing
content, and describing obtained outcomes, nor did they provide a common basis for
instruction or evaluation.
The argument from a respondent that the Standards were taught in school was
relevant to the statement issued by DECS (2000) and DETE (2000) that the essence
of curriculum accountability was the construction of learning programs and reporting
of learner achievement on the basis of these standards.

Benchmarks in LAN

The results show that implementation of the Benchmarks in the LAN Test are: (a)
Benchmarks influenced curriculum, (b) extension program or support (c) hard for a
struggling child, (d) the implementation should be questioned, (e) no teaching
relating to the Benchmarks, and (f) teaching according to the LAN Test. These are
presented in the following results that were drawn from both Node Search and Assay
Tool analysis in Table 6.4.

Benchmarks influenced curriculum


Document 'BelFftn', 1 passage. Section 4.1, Paragraph 9, 83 characters.
The Benchmarks influence how we set up curriculum planning and programming
lessons.
Document 'daCrln', 1 passage. Section 4.1, Paragraph 22, 210 characters.
We use them as looking for kids that we need to find out, more about, maybe the
kids are measured by testing around, developing a social program. Those kids who
are below the Benchmarks or not achieve the Benchmarks.

73
Extension program or support
Document 'daBrn', 1 passage. Section 4.1, Paragraph 10, 138 characters.
We use the Benchmarks to identify children who haven’t reached the levels of
Literacy and Numeracy understanding according to their level.
Document 'daJo', 1 passage. Section 4.1, Paragraph 9, 170 characters.
And the Benchmarks we use are those from the Literacy and Numeracy Test to
identify those students who would be a part of learning support or learning
extension program.
Document 'daKty', 1 passage. Section 4.1, Paragraph 10, 310 characters.
And we use the Benchmarks with the LAN Test to allocate support for children who
need supports. Some kids are, any kids may guess what a whole group of children
who are achieving to the Standards. And the kids, we use the LAN Test to focus on
certain support to get what they need.
Hard for a struggling child
Document 'daDsy', 1 passage. Section 4.1, Paragraph 10, 146 characters.
It is really hard for the kids who are struggling in Year 3 who are still in low band.
They can’t cope with the test. They don’t do it very well.
Implementation should be questioned
Document 'BeSvn', 1 passage. Section 4.1, Paragraph 9, 37 characters.
The Benchmarks should be questioned.
No teaching relating to Benchmarks
Document 'daDsy', 1 passage. Section 4.1, Paragraph 10, 126 characters.
But we don’t teach the LAN Test. I mean the children do it but we don’t need to
teach as a curriculum for students the Benchmarks.
Teaching according to LAN Test
Document 'BeFrtn', 1 passage. Section 4.1, Paragraph 9, 66 characters.
Teachers are aware and will teach according to the results of LAN.
Table 6.4 presents information about the implementation of the Benchmarks in the
LAN Test. Three participants (27%) believed that the Benchmarks in the LAN Test
were implemented to identify the students who needed support or an extension
program. Furthermore, two participants (18%) stated that the Benchmarks influenced
their curriculum. Interestingly there was a participant who claimed that the
Benchmarks were hard for a struggling student; and that the implementation of the
Benchmarks had to be questioned. Lastly, one respondent argued that the
Benchmarks were not taught as a curriculum.
Associated with the implementation of the Benchmarks in the LAN Test, there was
an argument that stated that the Benchmarks in the LAN Test were implemented to
identify the students who needed support or an extension program. This argument

74
was associated with section 2.1.2.4 about definition of the Benchmarks in the LAN
Test ‘support acquirement’. Similarly, this point can be linked to the Australian
Council of State School Organization (2006, p. 2) in which it pointed out that the
main purpose of introducing the benchmarks was to assess and report the
performance of school systems to the Australian community using an agreed
benchmark.

Table 6.4 Implementation of Benchmarks in the LAN Test


BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Bench, Influencing
curriculum 0 0 0 1 0 0 0 1 0 0 0 2 18
Implementation be
questioned 0 0 0 0 1 0 0 0 0 0 0 1 9

Hard for struggling kids 0 0 0 0 0 0 0 0 1 0 0 1 9


No teaching relating
Bench. 0 0 0 0 0 0 0 0 1 0 0 1 9
Extension program or
support 0 0 0 0 0 0 1 0 0 1 1 3 27
Teaching according to
LAN T 0 1 0 0 0 0 0 0 0 0 0 1 9

Totals 0 1 0 1 1 0 1 1 2 1 1 9

Note: First row - labels of respondents; First column - names of used themes

The argument that the Benchmarks influenced curriculum and the argument that
teaching took place according to the LAN Test would seem to have the same
meaning. However, these arguments were very contrast to the argument that no
teaching took place that related to the Benchmarks. This might imply that some
schools used the results of the LAN Test as feedback to improve or remake their
programs. On the other hand, there was a statement that the results of the LAN Test
did not impact on the curriculum. From these arguments, we can see schools’
freedom in developing their teaching programs. Associated with these statements,
Keeves (1999, p. 115) argued ‘the curriculum is now left to the teachers and is not
prescribed in the statements and profiles.’ Furthermore, the Education Department of
South Australia (1977, p. 3) commented that teachers at that time had a large
measure of freedom in choosing the outcomes of education that they wished to teach.
One respondent argued that the implementation of the Benchmarks had to be
questioned. Unfortunately, this respondent did not provide any reason why the
Benchmarks had to be questioned. However, there was an argument that the
Benchmarks were hard for a struggling child. Although these arguments derived
from different teachers, yet they could be related. Whatever the reasons are, this
implies that the teachers considered that the implementation of the Benchmarks was
not working effectively.

75
School context

The results show that implementation of the Benchmarks in the LAN Test involved:
(a) the context influencing the Standards and the Benchmarks, and (b) the context not
influencing the Standards and the Benchmarks. These are presented in the following
results that were drawn from both Node Search and Assay Tool analysis in table 6.5.

School context influenced Standards and Benchmarks


Document 'BelFftn', 1 passage. Section 4.1, Paragraph 9, 154 characters.
The Standards and the Benchmarks influence how we set up curriculum planning
and programming lessons. As we have a NAP unit, a high level of ESL students.
Document 'daCrln', 2 passages. Section 4.1, Paragraph 19, 263 characters.
And the context about this school, we have sixty six kids’ school cards and along
with… Some of the parents are literate. So, where the literacy is very strong, and
you know all the stuff about language in SACSA and the language in Standards
have been written in.
Section 4.1, Paragraph 22, 422 characters.
As far as the Benchmarks, the Benchmarks in other school context, they all give a
good feel about the kids. I guess it’s natural. …. We use them as looking kids that
we need to find out more about. Maybe the kids are measured by testing around,
developing social program. Those kids who are below the Benchmarks or not
achieving the Benchmarks.
School context did not influence Standards and Benchmarks
Document 'daBrn', 1 passage. Section 4.1, Paragraph 12, 156 characters.
And I don’t think the use of Standards and Benchmarks are influenced by the
context in which our school is placed. (No not really. No, I wouldn’t say this.)
Document 'daJo', 1 passage. Section 4.1, Paragraph 9, 87 characters.
Mm ..no, they’re not. They’re not influenced by the context in which the school
placed.
Additional comments
Document 'BeChr', 2 passages. Section 4.1, Paragraph 24, 417 characters.
Because of previous curriculum in terms of Statement and Profiles, which was
before SACSA and before the attainment levels, so I think teachers have become
very cynical about how long they would survive. I think they survived longer than
those previous ones. And I think it was very slow in the first few years. People
started using them and understanding them. I’m not talking about every teacher but
just generally.
Section 4.1, Paragraph 27, 370 characters.
I think it depends very much on the Principal. We have changed the Principal this
year. So that’s very much …just routines maintenance while we’re waiting for a new
Principal to come along. So in this school in particular, not like my previous school,
we haven’t done lots of work with them. So it’s really left that on the individual
teacher to use that they want to.
Document 'BeFv', 1 passage. Section 4.1, Paragraph 8, 74 characters.
Tests are given and inferences are made on issues which would be improved.

76
Table 6.5 shows whether school context influenced the implementation of the
Standards and the Benchmarks. It is obviously seen that two participants claimed that
the school context influenced the implementation of the Standards and the
Benchmarks. On the other hand, two other respondents argued that the school
context did not influence the implementation of the Standards and the Benchmarks;
whereas the rest of the participants did not provide information. DECS (2000, online)
and DETE (2000, online) claimed that Curriculum Accountability was demonstrated
through the construction of a curriculum that was responsive to a diversity of
learners. This implied that school context influenced the Standards in the SACSA
Framework. Thus, the findings that the school context influenced the standards was
supported by the statement issued by DECS and DETE (2000, online).
In contrast, there was the opinion that the school context did not influence the
Standards in the SACSA Framework. This implied that the respondents or the school
did not follow the DECS’ guidelines.

Table 6.5 The implementation of both the Standards and the Benchmarks in
terms of school context
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Context influence
Standards and Bench. 0 0 0 1 0 0 0 1 0 0 0 2 18
Context not influence
Standards and Bench. 0 0 0 0 0 0 1 0 0 1 0 2 18

Totals 0 0 0 1 0 0 1 1 0 1 0 4

Note: First row - labels of respondents; First column - names of used themes

6.2.3 Question 3: What is your personal opinion about the Standards in


SACSA and Benchmarks in the Literacy and Numeracy Test?

Standards in SACSA

In terms of teachers’ personal opinion towards the Standards in the SACSA


Framework, the results show several items involving opinion: (a) difficult
terminology, (b) general and broad, (c) good outcomes, (d) hard to teach, (e) one
year band, (f) only some Standards were useful, (g) programming work, (h) too hard
and too easy, and (i) useful. These are presented in the following results that are
drawn from both Node Search and Assay Tool analysis in Table 6.6.

Difficult terminology
Document 'daBrn', 1 passage. Section 6.1, Paragraph 19, 459 characters.
But the problem is that it is created by the Standards that they go in two year blocks.
That makes it more difficult for teachers to terminate difficult terms, and to plan
what they are doing. And also it is something that parents don’t understand very

77
well because parents look at children’s daily needs who and in Year 5, and that is it.
So to achieve what they need to know in Year 5, they can’t always get their head
around the idea that goes over two years.
Document 'daDsy', 1 passage. Section 6.1, Paragraph 15, 112 characters.
Sometimes they are hard to understand because the standards are very broad and
terminology is hard to understand
General and broad
Document 'BelFftn', 1 passage. Section 6.1, Paragraph 14, 48 characters.
I think the Standards are very general and broad
Document 'daBrn', 1 passage. Section 6.1, Paragraph 19, 189 characters.
But the problem is that it is created by the Standards that they go to two year blocks.
That makes it more difficult for teachers to use difficult terms, and to plan what they
are doing.
Document 'daDsy', 1 passage. Section 6.1, Paragraph 15, 144 characters.
Sometimes they are hard to understand because the standards are very broad and
terminology is hard to understand. It is very difficult to read.
Good outcomes
Document 'daDsy', 1 passage. Section 6.1, Paragraph 15, 110 characters.
My opinion about SACSA is, it is good, like the outcomes are good. Like we need
them and playing around them.
Hard to teach
Document 'BeChr', 1 passage. Section 6.1.1.1, Paragraph 41, 278 characters.
I think there are so many outcomes in there and you can’t teach all of them. And so
you have to be quite selective about what you use. So they’re just too
comprehensive. Some of the outcomes are too hard for the children, for my year
level, Year 5, and they couldn’t reach them.
Document 'daBrn', 1 passage. Section 6.1, Paragraph 19, 454 characters.
The problem is that it is created by the Standards that they go to two year blocks.
That makes it more difficult for teachers to terminate difficult terms, and to plan
what they are doing. And also it is something that parents don’t understand very
well because parents look at children’s daily needs who being in Year 5, and that is
it. So to achieve what they need to know in Year 5, they can’t always get their head
around the idea that goes over two years.
Document 'daDsy', 1 passage. Section 6.1, Paragraph 16, 382 characters.
The bands are across two years. So Band I is for Years 1 and 2, and Band 2 is for
Years 3 and 4. So, it is really hard to plan for two years. You don’t have the kids for
two years. That makes it difficult as well. When you are assessing, it is very hard to
give a report to the children; it is really hard to give them marks when you haven’t
covered because they haven’t reached the outcomes.
Document 'daJo', 1 passage. Section 6.1, Paragraph 14, 537 characters.
I will much prefer to see them over one year period than two year period. I think
because we have complex classes and they arrange abilities of students that the
Standards over two year period is very hard to be able to use it in planning and also
then have to…….6 and 7 year teacher and they have another 6 and 7 year teacher.
So we work it out between us. What we can teach within those Standards. Actually

78
in the period, I’ve got students in Years 6 and 7. And it’s hard to work out within
two year period rather than just one year period.
One year band
Document 'daKty', 1 passage. Section 6.1, Paragraph 16, 753 characters.
My personal opinion, the SACSA Framework is quite useful. It would be better if it
was instead of being mm Standard 1 is for Years 1 and 2. Two Year is divided into
two is much better because we have it every year. Standard 1 was for Year 1,
Standard 2 was for Year 2. It is very difficult because all classes, in most classes in
South Australia are composite classes. So, we have Years 4 and 5. If I look at the
program at the time, I’ll look at Standard 3. But also I have to look at Standard 4.
And when I’m reporting to a developer, I will report as well as to parents, it’s
difficult because each report needs different language and different Standard. And in
my class, I have to have two separate reports, some for Year 4 and some for Year 5.
Only some Standards were useful
Document 'BeChr', 2 passages. Section 6.1, Paragraph 34, 198 characters.
I found they were very useful for my planning and programming such as
mathematics, society, environment, and the English 1. Some of them, like you said
that had were schedule occasionally, but I don’t
Section 6.1.1.1, Paragraph 39, 118 characters.
But in terms of finding them useful yes, three, four of them are very useful. But
three, four of them are not useful.
Programming work
Document 'BelFftn', 1 passage. Section 6.1, Paragraph 14, 150 characters.
I use the Standards for programming work. As a teacher it is useful to know what
the Standards are in mainstreams and prepare my students accordingly.
Document 'BeVrc', 1 passage. Section 6.1, Paragraph 23, 92 characters.
Standards in SACSA help us develop programs, for the teachers to look at
developed programs.
Document 'daBrn', 1 passage. Section 6.1, Paragraph 19, 64 characters.
I guess, in terms of in determining what children need to know.
Some were too hard, and too easy
Document 'BeChr', 1 passage. Section 6.1.1.1, Paragraph 41, 134 characters.
Some of the outcomes are too hard for the children, for my year level, Year 5, and
they couldn’t reach them. Some of them are too easy
Useful
Document 'BeFrtn', 1 passage. Section 5.2, Paragraph 13, 21 characters.
Standards are useful,
Document 'BeVrc', 1 passage. Section 6.1, Paragraph 23, 177 characters.
Standards in SACSA help us develop programs, for the teachers to look at
developed programs now whereas the LAN Test in the Benchmarks are useful. The
useful things about SA
Document 'daBrn', 1 passage. Section 6.1, Paragraph 19, 34 characters.
The Standards of SACSA are useful,

79
Document 'daCrln', 1 passage. Section 7, Paragraph 27, 241 characters.
I think they are very useful in sense that inconsistency across side like …I guess you
know you can go along my way, and I could think, perhaps a student manages well
in my class, is really achieving everything that they need to achieve.
Document 'daJo', 1 passage. Section 6.1, Paragraph 14, 103 characters.
I think they are useful, but I will much prefer to see them over one year period than
two year period.
Document 'daKty', 1 passage. Section 6.1, Paragraph 16, 57 characters.
My personal opinion, the SACSA Framework is quite useful.
Table 6.6 records the teachers’ personal opinion regarding the Standards in the
SACSA Framework. The opinion could be put into two categories: positive and
negative opinion. From the table it can be seen that more than half of the respondents
(55%) considered that the Standards in the SACSA Framework were useful. One
respondent (9%) stated that the Standards had good outcomes; and they were used
for programming work. On the other hand, four respondents (36%) claimed that the
Standards were hard to teach. Furthermore, three participants argued that the
Standards were very general; two of them (18%) considered that the Standards had
difficult terminology. One respondent (9%) stated that only some of the standards
were useful, and some were too hard and too easy. Moreover, one respondent
believed that it would be better if the Standards were arranged in one year bands not
in two year bands.

Table 6.6 Personal opinion about the Standards in the SACSA Framework
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Good outcomes 0 0 0 0 0 0 0 0 1 0 0 1 9

General 0 0 0 1 0 0 1 0 1 0 0 3 27

One year band 0 0 0 0 0 0 0 0 0 0 1 1 9

Some too hard 1 0 0 0 0 0 0 0 0 0 0 1 9

For programming 0 0 0 1 0 1 1 0 0 0 0 3 27

Hard to teach 1 0 0 0 0 0 1 0 1 1 0 4 36

Only some useful 1 0 0 0 0 0 0 0 0 0 0 1 9

Difficult terminology 0 0 0 0 0 0 1 0 1 0 0 2 18

Useful 0 1 0 0 0 1 1 1 0 1 1 6 55

Totals 3 1 0 2 0 2 5 1 4 2 2 22

Note: First row - labels of respondents; First column - names of used themes

There were two respondents who argued that the Standards in the SACSA
Framework used difficult terminology. This argument is similar to the argument in
Table 6.3 about the implementation of the Standards in the SACSA Framework, in
which the respondent claimed that when her school made a report, it did not tie to the
Standards and the language in the SACSA Framework, because the Language of the

80
SACSA Framework was difficult for teachers and also for parents. Again, we can
link this argument to Bloom, et al’s argument (1971, p. 20) that an educator had to
choose words that convey the same meaning to all intended readers, since a
statement of objectives that could be interpreted differently by different readers gave
them no direction in selecting materials, organizing content, and describing obtained
outcomes, nor did they provide a common basis for instruction or evaluation.
One of the respondents considered that some of the Standards in SACSA were too
hard, while for some others they were too easy. This implied that the problem lay
with the difficulty level of the outcomes or objectives in the Standards. Associated
with this argument, Shayer and Adey (2002, p. 5) argued that cognition could be
stimulated by the presentation of intellectual challenges of moderate difficulty.
The argument that the Standards were hard to teach would seem to be correlated to
the arguments that the Standards were very general and broad, and had to be changed
to one year band. Thus, the argument that stated that the outcomes of the Standards
were very general and broad as well as the idea that two-year bands had to be
modified to merely one-year bands, can be concluded as two main factors of being
hard to teach in order to achieve the Standards. This implies that the ideas in the
Standards’ document are complex, and this may influence how learning units or tasks
can be planned. Associated to this issue, Bloom, et al. (1971, p. 16) highlighted the
idea that the instructional material and processes had to be organized into smaller
units than an entire course, grade, or program.
Many respondents acknowledged that the Standards in the SACSA Framework were
useful. One reason for this was that they used the Standards to program their work or
lessons. Unfortunately, some of the participants did not clarify their reasons why they
deemed that the Standards were useful. In addition, one respondent stated that the
Standards had good outcomes, so that they were needed. In terms of using the
Standards to program lessons, this can be linked to a previous section, ‘definitions of
the Standards in the SACSA Framework’ about Standards as guidelines. Again, the
argument that the Standards were useful was relevant to the results of a survey
conducted by the Australian Curriculum, Assessment, and Certification, Authorities
of South Australia or ACACA (2000, online) that ‘curriculum standard’ was a
common reference point for teachers and other educators to use in monitoring,
judging and reporting on learner achievement over time.

81
There was one teacher who argued that only some Standards were useful. This was
relevant to Bloom et al.’s suggestions (1971) when a teacher wanted to link his or her
own objectives to a nationally developed objectives. Bloom, et al. (1971, p. 36)
stated that: (a) the teachers could compare their objectives, what they thought
important, with those of the national group, and (b) the teachers had to study
carefully the objectives of the given curriculum in relation to their particular
situations.

Benchmarks

In terms of teachers’ personal opinion towards the Benchmarks in the LAN Test, the
results identified several items of interest: (a) disagreed, (b) gathering data level, (c)
giving limited ideas, (d) not useful, (e) too low, and (f) useful. The findings are
presented in the following descriptions that were drawn from both Node Search and
Assay Tool analysis summarized in Table 6.7.

Disagreed
Document 'daKty', 1 passage. Section 6.1, Paragraph 19, 33 characters.
The LAN test, I really disagree.
Gathering data level
Document 'BeFv', 1 passage. Section 6.1, Paragraph 13, 63 characters.
The Literacy and Numeracy Test is useful to gather data levels.
Giving limited ideas
Document 'BeSvn', 1 passage. Section 6.1, Paragraph 13, 72 characters.
The Benchmarks can only give a very limited idea of students’ abilities.
Document 'daJo', 1 passage. Section 6.1, Paragraph 15, 62 characters.
I don’t think it gives good enough information about students
Document 'daKty', 1 passage. Section 6.1, Paragraph 22, 209 characters.
So, children particularly in my class, Mathematics became very hard. Unless I train
them how to read the LAN Test which I will not do. They were misspelling and Art.
So, it is not a true indication where kids at.
Giving teachers ideas
Document 'daCrln', 1 passage. Section 7, Paragraph 33, 531 characters.
It also gives me an idea. I can look at what it is like in other schools, how they are
performing. So, it gives me a sense of, as the school, how we are going now like in
carrying the literacy and numeracy. And you know, our performing as well as other
schools. And if we are not performing, maybe it’s necessary to be looking at what
we need to change, and doing reviewing. And also at the class level, at the student
level, I can get some feedback about classes that might be covered by kids that are
not doing well I can also get sense from year to year.

82
Not useful
Document 'daJo', 1 passage. Section 6.1, Paragraph 15, 114 characters.
I don’t think it gives good enough information about students and I don’t think the
Benchmarks are really useful.
Too low
Document 'BeFrtn', 1 passage. Section 5.2, Paragraph 13, 23 characters.
Benchmarks are too low.
Document 'daBrn', 1 passage. Section 6.1, Paragraph 21, 46 characters.
But they are ridiculously or they are very low
Document 'daJo', 1 passage. Section 6.1, Paragraph 15, 28 characters.
I think they’re far too low.
Useful
Document 'BeVrc', 1 passage. Section 6.1, Paragraph 23, 44 characters.
The LAN Test in the Benchmarks are useful.
Document 'daBrn', 1 passage. Section 6.1, Paragraph 21, 192 characters.
With the Benchmarks, hhhh..mmm (breath). Yea I guess the Benchmarks are a
useful indicator of what is determined to be the minimum if you like, but they are
ridiculously or they are very low.
Document 'daCrln', 1 passage. Section 7, Paragraph 33, 671 characters.
I think the Benchmarks are useful, the LAN Test is useful as long as they are seen
as one piece of information alongside other information. It also gives me an idea. I
can look at what it is like in other schools, how they performing. So, it gives me a
sense of, as the school, how we are going now like in carrying out literacy and
numeracy. And you know, we are performing as well as other schools. And if we are
not, maybe it’s necessary to be looking at what we need to change, and doing
reviewing. And also at the class level, at the student level, I can get some feedback
about classes that might be covered by kids that are not doing well. I can also get
sense from year to year.
Additional comments
Document 'BeChr', 1 passage. Section 6.1.2.1, Paragraph 47, 397 characters.
It depends on the school Principal how much we will use. And the other thing too, is
that they haven’t been for universal reporting system, and the requirement. They are
allowed to develop their own assessment procedures. So that also doesn’t encourage
teachers I supposed to use a lot. I suppose to use them lots. Because they know the
best school will modify and change as they want to.
Document 'daBrn', 1 passage. Section 6.1, Paragraph 21, 376 characters.
I guess one is cynical, one tends to wonder whether they set low standards. So the
government doesn’t have to provide extra support for children. So, they can say if
they have reached a particular mark, and that’s fine we are not gonna give you
anymore support. So, the lower that could be the better because we know that most
children are going to achieve that lower standard.

Table 6.7 shows the teachers’ opinion associating with the Benchmarks in the LAN
Test. Their opinions about the Benchmarks are classified into two categories: positive

83
and negative opinion. Three teachers (27%) considered that the Benchmarks were
useful. In terms of the usefulness of the Benchmarks, one teacher (9%) stated that the
Benchmarks could give teachers ideas; and they were useful for gathering data level.
On the other hand, there was one teacher who claimed that the Benchmarks were not
useful; and one teacher who totally disagreed. Interestingly, the number of teachers
who claimed that the Benchmarks were too low and giving limited ideas was the
same (27%).

Table 6.7 Personal opinion about Benchmarks


BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Giving teachers ideas 0 0 0 0 0 0 0 1 0 0 0 1 9

Giving limited ideas 0 0 0 0 1 0 0 0 0 1 1 3 27

Disagreed 0 0 0 0 0 0 0 0 0 0 1 1 9

Useful 0 0 0 0 0 1 1 1 0 0 0 3 27

Too low 0 1 0 0 0 0 1 0 0 1 0 3 27

Gathering data level 0 0 1 0 0 0 0 0 0 0 0 1 9

Not useful 0 0 0 0 0 0 0 0 0 1 0 1 9

Totals 1 1 1 0 1 1 3 2 0 3 2 15

Note: First row - labels of respondents; First column - names of used themes

Some of the respondents stated that the Benchmarks in the LAN Test were useful, for
example for gathering data level and giving teachers ideas. These arguments can be
linked to Terwilliger’s argument (1971) about the important aspects in educational
evaluation. Terwilliger (1971, p. 4) stated that the aspects that had to be considered
in educational evaluation was: (a) the need for value judgments concerning the merit
of methods and materials used in education, and (b) the judgment of merit of
personnel responsible for the educational enterprise.
On the other hand, there were arguments that pointed out that the Benchmarks were
not useful, giving limited ideas, and too low. Moreover, there was one respondent
who obviously stated that she disagreed with the Benchmarks in the LAN Test.

6.2.4 Question 5: Do you think the Standards and the Benchmarks that
are associated with outcome statements are appropriate for Year
3, or Year 5, or Year 7 students?

Standards in SACSA

In terms of the appropriateness of the Standards in the SACSA Framework, the


results are grouped into three groups based on the Years or Levels: Year 3, Year 5,
and Year 7. The findings are presented in the following descriptions that were drawn
from both Node Search and Assay Tool analysis.

84
Year 3

There were three main types of opinion towards the appropriateness of the Standards
in the SACSA Framework namely ‘appropriate, inappropriate, and unsure’. These are
presented in the following descriptions and in Table 6.8.

Appropriate
Document 'BeFrtn', 1 passage. Section 10, Paragraph 22, 58 characters.
The standards are associated with the outcome statements,
Document 'BeVrc', 1 passage. Section 10.1, Paragraph 37, 66 characters.
I think most the Standards are appropriate, it depends on the kids.
Document 'daCrln', 1 passage. Section 11.1, Paragraph 45, 50 characters.
From my experience, I think they are appropriate.
Document 'daDsy', 1 passage. Section 10.1, Paragraph 28, 63 characters.
Yes, I think they are. I think they are appropriate for Year 3.
Document 'daKty', 1 passage. Section 11.1, Paragraph 38, 205 characters.
The Benchmarks both in the SACSA and the LAN Test………kids already know
how to write their names; put on their shoes; feed themselves, dress themselves.
When they come into Reception, a lot of kids cannot.
Inappropriate
Document 'BeSvn', 1 passage. Section 10.1, Paragraph 22, 94 characters.
No, I don’t think the standards and the Benchmarks are associated with the outcome
statements.
Not sure
Document 'BeFv', 1 passage. Section 11, Paragraph 22, 9 characters.
Not sure.
Table 6.8 presents information about the appropriateness of the Standards in the
SACSA Framework in Year 3. Obviously, the majority of the respondents who gave
comments regarding the appropriateness of the Standards in Year 3 considered that
the Standards were appropriate (45%). On the other hand, there was one respondent
(9%) who claimed that the Standards in Year 3 were inappropriate, whereas the rest
of the respondents were unsure and provided no answer.

Table 6.8 The appropriateness of the Standards in Year 3


BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Unsure 0 0 1 0 0 0 0 0 0 0 0 1 9

Inappropriate 0 0 0 0 1 0 0 0 0 0 0 1 9

Appropriate 0 1 0 0 0 1 0 1 1 0 1 5 45

Totals 0 1 1 0 1 1 0 1 1 0 1 7

Note: First row - labels of respondents; First column - names of used themes

85
There are two contradicted opinions concerning the Standards in the SACSA
Framework in Year 3. The majority of the respondents argued that the Standards in
Year 3 were appropriate, while one participant claimed that the Standards in the
SACSA Framework in Year 3 were inappropriate without providing any reason.
The argument that stated that the Standards in the SACSA Framework in Year 3 were
appropriate, was relevant to Piaget’s theory of intellectual development that pointed
out students’ readiness for learning (cited in the Education Department of South
Australia, 1977, p. 53). On the other hand, the respondent who claimed that the
Standards were inappropriate unfortunately did not specify any reason for arguing so.
Moreover, one participant felt unsure whether the Standards in Year 3 were
appropriate or not.

Year 5

It was found that in Year 5, there were merely two categories of opinion: appropriate,
and inappropriate. These are presented in the following statements and in Table 6.9.

Appropriate
Document 'daCrln', 1 passage. Section 11.1, Paragraph 45, 48 characters.
From my experience, I think they are appropriate
Inappropriate: too hard, too easy
Document 'BeChr', 1 passage. Section 10.2, Paragraph 62, 92 characters.
In terms of the Standards, some of them I think they are too hard, some of them are
too easy
Table 6.9 displays the appropriateness of the Standards in Year 5. The number of the
Year 5 teachers who participated in this study was only two. It can be seen clearly
that the two teachers had different opinion regarding the appropriateness of the
Standards in Year 5. One of them believed that the Standards in Year 5 were
appropriate. On the other hand, the other teacher claimed that the Standards were
inappropriate.

Table 6.9 The appropriateness of the Standards in Year 5


BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Appropriate 0 0 0 0 0 0 0 1 0 0 0 1 9

Inappropriate 1 0 0 0 0 0 0 0 0 0 0 1 9

Totals 1 0 0 0 0 0 0 1 0 0 0 2

Note: First row - labels of respondents; First column - names of used themes

These are two contradictory opinions concerning the Standards in the SACSA
Framework in Year 5. One respondent argued that the Standards in Year 5 were

86
appropriate, while another claimed that they were inappropriate for a reason that
some of the Standards were too hard, and some others were too easy or too low.
The argument that stated that the Standards in the SACSA Framework in Year 5 were
appropriate was not a problem, since it meant that the teaching materials and ideas
used were associated with the stages of intellectual development as proposed by
Piaget. Related to this, the Education Department of South Australia (1977, p. 53)
argued that Piaget’s theory about stages of intellectual development was closely
linked to students’ readiness for learning. Thus, the Education Department of South
Australia (1977, p. 53) further suggested that the materials and ideas used for
learning should be associated with these stages.
On the other hand, the respondent who claimed that the Standards were inappropriate
because some of them were too hard, and some others were too easy or too low,
implied that the teaching materials were unchallenging, and against the theory of
intellectual development that was claimed to be used in the South Australian
Curriculum as stated by the Education Department of South Australia (1977, p. 53).
In terms of challenging teaching materials, Shayer and Adey (2002, p. 5) raised a
similar issue. They argued that cognition could be stimulated by the presentation of
intellectual challenges of moderate difficulty. Thus, providing moderate difficulty of
teaching materials for students is very important.

Year 7

In Year 7, the findings shows that no respondent claimed that the Standards in Year 7
were inappropriate. However, there were some of the teacher who had no idea. These
findings are given in the following descriptions and in Table 6.10.

Appropriate
Document 'BelFftn', 1 passage. Section 10.1, Paragraph 23, 94 characters.
For Year 7, the Standards and the Benchmarks seem to be associated with the
outcome statement.
Document 'daCrln', 1 passage. Section 11.1, Paragraph 45, 50 characters.
From my experience, I think they are appropriate,
Document 'daJo', 1 passage. Section 10.1, Paragraph 28, 30 characters.
The standards are associated,
Inappropriate
(N/A)

87
Table 6.10 presents information about the appropriateness of the Standards in Year 7.
The number of the Year 7 teachers who commented in this study was only three
teachers. The results showed that the three teachers argued that the Standards in Year
7 were appropriate.
The comment that stated that the Standards in Year 7 were appropriate implies that
there is no problem in Year 7 in terms of the appropriateness of the Standards in the
SACSA Framework.

Table 6.10 The appropriateness of the Standards in Year 7


BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Appropriate 0 0 0 1 0 0 0 1 0 1 0 3 27

Inappropriate 0 0 0 0 0 0 0 0 0 0 0 0 0

Totals 0 0 0 1 0 0 0 1 0 1 0 3

Note: First row - labels of respondents; First column - names of used themes

Benchmarks in the LAN

Year 3

In Year 3, it was found that there were three themes in terms of the appropriateness
of the Benchmarks in the LAN Test ‘appropriate’, ‘inappropriate’, and ‘unsure’.
These findings are shown in the following comments and in Table 6.11.

Appropriate
Document 'daCrln', 1 passage. Section 11.1, Paragraphs 44-45, 335 characters.
From my experience, I think they are appropriate, but at the same time, I’m not a
classroom teacher who’s working with Years 3, 5, and 7, and looking at some
comparison. But I’m not having anyone saying that this doesn’t work. But they are
not saying that to you. But probably they have been handled around the LAN
Testing.
Document 'daDsy', 1 passage. Section 10.1, Paragraph 28, 64 characters.
Yes, I think they are. I think they are appropriate
Inappropriate
Document 'BeFrtn', 1 passage. Section 10, Paragraph 22, 59 characters.
…however the Benchmarks are not. The Benchmarks are too low.
Document 'BeSvn', 1 passage. Section 10.1, Paragraph 22, 94 characters.
No, I don’t think the standards and the Benchmarks are associated with the outcome
statements.
Document 'BeVrc', 1 passage. Section 10.1, Paragraph 34, 226 characters.
But one thing that the benchmarks I don’t think they are appropriate for ESL. We
have lots of ESL kids here. Some of them find the LAN Test are too hard, because
they find English is very hard in order to work in the LAN Test.
Document 'daKty', 2 passage. Section 11.1, Paragraph 38, 227 characters.

88
I think the Year 3 finds things very intimidating, a lot of kids try to catch up. They
come not for learning, they come to school, and they can’t read and write their
name. So, they do a lot of things to catch up.
Section 11.1, Paragraph 40, 71 characters.
When you are in Year 3, some children may not have done very many tests.
Not sure
Document 'BeFv', 1 passage. Section 11, Paragraph 22, 9 characters.
Not sure.
Table 6.11 presents information about the appropriateness of the Benchmarks in Year
3. The results show that from the seven teachers who provided comments about the
appropriateness of the Benchmarks in Year 3, four of them considered that the
Benchmarks in Year 3 were ‘inappropriate’. However, two of them stated that the
Benchmarks in Year 3 were ‘appropriate’; and only one stated ‘not sure’.

Table 6.11 The appropriateness of the Benchmarks in Year 3


BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty totals Percent

Inappropriate 0 1 0 0 1 1 0 0 0 0 1 4 36

Not sure 0 0 1 0 0 0 0 0 0 0 0 1 9

Appropriate 0 0 0 0 0 0 0 1 1 0 0 2 18

Totals 0 1 1 0 1 1 0 1 1 0 1 7

Note: First row - labels of respondents; First column - names of used themes

The majority of the participants argued that the Benchmarks in Year 3 were
inappropriate because of three main reasons: (a) the benchmarks were hard for ESL
students, (b) the Benchmarks were too low, and (c) the Benchmarks were very
intimidating for students since they had not been accustomed to the test. Similar to
the previous section ‘the appropriateness of the Standards in the SACSA
Framework’, the appropriateness of the Benchmarks in the LAN Test could be
associated with the issue about intellectual challenges of moderate difficulty that was
discussed by Shayer and Adey (2002). Shayer and Adey (2002, p. 5) argued that
cognition could be stimulated by the presentation of intellectual challenges of
moderate difficulty. This implied that in designing curriculum or tests, it is urgent to
consider the difficulty level of the materials. Otherwise, the materials or the tests will
not be challenging.

89
Year 5

In Year 5, it was found that there were two themes in terms of the appropriateness of
the Benchmarks in the LAN Test including ‘appropriate’, and ‘inappropriate’. These
findings were shown in the following comments and in Table 6.12.

Appropriate
Document 'daCrln', 1 passage. Section 11.1, Paragraph 45, 190 characters.
I think they are appropriate. I’m not having any staff who are saying that this
doesn’t work. But they are not saying that to you. But probably they have been
handled around the LAN Testing.
Document 'daKty', 1 passage. Section 11.1, Paragraph 40, 34 characters.
So I think it’s much better in Year 5
Inappropriate: very low
Document 'BeChr', 1 passage. Section 10.2, Paragraph 60, 177 characters.
I think the Benchmarks are very low, and I expect that ..learning difficulties for
students. And I expect all students will achieve the Benchmark sets because they are
very low.
Table 6.12 presents information about the appropriateness of the Benchmarks in Year
5. The results shows that from the three respondents who made comments regarding
the appropriateness of the Benchmarks in Year 5, two of them considered that they
were appropriate; while the another one assumed that the Benchmarks in Year 5 were
very low.

Table 6.12 The appropriateness of the Benchmarks in Year 5


BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Appropriate 0 0 0 0 0 0 0 1 0 0 1 2 18
Inappropriate: Very
Low 1 0 0 0 0 0 0 0 0 0 0 1 9

Totals 1 0 0 0 0 0 0 1 0 0 1 3

Note: First row - labels of respondents; First column - names of used themes

Similar to the Benchmarks in Year 3, the respondent who claimed that the
Benchmarks were inappropriate stated that they were very low. This is the only
reason for stating the inappropriateness in the Benchmarks in Year 5. However, there
were some teachers argued that the Benchmarks in Year 5 were appropriate.
The Benchmarks in Year 5 had the same case as those in Year 3. Therefore, the
concept proposed by Shayer and Adey (2002, p. 5) about intellectual challenges of
moderate difficulty can be used to support these opinions .

90
In Year 7, it was found that there were three themes in terms of the appropriateness
of the Benchmarks in the LAN Test ‘appropriate’, ‘inappropriate’, and ‘Benchmarks
should be higher’. These findings are shown in the following comments are
presented in Table 6.13.

Appropriate
Document 'BelFftn', 1 passage. Section 10.1, Paragraph 23, 65 characters.
The Benchmarks seem to be associated with the outcome statement.
Document 'daCrln', 1 passage. Section 11.1, Paragraph 45, 49 characters.
From my experience, I think they are appropriate.
Document 'daKty', 1 passage. Section 11.1, Paragraph 40, 206 characters.
Especially Year 7, the LAN Test in Year 7 is much more accurate. Children are
ready for testing. I’ve done this twice already. So, I know. They are ready for
testing, so they know what we expect from them.
Inappropriate
Document 'daJo', 1 passage. Section 10.1, Paragraph 28, 78 characters.
But the Benchmarks are not. They do not correlate. The Benchmarks are too low.
Table 6.13 presents information about the appropriateness of the Benchmarks in Year
7. The results shows that from the four respondents, who gave comments concerning
the appropriateness of the Benchmarks in Year 7, three of them stated the
Benchmarks in Year 7 were appropriate; and only one participant claimed that they
were inappropriate.

Table 6.13 The appropriateness of the Benchmarks in Year 7


BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Appropriate 0 0 0 1 0 0 0 1 0 0 1 3 27

Inappropriate 0 0 0 0 0 0 0 0 0 1 0 1 9

Totals 0 0 0 1 0 0 0 1 0 1 1 4

Note: First row - labels of respondents; First column - names of used themes

Three respondents argued that the Benchmarks in Year 7 were appropriate. However,
there was a statement that stated that the Benchmarks in Year 7 were inappropriate.
The theory that can be used to support these comments was similar to the theory that
proposed by Shayer and Adey (2002, p. 5) that stated that some of the Benchmarks in
Years 3 and 5 were appropriate, while some others were inappropriate.

91
Benchmarks should be higher
Document 'daBrn', 1 passage. Section 10.1, Paragraph 31, 249 characters.
The Benchmarks as I said are set so the children are going to achieve those levels,
but those benchmarks should be higher. I think in some cases they should be.
Whether the Benchmarks should be higher, and that I think in some cases they
should be.
For those years: Some are appropriate
Document 'daBrn', 1 passage. Section 10.1, Paragraph 31, 167 characters.
In some ways, the Standards are appropriate. The Standards are appropriate, for
each of those doesn’t matter which you are talking about, in any of the Years, I
guess.
Table 6.14 displays information on the appropriateness of the Benchmarks in Years 3,
5, and 7 as a whole. This respondent argued that some of the Benchmarks were
‘appropriate’.

Table 6.14 Some of the Standards and the Benchmarks were appropriate
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Should be higher 0 0 0 0 0 0 1 0 0 0 0 1 9
Some were
appropriate 0 0 0 0 0 0 1 0 0 0 0 1 9

Totals 0 0 0 0 0 0 2 0 0 0 0 2

Note: First row - labels of respondents; First column - names of used themes

6.2.5 Question 6: Are there any conflicting ideas and practices on trying
to achieve both the Standards and the Benchmarks?

The Standards in the SACSA Framework

The responses to the question, ‘were there any conflicting ideas and practices in
achieving the Standards in SACSA?’ involve two main themes: (a) there were
conflicting ideas (This theme involved ‘hard to achieve’, and ‘hard to assess); (b)
there was no conflicting idea. These findings are given in the following comments
and in Table 6.15.

Hard to achieve
Document 'BeChr', 1 passage. Section 12.1, Paragraph 69, 132 characters.
There are some children with learning difficulties in the class, and it makes difficult
for those students to achieve the Standards.
Document 'BeFrtn', 1 passage. Section 12.1, Paragraph 26, 46 characters.
But achieving the Standards is more difficult.
Hard to assess
Document 'BeChr', 1 passage. Section 12.1, Paragraph 69, 249 characters.

92
In terms of the Standards that come from the SACSA documents, it is a little bit
harder to assess because there are so many of them. And I guess I don’t have enough
hard data over the years that I collected. But it is a difficult question to answer.
No conflicting ideas
Document 'BelFftn', 1 passage. Section 12.1, Paragraph 27, 23 characters.
Not that I am aware of.
Document 'daBrn', 1 passage. Section 12.1, Paragraph 35, 226 characters.
No, I don’t think any conflict there. I mean teachers need…..They need basic forms
to work to that SACSA provides that format in terms of what children are expected
to know. So I don’t believe that there is a conflicting idea.
Not creating unit work
Document 'daDsy', 1 passage. Section 12.1, Paragraph 33, 85 characters.
The Standards don’t teach….about creating a unit work of and producing unit of
work.
Standards and Benchmarks need to match
Document 'daJo', 1 passage. Section 12.1, Paragraph 32, 145 characters.
Yes, but they don’t correlate. The Benchmarks are much lower. The Standards are
obviously put at eight levels, so they can continue to elaborate.
Document 'daKty', 1 passage. Section 13.1, Paragraph 44, 478 characters.
The conflicting ideas between the SACSA Framework and the Benchmarks: the
SACSA Framework, kids are set up in Yeas 2, 4 6 and 8. And the Benchmarks are in
between years; Years 3, 5, and 7. The Benchmarks need to match up the SACSA
Framework, so it’s 2, 4, 6 or the SACSA Framework in 3, 5, and 7. Because the
Standards in the Benchmarks are here, I’m not pleased as a teacher. We get the
results, and we can get certain information about the Benchmarks. We don’t get
everything.
Various pedagogy
Document 'daCrln', 1 passage. Section 13.1, Paragraph 50, 106 characters.
I guess that what I have to say is that the pedagogy that we use in SACSA to achieve
Standards is various.
Table 6.15 presents information about whether there were any conflicting ideas in
implementing the Standards in the SACSA Framework. The results show that two of
the 11 respondents stated that there were no conflicting ideas in implementing the
Standards; and one respondent stated that various pedagogies were used in
implementing the Standards. On the other hand, two respondents claimed that the
Standards were hard to teach, and the Standards needed to match the Benchmarks in
the LAN Test. In addition, one participant stated that the Standards were not creating
a unit of work, and they were hard to assess.

Table 6.15 Conflicting ideas in Standards


BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Hard to teach 1 1 0 0 0 0 0 0 0 0 0 2 18

93
No conflicting idea 0 0 0 1 0 0 1 0 0 0 0 2 18

Various pedagogies 0 0 0 0 0 0 0 1 0 0 0 1 9

Hard to assess 1 0 0 0 0 0 0 0 0 0 0 1 9
Bench & Stan, need to
match 0 0 0 0 0 0 0 0 0 1 1 2 18

Not creating unit work 0 0 0 0 0 0 0 0 1 0 0 1 9

Totals 2 1 0 1 0 0 1 1 1 1 1 9

Note: First row - labels of respondents; First column - names of used themes

Related to those arguments, Tomlinson (2002) raised a similar issue. Tomlinson


(2002, p. 38) suggested that academic standards should not be in conflict with good
teaching. She further argued that all profession clearly articulated standards. In
schools, according to her, the professions could help establish a common direction,
ensure some equity in learning goals, provide a ready means of communicating
among educators as well as with parents and community, and serve as common
benchmarks that allowed teachers to record a student’s learning journey.

Benchmarks in LAN Test

The respondents to the question ‘were there any conflicting ideas and practices in
achieving the Benchmarks in the LAN Test? involved six main themes: (a)
confusing, (b) no conflicting idea, (c) psychological burden, (d) the Standards and
the Benchmarks needed to match, (e) teachers disagree, and (f) too easy or too low.
These findings were displayed in the following statements and in Table 6.16.

Confusing
Document 'daKty', 1 passage. Section 13.1, Paragraph 46, 179 characters.
In the Benchmarks, all tests are done whether it ended wrong or right, not the
standards of contextual learning and experience. It is completely complex and
confused kids a lot.
No conflicting idea
Document 'BeFv', 1 passage. Section 14.1, Paragraph 27, 96 characters.
Teachers begin teaching to the Literacy and Numeracy Test so that children achieve
good results.
Document 'BelFftn', 1 passage. Section 12.1, Paragraph 27, 23 characters.
Not that I am aware of.
Document 'daBrn', 1 passage. Section 12.1, Paragraph 35, 37 characters.
No, I don’t think any conflict there.
Psychological burden
Document 'daCrln', 1 passage. Section 13.1, Paragraph 50, 137 characters.
And the way teachers teach in classroom is very different from the way children are
having to access learning when they do the LAN Test. So, there is a conflict there.

94
Document 'daDsy', 1 passage. Section 12.1, Paragraph 33, 298 characters.
When you do the Literacy and Numeracy Test, a different environment for the kids
to work in. I think they might not feel very well working in a room that they are not
used to and doing it without a teacher helping them. So, they might feel nervous.
They might get depressed. And they find it scary.
Standards and Benchmarks need to match
Document 'daJo', 1 passage. Section 12.1, Paragraph 32, 145 characters.
Yes, but they don’t correlate. The Benchmarks are much lower. The Standards are
obviously put at eight levels, so they can continue to elaborate.
Document 'daKty', 1 passage. Section 13.1, Paragraph 44, 241 characters.
The conflicting ideas between the SACSA Framework and the Benchmarks: the
SACSA Framework, kids are set up in Yeas 2, 4 6 and 8. And the Benchmarks are in
between years; Years 3, 5, and 7. The Benchmarks need to match up the SACSA
Framework.
Teachers disagreed
Document 'BeVrc', 1 passage. Section 12.1, Paragraph 41, 220 characters.
Lots of teachers disagree with implementing the LAN Test, because we’re worried
that the results will be put on a list and we will be marked as a bad school. And
parents will just to send their kids to favourite schools.
Document 'daKty', 1 passage. Section 13.1, Paragraph 44, 186 characters.
Because the Standards in the Benchmarks are here, I’m not pleased here as a
teacher. We get the results, and we can get certain information about the
Benchmarks. We don’t get everything.
Too easy or too low
Document 'BeChr', 1 passage. Section 12.1, Paragraph 69, 187 characters.
Generally I find that when LAN came back, the Literacy and Numeracy tests came
back that there are very few students who do not achieve the Benchmarks. They
always achieve the Benchmarks.
Document 'BeFrtn', 1 passage. Section 12.1, Paragraph 26, 32 characters.
Achieving the Benchmarks is easy
Document 'daJo', 1 passage. Section 12.1, Paragraph 32, 30 characters.
The Benchmarks are much lower.
Table 6.16 presents information about whether there were any conflicting ideas in
implementing the Benchmarks in the LAN Test. The results show that three of the 11
respondents (27%) stated that there were no conflicting ideas in trying to achieve the
Benchmarks. This opinion would seem to be supported by three other respondents
who argued that the Benchmarks were too easy or too low. On the other hand, two of
the teachers (18%) claimed that they disagreed with the Benchmarks. This opinion
was relevant to two other respondents (18%) who highlighted that the Benchmarks in
the LAN Test could lead to a psychological burden for children; and one respondent
considered that the Benchmarks were confusing.

95
Table 6.16 Conflicting ideas in Benchmarks
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

No conflicting idea 0 0 1 1 0 0 1 0 0 0 0 3 27

Teachers disagreed 0 0 0 0 0 1 0 0 0 0 1 2 18

Confusing 0 0 0 0 0 0 0 0 0 0 1 1 9

Too easy or too low 1 1 0 0 0 0 0 0 0 1 0 3 27

Psychological burden 0 0 0 0 0 0 0 1 1 0 0 2 18
Bench & stan. Need to
match 0 0 0 0 0 0 0 0 0 1 1 2 18

Totals 1 1 1 1 0 1 1 1 1 2 3 13

Note: First row - labels of respondents; First column - names of used themes

6.2.6 Question 7: Are there any ways in which you think the standards
and the Benchmarks should be changed or modified?

Standards in the SACSA Framework

The responses given provided several suggestions for changing or modifying the
Standards in the SACSA Framework. However, there was also a suggestion not to
change or modify the Standards. The findings identified eight main themes: (a) hard
to score, (b) make the Benchmarks flexible, (c) no change or modification needed,
(d) one year band; (e) select the key ones, (f) there should be some cores or basic
keys, (g) useful as a guide, and (h) very busy curriculum. These findings are
displayed in the following comments and in Table 6.17.

Hard to score
Document 'daDsy', 1 passage. Section 15.1, Paragraph 41, 222 characters.
A lot of teachers find it hard when we want to do grades with A, B, C, D, E. It is a
little bit hard to give them a score in Year 3, or 5, because at the beginning of the
Standards, you are not coming toward the end of it.
Make them flexible
Document 'BeChr', 2 passage. Section 14.1, Paragraph 75, 304 characters.
Because I have been teaching quite long, so I think certain standards are important
and outcomes are important that students have to cover and achieve, and there are
others that are not important and from year to year you might change, what you
teach and what the outcomes might be. So it can be flexible
Section 14.1, Paragraph 77, 138 characters.
And then there are perhaps others that can be more flexible and you choose. And
that one way that I would like to see them changed or modified.
No change or modification needed
Document 'daBrn', 1 passage. Section 14.1, Paragraph 40, 219 characters.
I don’t think the Standards and the Benchmarks need to be changed or modified in
terms of the context of the school. I believe that they need to be made uniform. I
don’t think they need to be changed to fit the context.

96
One year band
Document 'daDsy', 1 passage. Section 15.1, Paragraph 38, 121 characters.
I think the only thing that I’d like to change with the Standards is maybe to have it
over one year instead of two years.
Document 'daKty', 1 passage. Section 15.1, Paragraph 50, 168 characters.
The Standards should be modified into one year, rather than two years. That would
make, it much easier for teachers to actually went, like okay I’ve got this enforced.
Select the key ones
Document 'BeChr', 1 passage. Section 14.1, Paragraph 75, 43 characters.
I think they should pick out the key ones.
There should be some core, basic keys
Document 'BeChr', 1 passage. Section 14.1, Paragraph 75, 94 characters.
There should be some core, basic keys, Standards, and outcomes that students are
heading towards.
Useful as a guide
Document 'BeSvn', 1 passage. Section 14.1, Paragraph 31, 48 characters.
The Standards can be useful as a guide, so okay.
Very busy curriculum
Document 'BeChr', 1 passage. Section 14.1, Paragraph 75, 54characters.
It is a very busy curriculum, very crowded curriculum.
Document 'daKty', 1 passage. Section 15.1, Paragraph 50, 119 characters.
And now I’ve got Year 4 need to know all it is harder for the next one. so that is
definitely should be modified a lot.
Table 6.17 displays data about whether the Standards in SACSA should be changed
or modified. From the 11 participants, only one of them believed that the Standards
did not have to be changed or modified. This opinion was supported by another
respondent who stated that the Standards were useful. On the other hand, two of the
respondents criticized the Standards as a very busy curriculum; and one of them and
complained that the Standards were hard to score. Furthermore, one respondent
added that the Standards had to be questioned. To improve the Standards, two
respondents (18%) suggested that two-year bands had to be changed to be one-year
bands. Additionally, one participant suggested that the Standards had to be flexible;
and teachers had to select the key Standards.

Table 6.17 Suggestions for changing or modifying the Standards in the SACSA
Framework
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Select the key ones 1 0 0 0 0 0 0 0 0 0 0 1 9

97
Be questioned 1 0 0 0 0 0 0 0 0 0 0 1 9

Useful as a guide 0 0 0 0 1 0 0 0 0 0 0 1 9

Very busy curriculum 1 0 0 0 0 0 0 0 0 0 1 2 18

Make them flexible 1 0 0 0 0 0 0 0 0 0 0 1 9

Hard to score 0 0 0 0 0 0 0 0 1 0 0 1 9

1 year not 2 year band 0 0 0 0 0 0 0 0 1 0 1 2 18

No change needed 0 0 0 0 0 0 1 0 0 0 0 1 9

Totals 4 0 0 0 1 0 1 0 2 0 2 10

Note: First row - labels of respondents; First column - names of used themes

When teachers suggested that two-year bands should be changed to be one-year


bands, it could be assumed that this related to the argument that considered the
Standards as a very busy curriculum. Thus, the Standards were hard to score, and had
to be questioned. Related to these issues, Tomlinson (2001, p. 39) argued that lists of
standards were often presented to teachers with little or no effective guidance in how
to use them in curricular planning. She further highlighted that standard documents
provided virtually no modelling of the connectivity of knowledge; and the topic of
construction was absent from most standards documents.
The suggestions that the Standards had to be made flexible, and the key ones had to
be selected were relevant to DECS’ (2000) and DETE’s (2000) comments that the
SACSA Framework provided the context in which educators, site leaders, and state
office personal planned, monitored, allocated resources, took appropriate action and
accounted for the quality of the learning programs. However, if the teachers provided
such suggestions, this implied that the implementation of the Standards was not
flexible and too broad, so that teachers had to be selective.
The argument that stated the Standards were useful, and they did not have to be
changed or modified meant the objectives of the Framework determined by DECS
(2000) and DETE (2000), as described above, were achieved.

Benchmarks in LAN Test

The results showed several suggestions for changing or modifying the Benchmarks
in the LAN Test. However, there was also a suggestion not to change or modify the
Standards. The findings identified eight main themes: (a) no change or modification
needed, (b) not applicable to ESL, (c) no work with the Benchmarks, (d) refuse the
Benchmarks, (e) respect to learning styles, (f) setting classroom for support, and (g)
terminology should be changed. These findings are displayed in the following
statementss and in Table 6.18.

98
No change needed
Document 'daBrn', 1 passage. Section 14.1, Paragraph 40, 219 characters.
I don’t think the Standards and the Benchmarks need to be changed or modified in
terms of the context of the school. I believe that they need to be made uniform. I
don’t think they need to be changed to fit the context.
Not applicable to ESL
Document 'BeSvn', 1 passage. Section 14.1, Paragraph 31, 51 characters.
Benchmarks are not applicable to ESL at all, so no!
No work with Benchmarks
Document 'daDsy', 1 passage. Section 15.1, Paragraph 38, 42 characters.
We don’t really work with the Benchmarks.
Refused Benchmarks
Document 'BeFrtn', 1 passage. Section 14.1, Paragraph 32, 47 characters.
Yes, I think the Benchmarks should be scrapped.
Document 'BelFftn', 1 passage. Section 14.1, Paragraph 31, 215 characters.
I would throw out the LAN Test. It is irrelevant and teachers are quite capable of
assessing their students work continuously. The LAN Test is too narrow. Subjective
and does not reflect a students’ true capability.
Document 'daKty', 1 passage. Section 15.1, Paragraphs 52-53, 423 characters.
And the Benchmarks, I don’t agree with them at all. So, no qualification. I’m sure
they’re serving a purpose for us; very slow. A small purpose for us in that we can
see where children need help sometimes. But by the time the Benchmarks are done,
the testing done and we already know where our kids are at because we test at the
beginning of the year. And the testing is not a true indication about what where the
kids are.
Respected to learning styles
Document 'daJo', 1 passage. Section 14.1, Paragraph 36, 83 characters.
I don’t believe that it highly respects to learning styles or is this an advantage.
Setting classroom for support
Document 'daCrln', 1 passage. Section 16.1, Paragraph 55, 143 characters.
It should be that this is what we need to be doing more in the classroom setting for
support. Do you know what I mean? So, I’m not sure at all.
Terminology should be changed
Document 'daCrln', 1 passage. Section 16.1, Paragraph 55, 175 characters.
Sometimes in the LAN Test, some of the language is what the kids find it hard to
communicate due to limited life experiences or something like that.
Document 'daJo', 1 passage. Section 14.1, Paragraph 36, 520characters.
I don’t believe that the information that is given or communicated to parents is, you
know, that explains the scores or assessments. I think some of them would be too
complicated for parents. Especially, output to be able to understand it; to be printed
out. You know some of the answers or assessments are too difficult to understand.
Some of the parents haven’t got much discussion or some of them haven’t looked at

99
anything like that. So, I don’t think that it helps, and all of that should be changed
and simplified.
Table 6.18 displays data on whether the Benchmarks should be changed or modified.
Most of the respondents suggested that the Benchmarks had to be changed or
modified. Even three of them absolutely refused the Benchmarks. However, one
participant suggested that the Benchmarks did not have to be changed. In terms of
the ways of changing or modifying the Benchmarks, two respondents suggested the
Benchmark terminology had to be changed. In addition, one respondent emphasized
that it was necessary to do more in the classroom setting for support. This opinion
was relevant for another respondent who argued that the Benchmarks did not
represent differences in learning styles. For this reason, the respondent suggested that
the Benchmarks had to consider the learning styles and had to be applicable to
teaching English as a Second Language (ESL).
The argument that the Benchmarks did not need to be changed or modified
impliedthat there was no problem in the Benchmarks, and those who supported this
argument might consider that the existence of the Benchmarks was appropriate,
especially in terms of uniformity.
On the other hand, the majority of the respondents suggested changes or
modification, in terms of terminology, applicability to ESL, classroom setting for
support, and no work with the Benchmarks. The requests of the teachers to change or
modify was relevant to Eggleston’s concept (1980, cited in Campbell, 1985, p. 33)
that the appropriate style for developing the curriculum was co-operative, with staff
working together to produce plans for change.

Table 6.18 Suggestions for changing or modifying the Benchmarks in the LAN
Test
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Setting classroom for


support 0 0 0 0 0 0 0 1 0 0 0 1 9

Refused Bench. 0 1 0 1 0 0 0 0 0 0 1 3 27

Not applicable to ESL 0 0 0 0 1 0 0 0 0 0 0 1 9

No change needed 0 0 0 0 0 0 1 0 0 0 0 1 9
Respected to learning
styles 0 0 0 0 0 0 0 0 0 1 0 1 9

Change terminology 0 0 0 0 0 0 0 1 0 1 0 2 18

No work with Bench. 0 0 0 0 0 0 0 1 0 0 1 9

Totals 0 1 0 1 1 0 1 2 1 2 1 10

Note: First row - labels of respondents; First column - names of used themes

The argument that ‘terminology in the Benchmarks had to be changed since it was
difficult to understand’, was similar to the argument that ‘no tie to Standards and

100
language in SACSA' regarding the implementation of the Standards in the SACSA
Framework and the statement that ‘difficult terminology’ about personal opinion
towards the Standards in the SACSA Framework. Similarly, this argument,
‘terminology in the Benchmarks had to be changed since it was difficult to
understand’ was supported by Bloom, et al’s argument (1971, p. 20) that an educator
had to choose words that convey the same meaning to all intended readers, since
statement of objectives that could be interpreted differently by different readers gave
them no direction in selecting materials, organizing content, and describing obtained
outcomes, nor did they provide a common basis for instruction or evaluation.
No tie to Standards and language in SACSA
The participant who argued that the Benchmarks were not applicable to ESL students
did not specify any reason why they were not applicable to the ESL students.
However, it was more likely that it was related to the difficult terminology as has
been discussed previously.
The argument that teachers needed the classroom setting for support implied that
creating supporting environment in a classroom is very necessary to help students
learn. The supporting environment in a classroom might involve: (a) appropriate and
interesting materials, and (b) the use of appropriate teaching methods. Piaget, cited in
Shayer and Adey (2002, p. 6) stated that the environment that created cognitive
conflict and that stimulated cognitive growth was essentially a physical environment
as much as a social one.
One respondent argued that the Benchmarks had to pay respect to learning styles.
This implied that the teacher deemed that so far the Benchmarks were not much
concerned with a student’s learning style. With regard to the importance of respecting
learning styles, DECS (1994, p. 1) proposed providing appropriate teaching and
learning methods for students by groups, and as individuals.
The argument that no work with the Benchmarks related to the argument in section
‘Benchmarks were only tests’, and the argument in Section ‘no teaching relating to
the Benchmarks’ about the teaching methods used in achieving the Benchmarks. This
implied that in daily teaching, the Benchmarks were not involved at all, since they
were merely tests. The argument ‘no teaching relating to the Benchmarks’ appeared
twice. Firstly, this statement appeared in Section ‘the teaching methods used in
achieving the Benchmarks’. Secondly, this statement appeared in Section ‘how the
Benchmarks were implemented’. Thus, these arguments that derived from different

101
parts of the interview and different questions support the idea that the Benchmarks
and the Standards are two different things and they are not related.
Three teachers obviously rejected the Benchmarks. Two of them expressed their
reasons for rejecting the Benchmarks such as, no qualification, irrelevant, too narrow,
subjective, and did not reflect a student’s true capability. This argument ‘rejected the
Benchmarks’ related to the argument in Section ‘teachers disagreed with the
Benchmarks’ in terms of teachers’ personal opinion concerning the Benchmarks.
From the reasons provided, it could be inferred that the teachers criticized the
Benchmarks in terms of the content, and this could be categorized and related to
evaluation in education. Terwilliger (1971, p. 4) stated that there was a need for value
judgments concerning the merit of methods and materials used in education.

6.3 What teaching methods do the teachers use in applying the


standards in the SACSA Framework and the Benchmarks in Literacy
and Numeracy Testing Program?

In order to answer this question, the researcher used six interview questions namely
questions number 4, 8, 9 and 10 (Back to Chapter 5).

6.3.1 Question 4: What teaching methods do you use for your students
to achieve the Standards in the SACSA Framework and the
Benchmarks in the Literacy and Numeracy Test?

Standards in the SACSA Framework

The findings show that there were ten teaching methods used by the teachers in
achieving the Standards in the SACSA Framework. Those were: (a) by using
companion document, (b) considering current pedagogy, (c) different teaching
methods, (d) discussion, (e) doing tests but relating to the SACSA Framework, (f)
explicit teaching, (g) independent learning, (h) integrating approach, (i) thinking
methodology, and (j) using websites. These findings can be seen in the following
comments and in Table 6.19.

Companion document
Document 'BeChr', 1 passage. Section 8.1, Paragraph 54, 59 characters.
I’m planning to use the companion documents for planning.
Current pedagogy
Document 'daCrln', 2 passages. Section 9.1, Paragraph 37, 83 characters.

102
We’re constantly keeping abreast with what we consider to be current pedagogy.
Section 9.1, Paragraph 39, 202 characters.
We are looking at the voice of education, we’re looking at primary research as a part
of learning, with a guide of reading, we run phonological awareness nearly every
year. So there are many pedagogies.
Different teaching methods
Document 'daBrn', 1 passage. Section 8.1, Paragraph 26, 459characters.
I think the teaching methods that they use often vary from teacher to teacher,
because each teacher has a particular way of teaching, or a particular method, They
also vary depending upon the best way which children learn, because all children
learn in different ways. They all have different strengths and different intelligences.
So, therefore, teachers are going to use a variety of ways for different children. We
won’t use the same thing for all children
Document 'daJo', 1 passage. Section 8.1, Paragraph 23, 121 characters.
And the Standards in SACSA obviously they became a part of our programming.
And we use lots of different teaching methods
Document 'daKty', 1 passage. Section 9, Paragraph 27, 82 characters.
The assessment that I use for achieving the Standards in SACSA Framewok is
varied.
Discussion with kids
Document 'BeChr', 1 passage. Section 8.1, Paragraph 54, 334 characters.
And also I discussed with the students the criteria for the assessment …what I’m
looking for. So they have an idea before they start a unit of work. What I’m
expecting as an outcome or outcomes. It doesn’t mean that I tell them what is in the
book, because even I, as a teacher, sometimes doesn’t understand. But I put it in
simple terms then. So, they know what they should achieve and how I will assess
them.
Document 'daKty', 1 passage. Section 9, Paragraph 27, 16 characters.
Talking to kids,
Doing tests but not about SACSA Framework
Document 'daDsy', 1 passage. Section 8.1, Paragraph 21, 346 characters.
I mean we don’t really do testing. We do testing but not the SACSA because it’s
something different. So, I do a lot of pieces of work, like I do lots of checklists,
speaking, grammar, punctuation, and spelling. So, I can use a checklist to see
whether they can do it or not by looking at their pieces of work. We do the testing
but not the SACSA.
Explicit teaching
Document 'BeFrtn', 1 passage. Section 7, Paragraph 18, 18 characters.
Explicit teaching.
Document 'daBrn', 1 passage. Section 8.1, Paragraph 26, 41 characters.
We do a lot of explicit teaching as well.
Independent learning
Document 'daJo', 1 passage. Section 8.1, Paragraph 23, 63 characters.

103
To them that might be hands on group work independent learning.
Integrating approach
Document 'BeChr', 1 passage. Section 8.1, Paragraph 54, 28 characters.
We do integrating approach.
Thinking methodology
Document 'daJo', 1 passage. Section 8.1, Paragraph 23, 63 characters.
To them that might be hands on group work independent learning.
Using websites
Document 'BeChr', 1 passage. Section 8.1, Paragraph 54, 123 characters.
Also there are some very good ideas in the websites. I used some words from the
websites. Some useful information is there.
Table 6.19 displays data about teaching methods that the teachers used in achieving
the Standards in the SACSA Framework. Generally, the responses can be divided
into two groups: (a) how they designed lessons, and (b) how the teachers taught in
the class. In terms of how the teachers designed lessons, one teacher highlighted the
use of a companion document, current pedagogies, and the use of the websites.
Related to the teaching methods used, the teachers used different methods, such as,
independent learning, integrating teaching and explicit teaching. Interestingly, there
were two teachers who emphasized the importance of discussing with kids about the
criteria for assessment and expectation. However, one teacher commented that tests
conducted were not associated with the SACSA Framework

Table 6.19 Teaching methods used in achieving the standards in the SACSA
Framework
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent
Explicit teaching 0 1 0 0 0 0 1 0 0 0 0 2 18
Current pedagogy 0 0 0 0 0 0 0 1 0 0 0 1 9
Discussion with kids 1 0 0 0 0 0 0 0 0 0 1 2 18
Thinking methodology 0 0 0 0 0 0 1 0 0 0 0 1 9
Different methods 1 0 0 0 0 0 0 0 0 0 0 1 9
Websites 0 0 0 0 0 0 1 0 0 1 1 3 27
Independent learning 0 0 0 0 0 0 0 0 0 1 0 1 9
Integrating teaching 1 0 0 0 0 0 0 0 0 0 0 1 9
Companion doc. 1 0 0 0 0 0 0 0 0 0 0 1 9
Tests but not SACSA 0 0 0 0 0 0 0 0 1 0 0 1 9
Totals 4 1 0 0 0 1 3 1 1 3 3 17

Note: First row - labels of respondents; First column - names of used themes

The arguments that emphasized the importance of doing ‘discussion with kids’
integrating approach’ and ‘explicit teaching’ in achieving the Standards in SACSA
were relevant to Vygotsky’s ideas (cited in Roberston, 2002, p. 61). Vygotsky

104
contended that students needed to use language so as to integrate learning, and make
it explicit. Robertson would seem to support this theory. Related to this, Robertson
(2002, p. 61) questioned how the process of learning became integrated for each
students if there was no opportunity to discuss in a safe environment how learning
took place. Thus, having discussion with kids regarding what they would learn, and
what they had to achieve or similar things was important. In addition to the
importance of using an integrating approach, Barton and Smith (2000, p. 54) argued
that instruction should focus on integrated, interdisciplinary activities that revolved
around a set of important ideas.
The ideas of independent learning, modes of thinking and the use of websites were
supported by the statements of DECS (2000, online) and DETE (2000, online). They
claimed that the SACSA Framework was a cohesive curriculum that enabled learners
to develop values, skills, dispositions and understandings to: (a) respond to change
and plan for the future, (b) develop a positive sense of self and group, (c) work well
with a variety of others, (d) be independent critical thinkers, … . Furthermore, the
use of websites could be linked to part (a) about ‘respond to change and plan for the
future’. In addition to the argument of achieving the Standards by emphasizing
modes of thinking was supported by Venville (2002, p. 49). He claimed that the kind
of thinking that occurred in the CASE@KSI lessons provided potential for enhanced
cognitive development, and that children were stimulated to develop concrete
operational thought patterns during the course of the activities.
The opinion of being necessary to consider a current pedagogy was also supported by
Shayer and Adey (2002). Shayer and Adey (2002, p. 25) believed that the effective
delivery of the activities depended on the teachers having a good understanding of
the underlying theory, much practice in generating cognitive conflict, and
encouraging social construction as well as metacognition involving each child.
Glasscott and Crews (1998, p. 232) also discussed the importance of understanding
current pedagogy. They stated that the teacher’s pedagogy, the ways in which teacher
intervened with children, depended on a teacher’s beliefs about children and the
child-teacher relationship.
There was an argument from one respondent that he used companion documents
when he made a lesson plan. This implied that the teacher did not only depend on the
SACSA document but also used another relevant document.

105
There was an argument that teachers used different teaching methods based on
students’ characteristics. With regard to the importance of using various methods of
teaching, Bloom et al. (1971, p. 17) argued that it was necessary to use a great
variety of instructional methods, and the choice of methods had to be dependent on
the objectives of the instruction. Furthermore, Bloom et al. (1971) raised an issue
about specifying outcomes. Bloom et al. (1971, p. 15) stated that it was important to
have in mind the kinds of students or students’ characteristics who were likely to be
able to attain these outcomes in a reasonable period of time under the learning
conditions planned.
One teacher stated that tests were done but not relating to the SACSA Framework. In
Section ‘doing tests but not about the SACSA Framework’ more information is given
with respect to what this teacher meant about this argument. This teacher stated, ‘I do
lots of checklists, speaking, grammar, punctuation, and spelling’. This implies that
she did lot of work or evaluation associating with literacy. Implicitly, this teacher did
tests involving objectives covered in the Standards in the SACSA Framework, yet
she did not use SACSA terminology explicitly, since it was difficult to understand.
This can be seen in Section teachers’ personal opinions.

Benchmarks in LAN Test

The findings show that there were two teaching methods used by the teachers in
achieving the Benchmarks in the LAN Test. Those were: (a) Benchmarks were only
as tests, and (b) no teaching relating to the Benchmarks. These findings can be seen
in the following descriptions and in Table 6.20.

Benchmarks are only tests


Document 'daDsy', 1 passage. Section 8.1, Paragraph 23, 265 characters.
Assessing the Literacy and Numeracy Test, we just give them the test and that’s it.
We just get the results of the test, and then forget it. Sometimes we look at the test to
see what the kids got wrong. Basically, we don’t really do it, but we do it with
SACSA.
No teaching relating to Benchmarks
Document 'BeChr', 1 passage. Section 8.1, Paragraph 54, 62 characters.
I don’t teach the Benchmarks because I think they are very low
Document 'daJo', 1 passage. Section 8.1, Paragraph 21, 183 characters.
I don’t teach the Benchmarks in Literacy and Numeracy Test. We don’t look at the
Benchmarks. I’ve even never seen the document about what the Benchmarks are.
So, I never teach those.

106
Document 'daKty', 1 passage. Section 9, Paragraph 30, 90 characters.
The LAN Test, I don’t teach it at all. I don’t employ teaching methods for the LAN
Test.
Table 6.20 presents information about teaching methods used in achieving the
Benchmarks in the LAN Test. The results show that the majority of the respondents
did not provide information about the teaching methods used in achieving the
Benchmarks. Three of the four respondents who provided comments regarding this
topic stated that they did not teach the Benchmarks. This was related to a teacher’s
opinion that the benchmarks were merely tests.

Table 6.20 Teaching methods used in achieving the Benchmarks


BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent
Benchmarks only as
tests 0 0 0 0 0 0 0 0 1 0 0 1 9
Not teaching
benchmarks 1 0 0 0 0 0 0 0 0 1 1 3 27
Totals 1 0 0 0 0 0 0 0 1 1 1 4

Note: First row - labels of respondents; First column - names of used themes

Both the arguments ‘no teaching relating to the Benchmarks’ and ‘Benchmarks were
only tests’ could be assumed to have the same meaning. These arguments were
similar to the argument in Section ‘no teaching relating to the Benchmarks’ regarding
the implementation of the Benchmarks, and the argument in Section, ‘no work with
the Benchmarks’ in terms of the ways of changing or modifying the Benchmarks.
Again, this implied that in daily teaching, the Benchmarks were not involved at all,
since they were merely tests.

How to assess

The findings show several ways of assessment used: (a) checklist, (b) grammar; (c)
observation, (d) punctuation, (e) speaking, (f) spelling, (g) testing activities (h)
tests but not about SCSA, and (i) writing. The results are given in the following
comments and in Table 6.21.

Checklist
Document 'daDsy', 1 passage. Section 8.1, Paragraph 21, 25characters.
I do lots of checklists,…
Grammar
Document 'daDsy', 1 passage. Section 8.1, Paragraph 21, 21 characters.
I do lots of grammar.

107
Observation
Document 'BeSvn', 1 passage. Section 8.1, Paragraph 18, 18 characters.
I use observation.
Document 'daBrn', 1 passage. Section 8.1, Paragraph 27, 46 characters.
…to assess that is done through observation,..
Document 'daDsy', 1 passage. Section 8.1, Paragraph 21, 25characters.
…I do many observations.
Punctuation
Document 'daDsy', 1 passage. Section 8.1, Paragraph 21, 26 characters.
…I do lots of punctuation.
Speaking
Document 'daBrn', 1 passage. Section 8.1, Paragraph 27, 34 characters.
…through discussion with children.
Document 'daDsy', 1 passage. Section 8.1, Paragraph 21, 24 characters.
…I do lots of speaking.
Document 'daKty', passages. Section 9, Paragraph 27, 18 characters.
…talking to kids,
Spelling
Document 'daDsy', 1 passage. Section 8.1, Paragraph 21, 22 characters.
I do lots of spelling.
Document 'daKty', 1 passage. Section 9, Paragraph 27, 71 characters.
We use the written (Warrington) test with the Australian spelling test.
Testing activities
Document 'BeSvn', 1 passage. Section 8.1, Paragraph 18, 41 characters.
I use observation and testing activities.
Document 'daBrn', 1 passage. Section 8.1, Paragraph 27, 102 characters.
So teaching method that we used but to assess that is done through observation, and
standardized tests
Tests but not about SACSA
Document 'daDsy', 1 passage. Section 8.1, Paragraph 21, 100 characters.
I mean we don’t really do testing. We do testing but not the SACSA because it’s
something different.
Writing
Document 'daDsy', 1 passage. Section 8.1, Paragraph 21, 50 characters.
Basically, we do, in SACSA we do a lot of writing.
Table 6.21 also presents information about teaching methods used in achieving the
Standards in the SACSA Framework in terms of assessment. The results could be

108
split into two groups: (a) in terms of language skills, and (b) in terms of the
assessment methods. Regarded the language skills, speaking was the criterion
mostly used (27%) on assessment. This was followed by spelling (18%), and then
followed by other skills such as writing, punctuation, and grammar. Related to the
assessment method, observation was often used (27%). It was followed by the use of
checklists. To sum up, the results show that the teachers administered testing
activities. However, these were not associated with the SACSA Framework.

Table 6.21 The assessment used in the Standards in the SACSA Framework
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Speaking 0 0 0 0 0 0 1 0 1 0 1 3 27

Punctuation 0 0 0 0 0 0 0 0 1 0 0 1 9

Checklist 0 0 0 0 0 0 0 0 1 0 0 1 9

Testing activities 0 0 0 0 1 0 1 0 0 0 0 2 18

Writing 0 0 0 0 0 0 0 0 1 0 0 1 9

Observation 0 0 0 0 1 0 1 0 1 0 0 3 27

Tests but not SACSA 0 0 0 0 0 0 0 0 1 0 0 1 9

Grammar 0 0 0 0 0 0 0 0 1 0 0 1 9

Spelling 0 0 0 0 0 0 0 0 1 0 1 2 18

Totals 0 0 0 0 2 0 3 0 8 0 2 15

Note: First row - labels of respondents; First column - names of used themes

The idea of the importance of using observation when assessing students in trying to
achieve the Standards in the SACSA Framework has been broadly acknowledged.
Another argument that was closely related to this argument ‘using observation when
assessing students in trying to achieve the Standards in the SACSA Framework’ was
‘using checklists’. Usually, checklists were used at the same time as observation.
Glasscott and Crews (1998) raised an issue associating with the importance of using
observation in the classroom. Glasscott and Crews (1998, p. 232) argued that the
learner defined the interaction; and the professional educator had the ability to
identify and follow the interaction. This implied that students commented a learning
process, while teachers observed how the learning process had gone.
These were some particular ways used by some of the respondents in achieving and
assessing the Standards: speaking, spelling, punctuation and writing. Related to these
arguments, Tindal and Marston (1990, p. 181) stated that the assessment of spelling
was necessary to ensure that students learnt to communicate in writing consistently
and conventionally. Furthermore, Wallace (1978, cited in Tindal and Marston, 1990,
p. 205) described that written expression was a complex skill that required fluency in

109
many areas including speaking, reading, spelling, handwriting, capitalization, word
usage, and grammar
The argument that ‘tests but not about the SACSA Framework’ implied that the
teachers did not use SACSA terminology when giving students testing activities.
This seemed to relate to the previous discussion about ‘difficult terminology in
SACSA’

6.3.2 Question 8: How do you report to parents information about their


children achieving the Standards and the Benchmarks? Do you
draw attention to the context in which your school is set in
discussion with parents?

Standards in the SACSA Framework

The findings show several ways of reporting to parents information about children’s
achieving the Standards: (a) ESL scope and scale, (b) followed DECS’ guideline, (c)
general report, (d) discussion, or interviews, (e) no Standards terminology (f) notes
home, written reports, (g) school reports, folders. The results can be seen in the
following statements and in Table 6.22.

ESL scope and scale


Document 'BeFrtn', 1 passage. Section 17.1, Paragraph 37, 27 characters.
I use ESL scope and scales.
Document 'BelFftn', 1 passage. Section 17.1, Paragraph 36, 28 characters.
I use ESL scope and scales.
Following DECS guideline
Document 'BeChr', 1 passage. Section 17.1, Paragraph 88, 113 characters.
The School Department says ‘this is what you have to do’ and I think that lots of
teachers are happy with that.
General report
Document 'BeChr', 1 passage. Section 17.1, Paragraph 85, 68 characters.
As I said the report that we have here is a little bit more general.
Interviews, discussion
Document 'BeChr', 1 passage. Section 17.1, Paragraph 86, 43 characters.
A lot of discussion and then some concerns…

110
Document 'BelFftn', 1 passage. Section 17.1, Paragraph 36, 108 characters.
….parent-teacher interviews and show work samples. Explain what their kids have
achieved and what they need to concentrate on.
Document 'BeSvn', 1 passage. Section 17, Paragraph 35, 27characters.
…parents-teacher interviews
Document 'daBrn', 1 passage. Section 17.1, Paragraph 45, 219 characters.
We talk about what they have achieved in Standards. Again, when we talk to
parents, we wouldn’t be talking about the context of the school in that situation,
because we’re focusing on the child, what they have achieved.
Document 'daJo', 1 passage. Section 17.1, Paragraph 41, 118 characters.
They have interviews. And if any other issues, I think we need to discuss with
parents using the folders or on phones.
No standards terminology
Document 'daDsy', 1 passage. Section 17.1, Paragraph 45, 69 characters.
We don’t talk about the standards; we talk about how they are going.
Notes to home, written reports
Document 'BelFftn', 1 passage. Section 17.1, Paragraph 36, 37 characters.
And I usually provide written report
Document 'BeSvn', 1 passage. Section 17, Paragraph 35, 12 characters.
…notes home.
Document 'BeVrc', 1 passage. Section 16.1, Paragraph 49, 61 characters.
We usually give the report to kids, and they bring them home.
Document 'daKty', 1 passage. Section 18.1, Paragraph 59, 87 characters.
We designed a report that is based on the Standards and that would go home to
parents.
School reports, folder
Document 'BeSvn', 1 passage. Section 17, Paragraph 35, 23characters.
We have a school report,..
Document 'daJo', 1 passage. Section 17.1, Paragraph 41, 81 characters.
We have school folders, we make reports, and we have the first and second terms.
Document 'daKty', 1 passage. Section 18.1, Paragraph 59, 200 characters.
I will show a report using what sort of portfolio, where children’s work goes to in
the report folio and catch the work in task card. On the task card, it is written the
Standards that should be at.
Additional comments
Document 'BeChr', 2 passages Section 17.1, Paragraph 83, 157 characters.
My previous school we had a quite good reporting system in identifying a key and
outcomes that children have to achieve. So we didn’t try to cover them all.
Section 17.1, Paragraph 89, 71 characters.

111
Most teachers want some guidance and some direction and so in some way
Document 'daCrln', 1 passage. Section 19.1, Paragraph 65, 618 characters.
The problem I have is the language of SACSA will be challenging about this new
reporting and trying to be able to convey that in way that parents can understand.
Even the teachers are struggling with the language of SACSA. The language is very
high. And I found it very funny, this is me, that we have fixed SACSA documents.
And now we have, and then we had to make companion documents. So, if we
actually are students, and then we have said SACSA a lot. So, we can actually
manipulate Music to any say note. So, there is something to be about the design or
framework or thing that they need to reflect on.
Table 6.22 displays data about how to report to parents the information about their
children achieving the Standards. The results show that the ways teachers report to
parents can be split into two categories: ‘verbally’ and ‘written’. The results show
that the number of respondents who used verbal reports was nearly equal to those
who used written reports. In terms of verbal reports, interviews and discussion were
often used. However, the teachers used different ways of reporting when they made
written reports, such as notes to home, school folders, ESL scope, following DECS
guidelines, and general reports. Again, there was no SACSA terminology used in
reporting information to parents.
As has been discussed previously that there was one respondent who stated that in
making reports, she followed DECS’ guideline. This was accorded to what ACACA,
(2000); DECS (2000); and DETE (2000) conveyed that the Standards provided a
common reference point for educators to use in monitoring, judging and reporting on
learner achievement over time. In addition to DECS’ guideline, DECS (2000, online)
and DETE (2000, online) further highlighted that the developmental learning
outcomes and curriculum standards detailed in the SACSA Framework provided a
common basis for describing learners’ achievements.

Table 6.22 The ways of reporting to parents about children’s achieving on the
Standards in the SACSA Framework
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

ESL scope 0 1 0 1 0 0 0 0 0 0 0 2 18
Following DECS
guidelines 1 0 0 0 0 0 0 0 0 0 0 1 9

Interviews, discussion 1 0 0 1 1 0 1 0 0 1 0 5 45

School, reports, folders 0 0 0 0 1 0 0 0 0 1 1 3 27


No standards
terminology 0 0 0 0 0 0 0 0 1 0 0 1 9

General report 1 0 0 0 0 0 0 0 0 0 0 1 9
Notes to home, written
rep. 0 0 0 1 1 1 0 0 0 0 1 4 36

Totals 4 1 0 3 3 1 1 1 1 2 2 19

Note: First row - labels of respondents; First column - names of used themes

112
Benchmarks in LAN Test

The findings show several ways of reporting to parents in terms of the Benchmarks:
(a) copy of test results was sent home, (b) interviews or speaking, and (c) no reports
about the Benchmarks. The results can be seen in the following comments and in
Table 6.23.

Copy of test results is sent home


Document 'daDsy', 1 passage. Section 17.1, Paragraph 45, 174 characters.
In term 3 , the Benchmarks’ results go home to the parents after they did the tests in
Years 3, 5, and 7. So, the results are going home to parents. They have a copy of
them.
Interview, discussion
Document 'daCrln', 1 passage. Section 19.1, Paragraph 67, 268 characters.
With parents we provide opportunities for them to come to discuss, the LAN results.
Because they all produce then students’ results we talk to them about. So, we invite
the parents and they like to come and talk to us about the results.
Document 'daJo', 1 passage. Section 17.1, Paragraph 41, 99 characters.
If there are other issues, I think we need to discuss with parents using the folders or
on phones.
Document 'daKty', 1 passage. Section 18.1, Paragraph 61, 149 characters.
The Benchmarks, I would to talk to parents personally about the tests; and I would
let the parents know, ‘don’t panic, because this is only a test’.
No reports about LAN Test
Document 'BeFv', 1 passage. Section 19.1, Paragraph 36, 158 characters.
I don’t do this. The LAN Test is not understood by parents. They often just want a
number of where their child stands. This is not good, it doesn’t tell much.
Document 'BeSvn', 1 passage. Section 17, Paragraph 35, 38 characters.
And the Benchmarks are not applicable.
Document 'daBrn', 1 passage. Section 17.1, Paragraph 45, 227 characters.
We don’t talk about Benchmarks so much, all right. The only time we talk about
Benchmarks is used in Literacy and Numeracy Test. In terms of other times when
we report that to parents, we don’t talk about the Benchmarks at all.
Additional comments
Document 'daCrln', 1 passage. Section 19.1, Paragraph 60, 373 characters.
But I will be saying that oh that’s only Department of Education gets the
advantages. We wouldn’t do the LAN Test at all. Your children won’t achieve it as
well as SACSA. And there is, for some students, their social economic situation
actually depend on how they perform or something like that. We know that, the
research shows that. That’s why we call it fashioned funds.
Document 'daKty', 1 passage. Section 18.1, Paragraph 61, 39 characters.
It is not the way our children learn.

113
Table 6.23 provides data about how to report to parents the information about their
children achieving the Benchmarks in the LAN Test. The results show that three
teachers (27%) did not report the results of the Benchmarks to parents. Interestingly,
this number is similar to those who stated that the way to report the Benchmarks to
parents was by discussion or interviews. Additionally, one teacher pointed out that
the results of the Benchmarks in the LAN Test were copied to send home to them.

Table 6.23 The ways of reporting to parents about children’s achieving on the
Benchmarks in LAN Test
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Copy of test results


sent home 0 0 0 0 0 0 0 0 1 0 0 1 9

Interviews, discussion 0 0 0 0 0 0 0 1 0 1 1 3 27
No reports about LAN
test 0 0 1 0 1 0 1 0 0 0 0 3 27

Totals 0 0 1 0 1 0 1 2 1 1 2 9

Note: First row - labels of respondents; First column - names of used themes

Three respondents argued that there was no report made about the Benchmarks or
LAN Test, because the Benchmarks were not understood by the parents and the LAN
Test was only a test. It was not taught in the classroom. This relates to a previous
section that states ‘the Benchmarks did not influence the curriculum program’. This
implied that these teachers did not consider that the Benchmarks were important.
Another argument was ‘reporting to parents about the results of the LAN Test by
interviews or speaking’. This argument is contrast to the previous argument that no
report is made about the Benchmarks. In terms of reporting by interviews or
speaking, this way of reporting has many advantages. For example, by doing
informal interviews, parents can ask teachers directly if there is something unclear.
A teacher argued that she reported the Benchmarks to parents by sending a copy of
test results to home. Thus, this teacher preferred giving reports in written form.

Attention to context

There was attention to the context


Document 'daDsy', 1 passage. Section 17.1, Paragraph 49, 84 characters.
Yes, definitely when we discuss with the parents. We usually do it at the beginning
of the year. We find it hard and if they are doing, ok. We let them know.
There was no attention given to the context
Document 'daBrn', 1 passage. Section 17.1, Paragraph 45, 135 characters.
We wouldn’t be talking about the context of the school in that situation, because
we’re focusing on the child, what they have achieved.

114
Document 'daCrln', 1 passage. Section 19.1, Paragraph 60, 28 characters.
No, we don’t. I wouldn’t say that.
Document 'daJo', 1 passage. Section 17.1.1.1, Paragraph 45, 13 characters.
No, we don’t.
Table 6.24 presents information about whether the teachers drew attention to the
context or not when they reported to parents. The results show that they did not draw
attention to the context. On the other hand, one teacher argued that she drew
attention to the context when she reported to parents.

Table 6.24 Whether the teachers drew attention to the school context
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent
Attention to context 0 0 0 0 0 0 0 0 1 0 0 1 9
No attention to context 0 0 0 0 0 0 1 1 0 1 0 3 27
Totals 0 0 0 0 0 0 1 1 1 1 0 4

Note: First row - labels of respondents; First column - names of used themes

There were more respondents arguing that they did not draw attention to the school
contexts when they reported to parents. One reason was that teachers were more
focused on what students had achieved. This might imply that the teachers deemed
that talking about school contexts was not very important in reporting to parents.
On the other hand, there was an argument that ‘teachers drew attention to the school
context when they reported to parents. One teacher argued that the reason for this
was to let the parents know about the school context.

6.3.3 Question 9: What way do you provide feedback to the children


about their achieving the Standards and the Benchmarks?

Standards in the SACSA Framework

The teachers used various ways to provide feedback regarding the Standards in the
SACSA Framework to children. The findings showed 11 ways: (a) different ways,
(b) discussion, interviews, (c) ESL scope, (d) how kids work in the class, (e) kids’
activities in the class, (f) no feedback about assessment, (g) no SACSA terminology,
(h) paper feedback, (i) rubrics, (j) talk about expect outcomes, and (k) feedback was
given at the same time when reporting to parents.

Different ways
Document 'daJo', 1 passage. Section 19.1, Paragraph 49, 22 characters.
In very different ways

115
Discussion, interviews
Document 'BeSvn', 1 passage Section 22, Paragraph 41, 21 characters.
I provide discussion.
Document 'daJo', 1 passage. Section 19.1, Paragraph 49, 115 characters.
...we verbally; in reaching our report; in interview, invite the children to come to
interviews with their parents…
ESL scope
Document 'BeFrtn', 1 passage. Section 19.1, Paragraph 41, 45 characters.
I use ESL Scopes which are aligned with SACSA
How kids work in the class
Document 'BeChr', 1 passage. Section 19.1, Paragraph 101, 53 characters.
…whether they are doing okay or are not getting there
Document 'daKty', 1 passage. Section 20.1, Paragraph 67, 423 characters.
We talk about what they have done, what they have achieved, and how, what they
can do to improve it. That would be in conjunction with the Standards, but not with
the Benchmarks at all, unless they ask me. If the parents show me the scores for the
Benchmarks and will come to ask me, I will explain to them. I will explain what the
Benchmarks are. But I will say ‘Don’t worry about it. It is a test and it’s just one
test.
Kids activities in the class
Document 'BeChr', 1 passage. Section 19.1, Paragraph 101, 178 characters.
And as means of depending on what we are doing and the way we work. I would let
them know whether they are doing a really good job and achieving what they are
trying to do in the class,
Document 'daBrn', 1 passage. Section 19.1, Paragraph 49, 147 characters.
Like through the whole year, teachers are talking about how a child might have done
in particular areas of the curriculum, what they have achieved.
Document 'daKty', 1 passage. Section 20.1, Paragraph 67, 60 characters.
We talk about what they have done, what they have achieved…
No feedback about assessment
Document 'daCrln', 1 passage. Section 21.1, Paragraph 75, 192 characters.
We don’t do a lot in talking with kids about assessment. I think we do it in formal
way in the classroom. There hasn’t been really a strong focus here about
formalizing that.
No SACSA terminology
Document 'BeChr', 1 passage. Section 19.1, Paragraph 101, 42 characters.
But I wouldn’t use any SACSA terminology.
Document 'BelFftn', 1 passage. Section 19.1, Paragraph 40, 45 characters.
I don’t refer to them at all in formal terms.

116
Paper feedback
Document 'daKty', 1 passage. Section 20.1, Paragraph 67, 180 characters.
The feedback to students is daily, they get marks, they also get scores, they get
paper feedback, they also get handout that we give comments on that. I do a lot of
paper feedback.
Rubrics
Document 'daCrln', 1 passage. Section 21.1, Paragraph 75, 148 characters.
And on occasions… in that social emotional domain using rubrics. And I know that
some teachers use rubrics for challenging children with assessment.
Document 'daDsy', 1 passage. Section 19.1, Paragraph 53, 302 characters.
Sometimes we use rubrics, have you read rubrics? Like statements Greek. The
things that you want to achieve, for instance, when I write, I want to have capital
letters; I want to do full-stops and so on. So, when you assess it, you just tick the
box to see whether they have done it or haven’t done it.
Talk about expected outcomes
Document 'BelFftn', 1 passage. Section 19.1, Paragraph 40, 36 characters.
I talk more about expected outcomes.
Document 'daBrn', 1 passage. Section 19.1, Paragraph 49, 33 characters.
…what they’re supposed to achieve.
At the same time when reporting to parents
Document 'daBrn', 1 passage. Section 19.1, Paragraph 49, 267 characters.
Okay, that is through reporting process with parents, because in most cases, children
are a part of the reporting process with parents. They are there so they can hear what
has been said and they can contribute to that and it is done also individually with
children.
Table 6.25 displays data about how to provide feedback regarding children’
achievement in the Standards in the SACSA Framework. The results show that three
participants (27%) provided feedback associating the Standards in the SACSA
Framework to students’ activities in the classroom. Furthermore, one teacher argued
that different ways were used for providing feedback relating to the Standards in
SACSA. This opinion was supported by other teachers who highlighted rubrics, and
ESL scope as the tools to provide feedback. Additionally, discussion or interviews,
and talk about expected outcomes were other ways to provide feedback. However, it
was necessary to emphasize here that no SACSA terminology is used, and no
feedback given relating to assessment.
There was a statement that teachers provided feedback to students in different ways.
As examples we can see the different ways in the following discussions. In general,
the feedback provided can be classified into two main themes namely based on the
techniques of providing the feedback and based on the content of the feedback.

117
The comment that teachers provided feedback to students by ‘discussions or
interviews’ was similar to a way of teachers provided reports to parents. Yet, in this
Section the interviews involved the teacher, the parent and the student. Thus, the
teacher provided feedback to the student and at the same time provided reports to the
parent. This is similar to the next argument that feedback was given to students at the
same time when making reports to parents. Furthermore, there was a statement that
no SACSA terminology involved when providing feedback to students. This
comment is similar to the one in Section ‘no Standard terminology’ in reports made
to parents. Thus, there are some similarities in the ways of reporting to parents and
providing feedback to students.

Table 6.25 The ways to provide feedback to children regarding the Standards in
SACSA
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Paper feedbakc 0 0 0 0 0 0 0 0 0 0 1 1 9

Rubrics 0 0 0 0 0 0 0 1 1 0 0 2 18
No feedback
assessment 0 0 0 0 0 0 0 1 0 0 0 1 9

How kids work 1 0 0 0 0 0 0 0 0 0 1 2 18


Same time report to
parents 0 0 0 0 0 0 1 0 0 0 0 1 9

Kids’ activities 1 0 0 0 0 0 1 0 0 0 1 3 27

Talk expec. outcomes 0 0 0 1 0 0 1 0 0 0 0 2 18


No SACSA
terminology 1 0 0 1 0 0 0 0 0 0 0 2 18

Different ways 0 0 0 0 0 0 0 0 0 1 0 1 9

ESL scope 0 1 0 0 0 0 0 0 0 0 0 1 9

Discussion interviews 0 0 0 0 1 0 0 0 0 1 0 2 18

Totals 3 1 0 2 1 0 3 2 1 2 3 18

Note: First row - labels of respondents; First column - names of used themes

Other ways of providing the feedback, according to a respondent, were by using


‘paper feedbacks’, and ‘rubrics’. Both these involved feedback in written forms, and
provide information about students’ scores or assessment results. In terms of marking
or grading students, Ebel and Frisbie (1986, p. 243) argued grades or marks were
used as self- evaluative measures and also to report students’ educational status to
parents.
There also was the comment of ‘no feedback about assessment’. Bloom, et al. (1971)
raised an issue about the impact of giving feedback of students’ assessment results.
Bloom, et al. (1971, p. 70) argued that feedback of scores or grades information to
students had little effect in terms of changing a student’s behaviour.

118
In terms of the content of the feedback, there were the statements that ESL scope,
how students work in the class, students’ activities in the class, and expected
outcomes were the main topics in providing feedbacks to students

Benchmarks in LAN Test

The teachers used various ways to provide feedback regarding the Benchmarks in the
LAN Test to children. The findings showed four ways: (a) different ways, (b) no
Benchmarks terminology used, (b) no feedback concerning the LAN Test unless
asked, and (c) only when students find a problem.

Different ways
Document 'daJo', 1 passage. Section 19.1, Paragraph 49, 22 characters.
….in very different ways.
No benchmarks terminology used
Document 'BeChr', 1 passage. Section 19.1, Paragraph 101, 186 characters.
In terms of knowing whether they are achieving the Benchmarks and the outcomes,
I don’t specifically talk about new terminology in these books, I modify that and put
it in common words.
Document 'BelFftn', 1 passage. Section 19.1, Paragraph 40, 46 characters.
I don’t refer to them at all in formal terms.
No feedback concerning LAN Test unless asked
Document 'BeVrc', 1 passage. Section 18.1, Paragraph 53, 64 characters.
I don’t really think they provide feedback about the Benchmarks
Document 'daCrln', 1 passage. Section 21.1, Paragraph 73, 97characters.
It is not a common practice for me to sit down with the children and tell them LAN
Test results.
Document 'daKty', 1 passage. Section 20.1, Paragraph 67, 322 characters.
That would be in conjunction with the Standards, but not with the Benchmarks at
all, unless they ask me. If the parents show me the scores for the Benchmarks and
will come to ask me, I will explain to them. I will explain what the Benchmarks are.
But I will say ‘Don’t worry about it. It is a test and it’s just one test.
Only when kids find a problem
Document 'daCrln', 1 passage. Section 21.1, Paragraph 73, 55 characters.
I will talk to them if they are stuck to the LAN Test.
Additional comments
Document 'BeFv', 1 passage. Section 21.1, Paragraph 41, 465 characters.
I don’t value the LAN Test because it puts children into a box. For example, the
dummies group, this does a lot of damage to children’s self-esteem which is very
difficult to overcome. It is better to communicate what children can do well and
what they need more work in to improve. Numbers from data collected in the LAN

119
Test are often misunderstood, generalized and do not convey much about the
individual’s ability and strengths or weaknesses, or how to help them.
Document 'BeVrc', 1 passage. Section 18.1, Paragraph 53, 155 characters.
Lots of teachers have their own criteria that come from the Standards. And they
assess based on the criteria. Then they report the results to the parents.
Table 6.26 shows data about how teachers provide feedback to children regarding
their achievement in the Benchmarks in the LAN Test. The results show that three
teachers pointed out that they did not provide feedback unless it was asked. This
opinion was relevant to a teacher’s opinion that feedback was merely given if the
student found a problem. However, one teacher commented that different ways were
used to provide feedback to children.

Table 6.26 The ways to provide feedback to children regarding the Benchmarks
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty totals percent

No bench, terminology 1 0 0 1 0 0 0 0 0 0 0 2 18
Only when kids find a
problem 0 0 0 0 0 0 0 1 0 0 0 1 9
No feedback LAN
unless asked 0 0 0 0 0 1 0 1 0 0 1 3 27

Different ways 0 0 0 0 0 0 0 0 0 1 0 1 9

Totals 1 0 1 1 0 2 0 2 0 1 1 9

Note: First row - labels of respondents; First column - names of used themes

Similar to several ways in providing feedback about the Standards in SACSA, when
the teachers provided feedback regarding the Benchmarks in the LAN Test, they
argued that no Benchmarks terminology was used, and no feedback associated with
the LAN Test provided unless it was asked for. Another argument related to this was
that the feedback concerning the Benchmarks would only be provided if the student
found a problem in the LAN Test.
On the other hand, there was a comment that feedback associated wit the
Benchmarks was provided in different ways. However, the respondent did not state
examples of the different ways.

6.3.4 Question 10: How do you report to the Year 4/ Year 6/ Year 8
teachers and secondary school teachers about the students’
performance on the Standards and Benchmarks at Year 3/ Year 5/
Year 7, and do they use the information you provide?

How to provide information

The teachers used various ways of providing information about the Standards in
SACSA and the Benchmarks in the LAN Test to the Year 4/ Year 6/ Year 8 teachers

120
and secondary school teachers. The findings showed eight ways: (a) data base, (b)
folders (c) forms from the School of Education, (d) interviews or discussions, (e)
literacy statement, (f) no policy to pass on the information, (g) reports using
comments, and (h) yellow book.

Data base
Document 'BeVrc', 1 passage. Section 20.1, Paragraph 57, 200 characters.
Yes, we have information about that on the database. And teachers get copies of
that. We highlight the students who have difficulties, or those who need extra help
or those are out the lower level.
Folders
Document 'daBrn', 1 passage. Section 21.1, Paragraph 54, 123 characters.
There are records, we pass on children’s folders containing Literacy and Numeracy
Test. And teachers hopefully will read that.
Document 'daCrln', 1 passage. Section 23.1, Paragraph 81, 743 characters.
What I want to say is that all the reports are maintained in the main files. So they
report, and we’ve got the student in the main files they have all their reports and the
LAN Testing and all of that; so, that moves constantly up with them. We do think
around the LAN the data. We do, some analysis of that, so we will be looking at the
data. You know the teachers inYears 3, 5, and 7. And for the students who are any
place……. Those kids are most at risk. Or those I’m concerned about, that they
might not have in any place to go to, but I’m really worried about them. I will
always like to give an educational plan. And what we do is this. I would go back to
the folder, see their record about their achievement in the Standards in SACSA.
Document 'daDsy', 1 passage. Section 21.1, Paragraph 59, 325 characters.
Okay, we’ve got their reports in their folders. So, we put all the copies of the
Benchmarks’ tests in the folders. And that goes to the next teacher. So, the teacher
can get information about the previous things the children have done. So, all the
major information is in the folders. Go to the folders and everything is in there.
Document 'daJo', 1 passage. Section 21.1, Paragraph 54, 243 characters.
Each kid has got a folder which is used by the teachers throughout the years and
that has the information and the report about Literacy and Numeracy test results,
assessment and anything else that we think is relevant for the next teacher.
Document 'daKty', 1 passage. Section 22.1, Paragraph 73, 82 characters.
And also in the blue folder are all test that we’ve done, including the Benchmarks.
Forms from the School of Education
Document 'BeChr', 1 passage. Section 21.1, Paragraph 106, 106 characters.
There is a chart we have to fill out, we have to concern about Key areas such as
literacy and numeracy.
Interviews, discussion
Document 'BeFv', 1 passage. Section 24.1, Paragraph 46, 22 characters.
I report by interviews
Document 'daBrn', 2 passages, Section 21.1, Paragraph 54, 93 characters.

121
With some individual children, there might be discussion between Year 3 and Year
4 teachers.
Section 21.1, Paragraph 57, 192 characters.
…again with the transition from Year 7 to Year 8, there is some discussion held with
the high school about children who are coming up to them. And that they will only
know particular children.
Literacy statement
Document 'daJo', 1 passage. Section 21.1, Paragraph 56, 64 characters.
Last year we developed a literacy statement for those students.
No policy to pass on information
Document 'BeChr', 1 passage. Section 21.1, Paragraph 106, 118 characters.
A high school doesn’t have a policy whether we have to pass lots of information to
the next teachers about students.
Reports using comments
Document 'BeFv', 1 passage. Section 24.1, Paragraph 46, 55 characters.
…reports using comments, not numbers or grades assigned.
Yellow book
Document 'daKty', 1 passage. Section 22.1, Paragraph 73, 385 characters.
…teachers can look at mm, we also have a yellow book, and a score of a piece of
paper double sided for each child; and look at things that tests don’t show: the
family situation; whether they are in a single parent family; whether they’re from a
family who doesn’t speak English; whether they are Aboriginal; whether they’ve got
hearing problems. Sight problems and all that kind of stuff.
Table 6.27 presents information about the ways the teachers provided the information
about the Standards in the SACSA Framework and the Benchmarks in the LAN Test
to the Year 4/ Year 6/ Year 8 teachers and secondary school teachers were grouped
into two categories namely in ‘informal’ and ‘formal’ ways. The responses show that
folders were the most popular way to store data about kids, and provide information
for the next teachers or school. In addition to using folders, some other ways were
available to use in providing information, such as a Yellow book. There was no
policy to pass on the information, such as, forms from the School of Education,
literacy quote, and data base. These were categorized as ‘formal ways’. The other
different ways used to provide information for the next teachers were by interviews
or discussion, and reports using comments.

Table 6.27 How to provide information for the next teachers or next school
teachers
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Folders 0 0 0 0 0 0 1 1 1 1 1 5 45

Yellow book 0 0 0 0 0 0 0 0 0 0 1 1 9

122
No policy to pass on
info. 1 0 0 0 0 0 0 0 0 0 0 1 9
Forms from school of
Education 1 0 0 0 0 0 0 0 0 0 0 1 9

Literacy quote 0 0 0 0 0 0 0 0 0 1 0 1 9

Database 0 0 0 0 0 1 0 0 0 0 0 1 9

Using comments 0 0 1 0 0 0 0 0 0 0 0 1 9

Interviews, discussions 0 0 1 0 0 0 1 0 0 0 0 2 18

Totals 2 0 2 0 0 1 2 1 1 2 2 13

Note: First row - labels of respondents; First column - names of used themes

The comment that using folders in providing information seemed to be the most
popular way compared to the use of a data base, forms from the School of Education,
literacy statement, and yellow books. This might imply that the teachers preferred
providing the information formally. The other statement were providing information
by interviews or discussions, and reports by comments. This shows that the teachers
preferred providing information informally as well.

Usefulness

In terms of the usefulness of the information provided, the interviewees identified


five main categories: (a) no particular report provided, (b) not sure, (c) not useful, (d)
useful for certain students, and (e) very useful.

No particular report provided


Document 'BeFrtn', 1 passage. Section 22.1, Paragraph 46, 19 characters.
No report provided.
Document 'BeSvn', 1 passage. Section 26, Paragraph 46, 3 characters.
N/A
Document 'daBrn', 1 passage. Section 21.1, Paragraph 54, 249 characters.
There is mm. I think what you are asking here is in terms of how the teachers in
Years 3, 5, and 7 report to the next teachers about language testing. Probably,
nothing. They don’t hold particular conversation with the Year 4, 6, and 8 teachers,
Not sure
Document 'daBrn', 1 passage. Section 21.1, Paragraph 57, 186 characters.
Again the records are passed on to the higher school. Whether they read them or not,
is another issue, we don’t know. But the records are passed on to them about the
particular children.
Document 'daCrln', 1 passage. Section 23.1, Paragraph 81, 29 characters.
I’m not sure if they use it.
Not useful
Document 'BeChr', 2 passages. Section 21.1, Paragraph 108, 128 characters.
For a while schools provided lots of information but after a while we found that they
never used it. And if became just wasting time.

123
Section 21.1, Paragraph 114, 49 characters.
So, a lot of teachers don’t use the information.
Useful for certain students
Document 'BeChr', 1 passage. Section 21.1, Paragraph 114, 232 characters.
For some particular students, it can be useful. Those students with key learning
difficulties with them, it is important to have their back ground early about what the
difficulties are. But for lots of students, it is not necessary.
Document 'daJo', 1 passage. Section 21.1, Paragraph 56, 159 characters.
And I think it’s useful. But there are other (the majority) students that don’t have
learning difficulties, they don’t need such information provided for them.
Very useful
Document 'BeVrc', 1 passage. Section 20.1, Paragraph 57, 108 characters.
We highlighted the students who have difficulties, or those who need extra help or
those at the lower level.
Table 6.28 presents data about the usefulness of the information provided to the next
teachers or to the school. The responses show that the teachers’ answers were
various. However, a number of the teachers stated that there were no particular
reports provided for the next teachers or next school. This implied that providing
information for the next teachers or next school was not very important except for
certain students. This assumption was strongly supported by a teacher who argued
that the information would not be useful.
The comments involving that no particular report was provided, not sure, and not
useful, delivered the same meaning. To sum up, providing information to the Year 4/
Year 6/ Year 8 teachers and secondary school teachers was not considered very
useful. In contrast, there were arguments advanced that the information was useful
for certain students, and that it was very important to show the advantage of
possessing the information.

Table 6.28 The usefulness of the information provided for the next teachers or
next school
BeChr BeFrtn BeFv BelFftn BeSvn BeVrc daBrn daCrln daDsy daJo daKty Totals Percent

Very useful 0 0 0 0 0 1 0 0 0 0 0 1 9

Useful for certain kids 1 0 0 0 0 0 0 0 0 1 0 2 18

Not useful 1 0 0 0 0 0 0 0 0 0 0 1 9

Unsure 0 0 0 0 0 0 1 1 0 0 0 2 18

No particular reports 0 1 0 0 1 0 1 0 0 0 0 3 27

Totals 2 1 0 0 1 1 2 1 0 1 0 9

Note: First row - labels of respondents; First column - names of used themes

124
6.4 Do the teachers from different schools have different teaching
methods in implementing the standards in SACSA and the Benchmarks
in the Literacy and Numeracy Testing Program?

6.4.1Teaching methods

Table 6.29 presents information on how School A was different from School B
associated with assessment methods used in achieving the Standards or the
Benchmarks. The results show that both School A and School B used observation
and testing activities. Moreover, School B seemed to use a larger variety of
assessment methods in achieving the Standards or the Benchmarks.

Table 6.29 Comparison between the two schools associated with assessment
methods used in achieving the Standards or the Benchmarks
Themes Schools A School B
Administering tests but not in the SACSA Framework 0 1
Writing 0 1
Checklists 0 1
Speaking, discussion 0 2
Grammar 0 1
Punctuation 0 1
Spelling 0 1
Observation 1 2
Testing activities 1 1

Table 6.30 presents information how School A is different from School B. The results
show that these schools had only one similarity. Both schools used ‘explicit
teaching’.

Table 6.30 Comparison between the two schools associated with teaching
methods used in achieving the Standards
Themes School A School B
Explicit teaching 1 1
used for planning 1 0
Integrating approach 1 0
Use websites 1 0
Discussing with children 1 0
Doing tests but not SACSA 0 1
Different teaching methods 0 2
Independent learning 0 1
Thinking methodology 0 1
Considering current pedagogy 0 1

125
Table 6.31 presents information how School A was different from School B
regarding teaching methods used in achieving the Benchmarks in the LAN Test. The
results show that both schools agreed with the opinion that no teaching occurred
concerning the Benchmarks. School B considered the Benchmarks in the LAN Test
as only tests. Although School A did not mention it explicitly as well, this had been
already implied so.

Table 6.31 Comparison between the two schools associated with teaching
methods used in achieving the Benchmarks
Themes Schools A Schools B
Not teaching benchmarks 1 1
Benchmarks only as tests 0 1

6.4.2 The ways teachers reported to parents

Table 6.32 presents information about how School A was different from School B
relating to the ways used in reporting to parents about the Standards in the SACSA
Framework. The results show that School A used many various ways in reporting to
parents. For both schools, interviews or discussions were the most popular of
reporting to parents.

Table 6.32 Comparison between the two schools associated with the ways
teachers reported to parents in terms of the Standards in SACSA
Themes School A School B
General reports 1 0
Following DECS' guideline 1 0
ESL scope 2 0
Interviews or discussions 3 2
Notes to home 3 0
School reports 1 1
No standards terminology 0 1

Table 6.33 presents information about how School A was different from School B
relating to the ways used in reporting to parents about the Benchmarks in the LAN
Test. The results show that School A did not provide reports about the LAN Test to
the parents. On the other hand, School B provided information on different ways.
One comment stated that in reporting to parents, teachers used interviews or
discussions and the copy of the test results was sent home. Similar to School A, there

126
was one comment that no reports were sent home providing information about the
LAN Test results.

Table 6.33 Comparison between the two schools associated with how the
teachers reported to parents in terms of the Benchmarks in the LAN Test
Themes School A School B
Copy of test results was sent home 0 1
No reports about LAN Test 2 1
Interviews or discussions 0 2

Table 6.34 presents information about how School A was different from School B
relating to how the teachers reported to parents in terms of the school context. The
results show that School A did not provide any information relating to this topic. On
the other hand, School B provided two different answers. The majority of the
teachers interviewed stated that attention was not drawn to the school context when
reporting to parents. However, one teacher commented that there was attention
drawn to the context when reporting to the parents.

Table 6.34 Comparison between the two schools associated with how the
teachers reported to parents in terms of the school context
Themes School A School B
No attention to the school context 0 3
Yes, there was attention to the school context 0 1

6.4.3 The ways teachers provided feedback to students

Table 6.35 presents information about how School A was different from School B
with respect to how the teachers provided feedback to students in terms of Standards
in the SACSA Framework. The results show that both schools had some similarities
in providing feedback to students: (a) students’ activities, (b) discussion, and (c)
talking about expected outcomes.

Table 6.35 Comparison between the two schools associated with how the
teachers provided feedback to students in terms of Standards in the SACSA
Framework
Themes School A School B
ESL scope 1 0
No SACSA terminology 2 0
Students' activities 1 1
How kids worked 1 0
Rubrics 0 2
Discussion 1 1

127
Different ways 0 1
Talking about expected outcomes 1 1
The same time as reporting to parents 0 1
No feedback 0 1
Paper feedback 0 0

Table 6.36 presents information about how School A was different from School B
relating to with the ways teachers provided feedback to students in terms of the
Benchmarks in the LAN Test. The results show that both schools did not provide
feedback relating to the Benchmarks. However, School B provided feedback
regarding the Benchmarks when it was asked for.

Table 6.36 Comparison between the schools associated with the ways teachers
provided feedback to students in terms of the Benchmarks in the LAN Test
Themes School A School B
No benchmarks terminology 2 0
No feedback relating to the Benchmarks 1 1
Only when it was asked 0 1
Different ways 0 1

6.4.4 The ways of providing information to the teachers in Year 4/6 and
to the secondary school teachers

Table 6.37 presents information about how School A was different from School B
with respect to how the teachers provided the information to the teachers in Year 4/6
and to the secondary school teachers. The results show that these schools were
similar in terms of providing information by interviews or discussions. Interestingly,
School B seemed to prefer using folders.

Table 6.37 How to provide the information


Themes School A School B
Database 1 0
Forms from the school of Education 1 0
No policy to pass on information 1 0
Folders 0 4
Interviews or discussions 1 1
Reports using comments 1 0
Literacy quote 0 1
Yellow book 0 0

Table 6.38 presents information about how School A was different from School B
with respect to the ways teachers provided feedback to students in terms of the
usefulness of the information provided. The results show that both schools agreed

128
that no particular reports were provided. In addition, they also argued that the
information was only useful for certain students.

Table 6.38 Comparison between the two schools associated with the ways
teachers provided feedback to students in terms of the usefulness of the
information provided
Themes School A School B
Not useful 1 0
Useful for certain students 1 1
Very useful 1 0
No particular report provided 2 1
Not sure 0 2

129
Chapter 7
Conclusion and Implication

7.1 Introduction

This concluding chapter outlines the main findings of this study and also highlights
the implications for theory and practice. Finally, it points to the need for further
investigation into the implementation of change to the curriculum and student
assessment.

7.2 Conclusion

The findings address the issues that have been the main concern of the research
questions, namely: (a) how do the primary school teachers in Adelaide react to the
specification of standards in SACSA and the Benchmarks in the Literacy and
Numeracy Testing Program, (b) what teaching methods do the teachers use in
applying the standards in the SACSA Framework and the Benchmarks in the
Literacy and Numeracy Testing Program, (c) do the teachers from different schools
have different teaching methods in implementing the standards in SACSA and the
Benchmarks in the Literacy and Numeracy Testing Program?

7.2.1 The primary school teachers’ reaction to the specification of


Standards in SACSA and the Benchmarks in the Literacy and
Numeracy Testing Program

Definitions of the Standards and the Benchmarks

All the teachers involved in this study understood the definition of the Standards in
the SACSA Framework and the Benchmarks in the LAN Test. They defined the
Standards as outcome expectations, guidelines for programming lessons, South
Australian based, outcome assessment, talking about South Australian students, level
of instruction, and reports to parents. In terms of the Benchmarks, the teachers
defined them as the minimum requirement, students’ bands, a National Test, and
students’ need for support. All these definitions are in accordance with the published
statements issued by the Department of Education Children and Services or the
Department of Education, Training and Employment (DECS, 2000 or DETE, 2000).

130
How the Standards in SACSA and the Benchmarks in the Literacy and
Numeracy Test were used in elementary schools by the principal and the
teachers. Whether the use of Standards and the Benchmarks were influenced by
the context in which the schools are placed.

Associated with the Standards implementation in elementary schools, five main


points could be identified: (a) most teachers used the Standards to program lessons,
(b) the teachers created their own reporting system in order to report their students’
attainment of the Standards, (c) the use of a new reporting system ‘A, B, C, D and F’
issued by DECS, (d) when a school made a report, it was not tied to the Standards
and the language in the SACSA Framework, and (e) the Standards were formally
taught in the school. To sum up, these five main points can be grouped into two
categories. First, the teachers in implementing the Standards, used the Standards for
programming work and the reporting system issued by DECS. Secondly, the teachers
in implementing the Standards, did not use the Standards and the language in the
SACSA Framework, that they created their own reporting system.
In terms of the implementation of the Benchmarks in the LAN Test, six main points
could be identified: (a) the LAN Test was implemented to identify the students who
needed support or an extension program, (b) The Benchmarks influenced the
curriculum, (c) the Benchmarks were difficult for a struggling student to attain, (d)
the implementation of the Benchmarks had to be questioned, (e) there was no
teaching in relation to the Benchmarks, or (f) teaching according to the LAN Test. To
sum up, some teachers supported the implementation of the Benchmarks in the LAN
Test, but some other teachers did not.
The implementation of the Standards in the SACSA Framework and the Benchmarks
in the LAN Test, in terms of the school context gave rise two opposing views: (a) the
school context influenced the implementation of the Standards and the Benchmarks,
and (b) the school context did not influence the implementation of the Standards and
the Benchmarks. For one particular school, school context was regarded as an
important aspect to consider in designing the curriculum program. However, for the
other school, the school context was not important in planning the curriculum.

131
Teachers’ personal opinion about the Standards in SACSA and Benchmarks in
the Literacy and Numeracy Test

In terms of the Standards in the SACSA Framework, the teachers’ opinions could be
grouped into two categories namely positive and negative opinions. The positive
opinions involved: (a) that the Standards in the SACSA Framework were useful, and
(b) that the Standards had good outcomes. On the other hand, the negative opinions
regarding the Standards involved: (a) that the Standards were hard to teach, (b) that
Standards were very general, (c) that Standards used very difficult terminology, (d)
that only some Standards were useful, (e) that some of the Standards were too hard
and others were too easy, and (f) that the Standards should be arranged in one-year
bands not two-year bands.
Likewise, the teachers’ opinion relating to the Benchmarks could be classified into
two categories of positive and negative opinion. The teachers’ positive opinions
involved: (a) that the Benchmarks were useful, (b) that the Benchmarks could give
teachers ideas, and (c) that the Benchmarks were useful for gathering data about
student performance. On the other hand, the teachers’ negative opinion towards the
Benchmarks involved: (a) the Benchmarks were not useful, (b) teachers disagreed
with the Benchmarks, (c) the Benchmarks were too low, and (d) the Benchmarks
gave teachers very limited ideas.

Whether the Standards and the Benchmarks that were associated with outcome
statements were appropriate for Years 3, 5, 7 students.

In terms the appropriateness of the Standards in the SACSA Framework, in Year 3,


the majority of the group that provided comments associated with the question,
considered that the Standards were appropriate. However, there was some arguments
advanced that the Standards in Year 3 were inappropriate. The remainder of the
comments stated that the teachers were unsure, or preferred not to give any comment.
Similar to Year 3, in Year 5 the teachers provided very different opinions. One
comment indicated that the Standards in Year 5 were appropriate, and another one
stated the opposite. However, all the teachers who commented on the Standards in
Year 7 considered that the Standards were appropriate.
Similar to the appropriateness of the Standards in Year 3, the statements made
concerning the Benchmarks in Year 3 were various. One comment stated that the
Benchmarks in Year 3 were inappropriate. Another comment indicated that the

132
teachers were unsure whether the Benchmarks in Year 3 were appropriate or not.
However, there was also comment made that the Benchmarks in Year 3 were
appropriate. Furthermore, in Year 5, there were also two different new points. One
new stated that the Benchmarks in Year 5 were appropriate, and the other new given
stated that the Benchmarks in Year 5 inappropriate because they were very low.
Again, in Year 7 a similar situation occurred. There were two contrasting opinions.
One opinion argued that the Benchmarks in Year 7 were appropriate, while the other
stated that they were inappropriate.

Whether there were any conflicting ideas and practices in trying to achieve both
the Standards and the Benchmarks.

Again, there were two distinctly different ideas associating with the Standards,
namely the claim that there was no any conflicting idea and practice, and the
argument that there were conflicting ideas and practices in trying to achieve the
Benchmarks. Some teachers pointed out several examples of the conflicting ideas
and practices: (a) that various pedagogies were used in implementing the Standards,
(b) the Standards needed to match the Benchmarks in the LAN Test, (c) that the
Standards were hard to teach, (d) the Standards were not creating a unit of work, and
(e) the Standards were hard to assess.
Similarly, there are two distinct views associating with the Benchmarks, namely the
view that there were no conflicting ideas and practices, because they were too easy or
too low; and the view that there were conflicting ideas and practices in trying to
achieve the Benchmarks. The examples of the conflicting ideas and practices
associated with the Benchmarks claimed that: (a) the teachers disagreed with the
Benchmarks, (b) the Benchmarks could lead to a psychological burden for children,
(c) the Benchmarks were confusing, and (d) the Benchmarks and the Standards
needed to match.

Whether there were any ways in which the teachers thought that the standards
and the Benchmarks should be changed or modified.

The majority of the persons interviewed agreed with the idea that the Standards
needed to be changed or modified. There were several reasons provided for
undertaking the modification or changes namely: (a) the Standards gave rise to a very
busy curriculum, (b) the Standards were hard to score, (c) the Standards had to be

133
questioned, (d) two-year bands had to be changed to be one-year bands, (e) the
Standards had to be flexible, and (f) teachers had to select the key Standards. On the
other hand, there was a view expressed that the Standards did not need to be changed
or modified because they were very useful.
Likewise, there were two distinct opinions advanced in terms of changing or
modifying the Benchmarks. The majority of the teachers suggested that the
Benchmarks had to be changed or modified. The changes or modification proposed
were: (a) the Benchmark terminology, (b) it was necessary to do more in the
classroom setting for support, (c) the Benchmarks did not represent differences in
learning styles and, thus had to be changed, (d) the Benchmarks had to be applicable
to ESL, and (e) some teachers rejected the Benchmarks.

7.2.2 The teaching methods used in applying the Standards in the


SACSA Framework and the Benchmarks in the Literacy and
Numeracy Testing Program

The teaching methods used for students to achieve the Standards in SACSA and
the Benchmarks in the Literacy and Numeracy Test

Generally, the teaching methods used can be divided into two groups: (a) how they
designed lessons, and (b) how the teachers taught in the class. In terms of how the
teachers designed lessons, the teachers used a companion document, current
pedagogies, and the use of the available websites. Related to the teaching methods
used, the teachers applied different methods, such as, independent learning,
integrating teaching and explicit teaching. There was also argument that highlighted
the importance of discussing with students the criteria for assessment and expectation
that they had to fulfil. However, there was a belief that tests the classroom tests
conducted were not associated with the SACSA Framework. Associated with
teaching methods used in achieving the Benchmarks in the LAN Test, there was no
teaching method used in achieving the Benchmarks, since the teachers did not teach
the Benchmarks. Thus, the Benchmarks were merely associated with the LAN Tests.
Associated with the methods used in achieving the Standards in the SACSA
Framework in terms of assessment, there were methods relating to the language
skills, and those regarding the assessment methods. The language skills used in
assessment involved speaking, spelling, writing, punctuation, and grammar. The
assessment methods involved observation, and the use of checklists. In general, the

134
teachers conducted testing activities to assess the students. However, these were not
related to the SACSA Standards.

The ways the teachers reported to parents, information about their children
achieving the Standards and the Benchmarks. Whether they drew attention to
the context in which their school was set in discussion with parents.

The ways teachers reported to parents could be split into two categories: verbally and
written. In terms of verbal reports, interviews and discussion were often used.
However, the teachers used different ways of reporting when they made written
reports, such as notes to home, school folders, ESL scope, following DECS
guidelines, and general reports. Again, there was no SACSA terminology used in
reporting information to parents.
In terms of the Benchmarks in the LAN Test, the ways to report the Benchmarks to
parents were by: (a) discussion in interviews and (b) the results of the Benchmarks in
the LAN Test being sent home. However, there were some teachers who did not
report the LAN results to parents. Associated with the school context, some teachers
drew attention to the school context when they reported to parents. On the other
hand, some others did not draw attention to the school context.

How the teachers provided feedback to the children about their achieving the
Standards and the Benchmarks

The teachers provided feedback to students associating with the Standards in SACSA
concerning students’ activities in the classroom. Different ways were also used for
providing feedback. The given examples were rubrics, and ESL scope as the tools for
providing feedback. Additionally, discussion in interviews, and talk about expected
outcomes were other ways for providing feedback. However, it is necessary to
emphasize here that no SACSA terminology was used, and no feedback was given
relating to assessment.
Moreover, in terms of the Benchmarks in the LAN Test, some teachers did not
provide feedback unless it was specifically asked, and others provided feedback only
if the student had a problem. However, there was also a comment that stated that
different ways were used to provide feedback to children. This indicates that
providing feedback was definitely carried out.

135
How the teachers reported to the Year 4, Year 6, Year 8 teachers and secondary
school teachers about the students’ performance on the Standards and
Benchmarks at Year 3, Year 5, and Year 7 respectively, and whether they used
the information provided.

The teachers provided the information about the Standards in the SACSA Framework
and the Benchmarks in the LAN Test to the Year 4, Year 6, and Year 8 teachers and
the secondary school teachers both formally and informally. Providing the
information formally was done by using folders, Yellow books, forms from the
School of Education, literacy quote, and the data base. Providing the information
formally was done by interviews or discussions, and reports using comments.
Additionally, there was a statement made that there was no policy to pass on the
information.
Moreover, in terms of the usefulness of the information, the teachers’ answers were
various. There are three main points that can be highlighted relating to this topic: (a)
the information was useful, (b) there were no particular reports provided for the next
teachers or the next school, (c) the information was not useful except for certain
students. Additionally, there was a great uncertainty as to whether the information
was useful or not.

7.2.3 Whether the teachers from different schools have different


teaching methods in implementing the standards in SACSA and
the Benchmarks in the Literacy and Numeracy Testing Program

In terms of assessment methods, both School A and School B used observation and
testing activities. Moreover, School B seemed to use a larger variety of assessment
methods in achieving the Standards or the Benchmarks. Associated with the teaching
methods used in achieving the Standards, both schools A and B used ‘explicit
teaching’. Furthermore, related to the teaching methods used to achieve the
Benchmarks, both schools agreed with the idea that there was no teaching with
respect to the Benchmarks. School B considered the Benchmarks in the LAN Test as
only tests. Although School A did not mention them explicitly, it was implied so.

With respect to reporting, School A used various ways of reporting to parents. The
two schools used interviews or discussions as the most popular way of reporting to
parents regarding the Standards. In terms of the Benchmarks, School A did not

136
provide reports about the LAN Test to the parents. On the other hand, some teachers
in School B provided reports associating with the LAN Test while some others did
not. In reporting to parents, the teachers in school B used interviews or discussions
and the copy of the test results was sent home. Moreover, with respect to the school
context, School A did not provide any answers relating to this issue. On the other
hand, School B provided two different answers. The majority of the teachers
interviewed stated that no attention was drawn to the school context when reporting
to parents. However, one teacher argued that there was attention drawn to the context
when reporting to the parents.
With respect to how the teachers provided feedback to students about the Standards
in the SACSA Framework, School A and School B had three similar answers: (a)
students’ activities, (b) discussion, and (c) talking about expected outcomes.
Furthermore, both schools did not provide feedback relating to the Benchmarks.
However, School B provided feedback regarding the Benchmarks when it was asked
for.
With respect to how the teachers provided the information to the teachers in Year 4,
Year 6, and Year 8 and to the secondary school teacher, both School A and School B
were similar in terms of providing information by interviews or discussions. In
addition, School B seemed to prefer using folders, while School A did not use them.
Moreover, in terms of the usefulness of the information provided, both schools
agreed that no specific reports provided. In addition, they also agreed that the
information was only useful for certain students.

7.3 Implications for theory

There are clear implications in this study for theory in relation to curriculum
implementation. It is important for educators and curriculum makers to be more
aware of the need to consider the concepts of educational psychology and pedagogy
a such as Vygotsky’s, Piaget’s, and Bloom’s ideas. Those concepts were involved in
creating curriculum standards, appropriate objectives and outcomes. Thus,
pedagogical and psychological concepts have had a significant influence on
developing a scale of learning and performance and thus on curriculum design and
implementation.
Moreover, in curriculum implementation, it is also important for teachers to be more
aware of the need to emphasize on the content and processes involved. This implies

137
that the contents of the teaching materials should be considered carefully in terms of
the appropriateness of content as well as the processes of teaching and learning.
Thus, the processes of learning are as important as the content of the teaching
materials.

7.4 Implications for practice and policy

For curriculum makers, it is necessary to evaluate the implementation of a new


curriculum in order to examine the effectiveness of the curriculum implementation.
Identifying how teachers react to the implemented curriculum is very important.
Listening to the teachers’ personal opinion and asking them about how they used the
curriculum in their school can provide significant information about how they react
to change on the curriculum. It is very unlikely that the teachers can implement the
curriculum effectively while they are not happy with that change.
For teachers, it is important for them to be flexible and selective towards the
curriculum outcomes developed by the curriculum makers. If the teachers consider
that the curriculum outcomes are inappropriate for their students, they will try to
modify them.

7.5 Implications for further research

This study merely involved two elementary schools. In further research, the number
of schools involved should be extended. In addition, the investigation could also
involve secondary schools. Thus, comparison should be made between the
elementary and secondary school teachers in implementing the curriculum.
Moreover, in other countries that are implementing a new curriculum, for example,
Indonesia is implementing a new curriculum called ‘Competency-Based
Curriculum’; this research study can be used as a pilot study.

7.6 Concluding comment

The teachers from School A and School B understood the terms ‘the Standards in
SACSA and the Benchmarks in the LAN Test’ very well. However, in terms of
reporting systems, there was no uniformity. Some teachers preferred using their own
reporting system to the reporting system issued by DECS. Furthermore, there were
very different reactions among the teachers to the Benchmarks. Whereas some
teachers supported the Benchmarks, some others refused to consider them.

138
Associated with the school context, there were two different views among the
teachers. Some teachers believed that the school context was important to consider in
implementing both the Standards and the Benchmarks, however, others did not.
The teachers found the Standards in the SACSA Framework were useful, however,
there were some important points suggested for modification such as, the difficult
terminology, and the two-year bands should be changed to one-year bands. Related
to the Benchmarks, some teachers agreed with the Benchmarks, while others
apparently disagreed with the idea of the Benchmarks.
In terms of the appropriateness of the Standards in SACSA, there were different
views towards the Standards in Years 3 and 5. Some teachers considered that the
Standards were appropriate, while some others stated that they were inappropriate.
However, in Year 7, all the teachers interviewed had the same opinion that the
Standards were appropriate.
The majority of the teachers believed that there were conflicting ideas and practices
in trying to achieve both the Standards and the Benchmarks, while some others
claimed that there were no conflicting ideas and practices in trying to achieve both
the Standards and the Benchmarks. However, a majority of the teachers suggested
that both the Standards and the Benchmarks needed to be changed or modified.
There were no teaching methods used in trying to achieve the Benchmarks since they
were merely tests. Thus, the teachers did not teach using the Benchmarks.
Furthermore, when the teachers conducted testing activities, the tests were not about
the SACSA Standards.
In reporting to parents about students’ attainment in the Standards in the SACSA
Framework, the teachers used informal and formal ways or verbal and written ways.
In reporting to parents, the teachers provided feedback about students’ achievement
in the Standards. However, the majority of the teachers did not provide any feedback
concerning the Benchmarks unless it was asked for.
Finally, teachers provided information to teachers in Years 4,6, and 8 and in
secondary schools about students’ performance on the Standards and the Benchmarks
either formally or informally. However, with respect to the usefulness of this
information, they were very uncertain except in special cases. This investigation,
while only a pilot study, indicates that there may be need for inservice education
programs on the use of the Standards in SACSA Framework and the Benchmarks in
the LAN Test if teachers are to use effectively the new approaches to assessment

139
advanced in the changes to the school curriculum that have been introduced during
the past decade.

140
REFERENCES

Australian Council of State School Organization Inc. (2006). Literacy and numeracy.
Answers to parents’ questions about benchmarks. Curtin ACT: Australian
Parents Council Inc. (Online) Available:
http://www.austparents.edu.au/PDF%Files/lit_num/literacynumeracy.pdf.
Printed date: 26/06/06.
ACACA (Assessment and Certification Authorities). (2000). Curriculum Trends
across Australia in 2000 - State by State. Response to Survey on Priority
Areas for 2000, New South Wales. NSW: the Board of Studies NSW for the
Australian Curriculum. (Online) Available:
http://www.boardofstudies.nsw.edu.au/acaca2/acaca_nsw_priorities.html
ACACA (Assessment and Certification Authorities). (2000). Curriculum Trends
across Australia in 2000 - State by State. Response to Survey on Priority
Areas for 2000, Queensland. NSW: the Board of Studies NSW for the
Australian Curriculum. (Online) Available:
http://www.boardofstudies.nsw.edu.au/acaca2/acaca_qld_priorities.html
ACACA (Assessment and Certification Authorities). (2000). Curriculum Trends
across Australia in 2000 - State by State. Response to Survey on Priority
Areas for 2000, South Australia. NSW: the Board of Studies NSW for the
Australian Curriculum. (Online) Available:
http://www.boardofstudies.nsw.edu.au/acaca2/acaca_sa_priorities.html
Adey, P and Shayer, M. (2002). Cognitive acceleration comes of age. In Shayer, M
and Adey, P. (Eds). Learning intelligence. Cognitive Acceleration across the
curriculum from 5 to 15 Years. (pp. 1-17). Buckingham: Open University
Press.
Adey. P. (2002). Cognitive Acceleration with 5-year-olds. In Shayer, M and Adey, P.
(Eds). Learning intelligence. Cognitive Acceleration across the curriculum
from 5 to 15 Years. (pp. 18-34). Buckingham: Open University Press.
Barton, K.C. and Smith L.A. (2000). Themes or motifs? Aiming for coherence
through interdisciplinary outlines. The Reading teacher, Academic Research
Library, 54 (1), 54-63.
Bidell, T.R. and Fischer, K.W. (1992). Cognitive development in educational
contexts. Implications of skill theory. In Shayer, M. and Efklides, A. (Eds).
Neo-Piagetian theories of cognitive development. Implication and
applications for education. London: Routledge.
Black. P; Harlen. W and Orgee. T. (1984). Standards of performance-expectations
and reality. A study of the problem of interpreting the APU science surveys.
APU Occasional Paper NO. 3. London: the Department of Education and
Science.
Bloom, B. S.; Hastings, J.T. and Madaus, G.F. (1971). Handbook on formative and
summative evaluation of student learning. New York: McGraw-Hill.
Board of Studies. (1996). LAP reporting guide 1996. Victoria: the Board of Studies.

141
Brady, L. (1992). Curriculum Development. Fourth Edition. New York: Prentice
Hall.
Campbell, R.J. (1985). Developing the primary school curriculum. England: Holt,
Rinehart and Winston Ltd.
Committee of Enquiry into Education in South Australia. (1982). Final Report.
Education and change in South Australia.
Cronbach, L. J. (1964). Learning research and curriculum development. In Ripple,
R.E. and Rockcastle, V.N. (Eds). Piaget rediscovered. A report of the
conference on cognitive studies and curriculum development. California:
School of Education. Cornell University.
Dane, F. C. (1990). Research methods. California: Brooks/Cole Publishing
Company.
Darlington, Y. and Scott, D. (2002). Qualitative research in practice. Stories from the
field. New South Wales: Allen & Unwin.
Demetriou, A; Shayer, M; and Efklides, A. (Eds) (1992). Neo-Piagetian theories of
cognitive development. Implications and Applications for education. London:
Routledge.
Department of Education and children’s services (DECS). (1994). Statements and
profiles into practice. Improving student learning outcomes. Adelaide:
Gillingham Printers Pty Ltd.
Department of Education and children’s services (DECS). (1995). An Assessment,
recording and reporting resource. Adelaide: Gillingham Printers Pty Ltd.
Department of Education and children’s services (DECS), NSW. (1997). Year 5
linking Basic Skill Tests to the curriculum. NSW: The New South Wales
Department of Education.
Department of Education and children’s services (DECS). (2000). Literacy
benchmarks Years 3, 5 and 7. Writing, spelling and reading. Sydney:
Curriculum Corporation.
Department of Education, Training and Employment (DETE). (2000). South
Australian curriculum, standards and accountability framework.
Implementation plan 2001 to 2002. South Australia: Hyde Park Press.
(Online) Available: http://www.sacsa.sa.edu.au/index_fsrc.asp?=EL
Department of Education and Training (Government of Western Australia). (2006).
Western Australian literacy and numeracy assessment. (Online) Available
http://www.eddept.wa.edu.au/walna/. Printed date 26/06/06.
Department of Education and children’s services (DECS). (2000). South Australian
curriculum, standards and accountability framework. An Overview. (Online)
Available: http://www.sacsa.sa.edu.au/index_fsrc.asp?=EL
Department of Education of Queensland. (1995). Aspects of Science: Overall
Results. Assessment of Performance Program 1994. quality Assurance and
School Review. Queensland: Directorate Department of Education
Ebel, R.L and Frisbie, D.A. (1986). Essentials of educational measurement. Fourth
Edition. New York: Prentice-Hall.

142
Education Department of South Australia. (1977). The do it yourself curriculum
guide. Secondary Science Curriculum Committee. For Junior Secondary
Science. Adelaide: D.J. Woolman, Government Printer.
Education Department of SA. (a 1993). Monitoring student achievement. Attainment
levels and national profiles: the background. Resource 1 Paper. Adelaide:
Darlington Materials Development Centre.
Education Department of SA. (b 1993). Monitoring student achievement. Attainment
levels and national profiles: Some questions and answers. Resource 2 Paper.
Adelaide: Darlington Materials Development Centre.
Education Department of SA. (c 1993). Monitoring student achievement. Attainment
levels and national profiles: Training and development.. Resource 4 Paper.
Adelaide: Darlington Materials Development Centre.
Education Department of SA. (d 1993). Monitoring student achievement. Attainment
levels and national profiles: Establishing curriculum standards for South
Australian schools. Resource 10 Paper. Adelaide: Darlington Materials
Development centre.
Ellis, R. (2003). Task-based language learning and teaching. New York: Oxford
University Press
Finch, C.R. and Crunkilton, J.R. (1989). Curriculum development in vocational and
technical education. (3rd Edition). New York: Allyn and Bacon.
Fraser, B.J. (1978). CDC Professional Series. Review of research on Australian
science education project. Canberra: The Curriculum Department Centre.
Fontana, A. and Frey. J. H. (2005). The interview. From Neutral stance to political
involvement. In Denzin N.K. and Lincoln, Y.S. (Eds). The SAGE handbook
of qualitative research. Third Edition, (pp. 695-728). California: SAGE .
Gibbons, J.A. (2004). On reflection. Adelaide: Shannon Research Press.
Gilgun, J.F. (2005). Qualitative research and family psychology. American
Psychology Association, 19(1), 40-50.
Glascott, K.P and Crews, N.N. (1998). A teaching philosophy: Rhetoric or reality?.
Childhood Education, 74 (4), 232-233.
Harman, G. (1999). Politics of education. In Keeves, J.P. and Marjoribanks, K. (Eds).
Australian education: review research 1965-1998. (pp. 32-57). Camberwell,
Victoria: Australian Council for Educational Research.
Harvey-Beavis, A.; Macaskill, G and Wu, M. (2000). Calibration of the South
Australian Outcome Statements. Summary Report. June 2000. Adelaide:
ACER (Australian Council for Educational Research).
Hodgan, J. (2002). Primary teachers and cognitive acceleration in mathematics
education: transforming teachers’ mathematical knowledge through
reflection. In Shayer, M. and Adey, P. (Eds). Learning intelligence. Cognitive
Acceleration across the curriculum from 5 to 15 Years. (pp. 118-133).
Buckingham: Open University Press.
Hornibrook, M and Wallace, M. (2001). Report on the independent evaluation of the
development of the South Australian curriculum standards and accountability

143
framework. Executive summary. (Online) Available:
http://www.sacsa.sa.edu.au/index_fsrc.asp?=EL
Keeves, J.P. (1999). Research into curriculum change. In Keeves, J.P. and
Marjoribanks, K. (Eds). Australian Education: Review research 1965-1998.
Kelle, U. (2004). Computer-assisted analysis of qualitative data. In Flick, U;
Kardorff, E.v.; and Steinke, I. (Eds). A companion to qualitative research, pp.
276-283. London: SAGE.
Kemmis, S. and McTaggart, K. (1993). Critical curriculum research. In Smith, D.L.
(Ed). Australian curriculum reform: Action and reaction, (pp. 125-140).
Australian Curriculum Studies Association.
Kim, H. (1975). Evaluation of the mastery learning project in Korea. Lewy, A.
(1975). Studies in educational evaluation. Israil: School of Education, Tel-
Aviv University.
Larkin, S. (2002). Creating metacognitive experiences for 5- and 6-year-old children.
In Shayer, M. and Adey, P. (Eds). Learning intelligence. Cognitive
Acceleration across the curriculum from 5 to 15 Years. (pp. 65-79).
Buckingham: Open University Press.
Lindvall, C.M. and A.j. Nitko. (1975). Measuring pupil achievement and aptitude.
Second Edition. New York: Harcourt Brace and Jovanovich, Inc.
Lovat, T.J. and Smith, D.L. (1993). Curriculum. Action on reflection revisited. Third
Edition. NSW: Social Science Press.
Maggs, R.F. (2001). Best practice: in pursuit of methodological rigour. Blackwell
Science ltd, 35(3), 373-383.
Miller, W.L. & Crabtree, B.F. (1992). Primary care research: A multimethod typology
& qualitative road map. In Miller, W.L. & Crabtree, B.F. (Eds). Doing
qualitative research, (pp. 3-28), London: Sage Publication.
Maier, H.W. (1978). Three theories of child development. Third Edition. New York:
Harper & Row.
Marsh, C.J.& Willis, G. (2003). Curriculum. Alternative approaches, on going issues.
Third Edition. New Jersey: Merrill Prentice Hall.
Marsh C.J. (1992). Key concepts for understanding curriculum. London: The Palmer
Press.
McNeil, J.D. (1985). Curriculum. A comprehensive introduction. Third edition.
Boston: Little, Brown and Company.
Morgan, J.W. (1978). Some perspectives on school based curriculum development.
London: the University of New England.
Morris, R.W. and Howson. G. (1972). Curriculum development. In Howson G. (Ed).
Developing a new curriculum, (pp.1-34). London: Heinemann.
NSW Department of School Education. (1997). Year 3 linking Basic Skill Tests to
the curriculum. NSW: Assessment and Reporting and Curriculum
Directorates.

144
NSW Department of School Education. (1997). Year 5 linking Basic Skill Tests to
the curriculum. NSW: Assessment and Reporting and Curriculum
Directorates.
Ozkan, B. (2004). Using NVivo to analyze qualitative classroom data on
constructivist learning environments. The qualitative Report, 9(4), 589-603.
Payne, D.A. (1973). The assessment of learning. Cognitive and affective. New York:
D.C. Health and Company.
Piaget, J. (1964). Development and learning. In Ripple, R.E. and Rockcastle, V.N.
(Eds). Piaget rediscovered. A report of the conference on cognitive studies
and curriculum development. California: School of Education. Cornell
University.
Peräkylä, A. (2005). Analyzing talk and text. In Denzin N.K. and Lincoln, Y.S. (Eds).
The SAGE handbook of qualitative research. Third Edition, (pp. 869-886).
California: SAGE.
Print, M. (1987). Curriculum Development and Design. Sydney: Allen & Unwin.
QSR International Pty. Ltd. (2002). QSR NVivo (Version 2.0.163): QSR
International Pty. Ltd
Rennie, L.J.; Fraser, B.J.; and Treagust, D. F. (1999) In Keeves, J.P. and
Marjoribanks, K. (Eds). Australian Education: Review of research 1965-
1998. Camberwell, Victoria: ustralian Council for Educational research.
Richards, L. (1999). Using NVivo in qualitative research. London: SAGE.

Ripple, R.E. and Rockcastle, V.N. (1964). Piaget rediscovered. A report of the
conference on cognitive studies and curriculum development. California: School of
Education. Cornell University.

Robertson, A. (2002). Pupils’ understanding of what helps them learn. In Shayer, M


and Adey, P. (Eds). Learning intelligence. Cognitive Acceleration across the
curriculum from 5 to 15 Years, (pp. 51-64). Buckingham: Open University
Press.
Shayer, M. and Adey, P. (1981). Towards a science of science teaching. Cognitive
development and curriculum demand. London: Heinemann Educational
Books.
Sturman, A. (1989). Decentralization of curriculum decision making. Effects of the
devolution of curriculum decision making in Australia. Hawton,Victoria:
Australian Council for Educational Research.
Terwilliger, J.S. (1971). Assigning grades to students. New York: Scott, Foresman
and Company.
Tindal, G.A. and Marson, D.B. (1990). Classroom-based assessment. Evaluating
instructional outcomes. New York: Macmillan Publishing Company.
Tomlinson, C.A. (2001). Standards and the art of teaching: Crafting high-quality
classrooms. National Association of Secondary School Principals. NASSP
Bulletin, Academic Research Library, 85 (622), (pp. 38-47).

145
Whitton, D; Sinclair, C; Barker, K; Nanlohy, P; and Nosworthy, M. (2004). Learning
for teaching. Teaching for learning. Victoria: Thomson Social Science Press.
Venville, G.J. (2002). Enhancing the Quality of Thinking in Year 1 Classes. In
Shayer, M and Adey, P. (Eds). Learning intelligence. Cognitive Acceleration
across the curriculum from 5 to 15 Years, (pp. 35-50). England: Open
University Press.
Zais, R.S. (1981). Conceptions of curriculum and the curriculum field. In Giroux,
H.A.; Penna, A.N.; and Pinar, W.F. (Eds). Curriculum and instruction,
(pp. 32-46). New York: McCutchan Publishing Corporation.

146
APPENDIX 1

147
APPENDIX 2

FLINDERS UNIVERSITY ADELAIDE • AUSTRALIA


Social and Behavioural Research Ethics Committee

CONSENT FORM FOR PARTICIPATION IN RESEARCH


(by interview)

I …............................................................................................................................
being over the age of 18 years hereby consent to participate as requested in the
Letter for the research project on the exploration of the implementation of
Standards in SACSA and the Benchmarks in the Basic Skill Tests in primary schools
in South Australia.
1. I have read the information provided.
2. Details of procedures and any risks have been explained to my satisfaction.
3. I agree to my information and participation being recorded on tape.
4. I am aware that I should retain a copy of the Information Sheet and Consent

Form for future reference.

5. I understand that:
• I may not directly benefit from taking part in this research.
• While the information gained in this study will be published as explained,
I will not be identified, and individual information will remain confidential.

6. I am aware that I am free to withdraw at any time and decline to answer any
particular question.

Participant’s signature……………………………………Date…………………

I certify that I have explained the study to the volunteer and consider that she/he
understands what is involved and freely consents to participation.

Researcher’s name Rosmawati

Researcher’s signature…………… …………..Date 21/03/2006

148
APPENDIX 3

LETTER OF INTRODUCTION

Dear Sir/Madam/Name

This letter is to introduce Rosmawati who is a Postgraduate student in the School of


Education at Flinders University. She will produce her student card, which carries a
photograph, as proof of identity.
Rosmawati is undertaking research leading for her Postgraduate thesis under my
supervision. Her research addresses the subject of the implementation of the
Standards in SACSA and the Benchmarks in the Literacy and Numeracy Tests (LAN)
in primary schools in South Australia. Her thesis aims to explore how the primary
school teachers in South Australia implement the Standards in SACSA and the
Benchmarks in the LAN Tests. She is hoping to gain information from those teachers
who implement the Standards in SACSA and Benchmarks in the LAN Tests in their
schools. The purpose of this study is to ask you whether you are interested in
participating in this study.
Rosmawati would be most grateful if you would volunteer to spare the time to assist
in this project by granting an interview which touches upon certain aspects of this
topic. No more than half an hour on one occasion would be required in the
interview.
Be assured that any information provided will be treated in the strictest confidence
and none of the participants will be individually identifiable in the resulting thesis,
report or other publications.
Since she intends to make a tape recording of the interview, she will seek your
consent, on the attached form, to record the interview, to use the recording in
preparing the thesis, report or other publications, on condition that your name or
identity is not revealed. You are free to decline to take parts in this study if you do not
wish to so, and you are free to withdraw from the interview of any stage.
Additionally, you may decide the time when you will be ready for the interview at
school.
Any enquiries you may have concerning this project should be directed to me at the
address given above or by telephone on (+618) 82012392 or e-mail
john.keeves@)flinders.edu.au.
This research project has been approved by the Flinders University Social and
Behavioural Research Ethics Committee. The secretary of the Committee can be
contacted by telephone on 8201 5962, fax 8201-2035, or by email
sandy.huxtable@flinders.edu.au.
Thank you for your attention and assistance.

Yours sincerely,

John P. Keeves
Professorial Fellow
School of Education

149
APPENDIX 4

Procedures of Data Analysis Using NVivo Version 2

Preparation

As soon as the interviews were completed, the researcher transcribed the recording.
First, the interview protocols were created in word documents, but they were saved
in ‘rtf’. The documents had to be saved in ‘rtf’ to enable NVivo version 2 to import
them.
Creating documents

Figure 1. Creating documents step 1


After the Figure above was shown, the researcher chose the first button ‘locate and
import readable external text file; and then clicked ‘Next’ to continue. As the result
the figure below showed.

150
Figure 2. Creating documents step 2
When this figure was shown, the researcher selected one file and clicked ‘Open’.
Automatically, Figure 3 would appear, and allowed the researcher to select finish.
The researcher did the same procedure for the next files. The results were displayed
in Figure 4.

Figure 3. Creating
documents step 3

151
Figure 4. Documents were already created.
In figure 4, it was displayed 11documents with their details including when the
documents were created and modified.
Creating nodes
To create a node, the researcher needed to read the interview protocols carefully.
Afterwards, she identified the important themes for each question. When these
themes were ready, node creation was started.

Figure 5. Creating a node step 1

152
The researcher clicked ‘Create a Node’ and Figure 6 would appeared. Then, she
chose ‘Tree’ Node. The reason for choosing this typed was to arrange the data
systematically.

Figure 6. Creating Tree Nodes

Figure 7. The nodes were already created.

153
Figure 7 showes the result of creating nodes. As can be seen on the figure, there are
three types or models of nodes: ‘Free’ ‘Tree’ and ‘case’. Here, the researcher used
model ‘Tree’ and created nodes as appeared on the screen.

Figure 8. Another way to create nodes

This was another way to create nodes. It could be started form opening a document,
and then click ‘coder’. The result is shown as in Figure 9.

154
Analysis using ‘Node Search’

Figure 1

The first step was that we displayed this project pad. Then we clicked ‘Search’ button
to start the data analysis using ‘Node search’. This was used to search the text coded
by using the node selected. It was important to determine whether we wanted to use
‘no spread’ or ‘enclosing paragraph’, since it would influence the results. ‘No spread’
was used to show just the text coded, while ‘enclosing paragraph’ was used to show
the whole paragraph. An example, it can be seen in Figure 9.

In this study, the researcher preferred using ‘no spread’, since she wanted to display
the emphasized texts. However, if the readers would like to read the whole
paragraphs, they can go to Appendix 3.

155
Figure 2

The next step involved by clicking ‘Node’, choosing ‘No Spread’ and clicking ‘Run
Search’. As a result, Figure 3 emerged. Then by clicking ‘choose’, the researcher
could choose the type of node wanted. Here the researcher chose ‘Trees’ node,
because previously she had set her data in ‘Trees’ node.

Figure 3

156
Figure 4

The researcher started from the first node ‘Definition’. This node contained two main
descendants, and each of them belonged to children or sub-descendants. The
researcher needed to analyze each sub-descendant individually, because these would
become the answers of the research questions. An example, this can be seen in Figure
5 where the researcher chose the first main node ‘Definition’, and this node had two
descendants; one of them was ‘Standards in the SACSA Framework’. The node
‘Standards in the SACSA Framework’ had many sub-descendants; one of them was
‘Two year band’. The emphasis here was ‘Two year band’. Thus, the researcher
would like to see the respondents who defined ‘Standards in the SACSA Framework’
as ‘Two year band’. Similar procedure was done by the researcher for the next nodes
until completed. After clicking ‘Ok’ button as in Figure 5, Figure 6 emerged.

Figure 5

157
Figure 6

The next step was to select ‘No Spread’ and then click ‘Run Search’. As the result,
Figure 7 appeared. The results can be seen in Figure 9.

Figure 7

158
These were the results of the data analysis using ‘Node’ search with ‘No Spread’.

Figure 8.

These are the results of the data analysis using ‘Node’ search with ‘Enclosing
Paragraph’. This displays the whole paragraph including the un-coded texts.

Figure 9

159
Lastly, to move these results to Word Document, the researcher selected all the texts
and copied them to a Word Document file. The results of the copy could be seen in
‘Finding Chapter’.

Analysis using ‘Assay Tool’

160
161
162
APPENDIX 5
Interview questions
In these questions I am referring to the Standards in SACSA and the Benchmark in
the Literacy and Numeracy Test.

11. What does your school understand by the terms of Standards in SACSA and
the Benchmarks in the Literacy and Numeracy Test?
12. How are the Standards in SACSA and the Benchmarks in the Literacy and
Numeracy Test currently used in your school by the principal and the
teachers? Is the use of Standards and the Benchmarks influenced by the
context in which your school is placed?
13. What is your personal opinion about the Standards in SACSA and
Benchmarks in the Literacy and Numeracy Test?
14. What teaching methods do you use for your students to achieve the Standards
in SACSA and the Benchmarks in the Literacy and Numeracy Test?
15. a. Do you think the Standards and the Benchmarks that are associated with
outcome statements are appropriate for Year 3 students?
or
b. Do you think the Standards and the Benchmarks that are associated with
outcome statements are appropriate for Year 5 students?
or
c. Do you think the Standards and the Benchmarks that are associated with
outcome statements are appropriate for Year 7 students?
2. Are there any conflicting ideas and practices on trying to achieve both the
Standards and the Benchmarks?
3. Are there any ways in which you think the standards and the Benchmarks
should be changed or modified?
4. How do you report to parents information about their children achieving the
Standards and the Benchmarks? Do you draw attention to the context in
which your school is set in discussion with parents?

5. How do you provide feedback to the children about their achieving the
Standards and the Benchmarks?
6. How do you report to the Year 4/ Year 6/ Year 8 teachers and secondary
school teachers about the students’ performance on the Standards and
Benchmarks at Year 3/ Year 5/Year 7, and do they use the information you
provide?

163
APPENDIX 6
Questionnaire
This questionnaire aims to get information about the background of the participants
in order to support the information acquired from the interviews.
Would you like to spare your time to fill in this questionnaire, please?

Please tick the appropriate box and briefly provide details

1. Gender: Female Male

Please fill in the following boxes

2. Age:

3. Year of experience:

4. Type of education and details:

Pre-service Education

In-service Education

Post-graduate Education

5. Have you participated in any in-service or postgraduate education programs that

relate to the Standards in SACSA and the Benchmarks in the Basic Skill Tests? If

‘yes’ please provide details.

Yes

No

Thanks for your participation

164

You might also like