You are on page 1of 10

Teaching and Teacher Education 36 (2013) 143e152

Contents lists available at ScienceDirect

Teaching and Teacher Education


journal homepage: www.elsevier.com/locate/tate

What matters for student learning outcomes: A meta-analysis


of studies exploring factors of effective teaching
Leonidas Kyriakides*, Christiana Christoforou, Charalambos Y. Charalambous
Department of Education, University of Cyprus, Nicosia, Cyprus

h i g h l i g h t s

 A meta-analysis was used to examine teaching factors related to student outcomes.


 The dynamic model of educational effectiveness was used as a guiding framework.
 An integrated approach of effective teaching is empirically supported.
 The need for conducting more experimental and longitudinal studies is highlighted.
 The study findings suggest teaching practices that can be used in teacher education.

a r t i c l e i n f o a b s t r a c t

Article history: Meta-analysis comprises a powerful tool for synthesizing prior research and empirically validating
Received 12 February 2013 theoretical frameworks. Using this tool and the dynamic model of educational effectiveness as a guiding
Received in revised form framework, in this paper we present a meta-analysis of 167 studies investigating the impact of teaching
29 May 2013
factors on student achievement. The factors of the dynamic model were found to be moderately asso-
Accepted 18 July 2013
ciated with student achievement; in contrast, factors not included in the model were weakly associated
with student learning, with the exception of two factors associated with constructivism. In discussing the
Keywords:
study findings, we consider their theoretical, methodological, and practical implications.
Quality of teaching
Meta-analysis
Ó 2013 Elsevier Ltd. All rights reserved.
Teaching factors
Educational effectiveness
Theory testing and development

1. Introduction needed to unpack and understand what exactly teachers do


that promotes student outcomes. Aiming to contribute to the
Several effectiveness studies conducted in different countries ongoing effort to understand how teachersdthrough instruc-
over the past three decades have consistently showed that class- tiondcontribute to student learning, in this paper, we present the
room level is more important than the school level in terms of results of a meta-analysis exploring the effect of different teaching
explaining the variance in student achievement (e.g., Kyriakides, factors on student learning outcomes.
Campbell, & Gagatsis, 2000; Scheerens & Bosker, 1997). It has also Meta-analyses are a powerful tool for providing an account and
been demonstrated that a large proportion of the classroom level synthesis of the knowledge that has been accumulated in a given
variance can be attributed to teachers’ behavior in the class- field and for yielding directions for future theoretical and empirical
roomdnamely what teachers do in the classroom and how they work. As scholars in any field have for long known, a single study,
interact with their studentsdrather than to teacher personal no matter how seminal, cannot on its own satisfactorily contribute
characteristics, such as their beliefs (Creemers & Kyriakides, 2006; to the development and testing of theories that explain complex
Muijs & Reynolds, 2010). In fact, without effective teacher guidance phenomena, such as that of teaching (cf. Leinhardt, 1993). By
and instruction in the classroom, learning cannot be achieved. summarizing, integrating, and interpreting findings across studies,
Despite the progress made over these years in highlighting the role meta-analyses can help better unpack and understand multifaceted
that teachers have in promoting student learning, more work is phenomena by providing a more robust basis for theory
development than what a single study alone might accomplish
(Creemers, Kyriakides, & Sammons, 2010; Lipsey & Wilson, 2001).
* Corresponding author. Department of Education, University of Cyprus, P.O. Box
20537, 1678 Nicosia, Cyprus. Tel.: þ357 22892947; fax: þ357 22894488. Moreover, by calculating average effect sizes, meta-analysis can
E-mail address: kyriakid@ucy.ac.cy (L. Kyriakides). correct for the distorting effects of different error types (sampling

0742-051X/$ e see front matter Ó 2013 Elsevier Ltd. All rights reserved.
http://dx.doi.org/10.1016/j.tate.2013.07.010
144 L. Kyriakides et al. / Teaching and Teacher Education 36 (2013) 143e152

or measurement errors) that often produce the illusion of con- In terms of the frameworks employed in the three meta-
flicting findings, and, therefore, obscure “real” underlying patterns analyses, we note that in our approach we focus on the impact of
(Hunter & Schmidt, 2004). generic teaching factors on student learning, whereas in the other
Meta-analyses are usually conducted for two main reasons. two meta-analyses, the focus is broader. In particular, in Seidel and
First, researchers are interested in figuring out the stage of accu- Shavelson’s meta-analysis both generic and domain-specific
mulated knowledge in a given field. In this respect, the main aim of teaching factors are taken into account, while in Hattie’s meta-
a meta-analysis is to outline the state of the art, which can be used analysis the focus is even broader, covering not only teacher be-
by researchers and practitioners to advance work in this field, from haviors in the classroom, but also factors situated at the student
a theoretical, empirical, and practical standpoint. Second, re- level (e.g., student personal and background characteristics, moti-
searchers may also be interested in using the findings of meta- vation, persistence), and the school level (e.g., curriculum, school
analyses to build a new theory, refine existing theories, or design composition effects). Our focus on only specific teaching factors
new studies. relates to the aim of this paper to use meta-analyses to test the
In this paper, we argue that meta-analyses can be conducted to validity of certain theoretical models. Given that the dynamic
test and empirically validate educational effectiveness theoretical model examined in this study focuses only on generic teaching
frameworks/models. This can be done by using these frameworks factors, in this meta-analysis, we zoomed in on these factors.
and models to inform the selection of studies and determine the Our approach is also more deductive, because it uses an already
classification of the factors/variables used within each study into existing framework to classify the studies sampled, while in Hat-
different categories. Pursuing this approach helps researchers to tie’s meta-analysis, for example, the framework is generated more
integrate findings across different studies, but also test the validity inductively by synthesizing available studies. Also, in our approach
of a proposed theoretical framework/model more systematically. and in Seidel and Shavelson’s work, the relatively limited focus on
This type of meta-analysis may also reveal which aspects of the only teaching factors allows for going into more depth and exam-
theoretical work under consideration have not yet been addressed ining specific teaching behaviors, instead of considering broader
by empirical studies and therefore need to be examined in future teaching approaches (e.g., reciprocal teaching, direct instruction,
studies. Furthermore, such meta-analyses may contribute signifi- inquiry-based teaching). As will be evident in the next section, the
cantly to providing robust answers to policy makers, by helping dynamic model of educational effectiveness upon which this meta-
them prioritize their efforts. This can be accomplished by showing analysis is based integrates teaching factors from different teaching
which factors of those examined have a stronger effect on student approaches in an attempt to better understand what contributes to
learning, and therefore, should be prioritized, especially when fis- student learning.
cal constraints do not allow for incorporating multiple factors in
intervention programs or in teaching preparation/education 2. Teaching factors of the dynamic model of educational
programs. effectiveness
The meta-analysis reported in this manuscript uses the dynamic
model of educational effectiveness (Creemers & Kyriakides, 2008) The dynamic model takes into account the multilevel character
as a framework to organize and structure our exploration of of the influences on student learning, as revealed by a multitude of
teaching factors and their contribution to student learning. effectiveness studies conducted in several countries (Teddlie &
Through this meta-analysis, we do not only explore the effect sizes Reynolds, 2000). Therefore, the model is multilevel in nature and
that different factors have on student learning, but mainly inves- refers to four different levels: the context, the school, the class-
tigate the validity of the dynamic model at the teacher level, by room/teacher, and the student level. Teaching has a central focus in
examining the extent to which the findings of this meta-analysis the model: through its classroom/teacher level, the model exam-
justify the importance of the teaching factors included in the ines eight teacher behaviors in the classroom that have the po-
model. We also search for ways in which this theoretical model can tential to promote student learning (see below). The model also
be further developed, by identifying areas in which it could be assumes that factors at the school and context level have both
improved. direct and indirect effects on student achievement because of their
Recognizing that effect sizes are likely to vary due to differences potential to influence student learning either directly or through
in procedures, instrumentation, study contexts, and treatments, having an impact on the teaching and learning environment. Given
this meta-analysis also aims to identify moderators that may ac- that this meta-analysis is concerned with teaching factors, a brief
count for the variation in the observed effect sizes. Identifying the description of the dynamic model at the classroom/teacher level is
impact of such moderators provides further insight into the po- provided below.
tential influence of different teaching factors on student learning, Based on the main findings of teacher effectiveness research
for it can point to the conditions under which each factor operates. (e.g., Brophy & Good, 1986; Frazer, Walberg, Welch, & Hattie, 1987;
For example, such a meta-analysis may reveal that a specific Muijs & Reynolds, 2000; Rosenshine & Stevens, 1986), the dynamic
teaching factor has differential effects on student learning, model refers to eight factors which describe teachers’ instructional
depending on the outcomes considered each time (e.g., cognitive or role and are associated with student outcomes. These factors refer
affective) or the educational level under consideration (e.g., pri- to observable instructional teacher behaviors in the classroom
mary, secondary or tertiary education). In what follows, we provide rather than to factors that may explain such behaviors, such as
a short account of the dynamic model of educational effectiveness teacher beliefs/knowledge, or their interpersonal competences. The
upon which this meta-analysis was based. Before doing so, it is eight factors included in the model are as follows: orientation,
however, informative to also consider two recent meta-analyses structuring, questioning, teaching-modeling, applications, man-
(i.e., Hattie, 2009; Seidel & Shavelson, 2007) that have also exam- agement of time, the teacher role in making classroom a learning
ined the impact of teacher factors on student achievement. In doing environment, and classroom assessment. A short description of
so, we highlight similarities and differences between the work re- each factor follows.
ported in this paper and those meta-analyses in terms of their A) Orientation: Refers to teacher behavior in providing the ob-
frameworks. In subsequent sections, we also compare these meta- jectives for which a specific task or lesson or series of lessons
analyses to that presented here in terms of their methodological take(s) place and/or challenging students to identify the reason(s)
approaches and results. for which the lesson involves a particular activity. It is anticipated
L. Kyriakides et al. / Teaching and Teacher Education 36 (2013) 143e152 145

that the orientation process can make tasks and lessons meaningful solving strategies (Aparicio & Moneo, 2005; Gijbels, Van de
to students, which in turn encourages their active participation in Watering, Dochy, & Van den Bossche, 2006).
the classroom (e.g., De Corte, 2000; Paris & Paris, 2001). E) Application: Effective teachers use seatwork or small-group
B) Structuring: Rosenshine and Stevens (1986) pointed out that tasks to provide students with practice and application opportu-
student achievement is maximized when teachers actively present nities (Borich, 1992). Beyond looking at the number of application
materials and structure them by: (a) beginning with overviews and/ tasks given to students, the application factor investigates whether
or review of objectives; (b) outlining the content to be covered and students are simply asked to repeat what has already been covered
signaling transitions between lesson parts; (c) calling attention to by the teacher or if the application task is set at a more complex
main ideas; and (d) reviewing main ideas at the end. Provision of level than the lesson. It also examines whether the application
summary reviews was also found to be important for student tasks are used as starting points for the next step of teaching and
achievement, since these reviews integrate and reinforce the learning.
learning of major points (Brophy & Good, 1986). These structuring F) The classroom as a learning environment: This factor comprises
elements facilitate memorizing of information and allow for its five elements, i.e., teacherestudent interaction, studentestudent
apprehension as an integrated whole with recognition of the re- interaction, students’ treatment by the teacher, competition be-
lationships between parts. Moreover, student achievement levels tween students, and classroom disorder. Classroom environment
tend to be higher when information is presented in the form of research has shown that the first two of these elements are
repeating and reviewing general views and key concepts. The important components of measuring classroom climate (e.g.,
structuring factor also refers to the ability of teachers to increase Cazden, 1986; Den Brok, Brekelmans, & Wubbels, 2004; Harjunen,
the difficulty level of their lessons or series of lessons gradually 2012). However, according to the dynamic model, what should be
(Creemers & Kyriakides, 2006). examined are the types of interactions that exist in a classroom,
C) Questioning: Based on the results of studies on teacher rather than how students perceive their teacher’s interpersonal
questioning skills and their association with student achievement, behavior. Specifically, the dynamic model is concerned with the
in the dynamic model this factor is defined according to five ele- immediate impact teacher initiatives have on establishing relevant
ments. First, teachers are expected to offer a mix of product ques- interactions in the classroom and investigates the extent to which
tions (i.e., expecting a single response from students) and process teachers are able to establish on-task behavior through promoting
questions (i.e., expecting students to provide more detailed ex- interactions. The other three elements refer to teachers’ attempts to
planations). Second, the length of pause following questions is create an efficient and supportive environment for learning in the
taken into account, and it is expected to vary according to the level classroom (Walberg, 1986). These elements are measured by taking
of difficulty of the questions. Third, question clarity is measured by into account teachers’ behavior in establishing rules and persuading
investigating the extent to which students understand what is students to respect and abide by the rules in order to create and
required of them (i.e., what the teacher expects them to do or find sustain an effective learning environment in the classroom.
out). Fourth, with regards to the appropriateness of the difficulty G) Management of time: According to the dynamic model,
level of the questions, it is expected that most questions should effective teachers are able to organize and manage the classroom
elicit correct answers and most of the other questions should elicit environment as an efficient learning environment and thereby
overt, substantive responses (incorrect or incomplete answers), maximize engagement rates (Creemers & Reezigt, 1996). Therefore,
rather than failure to respond at all (Brophy & Good, 1986). In management of time is considered an important indicator of
addition, optimal question difficulty should vary with context. Fifth, teacher ability to manage the classroom effectively (Kyriakides &
the way teachers deal with student responses to questions is Tsangaridou, 2008).
investigated. Correct responses should be acknowledged as such H) Assessment: Assessment is seen as an integral part of teaching
because this would also benefit not only their contributors but (Stenmark, 1992) and formative assessment, in particular, has been
other students as well. In responding to students’ partly correct or shown to be one of the most important factors associated with
incorrect answers, effective teachers acknowledge whatever part effectiveness at all levels, especially at the classroom level (e.g., De
may be correct and try to elicit an improved response (Rosenshine Jong, Westerhof, & Kruiter, 2004; Kyriakides, 2005; Shepard, 1989).
& Stevens, 1986). Therefore, effective teachers are more likely than Therefore, information gathered from assessment is expected to be
other teachers to sustain the interaction with the original respon- used to enable teachers to identify their students’ needs, as well as
dent by rephrasing the question and/or giving clues to its meaning, to evaluate their own practice. In addition to the quality of the data
rather than terminating the interaction by providing the student emerging from teacher assessment (i.e., whether they are reliable
with the answer or calling on another student to respond. and valid), the dynamic model is also concerned with the extent to
D) Teaching modeling: Although there is a long research tradition which the formative rather than the summative purpose of
on teaching higher-order thinking skills and problem solving, these assessment is achieved.
teaching and learning activities have received unprecedented Table 1 presents sample indicators used to identify the eight
attention during the last two decades due to the policy emphasis on factors in the studies selected for the meta-analysis. It is also shown
the achievement of new goals of education. Thus, the teaching- that the eight factors of the dynamic model do not refer to a single
modeling factor is associated with findings of studies revealing teaching approach. Instead, the model incorporates aspects from
that effective teachers help pupils use strategies and/or develop different instructional approaches, given that effective teaching can
their own strategies that can help them solve different types of combine elements of different approaches. In this respect, the
problems (Grieve, 2010). In defining this factor, the dynamic model model adopts a more integrated approach in which both direct
also addresses the properties of teaching-modeling tasks and the instruction (cf. Joyce, Weil, & Calhoun, 2000) and approaches
role teachers are expected to play to help students devise problem- associated with constructivism are considered. In particular, the
solving strategies. Teachers may either present students with a dynamic model refers to skills associated with direct teaching and
clear problem-solving strategy or invite students to explain how mastery learning such as structuring and questioning, but also in-
they themselves would approach or resolve a particular problem corporates practices such as orientation and modeling which are
and then use that information for promoting the idea of modeling. associated with constructivist theories (cf. Brekelmans, Sleegers, &
Recent research suggests that the latter approach may encourage Fraser, 2000; Choi & Hannafin, 1995; Savery & Duffy, 1995; Vermunt
students to not only use, but also develop their own problem- & Vershaffel, 2000). This integration of approaches was purposive,
146 L. Kyriakides et al. / Teaching and Teacher Education 36 (2013) 143e152

because of their different affordances. For example, direct and studies. The following databases were considered: Educational Re-
active teaching may be effective for achieving objectives related to sources Information Centre, Educational Research Abstracts Online,
acquiring knowledge and skills, while more constructivist ap- Social Sciences Citation Index, Educational Administration Abstracts,
proaches can be more effective for accomplishing aims related to SCOPUS, ProQuest 5000, and PsycArticles. We also paged through
higher-order cognitive activities (Campbell, Kyriakides, Muijs, & volumes of education-focused peer-reviewed journals with interest
Robinson, 2003). The effectiveness of these approaches may also in effective teaching, including the journals School Effectiveness and
hinge on learners’ characteristics, such as age, developmental stage, School Improvement, British Educational Research Journal, Teaching
and abilities (Kyriakides & Creemers, 2009). For example, more and Teacher Education, Learning and Instruction, Oxford Review of
structured ways of teaching, as included in direct and active in- Education, and Learning Environment Research. Finally, relevant re-
struction, may be more suitable for younger students in their earlier views of teacher effectiveness studies (e.g., Frazer et al., 1987;
stages of learning and for more disadvantaged students (Muijs & Hattie, 2009; Scheerens & Bosker, 1997; Seidel & Shavelson, 2007)
Reynolds, 2010). On the other hand, goal orientation or self- and Handbooks focusing on educational effectiveness (e.g., Teddlie
regulation of learning may be more appropriate for high-ability & Reynolds, 2000; Townsend, 2007) were scrutinized for references
students or students at later stages of learning (Tynjälä, 1999). to empirical studies.
The meta-analysis reported here aims to test these assumptions
and in doing so, examine the extent to which there is a need for an 3.2. Setting inclusion criteria
integrated approach in defining effective teaching. Moreover, this
meta-analysis investigates the extent to which the effect sizes of A study was included in our sample if it met three criteria. First,
factors associated with one or the other teaching approach depend we selected studies conducted during 1980e2010 and which had
on students’ developmental level and/or the type of learning out- been purposely designed to investigate the contribution of teacher
comes examined. In this way, the abovementioned assumptions classroom behaviors to student outcomes. Second, the selected
about the added-value of each teaching approach can be explored. studies should include explicit and valid measures of student
achievement in relation to cognitive, affective, or psychomotor
outcomes of schooling. Studies that used more global criteria for
3. Methods
academic outcomes, such as dropout rates, grade retention, and
enrollment in top universities were also included. Finally, given the
3.1. Selecting studies
focus of this meta-analysis, a study was included if it also had
measures of specific teaching factors and provided information on
For the selection of studies included in this meta-analysis we
the methods used to measure each factor. The studies that met all
searched documentary databases containing abstracts of empirical
three criteria and were therefore included in our analysis are listed
in Appendix A.
Table 1
Sample indicators of the eight teaching/classroom-level factors of the dynamic We note here that to include more studies in this meta-analysis,
model. we decided to err on the side of being more inclusive rather than
exclusive. Therefore, we used minimal quality criteria for study
Factors Sample indicators
selection with respect to both the methods used to measure stu-
1) Orientation - making explicit the importance of engaging dent learning outcomes and the teaching factors. This approach
students in certain tasks/activities
- providing students opportunities
enabled us to explore whether our decision to be more inclusive
to identify the significance of engaging had any impact on the observed effect sizes. We did so by treating
in certain tasks methodological variation among the selected studies as an empir-
2) Structuring - summarizing the main points of the lesson ical matter to be examined and by investigating the effect of
- gradually increasing the level of difficulty
methodological characteristics of the selected studies on the study
of the assigned tasks during the lesson
- connecting previous lessons to the lesson findings (cf. Lipsey & Wilson, 2001). We recognize, however, that
of the day this decision might have an impact on the magnitude of the effect
3) Questioning - the type of the questions asked (process sizes yielded in the study. To test for this possibility, we examined
vs. product) moderating effect of both the type of outcomes examined in the
- the clarity of the questions asked
- the type of feedback provided
study as well as the type of studies employed in the meta-analysis.
4) Teaching modeling - strategies for solving problems As will be shown later on, the type of outcomes had no significant
- strategies for preparing the outline effect on the functioning of the factors examined in the study, while
of an essay/summary the type of study did have an effect, which we are discussing to-
5) Application - opportunities to practice a skill or a procedure
ward the end of the paper.
presented in the lesson
- opportunities to apply a formula to solve
a problem 3.3. Calculating effect sizes
- opportunities to transfer knowledge to solve
everyday problems To calculate the effect of each teaching factor, the Fisher’s Z
6) The classroom - opportunities for students to interact
as a learning in different settings
transformation of the correlation coefficient was employed. Since
environment - teachers’ dealing with misbehavior not all studies presented their results in terms of correlations, all
- interactions between the teacher and the student other effect size measures were transformed into correlations (r)
- students’ perceived treatment by the teacher using the formula presented by Rosenthal (1994). For small values of
(e.g., fairness, caring)
the correlation coefficient, Zr and r do not differ significantly (see also
7) Management - finishing the lesson on time
of time - minimizing transition time Hunter & Schmidt, 2004). Because meta-analyses are based on the
- maximizing student time on task effect sizes as calculated by Cohen’s d, it was taken into account that
8) Assessment - frequency of administering various Cohen’s d (Cohen, 1988) is approximately twice the size of the cor-
assessment forms relation coefficient when the latter is small (i.e., 0.20 <
- formative use of assessment
- reporting to parents
r < 0.20). Specifically, the three statistics d, t, and r are all algebraically
transformable from one to the other (Creemers et al., 2010).
L. Kyriakides et al. / Teaching and Teacher Education 36 (2013) 143e152 147

The meta-analysis was conducted using MLwiN (Goldstein et al., with asterisks in Appendix Adwere not included in Seidel and
1998). Specifically, we adopted the procedure suggested by Shavelson’s work. By including studies conducted in several
Lamberts and Abrams (1995), which was also used in a number of countries in our work, we were also able to examine whether the
recent meta-analyses in the area of teacher and school effectiveness country origin moderated the effects of the teacher factors exam-
(e.g., Kyriakides, Creemers, Antoniou, & Demetriou, 2010; ined (see Appendix B)dsomething that was not pursued in the
Scheerens & Bosker, 1997; Witziers, Bosker, & Krüger, 2003). For other two meta-analyses.
the interested reader, Appendix B presents more specific informa-
tion on the multilevel model used to conduct the meta-analysis. It 4. Results
is also important to note that the multilevel analysis was conducted
twice. In the first round, all studies were considered; in the second The study findings are organized into two parts. In the first part,
round, the so-called sensitivity analysis, outlier studies were we present the findings of the meta-analysis. Given our aim to
removed from the sample to examine the robustness of the findings examine the validity of the dynamic model, the teaching factors
yielded from the first round. considered in the meta-analysis were classified into two broad
categories: those included (Category A) and those not included
(Category B) in the dynamic model. In the second part, we consider
3.4. Reflection on the methods employed
whether the results of the meta-analysis were moderated by a set
of other factors.
We pause here to consider the methodological similarities and
Table 2 presents the results of the meta-analysis carried out in
differences among the meta-analysis presented in this paper and
this study. As this table shows, besides the eight factors of the
the two recent meta-analyses discussed above (i.e., Hattie, 2009;
dynamic model, our analysis incorporated five additional factors:
Seidel & Shavelson, 2007). First, we note that in both the work
self-regulation, concept mapping, computer use, and interpersonal
presented here and in Seidel and Shavelson’s meta-analysis,
behavior. All these five factors are concerned with teacher behav-
different types of student outcomes were examined, be they
iorsdwhich is the focus of this meta-analysis. Their inclusion was
cognitive, affective, and psychomotor; in contrast, Hattie’s meta-
considered important even though they did not pertain to the
analysis focuses only on student cognitive outcomes. By including
factors of the dynamic model because, by identifying factors that
multiple types of student outcomes, we tested if specific teaching
had significant effects, one could consider possible ways of modi-
factors have an impact on various types of outcomes, while other
fying the model.
factors contribute more to specific types of student outcomes. To
Apart from listing the number of studies explored per factor and
address this question, we made use of multi-level modeling tech-
classifying these studies per type of learning outcome and educa-
niques, which enable testing whether there was variation in the
tional level, Table 2 also reports the average effect sizes of all the
reported effect sizes for each factor, depending on the type of the
factors examined. The following observations arise from this table.
outcome examined. By employing multi-level approaches in our
First, we observe an uneven number of studies across the 13
work (see Appendix B)dapproaches not used in the other two
factors under exploration. While concept mapping was examined
meta-analysesdwe were also able to test whether variation in re-
in only three studies, the classroom as a learning environment was
ported effect sizes could be explained by the different types of
examined in 78 studies. In general, with the exception of self-
studies sampled for the meta-analysis.
regulation, the studies concerning the factors not included in the
Second, in our work and in Seidel and Shavelson’s, the studies
dynamic model were much fewer than those pertaining to the
sampled were of mixed type (i.e., cross-sectional, experimental,
teaching factors of the model. Second, experimental studies were a
quasi-experimental, and longitudinal); we speculate that by refer-
small portion of the number of studies examined in this meta-
ring to “innovation” studies (p. 6), Hattie’s meta-analysis was also
analysis. Third, among the types of the learning outcomes exam-
based on quasi-experimental and experimental studies. Hattie
ined, cognitive and meta-cognitive outcomes had the lion’s share,
(2009) himself warned that the effects reported in such studies
with much fewer studies for affective, psychomotor, or behavioral
might not generalize, since “these effects from innovations may not
outcomes. Fourth, most of the studies were conducted at the pri-
be the same as the effects of teachers in regular classrooms” (p. 6)d
mary and secondary level; we identified only 10 studies (about 6%
but did not examine this possibility. In contrast to Hattie’s and
of the studies examined) that were conducted at the tertiary level.
Seidel and Shavelson’s meta-analyses, we empirically test for this
All these observations point to directions for further research,
potential, by exploring the moderating role of the type of the
which are discussed in the next section. It is also important to
studies on the effect sizes explored in this meta-analysis.
notice that the majority of the studies under exploration investi-
Finally, we note that the three meta-analyses differ considerably
gated differences in teacher effectiveness at a single point in time or
in the number of studies employed and the countries in which
collected data at two time points only. Thus, longitudinal studies
these studies were conducted. Whereas Hattie’s meta-analyses
with more data collection points are needed.
included thousands of studies (since it was a meta-analysis of
Turning to the average effect sizes of the factors examined in the
about 800 meta-analyses), in Seidel and Shavelson and in our work,
meta-analysis, we observe that seven of the eight factors of the
a much smaller number of studies was considered (112 in the
dynamic model had moderate effect sizes, ranging from 0.346 to
former and 167 in the latter). However, this significant difference in
0.457. Even the factor with the lowest effect size (application:
the studies analyzed was due to Hattie’s broader focus. We also
0.189) still had a significant effect on student learning. In
note that in our meta-analysis we have included at least 16 studies
conjunction, then, these findings empirically support the validity of
not incorporated in Hattie’s meta-analysis1; additionally, because
the dynamic model at the teacher/classroom level. Regarding the
our meta-analysis is more recent than that of Seidel and Shavelson,
factors not included in the dynamic model, the bottom panel of
about two thirds of the studies reviewed in this paperdindicated
Table 2 shows that three of the examined factors (i.e., computer use,
interpersonal behavior, and classroom organization) were weakly
1
associated with student learning (range of effects: 0.054e0.126). In
This is a very conservative estimate, since it takes into consideration the pub-
lication year of the latest meta-analysis in Hattie’s work (i.e., 2008). Of course, given
contrast, two factors had notable average effect sizes (concept-
that his work was based on meta-analyses and not single studies, it is highly likely mapping techniques: 0.754; self-regulation: 0.477). However, in
that the studies considered in his work were published earlier than 2008. the former case, it should not escape notice that only three studies
148 L. Kyriakides et al. / Teaching and Teacher Education 36 (2013) 143e152

Table 2
Characteristics of studies investigating the effect of teacher-level factors on student achievement and types of effects identified.

Teacher factors Average Number Number of studiesa Outcomesb Level of education


effect of participants
Total Experimental Cognitive Affective Psychomotor Behavior Primary Secondary Both primary Tertiary
(meta-cognitive) and secondary

A) Included in the dynamic model


Orientation 0.36 42 850 14 2 12 (0) 5 1 0 5 8 1 0
Structuring 0.36 157 783 35 1 32 (0) 7 2 2 16 19 0 0
Modeling 0.41 126 606 35 18 32 (5) 7 0 0 16 17 1 1
Application 0.18 146 010 27 1 26 (0) 4 2 2 13 12 1 1
Questioning 0.34 19 218 12 3 11 (0) 3 0 2 6 6 0 0
Assessment 0.34 166 593 30 2 27 (0) 5 0 0 15 13 2 0
Time 0.35 124 178 30 0 28 (0) 2 2 1 18 11 1 0
management
Classroom 0.45 232 286 78 24 74 (1) 18 0 5 35 36 2 5
as a learning
environment

B) Not included in the dynamic model


Self-regulation 0.47 17 651 16 11 11 (5) 8 0 1 7 9 0 0
Concept-mapping 0.75 640 3 3 3 (1) 0 0 0 1 1 0 1
Computer use 0.20 5812 8 6 6 (1) 2 0 1 1 5 0 2
Interpersonal 0.16 35 020 18 0 9 (0) 8 0 1 8 9 1 0
behavior
Classroom 0.05 28 862 9 1 8 (0) 2 0 2 4 5 0 0
organization
a
Some studies reported more than one observed effects.
b
Some studies explored the effects of more than one type of outcomes of schooling.

were examined and all of them were experimental in nature; considering older students; in contrast, the application factor
hence, the strong average size reported for concept mapping should seems to be more important for younger students. These findings
not be dissociated from the nature of the studies considered for this seem to partly support the argument that factors associated with
factor. Further studies are therefore needed to investigate the constructivism, such as modeling and self-regulation, are more
generalizability of the observed effect size as just reported. important for older students. Although the effects of these three
The next step in our meta-analysis was to examine whether the factors on different age groups of students vary, each factor has a
effect sizes reported above were moderated by factors including the significant effect on each age group of students, and hence all of
type of the learning outcomes considered in the studies [i.e., them could be treated as generic factors. The differences in the
(mathematics),2 language, science, affective, all other outcomes]; effect sizes of these three factors could also be attributed to dif-
the educational level [i.e., (primary), secondary, tertiary]; the ferences in the developmental stages of students in different levels
country in which the study was conducted [i.e., (USA), European of education as well as to differences in the functioning and the
countries, Asian countries, and other countries]; the study design curriculum at each schooling level (Kyriakides & Creemers, 2009).
employed [i.e., (cross-sectional studies), longitudinal, experi- Second, none of the country effects included in this analysis was
mental, quasi-experimental, and outlier studies]; and the statistical found to predict the reported effect sizes; this supports the
techniques employed in the analysis [i.e., (single-level) or multi- assumption that factors included in the dynamic model as well as
level analysis]. To this end, we considered the mean effect sizes of self-regulation (which is not included in the model) seem to be
six teacher factors included in the dynamic model (orientation, generic. Third, the figures of Table 3 reveal that the effect sizes of
structuring, modeling, assessment, time management, and class- the factors under exploration were not moderated by any of the
room as a learning environment) and self-regulation not included schooling outcomes examined in this analysis. In contrast, we
in the model. We did not include six of the factors considered above found that the study design did have a moderating effect on the
(two from the dynamic model and four not included in the model), average effect sizes: longitudinal studies were found to report
because there were relatively few studies focusing on these factors larger effect sizes for the orientation factor, while experimental
and hence detecting any moderating effect of the contribution of studies were shown to report larger effects for the modeling and
these factors to student outcomes would be difficult. Our analysis the self-regulation factors. These differences do not question the
showed a relatively large variation in effect sizes within and across generic nature of any of these factors, since each type of study
studies, suggesting that multilevel analysis could identify moder- shows that the factors had a significant impact on learning but the
ators explaining the variation across and within studies. Table 3 magnitude of the effect varied. However, these findings seem to
reports the results of the multilevel analysis conducted toward reveal that educational effectiveness researchers should consider
this end. the possibility of conducting experimental and longitudinal studies
The following observations arise from Table 3. First, the results when exploring the impact of teacher factors. These results are also
show that only in a few cases did moderators have a significant in line with an argument made earlier that the strong effect size
relationship with the reported effect size of the seven factors under found for concept mapping might have been an “artifact” of the
examination. Moreover, no moderator was found to have a signif- type of studies (i.e., only experimental studies) considered for this
icant relationship with the effect size of all factors. However, if we teaching factor e a finding that is consistent with Hattie’s meta-
look across different educational levels, it appears that the effect analysis in which experimental studies were used and strong ef-
sizes of two factors (modeling, and self-regulation) are larger when fect sizes were reported.
As mentioned above, the final step of this meta-analysis was to
repeat the abovementioned statistical procedure with the outlier
2
Listed in parentheses is the reference group. studies removed from the sample to check for the robustness of the
L. Kyriakides et al. / Teaching and Teacher Education 36 (2013) 143e152 149

Table 3
Predicting difference in effect sizes of each of the seven teacher factors.

Predictors Structuring Modeling Application Assessment Time management Class learning Self-regulation
estimate estimate estimate estimate estimate (p value) environment estimate (p value)
(p value) (p value) (p value) (p value) estimate (p value)

Intercept 0.28 (.01) 0.33 (.03) 0.25 (.03) 0.29 (.03) 0.31 (.03) 0.41 (.04) 0.38 (.05)

Outcome
Language 0.05 (.71) 0.07 (.61) 0.11 (.41) 0.08 (.54) 0.05 (.78) 0.07 (.61) 0.09 (.14)
Science 0.04 (.81) 0.05 (.72) 0.09 (.52) N.A. 0.07 (.49) 0.08 (.51) N.A.
Affective 0.09 (.66) 0.11 (.46) 0.06 (.56) 0.09 (.46) N.A. 0.13 (.06) 0.06 (.78)
All other outcomes 0.04 (.23) 0.06 (.53) 0.06 (.33) 0.08 (.53) 0.06 (.39) 0.09 (.43) 0.04 (.82)

Phase of schooling
Secondary 0.09 (.14) 0.22 (.01)* 0.15 (.04)* 0.05 (.33) 0.08 (.21) 0.07 (.19) 0.15 (.03)*
Tertiary N.A. N.A. N.A. N.A. N.A. N.A. 0.24 (.01)*

Country
European countries 0.04 (.58) 0.07 (.44) 0.08 (.22) 0.06 (.41) 0.07 (.42) 0.04 (.67) 0.04 (.88)
Asian countries 0.08 (.45) N.A. 0.05 (.55) 0.03 (.89) 0.06 (.34) 0.03 (.75) N.A.
All other countries 0.06 (.34) 0.04 (.88) N.A. 0.01 (.95) N.A. 0.05 (.42) 0.03 (.91)

Type of study
Longitudinal 0.12 (.05)* 0.08 (.12) 0.05 (.29) 0.11 (.05)* 0.05 (.31) 0.06 (.17) N.A.
Experimental N.A. 0.12 (.05)* N.A. N.A. N.A. 0.12 (.05)* 0.13 (.04)*
Quasi-experimental N.A. 0.19 (.01)* N.A. 0.06 (.21) N.A. 0.07 (.15) 0.11 (.09)
Outlier N.A. N.A. 0.19 (.03)* 0.14 (.04)* 0.11 (.05)* 0.14 (.05)* N.A.

Analysis of data (multilevel) 0.04 (.35) 0.06 (.21) 0.03 (.81) 0.04 (.41) 0.05 (.29) 0.06 (.21) 0.02 (.81)

NA: It was not possible to test the effect of this explanatory variable since almost all the studies which assessed the impact of this factor belong to only one of the two groups
that are compared. *: A statistically significant effect at level .05 was identified.

findings (i.e., sensitivity analysis). According to this analysis, only meta-analysis the number of studies investigated and/or the sam-
the effect of modeling was reduced when outlier studies were ple size of these studies was not particularly high, because several
removed from the sample; yet, still a positive and significant rela- studies examined effect sizes for more than one outcome and
tionship between teaching modeling and student outcomes was because of analyzing the data by using a multi-level modeling
reported. Therefore, the sensitivity study seems to suggest that the approach, these limitations were to some extent counterbalanced
effects of the factors of the dynamic model as well as self-regulation (see Snijders & Bosker, 1999). However, we acknowledge that this
are robust to outlier studies. Future analyses could also explore the was not necessarily the case for concept mapping and computer
robustness of the findings for other variables, as well. use. Yet, it is important to note that in these latter cases, most of the
studies reviewed were experimental, which, as explained below
5. Discussion and implications and as found in other meta-analyses (e.g., Seidel & Shavelson, 2007)
lend themselves better to identifying effects, should these exist.
Meta-analysis is a powerful tool for summarizing, integrating, One of the main findings of this meta-analysis is that the factors
and interpreting quantitative empirical research works. In this found to have an effect on student outcomesdbe they (meta)
paper, this tool was used to not only synthesize existing works on cognitive, affective, or psychomotordwere not associated only
the contribution of teaching factors to student learning, but to also with either direct and active teaching approaches or more
provide empirical validity to a theoretical model, namely the dy- constructivist approaches. For example, the analysis showed factors
namic model of educational effectiveness. To this end, the dynamic related to direct instruction (e.g., time management, structuring) or
model was used as a framework to select and organize 167 studies constructivism (e.g., orientation, modeling) to both contribute to
investigating the impact of several teaching factors on student student outcomes. This finding empirically corroborates the theo-
achievement. In this final part of the paper, we summarize the main retical underpinnings of the dynamic model which, by pursuing an
results of the meta-analysis and simultaneously consider their integrated approach, incorporates factors from both instructional
theoretical, methodological, and practical implications. Below, we perspectives at the teacher/classroom level (see Kyriakides, 2008).
discuss each in turn. More than that, from a theoretical standpoint, this finding sug-
Before doing so, it is important to note that our meta-analysis, gests that when it comes to teaching and the factors contributing to
like any other meta-analysis is limited in the sense that it de- it, imposing unnecessary dichotomies between different teaching
pends on the availability of existing studies. Further, as Hattie approaches might be counterproductive. Instead, by being agnostic
(2009) acknowledges, since the findings of meta-analyses are to the teaching approach pursued in instruction and by considering
based on past studies, it might be the case that other factors that are what exactly the teacher and the students do during the lesson and
important for teaching and which have not yet been or are how they interactdregardless of whether their actions and in-
currently investigated are omitted. Another limitation often plague teractions resonate more with the one or the other approach-
meta-analyses is the so-called publication bias: non-published dmight be more productive. Grossman and McDonald (2008) speak
studies (e.g., doctoral dissertations are seldom included in the to this idea, when they suggest that attempts to develop a frame-
analysis). To account for this limitation, in the present meta- work for studying and understanding teaching and its effects should
analysis we did search for doctoral dissertations, but with only a not prioritize the one over the other approach and should, instead,
handful of exceptions (five studies), in most cases, these disserta- be more inclusive and selective. As these scholars explain, often
tions either did not meet the criteria for inclusion or have already times, research on teaching and its effects suffers from “a kind of
been published and therefore were already included in the meta- historical amnesia, forgetting the past in the rush to invent the
analysis. Finally, although for several factors included in our future” (p. 200): researchers are often inclined to focus on “new”
150 L. Kyriakides et al. / Teaching and Teacher Education 36 (2013) 143e152

ideas, tending to forget that other theoretical models and research here could also be expanded to include the effects of more subject-
paradigms that have been proven to “work” in the past should not specific teaching factors to student learning. Future meta-analyses
be underestimated. Along these lines, this meta-analysis un- could also examine the effect of teacher knowledge and/or beliefs
derscores the importance of focusing on teaching factors themselves on the teaching quality. These studies were not included in our
rather than on the teaching approach with which they might be analysis for two reasons. First, they were not part of the dynamic
more associated. We note that in meta-analyses focusing more model which focuses on observable teacher behaviors. Second, for
broadly on teaching approaches rather than on specific teaching the time being, studies exploring the effect of teacher knowledge
behaviors (e.g., Hattie, 2009), one could impose unnecessary di- on the quality of instruction and consequently on student outcomes
chotomies by opting for the one or the other approachdeven are scarce (cf. Schechtman, Roschelle, Haertel, & Knudsen, 2010).
though this might not be the intention of the meta-analysis. This From a methodological standpoint, the findings of the present
stands in sharp contrast to what practitioners themselves know study offer three implications/directions for future work. First, the
very well from their daily practice: that good teaching is not study suggests an additional purpose for conducting meta-
necessarily associated with a particular teaching approach; instead, analyses. Meta-analyses are typically conducted for two main rea-
its quality resides in making judicious choices and uses of different sons. The first reason relates to researchers being interested in
components from different approaches in ways that benefit and appraising the accumulated knowledge in a field, with the main
reinforce student learning (Creemers, Kyriakides, & Antoniou, 2013). aim being to give specific answers to what factors or interventions
This meta-analysis also showed two other factors not originally contribute to some other variables. Researchers may also be inter-
included in the dynamic model to have notable effect sizes in terms ested in capitalizing on the existing findings to build a new theory
of student outcomes. Because one of these factors might have been or to design future studies. The approach employed in the present
an artifact of the nature of the studies revieweddnamely experi- meta-analysis points to a third venue of using such qualitative
mental studies, a point to which we return belowdhere we focus syntheses: it underlines the importance of using a theory-driven
on the second factor: self-regulation. A closer consideration of this approach in selecting, organizing, and synthesizing existing
factor suggests that it bears resemblances to other factors already studies. The dynamic model of educational effectiveness employed
included in the dynamic model. For example, the orientation factor in this study offered such a framework that guided the structure
included in the model attends to the extent to which the teacher and classification of factors included in the analysis. This strategic
provides information to orient students to the importance of organization allowed us to empirically validate the model, while
learning the new content. This could be considered as a component also offering insights into ways in which the model can be extended
of teachers’ attempt to encourage self-regulation and help students and enriched. What this study then suggests is that a theory-driven
understand the reasons for which they should be engaged in meta-analysis not only helps in summarizing and integrating the
certain learning tasks. From a theoretical standpoint, then, such findings across different studies, but it can also contribute toward
connections suggest that including self-regulation in the dynamic theory-building and modification (Creemers et al., 2010).
model might be a natural extension to the model. This is because, The second methodological implication of this study pertains to
this factor can help better capture the extent to which teaching not need of conducting more longitudinal and experimental studies. If
only gives students the opportunity to apply approaches presented one considers the studies employed for examining the effect of
in the lesson (i.e., application) or to develop certain strategies for concept mapping, one can easily conclude that experimental
dealing with particular problems (i.e., modeling), but it can also studies are particularly powerful in uncovering effectsdshould
help students gradually become independent learners and auton- these effects exist. This finding is in line with Hattie’s meta-analysis
omous thinkers. (2009), which largely drew on experimental/innovation studies,
Finally, from a theoretical perspective, the findings of this meta- and which yielded notably higher effect sizes compared to those
analysisdand particularly the weak to inexistent effect of several yielded by the cross-sectional or the quasi-experimental studies
moderators on the factors examined in this meta-analysisdseem to employed in our meta-analysis. Longitudinal studies which can
suggest that the teaching factors examined in this study are, at least help trace the impact of teaching factors over a period of time are
to some extent, generic in nature. This generic character was sug- also very promising in exploring the effects of teaching on student
gested by the fact that no statistically significant differences were learning. Yet, our meta-analysis suggested that both types of
found across different subject matters (e.g., mathematics, language studies were very scarce. In pointing to the need for conducting
or science) or across different countries. From this respect, it can be more experimental and longitudinal studies, we are dismissive
argued that these factors might transcend different educational neither of the logistical and other difficulties associated with
contexts and different subject matters. This argument can be running such studies nor of Hattie’s caution (2009) that “the mere
further examined in future studies incorporating results from even involvement in asking questions about the effectiveness of any
more subject matters and from a greater range of countries. innovation [/experiment] may lead to an inflation of the effects” (p.
Future studies could also further probe into the moderating 6). However, as our meta-analysis and others (e.g., Hattie, 2009;
effect of the educational level. In particular, this study showed some Scheerens & Bosker, 1997; Seidel & Shavelson, 2007) clearly sug-
of the factors to have greater effects in primary education and some gest, the benefits accrued by running such studies can counter-
others in secondary education. To the extent that this pattern holds, balance the difficulties inherent in conducting these studies;
it can then inform teaching preparation and professional devel- additionally, careful designs of such studies might help alleviate
opment programs, a point we consider below. Future meta- some of the concerns raised above.
analyses could also shed more light into the moderating effect of Third, from a methodological viewpoint, this meta-analysis
other student characteristics, be they individual traits or collective suggests that future studies should also move beyond simply
student characteristics at the classroom level. For example, is exploring cognitive outcomes, which occupied the lion’s share in
orientation or modeling equally productive in classes with a high this meta-analysis, to also explore other student outcomes. Future
variation in terms of student abilities or socioeconomic back- studies could also consider extending their scope to incorporate
ground? Is self-regulation equally conducive to learning regardless tertiary education, which appeared to be the stepchild among the
of the student composition? Additionally, given the recent three educational levels examined in the study.
emphasis on domain-specific teaching practices (e.g., Chen, The findings of this meta-analysis may also provide implications
Hendricks, & Archibald, 2011; Hill et al., 2008), the work reported for policy makers and practitioners, especially with regards to
L. Kyriakides et al. / Teaching and Teacher Education 36 (2013) 143e152 151

teacher preparation and education programs. Recently, scholars are Borich, G. D. (1992). Effective teaching methods (2nd ed.). New York: Macmillan
Publishing Company.
increasingly attending to what they define as “high-leverage” or
Brekelmans, M., Sleegers, P., & Fraser, B. (2000). Teaching for active learning. In
“core teaching practices” (Ball, Sleep, Boerst, & Bass, 2009; P. R. J. Simons, J. L. van der Linden, & T. Duffy (Eds.), New learning (pp. 227e242).
Grossman, Hammerness, & McDonald, 2009), that is teaching Dordrecht: Kluwer Academic Publishers.
practices that can be enacted irrespective of the curricula used or Brophy, J., & Good, T. L. (1986). Teacher behavior and student achievement. In
M. C. Wittrock (Ed.), Handbook of research on teaching (3rd ed.). (pp. 328e375)
the instructional approaches employed; can be practiced, learned, New York: MacMillan.
and improved; and have the potential to improve student learning. Campbell, R. J., Kyriakides, L., Muijs, R. D., & Robinson, W. (2003). Differential
This meta-analysis provides empirical evidence about teaching teacher effectiveness: towards a model for research and teacher appraisal.
Oxford Review of Education, 29(3), 347e362.
factors/practices which have been shown, across different studies, Cazden, C. B. (1986). Classroom discourse. In M. C. Wittrock (Ed.), Handbook of
to influence student outcomes. As such, the study findings can research on teaching (pp. 432e463). New York: MacMillan.
inform teacher pre-service and in-service programs. In particular, Chen, W., Hendricks, K., & Archibald, K. (2011). Assessing pre-service teachers’
quality teaching practices. Educational Research and Evaluation: An International
teacher educators, as well as those involved in professional devel- Journal on Theory and Practice, 17(1), 13e32.
opment efforts, could enrich their programs by engaging pre- Choi, J. I., & Hannafin, M. (1995). Situated cognition and learning environments:
service and in-service teachers in discussions regarding the roles, structures, and implications for design. Educational Technology Research
and Development, 43(2), 53e69.
importance of these factors for student learning. More critically, Cohen, J. (1988). Statistical power analysis of the behavioral sciences (2nd ed.). New
however, they can also give prospective and practicing teachers the York: Academic Press.
opportunity to rehearse and practice these factors in their teaching. Creemers, B. P. M., & Reezigt, G. J. (1996). School level conditions affecting the
effectiveness of instruction. School Effectiveness and School Improvement, 7(3),
For pre-service teachers such opportunities can be afforded in
197e228.
microteaching environments, where while working in a relatively Creemers, B. P. M., & Kyriakides, L. (2006). Critical analysis of the current ap-
“safe” environment with other fellow-students and without having proaches to modeling educational effectiveness: the importance of establishing
the pressure of the actual teaching conditions, novice teachers can a dynamic model. School Effectiveness and School Improvement, 17(3), 347e366.
Creemers, B. P. M., & Kyriakides, L. (2008). The dynamics of educational effectiveness:
experiment with incorporating different factors in their practice A contribution to policy, practice and theory in contemporary schools. London and
and receive specific and detailed feedback on their performance New York: Routledge.
(Antoniou & Kyriakides, 2013). For in-service teachers, such op- Creemers, B. P. M., Kyriakides, L., & Antoniou, P. (2013). Teacher professional devel-
opment for improving quality in teaching. Dordrecht, the Netherlands: Springer.
portunities arise when teachers have the opportunity to plan lesson Creemers, B. P. M., Kyriakides, L., & Sammons, P. (2010). Methodological advances in
that are undergirded by considerations of such factors, enact these educational effectiveness research. London and New York: Routledge.
lesson plans, reflect on them, and receive feedback on how they can De Corte, E. (2000). Marrying theory building and the improvement of school
practice: a permanent challenge for instructional psychology. Learning and In-
further improve their practice (Creemers et al., 2013). In incorpo- struction, 10(3), 249e266.
rating such factors into teacher preparation/education programs, De Jong, R., Westerhof, K. J., & Kruiter, J. H. (2004). Empirical evidence of a
attention should also be paid to the extent to which these factors comprehensive model of school effectiveness: a multilevel study in mathe-
matics in the 1st year of junior general education in the Netherlands. School
are equally effective across different educational levels, something Effectiveness and School Improvement, 15(1), 3e31.
that the present meta-analysis seems to partly question, and which Den Brok, P., Brekelmans, M., & Wubbels, T. (2004). Interpersonal teacher behaviour
definitely calls for more research and exploration. and student outcomes. School Effectiveness and School Improvement, 15(3/4),
407e442.
Writing about the challenges faced by those involved in
Frazer, B. J., Walberg, H. J., Welch, W. W., & Hattie, J. A. (1987). Syntheses of
teacher preparation and ongoing education, Grossman et al. educational productivity research. International Journal of Educational Research,
(2009) argued that one of the main obstacles in redefining and 11, 145e252.
improving our programs lies in reaching consensus among Gijbels, D., Van de Watering, G., Dochy, F., & Van den Bossche, P. (2006). New
learning environments and constructivism: the students’ perspective. Instruc-
teacher educators, supervisors, and practicing teachers regarding tional Science, 34(3), 213e226.
a set of core practices around which teacher preparation and Goldstein, H., Rasbash, J., Plewis, I., Draper, D., Browne, W., Yang, M., et al. (1998).
professional development programs should be built. Results from A user’s guide to MLwiN. London: Institute of Education.
Grieve, A. M. (2010). Exploring the characteristics of “teachers for excellence”:
this meta-analysis and from other similar meta-analyses can help teachers’ own perceptions. European Journal of Teacher Education, 33(3), 265e277.
these stakeholders make informed decisionsddecisions that are Grossman, P., Hammerness, K., & McDonald, M. (2009). Redefining teaching, re-
both theory-driven and evidence-based. Perhaps, it is only imagining teacher education. Teachers and Teaching: Theory and Practice, 15(2),
273e289.
through such decisions that we might be able to restructure and Grossman, P., & McDonald, M. (2008). Back to the future: directions for research in
improve teacher education and respond to our critics who often teaching and teacher education. American Educational Research Journal, 45(1),
question the effectiveness of teacher preparation/education pro- 184e205.
Harjunen, E. (2012). Patterns of control over the teachingestudyingelearning
grams to improve teacher abilities and consequently student process and classrooms as complex dynamic environments: a theoretical
learning. framework. European Journal of Teacher Education, 35(2), 139e161.
Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to
achievement. New York: Routledge.
Appendices A and B. Supplementary data Hill, H. C., Blunk, M., Charalambous, C. Y., Lewis, J., Phelps, G. C., Sleep, L., et al.
(2008). Mathematical knowledge for teaching and the mathematical quality of
instruction: an exploratory study. Cognition and Instruction, 26, 430e511.
Supplementary data related to this article can be found at http:// Hunter, J. E., & Schmidt, F. L. (2004). Methods of meta-analysis: Correcting error and
dx.doi.org/10.1016/j.tate.2013.07.010. bias in research findings (2nd ed.). Thousand Oaks, CA: Sage.
Joyce, B., Weil, M., & Calhoun, E. (2000). Models of teaching. Boston: Allyn & Bacon.
Kyriakides, L. (2005). Extending the comprehensive model of educational effec-
tiveness by an empirical investigation. School Effectiveness and School
References Improvement, 16(2), 103e152.
Kyriakides, L. (2008). Testing the validity of the comprehensive model of
Antoniou, P., & Kyriakides, L. (2013). A dynamic integrated approach to teacher educational effectiveness: a step towards the development of a dynamic
professional development: impact and sustainability of the effects on model of effectiveness. School Effectiveness and School Improvement, 19(4),
improving teacher behavior and student outcomes. Teaching and Teacher Edu- 429e446.
cation, 29(1), 1e12. Kyriakides, L., Campbell, R. J., & Gagatsis, A. (2000). The significance of the class-
Aparicio, J. J., & Moneo, M. R. (2005). Constructivism, the so-called semantic room effect in primary schools: an application of Creemers’ comprehensive
learning theories, and situated cognition versus the psychological learning model of educational effectiveness. School Effectiveness and School Improvement,
theories. Spanish Journal of Psychology, 8(2), 180e198. 11(4), 501e529.
Ball, D. L., Sleep, L., Boerst, T. A., & Bass, H. (2009). Combining the development of Kyriakides, L., & Creemers, B. P. M. (2009). The effects of teacher factors on different
practice and the practice of development in teacher education. The Elementary outcomes: two studies testing the validity of the dynamic model. Effective
School Journal, 109(5), 458e474. Education, 1(1), 61e85.
152 L. Kyriakides et al. / Teaching and Teacher Education 36 (2013) 143e152

Kyriakides, L., Creemers, B., Antoniou, P., & Demetriou, D. (2010). A synthesis of Seidel, T., & Shavelson, R. J. (2007). Teaching effectiveness research in the past
studies searching for school factors: implications for theory and research. decade: the role of theory and research design in disentangling meta-analysis
British Educational Research Journal, 36(5), 807e830. results. Review of Educational Research, 77(4), 454e499.
Kyriakides, L., & Tsangaridou, N. (2008). Towards the development of generic and Shepard, L. A. (1989). Why we need better assessment. Educational Leadership,
differentiated models of educational effectiveness: a study on school and 46(2), 4e8.
teacher Effectiveness in Physical Education. British Educational Research Journal, Snijders, T., & Bosker, R. (1999). Multilevel analysis: An introduction to basic and
34(6), 807e838. advanced multilevel modeling. London: Sage.
Lamberts, P. C., & Abrams, K. R. (1995). Meta-analysis using multilevel models. Stenmark, J. K. (1992). Mathematics assessment: Myths, models, good questions and
Multilevel Modeling Newsletter, 7(2), 17e19. practical suggestions. Reston, VA: NCTM.
Leinhardt, G. (1993). On teaching. In R. Glaser (Ed.), Advances in instructional psy- Teddlie, C., & Reynolds, D. (2000). The international handbook of school effectiveness
chology (Vol. 4; pp. 1e54). Hillsdale, NJ: Lawrence Erlbaum Associates. research. London: Falmer Press.
Lipsey, M. W., & Wilson, D. B. (2001). Practical meta-analysis. Thousand Oaks, CA: Townsend, T. (2007). International handbook of school effectiveness and improvement.
Sage. Dordrecht, the Netherlands: Springer.
Muijs, D., & Reynolds, D. (2000). School effectiveness and teacher effectiveness: Tynjälä, P. (1999). Towards expert knowledge? A comparison between a construc-
some preliminary findings from the evaluation of the mathematics enhance- tivist and a traditional learning environment in the university. International
ment programme. School Effectiveness and School Improvement, 11(3), 247e263. Journal of Educational Research, 31(5), 357e416.
Muijs, D., & Reynolds, D. (2010). Effective teaching. Evidence and practice. London: Vermunt, J., & Vershaffel, L. (2000). Process-oriented teaching. In R. J. Simons, J. van
Sage. der Linden, & T. Duffy (Eds.), New learning (pp. 209e225). Dordrecht, the
Paris, S. G., & Paris, A. H. (2001). Classroom applications of research on self- Netherlands: Kluwer.
regulated learning. Educational Psychologist, 36(2), 89e101. Walberg, H. J. (1986). Synthesis of research on teaching. In M. C. Wittrock (Ed.),
Rosenshine, B., & Stevens, R. (1986). Teaching functions. In M. C. Wittrock (Ed.), Handbook of research on teaching (3rd ed.). (pp. 214e229) New York: Macmillan.
Handbook of research on teaching (3rd ed.). (pp. 85e107) New York: Macmillan. Witziers, B., Bosker, R. J., & Krüger, M. L. (2003). Educational leadership and student
Rosenthal, R. (1994). Parametric measures of effect size. In H. Cooper, & L. V. Hedges achievement: the elusive search for an association. Educational Administration
(Eds.), The handbook of research synthesis (pp. 231e245). New York: Russell Quarterly, 39(3), 398e423.
Sage.
Savery, J. R., & Duffy, T. M. (1995). Problem based learning: an instructional model
and its constructivist framework. Educational Technology, 35(5), 31e38.
Schechtman, N., Roschelle, J., Haertel, G., & Knudsen, J. (2010). Investigating links Further reading
from teacher knowledge, to classroom practice, to student learning in the
instructional system of the middle-school mathematics classroom. Cognition Raudenbush, S. W. (1994). Random effects models. In H. Cooper, & L. V. Hedges (Eds.),
and Instruction, 28(3), 317e359. The handbook of research synthesis (pp. 301e323). New York: Russell Sage.
Scheerens, J., & Bosker, R. J. (1997). The foundations of educational effectiveness. Raudenbush, S. W., & Bryk, A. S. (1985). Empirical Bayes meta-analysis. Journal of
Oxford: Pergamon. Educational Statistics, 10, 75e98.

You might also like