You are on page 1of 66

IMPROVING EDUCATIONAL QUALITY THROUGH ENHANCING COMMUNITY PARTICIPATION: RESULTS FROM A RANDOMIZED FIELD EXPERIMENT IN INDONESIA Menno Pradhan+

Amanda Beatty Armida Alisjahbana


++

Daniel Suryadarma# Maisy Wong^ Arya Gaduh**

Rima Prama Artha^^

April 2013

Abstract
Education ministries worldwide have promoted community engagement through school committees, consisting of parents, teachers, school officials and community members. This paper presents results from a large field experiment testing alternative approaches to strengthen school committees in public schools in Indonesia. Two novel treatments focus on institutional reforms. First, some schools were randomly assigned to implement elections of school committee members. Another treatment facilitated joint planning meetings between the school committee and the village council (linkage). Two more common treatments, grants and training, provided resources to existing school committees. We find that institutional reforms, in particular linkage and elections combined with linkage, are most cost effective at improving learning.

JEL codes: I25, O1

+ VU University Amsterdam & University of Amsterdam. m.p.pradhan@vu.nl. # Australian National University. daniel.suryadarma@anu.edu.au. * Mathematica Policy Research. abeatty@mathematica-mpr.com ^ University of Pennsylvania. maisy@wharton.upenn.edu. ++ Universitas Padjadjaran, Bandung, Indonesia. armida.alisjahbana@fe.unpad.ac.id. ** University of Southern California (USA). abgaduh@gmail.com ^^ National Graduate Institute for Policy Studies (Japan) resp.; rimaprama@yahoo.com.

I. Introduction1 There is a growing interest to improve the quality of education in developing countries by enhancing community participation (Mansuri and Rao, 2012, Stiglitz, 2002). The argument is that communities can contribute to improved service delivery because they can easily observe its quality and have a direct incentive to improve it (World Bank, 2003). However, organizing effective community participation is a time-intensive task (Banerjee and Duflo, 2008). Many countries have created local institutions to coordinate community participation for education, such as school committees, parent teacher associations or Village Education Committees (VECs) in India. Yet, often these government-sponsored institutions do not live up to their expectations (Banerjee and Duflo, 2008, Bruns, Filmer and Patrinos, 2011). Why do these institutions fail? And is investing in these institutions a viable strategy to improve learning outcomes? There are several randomized evaluations aimed at enhancing community participation through these school

A large number of people contributed to the design, implementation and supervision of this research project. From the Indonesian Ministry of National Education: Bambang Indriyanto, Sri Renani Pantjastuti, Sri Amien, Dasim Budimansyah, Agus Haryanto, Yadi Haryadi, Neneng Kodri, Suparlan, Anen Tumanggung, Yudistira Widiasana, Diana Sufa, Ismulyanto Apriwibowo. From the World Bank and affiliates: Vivi Alatas, Desmond Ang, Chris Bjork, Esther Duflo, Scott Guggenheim, Djoko Hartono, Dedy Junaedi, Siswantoro, Rosfita Roesli, Chris Thomas, Jeremy Tobacman, Tri Yuwono. We are grateful to Matt Stevens, Deborah Cobb-Clark, Tue Grgens, Andrew Leigh, Matthew Grant Wai-Poi, Astid Zwager, Claudio Ferraz and Harry Patrinos for helpful comments and suggestions. We also thank anonymous referees and the editor for their comments. We acknowledge financial support from the Japan Social Development Fund, and the Dutch Government. Maisy Wong is grateful for support from the Zell/Lurie Real Estate Center. The firm PPA and Moores Roland implemented the intervention and survey respectively. The findings, interpretations, and conclusions expressed in this paper are entirely those of the authors. They do not necessarily represent the views of the International Bank for Reconstruction and Development/World Bank and its affiliated organizations, or those of the Executive Directors of the World Bank or the governments they represent. Ying Chen provided excellent research support.

committees but the results are mixed. Some research suggests that interventions that provide additional resources such as block grants and training improve learning outcomes but not others.2 Merely informing communities about the state of service delivery seems insufficient to improve services (Banerjee et al., 2010, Kremer and Holla, 2009). This paper investigates the role of school committees in improving education quality in public schools in Indonesia. Our study is a randomized evaluation comprising 520 schools in Central Java, from January 2007 to October 2008. We have four main treatments. The first two are novel amongst the many field experiments in the community participation literature. These treatments are institutional reforms that improve the social capital of the school committee by strengthening its trustworthiness and relationship with the community (Ostrom and Ahn, 2009). The first treatment supported schools to implement democratic elections of school committee members. The second treatment linked school committees to the democratically elected village council by facilitating a series of joint planning meetings between the two groups (we call this linkage). We benchmarked these two novel treatments against more common treatments that improve the financial and human capital of existing school

Gertler, Patrinos and Rodrguez-Oreggia (2010) find that providing block grants in Mexico improve school quality, but Blimpo and Evans (2010) find that providing block grants or grants and training in Gambia have no effect on test scores. Khattri, Ling and Jha (2010) find small effects. Duflo, Dupas and Kremer (2009) find that training school committee members in Kenya only impact learning if interacted with another treatment that provides contract teachers.

committees: providing block grants and training. One reason our institutional reforms are attractive is the financial cost. Compared to the treatments that provide resources, linkage and elections are extremely cheap to implement (the linkage intervention cost $125, and elections cost $174 per school). The training and grant treatments on the other hand were rather expensive (the grant was $870 and the training cost $360). We contribute to the literature on community participation and education in several ways.3 First, our institutional reform treatments are, to our knowledge, the first of its kind in this literature. Many countries already have school committees devoted to enhance community participation. The puzzle is why these institutions are failing. Analyzing the impact of reforming institutions is hard because randomly assigning a political process is seldom possible (Acemoglu, Johnson and Robinson, 2001). A unique opportunity arose for this study because the Ministry of Education in Indonesia was our partner. This partnership allowed us to introduce institutional reforms to these government-sponsored institutions that would not have been possible in Indonesia without the support of the government. For example, our linkage treatment involved signing a joint action plan involving the school committees and school heads in public schools with another government-sponsored entity, the village council.

See Banerjee, et al. (2010), Blimpo and Evans (2010), Bruns, Filmer and Patrinos (2011), Duflo, Dupas and Kremer (2009), Khattri, Ling and Jha (2010), Pandey, Goyal and Sundararaman (2009). We are also related to the broader literature on community participation and public service delivery, see Bjrkman and Svensson (2009), Olken (2007), Olken, Onishi and Wong (2012).

Another novel institutional reform treatment is the introduction of elections of school committee members. To our knowledge, this is the first paper that credibly estimates the causal impact of elections using a randomized evaluation.4 Schools randomly assigned the election treatment showed an increase in the communitys awareness of the school committee and resulted in many newly elected members of the school committee. We also contribute to the literature by evaluating a rich set of interventions in a common context. There is a large group of randomized evaluations of community participation, but it is hard to compare the treatments across countries because context could be important, especially when institutional features vary across countries. We use a rich set of interventions and a large collection of intermediate outcome variables that allow us to unpack the dynamics of community participation interventions, something which has been lacking in earlier studies that focused largely on single interventions. We evaluate seven interventions, including the four treatments and three combinations of treatments (linkage and election, linkage and training, elections and training). To do so, schools were randomly assigned into nine comparison groups: one control group, one group that was assigned the grant only, and seven groups that were assigned the grant plus combinations of

Olken (2010) is the randomized evaluation of the impact of plebiscites (community votes on categories of public goods to spend on). Martinez-Bravo, et al. (2011) study the impact of elections of village leaders in China by using variation in the timing of the introduction of these elections.

training, linkage and/or elections. These latter groups were always assigned the grant treatment also to ensure that the school committee would have money available to support initiatives that might arise from the training, linkage and/or election treatments. We find that institutional reforms of school committees that enhance its social capital in the community show positive effects on learning but interventions that enhance financial and human capital are less cost effective. Two years after the start of the project, linkage, and linkage plus elections show a positive impact on learning. Indonesian test scores increase by 0.17 standard deviations for linkage and 0.23 standard deviations for linkage plus elections. Training, on the other hand, shows no effect on learning and the effect of grants alone, while showing a positive impact estimate, is usually statistically indistinguishable from zero. We further analyze five possible mechanisms that could enhance learning: increased awareness of school committees, enhanced school-based management and increased contributions from parents, teachers, and communities. Our analysis of the possible mechanisms suggests that linkage and linkage plus elections mostly improve learning by increasing community level inputs by 0.14 and 0.13 standardized units. These effects are driven by reported increases in the village councils collaboration with the school and the school principals satisfaction of the extent of the village councils attention to education in the village. Financial and in-kind donations do not increase differentially for these two interventions. Instead of being a passive

fundraising vehicle only, the joint planning meetings between the school committee and the village council translated into co-sponsored education initiatives such as the hiring of contract teachers and the establishment of village study hours. Strikingly, linkage and linkage plus elections do not raise awareness of school committees but impact learning, whilst interventions that increase awareness show limited effects. This is consistent with findings in the literature suggesting that information campaigns that raise school committee awareness alone may not improve learning (Banerjee, et al., 2010). Our results shed light on why some community participation programs improve service delivery but some do not. For example, Bjrkman and Svensson (2009) find that community participation improved health care delivery by Health Unit Management Committees (HUMC) in Uganda but Banerjee et al. (2010) find that enhancing community participation through VECs in India did not improve school quality. In a survey article of community participation, Banerjee and Duflo (2008) conjecture that HUMCs are more powerful than VECs politically. Our findings corroborate this. Linkage possibly has strong learning effects because the school committee in Indonesia, like VECs in India, has no power. Engaging the more powerful village council leads to concrete actions on the ground and increases the legitimacy of the co-sponsored initiatives developed as a result of the joint meetings. By contrast, elections increase awareness and the representativeness of school committee members but enhancing this dimension of community

participation alone do not endow the school committee with sufficient power to make a difference towards learning. Our results also speak to well-known, cross-country evidence showing that changes in educational spending show weak correlations with changes in learning outcomes (World Bank, 2003). Like Duflo, Dupas and Kremer (2009), we find that providing resources alone is not cost effective. However, substantial gains are obtained when grants are combined with relatively cheap institutional reforms. In short, coordinating a community to impact service delivery is indeed a time-intensive task. Our results suggest that fostering ties between the school committee and powerful local groups through joint planning activities could be most cost effective at improving learning. In the following section we discuss the motivation for the field experiment, and describe the interventions in detail. Section III outlines the sampling strategy, timing and what information was collected. We then present the approach used in the empirical analysis (section IV), followed by the results (section V). We conclude in section VI.

II. Motivation, intervention design and implementation After achieving universal primary school enrollment in the 1980s - gross enrollment stood at 114 in the mid-nineties (Behrman, Deolalikar and Soon, 2002) - Indonesia began to shift attention to quality with reforms such as teacher training and upgrading, curricula revision, facility improvements, and

later on school-based management (Kristiansen and Pratikno, 2006). Despite these initiatives, Indonesia awaits marked progress in learning.5 Hanushek and Woessmann (2008) find that just nearly 30 percent of a cohort of grade nine Indonesian students had achieved full literacy. Against this backdrop of national efforts at promoting education quality, the Government of Indonesia in 2002 instituted a decree that gave school committees a greater role in advising and supporting school management, and encouraged greater engagement with the community.6 The decree stipulated that school committees would replace existing school-level committees known as BP3 (Badan Pembantu Penyelenggaraan Pendidikan) in Indonesian. The primary function of the BP3 was to raise funds from parents and the community to support the school, yet the funds were largely handed over to principals. The school committee would go a few steps further, making recommendations on school expenditures, teacher qualifications, and school facilities. In addition, the school committee was expected to act as a mediator between the school and the community, and promote community, especially parental, involvement in the school. Although the decree had been passed in 2002, four years later the decree had had limited effect on the actual functioning of school committees they

In reading, it ranks 57th out of 65 countries that participated in the Program for International Student Assessment (PISA) in 2009 (Organisation for Economic and Development, 2010); and in the Trends in International Mathematics and Science Study 2007, only half of Indonesias students performed above the lowest international benchmark in Math (Mullis, Martin and Foy, 2008). 6 See Lampiran I Keputusan Menteri Pendidikan Nasional, Nomor 044/U/2002, Tanggal 2 April 2002.

were largely still operating under the BP3 model (Fearnley-Sander et al., 2008) . This result begged the question of what could be done to help school committees realize the role envisioned in the decree, yet were cost-effective and scalable. Field visits and further discussions with the Ministry regarding lessons learned from other education projects led to the development of four approaches tested as part of this experiment. Here we discuss the motivation for choosing each of the interventions, and how they were implemented in practice. This experiment was funded by a grant from the Japanese government to the Ministry, and the Ministry contracted out intervention implementation to a consulting firm, Pusat Pengembangan Agribisnis (PPA),7 for a total contract value of 2.9 billion Rupiah (US$315,000).

Intervention 1: Grant and facilitation All 420 treatment school committees received a block grant of eight million Rupiah (US$870). The grant provided for the first time money that was directly under the school committees control.8 The grant was small relative to school budgets. The average 2005/2006 school budget recorded in

The World Bank supported PPA by making available a consultant for about two months, with the task of assisting in planning the interventions. 8 Reported parental contributions collected through the school committee at baseline were on average 1 million Rupiah (US$109). School committees transferred this money to the school and it entered into the school budget. The project grant was the only budget under their control.

10

the baseline survey was 200 million Rupiah (US$22,000), of which 146 million (US$15,900) was for personnel related expenditures.9 It was thought that the grant could help the school committee to catalyze change by bringing stakeholders together, participating in school budget discussions and making small initial investments in education. The grant would allow committees to reach out more easily to parents, community members and school management because they had money to hold meetings. Planning how to spend the grant was also an occasion for meeting. The project supported planning for the use of the grant in the linkage treatment where this topic was included in the discussion with the village council and school management, and during the training where it was used as an exercise to think ahead of how to put the material covered in the training into practice. The grant also provided the school committee an opportunity to contribute to school activities, and thus be a more active participant in the planning and budgeting process of the school. The school committee did not receive the money without strings, but rather was expected to develop (together with the village council for schools that were assigned the linkage treatment, see below) a plan for expenditure, and the committee was required to be transparent by posting expenditure categories on the school notice board. The school committee developed an

This grant amount is comparable to grant treatments in the literature. Blimpo and Evans (2010) use a grant of US$500 or five percent of the average school budget. Gertler, Patrinos and Rodrguez-Oreggia (2010) provide grants of US$500 to US$700 per year to schools in Mexico.

11

expenditure plan with the assistance of the facilitators, who coached school committees on how they might address problems at the school with the block grant (but only those that could be implemented in two years, or the life of the experiment), approved expenditure proposals from school committees, authorized transfer of the block grant (once they approved expenditure proposals), ensured transfer of the grant to school committees bank accounts, and monitored the use of the block grant. On average, one facilitator was assigned to ten schools, and visited each school committee 13 times.10 Using these estimates on the number of visits, time facilitators spent in the schools, and staff salaries to break down the facilitation costs, and considering other treatment-specific costs, we estimate that the cost of implementing the grant treatment was about US$321 (excluding the grant itself) per school. The block grant was transferred directly from the Ministry into a bank account held by the school committee, in two tranches, with the first tranche amounting to three million Rupiah (US$326). This first tranche was disbursed in January 2008, three months later than planned, due to budgeting problems at the Ministry. In the midline survey, in April 2008, school committee respondents reported an average spending of 2.3 million Rupiah (US$250). The biggest expenditure category was for meetings, averaging almost 1 million Rupiah (US$109).

10

Based on interviews by the authors with two district facilitators after completion of the project. According to PPA, the project employed 50 facilitators for a period of 15 months, and six district facilitators who managed the district teams. The interventions were implemented consecutively elections, linkage, and then training.

12

The second tranche also confronted Government budgeting delays. It was to be disbursed to the schools subject to sufficient progress achieved by the school committee in using the first tranche of the grant; but, in practice, all schools received the second tranche, and received it ten months late, in December 2008, soon after the endline survey.11 Thus, these results measure the impact of the first tranche, and the anticipation of getting remaining funding.

Intervention 2: Training Other factors hypothesized to be holding back school committees from realizing their role were information, such as their lack of knowledge about the decree; and capacity, such as how to engage the community, how to play a role in school management, and how to promote student learning. Thus, a twoday, district-level training attended by four school committee members (principal, teacher, parent, and one village representative)12 covered planning, budgeting and steps the school committee could take to support education quality. The budget session focused on a plan for spending the block grant. Materials drew heavily on the Creating Learning Communities for Children (CLCC) model developed by UNICEF, which provides prolonged training and facilitation to schools on active learning, school-based management and

11

The endline survey had to be conducted before the second tranche was disbursed because the grant from the Japan Social Development Fund that financed the survey was about to expire. 12 For the schools that were also assigned the linkage treatment, one additional representative from the village council was invited.

13

community participation, and has served as the foundation for several donor projects promoting school-based management in Indonesia. Naturally, it was not replicated fully for this project because the cost per school for such intensive work would have been too high. The training also included a visit to a model school committee that had been successful in applying school-based management practices. Appendix 2 provides more details on the training components, and its departures from the CLCC model. Schools assigned the training intervention received three additional visits beyond those provided with the grant. One visit was to announce the training and to agree on who would participate; another to deliver an official invitation stamped by the district education office; and a final visit, which took place just before the training, to ensure that those invited would come. Implementing the training cost US$360 (including cost of training) per school.

Intervention 3: Election The primary concern to be addressed by the election intervention was that school committee members were often handpicked by school management and did not represent parents or the broader community. With a democratic mandate and greater diversity in membership, it was hoped that the school committee would gain legitimacy, better communicate with parents and community members, and act in their interest with more authority and voice. The intervention introduced two primary changes to the process outlined in the decree: a quota for different types of members rather than a minimum or

14

maximum number of members, and an election committees role as facilitator for elections rather than an election body.13 The 2002 decree stipulates that the school committee include at least nine members, including community representatives (with a maximum of three from the village government), teachers, parents and the principal (although he/she cannot be the head); and the community must propose these candidates. The intervention tightened these guidelines, designating that the committee be comprised of six parents, three community members, one teacher, the principal and the head of the village council. The rationale for the quota was that it would ensure greater parental representation and allow parent and community stakeholder groups to directly choose their own representatives. The intervention also redefined the role of the election committee and gave more structure to the election process. The 2002 decree actually requires an appointment of members by an election committee. But because the election committee members are often chosen by principals, the appointed members also usually do not represent a variety of education stakeholders. The intervention instead delegated the election committees responsibility for mobilizing voters, and held an election to pick the election committee members. Facilitators from the project disseminated information about the

13

The election process was modeled after that used in the World Bank/Goverment of Indonesia Urban Poverty Project (UPP), known by the Indonesian acronym P2KP. The experience of UPP pilots, in which membership of village government was put to a popular vote, was that elites stood for election and were elected. Thus, the UPP project modified election processes to mobilize candidates from different sectors of the community, which is the model used in this experiment.

15

new process and assisted the two stakeholder groups, parents and community members, in selecting an election committee.14 Once the election committees were established, the facilitators, along with election committee members, undertook human resource mapping for the community and parental groups,15 which led to candidate selection. After candidates were proposed, the community and parent groups elected their members.16 Subsequent meetings were held to sign a decree establishing the school committee and to develop a work plan. The election process generally took facilitators five visits to schools beyond those necessitated by grant implementation, and took place in three batches to spread out facilitator workloads.17 The intervention cost approximately US $174 per school. Despite the efforts of the implementer, PPA, to encourage communities to remain faithful to the design outlined above, some schools refused to conduct an election. As shown in Table 1, 48 percent of the schools randomly assigned to implement an election actually did as intended. Of those schools

The election committee was selected through two separate but similar processes at the school and village levels. At the school, two parents and one teacher were selected based on a plurality of votes cast by the principal and teachers. The teacher selected in this process automatically becomes a member of the school committee. Similarly, at the village level, two individuals elected during a village-level meeting are designated to the election committee. 15 The village human resource mapping involved village organizations and community leaders, and allowed groups to propose potential candidates as community representatives. Representatives at this meeting recommended five people per organization or group as potential candidates, who were then invited at the next community meeting. A similar, separate meeting was held for parents of children in grades one to five, where the desired qualifications of school committee members were discussed, and potential candidates suggested. 16 Instead of a ballot, voters wrote down names of candidates, and the ones with the highest votes won. 17 Batches took place 15 April to 31 May 2007, 1 June to 14 July 2007 and 15 July to 31 August 2007.

14

16

that were assigned to the election intervention but did not fully comply, about seven percent of committees refused to change any members, while the remainder of committees agreed to a compromise of electing representatives of previously unrepresented groups. Some of those that refused or partially refused claimed that some school committees were only starting their terms, and thus they did not want to start over with new membership after a new pool had just been appointed or elected. Nevertheless, schools assigned to hold the elections did experience substantial changes in school committee membership as compared to other schools. In the former group, 54 percent of school committee members recorded in the endline survey had started their term after the baseline, while in the latter group only 1 percent did so. In the results section we will present intent to treat results, which estimates the impact of being assigned to hold an election.

Intervention 4: Linkage The aim of the linkage intervention was to increase the engagement of an external stakeholder, the village council (known in Indonesian as Badan Perwakilan Desa or BPD). The village council is a democratic village organization elected by villagers and has the power to draft village legislation, approve the village budget, monitor village government and can even propose to the district head for the village head to be removed (Antlov, 2003). By facilitating a series of planning meetings between the school committee and

17

the village council, discussing potential measures to address education issues in the village, it was believed that the school committee could more effectively form a bridge between the community and the school. It was hypothesized that this linkage of school committee with the powerful village council would first increase the stature of the school committee vis--vis school management, improving the ability of the school committee to exert more influence to improve services; and second, result in concrete support from the village council for measures addressing education problems that could not be solved by the school committee with school management alone. The intervention represents the spirit of the decree since the decree even envisions village representation in the school committee; but findings from field visits indicated, and baseline data confirmed, that there was little evidence of this collaboration before the linkage intervention. At baseline, 22 percent of all school committee representatives reported collaboration with the village council. The intervention cost US $125 per school, mainly covering the two additional visits to the school beyond those provided with the grant. The first facilitated meeting was between school principal and school committee members to identify measures for improving education quality that they would then propose to the village council. These measures were discussed in a subsequent meeting with village council representatives and other village officials, and the results of the meeting were documented in a memorandum of

18

understanding signed by the head of the school committee, the head of the village council, and the school principal. Examples of measures that parties collaborated on included building school facilities, establishing village study hours (two hours in the evening when households would turn off televisions and computer game kiosks would be closed), hiring contract teachers, making land available for school infrastructure expansion, resolving conflicts between two schools in a community and encouraging social and religious activities at school. In some cases, collaboration even extended to village council representatives becoming school committee members (Bjork, 2009).

III. Sample and timeline This study took place in six districts in Central Java and Yogyakarta, a region chosen because there were few large education projects active in the area, enabling the results to be relatively free from the risk of contamination from other projects. Moreover, conditions were hypothesized to be ripe for community engagement to flourish the area is peaceful, has reasonably high levels of existing social capital, and schools are relatively well equipped (high levels of electricity, adequate number of teachers, etc). The evaluation also focuses on public primary rural schools public because this evaluation was designed by the Ministry, which has the authority over public schools,18 and

18 The Ministry supplies some support to private schools, but has direct oversight over public schools.

19

rural because the majority of schools in the country are in rural or semi-rural areas, and it was hypothesized that accountability would be easier to engender in smaller, closer-knit areas. From six districts in two provinces, the sampling frame was further restricted by excluding sub-districts containing fewer than eight villages19 and schools with parallel classes in grade four.20 We also dropped schools with extremely good or bad average sixth grade examination scores in mathematics or Indonesian. We obtained school level, average grade six scores for mathematics and Indonesian. They ranged from a minimum of 0 to a maximum of 9.65. We dropped schools where the average scores for mathematics or Indonesian were below four (this cutoff corresponds to the seventh percentile for average mathematics scores and third percentile for average Indonesian scores) or above eight (this corresponds to the 99th percentile for mathematics and 97th percentile for Indonesian). While this could introduce external validity issues, we selected on test scores ex ante because we thought schools that were weak lacked the resources to fully benefit from our institutional reforms and schools that have high test scores will have less need for our interventions.
19

This restriction was imposed because in the initial design, facilitated meetings with subdistrict government education officials were envisioned, and too few villages per meeting would make this intervention ineffective. However, this idea was never implemented, making the restriction unnecessary. 20 Parallel classes are grades with more than one section or teacher. This restriction was imposed because the evaluation was not planning on assigning student IDs or ensuring that the student population was identical over time. With only one class per grade, and low dropout and repetition rates, the evaluation team was confident that the same children interviewed in grade four would be in grade six two years later. However, this actually became an issue, since several schools merged, but the team was able to match student names, see below.

20

To gauge the extent of the external validity problem due to this selection criterion, we checked the average scores for the selected schools and the full sample. The average mathematics score for selected (all) schools was 6.0 (5.8) and the average Indonesian score for selected (all) schools was 6.9 (6.8). The standard deviations are 1.3 for mathematics and 1.2 for Indonesian average scores. The median is also not that different. From this sampling frame, we first sampled 44 sub-districts. To avoid spillovers between treatment and comparison schools within a village, we sampled one school per village.21 Out of the villages in the 44 sub-districts, we selected 520 villages and randomly selected one school from each of these villages. The resulting sample of 520 schools was then stratified into three groups using their average test scores.22 Within each stratum, schools were randomly assigned into the nine treatment and comparison groups according to Table 2.23 A disproportionate share of the sample was allocated to the cells assigned nothing (control group) and the cells assigned just the block grant, in order to separately identify the effect of the block grant. The linkage, election and training interventions were always implemented in combination with the grant. The cell size for the training intervention was made slightly smaller

21

The sampling probability was increased accordingly for schools that were located in villages with more than one school to keep the probability of being sampled equal across schools. 22 Calculated as 0.5*Mathematics score + 0.5 * Indonesian score. 23 Random allocation conducted by authors using Stata.

21

than the non-training cells because training is a relatively costly intervention. We account for these different cell sizes using weights (see the next section). The baseline survey took place in January 2007, midline in April 2008, and the endline survey in October 2008, as shown in Table A1.24 Tests in mathematics and Indonesian, designed by the Ministry25, were administered to all students in grade four at baseline and grade six at endline.26 Attrition occurred both in terms of schools and students. Three out of 520 schools were not included in the endline survey. One school refused interviews. In two other schools, the implementer implemented the treatment in schools different than those surveyed in the baseline. These schools were assigned to be in the grant plus linkage, grant plus election, and control groups respectively. Our analysis is based on the 517 remaining schools. We matched students on the basis of student names written on the test sheets and school ID. We were able to match 10,941 students, which is equal to 87 percent of the tests administered at baseline in grade four, and 88 percent of the tests administered at endline in grade six in the 517 schools that participated in both rounds. To investigate the potential impact of the match on the results, we estimated the probability of being able to find a student
24 25

The survey was conducted by Moores Roland, a survey firm selected by MoNE. We relied on the test development department of the Ministry of Education (Puspendik) to design the tests as they have more experience developing tests and to ensure ownership of the results by the Ministry. 26 In the original sample selection, schools with multiple parallel classes were excluded (so all sample schools started out with one grade 4 class), and thus we did not assign student IDs; but in the endline survey, it was discovered that several schools had more than one grade six class, due to schools merging. We only have data on the number of parallel grade six classes in 240 sample schools, and found that 13 grade six classes had two parallel classes (5 percent). However this issue is remedied by matching student names.

22

match as a function of the interventions, the baseline test score and the baseline values of the summary indices (see equation 2 in the next section for a definition of the indices). As shown in Table A2, there is no impact of the interventions on the probability to find a student match. The interaction between baseline test score and the intervention is not significant, and neither are all but one of the baseline indices. 27 We do find that students with lower baseline scores have a statistically significantly higher probability of not being matched, but the size of the effect is small. For instance, for students in the lowest decile of the baseline test score distribution, 84 percent could be matched. We believe that most of the matching problems arise from problems in writing names. School records show low repetition rates and drop-out rates of around 2 percent. We use the matched panel in the estimations. We also designed surveys centered around hypothesized mechanisms that could improve learning outcomes. Broadly, these intermediate outcomes relate to awareness of school committees, school-based management, parent, community and teacher inputs to education and perceptions of student learning. We interviewed parents, teachers, students, school committee members, and principals. Administrative data and interviewer observations on infrastructure and teacher activities at the start of visit were also recorded. To track the teachers of the students tested, the teacher sample was restricted to teachers teaching grade four at baseline and grade six at endline. We then
27

The teacher level inputs has a significantly positive effect of finding a student match. We checked whether the interaction between the teacher level inputs and the intervetion was significant, and it is not.

23

randomly selected three students from their classes, and these students parents, for interview.

IV. Impact Evaluation Strategy In this section we discuss the empirical framework and how we treat the issue of having many intermediate outcomes.

Pairwise impact evaluation As discussed above, the objective of this study is to evaluate the effects of four treatments, independently and combined with each other. We analyze seven pairwise comparisons (see Table 3). The grant comparison examines the schools in the control group with those that were assigned the grant only, while all other comparisons measure the effect of assignment to the other interventions (election, linkage and training) and their combinations, conditional on being assigned the grant treatment. We chose to use seven bivariate comparisons that correspond to the hypotheses we set out to test. We excluded the comparison that tests all interventions against grant only for lack of power as the former group consists of only 45 schools. All comparisons are intent to treat effects where adherence to the grant, training and linkage is close to perfect, but not for elections (see discussion in section II). We need to apply weights in our regressions because the assignment of schools to the treatment and comparison groups is not balanced (see Table 2). In particular, the cell with grants only has almost twice as many schools as the

24

other seven treatment cells because we wanted to evaluate the impact of the grant-only intervention by comparing this cell with the control group (the 100 schools assigned nothing).29 We would not need weights if we had a randomized control trial with cross-treatments and balanced assignments. For example, when evaluating the impact of the linkage intervention, balanced assignment ensures that the share of schools assigned to training and election treatments should be equal for the linkage and no linkage groups. This is not the case for us. Table 2 shows that the linkage group has 95 out of 190 schools assigned the election treatment but the no linkage group has 95 out of 230 schools assigned the election treatment. This is because the no linkage group includes the grant only treatment cell that has more schools compared to other treatment cells in Table 2. In our analysis, we assign lower weights to cells in Table 2 that have more schools. Specifically, the weights are calculated as the total number of schools divided by the total number of schools in the cell. This way, observations in the grant only cell that has more schools will have lower weight so that the weighted number of observations is balanced for each cell, as if we had balanced assignment. In the example above, the weighted number of observations assigned the election treatment would be balanced across the linkage and no linkage groups. This procedure is akin to using weights to correct for oversampling. Here, we have over-assigned schools to the grant

29

Our initial power calculations suggested that each hypothesis required approximately 90 schools.

25

only cell. We also ran regressions without weights that include controls for the cross-treatments and the results are similar, suggesting that the weights are not driving the results.

Impact on test scores The impact of the intervention on test scores is estimated by

y i , j , endline = k + ( treatment j ) + y i , j , baseline + ij

(1)

where yi , j denotes the standardized test score of student i in school j in strata k. The standardized test scores are calculated by subtracting the mean and dividing the test score by the standard deviation observed in the control group schools. Note that the baseline value for y observes the students in grade 4, while by endline these students were in grade 6. Standard errors were clustered at the school level. The treatment variable equals 1 for the treatment group, and 0 for the comparison group. All regressions include strata dummies because random assignment was within each stratum.

Impact on intermediate outcomes We not only wanted to understand effects on student learning, but also changes in other intermediate outcomes that were hypothesized to be precursors to learning improvements.

26

To address the issue of having too many intermediate outcomes, we follow Kling, Liebman and Katz (2007) and Banerjee, et al. (2010) and construct summary indices for each domain of intermediate outcome

variables. The advantage is that adding more outcomes to a summary index does not increase the number of hypotheses. The disadvantage is that the summary indices are harder to interpret and less transparent. We report results for the summary indices in Table 6 and results for the per comparison hypothesis tests in the appendix. We define the summary index score for school j over the set of ND outcome variables in group D as the mean of the z scores of the non-missing30 outcome variables in a group. Each variable is constructed such that it contributes positively to the header or overall concept used for the domain.

y jD =

1 ND

d =1

ND

y jd y d

(2)

Where yd and d are the mean and standard deviations of variable y jd estimated from the control group schools. The summary index provides an equal weight to each variable that enters the summary index correcting for natural variation as observed in the control group.

30

There were some difficulties with non-response in some of the schools, we have complete information on 508 schools and partial information on nine schools.

27

Following Banerjee, et al. (2010), we condition on the baseline values of all variables included in the summary index. For instance, for outcome variable d included in summary index D, the impact is estimated by

y j ,d ,endline = k + (treatment j ) + d y j ,d ,baseline + jd


d =1

ND

(3)

For estimating the impact on the summary index itself, yj,d,endline is replaced by
yj,D,endline.

Our challenge was to group the outcome variables in a way that made intuitive sense as separate pathways to learning. Our strategy was to define domains according to stakeholders parents, teachers and community soliciting answers from various respondents on how they support education. Two separate indices that focused on school committee awareness and schoolbased management were also included. Each impact domain is represented by a table (see Tables A6-A10). The list of variables in each domain, their definitions and the corresponding questionnaire are found in Table A5. The grouping is as follows: Awareness of school committee (Table A6) Parent level inputs to education (Table A7) Teacher level inputs to education (Table A8) School-based management (Table A9) Community level inputs to education (Table A10)

28

V. Results In this section we check pre-treatment balance, and discuss the impacts on education outcomes and intermediate outcomes as defined in the summary indices.

Checking baseline, pre-treatment differences

In order to check whether our baseline outcome values are balanced across intervention groups, we report the estimates of in (1) where the dependent variable is replaced by the baseline value and only the treatment variable and strata dummies are included on the right hand side (Table 4). To save space, we present the results for the language and mathematics test scores and the summary indices only. No difference can be found for the baseline test score comparisons, and the same holds for most summary measures. Only one coefficient is significant at the five percent level or below. Treatment schools in the linkage plus elections comparison have a higher value for the school-based management summary measure at baseline. This baseline difference is controlled for in our main results below.

Education outcomes (Tables 5, A3 and A4) We begin by measuring the impact of the interventions on the main education outcome variables - dropout rate, repetition rate, and test scores, which are shown in Table 5. We do not find any significant effects on dropout and repetition rates, which is not surprising given the very low rates at

29

baseline (2 percent for both). The lack of effects on dropout and repetition rates makes us more confident about comparing test scores of grade 4 and grade 6 students. If there had been effects, we would have been worried about endogenous attrition causing sample selection bias in our results. Looking at learning, we find substantial effects on Indonesian test scores, and no effects on mathematics. Linkage improves Indonesian scores by 0.17 standard deviations, while linkage plus elections increases Indonesian test scores by 0.23 standard deviations. These two estimates are not statistically significantly different from each other. The first two columns of Table 5 report the means and standard deviations of the unstandardized test scores for the control group at baseline and endline. The unstandardized test scores range from 0 to 30 because each test had 30 questions and each correct answer was awarded a point. Therefore, the statistics in Table 5 can be used to convert the effect sizes in standard deviations to the number of correct answers in each test by multiplying and adding the standard deviation and mean of the control group, respectively. We find no impacts on learning for training nor elections alone, nor the other combinations of interventions. The treatment effects for linkage and linkage plus elections are robust to adding various controls (Table A3). The treatment effects for training and elections and their combinations remain insignificant with additional controls. We tried adding district fixed effects, controlling quadratic terms in the baseline test score, controlling separately for baseline language and

30

mathematics scores, controlling for baseline summary indices and saturating the model with all these baseline controls. The estimated treatment effect for the grant only intervention is smaller (0.129) and is not statistically significant. The grant comparison has the least statistical power because it includes the smallest number of schools (190). In our robustness checks (Table A3), the effect remains insignificant for 3 robustness checks. Only when we add controls for baseline summary indices does the grant only treatment effect on Indonesian test scores increase to 0.17, significant at the 10% level. However, grants are at least 2.5 times more costly to implement compared to the incremental costs for linkages and only as impactful on Indonesian test scores, at best.31 We think it is harder to detect a treatment effect using mathematics scores because the endline mathematics test was much harder than the baseline test. We can see this in the data in several ways. Note that the maximum number of points that could be obtained was 30 for each test. First, the unstandardized test score means for mathematics are 16.4 and 8.9 for the baseline and endline tests respectively (Table 5), meaning the mean for the endline is almost half of that for the baseline. For Indonesian, the baseline and endline means are more similar (11.7 and 13.1 respectively). Second, there is less variation in the endline mathematics test. The standard deviation is lower

31

Calculated using $321 as the cost to implement the grant (see Section II) compared to $125, the incremental cost to implement linkages. If we included the cost of the grant itself ($870), grants would be even less cost effective. The effect size, 0.17, is the largest estimate for the grant treatment in Table A3.

31

(3.2 instead of 5.6 for the baseline). Conversely, the standard deviations for the endline language score (6.3) is higher than that for the baseline (4.3). Third, in a regression framework, we find that controlling for an indicator for all post treatment scores can explain almost half of the variation in mathematics test scores, but not so for language test scores. When we regress math test scores on school fixed effects, we can explain 14% of the variation. But, when we add an indicator for post treatment test scores, the R-squared increases to 0.59. For language scores, the R-squared only increases from 0.23 to 0.26. In other words, the post treatment dummy is explaining much of the variation in the math tests, even more than school fixed effects. These findings suggest that the endline mathematics test was not as useful at demonstrating a range of abilities as was the Indonesian test because it was too hard. The effects are generally larger for girls than for boys.33 For linkage, the effect sizes are 0.16 standard deviations for boys and 0.19 for girls. For linkage plus elections, the effects are 0.19 and 0.27 standard deviations for boys and girls respectively. For girls, we also find a positively significant effect of 0.11 standard deviations of the linkage intervention on math scores. We also looked at treatment effects by quintile of the baseline test scores. We find that impacts are generally higher for students who already did well in the baseline test (Table A4). Figure 1 shows probability density functions of the scores for all the tests. A very large proportion of the students scored around 7 points for the
33

The students gender variable was constructed ex-post using the name of the student.

32

mathematics endline test. There is no evidence of ceiling or floor effects for the mathematics scores in both years the distributions have the expected bell shape curves. The same is true for Indonesian scores at baseline, but the endline distribution has two humps, where the right hump is censored above by the maximum score. There is however no clear excess mass at the maximum score, suggesting limited truncation. The two bells found in the curve are in line with the finding that the strongest effects were found for the higher scoring students.

Cost Effectiveness

Using the costs discussed in the implementation section above,34 we find that the linkage intervention is the most cost-effective at $0.44 per child per tenth of a standard deviation.35 Linkage plus elections costs $0.78 per child per tenth of a standard deviation. These incremental gains are achieved at much lower cost than when the grant is provided without the other interventions.36 We use tenth of a standard deviation because this is widely considered to be the minimum effect size relevant for policy, and has been

34

See Section II for details on the cost of each intervention. The only cost not mentioned in Section II above was overhead costs (team leader, office space, etc.) allocated equally to all intervention schools, which amounted to US$ 140 per school. 35 Calculated as $125.47/(164*10*0.173). 36 We calculate that the best possible cost-effective estimate for the grants only treatment is $2.32, which is much higher than the cost-effectiveness for linkage conditional on grants. The $2.32 estimate was calculated as total outlays for grants as of the endline survey ($326 of grants disbursed plus $321 general facilitation cost), divided by the largest possible effect per child per tenth of a standard deviation (0.17*164*10), where the 0.17 number is the largest effect size for the grant only treatment in our robustness checks (Table A3). If we included the full grant amount in the numerator ($870 instead of $326), the estimate for grants is even higher.

33

used in other literature looking at the impacts on test scores compared to cost (Burde and Linden, 2012, He, Linden and MacLeod, 2008, Linden, 2008). We consider intervention and facilitation costs. Specifically, as discussed above, linkage alone costs $125.47 per school, and linkage plus elections costs $300.43 per school. The per student calculations utilize 164 as the average school size, since the entire school was affected by the intervention, not just the students who were tested; although we recognize that effects could differ across grades. Several caveats are in order. First, we did not include the cost of general facilitation and the grant. We think this is appropriate since the linkage and linkage plus elections effects are estimated, conditional on the grant treatment. That is, the numerator in our cost effectiveness calculation differences out the cost of the grant treatment because the denominator differences out the effect of the grant treatment. While it is possible that the linkage, election and training interventions would not be effective without the general facilitation and grant being available, this hypothesis is not testable as interventions were implemented simultaneously, or at least closely parallel to each other. Second, we did not factor in the opportunity costs of the school committee members time which would decrease cost effectiveness. Third, we did not amortize the cost of the interventions over time, which could increase cost effectiveness. We decided against the last two factors because we wanted our cost effective estimates to be comparable to as large a set of estimates as

34

possible and most cost effective estimates for test scores do not account for these factors.37 Compared to other learning interventions, linkage and linkage plus elections are extremely cost-effective. A village-based school construction program in Afghanistan costs $4.80 per child per tenth of a standard deviation (Burde and Linden, 2012), an after-school computer-based learning program in India costs $4.59 per child per tenth of a standard deviation (Linden, 2008), and a girls scholarship in Kenya costs between $1.77 to $3.53 per child per tenth of a standard deviation (Kremer, Miguel and Thornton, 2009). Other programs with similar cost effectiveness are machine or flash card-based activities aimed at teaching English to Indian children in grades 1 to 5 that cost as low as $0.22 per student per tenth of a standard deviation (He, Linden and MacLeod, 2008), and an Indian remedial teacher education program that cost $1 per tenth of a standard deviation (See Banerjee et al. (2007) cited in Linden (2008)).

Intermediate outcomes (Table 6, Tables A6 A10) In this section, we focus on why linkage and linkage plus elections impact learning, then briefly discuss the mechanisms driven by the other interventions. Improvements in these areas may be goals themselves, and they

37

Dhaliwal et al. (2012) provide a comprehensive set of cost effective estimates that account for these factors. But, they do not provide estimates for effects on test scores, which is our focus.

35

were hypothesized to ultimately lead to improved learning. We only focus on estimates that are significant at the 1% and 5% level unless stated otherwise.

Linkage and Linkage Plus Elections

Linkage increases community level inputs by 0.14 standardized units while other summary indices are not significant (Table 6). Linkage plus elections increases community level inputs by 0.13 standardized units but the estimates are only significant at the 10% level. Table A10 reports estimates for individual components of the community level index. All linkage-related interventions enhance collaboration between the school committee and the village council (SCbpd, SPbpd). Principals demonstrate improved satisfaction with village councils attention to education in the village (SPsatbpd). Parents agree with principals more positive view about community engagement with the linkage intervention (Psatcomm). Cooperation with the village council appears to be central. School committees in the linkage treatment groups do not report more cooperation with any non-educational community organizations other than the village council (SCnonbpd). Interestingly, with the linkage intervention, meetings between the school committee and the local education office decline significantly (SCmeetdinas, Table A6), suggesting that the local education office may have pulled back as a result of greater school engagement by the village council.

36

The community support appears to be unrelated to donations. School committee representatives perceive no changes in community support in areas such as in-kind or financial contributions (SCsatcomm, SCcominkind). This is consistent with our hypothesis that the linkage intervention improved learning because the joint planning meetings translated into co-sponsored education initiatives. This is in marked contrast to the school committees predecessor, the BP3, which was largely perceived as a passive fundraising arm for the school. While the summary index for teacher level inputs is insignificant, an additional factor that could have contributed to the learning gains of the linkage plus elections intervention is teacher work hours (Thours, see Table A8). This increases by one hour per week for this intervention. We chose to highlight this particular result as teacher work hours seems closely related to the learning experience students have in school.

Other Interventions

Grants increase awareness of school committees by 0.15 standardized units (Table 6). Elections and training plus elections increase awareness also but the effects are only significant at the 10% level. The awareness of the school committee summary index measures whether respondents knew members of the school committee, interacted with them and whether they considered the school committee effective (Table A6).

37

Grants are highly effective at increasing parents awareness of the school committee. They are 16% more likely to know that the school committee exists and are 10% more likely to know the names of the school committee members (Pknowexist, Pknow_scmem). The increased awareness however does not translate to increased interaction between parents and school committees. Parents do not meet more with the school committee (SCmeetparents). Only internal meetings of the school committee are impacted positively by the grant (SCintmeettot). The rationale behind the many community participation programs is that community members themselves should have the strongest incentive to improving service delivery. To this end, parents should care the most about their childrens education, yet, we do not see much traction in the index for parental level inputs. Elections and training plus elections increase parental level inputs but are only significant at the 10% level. Parents could support education by contributing to the school, or supporting their childrens education directly. The summary index for parental contributions captures both aspects (Table A7). There is an increase in the total number of minutes that household members accompany a child studying at home in the past week (Pallhh_min). With the election intervention, this increases by 81 minutes per week.38

The survey asked fathers, mothers and anyone else in the household to report the number of hours per day and the number of days a week that they accompanied his/her child studying. Field workers were then instructed to convert hours per day to minutes, so there was the

38

38

Training increases in kind contributions of parents to the school committee (SCparinkind) but decreases the likelihood that parents ever visit to observe the class (Pvisit). However, the effects on parental inputs are limited to these variables only. There is no impact on stakeholder (parents themselves, school

committee, teachers or principals) satisfaction with parents' support for pupils' learning (Psatparents, SCsatparents, Tsatpar, SPsatpar), nor do we see any increase in the number of times parents come to school to meet a teacher (SPparentsinvolve, Pmeet_teacher). The summary index for teacher inputs captures both the number of teachers and their work effort (Table A8). We see some increases in teacher hours (Thours) for elections and decreases in the fraction of classes observed to have teachers (OBfractwithteach) for election and training plus elections but no impact on the overall summary index. We also measured variables related to school-based management, especially financial accountability and teacher management (Table A9). An objective of the interventions was to promote accountability for the schools routine financial decisions. We see no effects in the summary index for school-based management. Grants do appear to have a positive effect on the accountability of principals to parents and the school committee. Principals

potential for rounding errors, and we see this with peaks in the data on the hour (e.g., 60, 120, 180 minutes).

39

report providing more information to parents about school funding and budgeting (SPinviterapbs, SPparentsrapbs). Finally, training appears to have an impact on community level inputs but only if paired with linkage (Table A10).

VI. Conclusion This paper studied the impact of four treatments aimed at increasing the effectiveness of school committees in public schools in Indonesia. We find that measures that reinforce existing school committees, grant and training, demonstrate limited effects; while measures that foster ties between school committees and other parties, linkage and elections combined with linkage, lead to greater engagement by education stakeholders and in turn to learning. The results suggest that these institutional reforms can be highly cost effective in an environment where school committees have access to resources. In considering the mechanisms from treatment to greater learning, we find that the results arise mostly from increased community support. The election intervention was the most promising at reforming the school committee into an institution that engages community members and improves service delivery. Indeed, elections raised the awareness of the school committee, increased parental support for homework and teachers reported more work hours. However, elections alone were insufficient to raise learning outcomes.

40

We think that the success of the linkage intervention results from the fact that a more powerful community institution, the village council, was involved in the planning of the activities. This provided the legitimacy needed to ensure that actions that could improve learning were implemented. The grant and training interventions, which focused on fortifying the existing committee, showed little promise. This resonates with a more often voiced view that focusing on increasing resources, in this case financial and human resources, is not sufficient to make an institution perform better.

41

References Acemoglu, D.; S. Johnson and J. A. Robinson. 2001. "The Colonial Origins of Comparative Development: An Empirical Investigation." American Economic Review, 91(5), pp. 1369-401. Antlov, H. 2003. "Village Government and Rural Development in Indonesia: The New Democratic Framework." Bulletin of Indonesian Economic Studies, 39(2), pp. 193-214. Banerjee, Abhijit V.; Rukmini Banerji; Esther Duflo; Rachel Glennerster and Stuti Khemani. 2010. "Pitfalls of Participatory Programs: Evidence from a Randomized Evaluation in Education in India." American Economic Journal: Economic Policy, 2(1), pp. 1-30. Banerjee, Abhijit V.; Shawn Cole; Esther Duflo and Leigh Linden. 2007. "Remedying Education: Evidence from Two Randomized Experiments in India." Quarterly Journal of Economics, 122(3), pp. 123564. Banerjee, Abhijit V. and Esther Duflo. 2008. "Mandated Empowerment - Handing Antipoverty Policy Back to the Poor?" Reducing the Impact of Poverty on Health and Human Development: Scientific Approaches, 1136, pp. 333-41. Behrman, Jere R.; Anil B. Deolalikar and Lee-Ying Soon. 2002. "Promoting Effective Schooling through Education Decentralization in Bangladesh, Indonesia, and Philippines," In ERD Working Paper No. 23. Manila: Asian Development Bank. Bjork, Christopher. 2009. "Improving Educational Quality through Community Participation - Qualitative Study," In Report for World Bank. Bjrkman, Martina and Jakob Svensson. 2009. "Power to the People: Evidence from a Randomized Field Experiment on Community-Based Monitoring in Uganda." The Quarterly Journal of Economics, 124(2), pp. 735-69. Blimpo, M. and D. Evans. 2010. "School Based Management, Local Capacity, and Educational Outcomes Lessons from a Randomized Field Experiment," In http://www.stanford.edu/~mpblimpo/research.html. Bruns, Barbara; Deon Filmer and Harry Anthony Patrinos. 2011. Making Schools Work: New Evidence on Accountability Reforms. Wasington DC: World Bank Burde, Dana and Leigh L. Linden. 2012. "The Effect of Village-Based Schools: Evidence from a Randomized Controlled Trial in Afghanistan," In NBER working paper 18039. Boston. Dhaliwal, Iqbal; Esther Duflo; Rachel Glennerster and Caitlin Tulloch. 2012. "Comparative Cost-Effectiveness Analysis to Inform Policy in Developing Countries: A General Framework with Applications for Education." Abdul Latif Jameel Poverty Action Lab (J-PAL), MIT.

42

Duflo, Esther; Pascaline Dupas and Michael Kremer. 2009. "Additional Resources Versus Organizational Changes in Education:Experimental Evidence from Kenya," In. Fearnley-Sander, Mary; Pahala Nainggolan; Mike Ratcliffe; Abby Riddell; Simone Seper and George Taylor. 2008. "Support to Basic Education in Indonesia. Report of the Joint European Community/Ausaid Pre-Feasibility Study Mission. ," In. Gertler, Paul ; Harry Patrinos and Eduardo Rodrguez-Oreggia. 2010. "Parental Empowerment in Mexico: Randomized Experiment of the Apoyo a La Gestin Escolar (Age) in Rural Primary Schools in Mexico: Preliminary Findings," In. Washington, DC.: World Bank. Hanushek, Eric A. and Ludger Woessmann. 2008. "The Role of Cognitive Skills in Economic Development." Journal of Economic Literature, 46(3), pp. 607-68. He, Fang; Leigh L. Linden and Margaret MacLeod. 2008. "How to Teach English in India: Testing the Relative Productivity of Instruction Methods within the Pratham English Language Education Program." working paper. Khattri, N.; C. Ling and S. Jha. 2010. "The Effects of School-Based Management in the Philippines: An Initial Assessment Using Administrative Data,," In World Bank Policy Research Working Paper, ed. W. Bank. Kling, Jeffrey R.; Jeffrey B. Liebman and Lawrence F. Katz. 2007. "Experimental Analysis of Neighborhood Effects." Econometrica, 75(1), pp. 83-119. Kremer, M. and A. Holla. 2009. "Improving Education in the Developing World: What Have We Learned from Randomized Evaluations?," In Annual Review of Economics, 513-42. Kremer, Michael; Edward Miguel and Rebecca Thornton. 2009. "Incentives to Learn." Review of Economics and Statistics, 91(3), pp. 43756. Kristiansen, S. and Pratikno. 2006. "Decentralising Education in Indonesia." International Journal of Educational Development, 26(5), pp. 513-31. Linden, Leigh L. 2008. "Complement or Substitute? The Effect Of Technology on Student Achievement in India." downloaded from
http://www.povertyactionlab.org/publication/complement-substitute-effecttechnology-student-achievement-india.

Mansuri, Ghazala and Vijayendra Rao. 2012. Localizing Development: Does Participation Work? Washington DC: World Bank. Martinez-Bravo, Monica; Gerard Padr i Miquel; Nancy Qian and Yang Yao. 2011. "Do Local Elections in Non-Democracies Increase Accountability? Evidence from Rural China," In NBER working paper.

43

Mullis, I.V.S. ; M.O. Martin and P. Foy. 2008. "Timss 2007 International Mathematics Report: Findings from Ieas Trends in International Mathematics and Science Study at the Fourth and Eighth Grades," In ed. B. College. Boston: TIMSS & PIRLS International Study Center. Olken, B. A. 2007. "Monitoring Corruption: Evidence from a Field Experiment in Indonesia." Journal of Political Economy, 115(2), pp. 20049. Olken, Benjamin A. 2010. "Direct Democracy and Local Public Goods: Evidence from a Field Experiment in Indonesia." American Political Science Review, 104(2), pp. 243-67. Olken, Benjamin A.; Junko Onishi and Susan Wong. 2012. "Should Aid Reward Performance? Evidence from a Field Experiment on Health and Education in Indonesia," In NBER working paper. Organisation for Economic, Co-operation and Development. 2010. Pisa 2009 Results. Paris: OECD Publishing. Ostrom, Elinor and T.K. Ahn. 2009. "The Meaning of Social Capital and Its Link to Collective Action " In Handbook of Social Capital, ed. G. T. Svendsen and G. L. H. Svendsen. Edward Elgar. Pandey, Priyanka; Sangeeta Goyal and Venkatesh Sundararaman. 2009. "Community Participation in Public Schools: Impact of Information Campaigns in Three Indian States." Education Economics, 17(3), pp. 35575. Stiglitz, Joseph E. 2002. "Participation and Development: Perspectives from the Comprehensive Development Paradigm." Review of Development Economics, 6(2), pp. 163-82. World Bank. 2003. "Making Services Work for Poor People," In World Development Report, ed. S. Devarajan and R. Reinikka-Soininen. Washington DC: Oxford University Press, The World Bank.

44

Table 1: Adherence to design (Percent of intent to treat) Fully Implemented Partially Implemented Not Implemented (1) (2) (3) Election 47.9 44.7 7.4 Grant 98.8 0 1.2 Training 100 0 0 Linkage 98.4 0 1.6 Note: Partial implementation for the election means previously underrepresented groups were elected while other members stayed on.

Table 2: Allocation of schools to treatments (number of schools) Receiving block grant No election Linkage No Linkage No Training 50 90 Training 45 45 Total 95 135 Control Group, not receiving block grant, no intervention: 100 schools Election Linkage 50 45 95 Total No Linkage 50 45 95 240 180 420

Table 3: Impact evaluation framework Comparison Grant Election Linkage Training Linkage + Election Linkage + Training Training + Election Treatment Grant-only Grant + Election Grant + Linkage Grant + Training Grant + Linkage + Election Grant + Linkage + Training Grant + Training + Election Number of schools 90 190 190 180 95 90 90 Control No grant Grant + No Election Grant + No Linkage Grant + No Training Grant + No Linkage + No Election Grant + No Linkage + No Training Grant + No Training + No Election Number of schools 100 230 230 240 135 140 140

45

Table 4: Tests of pre-treament balance in observables across interventions Grant, G OLS (1) 0.133 (0.093) 0.104 (0.102) -0.047 (0.058) -0.005 (0.046) 0.017 (0.041) -0.042 (0.049) -0.077 (0.061) Election, E OLS (2) -0.047 (0.064) -0.059 (0.067) 0.026 (0.038) -0.029 (0.038) 0.035 (0.030) 0.061* (0.035) -0.017 (0.045) Linkage, L OLS (3) 0.001 (0.064) 0.015 (0.068) 0.008 (0.038) 0.002 (0.038) -0.002 (0.030) 0.049 (0.035) 0.020 (0.045) Training, T OLS (4) -0.054 (0.065) 0.004 (0.068) -0.040 (0.038) -0.007 (0.038) -0.014 (0.030) -0.043 (0.036) -0.086* (0.045) L+E OLS (5) -0.047 (0.088) -0.046 (0.101) 0.033 (0.055) -0.028 (0.056) 0.033 (0.041) 0.109** (0.049) 0.003 (0.061) L+T OLS (6) -0.050 (0.092) 0.022 (0.098) -0.033 (0.052) -0.005 (0.053) -0.015 (0.038) 0.005 (0.046) -0.066 (0.063) T+E OLS (7) -0.101 (0.087) -0.052 (0.090) -0.014 (0.049) -0.036 (0.050) 0.021 (0.043) 0.018 (0.046) -0.103* (0.055)

Language test scores Mathematics test scores Awareness of school committee Parent level inputs to education Teacher level inputs to education School based management Community level inputs to education * p<0.10, ** p<0.05, *** p<0.01

Each cell reports the estimated treatment effects using OLS. All estimations include stratum fixed effects because assignment of treatment was within each stratum. Robust standard errors reported in the parentheses. All standard errors for regressions with test scores are clustered at the school level.

46

Table 5: Impact on drop out, repetition and test scores Baseline Mean/SD (1) Endline Mean/SD (2) Grant, G OLS (3) Election, E OLS (4) Linkage, L Training, T OLS OLS (5) (6) L+E OLS (7) L+T OLS (8) T+E OLS (9)

Panel A: Drop out and repetition rates Drop out 0.002 0.010 -0.005 0.010 0.050 (0.005) Repetition 0.022 0.032 -0.004 0.040 0.061 (0.007) Panel B: Language test scores (average, by gender) Average 11.662 13.088 0.129 4.324 6.339 (0.094) Boys 11.379 13.162 0.085 4.134 6.572 (0.105) Girls 11.927 13.038 0.167* 4.476 6.128 (0.093) Panel C: Mathematics test scores (average, by gender) Average 16.416 8.876 -0.015 5.607 3.166 (0.080) Boys 16.378 8.981 0.002 5.673 3.101 (0.085) Girls 16.494 8.782 -0.032 5.546 3.231 (0.088) * p<0.10, ** p<0.05, *** p<0.01

-0.003 (0.006) -0.001 (0.005) 0.053 (0.069) 0.025 (0.078) 0.073 (0.069) -0.008 (0.050) -0.030 (0.058) 0.009 (0.053)

-0.002 (0.006) 0.007 (0.005) 0.173** (0.068) 0.156** (0.078) 0.191*** (0.068) 0.070 (0.050) 0.026 (0.058) 0.113** (0.053)

0.007 (0.006) -0.006 (0.005) -0.042 (0.069) -0.044 (0.078) -0.040 (0.069) -0.029 (0.050) -0.021 (0.059) -0.035 (0.053)

-0.005 (0.011) 0.007 (0.008) 0.234** (0.094) 0.187* (0.101) 0.271*** (0.099) 0.061 (0.075) -0.003 (0.089) 0.121 (0.074)

0.003 (0.006) 0.001 (0.009) 0.134 (0.087) 0.115 (0.101) 0.152* (0.088) 0.040 (0.068) 0.001 (0.080) 0.079 (0.072)

0.004 (0.006) -0.007 (0.008) 0.015 (0.103) -0.020 (0.115) 0.039 (0.101) -0.036 (0.066) -0.051 (0.071) -0.025 (0.074)

Dropout and repetition are fractions obtained from aggregate, school level administrative data, test scores are z-scores standardized using the baseline and endline means and standard deviations for the control group. Columns 1 and 2 report means and standard deviations for the control group in the baseline and endline, respectively. For the test scores, we report the mean and standard deviation of the unstandardized test scores for the control group (what we used to calculate the z-scores). The unstandardized scores are on a scale of 0 to 30 because each test had 30 questions and each correct question was awarded one point. The following columns report the estimated treatment effects using OLS. All estimations include stratum fixed effects because assignment of treatment was within each stratum. Robust standard errors reported in the parentheses. All standard errors for regressions with test scores are clustered at the school level.

47

Table 6: Impact on summary index of intermediate outcomes Grant, G OLS (1) 0.154*** (0.054) 0.023 (0.054) -0.027 (0.042) 0.055 (0.058) 0.008 (0.073) Election, E OLS (2) 0.074* (0.041) 0.062* (0.034) 0.011 (0.032) -0.030 (0.035) -0.019 (0.051) Linkage, L Training, T OLS OLS (4) (5) -0.012 0.049 (0.042) (0.041) 0.025 0.026 (0.032) (0.033) 0.022 -0.001 (0.030) (0.030) 0.017 -0.018 (0.035) (0.036) 0.141*** 0.095* (0.051) (0.051) L+E OLS (6) 0.048 (0.060) 0.070 (0.043) 0.015 (0.046) -0.018 (0.050) 0.126* (0.072) L+T OLS (7) 0.025 (0.060) 0.048 (0.044) 0.024 (0.039) 0.006 (0.050) 0.237*** (0.066) T+E OLS (8) 0.110* (0.060) 0.074* (0.044) 0.016 (0.041) -0.068 (0.054) 0.062 (0.070)

Awareness of school committee Parent level inputs to education Teacher level inputs to education School based management Community level inputs to education * p<0.10, ** p<0.05, *** p<0.01

Each cell reports the estimated treatment effects using OLS. All estimations include stratum fixed effects because assignment of treatment was within each stratum. Robust standard errors reported in the parentheses. See Tables A6 to A10 for effects on individual outcomes of each summary index.

48

Figure 1: Probability density functions of test scores


0.16 Math Baseline 0.14 Math Endline 0.12

Language Baseline

0.1

Language Endline

0.08

0.06

0.04

0.02

0 0 5 10 15 Number of correct answers 20 25 30

49

Appendix 1

Table A1: Study timeline Activity Baseline survey Training of school committees Linkage Elections Disbursement of first block grant Midline survey Qualitative study Endline survey Disbursement of second block grant Period January to February 2007 July to September 2007 June to October 2007 April to August 2007 January 2008 April 2008 July 2008 From October to November 2008 December 2008

Table A2: Attrition analysis (linear probability model) Grant, G (1) 0.002 (0.023) -0.040*** (0.012) 0.007 (0.017) 0.064* (0.036) 0.039 (0.047) 0.083** (0.036) 0.033 (0.034) -0.034 (0.035) Election, E (2) -0.019 (0.014) -0.038*** (0.008) 0.010 (0.012) 0.030 (0.023) 0.041 (0.027) 0.094*** (0.026) 0.003 (0.023) -0.016 (0.017) Linkage, L (3) 0.006 (0.014) -0.028*** (0.008) -0.010 (0.011) 0.031 (0.023) 0.045 (0.028) 0.093*** (0.026) 0.001 (0.024) -0.018 (0.017) Training, T (4) 0.002 (0.014) -0.024*** (0.009) -0.017 (0.011) 0.031 (0.023) 0.044 (0.027) 0.092*** (0.025) 0.001 (0.023) -0.018 (0.016) L+E (5) -0.011 (0.020) -0.038*** (0.012) -0.001 (0.018) 0.048 (0.030) 0.078* (0.042) 0.094** (0.042) -0.021 (0.028) -0.022 (0.027) L+T (6) 0.008 (0.019) -0.017* (0.009) -0.023* (0.013) 0.026 (0.029) 0.031 (0.032) 0.075** (0.033) 0.042 (0.035) -0.025 (0.021) T+E (7) -0.019 (0.017) -0.040*** (0.010) -0.009 (0.013) 0.081*** (0.028) 0.027 (0.027) 0.075** (0.029) 0.010 (0.024) -0.024 (0.027)

Intervention Baseline score Baseline score* intervention Baseline summary indices Awareness of school committee Parentel level inputs to education Teacher level inputs to education School based management Community level inputs to education * p<0.10, ** p<0.05, *** p<0.01

Estimated impact on attrition using OLS. Baseline score is the sum of language and mathematics scores. Estimations include stratum fixed effects because assignment of treatment was within each stratum. Robust standard errors reported in the parentheses. All standard errors clustered at the school level.

Table A3: Robustness checks for main results on test scores Grant, G OLS (3) Panel B: Language test scores (average) (A): Language baseline test score, strata dummies (A)+(B): Language baseline test score squared (A)+(C): Mathematics baseline test score (A)+(D): District fixed effects (A)+(E): Baseline summary indices (A)+(B)+(C)+(D)+(E) Panel C: Mathematics test scores (average) (A): Mathematics baseline test score, strata dummies (A)+(B): Mathematics baseline test score squared (A)+(C): Language baseline test score (A)+(D): District fixed effects (A)+(E): Baseline summary indices (A)+(B)+(C)+(D)+(E) * p<0.10, ** p<0.05, *** p<0.01 Robustness check for main specification reported in columns 3 to 9 in Panels B and C of Table 5. Specification A is identical to the results in Table 5 for average test scores. The specification controls for the baseline test score and strata dummies (equation 1 in the paper); the second row adds the baseline test score squared; the third row is similar to specification A but controls for both baseline test scores separately; the fourth row adds district fixed effects to A; the fifth row adds baseline summary indices; the last row includes all controls in the rows above. 0.129 (0.094) 0.134 (0.094) 0.127 (0.094) 0.130 (0.091) 0.173* (0.092) 0.157* (0.088) -0.015 (0.080) 0.005 (0.078) -0.025 (0.079) -0.023 (0.076) 0.014 (0.077) 0.005 (0.072) Election, E OLS (4) 0.053 (0.069) 0.052 (0.069) 0.057 (0.069) 0.069 (0.066) 0.054 (0.069) 0.074 (0.066) -0.008 (0.050) -0.018 (0.049) -0.006 (0.049) -0.015 (0.048) -0.003 (0.050) -0.010 (0.046) Linkage, L OLS (5) 0.173** (0.068) 0.175** (0.068) 0.172** (0.067) 0.137** (0.066) 0.168** (0.068) 0.135** (0.065) 0.070 (0.050) 0.065 (0.048) 0.071 (0.049) 0.068 (0.048) 0.065 (0.049) 0.063 (0.046) Training, T OLS (6) -0.042 (0.069) -0.044 (0.069) -0.046 (0.069) -0.065 (0.065) -0.032 (0.068) -0.063 (0.063) -0.029 (0.050) -0.032 (0.049) -0.019 (0.049) -0.051 (0.048) -0.025 (0.049) -0.044 (0.046) L+E OLS (7) 0.234** (0.094) 0.235** (0.093) 0.237** (0.093) 0.230** (0.092) 0.210** (0.092) 0.223** (0.090) 0.061 (0.075) 0.048 (0.073) 0.064 (0.073) 0.065 (0.070) 0.053 (0.072) 0.055 (0.068) L+T OLS (8) 0.134 (0.087) 0.133 (0.087) 0.127 (0.086) 0.087 (0.085) 0.132 (0.084) 0.083 (0.085) 0.040 (0.068) 0.033 (0.065) 0.052 (0.066) 0.020 (0.065) 0.037 (0.067) 0.023 (0.063) T+E OLS (9) 0.015 (0.103) 0.012 (0.102) 0.014 (0.102) 0.004 (0.099) 0.038 (0.103) 0.011 (0.100) -0.036 (0.066) -0.049 (0.065) -0.023 (0.064) -0.070 (0.062) -0.011 (0.064) -0.046 (0.058)

Table A4: Impact on test scores by quintile (quintiles defined using baseline test score) Baseline Endline Mean/SD Mean/SD (1) (2) Panel A: Language test scores by quintiles 1 (Low base score) 6.984 12.306 2.162 6.110 2 9.419 12.164 2.051 5.697 3 11.302 13.052 1.953 6.969 4 13.737 13.331 2.073 6.391 5 (High base score) 17.696 14.713 2.409 6.232 Panel B: Mathematics test scores by quintiles 1 (Low base score) 9.347 8.464 2.928 2.929 2 13.858 8.475 2.666 2.877 3 17.233 8.220 2.602 3.018 4 19.929 9.056 2.570 3.335 5 (High base score) 23.089 10.189 2.574 3.286 * p<0.10, ** p<0.05, *** p<0.01 Columns 1 and 2 report means and standard deviations for the control group in the baseline and endline, respectively. For the test scores, we report the mean and standard deviation of the unstandardized test scores for the control group (what we used to calculate the z-scores). The unstandardized scores are on a scale of 0 to 30 because each test had 30 questions and each correct question was awarded one point. The following columns report the estimated treatment effects using OLS. All estimations include stratum fixed effects because assignment of treatment was within each stratum. Robust standard errors reported in the parentheses. All standard errors are clustered at the school level. Quintiles are defined using the standardized language score plus the standardized mathematics score. Grant, G OLS (3) -0.046 (0.155) 0.073 (0.100) 0.098 (0.141) 0.201 (0.122) 0.302** (0.144) -0.173 (0.115) -0.071 (0.103) 0.141 (0.098) -0.035 (0.117) 0.103 (0.165) Election, E OLS (4) 0.095 (0.109) 0.057 (0.086) 0.088 (0.093) -0.104 (0.084) 0.116 (0.108) -0.103 (0.104) -0.000 (0.063) -0.002 (0.056) 0.073 (0.067) -0.040 (0.088) Linkage, L OLS (5) 0.140 (0.110) 0.088 (0.087) 0.132 (0.092) 0.246*** (0.081) 0.267** (0.107) 0.053 (0.103) 0.131** (0.060) 0.022 (0.055) 0.083 (0.067) 0.066 (0.089) Training, T OLS (6) -0.072 (0.111) -0.060 (0.085) -0.093 (0.093) 0.009 (0.085) -0.048 (0.109) 0.010 (0.097) -0.054 (0.061) -0.025 (0.055) -0.024 (0.067) -0.059 (0.089) L+E OLS (7) 0.217 (0.148) 0.144 (0.112) 0.221 (0.134) 0.144 (0.108) 0.424*** (0.153) -0.054 (0.151) 0.134 (0.097) 0.018 (0.083) 0.167* (0.085) 0.016 (0.134) L+T OLS (8) 0.081 (0.153) 0.027 (0.114) 0.038 (0.113) 0.253** (0.105) 0.213 (0.142) 0.092 (0.109) 0.105 (0.091) -0.005 (0.073) 0.067 (0.097) 0.017 (0.126) T+E OLS (9) 0.035 (0.157) 0.020 (0.115) -0.002 (0.132) -0.094 (0.128) 0.079 (0.147) -0.081 (0.106) -0.037 (0.084) -0.031 (0.077) 0.054 (0.094) -0.098 (0.121)

Table A5: Intermediate outcome definitions Variable and index Variable description Table A6: Awareness of school committees Panel A: Parents' awareness of school committees Pknow_scexist Parents know there is a school committee (1=Yes) Pknow_scmem Parents know names of school committee members (1=Yes) Pscanswer Parents are able to answer series of questions about school committee activities and performance (1=Yes) Panel B: Stakeholder opinions about school committee effectiveness SPsceffective Index of school committees cooperation, support, outreach and involvement in the school and community, according to principals [0,1] SCposcontr Whether school committee helped meet schools needs during the first semester of previous school year (1=Yes) Tscperception Index of teachers evaluation of school committee effectiveness, openness, and cooperation with school principal [0,1] Panel C: The number of school committee meetings with education stakeholders SCmeettripartite Number of formal meetings with school committee, principal, parents in previous year SCmeetprincipaltot SCintmeettot SCmeetparents SCmeetdinas SCmeetcomm SCmeetbpd SPmeetsc Number of informal and formal meetings with school committee, principal to discuss school issues/problems in the past year Number of internal formal and informal school committee meetings without principal or parents in the past year Number of formal meetings with school committee and parents, but principal not invited in the past year Number of formal meetings between school committee and Dinas kab/kota/keca (invited by Dinas) in the past six months Index of school committee meeting with any set of community groups in the first semester of the previous school year [0,1] Whether school committee has ever had a meeting with village council (1=Yes) Number of informal meetings with principal and school committee representative + number of formal meetings with principal and school committee members + number of formal meetings with entire school committee in the past month Number of times school committee invited teachers to discuss issues and problems at the school in the previous school year

Tscmeet

Table A7: Parent level inputs to education Panel A: Parents financial and in-kind support for school committees SCparfundraise Parental contributions in the first semester of previous school year (Rupiah in millions) SCparinkind Whether parents provided in-kind donations in the first semester of previous school year School committees subjective assessment of in kind contributions of parents to school committee in past semester (1=Large) Pcont Amount of voluntary financial and in-kind donations from parents to school committee in past year (Rupiah in thousands) Pcont_physical Whether parents contributed in-kind to school committee in past year (1=Yes) Panel B: Parents' support for and involvement in education SCsizeinkind

Variable and index Variable description Pmeet_teacher Number of times parents met with teacher in the last three months to discuss childs performance (other than to pick up report card) Pvisit Whether parents have ever come to school to observe class (1=Yes) Pallhh_min Total number of minutes all household members accompanied child studying at home in past week Psatparents Parents satisfaction with parents involvement in school and learning (1=Yes) Pchildatt Index of emphasis parents put on childs education (compilation of five opinion questions) [0,1] SCsatparents School committee representatives satisfaction with parents support for pupils education (1=Satisfied) SPsatpar Principals satisfaction with parents support for pupils education (1=Satisfied) SPparentsinvolve Index of principals assessment of parents involvement in school and learning [0,1] Tsatpar Teachers satisfaction with parents support for pupils education (1=Satisfied) Tparentsperception Index of teachers assessment whether parents of her/his pupils can help students improve achievement [0,1] Tparentsperception1 Teachers perception about parents involvement (actual and desired) Shomesupport Index of whether someone in the household promotes, accompanies and answers questions relating to home study [0,1] Table A8: Teacher level inputs to education Panel A: Number of teachers PNSteach Number of civil servant teachers GTTteach_govt Number of contract teachers hired by government GTTteach_school Number of contract teachers hired by school directly Panel B: Teacher effort SCsatteachers School committee representatives satisfaction with quality and performance of teachers (1=Satisfied) SCteachnoprob School committee representatives perception of whether teacher quality has been a problem (1=Not problematic) SPsatteach Principals satisfaction with quality and performance of teachers (1=Satisfied) Tsatteach Teachers satisfaction with quality and performance of teachers (1=Satisfied) Psatteachers Parents satisfaction with quality and performance of teachers (1=Satisfied) Pteacherperception Index of parents perceptions of teacher effort and approachability [0,1] Thours Number of hours worked per day in past week on teaching activities Tmeetparents Number of times in past three months that teacher met with parents to discuss student learning OBfractwithteach Fraction of classrooms with teachers (of those classrooms with teachers)

Variable and index Variable description Table A9: School based management Panel A: Financial accountability of school management to parents and school committees SCrapbs Index of involvement of school committee in developing school budget (according to school committee) [0,1] SCrecrapbs Whether school committee received the school budget in previous school year (1=Yes) Whether materials about school funding and budgeting were distributed to parents in previous school year (1=Yes) SPinviterapbs Index of involvement of school committee and community in developing school budget, according to principal [0,1] SPparentsrapbs Whether parents were told about school funding and budgeting in the previous school year (1=Yes) Pmtgrapbs Whether there was a meeting at the school about the budget (1=Yes) Prapbs Whether parents were told about school funding and budgeting in the previous school year (1=Yes) Panel B: Principals performance and management of teachers SPmeetteach Number of meetings between principal and teachers during previous school year Tprincmeet Number of routine meetings between principal and teachers in past year SPteacheval Index of whether principal conducts oral or written evaluations of teacher performance beyond compulsory yearly evaluation and whether results are given to teacher verbally or in writing [0,1] Tprinceval Index of whether principal conducts evaluations of teacher performance beyond compulsory yearly evaluation to teachers [0,1] Tprincipal Index of teachers overall assessment of principal (principal rated on seven areas of performance) [0,1] SPteachaward Whether principal rewards teachers who perform well (through recognition or gift/money), according to principals (1=Yes) SPteachaccount Whether principal sanctions teachers who dont perform well (trough warnings or training), according to principals (1=Yes) Treward Whether principal rewards teachers who perform well (through recognition or gift/money), according to teachers (1=Yes) Whether principal sanctions teachers who dont perform well (through warnings or training), Taccount according to teachers (1=Yes) SCprinceffort School committee representatives perception of whether principal has taken measures to address issues that are holding back learning (1=Yes) Table A10: Community level inputs to education Panel A: Village council's collaboration with schools and overall support for education in the village SCbpd Whether the school worked with the village council in the previous school year (1=Yes) SCsatbpd SPbpd SPsatbpd
School committee representatives satisfaction with village councils attention to education in the village (1=Satisfied) Whether the school worked together with the village council in previous school year (1=Yes)

SCdistrapbs

Principals assessment of extent of village councils attention to education in village (conditional on principal knowing there is a village council in the village) (1=Satisfied)

Variable and index Variable description Panel B: Community support for schools and school committees SCsatcomm School committee representatives satisfaction with support from community (1=Satistfed) SCnonbpd SCcomfundraise SCcominkind SPsatcomm SPnonbpd Psatcomm Tsatcomm Whether school committee cooperated with any non-educational community organizations other than the village council in the previous school year (1=Yes) Community, private sector and other contributions in the first semester of previous school year (Rupiah in millions) Whether community, private sector or any other private person/organization provided in-kind donations in the first semester of previous school year (1=Yes) Principals satisfaction with support from community (1=Satisfied) Whether school cooperated with any non-educational community organizations other than the village council in the previous school year (1=Yes) Parents satisfaction with support from community (1=Satisfied) Teachers satisfaction with support from community (1=Satisfied)

Letters before the variable name indicate type of questionnaire from which variable was drawn. SC = school committee. P = parents. T = teachers. SP = school principal. S = student. OB = school-level observational questionnaire.

Table A6: Awareness of school committee Baseline Mean/SD (1) Endline Mean/SD (2) Grant, G OLS (3) Election, E OLS (4) Linkage, L OLS (5) 0.029 (0.030) 0.032* (0.019) -0.006 (0.029) 0.001 (0.010) 0.034 (0.044) 0.016 (0.012) -0.060 (0.144) 0.275 (0.374) -0.351 (0.318) -0.115* (0.064) -0.214** (0.087) 0.037 (0.033) 0.035 (0.052) -0.413 (0.715) -0.077 (0.262) -0.012 (0.042) Training, T OLS (6) 0.023 (0.031) 0.002 (0.019) 0.025 (0.029) -0.006 (0.010) -0.017 (0.043) 0.015 (0.012) -0.112 (0.142) 0.426 (0.381) 0.461 (0.301) 0.104 (0.066) -0.003 (0.086) -0.021 (0.033) -0.003 (0.052) -0.266 (0.734) 0.123 (0.236) 0.049 (0.041) L+E OLS (7) 0.085** (0.042) 0.045 (0.028) 0.018 (0.040) 0.015 (0.014) 0.040 (0.062) 0.026 (0.018) -0.006 (0.181) 0.261 (0.516) -0.044 (0.436) -0.117 (0.112) -0.344*** (0.126) 0.046 (0.045) 0.076 (0.070) 0.351 (0.924) 0.330 (0.273) 0.048 (0.060) L+T OLS (8) 0.039 (0.041) 0.019 (0.027) 0.015 (0.042) -0.012 (0.014) 0.020 (0.064) 0.029 (0.018) -0.183 (0.179) 0.846 (0.534) 0.191 (0.465) -0.016 (0.103) -0.244** (0.123) 0.009 (0.044) 0.021 (0.074) -0.555 (0.889) 0.037 (0.366) 0.025 (0.060) T+E OLS (9) 0.071* (0.042) 0.012 (0.025) 0.038 (0.040) 0.005 (0.014) -0.001 (0.059) 0.035** (0.016) 0.076 (0.205) 0.543 (0.611) 0.678 (0.462) 0.102 (0.096) -0.115 (0.122) -0.019 (0.046) 0.027 (0.072) 0.368 (1.070) 0.569 (0.385) 0.110* (0.060)

Panel A: Parents' awareness of school committees Pknow_scexist 0.535 0.541 0.159*** 0.046 0.383 0.352 (0.046) (0.030) Pknow_scmem 0.215 0.222 0.098*** 0.011 0.219 0.210 (0.027) (0.019) Pscanswer 0.612 0.574 0.088* 0.013 0.347 0.347 (0.049) (0.028) Panel B: Stakeholder opinions about school committee effectiveness SPsceffective 0.599 0.575 0.027 0.012 0.130 0.110 (0.018) (0.010) SCposcontr 0.758 0.768 -0.027 0.026 0.431 0.424 (0.070) (0.042) Tscperception 0.818 0.836 -0.029 0.024** 0.106 0.110 (0.023) (0.012) Panel C: The number of school committee meetings with education stakeholders SCmeettripartite 2.263 2.273 -0.043 0.128 1.418 1.766 (0.258) (0.142) SCmeetprincipaltot 3.222 3.576 0.229 0.126 3.049 3.918 (0.431) (0.377) SCintmeettot 1.646 1.222 0.708** 0.355 2.508 1.782 (0.340) (0.305) SCmeetparents 0.101 0.172 -0.010 -0.008 0.562 0.475 (0.091) (0.068) SCmeetdinas 0.919 0.525 0.272 -0.090 0.865 0.941 (0.173) (0.083) SCmeetcomm 0.269 0.239 0.044 0.016 0.300 0.270 (0.040) (0.033) SCmeetbpd 0.354 0.343 0.043 0.041 0.481 0.477 (0.071) (0.051) SPmeetsc 4.697 5.394 1.112 0.864 4.186 5.825 (0.994) (0.791) Tscmeet 1.424 1.808 0.127 0.358 1.566 2.427 (0.379) (0.232) Summary Index 0.154*** 0.074* (0.054) (0.041) * p<0.10, ** p<0.05, *** p<0.01

Columns 1 and 2 report means and standard deviations for the control group in the baseline and endline respecitvely. The following columns report the estimated treatment effects using OLS. All estimations include stratum fixed effects because assignment of treatment was within each stratum. Robust standard errors reported in the parentheses. Letters before the variable name indicate type of questionnaire from which variable was drawn. SC = school committee. P = parents. T = teachers. SP = school principal. S = student. OB = school-level observational questionnaire.

Table A7: Parent level inputs to education Baseline Mean/SD (1) Endline Mean/SD (2) Grant, G OLS (3) Election, E OLS (4) -0.317 (0.438) -0.040 (0.042) -0.012 (0.030) 4883.991 (3150.762) 0.029 (0.023) 0.018 (0.188) -0.010 (0.019) 81.175** (32.577) 0.013 (0.009) 0.009 (0.010) 0.007 (0.017) -0.010 (0.020) 0.017 (0.013) 0.008 (0.020) 0.010 (0.051) 0.016 (0.018) 0.018 (0.017) 0.062* (0.034) Linkage, L OLS (5) 0.458 (0.424) 0.015 (0.041) 0.012 (0.030) 1991.739 (2288.924) -0.047** (0.023) 0.159 (0.193) 0.007 (0.019) 9.015 (32.449) 0.008 (0.009) -0.004 (0.010) 0.026 (0.017) 0.016 (0.019) -0.003 (0.013) 0.019 (0.020) 0.017 (0.052) 0.002 (0.018) -0.032* (0.017) 0.025 (0.032) Training, T OLS (6) 0.699 (0.453) 0.085** (0.041) 0.054* (0.029) -375.462 (3067.236) -0.006 (0.024) -0.164 (0.158) -0.039** (0.019) 25.020 (38.026) 0.005 (0.008) -0.007 (0.010) 0.012 (0.016) -0.008 (0.020) 0.010 (0.014) 0.022 (0.020) 0.023 (0.051) 0.008 (0.018) -0.003 (0.017) 0.026 (0.033) L+E OLS (7) 0.147 (0.316) -0.046 (0.057) -0.010 (0.044) 5955.505 (4143.527) -0.021 (0.032) 0.126 (0.211) -0.009 (0.027) 77.263 (52.735) 0.020 (0.013) 0.010 (0.014) 0.024 (0.023) 0.009 (0.025) 0.021 (0.016) 0.031 (0.029) 0.012 (0.070) 0.017 (0.025) -0.019 (0.023) 0.070 (0.043) L+T OLS (8) 1.415 (0.925) 0.070 (0.056) 0.046 (0.039) 2119.113 (1654.624) -0.051 (0.032) -0.094 (0.219) -0.032 (0.025) 50.291 (58.424) 0.015 (0.011) -0.012 (0.014) 0.033 (0.024) 0.011 (0.029) 0.014 (0.020) 0.044 (0.029) 0.020 (0.070) 0.011 (0.025) -0.030 (0.022) 0.048 (0.044) T+E OLS (9) 0.408 (0.295) 0.034 (0.053) 0.038 (0.037) 4223.308 (3280.726) 0.027 (0.034) -0.128 (0.287) -0.039 (0.029) 106.454* (55.458) 0.017 (0.012) -0.003 (0.014) 0.017 (0.022) -0.029 (0.026) 0.024 (0.017) 0.020 (0.026) 0.027 (0.072) 0.024 (0.026) 0.008 (0.023) 0.074* (0.044)

Panel A: Parents' financial and in-kind support to school committees: SCparfundraise 0.619 2.201 -1.419 3.912 12.797 (1.526) SCparinkind 0.152 0.131 0.031 0.360 0.339 (0.051) SCsizeinkind 0.106 0.098 0.009 0.260 0.255 (0.037) Pcont 7097.640 6405.720 -2.7e+03 24292.800 23436.000 (3309.804) Pcont_physical 0.101 0.108 0.008 0.245 0.222 (0.036) Panel B: Parents' support for and involvement in education: Pmeet_teacher 0.690 0.548 0.181 1.497 1.584 (0.157) Pvisit 0.138 0.101 0.056* 0.223 0.187 (0.029) Pallhh_min 263.596 268.455 4.557 321.546 341.264 (49.346) Psatparents 0.614 0.619 0.002 0.107 0.078 (0.015) Pchildatt 0.708 0.712 0.002 0.105 0.104 (0.015) SCsat parents 0.589 0.613 -0.052* 0.189 0.170 (0.027) SPsatpar 0.529 0.556 -0.042 0.191 0.184 (0.027) SPparents involve 0.523 0.511 0.014 0.122 0.123 (0.018) Tsatpar 0.488 0.478 0.027 0.209 0.193 (0.033) Tparents perception 0.566 0.556 -0.048 0.498 0.499 (0.079) Tparents perception1 0.540 0.508 0.022 0.191 0.197 (0.028) Shome support 0.814 0.774 0.048* 0.164 0.173 (0.027) Summary index 0.023 (0.054) * p<0.10, ** p<0.05, *** p<0.01

Columns 1 and 2 report means and standard deviations for the control group in the baseline and endline respecitvely. The following columns report the estimated treatment effects using OLS. All estimations include stratum fixed effects because assignment of treatment was within each stratum. Robust standard errors reported in the parentheses. Letters before the variable name indicate type of questionnaire from which variable was drawn. SC = school committee. P = parents. T = teachers. SP = school principal. S = student. OB = school-level observational questionnaire.

10

Table A8: Teacher level inputs to education Baseline Mean/SD (1) Panel A: Number of teachers PNSteach 7.091 1.642 GTTteach_govt 0.475 0.812 GTTteach_school 1.212 1.272 Panel B: Teacher effort SCsatteachers 0.616 0.129 SCteachnoprob 0.808 0.396 SPsatteach 0.626 0.119 Tsatteach 0.663 0.112 Psatteachers 0.654 0.089 Pteacherperception 0.593 0.115 Thours 5.796 2.807 Tmeetparents 1.404 1.228 OBfractwithteach 0.820 0.348 Summary index * p<0.10, ** p<0.05, *** p<0.01 Columns 1 and 2 report means and standard deviations for the control group in the baseline and endline respecitvely. The following columns report the estimated treatment effects using OLS. All estimations include stratum fixed effects because assignment of treatment was within each stratum. Robust standard errors reported in the parentheses. Letters before the variable name indicate type of questionnaire from which variable was drawn. SC = school committee. P = parents. T = teachers. SP = school principal. S = student. OB = school-level observational questionnaire. Endline Mean/SD (2) 7.596 2.208 0.717 1.098 1.909 1.629 0.626 0.128 0.798 0.404 0.613 0.123 0.630 0.104 0.636 0.100 0.615 0.068 5.784 3.182 2.919 7.341 0.796 0.365 Grant, G OLS (3) -0.120 (0.166) -0.007 (0.129) 0.150 (0.196) -0.017 (0.021) 0.028 (0.056) -0.004 (0.021) -0.004 (0.018) 0.001 (0.011) -0.014 (0.012) 0.199 (0.431) -0.957 (0.812) 0.018 (0.030) -0.027 (0.042) Election, E OLS (4) -0.152 (0.106) 0.080 (0.084) -0.141 (0.126) -0.007 (0.014) 0.017 (0.042) 0.021 (0.015) 0.006 (0.015) 0.002 (0.008) 0.004 (0.009) 0.587** (0.279) -0.644 (0.591) -0.053** (0.021) 0.011 (0.032) Linkage, L OLS (5) -0.026 (0.098) -0.010 (0.085) -0.139 (0.125) 0.010 (0.014) 0.011 (0.041) 0.011 (0.014) 0.021 (0.015) 0.003 (0.008) -0.009 (0.009) 0.554* (0.284) -0.534 (0.421) 0.004 (0.021) 0.022 (0.030) Training, T OLS (6) 0.125 (0.104) -0.051 (0.082) -0.075 (0.122) 0.005 (0.014) -0.005 (0.040) 0.000 (0.014) -0.015 (0.015) 0.008 (0.007) -0.002 (0.009) 0.150 (0.285) 0.570 (0.423) -0.019 (0.023) -0.001 (0.030) L+E OLS (7) -0.147 (0.137) 0.112 (0.133) -0.346* (0.179) -0.005 (0.021) 0.026 (0.057) 0.023 (0.021) 0.024 (0.022) 0.005 (0.012) -0.006 (0.013) 1.000*** (0.369) -1.346 (0.891) -0.046 (0.029) 0.015 (0.046) L+T OLS (8) 0.143 (0.139) -0.047 (0.107) -0.213 (0.178) 0.017 (0.020) 0.011 (0.057) 0.010 (0.018) 0.006 (0.021) 0.008 (0.009) -0.013 (0.013) 0.760* (0.430) 0.118 (0.391) -0.009 (0.026) 0.024 (0.039) T+E OLS (9) -0.041 (0.128) 0.015 (0.108) -0.186 (0.165) 0.000 (0.017) 0.020 (0.055) 0.022 (0.019) -0.012 (0.022) 0.010 (0.010) 0.004 (0.012) 0.644 (0.397) 0.314 (0.453) -0.072** (0.032) 0.016 (0.041)

11

Table A9: School based management Baseline Mean/SD (1) Endline Mean/SD (2) Grant, G OLS (3) Election, E OLS (4) Linkage, L OLS (5) Training, T OLS (6) -0.020 (0.037) -0.049* (0.029) -0.008 (0.039) -0.014 (0.018) -0.042 (0.035) 0.018 (0.041) -0.029 (0.030) 0.185 (0.714) -0.901 (0.717) -0.020 (0.027) 0.044 (0.041) 0.014 (0.013) 0.018 (0.011) 0.002 (0.012) 0.001 (0.009) 0.004 (0.008) -0.034 (0.035) -0.018 (0.036) L+E OLS (7) -0.086 (0.052) 0.006 (0.038) -0.022 (0.057) 0.007 (0.025) 0.045 (0.046) 0.012 (0.056) 0.022 (0.047) 1.155 (0.860) 2.076** (0.939) -0.035 (0.034) -0.041 (0.060) 0.020 (0.020) 0.023 (0.016) 0.006 (0.017) -0.029* (0.015) -0.037*** (0.013) -0.049 (0.047) -0.018 (0.050) L+T OLS (8) -0.071 (0.055) -0.043 (0.044) 0.008 (0.059) -0.007 (0.025) 0.037 (0.050) -0.006 (0.053) 0.003 (0.039) 1.687* (0.876) 0.191 (0.930) -0.024 (0.036) 0.025 (0.058) 0.031* (0.017) 0.048*** (0.016) 0.007 (0.016) -0.014 (0.014) -0.017 (0.011) -0.083* (0.044) 0.006 (0.050) T+E OLS (9) -0.065 (0.054) -0.061 (0.047) -0.094 (0.058) -0.018 (0.026) -0.066 (0.047) 0.001 (0.059) -0.066 (0.045) -0.732 (0.894) 0.332 (1.004) -0.045 (0.041) 0.044 (0.060) 0.013 (0.020) 0.010 (0.017) 0.012 (0.017) -0.003 (0.012) -0.002 (0.010) -0.021 (0.049) -0.068 (0.054)

Panel A: Financial accountability of school management to parents and school committees SCrapbs 0.727 0.762 0.040 -0.043 -0.047 0.377 0.380 (0.056) (0.037) (0.037) SCrecrapbs 0.949 0.869 0.049 -0.002 0.005 0.220 0.339 (0.046) (0.027) (0.026) SCdistrapbs 0.793 0.719 -0.013 -0.058 0.023 0.335 0.374 (0.057) (0.039) (0.038) SPinviterapbs 0.591 0.558 0.047* -0.005 0.006 0.169 0.147 (0.024) (0.018) (0.019) SPparentsrapbs 0.778 0.722 0.114** -0.042 0.080** 0.321 0.359 (0.051) (0.034) (0.034) Pmtgrapbs 0.390 0.403 0.008 -0.002 -0.014 0.383 0.382 (0.057) (0.040) (0.041) Prapbs 0.347 0.322 0.074* -0.019 0.029 0.322 0.313 (0.043) (0.030) (0.030) Panel B: Principals' performance and management of teachers SPmeetteach 6.556 11.606 -0.243 -0.532 1.665*** 3.654 4.618 (0.760) (0.627) (0.593) Tprincmeet 10.717 12.722 -1.875* 1.856** 0.988 3.452 8.719 (1.045) (0.760) (0.674) SPteacheval 0.771 0.771 0.007 -0.020 -0.004 0.259 0.246 (0.038) (0.027) (0.027) Tprinceval 0.715 0.731 -0.087 0.007 -0.037 0.341 0.352 (0.060) (0.040) (0.040) Tprincipal 0.844 0.850 -0.020 -0.000 0.016 0.109 0.096 (0.021) (0.013) (0.013) SPteachaward 0.262 0.242 -0.001 -0.008 0.028** 0.131 0.125 (0.020) (0.011) (0.012) SPteachaccount 0.283 0.273 0.000 0.005 0.003 0.151 0.110 (0.018) (0.012) (0.012) Treward 0.027 0.024 0.027* -0.004 -0.019** 0.074 0.068 (0.014) (0.009) (0.010) Taccount 0.028 0.036 0.021* -0.005 -0.023*** 0.076 0.096 (0.013) (0.009) (0.009) SCprinceffort 0.872 0.832 -0.029 0.015 -0.055 0.283 0.278 (0.045) (0.034) (0.034) Summary index 0.055 -0.030 0.017 (0.058) (0.035) (0.035) * p<0.10, ** p<0.05, *** p<0.01

Columns 1 and 2 report means and standard deviations for the control group in the baseline and endline respecitvely. The following columns report the estimated treatment effects using OLS. All estimations include stratum fixed effects because assignment of treatment was within each stratum. Robust standard errors reported in the parentheses. Letters before the variable name indicate type of questionnaire from which variable was drawn. SC = school committee. P = parents. T = teachers. SP = school principal. S = student. OB = school-level observational questionnaire.

12

Table A10: Community level inputs to education


Baseline Mean/SD (1) Endline Mean/SD (2)

Grant, G OLS (3)

Election, E OLS (4)

Linkage, L OLS (5)

Training, T OLS (6) 0.045 (0.060) 0.025 (0.026) 0.120** (0.059) 0.027 (0.027) -0.003 (0.019) -0.007 (0.060) -0.019 (0.172) 0.013 (0.016) 0.004 (0.022) 0.115** (0.054) 0.017 (0.011) -0.001 (0.022) 0.095* (0.051)

L+E OLS (7) 0.303*** (0.081) -0.028 (0.037) 0.304*** (0.082) 0.091** (0.037) -0.014 (0.026) -0.019 (0.085) -0.154 (0.114) -0.027 (0.021) -0.018 (0.032) 0.069 (0.081) 0.020 (0.014) -0.003 (0.034) 0.126* (0.072)

L+T OLS (8) 0.239*** (0.080) -0.008 (0.035) 0.434*** (0.071) 0.103*** (0.039) 0.012 (0.025) -0.031 (0.080) 0.138 (0.158) -0.000 (0.017) 0.006 (0.036) 0.135* (0.075) 0.043*** (0.015) -0.015 (0.032) 0.237*** (0.066)

T+E OLS (9) 0.085 (0.085) 0.032 (0.034) 0.126 (0.079) 0.060 (0.038) -0.032 (0.025) -0.049 (0.085) -0.292 (0.278) -0.007 (0.023) -0.029 (0.030) 0.133* (0.075) 0.019 (0.015) 0.015 (0.031) 0.062 (0.070)

Panel A: Village councils' collaboration with schools and overall support for education in the village SCbpd 0.274 0.242 0.095 0.092 0.190*** 0.448 0.431 (0.078) (0.060) (0.058) SCsatbpd 0.447 0.480 0.024 0.020 -0.038 0.254 0.217 (0.041) (0.026) (0.026) SPbpd 0.182 0.152 0.123* 0.008 0.281*** 0.388 0.360 (0.072) (0.060) (0.056) SPsatbpd 0.408 0.457 -0.049 0.026 0.066** 0.259 0.217 (0.039) (0.027) (0.026) Panel B: Community support for schools and school committees SCsatcomm 0.640 0.657 -0.009 -0.031 0.017 0.132 0.154 (0.028) (0.020) (0.019) SCnonbpd 0.434 0.404 0.003 -0.028 0.003 0.498 0.493 (0.088) (0.060) (0.060) SCcomfundraise 0.549 0.255 -0.050 -0.275 0.195 3.108 1.615 (0.094) (0.182) (0.195) SCcominkind 0.067 0.054 -0.025 -0.017 -0.002 0.150 0.132 (0.022) (0.017) (0.016) SPsatcomm 0.593 0.596 -0.020 -0.030 0.006 0.147 0.160 (0.026) (0.022) (0.022) SPnonbpd 0.636 0.667 -0.047 0.063 0.004 0.483 0.474 (0.086) (0.056) (0.055) Psatcomm 0.624 0.627 -0.002 -0.003 0.022** 0.121 0.084 (0.015) (0.010) (0.011) Tsatcomm 0.576 0.551 0.023 0.009 -0.010 0.164 0.194 (0.033) (0.023) (0.022) Summary index 0.008 -0.019 0.141*** (0.073) (0.051) (0.051) * p<0.10, ** p<0.05, *** p<0.01

Columns 1 and 2 report means and standard deviations for the control group in the baseline and endline respecitvely. The following columns report the estimated treatment effects using OLS. All estimations include stratum fixed effects because assignment of treatment was within each stratum. Robust standard errors reported in the parentheses. Letters before the variable name indicate type of questionnaire from which variable was drawn. SC = school committee. P = parents. T = teachers. SP = school principal. S = student. OB = school-level observational questionnaire.

13

Table A11: Training Modules Module Community participation and school quality Objectives Enable participants to: 1. Explain how community participation can improve school quality. 2. Explain the importance of eliminating the negative myths that prevented effective community participation. 3. Utilize the nine steps to evaluate the most effective forms and degrees of community participation for improving learning. Enable participants to: 1. Explain the essence of the school committee. Duration 180 minutes

The role of the school committee

180 minutes

2. Understand the objective, role, function and organizational structure of a school committee. 3. Understand the relationships between the school, school committee, and the district education council. 4. Identify concrete steps to empower the school committee. School budgets and Enable participants to: exploring local 1. Analyze school problems and needs. potentials 2. Identify and explore community potential. 3. Formulate the school budget. 4. Build collaboration networks to improve school quality. 5. Strengthen collaborations between stakeholders. Sustaining community Enable participants to: participation 1. Understand the urgency of sustaining community participation. 2. Choose and apply effective strategies in building and sustaining community participation. 3. Maintain and improve community trust of the school. Active, joyful and Enable participants to: effective learning 1. Analyze the weaknesses of conventional learning. (AJEL) 2. Identify the advantage of AJEL over conventional learning. 3. Identify and explore the potential of the school committee in supporting the implementation of AJEL.

180 minutes

180 minutes

195 minutes

14

APPENDIX 2. Training of school committee members The training material used in this experiment is an abridged version of training developed for the Creating Learning Communities for Children (CLCC) program.1 CLCC consists of three main components (Ministry of National Education, 2003). The first, namely school-based management, trains school principals, teachers, and community members to collaboratively develop school development plans. Meanwhile, the second component, active, joyful, and effective learning, focuses on moving classroom teaching away from rote memorization to more interactive learning. Finally, the community participation component trains communities and schools to work together to implement their plans. The idea is to have schools involve communities in planning and organizing their activities, and encourage communities to support them. Table A11 provides a breakdown of the time spent on each of the modules.

All three components of the CLCC program are included in the training modules for this experiment.2 However, unlike in the regular CLCC, the training modules in this experiment are geared only towards school committee members, without special modules for school principals and teachers. Moreover, the training materials are also significantly shorter than those in a regular CLCC training. The length of the training in this experiment is two days and a visit to a model school. In contrast, the recommended length of the first (out of four) packages of a regular CLCC training (inclusive of the school visit) is 6 days (Ministry of National Education, 2005a). Meanwhile, the remaining three packages would require an additional 9 training days for the school committee members (Ministry of National Education, 2005b, 2006a, 2006b).
1 CLCC began in 1999 as an effort to create a model of schools management, and by 2002, was adopted by the Ministry of National Education as the official model for schoolbased management in primary schools (Ministry of National Education, 2003). The program whose development was supported by UNICEF, UNESCO, and the New Zealand AID trained school principals, teachers, and communities to plan for the development of their schools and mobilize local resources. 2 More specifically, the training for this experiment comprises five modules: (i) community participation and school quality; (ii) the role of the school committee; (iii) school budget and exploring local potentials (i.e., school-based management); (iv) sustaining community participation; and (v) active, joyful, and effective learning.

15

In addition, our approach also significantly differs from CLCC with regards to the reliance on school clusters. CLCC encourages the formation of working groups of school principals and teachers, as well as school committee fora among CLCC schools that are located close to each other in a school cluster. However, this experiment examines the effects of training on individual schools: each treatment school and not school cluster that received training was stratified-randomly selected from a population of schools. Therefore, we are unable to implement CLCC's school-cluster approach in training for this experiment.

References: Ministry of National Education, 2003, Creating Learning Communities for Children: Improving Primary Schools through School Based Management and Community Participation. Jakarta: Indoneisa, UNICEF, UNESCO, Government of Indonesia. Ministry of National Education, 2005a, Paket Pelatihan 1: Peningkatan Mutu Pendidikan Dasar Melalui Manajemen Berbasis Sekolah, Peran Serta Masyarakat, Pembelajaran Aktif, Kreatif, Efektif dan Menyenangkan. Downloaded from http://mbeproject.net/download.html. Ministry of National Education, 2005b, Paket Pelatihan 2: Peningkatan Mutu Pendidikan Dasar Melalui Manajemen Berbasis Sekolah, Peran Serta Masyarakat, Pembelajaran Aktif, Kreatif, Efektif dan Menyenangkan. Downloaded from http://mbeproject.net/download.html Ministry of National Education, 2006a, Paket Pelatihan 3: Peningkatan Mutu Pendidikan Dasar Melalui Manajemen Berbasis Sekolah, Peran Serta Masyarakat, Pembelajaran Aktif, Kreatif, Efektif dan Menyenangkan. Downloaded from http://mbeproject.net/download.html Ministry of National Education, 2006b, Paket Pelatihan 4: Peningkatan Mutu Pendidikan Dasar Melalui Manajemen Berbasis Sekolah, Peran Serta Masyarakat,

16

Pembelajaran Aktif, Kreatif, Efektif dan Menyenangkan. Downloaded from http://mbeproject.net/download.html Tape, S. and B. Irianto, 2010, Petunjuk Teknis Implementasi Manajemen Sekolah, PAKEM, dan Peran Serta Masyarakat Melalui Gugus Sekolah, Jakarta: Indonesia, Ministry of National Education, UNICEF, UNESCO.

17

You might also like