66069 Report No. South Asia Human Development Sector 2011 Discussion Paper Series Report No 35 South Asia: Human Development Unit An Impact Evaluation of Sri Lanka’s Policies to Promote the Academic Performance of Primary School Students through School Improvement and Report Card Programs May 2011 Discussion Paper Series Discussion Papers are published to communicate the results of the World Bank’s work to the development community with the least possible delay. The typescript manuscript of this paper therefore has not been prepared in accordance with the procedures appropriate to the formally edited texts. Some sources cited in the paper may be informal documents that are not readily available. The findings, interpretations, and conclusions expressed herein do not necessarily reflect the views of the International Bank for Reconstruction and Development / The World Bank and its affiliated organizations, or those of the Executive Directors of the World Bank or the government they represent. The World Bank does not guarantee the accuracy of the data included in this work. The boundaries, colors, denominations, and other information shown on any map in this work do not imply any judgment on the part of the World Bank concerning the legal status of any territory or the endorsement or acceptance of such boundaries. Table of Contents Acknowledgements ..................................................................................................................... 1 List of Acronyms ........................................................................................................................ 2 Introduction ................................................................................................................................. 7 Section one: School-Based Management .................................................................................... 7 Section Two: The Programme for School Improvement .......................................................... 10 Section Three: The Analytical Framework ............................................................................... 13 Section Four: Findings and Results .......................................................................................... 17 Section Five: Conclusions ......................................................................................................... 20 Section Six: Future Development of The Psi And The Srcp For School Improvement ........... 21 List of Tables (in Text) Table 1: Pilot Zones of the PSI Program (2006-2007) ............................................................ 13 Table 2: Standardized Test Scores by Year and Treatment Group – Grade IV ....................... 15 Table 3: Grade IV Test Score and Household Education Expenditures (School-Level fixed effects estimation with clustered standard errors) ..................................................... 18 Table 4: Grade IV Teacher Variables (School level fixed effects estimation with clustered standard errors).......................................................................................................... 19 Table 5: Grade IV Principals Management of School Needs Variables (Probit estimation with clustered standard errors) .......................................................................................... 19 List of Boxes Box 1: Focus Ground Discussions with Stake Holders ........................................................... 20 Acknowledgements The team acknowledges with sincere gratitude the assistance of several World Bank colleagues and Sri Lankan counterparts in the preparation of this impact evaluation. In particular, Diaretou Gaye (Country Director for Sri Lanka and the Maldives), Naoko Ishii (former Country Director for Sri Lanka and the Maldives), Michal Rutkowski (Sector Director, Human Development), and Amit Dar (Education Sector Manager); a number of government officials, particularly Mr. H.M. Gunasekera (Secretary, Ministry of Education), Mr. S.U. Wijeratne (Additional Secretary, Planning and Performance Review), Mr. Hemantha Premkumara (Additional Secretary, Education Quality Development), Ms. Madura Wehella (Director, Planning), Dr. Jayantha Balasuriya (Deputy Director, Planning), Ms. Kamani Perera (Director of Social Sciences and Humanities, School Activities Branch), Mr. Salahuddin (Deputy Director, School Activities Branch), Mr. Ariyaratane Hewage (Chairman, Finance Commission), Mr. Asoka Gunewardena (former Chairman, Finance Commission), Mr. W.H. Munasinghe, Secretary, Finance Commission, and several officials of the Finance Commission including Mr. Mahinda Gammampila, Mr. Sisira Liyanage, Mr. Phillip Senaratne and Mr. Wilfred Perera, and officials from the Provinces who were all helpful at various stages of the study. The National Education Research and Evaluation Centre of the University of Colombo undertook the surveys for the evaluation. The peer reviewers were Harry Patrinos, Lead Education Economist, HDNED, and Rajendra Joshi, Senior Education Specialist, AFTED. The study was financed by the World Bank and the Education Program Development Fund (EPDF) of Education for All: Fast Track Initiative. 1 List of Acronyms ESDFP Education Sector Development Framework and Programme ESDP Education Sector Development Project GCE A/L General Certificate of Education Advanced Level GCE O/L General Certificate of Education Ordinary Level NEREC National Evaluation Research and Evaluation Centre PSI Programme for School Improvement SDC School Development Committee SMT School Management Team SRCP School Report Card Programme UNESCO United Nations Educational, Scientific and Cultural Organization 2 Executive Summary The Government of Sri Lanka’s Education Sector Development Framework and Programme (ESDFP) initiated a major development innovation for primary and secondary education for the period 2006-2010. Their strategy was organized around four key themes: improving equitable access to basic and secondary education; improving the quality of basic and secondary education; enhancing the economic efficiency and equity of resource allocation; and strengthening service delivery (Ministry of Education, 2007). Under each theme there were a number of key development initiatives. Among these, the Programme for School Improvement, through which the government sought to introduce school-based management, constituted an important innovation. A smaller intervention, the School Report Card Programme, was also introduced to inform schools on their performance. The Programme for School Improvement (PSI) The PSI was designed to bring about radical change in the culture of schools through the establishment of management structures and the provision of training and support services in which: (a) the participation of parents and community in the work of the school was increased; and (b) the quality of student learning became a major focus. More specifically, the PSI was designed to achieve: a) Active involvement of the school community (parents, teachers, and past pupils) in the running of the school. b) Planned development of the school. c) Effective utilization of resources. d) Improved performance in curricular and co-curricular activities through cooperation between schools and communities. e) Creation of congruence between staff training and school needs. f) Strengthening of school-community relationships. g) Entrusting responsibility for the school to the School Development Committee, thus ensuring accountability (Ministry of Education, 2005). The School Report Card Programme (SRCP) The SRCP was implemented on a relatively small scale at the same time that the PSI was introduced. The premise was that the school community—principals, teachers, parents, and students—should receive regular information on their school’s performance through a “report card� to enable schools to improve their performance, either by stimulating low-performance school communities to action, or by encouraging high-performance schools to strive even harder. The “report cards� were to be filled out by school personnel at the end of the school year and distributed to parents and School Development Committee (SDC) members. These progress reports contained basic institutional information, and some record of teacher and student performance such as average teacher/student attendance, failure and dropout rates, school pass percentages on Grades 5, General Certificate of Education Ordinary and Advanced Levels (GCE O-Level and GCE A-Level) examinations, and funds available to— and activities of—the school’s SDC. 3 Evaluation Design The PSI began implementation in 2006 in all schools in the eight 1 selected ‘education zones’ in Sri Lanka. The SRCP was implemented on a pilot basis in selected schools in the eight zones, as well as in eight other education zones. The two initiatives were designed to allow researchers to estimate programmatic impact on students’ educational outcomes. To this end, our sample of 200 schools was divided into four sets of 50 schools each. One group consisted of schools in which the PSI was to be implemented. The second group comprised schools in which the SRCP was to be implemented, and the third group, in which both programs were to be implemented. The fourth cluster served as the “control� group of schools, in which neither program was implemented. The division of these 200 schools into the four groups was made in a purely randomized manner, which facilitates estimation of the impact of the two programs. Baseline data 2 were collected in 2006 for all 200 schools before either program was implemented, and follow-up data were collected in 2008, two years after program implementation The objective of this study was to gauge the impact of the Education Sector Development Framework and Programme’s (ESDFP) two significant interventions—the PSI and the SRCP—on the educational outcomes of Grade IV Sri Lankan students, with special emphasis on English language and Mathematics performance. It sought to further understand performance in comparison with the control group. As per the evaluation design, our sample was divided into the above-mentioned four groupings of schools. At the commencement of the study, the students and schools in the four groups exhibited similar characteristics and learning outcome levels. It should be noted here that our research design incorporated controls for student, family background, and school characteristics in such a way that, if student learning outcomes in the different groups of schools were significantly at variance from that of the control group at the time of the final survey, this difference could be attributed to the impact of the relevant program. Results and Findings Overall, our findings illustrate that the schools in which the PSI was implemented performed well in terms of improving the cognitive abilities of their primary school students. This is an encouraging and positive finding. Discussions with stakeholders (education officials, school principals, teachers, parents and students) suggested that a range of processes—including better teacher and parental involvement with the children, both at school and in the home— are likely to have contributed to this outcome. In the local communities, stakeholders, including parents, past pupils and well-wishers, involved themselves closely with various aspects of school administration through school development committees (SDC). This presented the schools with the additional management 1 In 2006, there were only 8 provinces in Sri Lanka. Following a court decision, a 9th province came into being in 2007. 2 Baseline data is basic information gathered on a program before its commencement. It is used later on to provide a comparison for assessing program impact. 4 support critical to improving the learning potential of their pupils. Additionally, the committees were proactive in resource mobilization for school development projects, over and above the funds received from provincial councils or the central government. To improve the English language skills of the children, the committees helped to build a stock of children’s books, and facilitated theater and musical events. For mathematical skills development, the committees implemented numeracy advancing games and activities with an especial focus on the fun element, important for enticing primary school children. The formation of SDCs provided the schools with a sense of order and method. The committee’s regular meetings and continual interactions with the school management teams contributed to enhanced management of the schools, particularly with regard to the goal of promoting learning. In poorer communities, it was observed that the parents of children also benefited from the school-community interactions. Parents imbibed the importance of facilitating activities at home to enable their children to study and learn better. For instance, they learned to set aside dedicated study time for their children. Further, it was noticed that parents whose children studied in schools that had school development committees, helped their children with learning activities more than in schools that did not have them. The SRCP, however, was not as successful as the PSI. The results do not show a statistically significant effect of the SRCP on school performance. This may partly be due to the relatively low weight accorded to the SRCP in relation to the PSI by policy makers. The PSI was viewed as a flagship program, and was strongly supported by both the central government and provincial councils. In contrast, the SRCP was put into operation mainly for the purposes of the evaluation. Little effort was made to enable schools to use the SRCP’s mechanisms to obtain information about schools and to seek an improvement in institutional performance. Future Development and Expansion of the PSI and the SRCP Considerable effort and resources have been invested in recent years in the promotion of the decentralization of decision making, and in increasing parental and community involvement in the education system, through the PSI. The findings of this impact evaluation reveal that the PSI has had a positive and significant impact on increasing local community participation in school administration, the implementation of school development projects through resources raised from local communities, and the cognitive achievement levels of primary school students. The following steps now seem appropriate to extend and consolidate the reforms supported by the PSI. • Expand the PSI to all schools in the country. • Consolidate PSI in schools where structures have been established, but activity is low. This may require greater clarity in specification of roles, capacity building, and continuing personal and monetary support for schools. • Support the development of capacity to exercise real devolution, not just decentralization or delegation. This may involve reforms to make teachers more accountable to local school communities, for such matters as school attendance. Eventually, teacher recruitment, currently centralized, could be devolved to school level. 5 • Review regulations regarding School Development Committees and School Management Teams to establish their appropriateness for all types and sizes of school. • Empower lower levels of governance with clearly defined functions that do not overlap with higher levels. This will involve strengthening zonal capacity through training for advisers and networking of principals. • Particular attention should be paid to schools serving children in socioeconomically disadvantaged areas as effective governance is particularly important for the marginalized and disadvantaged. Additional funding may be required which takes account of school size, level of schooling, special education needs, location and type of school. • Support schools in developing a parent involvement program which to date has been largely restricted to distal activities (attending meetings, involvement in school committees, engagement in voluntary work or making financial contributions to maintain or improve physical conditions, resources, and services) to one in which attention is paid to proximal activities. The program should focus on (a) developing parents’ understanding that the home environment has a profound impact on the school learning of children and that they have the power to change it; (b) developing parents’ self-confidence and sense of efficacy in establishing a home environment that will provide rich learning experiences for children; and (c) demonstrating specific behaviors that parents can use (e.g., how to interact with pupils re homework, having pupils read to them). • Extend the involvement of communities to contribute to the development of “competencies for life�, so-called “soft skills� or generic skills which are necessary for effective functioning in personal life, interpersonal relationships, and employment/economic activities (critical and divergent thinking, problem solving, creativity, initiative, leadership, responsibility, team work). Experience in community activities is often more relevant and appropriate in developing these skills than school- based experience which is often preoccupied with covering syllabuses and preparing students for examinations. Community activities also provide opportunities to develop social cohesion through learning to live with others in harmony, respecting the diversity of a multi-ethnic, multi-religious and multi-cultural society (see Ministry of Education, 2004b). The School Report Card Programme The SRCP in its present form has demonstrated little impact on school performance. This is not surprising, as it has received scant attention by policy makers. The SRCP could be revised so that the information contained in it is used by the SDCs and government authorities to improve schools. As a first step, the SDCs and local governments would need to be trained in the judicious use of school report cards. Once the SRCP is implemented as a full-fledged program, its impact could be carefully studied. It is entirely possible that mainstreaming the SRCP in the government reform program would result in positive effects at the school level. 6 Introduction The Government of Sri Lanka’s Education Sector Development Framework and Programme (ESDFP) initiated a major development innovation for primary and secondary education for the period 2006-2010. Their strategy was organized around four key themes: improving equitable access to basic and secondary education; improving the quality of basic and secondary education; enhancing the economic efficiency and equity of resource allocation; and strengthening service delivery (Ministry of Education, 2007). Under each theme there were a number of key development initiatives. Among these, the Programme for School Improvement, through which the government sought to introduce school-based management, constituted an important innovation. A smaller intervention, the School Report Card Programme, was also introduced to inform schools on their performance. The World Bank supported the ESDFP through a programmatic sector-wide operation, the Education Sector Development Project (ESDP), and a range of analytical activities of key initiatives. The PSI was a central element of the World Bank’s support under the ESDP. The World Bank undertook an impact evaluation of the PSI, as part of its analytical assistance to this program. This report presents the findings of this assessment, and comprises five sections. First, there is a concise description of school-based management. Second, there is a discussion of the PSI and SRCP in Sri Lanka. Third, the analytical framework and results of the impact evaluation are discussed. Fourth, the findings and results of the analysis are discussed. Fifth and finally, a set of recommendations are presented for the future of the PSI and the SRCP. Section One: School-Based Management Rationale for School-Based Management A variety of reasons have been posited by policy makers and practitioners in support of school-based management: a) School-based management is democratic as it engenders power distribution between the various education partners when regulating institutional and individual behavior, and in funds allocation. When parents and community members are involved, it contributes to their empowerment b) School-based management facilitates the recognition of, and responsiveness to, local needs. Large bureaucracies could tend to overlook peripheral needs and ignore ethnic, linguistic, and regional cultural variation, while school-based management allows local decision makers to adapt education policies to local realities and to determine the appropriate mix of inputs. c) School-based management has the potential to lead to a more effective educational delivery and prudent use of school, local, and regional level resources. This view, in part, reflects the business concept of total quality management, according to which decisions made close to the actual product will produce better results. Research suggests that local management is most appropriate in business organizations where the work is complex and is carried out in a continually changing environment, and where there is uncertainty in its day-to-day tasks, all of 7 which characterize the teaching-learning situation (Wohlstetter & Mohrman, 1993). It should, for example, be possible for local actors to work more effectively than a central authority in mobilizing local resources (including from private parties) in improving inter-agency cooperation and in integrating services. d) School-based management should lead to improved communication between stakeholders, and in facilitating principals’ awareness of teacher and parent concerns. e) School-based management should result in greater accountability of schools and teachers to their pupils, parents, and local communities. f) School-based management is more transparent, significantly reducing opportunities for corruption. g) School-based management provides for group decision-making, which tends to be more considered than decisions made by individuals. h) School-based management contributes to the development of high levels of professionalism in schools. i) School-based management should ultimately lead to improved student retention and learning. This was not an issue in the early reforms period where school-based management was interpreted to mean political activities that transferred power and authority to individual schools over their budgets, personnel, and curriculum. This changed in the late 1990s, where overall school improvement became a major objective of school-based management initiatives. j) Training (when provided) for parents and other stakeholders in shared decision- making, interpersonal skills, and management proficiency can benefit the community as a whole. k) The development of school-based management is relatively inexpensive as it involves a change in locus of decision-making rather than a large increase in resources (Abu-Duhou, 1999; Caldwell, 2005; World Bank, 2006, 2007, 2008). School-based management—and indeed, other forms of decentralization—have also been associated with a number of possible disadvantages. Notably, it can have adverse effects on equity. It may, for example, result in disparities between schools in economically advantaged and disadvantaged areas with regard to resource availability (including the capacity to manage). According to Bray (2001), “A major question for policy makers concerns ways to harness the resources and energies of prosperous communities while protecting and encouraging their less prosperous counterparts� (p.3). Within communities, better functioning families may take up opportunities to become involved in school activities, leaving the neediest even more excluded (Corter & Pelletier, 2005). Thus, it becomes imperative to monitor the impact of decentralized management on income and social groups, and to identify measures to mitigate possible adverse effects. This, in turn, points to the need for a central authority to retain its power to implement policies to discriminate in favor of areas most in want. Secondly, there is the danger of unnecessary duplication in a decentralized system (e.g., in policy to address the special needs of students with disabilities, for instance). To get around this issue, it will be important to specify decision-making categories that are most appropriately dealt with at the national level. Thirdly, it may take longer in a decentralized system to implement certain types of reforms and innovations. Forms of decentralization that involve intermediate tiers can create 8 problems for schools as sources of conflict multiply and bureaucracy increases (Perera, 2000). Fourthly, school-based management is vulnerable to elite capture. For instance, aggressive and well-connected parents may make use of their influence on school boards or committees to further the interests of their own children at the expense of other children. Characteristics of School-Based Management School-based management systems vary on a variety of characteristics (Abu-Duhou, 1999; Caldwell, 2002, 2005; Cheng, 1996; Hill, Smith, & Spinks, 1990; World Bank, 2006, 2007). Only the first two items outlined in the bullet points below apply to all systems. • The school has the authority and responsibility to make decisions on one or more of the following: a) Use, maintenance, and improvement of the school building. b) Intended curriculum (range of subjects taught, syllabus content). c) Implemented curriculum (methods of instruction; choice of textbooks). d) Budget/expenditure. e) Procurement of educational materials. f) Management (deployment of teachers, assigning students to classes, school calendar, classroom hours). g) Human resources (employment, remuneration, and conditions of employment of teachers and other staff); professional development. h) Admission, suspension, and expulsion of students. i) Monitoring and evaluation of student performance (judgment of student achievement/failure; certification of student achievement). j) Quality assurance (supervision and evaluation of teacher performance). k) Publication of information about a school’s performance. • School decision-making is carried out within a centrally determined framework of goals and policies. School-level actors have to conform to, or operate within, a set of centrally determined policies and procedures. • An internal school management group comprising the principal, teachers and, in some cases, students, is constituted either: (a) to advise the principal; or (b) to take decisions. • Parents’ and other community members are provided with the opportunity of participating in school management, planning, and development, usually through the creation of a school council. Again, the council may: (a) advise the principal; or (b) take decisions. • School principals and teachers are considered accountable to: (a) education authorities for adhering to policy and rules; (b) their peers for adhering to standards of instruction; and (c) students, parents, and the general public for student achievement. There is great variation in accountability systems. In some, information on student achievement is published in league tables, and sanctions—including monetary rewards—are attached 9 to school performance. The use of monetary rewards, however, has proved controversial, and in most cases, has not lasted very long. Furthermore, rewarding successful schools at the expense of increasing resources to schools that are failing would not contribute to overall school improvement. Non-monetary rewards (working in an environment conducive to learning, seeing positive results in student learning, or responding to parent pressure) can be motivating. • School-based management may be implemented in conjunction with other reforms. It is not unusual to regard such management as only one of several strategies designed to improve student learning. Section Two: The Programme for School Improvement In 2006, the Ministry of Education embarked on the Programme for School Improvement (Ministry of Education, 2007). The objective of the PSI was for schools to become increasingly empowered through strong community involvement in school management. The PSI was designed to bring about a radical change in the culture of schools through the establishment of management structures and the provision of training and support services in which: (a) the participation of parents and community in the work of the school was increased; and (b) the quality of student learning became a major focus. More specifically, the PSI was designed to achieve: i) Active involvement of the school community (parents, teachers, and past pupils) in the running of the school. ii) Planned development of the school. iii) Effective utilization of resources. iv) Improved performance in curricular and co-curricular activities through cooperation between schools and communities. v) Creation of congruence between staff training and school needs. vi) Strengthening of school-community relationships. vii) Entrusting responsibility for the school to the School Development Committee, thus ensuring accountability (Ministry of Education, 2005). Each school was required to set up two bodies: a School Development Committee (SDC) and a School Management Team (SMT). The SDC consists of the principal (as Chair), a deputy principal, and representatives of teachers, parents, past pupils, and the Education Authority. The number of teachers, parents and past pupils varies from 3-5 members from each category, depending on the size of the school. Teachers, parents, and past pupil representatives are elected members. The SDC is expected to meet at least once a month during its three-year term of office. The committee is charged with preparing a five-year school development plan based on the Manual of Instruction for School Level Planning (Ministry of Education, 2004a) and an annual implementation plan. The school development and implementation plan should: (a) address student access and participation; (b) focus on improving student achievement; and (c) attend to the school plant and physical resources (Ministry of Education, 2004b). Grants are provided to enable actioning on the plan activities. 10 The SDC has the power to undertake projects and make purchases (to a maximum value of Rs. 200,000). It is required to prepare an annual budget and monthly and annual financials; operate a bank account; be responsible for the maintenance and development of the school plan; and be accountable to the relevant authorities and to the school community. The SMT, which is established within the school, consists of all the school staff members of the SDC, the other Deputy Principals, Assistant Principals, and Sectional Heads. The SMT should work closely with the SDC and, following consultation, may appoint subcommittees. The consultation process between the SDC, which consists of selected school officials and elected stakeholders in the school, and the SMTs, which consists only of schools officials, is mediated through the principal, who chairs both the SDC and the SMT. The consultations vary by school type, and are flexible. For instance, in wealthy, educated urban communities SMTs may decide to approach the SDCs with an appeal for funds for educational activities. Or the SMTs may ask the SDCs to assist with the organization of activities such as literary festivals, theater productions, and musical and cultural events. In poorer, less educated rural communities the SMTs may approach the SDC to instruct parents on the importance of setting aside time for their children to study on a daily basis. Or, to provide labor to clean the school premises. The interactions between the SDCs and SMTs vary according to the school circumstances and the characteristics of the school stakeholders. The PSI was launched in one zone in eight provinces in 2006. Additional zones were added in each succeeding year. By 2009, a total of 5,222 schools were participating in the scheme. Over the years, the program has evidenced a school-to-school variation in its interpretation and implementation. A high degree of program operationalizing, defined as application of the powers given by the PSI circulars, was associated with: a) Strong commitment of the principal and other teachers to the values of the PSI. b) Prepared school plans, and monitored implementation. c) A wide range of extra-curricular activities provided for pupils (e.g., dance, music, sport, gardening). d) Increased community involvement in the school. e) Regular (monthly) meetings between teachers and parents to monitor and discuss the progress of individual pupils. f) Regular visits to the schools by zonal and divisional officers to participate in committees, to advise on teaching methods, and to assist in development and implementation of the school plan. g) A shift in teachers’ mindsets from inputs (e.g., resources) to the quality of student learning. In schools in which implementation of the PSI was less successful, structures might have been established but it is highly likely these did not operate effectively to promote the initiative’s objectives. In such cases, observations in—and reports from—schools, together with limited evaluation findings (Dias, 2008; Gunasekara, et al., 2010; Kularatne, 2008) point to: a) Inadequate commitment to, or interest in, meeting the challenges that the program had been designed to address. b) Efforts to implement management changes are not sufficiently grounded in institutional political analysis. 11 c) Ambiguities in the responsibilities specified for different levels. d) Inadequate funds. e) Lack of support to schools in helping them understand the PSI messaging and how that might be translated into action in their schools. There is some evidence that this situation may arise from the inability of zonal officers, often because of lack of time, to engage in meaningful collaboration with schools. f) Failure to connect the PSI with curriculum and instructional reforms and, in particular, with student learning outcomes. g) Reluctance of some administrators and teachers to authorize others to take over decision-making. h) Additional management roles and responsibilities are not always welcome in schools. i) Lack of stakeholders’ knowledge of what school-based management is, and how it works. j) A tradition of weak management, decision-making, and communication skills in a school. k) Problems in getting full participation in meetings. l) Lack of support from parents and the community. m) Lack of a culture of accountability within a community (no one would question the actions of school teachers, for example) (see World Bank, 2007). A variety of measures are in place to address the problems associated with low or marginal levels of implementation: a) PSI committees have been established in all zones. b) Technical Assistants have been appointed in all provinces to support schools. c) Meetings have been held between Ministry officials, Zonal Officers, and Technical Assistants to review progress. d) Training has been carried out of SDC personnel at the provincial level. e) Schools serving pupils from disadvantaged areas that need additional support or assistance in implementing the PSI have been identified. f) “Seed grants� have been given to “difficult� and “very difficult� schools (Ministry of Education, 2008). The School Report Card Programme The SRCP was implemented on a relatively small scale at the same time that the PSI was introduced. The premise was that the school community—principals, teachers, parents, and students—should receive regular information on their school’s performance through a “report card� to enable schools to improve their performance, either by stimulating low- performance school communities to action, or by encouraging high-performance schools to strive even harder. The “report cards� were to be filled out by school personnel at the end of the school year and distributed to parents and School Development Committee (SDC) members. These progress reports contained basic institutional information, and some record of teacher and student performance such as average teacher/student attendance, failure and dropout rates, school pass percentages on Grades 5, General Certificate of Education Ordinary and Advanced Levels (GCE O-Level and GCE A-Level) examinations, and funds available to—and activities of—the school’s SDC. 12 Section Three: The Analytical Framework The PSI began implementation in 2006 in all schools in the eight selected education zones in Sri Lanka. The SRCP was implemented on a pilot basis in selected schools in the eight zones, as well as in eight other education zones. The two initiatives were designed to allow researchers to estimate programmatic impact on students’ educational outcomes. To this end, our sample of 200 schools was divided into four sets of 50 schools each. One group consisted of schools in which the PSI was to be implemented. The second group comprised schools in which the SRCP was to be implemented, and the third group, in which both programs were to be implemented. The fourth cluster served as the “control� group of schools, in which neither program was implemented. The division of these 200 schools into the four groups was made in a purely randomized manner, which facilitates estimation of the impact of the two programs. Baseline data were collected in 2006 for all 200 schools before either program was implemented, and follow-up data were collected in 2008, two years after program implementation. The purpose of our analysis was to estimate the impact of the PSI program and the SRCP on the educational outcomes of Sri Lankan students studying in Grade IV. Table 1: Pilot Zones of the PSI Program (2006-2007) 2006 Province Pilot Zone District Schools Western Colombo Colombo 125 Central Wattegama Kandy 82 Southern Ambalangoda Galle 82 North-Western Chilaw Kurunegala 158 Northern Vavuniya-South Vavuniya 97 Eastern -- -- North-Central Tambuttegama Anuradhapura 69 Uva Wellawaya Monaragala 87 Sabaragamuwa Kegalle Kegalle 163 Note: In 2006, the Northern and Eastern Provinces were merged, and the Vavuniya Zone was selected from the North-Eastern Province. In 2007, the Northern and Eastern Provinces were de-merged and set up as separate provinces. Hence, separate zones were selected from each of these provinces. Selection of the Four Groups of Schools This analysis is based on data from 200 schools: 50 schools in which the PSI was implemented in 2006; another 50 in which the SRCP was actioned; 50 schools in which both programs were implemented; and 50 “control� schools that were not selected for either program. This section elucidates on how all four sets of schools were chosen. As explained above, Sri Lanka comprises nine provinces (although for a short time in 2006, two were merged, so that there were only eight provinces). Each province is further divided into districts, of which there are 25, and each district is sub-divided into education zones. There are a total of 93 education zones in the country. The research team selected one district in each of Sri Lanka’s nine provinces, and the 200 schools covered in this paper are from those districts. Within each selected district, one education zone was selected by the government in March/April 2006 to implement the PSI, and all schools in that zone implemented PSI later that year, starting around July. The baseline data, collected 13 in March 2006, were intended to reflect school outcomes at the end of the 2005 school year. Therefore, data collection was done when the Grade IV students were in the first semester of Grade V. 3 Since all the schools in the selected education zones implemented the PSI in 2006, all the “control schools� had to be selected from other education zones, but always from the same district. 4 In each province, the sole education zone to implement the PSI was chosen by Sri Lanka’s Ministry of Education in consultation with provincial authorities—in a somewhat ad hoc manner—perhaps due to political considerations, as opposed to random selection using standard sampling methods. After the government shortlisted the education zone in each province for implementing the PSI, between 11 to 12 schools were randomly selected for the purposes of this study from among all PSI-instituted schools in each selected zone. This random selection of schools within education zones was done in a stratified manner; in each zone schools were classified according to the four types of schools in Sri Lanka (see footnote 4) and in each of these four strata, schools were selected with an equal probability. The assessment team initially shortlisted 100 schools implementing the PSI from all nine provinces. This was then whittled down to 50 (see following paragraph for an explanation of how and why this was done). 100 control schools were shortlisted from the same districts, but from different education zones, as explained above. While these schools were not arrived at in a strictly random fashion, they were selected to match as closely as possible—given observable characteristics—the 100 PSI schools. More specifically, each PSI school was compared with all non-PSI schools in the same district (but in a different education zone), all with the same “level� and “race� 5 characteristics. Of the non-PSI schools the one with an enrolment level closest to that of the PSI was selected as the “match.� 6 Finally, 50 of the 100 PSI schools, and 50 of the 100 control schools, were randomly chosen to participate in the study. Data Collection: 2006 and 2008 A range of information was collected in 2006 and 2008 from the 150 schools that implemented the PSI or the SRCP (or both), and from the 50 control schools. The data was derived from academic tests administered to students in grade IV of those schools, as well as to their teachers, and from a set of questionnaires that were administered to students, teachers, section heads, school principals, zone directors, and in-service advisors. 7 Within each school, up to 20 students of Grade IV were randomly chosen to 3 Sri Lanka’s school year runs from January to December 4 The education zones chosen for implementing the PSI in 2007 were in different districts from the education zones chosen for the PSI in 2006, so none of the control zones chosen in 2006 were selected for the PSI in 2007. Similarly, 16 additional zones implemented the PSI in 2008, and another 8 did so in 2009, but none of these 24 zones are the control zones selected in 2006. 5 All Sri Lankan primary and secondary schools are divided into four types: 1AB, 1C, 2 and 3. Type 1AB schools teach Grades I- XIII and offer all three curriculum streams (arts, commerce, and science). Type 1C schools also teach Grades I-XIII but offer only two streams (arts and commerce). Type 2 schools offer only Grades I-XI, and small Type 3 schools offer only Grades I-V or I-VIII. There are five types of school “levels�: very congenial, congenial, uncongenial, difficult, and very difficult. Finally, there are three race “categories� in Sri Lanka: Singhalese, Tamil, and Muslim. 6 All “national� and “Divisional Secretariat Division� (DSD) schools were excluded from the sample because the Ministry of Education decided to implement the PSI in all national and DSD schools. Therefore, there is no control group for those schools. National schools, which make up about 3% of Sri Lanka’s schools, are run directly by the national government, while provincial governments run other schools. DSD schools comprise another 3% of schools that have recently been designated by the central government for a major improvement in their physical quality. 7 In Sri Lanka, about half of the schools are in effect combinations of primary and secondary schools, and so have Grades from I- XIII. These schools have one Principal, but are also divided into primary and secondary “sections�, each of which has a “section head.� Education zones also have in-service advisors (school inspectors), who visit schools to supervise and provide support for teachers. 14 take the exams and complete the questionnaires; if a school had less than 20 students, all pupils were made to participate in the study. Table A1 in Annex One summarizes the type of information available from these 200 schools. The variables shown in that table are those that are most directly related to learning and academic performance. Table A2 in Annex One compares the descriptive statistics of the main explanatory variables among child and teacher characteristics. According to the information in Table A2, the four samples were statistically insignificantly different from zero for these key variables at the time the study commenced. The students were administered two tests, and the study was able to gather a significant amount of data on student characteristics. In the first test, the fourth graders were administered academic tests of their skills in English and Mathematics. These were administered in March 2006 (when the Grade IV students were in the first semester of Grade V), and for a new set of Grade IV students in October 2008 (again, the students were tested after they had entered the next grade.) The tests were designed by the University of Colombo’s National Education Research and Evaluation Centre. For each grade and each year, the test scores for each student were standardized by subtracting the mean, and then dividing by the standard deviation of the control group. Thus, the control group test scores have a mean zero and standard deviation of one for each subject, grade, and year. Table 2 shows the average test scores, by year and treatment group. Table 2: Standardized Test Scores by Year and Treatment Group – Grade IV 2006 2008 Math N Mean SD N Mean SD PSI and Report Card 788 -0.105 1.041 751 -0.173 1.164 PSI Only 687 -0.286 1.080 654 -0.138 1.080 Report Card Only 756 -0.104 1.031 703 -0.233 1.247 Control 696 0 1 673 0 1 English N Mean SD N Mean SD PSI and Report Card 787 -0.108 1.048 750 -0.143 1.007 PSI Only 669 -0.285 0.979 659 -0.158 0.978 Report Card Only 752 0.060 1.050 708 -0.094 1.006 Control 699 0 1 663 0 1 First Language N Mean SD N Mean SD PSI and Report Card 784 -0.256 1.012 751 -0.590 1.192 PSI Only 688 -0.258 1.026 641 -0.500 1.117 Report Card Only 755 -0.210 1.030 707 -0.590 1.396 Control 699 0 1 673 0 1 Note: The sample is restricted to include only those schools that were in both rounds of data collection. The teacher questionnaire collects information on each teacher’s personal characteristics (age; sex; living accommodation; distance from, and commuting time to, school; education; and work experience), the teacher’s classroom (materials received; sufficiency of supplies; and number of students), support provided by education administrators, 15 teaching methods used, and teachers’ opinions on the official syllabus and curriculum. In 2006, two Grade IV teachers were surveyed from each school—the class teacher and the English teacher. The data was averaged across all surveyed teachers in each school. In 2008, only the class teacher was surveyed. The key teacher variables used in the analysis are summarized in Table A2 of Annex One. Additional information on students is available from the student and parent questionnaires. The student questionnaire collates basic demographic information (age, sex, ethnicity, religion, number of siblings), language(s) spoken at home, food availability at home, transportation mode and travel time to school, availability of a desk and chair at school, accessibility of textbooks, workbooks and exercise books at school (English, First Language, Math, and Science), attitudes about school, time spent doing various activities at home (watching TV, listening to the radio, reading), and whether any grades were repeated. The parent questionnaire collects basic demographic information on both parents, some description of the home (type of building, ownership, utilities, and ownership of durable goods), educational levels of parents and of the child’s siblings, parents’ occupation, household income, spending on educational items for the child, availability of reading materials in the home, time spent by the child in various activities, parent participation at school and helping child with schoolwork, student participation in tuition (tutoring) classes, and educational aspirations for their child. In 2006, one parent for each student was surveyed. In 2008, data was collected from both parents, although in many cases, only one parent responded. Table A3 in Annex One describes the key variables used in the regression analysis, including control variables for student gender and race, parent’s educational level, and household income and expenditures on the child’s education across the sample for the two years. The principal’s questionnaire also begins by requesting personal information, followed by questions on the teaching staff, school facilities, financial resources, opinions on various education issues, management training and practices, the activities and composition of the SDC, 8 and some information on student performance on recent national tests. Table A4 in Annex One describes several groups of variables used in the regression analysis below including the principal’s management of teachers, the principal’s plans and assessment of school needs, and the funding received by different sources for the school. The questionnaire for section heads also gathers personal information, and inquires about facilities and teaching supplies, pedagogical practices, methods to evaluate teachers, satisfaction with the current teachers, and opinions on new educational policies, the current curriculum and other matters pertaining to schooling. Data were also collected from an in- service advisor and zonal education director. Finally, teacher tests were administered to teachers in 2006 and 2008. These were designed by NEREC. This data is not currently used in the regression analysis below, and a full description is omitted for the sake of brevity. 8 School Development Committees (SDCs) are headed by the principal and include members of the teaching staff and other stakeholders in the educational system, such as parents. SDCs are dominated by the principal, while school management committees spread power more evenly between the Principal, teachers and local communities. 16 Section Four: Findings and Results The analysis framework is described in detail in Annex Two of this report. The sample was divided into four groups: a) the control group school, in which neither the PSI or the SRCP had been initiated by the government; b) PSI-implemented schools; c) schools running the SRCP; and d) schools in which both the PSI and the SRCP were actioned. The impact analysis sought to understand whether, over time, students benefited from the educational reform initiatives instituted by the two government programs (PSI and SRCP) in their schools. Specifically, the study aimed to assess whether the students exhibited improved Mathematics and English language performance, when compared to the control group. The study also controls for student, family background and school characteristics in such a way that, if student learning outcomes in the various groups of schools are significantly different from that of the control group at the time of the final survey, this difference can be attributed to the impact of the relevant program(s). At the commencement of the study, the students and schools in the four groups displayed similar characteristics and learning outcome levels. Results This section presents estimates for a variety of outcome variables of interest. The first subsection examines student and household-level variables, beginning with students’ test scores, while the second subsection examines the impact of the PSI and the SRCP on school-level variables. The regression results reveal several interesting findings: a) Student and Household Variables. Table 3 presents estimates of the impact of the PSI and the SRCP on student and household outcomes of interest for Grade IV students. Columns 1-2 examine the impact of those programs on Math and English test scores. The findings reveal that students from schools that implemented the PSI alone score significantly higher in both subjects. These scores increase by 0.20 and 0.18 standard deviations, respectively, among students from the PSI schools. However, the schools that implemented the SRCP alone display impacts that are not statistically significant. Finally, schools that implemented both programs had impacts that were not statistically significant. 17 Table 3: Grade IV Test Score and Household Education Expenditures (School- Level fixed effects estimation with clustered standard errors) (1) (2) (3) (4) VARIABLES Math English Math English Year = 2008 -0.0684 -0.0584 -0.0385 -0.0469 (0.0482) (0.0458) (0.0525) (0.0495) 2008×PSI 0.199*** 0.177*** 0.220*** 0.226*** (0.0720) (0.0662) (0.0767) (0.0712) 2008×Report -0.0673 -0.102 0.0321 -0.0806 Card (0.0708) (0.0645) (0.0789) (0.0715) 2008×RepCd×PSI -0.112 -0.0209 -0.120 -0.0448 (0.102) (0.0921) (0.110) (0.101) Male -0.238*** -0.344*** (0.0289) (0.0266) Sinhala 0.349*** 0.0286 (0.103) (0.0841) Tamil 0.0400 -0.0644 (0.0975) (0.0828) Income 0.0313*** 0.0421*** (0.0119) (0.0116) Mother’s 0.0281*** 0.0269*** Education (0.00635) (0.00545) Father’s 0.0101 0.00962* Education (0.00625) (0.00540) Constant - - 0.114*** 0.0756*** -0.668*** -0.337*** (0.0173) (0.0163) (0.104) (0.0880) Observations 5709 5688 4746 4727 R-squared 0.003 0.004 0.038 0.058 Number of scid 196 196 196 196 Robust standard errors in parentheses *** p<0.01, ** p<0.05, * p<0.1 The regressions in columns 3-4 in Table 3 repeat those in columns 1-2, except that additional explanatory variables are added. To the extent that adoption of the program is correlated with these additional variables, this will reduce bias in the double difference results. In addition, adding variables with high explanatory power may reduce the standard errors of the estimated program effects, leading to further precise estimation. However, the sample size drops by almost 20% when these variables are added (due to missing data on these variables), and this reduction in sample size offsets any increased precision from adding these variables. In fact, the estimated impact of the programs is largely the same when these variables are added, viz: the PSI program increases Math and English language scores. Overall, the estimates in Table 10 suggest that the PSI has significantly increased Math and English test scores among Grade IV students. b) School Variables. Tables 4-5 examine the impact of the PSI and SRCP programs on school-level variables. Table 4 begins by examining Grade IV teacher behavior variables. Neither program, nor the combination of both programs, had any impact on teacher absences, homework assignments, or teachers’ perception of whether 18 more money was allocated for higher quality inputs. This reflects inadequate accountability by teachers (who belong to the central government), to local communities. For instance, the local communities do not monitor teacher attendance. Also, schools receive little money as grants from higher levels of government for quality improvements. Most school-based spending is from resources raised from local parents, past pupils, and well-wishers from local communities. Table 4: Grade IV Teacher Variables (School level fixed effects estimation with clustered standard errors) (1) (2) (3) VARIABLES Teacher Homework Money from Absence Government Year = 2008 -12.88*** -0.0263 0.0213 (4.722) (0.0928) (0.0381) 2008×PSI 9.592 -0.117 0.0740 (6.490) (0.149) (0.0606) 2008×Report Card 6.505 -0.0570 0.0263 (5.866) (0.152) (0.0620) 2008×RepCd×PSI (5.958) (0.142) (0.0719) (1.435) (0.0371) (0.0169) Constant 23.40*** 2.808*** 0.907*** 7.625 -0.0213 0.000946 Observations 384 349 371 R-squared 0.075 0.018 0.029 Number of scid 196 191 195 Robust standard errors in parentheses *** p<0.01, ** p<0.05, * p<0.1 Table 5: Grade IV Principals Management of School Needs Variables (Probit estimation with clustered standard errors) (1) (2) (3) (4) (5) VARIABLES Project Needs Priorities Long-Term Formed Analysis Plan School Dev. Com. Year = 2008 -0.122 0.0621 0.0621 0.738** -0.519* (0.217) (0.311) (0.310) (0.345) (0.307) 2008×PSI 0.628** 0.534 0.219 0.327 1.685*** (0.292) (0.517) (0.447) (0.541) (0.513) 2008×Report Card 0.323 0.153 -0.584 -0.0678 -0.253 (0.301) (0.453) (0.383) (0.482) (0.440) 2008×RepCd×PSI 0.501* 0.153 0.463 1.296*** (0.294) (0.453) (0.526) (0.448) Constant -0.239** 1.364*** 1.364*** 0.894*** 0.773*** (0.0934) (0.127) (0.127) (0.105) (0.102) Observations 331 346 345 307 296 Robust standard errors in parentheses *** p<0.01, ** p<0.05, * p<0.1 For column (4), all schools with both the PSI and the SRCP initiatives had a long-term plan, so that coefficient in effect goes to infinity, and must therefore be dropped from the regression. 19 Table 5 examines certain variables concerning the management of school needs. The combination of both programs and the PSI by itself increased the probability that the principal had implemented some kind of project without financial support from the central or provincial governments, but with capital from local communities. This suggests that schools assigned to the PSI were initiating programs financed by local resources, which is one of the program’s goals. It also increased the probability that a school development committee had been formed. Section Five: Conclusions Overall, the findings indicate that schools which implemented the PSI performed well in terms of improving the cognitive achievement levels of their primary school students. This is an encouraging and positive finding. Discussions with stakeholders suggested that a range of processes, covering better teacher and parental involvement with children, both at school and in the home, are likely to have contributed to this outcome (see Box 1). The education stakeholders in local communities, such as parents, past pupils and well- wishers, involved themselves closely in the administration of the schools through the school development committees. This provided the institutions with additional management support to advance the learning standards of children. This involvement also resulted in school development committees raising additional resources for schools—over and above the funds received from the central government or the provincial councils—to undertake school development projects. In particular, the committees promoted the stocking up of children’s books in the library, and encouraged theater and music related activities to improve the English language skills of children. Further, the committees helped schools to implement pleasurable activities—such as numeracy games—that promoted the mathematical skills of students. The element of “fun� in these games was important for these primary school level children. Box 1: Focus Ground Discussions with Stake Holders The evaluation team conducted focus group discussions with officials from the national Ministry of Education, provincial education authorities, school principals, teachers, parents, and local community representatives. The discussions revealed that the school development committees had directed their efforts at increasing resources, both cash and in-kind, for their local schools. These resources contributed to the incorporation of co-curricular and extra-curricular activities such as theater and literary events, and sports and athletic events. Also, the resources were used for curriculum-related activities such as the provision of children’s storybooks for the school library, and school trips to places of cultural or historical interest. Additionally, it was seen that parents attached high value to education. Principals and teachers stated that the active involvement of parents, past pupils and other local community representatives gave “life� to their schools, especially those in small and remote rural communities. Parents were pleased to be involved in school affairs, which they stated gave them greater ownership and commitment to the educational activities of their children. Past pupils stated that they saw their support as “giving something back� to the schools from which they had benefited when they were children. A recurring theme in the focus group discussions was the importance of dynamic and inspiring leadership of the school principal. Principals who demonstrated leadership and able managerial qualities were viewed as having developed their schools from very humble origins—in some cases even in the midst of jungles—to institutions of respect and worth in their areas. Parents and past pupils were inspired to support schools with impressive principals. Overall, the Programme for School Improvement was extremely popular among school stakeholders. 20 The formation of school development committees—with their regular meetings, and continuing interactions with the school management teams—also provided a sense of order and method to schools. This contributed to better school management, particularly with the goal of promoting learning. In poorer communities, it was observed that the parents of children also benefited from the school-community interactions. Parents realized the importance of facilitating at-home activities to enable children to study and learn better. For instance, parents learned to set aside study time for their children and, to the extent possible, mothers joined in to help their children with learning activities. It was noted that in schools which did not have the school development committees, parental involvement was far less. The SRCP, however, was not as successful as the PSI. The results do not show a statistically significant effect of the SRCP on school performance. This may partly be due to the relatively low weightage accorded to the SRCP in relation to the PSI by policy makers. The PSI was viewed as a flagship program, and was strongly supported by both the central government and provincial councils. In contrast, the SRCP was put into operation mainly for the purposes of the evaluation. Little effort was made to enable schools to use the SRCP mechanisms to obtain information about schools and to seek an improvement in institutional performance. The interaction of the PSI and the SRCP showed insignificant results. This is somewhat puzzling, as the PSI alone demonstrated a positive effect. The PSI and SRCP combination may have been less successful because school management committees need more time and effort to absorb and act on the information received from the SRCPs. This could especially be more challenging for poorer communities, which have less educated parents and stakeholders. This is an area for further research in the future. Section Six: Future Development of the Psi And The Srcp For School Improvement Considerable effort and resources have been invested in recent years in decentralizing decision making, and in escalating parental and community involvement in the educational system through the PSI. The findings of this impact evaluation reveal that the PSI has had a positive and significant impact on increasing local community participation in school administration, the implementation of school development projects through resources raised from local communities, and the cognitive achievement levels of primary school students. The Programme for School Improvement The following steps seem appropriate to extend and consolidate the reforms supported by the PSI: • Expand the PSI to cover all schools in the country. • Consolidate PSI in schools where structures have been established, but activity is low. This may require greater clarity in specification of roles, capacity building, and continuing personal and monetary support for schools. 21 • Support capacity development to exercise real devolution, as opposed to mere decentralization or delegation. This may involve reforms to make teachers more accountable to local school communities in such matters as school attendance. Eventually, the currently centralized teacher recruitment practices could be devolved to the school level. • Promote decentralization, not as an end in itself, but as a means through which school- level decision makers can implement practices that improve teaching and learning. • Review regulations regarding School Development Committees and School Management Teams to establish their appropriateness for all types and sizes of school. • Empower lower levels of governance with clearly defined functions that do not overlap with higher levels. This will involve strengthening zonal capacity through training programs for advisors and networking of principals. • Pay particular attention to schools serving children in socio-economically disadvantaged areas, as good governance is indispensable for the marginalized (UNESCO, 2008). Deploy additional funding to such schools, commensurate with school size, level of schooling provided, special education needs, location, and type of school (Ross & Levacic, 1999). • Support schools in carving out a dynamic parental involvement program—which, to date, has been largely restricted to distal activities—to one in which attention is paid to proximal activities. Encourage parents to attend meetings, get involved in school committees, engage in voluntary work or make financial and other contributions to maintain or improve physical conditions, resources, and services. The program should focus on: (a) developing parents’ understanding that the home environment has a profound impact on the learning abilities of children and that parents have the power to change it; (b) developing parents’ self-confidence and sense of efficacy in establishing a home environment that will provide rich learning experiences for children; and (c) demonstrating specific behaviors that parents can follow, such as learning how to interact with pupils regarding homework, and having students and parents read to each other, depending on the age and abilities of the children. (Hoover-Dempsey & Sandler, 1997). • Consider ways to involve parents and other community members to connect with disengaged parents. • Extend the involvement of communities to contribute to the development of “competencies for life�, the “soft skills� or generic skills that are necessary for effective functioning in personal life, interpersonal relationships, and employment/economic activities (examples: critical and divergent thinking, problem solving, creativity, initiative, leadership, responsibility, team work). Some experience in community activities is often more relevant and appropriate in developing these skills than school-based experience which is often consumed with covering syllabuses and preparing students for examinations. Community activities also provide opportunities to develop social cohesion through learning to live with others in harmony, respecting the diversity of a multi-ethnic, multi-religious, and multi-cultural society (see Ministry of Education, 2004b). • Schools need support in setting objectives, assessing student achievements, determining what learning experiences are necessary to ensure success, and measuring and reporting on the outcomes to parents (Caldwell, 2005). To support schools in this activity, it is proposed that standardized tests in core curriculum areas which would provide normative data are developed and made available to schools for use by teachers. 22 • School accountability to parents and communities should be expressed in an annual report describing their activities. This should include data on student achievements measured by standardized tests. The School Report Card Programme The SRCP in its present form has demonstrated little impact on school performance. This is not surprising, as it has received scant attention by policy makers. The SRCP could be revised so that the information contained in it is used by the SDCs and government authorities to improve schools. As a first step, the SDCs and local governments would need to be trained in the judicious use of school report cards. Once the SRCP is implemented as a full-fledged program, its impact could be carefully studied. It is entirely possible that mainstreaming the SRCP in the government reform program would result in positive effects at the school level. 23 Annex One Table 1: Variables Available from the PSI and non-PSI Schools Variable Source 2008 Question 2008 Question numbers numbers Grade IV Grade IV Student Indicators Test scores Student test Participation in tutoring (tuition) Student questionnaire 22 20 classes Time spent studying Parent questionnaire 44 44 Grade repetition Student questionnaire 31 N/A Teacher indicators Subject knowledge (test scores) Teacher test Classroom supplies (books, texts, Teacher questionnaire 15, 22 16, 22 desks, etc.) Teacher training (≥ 14 days in last Teacher questionnaire 14 14 2 years) Teacher absences Teacher questionnaire 30 30 Adequate guides on student Teacher questionnaire 40 40 centered learning Teachers allocated funds to buy Teacher questionnaire 23 23 school inputs Parental and community Indicators Parent/teacher meeting attendance Parent questionnaire 40 40 Parent helps child with Parent and Student 14 (P), 21 (S) 43(P), 19(S) schoolwork questionnaire Parental expectations of child Parent questionnaire 50 50 achievement Parent participation in school Parent questionnaire 41 41 events Principal/Section Head/School indicators School management practices Principal questionnaire 35, 36, 39, 42, 35, 36, 39, 43, 44, 55, 58, 62, 45, 52, 55, 60, 68, 70, 71, 79 61, 79, 82, 90 Finances Principal questionnaire 31, 32 32, 33 School facilities Principal questionnaire 22, 23, 24 23, 24, 25 Teacher/Principal meetings Section Head 8 8 questionnaire 24 Table 2: Descriptive Statistics of Key Variables Child and Parent Control Variables - Grade IV F-test 2006 2008 Variable Description 1/ Sample N Mean SD N Mean SD Whole sample 2847 1.930 1.167 2345 2.507 1.316 Parent report of household income =1 if less than 3000, =2 if between 3000- PSI and Report 768 1.914 1.156 638 2.524 1.314 Household 5000, =3 if between 5001=10,000, =4 Card Income1 if between 10,001-20,000, =5 if PSI Only 679 1.897 1.125 567 2.430 1.281 between 20,001-30,000, =6 if above Report Card Only 737 1.909 1.184 561 2.621 1.354 30,000 Control 663 2.008 1.202 579 2.453 1.309 Whole sample 2923 8.945 5.750 2813 9.366 7.125 Which of the following needs of your PSI and Report 790 8.773 5.863 758 9.958 7.421 Educational child’s education do you spend on? 1= Card Expenses3 less than 500, 2=between 500-1000, 3= PSI Only 689 8.874 5.249 662 9.330 6.669 more than 1000. Report Card Only 761 9.059 5.835 713 8.880 7.317 Control 683 9.088 6.007 680 9.249 6.983 1/ F-test for equality of means across groups in 2006: * if difference is statistically significant at 5% level 25 Table 2: Descriptive Statistics of Key Variables Teacher Variables - Grade IV F-test 2006 2008 Variable Description 1/ Sample N Mean SD N Mean SD Number of days teacher Whole sample 191 23.558 24.253 191 16.419 19.062 took leave in 2005 or PSI and Report Card 48 23.771 22.598 48 18.542 22.471 Teacher Absence 2007 for vacation, PSI Only 47 22.649 19.121 48 19.031 24.216 medical, maternity, no Report Card Only 47 21.319 22.077 47 14.606 12.509 pay, other Control 49 26.367 31.575 48 13.458 14.166 Frequency that the teacher Whole sample 174 2.799 0.443 175 2.743 0.464 gives homework to PSI and Report Card 45 2.778 0.517 44 2.750 0.438 Homework students =3 if always, =2 PSI Only 47 2.766 0.428 43 2.628 0.536 if seldom, =1 if once in a Report Card Only 40 2.800 0.405 44 2.773 0.476 while Control 42 2.857 0.417 44 2.818 0.390 Teacher’s opinion if Whole sample 188 0.904 0.295 183 0.956 0.205 money was allocated for PSI and Report Card 46 0.913 0.285 47 0.936 0.247 Money quality inputs for Grade PSI Only 46 0.891 0.315 45 0.978 0.149 IV students in 2005 or Report Card Only 47 0.894 0.312 44 0.955 0.211 2007 = 1 if yes, =0 if no Control 49 0.918 0.277 47 0.957 0.204 1/ F-test for equality of means across groups in 2006: * if difference is statistically significant at 5% level 26 Table 3: Child and Parent Control Variables – Grade IV Description 2006 2008 N Mean SD N Mean SD Gender Child’s report: 1=male, 0= 2881 0.500 0.500 2687 0.505 0.500 female Sinhala Child’s report:1=yes, 0= no 2859 0.721 0.449 2662 0.739 0.439 Tamil Child’s report:1=yes, 0= no 2859 0.153 0.360 2662 0.148 0.355 Other Child’s report:1=yes, 0= no 2859 0.127 0.333 2662 0.113 0.317 Ethnic Group Household Parent report of household 2847 1.930 1.167 2346 2.507 1.316 Income1 income =1 if less than 3000, =2 if between 3000-5000, =3 if between 5001=10,000, =4 if between 10,001-20,000, =5 if between 20,001- 30,000, =6 if above 30,000 Mother’s Categorical variable ranging 2843 9.867 2.959 2233 9.399 3.120 Education2 from discrete values of 1=15. 1= post graduate degree, 15=not attended school Father’s Categorical variable ranging 2843 9.503 3.059 2360 9.646 2.977 Education2 from discrete values of 1=15. 1= post graduate degree, 15=not attended school Educational Which of the following 2923 8.945 5.750 2817 9.357 7.126 Expenses1, 3 needs of your child’s education do you spend on? 1= less than 500, 2=between 500-1000, 3= more than 1000. Notes: The sample is restricted to include only the schools that were in both rounds of data collection. 1 In the 2008 parent questionnaire, data were collected from both mothers and fathers. In case of data from both parents, the household income and the educational expenditures data were taken as the average. 2 In the 2008 parent questionnaire, data were collected from both mothers and fathers. In case of data from both parents, the father’s education was taken from the father’s data and the mother’s education was taken from the mother’s data. 3 The variable was created by summing across the following expenditure categories: school fees, transport, books, pens and pencils, school uniform, shoes, instruments, additional reading books, hostel fees, repair and maintenance of school buildings, library fees, sports equipment, sports functions, concerts, educational tours, societies, other. 27 Table 4: Grade IV Principal Variables Description 2006 2008 N Mean SD N Mean SD Principal’s Management of Teachers Appraisal Do you have an appraisal 189 .803 .398 143 .853 .356 system for your teachers? =1 if yes, =0 if no Reward Do you have a system of 189 .301 .460 147 .347 .478 rewards for teachers? =1 if yes, =0 if no Self Do you have a self- 189 .577 .495 142 .606 .490 Evaluation evaluation scheme for teachers? =1 if yes, =0 if no Activities Have you introduced any 168 .732 .444 143 .755 .431 activities for professional development of teachers? =1 if yes, =0 if no Review Do you review 185 .768 .424 139 .842 .366 performance and monitor progress of your school? =1 if yes, =0 if no Observe How often do you 188 2.872 1.532 143 2.783 1.3222 observe teaching =1 if daily, =2 if weekly, =3 if fortnightly, =4 if monthly, =5 if occasionally Principal’s Management of School Needs Project Did you undertake any 185 .405 .492 146 .500 .502 project, programs or repair without financial assistance from the Central Government or Provincial Government in 2005 or 2007? =1 if yes, =0 if no Needs Have you done a need 197 .914 .281 149 .946 .226 Analysis analysis for your school? =1 if yes, =0 if no Priorities Have you prioritized the 197 .914 .281 148 .912 .284 needs that you have identified? =1 if yes, =0 if no Long- Do you have a long term 194 .814 .390 149 .966 .181 Term Plan (2005) or 5 year plan (2007)? =1 if yes, =0 if 28 no Formed Have you formed a 191 .780 .412 105 .826 .379 School School Development Dev. Committee in your Committee school? =1 if yes, =0 if no Financial Assistance Received Facility Level received: 177 1.4011 .546 136 1.419 .524 Fees 3=highest, 2=average, 1 = not enough School Level received: 189 1.354 .511 148 1.426 .561 Dev. 3=highest, 2=average, 1 = Society not enough Past Pupil Level received: 171 1.094 .347 133 1.053 .256 Assoc. 3=highest, 2=average, 1 = not enough Other Level received: 174 1.115 .354 138 1.159 .267 3=highest, 2=average, 1 = not enough NGO Level received: 175 1.183 .416 156 1.110 .314 3=highest, 2=average, 1 = not enough State Level received: 193 2.316 .558 148 2.399 .491 Assistance 3=highest, 2=average, 1 = not enough Note: The sample is restricted to include only the schools that were in both rounds of data collection for Grade IV. 29 Annex Two Analytical Methodology The objective of this paper is to estimate the average treatment effect for students in the 200 Sri Lankan schools that were assigned to either a “program� group or a “control� group. More specifically, this paper attempts to estimate the impact of the PSI and the SRCP on Sri Lankan students’ educational outcomes. This section explains the methodology used, highlighting the assumptions needed to ensure unbiased estimation of average treatment effects. To begin, consider the case of a single program to be evaluated. Let Yi(1) denote the value of Y, an outcome variable of interest, if student i is enrolled in a school that participates in the program, and let Yi(0) denote the value of Y if student i is enrolled in a school that does not participate in the program. The average treatment effect can be defined as: ATE = E[Yi(1) – Yi(0)] (1) Since ATE is not conditional on any student characteristic, “student i� represents the “average student� in the population of students who attend the program and control schools. This paper estimates the impact of the program using a standard “double difference� estimator. To see how this estimator works, assume that Yi(0) is determined as follows: S Yi(0) = α + βTi + Σ γsGis + εi (2) s=1 Where Ti is a time dummy variable that equals 0 for the year 2006 and equals 1 for the year 2008, Gis is a set of dummy variables for each of the 200 schools that equal 1 if student i is enrolled in school s (and equal 0 otherwise), and εi is a residual term that measures student specific deviations from the school means, which are measured by the γs parameters. Thus, for each school this definition of εi implies that E[εi] = 0. Finally, assume that εi is independent of the Ti and Gis variables. 9 In fact, ATE could vary depending on how long the program has been operating. The data from 2008 can be used to measure the impact of the program after two years, so in fact this paper estimates the average treatment effect after almost two years, which can be defined as: ATE (2 years) = E[(Yi(1) – Yi (0))| Ti = 1] (3) 1. Where Ti = 1 simply indicates the impact of the program in 2008, i.e., two years after the program commenced. 9 Since εi is defined as the within-school deviation from school means, it is uncorrelated with the Gis variables (as those variables do not vary within schools); note that independence is a somewhat stronger assumption than lack of correlation. The assumption that εi is independent of Ti holds if εi does not change over time, but this independence assumption does not necessarily imply (does not require) that εi is unchanged over time. 30 The goal of this paper is to estimate the expression in equation (3). The standard double difference estimator assumes that the impact of the program after a given amount of time is the same for all observations. Denoting this program impact by τ, this assumption implies the following relationship between Yi(1) and Yi(0): S Yi(1) = Yi(0) + τ = α + βTi + Σ γsGis + εi + τ for all i (4) s=1 Where the expression after the second equality simply uses equation (2). The assumption that τ does not vary over students is a strong assumption that is made here primarily for convenience. The implications of it not holding will be discussed below. Suppose that data are available only for the year 2008. It is not possible to estimate the average treatment effect after two years, denoted by ATE (2 years), without making further assumptions. With data only from 2008, one observes Yi(1) for students in schools that participated in program and Yi(0) for students in schools that did not participate. Let Yi denote the observed value of Y in 2008 for student i, so that for students in schools that do not participate in the program Yi = Yi(0), and for students in schools that do participate in the program Yi = Yi(1). The most obvious way to estimate ATE (2 years) is to use observed data on Y, that is Yi, for students in schools that participated in the program to estimate E[Yi(1)] and to use observed data on Y for students in schools that did not participate in the program to estimate E[Yi(0)], yet this could lead to bias if the “assignment� of schools to be treatment and control schools was not random. For example, if the schools that participated in the program had better than average students (which implies that the schools that did not participate had worse than average students), then this approach would overestimate ATE (2 years), and if these schools had worse than average students than it would underestimate ATE (2 years). However, if the assignment of schools to be treatment and control schools is uncorrelated with Yi(1) and Yi(0), then the following holds: E[Yi(1) |Ti = 1] = E[Yi(1)| Ti = 1, Wi = 1] (5) E[Yi(0) |Ti = 1] = E[Yi(0)| Ti = 1, Wi = 0] Where the variable W denotes program participation, so that Wi = 1 indicates that student i is enrolled in a program school and Wi = 0 indicates that student i is enrolled in a control school. Since both E[Yi(1)| Ti = 1, Wi = 1] and E [Yi(0)| Ti = 1, Wi = 0] are observed in the data from 2008, the assumption in equation (5) allows one to estimate the average treatment effect after two years as follows: ˆ A T E (2 years) = E [Yi(1)| Ti = 1, Wi = 1] – E [Yi(0)| Ti = 1, Wi = 0] (6) If the assignment of schools to the treatment and control groups was not randomized, then the estimate in equation (6) of the average treatment effect could be biased. To see the source of bias more directly, insert the equations for Yi(0) and Yi(1) from (2) and (4) into equation (6): 31 S S A T E (2 years) = E [α + βTi + Σ γsGis + τ + εi |Ti = 1, Wi = 1] – E [α + βTi + Σ γsGis + ˆ s=1 s=1 εi |Ti = 1, Wi = 0] (6′) Note that the assumption that εi is the within-school deviation from the school means implies that it is uncorrelated with any variable that does not vary within schools, which implies that E[εi| Wi] = 0 and that E[εi |Ti = 1, Wi = 1] = E[εi |Ti = 1, Wi = 1] = 0. Then we can write: S S A T E (2 years) = α + β + E[ Σ γsGis|Ti = 1, Wi = 1] + τ – (α + β + E[ Σ γsGis|Ti = 1, Wi ˆ s=1 s=1 = 0]) (6′′) S S = τ + E [ Σ γsGis|Ti = 1, Wi = 1] – E [ Σ γsGis|Ti = 1, Wi = 0] s=1 s=1 If Wi had been randomly assigned, then the two expectation terms in the last line of′′)(6 both equal E [Gis|Ti = 1] and so cancel each other out, so A T ˆ E (2 years) is an unbiased ˆ estimate of τ (since A T E(2 yrs) = τ). However, if they are not randomly assigned, so that certain types of schools, as indicated by the Gis dummy variables, are correlated with Wi, ˆ then the two expectation terms are not equal and A T E (2 years) is a biased estimate of τ. Double difference estimation gets around the problem that Wi may be correlated with the Gis dummy variables by using data from before the program started (in this context, data from 2006) and by assuming that the associated γs parameters do not change over time. The double difference estimator can be defined as: ˆ A T EDD (2 years) = {E [Yi(1)| Ti = 1, Wi = 1] – E [Yi(0)| Ti = 0, Wi = 1]} (7) - {E [Yi(0)| Ti = 1, Wi = 0] – E [Yi(0)| Ti = 0, Wi = 0]} Intuitively, this estimator compares the change over time in Yi for students in schools that started participating in the program between 2006 and 2008 with the change over time in Yi for students in schools that did not participate in the program at any time between 2006 and 2008. To see why this estimator is unbiased given the above assumptions, substitute equations (2) and (4) into (7): S S A T EDD (2 years) = {E[α + βTi + Σ γsGis + τ + εi| Ti = 1, Wi = 1] – E[α + βTi + Σ γsGis + εi | ˆ s=1 s=1 Ti = 0, Wi = 1]} (7′) S S - {E[α + βTi + Σ γsGis + εi| Ti = 1, Wi = 0] – E[α + βTi + Σ γsGis + εi| Ti = 0, Wi = 0]} s=1 s=1 S S = {α + β + τ + E[ Σ γsGis| Ti = 1, Wi = 1] – α - E[ Σ γsGis| Ti = 0, Wi = 1]} s=1 s=1 32 S S - {α + β + E[ Σ γsGis| Ti = 1, Wi = 0] – α - Σ γsGis| Ti = 0, Wi = 0]} s=1 s=1 = {β + Σ Gis + τ - Σ Gis} - {β + Σ Gis - Σ Gis} s∈W =1 s∈W =1 s∈W =0 s∈W =0 = (β + τ – β) = τ Another potential benefit of double difference estimation is that it is likely to provide more precise estimates (in the statistical sense of having a lower standard error) than estimates that are based on 2008 data only, even if the latter estimates are not biased. Using these assumptions for double difference estimation, it is convenient to estimate τ using OLS regression. The regression equation is: S Yit = α + βTi + Σ γsGis + τ(Ti×Wi) + εi (8) s=1 Where Yit is the observed value of Y for student i at time t (2006 or 2008). In this regression, β estimates the “general� change in Yi over time for all students, and τ estimates the impact of the program. In fact, two distinct programs were implemented in the 200 schools analyzed in this paper, the PSI and the SRCP, with one-fourth of the schools having adopted both programs, which can be thought of as a third treatment. Equation (8) can be extended to the case where three distinct programs are assessed: S Yit = α + βTi + Σ γsGis + τ1(Ti×W1i) + τ2(Ti×W2i) + τ3(Ti×W3i) + εi (8′) s=1 In this regression W1i is a dummy variable that indicates whether student i attends a school that participated in the first program, and W2i and W3i indicate whether student i attends a school that participated in the second or the third program, respectively. Similarly, τ1, τ2 and τ3 estimate the impacts of the first, second and third programs. The S γs terms are school fixed effects. 10 10 In many double difference estimates the same students are observed over time, which allows the regression equation to use the change in the students’ test scores over time as the dependent variable, which automatically differences out school fixed effects, and indeed, student fixed effects. Yet, our test score data from Sri Lanka are from different students in different years, although from the same schools, so we need to explicitly include school fixed effects in the regression equation. 33 References Abu-Duhou, I. (1999). School-Based Management. Paris: International Institute for Educational Planning: UNESCO. Aturupane, H. (2009). The Pearl of Great Price: Achieving Equitable Access to Primary and Secondary Education and Enhancing Learning in Sri Lanka. CREATE Monograph 29, London, U.K. Barrera-Osorio, F., Fasih, T., and Patrinos, H.A. (2009). Decentralized Decision- Making in Schools: The Theory and Evidence on School-Based Management. Washington, D.C.: World Bank. Bray, M. (1996). Decentralization of Education: Community Financing. Washington, D.C.: World Bank. Bray, M. (2001). Community Partnerships in Education: Dimensions, Variations and Implications. Paris: UNESCO. Caldwell, B.J. (2002). Autonomy and Self-Management: Concepts and Evidence. In Bush, T. & Bell, L. (Eds.), The Principles and Practice of Educational Management (pp. 21-40). London: Paul Chapman. Caldwell, B.J. (2005). School-Based Management. Brussels: International Academy of Education; Paris: International Institute for Educational Planning. Chaudhury, N., Hammer, J., Kremer, M., Muraldhiran, K., & Rogers, H., (2006). Missing in Action: Teacher and Health Worker Absence in Developing Countries, Journal of Economic Perspectives. 20 (1): 91-116. Cheng, Y.C. (1996). School Effectiveness and School-Based Management: A Mechanism for Development. London: Falmer. Corter, C., & Pelletier, J. (2005). Parent and Community Involvement in Schools: Policy Panacea or Pandemic? In Bascia, N., Cumming, A., Datnow, A., Leithwood, K., & Livingstone, D. (Eds.), International Handbook of Educational Policy (pp. 295- 327). Dordrecht, Netherlands: Springer. Deaton, A. (1997). The Analysis of Household Surveys: A Microeconomic Approach. Baltimore: Johns Hopkins University Press. De Silva, T.H.D.C. (2007). Study on School Based Management. Colombo, Sri Lanka. Processed. Dias, M.A.A.S. (2008). Case Studies on Schools Exposed to the Programme on School Improvement (PSI). Colombo, Sri Lanka. Processed. - 34 - Epstein, J.L., & Saunders, M.G. (2002). Family, School, and Community Partnerships. In Bornstein, M.H. (Ed.), Handbook of parenting (2nd ed.). Mahwah NJ: Lawrence Erlbaum. Finn, J.D. (1998). Parental Engagement that Makes a Difference. Educational Leadership, 55, 20-24. Gertler, P., Patrinos, H.A., & Rubio-Codina, M. (2006). Empowering Parents to Improve Education: Evidence from Rural Mexico. World Bank Policy Research Working Paper 3955. Washington, D.C.: World Bank. Graue, N.E., Weinstein, T., & Walberg, H.J. (1983). School-Based Home Instruction and Learning: A Quantitative Synthesis. Journal of Educational Research, 76, 351- 360. Grolnick, W.S., & Slowiaczek, M.L. (1994). Parents’ Involvement in Children’s Schooling: A Multi-Dimensional Conceptualization and Motivational Model. Child Development, 65, 237-252. Gunasekara, T.A.R.J., Dias, M.A.A.S., Ratnayaka, D.A.S.D., Dissanayake, N.D., de Silva, G.H.R.T., & Chinthani, R.B.N. (2010). Impact Evaluation of Programme for School Improvement in Sri Lanka. 2006-2009. Maharagama: Faculty of Research Planning and Development, National Institute of Education. Guthrie, J.W. (1986). School-Based Management: The Next Needed Education Reform. Phi Delta Kappan, 68, 305-309. Herath, T.N. (2009). Decentralization of Governance and Economic Development. South Asia Economic Journal, 10, 157-185. Hill, D., Smith, B., & Spinks, J. (1990). Local Management of Schools. London: Paul Chapman. Ho, E., & Williams, J.D. (1996). Effects of Parental Involvement on Eighth Grade Achievement. Sociology of Education, 69, 126-141. Hoover-Dempsey, K.V., & Sandler, H.M. (1997). Why Do Parents Become Involved in Their Children’s Education? Review of Educational Research, 67, 3-42. Jeynes, W.H. (2005). A Meta-Analysis of the Relation of Parent Involvement to Urban Elementary School Student Academic Achievement. Urban Education, 40, 237-269. Jeynes, W.H. (2007). The Relationship Between Parental Involvement and Urban Secondary School Student Academic Achievement. Urban Education, 42, 82-110. Kellaghan, T. (2001). Family and Schooling. In Smelser, N.J., & Baltes, P.B. (Eds.), International Encyclopedia of the Social and Behavioral Sciences (pp. 5303-5307). Oxford: Pergamon. - 35 - Kellaghan, T., Sloane, K., Alvarez, B., & Bloom, B.S. (1993). The Home Environment and School Learning. Promoting Parental Involvement in the Education of Children. San Francisco: Jossey-Bass. Kularatne, W.G. (2008). Handbook for School Improvement Partners. Battaramulla: Ministry of Education/Secondary Education Modernization Project II. Lo, W.Y.W. (2010). Educational Decentralization and its Implications for Governance: Explaining the Differences in Four Asian Newly Industrialized Economics. Compare: A Journal of Comparative and International Education, 40, 63-78. Ministry of Education And Higher Education (1996). National Education Policy. Battaramulla: Ministry of Education. Ministry of Education. (2004a). Manual of Instructions For School Level Planning. Battaramulla: Ibid. Ministry of Education. (2004b). The Development Of Education: National Report. Battaramulla: Ibid. Ministry of Education. (2005). Programme on School Improvement. Battaramulla: Ibid. Ministry of Education. (2007), Education Sector Development Framework And Programme, Colombo, Sri Lanka. Ministry of Education. (September, 2008). Review of PSI. World Bank Mission. Battaramulla: Ministry of Education. National Education Commission. (1995). An Action Oriented Strategy Towards a National Education Policy. Colombo, Sri Lanka. National Education Commission. (1997). Education Reforms. Colombo: National Education Commission. National Education Commission. (2003). Envisioning Education for Human Development: Proposals For a National Policy Framework on General Education in Sri Lanka. Colombo, Sri Lanka. Ozler, B. (2001). Decentralization and Student Achievement: The Case of Nicaragua’s School Autonomy Reform. Working Paper on Impact Evaluation of Education Reforms. Washington, D.C.: World Bank. Patall, E.A., Cooper, H., & Robinson, J.C. (2008). Parent Involvement in Homework: A Research Synthesis. Review of Educational Research, 78, 1039-1101. Perera, L., Wijetunge, S., Navaratna, A.A., & Karunanithy, M. (2007). National Assessment of Achievement of Grades 8 and 10 Students in Sri Lanka. Patterns and - 36 - Trends in Performance. National Report. Colombo: National Education Research and Evaluation Centre, University Of Colombo. Perera, W.J. (1997). Changing Schools from Within: A Management Intervention for Improving School Functioning in Sri Lanka. Paris: International Institute for Educational Planning. Perera, W.J. (2000). School Autonomy Through School-Based Management: The Case of Sri Lanka. In G. Gôttelmann-Duret (Ed.). Improving School Efficiency: The Asian Experience (Pp. 33-74). Paris: International Institute for Educational Planning. Perera, W.J. (2006). Efforts Toward Decentralization. Ideology vs. Reality – The Sri Lankan Case. In C. Bjork (Ed.), Educational Decentralization (Pp. 211-222). New York: Springer. Postlethwaite, T.N., & Ross, K.N. (1992). Effective Schools in Reading: Implications for Educational Planners. The Hague: IEA. Ross, K.H., & Levacic, R. (Eds.). (1999). Needs-Based Resource Allocation in Education via Formula Funding of Schools. Paris: International Institute for Educational Planning. Shaeffer, S. (1994). Participation for Educational Change: A Synthesis of Experience. Paris: International Institute For Educational Planning. Summers, A.A., & Johnson, A.W. (1995). Doubts About Decentralized Decisions. School Administrator, 52(3), 24-32. Swift-Morgan, J. (2006). What Community Participation in Schooling Means: Insights From Southern Ethiopia. Harvard Educational Review, 76, 339-368. Naik, C. (1994). Education for All Summit of Nine High-Population Countries. Final Report. Paris: UNESCO. UNESCO. (2008). Overcoming Inequality: Why Governance Matters. Education for All Global Monitoring Report. UNESCO: Paris: Van Der Werf, G., Creemers, B., & Guldemond, H. (2001). Improving Parental Involvement in Primary Education in Indonesia: Implementation, Effects and Costs. School Effectiveness and School Improvement, 12, 497-466. White, K.R. (1982). The Relation Between Socioeconomic Status and Academic Achievement. Psychological Bulletin, 91, 461-481. Wohlstetter, P., & Mohrman, S.A. (1993, January). School-Based Management: Strategies for Success. CPRE (Consortium For Policy Research In Education) Briefs. World Bank. (2005). Treasures of the Education System in Sri Lanka: Restoring Performance, Expanding Opportunities and Enhancing Prospects. Human - 37 - Development Unit, South Asia Region. The World Bank. Washington, D.C. & Colombo, Sri Lanka. World Bank. (2006). From Schooling Access to Learning Outcomes: An Unfinished Agenda. An Evaluation Of World Bank Support To Primary Education. Washington, D.C.: World Bank. Http://Www.Worldbank.Org/Ieg World Bank. (2007). What is School-Based Management? Washington, D.C.: World Bank World Bank. (2008). What Do We Know About School-Based Management? Washington, D.C.: World Bank. - 38 -