Policy Research Working Paper 9053 Measuring and Explaining Management in Schools New Approaches Using Public Data Clare Leaver Renata Lemos Daniela Scur Education Global Practice November 2019 Policy Research Working Paper 9053 Abstract Why do some students learn more in some schools than covering nearly all public schools in Brazil. Both indices others? One consideration receiving growing attention is show a strong, positive relationship between school man- school management. To study this, researchers need to be agement and student learning. The paper then develops able to measure school management accurately and cheaply a simple model that formalizes the intuition that strong at scale, and also explain any observed relationship between management practices might be driving learning gains via school management and student learning. This paper intro- incentive and selection effects among teachers, students duces a new approach to measurement using existing public and parents. The paper shows that the predictions of this data, and applies it to build a management index cover- model hold in public data for Latin America, and draws ing 15,000 schools across 65 countries, and another index out implications for policy. This paper is a product of the Education Global Practice. It is part of a larger effort by the World Bank to provide open access to its research and make a contribution to development policy discussions around the world. Policy Research Working Papers are also posted on the Web at http://www.worldbank.org/prwp. The authors may be contacted at rlemos@worldbank.org. The Policy Research Working Paper Series disseminates the findings of work in progress to encourage the exchange of ideas about development issues. An objective of the series is to get the findings out quickly, even if the presentations are less than fully polished. The papers carry the names of the authors and should be cited accordingly. The findings, interpretations, and conclusions expressed in this paper are entirely those of the authors. They do not necessarily represent the views of the International Bank for Reconstruction and Development/World Bank and its affiliated organizations, or those of the Executive Directors of the World Bank or the governments they represent. Produced by the Research Support Team Measuring and explaining management in schools: New approaches using public data Clare Leaver Renata Lemos Daniela Scur Blavatnik School of Government, World Bank Dyson School of Applied University of Oxford CEP-LSE Economics and Management, CEPR Cornell University CEPR, CEP-LSE Keywords: management, teacher selection, teacher incentives, cross-country JEL codes: M5, I2, J3 Acknowledgements: We thank Melissa Adelman and Justin Sandefur for helpful discussions and comments on earlier drafts. We also thank participants at the World Bank’s Regional Study authors’ workshop, RISE Annual Conference 2018, SIOE 2019, Cornell University Development seminar, Nikki Shure, Chris Barrett, John Hoddinott, Vicente Garcia for useful suggestions. José Mola, Raissa Ebner, Maria José Vargas, Claudia Rivas, and Ildo Lautharte provided excellent research assistance. Leaver is grateful for the hospitality of Toulouse School of Economics, 2018- 19. 1 Introduction Despite global calls for improvements in education, progress towards learning for all is slow. This deficit is particularly pronounced for poor children and children in low-income countries [Akmal and Pritchett, 2019]. But why do some students learn more in some schools than others? While there are many contributing factors at system, school, and household-level, one consideration receiving growing attention is school management—the processes and practices used by principals day-to- day as they run their schools [World Bank, 2018]. Academics and practitioners interested in this issue face two challenges: how to measure school management accurately and cost-effectively at scale across schools and countries; and how to explain any observed relationship between school management and learning outcomes in a way that elucidates the underlying mechanisms to guide policy. This paper addresses both of these challenges. Our first contribution is to develop a new approach to measurement that can, in principle, be used with any existing public dataset containing items about school management. We illustrate using two public datasets as examples: the OECD’s Programme for International Student Assessment (PISA), and the Brazilian school census survey, Prova Brasil. The essence of our approach is to benchmark against the “state of the art”, but expensive, World Management Survey (WMS) in Bloom et al. [2015a]. We show how questions from these public surveys can be classified into WMS topics (53 PISA questions into 14 WMS topics and 33 Prova Brasil questions into 8 WMS topics), how the responses can be coded using the WMS scoring rubric, and finally how these grades can be built into a school management index. Our PISA-based index covers over 15,000 schools across 65 countries, and our Prova Brasil-based index covers nearly all public schools in Brazil. These indices are well-validated and can be used by researchers interested in studying the role of management in education systems across a far wider range of countries and schools than was previously possible.1 All three indices, WMS, PISA, and Prova Brasil, show a strong, positive (within-country) correlation between school management and student learning outcomes, echoing recent causal evidence from randomized controlled trials in the U.S. [Fryer, 2014, 2017]. Our second contribution is to develop a framework to explore why management matters for schools. 1 For example, see Wössmann [2016] for a review of education systems research using large, cross-country surveys. 1 We set out, in general terms, how the impact of school management can be decomposed into learning gains that arise because given actors (teachers, students and parents) become more productive, and learning gains that arise because different actors join the school. To explore why these incentive and selection effects might arise, we turn to a specific model that captures key features of education systems in Latin America. This model has two main building blocks. The first is the education production function: we assume that student learning depends on teacher ability, teacher effort, and household effort. The second is the impact of management practices where, considering the personnel policy restrictions the public sector faces, we distinguish between operations and people management. Good people management practices enable managers to observe and contract on the performance of their employees, as well as to cultivate the intrinsic motivation of their staff. Good operations management practices enable managers to use resources efficiently and hence offer a higher level of teacher compensation and a more stimulating environment for students.2 Our framework predicts that good people management practices increase expected test scores through two channels. A teacher with a given ability and intrinsic motivation to teach exerts more effort because these practices provide extrinsic, and cultivate intrinsic, incentives. Compounding this, good people management practices improve selection: a teacher with high ability and high intrinsic motivation prefers a school with performance pay over alternative employments because she anticipates that she will work hard and be rewarded for producing student learning. We focus on Latin American countries and find support for both mechanisms in our PISA data. Principals in schools with higher PISA-based people management scores (predominantly private schools) are less likely to report experiencing teacher shortages and also report higher levels of teacher moti- vation and effort, compared to principals in schools with lower PISA-based people management scores. Our framework also predicts that good operations management practices increase expected test scores through two channels. There is no teacher incentive effect but the selection effect remains, now driven by the level rather than structure of compensation. This is reinforced by a household 2 This assumption echoes the observation made by Baker et al. [1988] that compensation plans featuring explicit financial rewards seldom account for all of a worker’s rewards. 2 incentive effect that arises because strong operations management practices encourage both students and parents to increase their inputs. We also find evidence of these mechanisms in our PISA data for Latin America. Principals in public schools with higher PISA-based operations management scores are less likely to report experiencing teacher shortages and also report higher levels of teacher motivation, teacher effort and household effort, compared to principals in public schools with lower PISA-based operations management scores. While this is not definitive causal evidence, this combination of theory and descriptive empirical analysis offers a novel insight into why management matters in schools and we therefore move on to consider policy implications. People management practices such as performance pay, while common in the private sector, may not be possible in public schools. But there would seem to be fewer barriers to conducting assessments to judge teacher effectiveness, and letting such appraisals lead to changes in public recognition, opportunities for professional development, likelihood of career advancement, and/or greater responsibilities. That is, these people management practices help to attract, develop and reward good performers, and, our analysis suggests, should improve both teacher selection and incentives. There is also substantial variation in the strength of operations management practices within the public sector. This suggest a role for government to encourage principals in public schools with weak operations management to follow best practices. Specific areas suggested by our analysis include processes that facilitate: personalization of learning; dialogue among staff, students and parents focused on continuous improvement; and collection and use of student assessment data. Related literature. Our first contribution—a new approach to measure management practices in schools—relates to two bodies of work. The first is the literature that has evolved since the creation of the WMS dataset first described in Bloom and Van Reenen [2007]. The WMS methodology has been adapted to a range of public sector institutions, including schools and universities [Bloom et al., 2015a, McCormack et al., 2014], healthcare facilities [Bloom et al., 2015b, 2019b], social programs [Delfgaauw et al., 2011, McConnell et al., 2009], and the civil service [Rasul and Rogger, 2016], as well as to low-income settings [Lemos and Scur, 2016]. However, it is expensive and time- consuming to implement at scale; our approach is a feasible alternative. The second is the literature 3 studying the role of education systems and institutions in determining student performance across countries [Wössmann, 2016]. Many recent papers use PISA data and have looked at this issue through the lens of autonomy [Hanushek et al., 2013, Wössmann et al., 2007], competition [West and Wössmann, 2010], student tracking [Hanushek and Wössmann, 2006, Ruhose and Schwerdt, 2016], external exams [Wössmann, 2005], and instructional time [Lavy, 2015]. Our PISA-based index enables researchers to consider school management in such studies. Our second contribution—a framework to explain why management matters in schools—relates to the literature in personnel economics exploring incentives and selection. These channels have featured in prior work seeking to explain the performance of private sector employees [Bender et al., 2018, Cornwell et al., 2019, Lazear, 2000], public sector employees [Finan et al., 2017, Prendergast, 2007] and politicians [Besley, 2004, 2006, Gagliarducci and Nannicini, 2013, Martinez-Bravo, 2014]. Most closely related is Lazear [2003], who emphasises the potential selection margin of teacher performance pay, albeit without fully working up a formal model.3 A selection margin also fea- tures in the dynamic occupational model of Rothstein [2015] and the Roy model of Biasi [2019]. We study a wider range of management practices (beyond just performance pay) and provide an intuitive decomposition of the impact of these practices on student learning into incentives and selection. The remainder of this paper is organized as follows. In Section 2, we set out our approach to measure management practices in schools, illustrating with the construction and validation of PISA-based and Prova Brasil-based management indices. In Section 3, we describe our theoretical framework, its testable predictions, a series of corroborative descriptive analyses from across Latin America, and the policy implications of these results. Section 4 concludes. 3 See also Dohmen and Falk [2010] who briefly sketch out theoretical reasons why fixed wage contracts and piece rates might be expected to have different impacts on sorting by ability. 4 2 How to measure management in schools? Until the early 2000s, management was typically viewed as an unmeasurable productivity shifter, to be relegated to the residual in any performance regression [Bloom and Van Reenen, 2007]. Since then, improvements in survey methodology and data access have allowed for advances in measure- ment. The current “state of the art” approach uses a dedicated survey—the World Management Survey (WMS)—to measure establishments’ adoption of structured management best practices. While the WMS offers uniquely rich information about management practices, it costs approxi- mately USD400 per interview and takes about 4 months to conduct a single country wave [Bloom et al., 2016]. In view of these costs, it may not be well-suited to every context. In this section, we propose an alternative three-step approach than can, in principle, be used with any existing public dataset containing information on management practices. The first step is to use the original WMS phone survey as a benchmark, and to look for questions in the public survey that elicit information on the management practices already measured by the WMS.4 The second step is to code answers in line with the WMS methodology. And the final step is to create a management index. In Section 2.1, we provide a brief overview of the WMS questions and coding. In Section 2.2, we describe our approach using two existing public datasets as examples: PISA and the Brazilian school census survey, Prova Brasil. Since Brazil and several other PISA countries are part of the Bloom et al. [2015a] sample, we can compare the (within-country) distribution of each index with the corresponding (within-country) distribution of the WMS index. Both indices are well-validated and can therefore be used by researchers interested in studying management across a wider range of countries and schools than was previously possible. 4 Our approach follows the spirit of the re-casting of the original phone-based World Management Survey into the US Census Management and Organizational Practices Survey (MOPS) administered to the population of US manufacturing establishments as a self-reported questionnaire [Bloom et al., 2019a]. The MOPS has been replicated in a number of other countries. Its questions follow the WMS topics and look to measure similar practices, but with self-reported answers. 5 2.1 Overview of the World Management Survey methodology The WMS was developed to measure adoption of structured management best practices in estab- lishments across a range of countries and industries.5 The rigorous data collection is based on double-blind, semi-structured interviews conducted by highly-trained analysts and monitored by supervisors experienced on the survey methodology. Following its successful implementation in the private sector, the WMS was subsequently extended to public sector organizations [Bloom et al., 2015a, 2019b]; in this paper, we focus on the latter. The public-sector WMS covers 20 topics across two main areas: operations management and peo- ple management. Broadly speaking, operations management in schools covers practices including: whether the school has standardization of instructional processes across classrooms while allowing for within-classroom personalization of learning; whether and how the school uses assessments and data; and whether and how the school sets and uses targets and keeps track of progress. People management covers practices in handling good and bad performance measuring whether there is a systematic approach to identifying good and bad performance, rewarding school teachers propor- tionately, dealing with underperformers, and promoting and retaining good performers. For each WMS topic, there is a scoring grid ranging from 1 to 5, which serves as a guide to evaluate answers to questions during the semi-structured interviews. A score between 1 to 2 refers to a school with practically no structured management practices or very weak management practices implemented; a score between 2 to 3 refers to a school with some informal practices implemented, but these practices consist mostly of a reactive approach to managing the school; a score between 3 to 4 refers to a school where a good, formal management process is in place (though not yet consistent enough) and these practices consist mostly of a proactive approach to managing a school; and a score between 4 to 5 refers to well-defined, strong processes in place which are often seen as best practices in education. The overall management index, which measures the level of adoption of structured management best practices, is simply the average of the scores for these 20 topics. The practices measured by the survey seem to matter: Bloom et al. [2015a] show that their school management score is strongly positively correlated with school-level student outcomes across 6 5 See Bloom and Van Reenen [2007] for the survey’s inception and Bloom et al. [2016] for a recent review. 6 WMS countries (Brazil, Canada, India, Sweden, UK and US).6 They find a strong positive correla- tion for these countries: moving from the bottom to the top quartile of management is associated with a large increase in student learning outcomes, equivalent to approximately 0.4 standard devi- ations. 2.2 A new approach using existing public datasets We now describe our approach, illustrating with the examples of PISA and Prova Brasil.7 . Construction. In 2012, alongside its famous student proficiency tests, PISA ran school principal surveys across 65 countries which included a wide-range of questions on both operations and people management.8 As a first step, we classified each of the PISA questions that could fall under one of the WMS topics, identifying 53 PISA questions that fit into 14 of the WMS topics.9 As a second step, we manually assigned scores for each of these PISA questions following the spirit of the scoring grid of the WMS and the US Census Management and Organizational Practices Survey (MOPS). As a final step, we built the overall management index, and the operations and people management sub-indices, following Anderson [2008]. This methodology weights the impact of the included variables by the sum of their row in the inverse variance-covariance matrix, thereby assigning greater weight to questions that carry more “new information”.10 PISA data is excellent for cross-country analysis, but it precludes in-depth analyses within countries as the sample of schools per country is typically small and does not include the necessary identi- fiers. Many countries, however, conduct their own national detailed surveys with school principals, teachers, and students in addition to administering standardized tests across grades. Latin America is particularly prolific: for example, Brazil’s Prova Brasil, Colombia’s SABER, Chile’s SIMCE, and Peru’s ECE are all available to researchers. These questionnaires provide rich information about 6 We replicate the primary figure from Bloom et al. [2015a] in Figure B.3 in Appendix B. It plots the school-level student learning outcomes by each quartile of school management score. 7 We provide details to enable replication in Appendix C 8 Our main focus is on the 2012 data because that survey wave contains a richer set of questions, particularly relating to people management. See Appendix C for a mapping of the 2015 PISA data. 9 The set of WMS topics that had matching PISA questions is detailed in Appendix C. 10 We also built the indices using alternative methods (straightforward standardization, factor analysis, and factor analysis with Bartlett correction) which yielded similar results. 7 practices at the school, as reported by a range of actors. In addition, the samples are usually large (often census-based) and contain school identifiers, thereby enabling researchers to explore hetero- geneity and answer a wide range of policy-relevant questions. We illustrate how our approach can be applied widely to other national surveys using the example of Prova Brasil. This national survey plays a significant role in Brazil’s education policy because its test results, along with promotion, dropout, and retention rates, are the main inputs to the Índice de Desenvolvimento da Educação Básica (IDEB), a national index representing educational quality at the school, municipality, and state levels. We followed the same steps to create a Prova Brasil-based management index: we classified 33 questions (14 from the principal questionnaire and 19 from the teacher questionnaire) into 8 WMS topics, coded responses following the same rubric, and used the Anderson [2008] method to build a school management index. Potential concerns. One of the key differences between the WMS survey and PISA or school census surveys is that the WMS is administered and analyzed by an independent interviewer, while the latter surveys are self-reported. There are a number of issues with self-reported data: for example, problems with translation and interpretation, and/or measurement equivalence. To address measurement error of cross-cultural understandings and norms on answering questions in our PISA index, we standardize our PISA-based management index within countries. This has an important implication: since all 65 countries have a mean score of zero, our index cannot be used to construct cross-country rankings of school management. Instead, the value of our PISA-based index lies in enabling academics and practitioners to study the (within-country) correlation between management and other variables for a far wider set of countries than was previously possible.11 This is not a concern for country-specific national surveys. Another concern with self-reported data is that it is difficult to assess whether respondents are being accurate and truthful. In the WMS there are several strategies to elicit truthful information 11 In the 2012 PISA the dataset included a PISA-built ‘leadership and management’ measure. This is distinctively different from ours, as it was based off a section of the questionnaire that was titled ‘management’ and contained only a small subset of questions. This index fails to take advantage of the full questionnaire and the information available elsewhere that also speaks to managerial practices used in the schools. More pertinently, PISA’s measure does not compare well to the (empirically robust) management index derived from the World Management Survey (see Liberto et al. [2015]). 8 during the interview (such as always asking open-ended questions and asking for examples), but these are not available in self-reported questionnaires. We address this issue by focusing on the topics that have a direct equivalent in the WMS to allow for a clear benchmark for our new index. If principals are reporting “good” information in these surveys that allow us to capture similar signals as the WMS scores, we should see similar distributions of scores across the common countries and a similar overall relationship between management and student test scores across countries. For PISA, we compare the distribution of scores and the performance correlations for the common countries as there are no school identifiers available. For Prova Brasil, we use school identifiers to match schools directly and hence provide a one-to-one comparison of the index standardized values. Validation of new indices. As a first validation exercise, we compare the distribution of our PISA-based management index with the distribution of school management as measured by the WMS data in Figure 1 for all countries that the WMS has collected data.12 The PISA and WMS distributions are reassuringly similar. The Kolmogorov-Smirnov test for equality of distributions rejects in only one of the 9 cases, Italy, where the PISA-index is somewhat more dispersed.13 As a second validation exercise, and to ensure we are picking up important variation with our management index, we conduct a basic check of the correlation between our measure and student performance. For each country we separate schools into quartiles of the management measure, and in Figure 2 we show, for each quartile, the average PISA test scores for math, reading and science (in deviations from the global mean). The graph includes all students and schools across the 65 countries available in the 2012 PISA dataset. This simple relationship suggests that students in schools in the bottom quartile of management within their country score are, on average, about 6 points lower than the PISA global mean, while students in schools in the top quartile of management within their country score, on average, about 5.5 points higher than the PISA global mean. To put this into context, 41 PISA points in math are the equivalent of a year of learning. The range of 12 Independent researchers conducted the WMS in Colombia and Mexico during 2015, with guidance and supervi- sion from the original WMS team. These data were not available to Bloom et al. [2015a]. India is not included in Figure 1 because it did not participate in PISA in 2012. 13 We show comparisons between the WMS and PISA sub-indices for operations management in Figure B.1 and people management in in Figure B.2 in Appendix B and confirm that the distributions are consistent for both sub-indices. 9 our results mirror how much, for example, the UK average science score changed between 2009 and 2015 (5 points), and how much the Brazilian average science score decreased over the same period (4 points). Unlike PISA, the data in Prova Brasil includes school identifiers that allow for a one-to-one match with the schools surveyed in the 2013 WMS wave.14 In total, we have 262 matched schools in the public sector. We use this matched sample in Figure 3 where we show a school-level binned scatter plot of WMS management score against the Prova Brasil-based management score. There is a positive and significant correlation of 0.19, suggesting reasonable internal validation of the Prova Brasil index. As with the PISA index, we repeat the exercise of correlating the new index with student performance in secondary schools and find the same pattern (see Figure 4). In Table 1 we formalize these relationships by reporting the average correlations between student learning and our management indices. For the student-level PISA dataset, we run OLS regressions via the OECD’s repest Stata command, which uses the five available test score plausible values for each student and subject. We report the standard errors in parentheses and p-values in square brackets. The standard errors are clustered at the school level and use the appropriate survey weights.15 In the PISA specifications we include country fixed effects, and successively introduce school controls (dummies for school location, student-teacher ratio, log of the number of students, share of government funding relative to total funding the school receives, and ratio of computers connected to the web used as a proxy for school resources) and then student controls (gender, grade, socio-economic status and immigration status). All panels use the same sample but have different subject outcome variables. The R-squared for each set of regressions is reported within each panel, while the common sample characteristics and controls included are reported at the bottom of the table. Column (1) shows the raw relationship between the PISA-based school management index and student performance, only controlling for country fixed effects. The raw relationship ranges from just over 4 to almost 5 points on the PISA scale.16 Recall that 41 points on the PISA scale for math is equivalent to about one year of learning, and thus the raw correlation is equivalent to about one month of learning for math (similar for the other subjects). Column (2) includes school 14 Prova Brasil was also administered in 2013. 15 See Jerrim et al. [2017] for a thorough review of how to best use PISA scores and survey weights. 16 PISA is standardized across years and countries such that the mean is 500 and the standard deviation is 100. 10 controls, which absorb little of the variation, and Column (3) shows the fully-specified regression including student controls. These controls account for a further point in the student performance. While we refrain from ranking management indices across countries, Figure 5 plots the coefficients of country-level regressions of management on PISA math test scores using the specification of Column (1). The estimation loses precision once we restrict to individual country samples but still broadly supports the positive relationship found in Table 1. For the Prova Brasil student-level dataset, we run standard OLS regressions, also clustering the standard errors at the school level. Results are reported in standard deviations. In these Prova Brasil specifications, we include state fixed effects, and successively introduce school controls (student-teacher ratio, log of the number of students, dummy variables indicating the presence of an IT lab, science lab, and library, a dummy for male principals, dummies for educational attainment, and dummies for experience as principal), and then student controls (gender, race, socio-economic status, and mothers’ educational attainment). Column (4) shows the raw rela- tionship between the Prova Brasil-based school management index and student performance, only controlling for state fixed effects. One standard deviation higher score in the management index is strongly correlated with 0.068 standard deviations higher Portuguese scores and 0.078 standard deviations higher math scores. Column (5) shows that including school characteristics absorbs very little of the variation, while Column (6) shows that including student characteristics absorbs only slightly more. The fully specified regression supports the general positive relationship between the school management index and student performance. 3 Why does management matter in schools? There are myriad uses of our new indices. In this paper, we push the frontier of understanding the mechanisms behind the management and performance relationship by focusing on teachers. Our aim is not to provide a theoretical contribution per se, but rather to formalize a policy discussion around teacher incentive and selection mechanisms and their relationship to management practices and student performance. We take wider system-level factors—in particular hiring and firing autonomy, admissions autonomy and competition between schools—as given and assume that teachers and 11 students make choices within the confines of this environment. Real-world education systems are diverse, and in particular the dynamics of the public and private sector — and the type of private sector offerings — are different across countries. In some contexts, private schools target affluent households, and jobs in private schools are often seen as more attrac- tive than public sector jobs, typically providing some form of performance-based compensation. In other contexts, there has been a growth of ‘low-cost private schools’ that deliberately cater for the lower end of the income distribution and, in these settings, jobs in the public sector typically confer significant rents relative to the private sector. In view of this diversity, we focus our model and empirical test on one particular regional system: Latin America. We choose Latin America because its education systems are reasonably homoge- neous across countries in terms of the character of public and private schools. Specifically, the private school system caters to the middle (and upper) classes and accounts for about one-fifth of high school students. Private schools tend to be better funded (via costly school fees) and in turn pay higher teacher salaries and offer better facilities. Public schools, on the other hand, are often poorly funded and operate in highly centralized environments. Teacher pay is set on a rigid scale dictated by strong unions. Focusing on a region that has such systems makes the applied theory exercise substantially less complex, and the prevalence of large-scale national surveys in the region opens many possibilities for future empirical work. Table 2 reports the correlation exercise in Columns (1) to (3) of Table 1 for Latin American countries only, and confirms that the relationship between management and student performance is strong in the region. Further, the coefficient on private schools indicates that students in private schools achieve higher scores, by about 55 points, than students in public schools. This affords a suitable empirical environment to study the channels that we are interested in this section. 3.1 Overview of the theoretical framework The analysis is built around a student-level education production function. A common, general formulation is y = A(L, K ) + ε where y is a measure of student learning, L and K are respectively labour and physical capital inputs into the student’s education, A is a (school-specific) productivity 12 parameter, and ε is an error term. Here, we specialize to y = θ e+a+ε, where θ is teacher ability, e is teacher effort, and a is household (student and/or parent) effort. That is, we enrich the specification of labour to allow for (additively separable) teacher and household inputs but abstract from the role of physical capital and school-level productivity.17 Using a theoretical framework built around this education production function, we show that school management structures can impact student learning outcomes via the following three channels: 1. Teacher selection : schools with high management scores offer compensation packages that select in more able (higher θ) and more intrinsically motivated (lower effort cost) teacher types. 2. Teacher incentives : schools with high management scores offer compensation packages that extrinsically incentivize, and adopt practices that intrinsically motivate, more effort from any given teacher type that selects in. 3. Household incentives : schools with higher management scores institutionalize a strong work ethic and culture of high achievement among students and encourage greater parental in- volvement within the school (higher a). In Section 3.2, we present a simple model that suffices to make the points above. In Section 3.3 and Section 3.4 we explain, intuitively, how the above selection and incentive effects are driven by people management and operations management. Then, in Section 3.5 we draw together these results and discuss implications for policy. We also briefly comment on how predictions would change in an alternative model featuring ‘low-cost private schools’. 3.2 The model We focus on a teacher who must decide whether to accept a job offer in her assigned public school, or decline it and apply to a private school or the outside sector. 17 In principle, parents could play a further role by selecting between schools. Since our PISA data cannot speak to this issue, we leave the analysis of management-induced household selection for future work. Note that we assume management practices change effective labour inputs. In their study of the IT industry, Schivardi and Schmitz [2019] assume that management is an additional input, alongside capital and labour, in an approach that they term “management as a production technology”. 13 Preferences. The teacher is risk neutral and cares about her compensation w and effort e. When working in the education sector, the teacher’s preferences are w − (e2 − c e). The parameter c captures her intrinsic motivation. This is because for e < c/2 she derives a marginal benefit from exerting an extra unit of effort in teaching; it is only when e > c/2 that effort costs kick in. We assume that c = τ + ∆. The first component τ denotes the teacher’s baseline intrinsic motivation. This can be thought of as the realization of a random variable with density function f . The teacher observes this realization perfectly, while (at the time of hiring) employers observe nothing. The second component ∆ is a motivational increment that, as we describe below, is determined by the people management practices in the teacher’s chosen school. When working in the other sector, the teacher’s preferences are simply w − e2 ; intrinsic motivation plays no role. We abstract from differences within classes and focus on a representative household (student plus parents). This household cares only about its effort level a, and has preferences −(a2 − γ a). The parameter γ is a motivational increment that is also determined by management practices. Performance metrics. Let y1 denote a representative student’s learning outcome in a school that hires the teacher, and y0 denote a representative student’s learning outcome in a school that does not hire the teacher. To the extent that teachers contribute to learning, one would expect y1 > y0 . We capture this in a simple way by assuming y1 = θe + a + ε and y0 = a + ε. If the teacher is not hired by a school but instead chooses to work in the outside sector, her performance is z = θe + ε. The component θ denotes the teacher’s ability. This can be thought of as the realization of a random variable with density function g , and which is drawn independently of τ . The teacher observes this realization perfectly, while (at the time of hiring) employers observe nothing. Draws of the error term ε are independent across employments. We assume throughout that ε is mean zero and distributed U [ε, ε]. At times, for the purposes of illustration, we also assume a specific (uniform) distribution for θ, as part of a numerical example that we discuss at the end of this section.18 18 Note that the basic production function is the same across schools; management practices matter by affecting which θ-types are hired, the teacher’s choice of e (which depends on which τ -types are hired), and the household’s choice of a. 14 Compensation schemes. Schools offer either a performance-pay contract or a fixed wage con- tract. Under the former, the teacher receives a base wage of W plus a bonus B if her performance ¯. Under the latter, the teacher simply receives a base wage of G. The outside exceeds a threshold y sector offers a performance-pay contract with a low base wage (normalized to zero) and a bonus β ¯. if performance exceeds a threshold z The impact of management practices. We assume that people management has two effects. The first relates to the structure of compensation: good people management practices enable man- agers to observe, and contract on, the performance of their employees—i.e. to offer a performance- pay contract. The second relates to teacher motivation: good people management practices enable managers to cultivate the intrinsic motivation of their staff—i.e. to increase ∆. We assume that operations management also has two effects. The first relates to the level of compensation: good operations management practices free up resources and enable managers to offer a higher level of base pay. The second relates to household effort: good operations management practices help to create a stimulating environment for students and parents —i.e. to increase γ . We classify schools into three management types: high (strong people and strong operations man- agement), intermediate (weak people but strong operations management), and low (weak people and weak operations management). Performance metrics are indexed accordingly by i = H, I, L. We assume that high management schools are found exclusively in the private sector, while the public sector consists of a mix of intermediate and low management schools. This implies that performance-pay contracts are only offered by private schools (and the outside sector). Figure 6 provides evidence that this key assumption is well-supported in our PISA data. Here, we plot empirical cumulative distribution functions (CDFs) of the PISA-based people management score by sector and find that the private sector CDF (dashed blue plot) clearly first order stochastically dominates the public sector CDF (solid red plot).19 Timing. The timing of the game is as follows. 19 We show empirical CDFs by country in Latin America in Figure B.4 in Appendix B. The private sector CDF first order stochastically dominates the public sector CDF in Colombia, Peru, Costa Rica and Mexico. 15 1. Nature chooses the teacher’s two-dimensional type. This realization (τ, θ) is observed by the teacher but not by employers. 2. Employers announce management structures and compensation schemes. 3. The teacher is assigned (by government) to a public school and decides whether to accept this post or decline it and apply either to a private school or the outside sector.20 4. Having made an occupational choice, the teacher chooses an effort level. Simultaneously, if the teacher is in the education sector, households choose effort levels. 5. A performance metric is realized. The teacher is rewarded in accordance with the compensa- tion scheme announced at Stage 2. Numerical example. At times in the analysis below, we will invoke specific distributional and parameter assumptions. In this numerical example, teacher intrinsic motivation is distributed τ ∼ U [0, 10], and teacher ability is distributed θ ∼ U [1, 5]. These random variables are independent of each other and the error term in the production functions. In a high management private school: H ≥ 4.5 and W = 15 otherwise, and the motivational increments are teacher pay is W + B = 55 if y1 ∆ = 0.5 and γ = 2. In an intermediate management public school teacher pay is G = 35, and the motivational increments are ∆ = 0 and γ = 2. And in a low management public school: teacher pay is G = 30, and the motivational increments are ∆ = 0 and γ = 1. Pay in the outside sector is β = 50 if z ≥ 1 and 0 otherwise. Our interest lies in establishing the impact of management practices on student learning via teacher occupational choice and effort level, and household effort level. We do not model the government’s assignment rule, or the school principal’s choice of management structure, simply treating these as exogenous parameters. The model is straightforward to solve (see Appendix A for details) and yields the insights summarized in the next two sections. 20 In assuming this timing, we abstract from applicant choice between schools in the public sector. As Table B.4 and B.5 in Appendix B, the degree to which teachers can choose among public schools varies across Latin America, yet our model generally fits the reality in these countries. 16 3.3 The impact of good people management In this subsection, we use the theoretical framework to give a possible explanation for why schools with good people management may produce better student outcomes. In Section 3.3.1, we decom- pose the test score gain from people management into two effects: teacher selection and teacher incentives. If this decomposition is correct, then we should see evidence of these mechanisms in intermediate school outcomes. We develop this argument, and present corroborative evidence from our PISA dataset, in Section 3.3.2. 3.3.1 Decomposing the test score gain into teacher selection and incentives A school of management type i hires the teacher if, given her (τ, θ) type, she expects to receive a higher payoff teaching in this school compared to other schools or working in the outside sector. Let the set of (τ, θ) types hired by a school of management type i be denoted by T i . The expected learning outcome of a representative student (i.e. ex ante, prior to occupational and effort choices) can therefore be written as [ ] [ i ] [ i ] E y i = E y1 · 1{(τ,θ)∈T i } + E y0 · 1{[(τ,θ)∈T / i} , / i } are indicator functions for the hiring and not hiring events. In where 1{(τ,θ)∈T i } and 1{[(τ,θ)∈T [ ] keeping with the empirical application, we will refer to E y i as the expected test score in school i. The difference in expected test score across high and intermediate management schools—that is, the impact of people management holding operations management constant—can be written as [ ] [ ] [ ] [ ] E y H − E y I = E y1 H · 1{(τ,θ)∈T H } − E y1 I · 1{(τ,θ)∈T I } , where the equality follows from the fact that people management only impacts test scores when the teacher is hired (the effect of household effort and the error term difference out). It is helpful 17 to decompose this difference as follows [ ] [ ] E yH − E yI = [ ] [ ( )] H E (y1 − y1 I ) · 1{(τ,θ)∈T H } + E y1 I · 1{(τ,θ)∈T H } − 1{(τ,θ)∈T I } . (1) teacher incentives teacher selection The first term on the RHS of equation (1) captures what we will term the teacher incentive effect of good people management practices. Here, we compare the expected test score outcome in a high management private school with a teacher in the event that the teacher is hired to such a school against the expected test score outcome in an intermediate management public school with a teacher in the counterfactual event that the teacher is hired to a high management private school. In this way, we hold the set of (τ, θ) types fixed and just consider how the incentive environment produces test scores. In Lemma 1 in Appendix A, we derive teacher effort in high and intermediate management schools. θ B τ +∆ Respectively, these are eH = 2(ε−ε) + 2 and eI = τ 2 . Substituting, we can write the first incentive term in (1) as [ ] H E (y1 − y1 I ) · 1{(τ,θ)∈T H } =  extrinsic intrinsic  ∫ ∫ θ B ∆  θ   2(ε − ε) + 2  · 1{(τ,θ)∈T H } f (θ)g (τ )dθdτ. (2) We see from this expression that there are two teacher incentive channels. Part of the reason that test scores are higher in schools with good people management practices is because any given (τ, θ) type exerts more effort due to: (i) an extrinsic incentive from the bonus B , and (ii) additional intrinsic motivation arising via the shift term ∆. The second term in equation (1) captures what we will term the teacher selection effect of good people management practices. Here, we compare the expected test score outcome in an intermediate management public school with a teacher in the event that the teacher is hired to such a school against the expected test score outcome in an intermediate management school with a teacher in the counterfactual event that the teacher is hired to a high management school. In this way, we 18 hold the incentive environment fixed and just consider how the selection of (τ, θ) types produces test scores. Substituting for eI , we can write this second selection term as [ ( )] E y I · 1{(τ,θ)∈T H } − 1{(τ,θ)∈T I } =  effort  ∫ ∫ ability ( )  τ  θ   · 1 {(τ,θ)∈T H } − 1{(τ,θ)∈T I } f (θ)g (τ )dθdτ. (3) 2 We see from this expression that there are also two selection channels. A further part of the reason that test scores are higher in schools with good people management practices is because: (i) the τ -types selected in are intrinsically motivated to exert more effort, and (ii) the θ-types selected in are of greater ability. To see this, consider the numerical example illustrated in Figure 7. In the top panel, the grey shaded area depicts the set of (τ, θ) types that are hired by a high management private school. The unshaded area depicts the set of (τ, θ) types that are hired by an intermediate management public school. It is clear that the intermediate management public school experiences negative selection on both dimensions. More able teachers prefer the performance-contingent compensation schemes available either in private schools or the outside sector. And more intrinsically motivated teachers prefer private schools because they anticipate exerting higher effort (and hence higher pay). 3.3.2 Predictions for intermediate school outcomes and evidence from PISA Our theoretical framework suggests two mechanisms, teacher selection and teacher incentives, that could explain the positive correlation between people management scores and student learning outcomes apparent in the WMS, PISA and Prova Brasil data. If these mechanisms are correct, then we should see behavioural responses in intermediate school outcomes. In this section, we set out these predictions and then explore empirically whether they hold in our PISA data for Latin America.21 21 These predictions are based on the numerical example illustrated in Figure 7 and are derived (via numerical integration) in Remarks 1 and 2 in Appendix A. We present the main results with our preferred specification, but include additional variations of the main explanatory variable in Table B.3 in Appendix B. The results are robust to alternative specifications. 19 Teacher shortages. The probability of hiring the teacher in a high management private school is higher than the probability of hiring the teacher in an intermediate management public school (via teacher selection). In the numerical example shown in the top panel of Figure 7, the area of the grey region is bigger than the area of the unshaded region. The PISA dataset does not contain objective information on school-level vacancies, so we use a series of 4 questions in the school principal questionnaire that ask the principal whether he/she feels that the school’s capacity is hindered by a lack of qualified teachers in each of math, science, language and ‘other subjects’.22 It is worth emphasising that these questions are open to consid- erable interpretation. For instance, a principal might answer ‘a lot’ because he/she feels that the school needs more new posts even if there are few vacancies for existing posts. Conversely, he/she might answer ‘not all’ because of a belief (or desire to say) that the school is coping despite there being vacancies.23 With this caveat in mind, it can still be instructive to examine the data. Column (1) of Table 3 shows that, consistent with the theory, the teacher shortage index is 0.535 standard deviations lower among private schools than public schools, significant at the 1 percent level. Column (4) repeats the specification but with the people management index instead of the private school dummy. The relationship is consistently negative, suggesting that a one standard deviation increase in the people management index is correlated with 0.062 lower teacher shortage in schools, marginally significant at the 10 percent level. Teacher motivation. The expected intrinsic motivation of a teacher hired to a high management private school is higher than the expected intrinsic motivation of a teacher hired to an intermediate management public school (via teacher selection and augmentation of teacher intrinsic motivation). In the numerical example in the top panel of Figure 7, the vertical height of the black point is greater than the vertical height of the blue point. We explore this prediction by using the school climate section of the school principal questionnaire 22 We describe how this teacher shortage index, and the other intermediate outcome indices for teacher motivation, teacher effort, and household involvement, are constructed in Appendix C. All indices are standardized. 23 Consistent with this, by far the most common answer given by principals in both sectors is ‘not at all’, as reflected in the density of the standardized score in Figure B.5 in Appendix B. 20 (questions relating to the perception of teachers’ expectations of their students and meeting student needs, as well as the morale, enthusiasm, pride and valuation of academic achievement) to construct an index of teacher motivation. Column (2) of Table 3 shows that, consistent with the theory, the teacher motivation index is 0.591 standard deviations higher among private schools than public schools, significant at the 1 percent level. Using the people management index also yields a consis- tent relationship, shown in Column (5). The coefficient suggests a one standard deviation higher people management score is associated with 0.238 higher teacher motivation score, also significant at the 1 percent level. Teacher effort. The expected effort level of a teacher hired to a high management private school is higher than the expected effort level of a teacher hired to an intermediate management public school (via teacher selection, extrinsic teacher incentives, and augmentation of teacher intrinsic motivation on-the-job) [ ] [ ] E θB 2(ε−ε) + τ +∆ 2 (τ, θ) ∈ T H > E τ2 (τ, θ ) ∈ T I . We explore this prediction by using the school climate section of the school principal questionnaire (questions relating to how often teachers are absent, late and/or unprepared) to construct an index of teacher effort. Column (3) of Table 3 shows that, consistent with the theory, the teacher effort index is 0.792 standard deviations higher among private schools than public schools, significant at the 1 percent level. Column (6) reports the same specification for the people management index, suggesting that a one standard deviation increase in the management score is associated with 0.074 higher teacher effort, significant at the 5 percent level.. For completeness, Figure 8 plots the coefficients for country-level regressions using the same specifications reported in Columns (4) to (6) of Table 3. While there is some variation across countries, the results broadly hold. 3.4 The impact of good operations management In this subsection, we use the theoretical framework to give a possible explanation for why schools with good operations management may produce better student outcomes. In Section 3.4.1 we 21 decompose the test score gain from operations management into three effects: teacher selection, teacher incentives and household incentives. If this decomposition is correct, then we should see evidence of these mechanisms in intermediate school outcomes. We develop this argument, and present corroborative evidence from our PISA dataset, in Section 3.4.2. 3.4.1 Decomposing the test score gain into teacher selection, teacher incentives, and household incentives The difference in expected test scores across intermediate and low management public schools—that is, the impact of operations management holding people management constant—is [ ] [ ] E y I −E y L = [ ] [ ] [ ] [ ] E y1I · 1{(τ,θ)∈T I } − E y1 L · 1(τ,θ)∈T L } + E y0 I / I } − E y0 · 1{(τ,θ)∈T L } . · 1{(τ,θ)∈T L Letting aI and aL respectively denote household effort in these schools, and using the same decom- position as before, we can rewrite this difference as [ ] [ ] [ ( )] E y I − E y L = E y1 L · 1{(τ,θ)∈T I } − 1{(τ,θ)∈T L } + aI − aL . (4) household incentives teacher selection There is no teacher incentive term because both extrinsic teacher incentives and augmentation of teacher intrinsic motivation depend on people management and this is assumed to be constant across these schools. We can write the teacher selection term as: [ ( )] E y L · 1{(τ,θ)∈T I } − 1{(τ,θ)∈T L } =  effort  ∫ ∫ ability ( )  τ  θ   · 1{(τ,θ)∈T I } − 1{(τ,θ)∈T L } f (θ)g (τ )dθdτ. (5) 2 Again, there are two teacher selection channels. Part of the reason that test scores are higher in schools with good operations management practices is because: (i) the τ -types selected in are in- trinsically motivated to exert more effort, and (ii) the θ-types selected in are of greater ability. 22 To see this, consider the numerical example illustrated in Figure 7. The unshaded area in the top panel depicts the set of (τ, θ) types that are hired by an intermediate management public school, while the unshaded area in the bottom panel depicts the set of (τ, θ) types that are hired by a low management public school. It is clear that the intermediate management public school hires both more and better types. In contrast to the people management case, there is a channel operating via household incentives. A further part of the reason that test scores are higher in schools with good operations management practices is because households (students plus parents) exert more effort due to additional intrinsic motivation arising via the shift term γ . 3.4.2 Predictions for intermediate school outcomes and evidence from PISA Again, we set out predictions relating to intermediate school outcomes (see Remark 2 in Ap- pendix A), and then explore empirically whether they hold in our PISA data.24 Teacher shortages. The probability of hiring the teacher in an intermediate management public school is higher than the probability of hiring the teacher in a low management public school (via teacher selection). In the numerical example shown in Figure 7, the unshaded area is larger in the top panel relative to the bottom panel. We take this prediction to the data using the 4 questions in the PISA school principal questionnaire that ask the principal whether they feel that the school’s capacity is hindered by a lack of qualified teachers (see Appendix C). Column (1) of Table 4 shows a negative and statistically significant correlation between the operations management index and the teacher shortage index. A one standard deviation increase in operation score is associated with a 0.076 standard deviation decrease in the teacher shortage index, significant at the 10 percent level. Teacher motivation. The expected intrinsic motivation of a teacher hired to an intermediate management public school is higher than the expected intrinsic motivation of a teacher hired to a 24 We present the main results with our preferred specification, but include additional variations of the main explanatory variable in Tables B.2 in Appendix B. The results are robust to alternative specifications. 23 low management public school (via teacher selection). In the numerical example shown in Figure 7, the vertical height of the blue point in the top panel is greater than the vertical height of the orange point in the bottom panel. Column (2) of Table 4 shows that, consistent with the theory, the partial effect of operations score on the teacher motivation index is positive and significant at 1 percent; a one standard deviation increase in operation score is associated with a 0.238 standard deviation increase in the teacher motivation index, significant at the 1 percent level. Teacher effort. The expected effort level of a teacher hired to an intermediate management public school is higher than the expected effort level of a teacher hired to a low management public school (via teacher selection) [τ ] [ ] E 2 (τ, θ) ∈ T I > E τ2 ( τ, θ ) ∈ T L . Column (3) of Table 4 shows that one standard deviation higher operations score is correlated with 0.076 standard deviations higher teacher effort, significant at the 5 percent level. Household effort The expected level of household effort in an intermediate management public school is higher than the expected level of household effort in a low management public school (via augmentation of student intrinsic motivation): aI > aL . We explore this prediction by using the school climate section of the school principal questionnaire to construct an index of household effort, combining student behavior questions (relating to how often students are truant, late, disrespectful and/or disruptive) and parental involvement questions (relating to the extent to which parents: are interested in, and discuss, their child’s progress and behaviour; volunteer for school activities; and participate in school governance or other forms of accountability). Column (4) of Table 4 shows that, consistent with the theory, one standard deviation higher operations management score is correlated with 0.160 higher household effort, significant at the 1 percent level. For completeness, Figure 9 plots the coefficients for country-level regressions using the same specifications in Columns (1) to (4) in Table 4. Again, the results 24 broadly hold at country-level. 3.5 Summary of theoretical prediction We developed a simple theoretical framework, built around a student-level education production function, to explore why management practices might matter in schools. Using this framework, we showed that people management practices may be contributing to higher student test scores through two channels: teacher selection and teacher incentives. The predictions that these channels imply for intermediate school outcomes—fewer teacher shortages, higher teacher motivation, and higher teacher effort in schools with strong people management than in schools with weak people management—are all well-supported in our PISA data for Latin America. We also showed that operations management practices may be contributing to higher student test scores via two channels: teacher selection and household incentives. The empirical support for the predictions that these channels imply for intermediate school outcomes—fewer teacher shortages, higher teacher motivation, higher teacher effort, and higher household effort in schools with strong operations management than in schools with weak operations management—is also strong. While this does not represent definitive causal evidence, this combination of theory and descriptive empirical analysis offers an insight into why management appears to matter in schools and so we cautiously move on to policy. For example, what type of school management practices might be changed to drive improved student learning? Our analysis suggests that people management is a good place to start. While it may not be feasible (on political or budgetary grounds) for govern- ments to introduce performance pay in public schools, it may be possible to conduct assessments to judge teacher effectiveness, and for these appraisals to lead to changes in public recognition, and to opportunities for professional development, likelihood of career advancement, and greater responsibilities and leadership. Such practices also reward and develop good performers and create a good employee proposition and could improve both teacher selection and incentives. Beyond the difference in people management across private and public sectors, a striking feature of the PISA data is that there is substantial variation in the strength of operations management practices within the public sector. Some public schools are adopting management practices that 25 appear to be driving student learning, while others within the same public education system are not. This suggests a role for government to encourage schools with weak operations management to follow best practice. As we observe from our mapping exercise, ‘strong’ operations management practices do two things, they: actively promote quality of delivery in the classroom (e.g. via per- sonalization of learning, and encouragement to follow best educational practice); and put processes in place to review school performance and drive change (e.g. dialogue and meetings focused on continuous improvement, collection and use of student assessment data). Such practices could be adopted more widely in the public sector and, our analysis suggests, should improve both teacher selection and household incentives. To be sure, our goal has not been to produce a global theory; given the diversity of real-world education systems, we have deliberately focused on one region, Latin America, and developed a theoretical framework for that context. One could adapt this framework to study different settings, for instance South Asia and parts of East Africa where there is a preponderance of ‘low-cost private schools’. Such a model would need to allow for the possibility of ‘queues’ for jobs in public schools and, as a result, to explicitly model demand -side selection. To the extent that strong management practices enable public school principals to offer higher levels of compensation and then choose motivated and able teachers from the resulting queue, then qualitatively similar predictions would likely still apply. We hope that our new index will enable fertile ground for further research. 4 Conclusion Policy makers have begun to set ambitious, universal learning goals. To achieve these targets it will be necessary to understand why, within and across current education systems, some students are learning more in some schools than others. Although there are likely many factors at work, it has been suggested that part of this variation in learning might stem from differences in school management. To explore this issue and develop policy, academics and practitioners need to be able to measure school management accurately and cost-effectively at scale across schools and countries, and be in a position to postulate mechanisms behind any observed relationship between school management and student learning outcomes. 26 This paper has responded to these observations by developing new approaches to measurement, as well as a simple theoretical framework that captures key features of education systems in Latin America. The first application of our new measurement approach used publicly available data from the school principal surveys conducted by PISA to construct a school management index spanning 65 countries. This PISA-based school management index can be well-validated against the more detailed (though also much more expensive) index based on the WMS. As such, it has clear value in settings where cross-country coverage is important, enabling researchers to study and compare the (within-country) correlation between management and student/school-level outcomes for a far wider set of countries than was previously possible. Our second application used publicly available data from a national administrative survey to con- struct a school management index spanning all public schools in a single country: Brazil. This Prova Brasil-based index was also well-validated against the WMS. This second application has value in settings where within-country coverage, and the availability to merge with other administrative data sets, is important. It is striking that both of our new school management indices confirm the strong positive correla- tion of school management scores with school-level student outcomes first reported in Bloom et al. [2015a]. A positive relationship holds for the global PISA sample, and in the census of schools in Brazil. Our theoretical framework, for the first time, formalizes the possible causal mechanisms in one of these regions: Latin America. We argued that strong people management practices may be improving student learning through a combination of teacher selection and incentive effects, and that schools could be encouraged to adopt practices that reward good performers, develop good performers, and create a good employee value proposition. Looking to operations manage- ment, we argued that strong operations practices may be improving student learning through a combination of teacher selection and household incentives, and that schools could be encouraged to adopt practices promoting quality of delivery in the classroom and adopt processes to review school performance and drive change. We also provided a suggestive set of evidence for these channels. Improvements to management practices present an untapped opportunity for potentially large im- provements in educational outcomes, particularly in cash-strapped regions of the world. One pos- 27 sible way of effecting change is to support existing school principals to introduce stronger people and operations management practices, for instance via training and resources. Fryer [2014, 2017] reports positive results from RCTs injecting best management practices into U.S. public schools. Another possibility is to contract new managers into existing public schools. Romero et al. [forth- coming] report mixed results from an RCT in Liberia in which (non-governmental) management teams were contracted to run public schools: contracting-in raised learning outcomes, but new managers spent more and may have engaged in strategic behaviour. Investigating how to imple- ment strong people and operations management practices to drive learning for all is an important area for future research. 28 Table 1: School management and student performance PISA Prova Brasil (1) (2) (3) (4) (5) (6) Panel A: Reading PISA points Portuguese scores (SDs) Management Index 4.904 3.947 3.019 0.068 0.066 0.059 (1.193) (1.172) (0.980) (0.003) (0.003) (0.002) [0.000] [0.000] [0.002] [0.000] [0.000] [0.000] Private 11.514 2.911 (2.889) (2.560) [0.000] [0.255] R-squared 0.24 0.29 0.42 0.03 0.04 0.10 410701 410701 410701 9891822 9891822 9891822 Panel B: Math PISA points Math scores (SDs) Management Index 4.689 3.937 2.800 0.078 0.075 0.068 (1.267) (1.272) (1.060) (0.003) (0.003) (0.002) [0.000] [0.001] [0.008] [0.000] [0.000] [0.000] Private 11.467 2.001 (2.874) (2.655) [0.000] [0.451] R-squared 0.31 0.34 0.45 0.05 0.05 0.10 410701 410701 410701 9891822 9891822 9891822 Panel C: Science PISA points n.a. Management Index 4.283 3.601 2.553 (1.187) (1.217) (0.982) [0.000] [0.003] [0.009] Private 10.215 1.245 (2.751) (2.377) [0.000] [0.600] R-squared 0.30 0.33 0.43 # Observations 410701 410701 410701 9890704 9890704 9890704 # Schools 15196 15196 15196 33148 33148 33148 Location FE* Y Y Y Y Y Y School controls Y Y Y Y Student controls Y Y Notes: Standard errors in parentheses, p-values in square brackets. OLS regressions for PISA were run with the student-level PISA dataset using the OECD’s repest Stata command. Standard errors clustered at the school level and use all 5 plausible values for each subject and student final weights. Prova Brasil regressions run with standard OLS. Standard errors clustered at the school level and dependent variables are student learning outcomes on national tests at Grade 9 (the same exercise can be done with primary schools and tests at Grade 5). All specifications include location fixed effects (countries for PISA and states for Prova Brasil). PISA controls: School controls include school location, student-teacher ratio, log of the number of students, share of government funding relative to total school funding, and ratio of computers connected to the web as a proxy for school resources. Student controls include gender, grade, socio-economic status and immigration status. Prova Brasil controls: School controls include student-teacher ratio, log of the number of students, dummies indicating the presence of an IT lab, science lab, and library as proxies for school resources. Given availability of principal characteristics, school controls also include a dummy for male principals, dummies for educational attainment, and dummies for experience as principal. Student controls include a dummy for male students, a dummy for white students, student households’ consumption index, dummies for mother educational attainment (grades 1-5, grades 6-9, secondary grades 10-12, and college). For control variables, missing variables are replaced with a value of -99 and we include an indicator variable with a value of 1 for each imputed value. All panels use the same sample but have different subject outcome variables. Summary statistics for PISA dependent variables and controls are presented in Table B.1. 29 Table 2: PISA management index and student performance: Latin America (1) (2) (3) (4) (5) (6) (7) (8) (9) Reading PISA points Math PISA points Science PISA points Management Index 8.255 2.681 2.212 7.442 2.432 1.764 7.859 3.092 2.509 (1.610) (1.252) (1.008) (1.576) (1.230) (1.039) (1.421) (1.144) (0.973) [0.000] [0.032] [0.028] [0.000] [0.048] [0.089] [0.000] [0.006] [0.009] Private 56.807 31.921 55.695 32.589 0.000 55.428 33.077 (3.301) (2.956) (3.713) (3.121) (3.735) (3.327) [0.000] [0.000] [0.000] [0.000] [8.161] [2.736] R-squared 0.032 0.173 0.342 0.041 0.185 0.350 0.040 0.172 0.312 # Observations 78144 78144 78144 78144 78144 78144 78144 78144 78144 # Schools 3075 3075 3075 3075 3075 3075 3075 3075 3075 # Countries 8 8 8 8 8 8 8 8 8 Country FE Y Y Y Y Y Y Y Y Y School controls Y Y Y Y Y Y Student controls Y Y Y Notes: Standard errors in parentheses, p-values in square brackets. OLS regressions for PISA were run with the student-level PISA dataset for 8 Latin American countries using the OECD’s repest Stata command. Standard errors clustered at the school level and use all 5 plausible values for each subject and student final weights. All specifications include country fixed effects. School controls include school location, student-teacher ratio, log of the number of students, share of government funding relative to total school funding, and ratio of computers connected to the web as a proxy for school resources. Student controls include gender, grade, socio-economic status and immigration status. For control variables, missing variables are replaced with a value of -99 and we include an indicator variable with a value of 1 for each imputed value, for each variable with imputed values. 30 Table 3: People management and intermediate outcomes, public and private schools in Latin America (1) (2) (3) (4) (5) (6) z-teacher z-teacher z-teacher z-teacher z-teacher z-teacher shortage motivation effort shortage motivation effort Private School -0.535 0.591 0.792 (0.122) (0.139) (0.128) [0.000] [0.000] [0.000] People Index -0.062 0.238 0.074 (0.035) (0.040) (0.033) [0.077] [0.000] [0.026] R-squared 0.152 0.142 0.154 0.139 0.169 0.123 Observations 3035 3043 3043 3035 3043 3043 School controls Y Y Y Y Y Y Country FE Y Y Y Y Y Y Notes: The first row reports the coefficients from regressions of a binary indicator (coded to 1 if the school is a private school, 0 otherwise) on the standardized index of three intermediate school outcomes: teacher shortage, teacher motivation and teacher effort. The second row reports coefficients from regressions of the standardized people management index on each of the intermediate school outcomes. The people management index is built out of the school questionnaire from PISA 2012 using the methodology from Anderson [2008]. All specifications include PISA school final weights and country fixed effects. School controls include school location, student-teacher ratio, log of the number of students, share of government funding relative to total school funding, ratio of computers connected to the web as a proxy for school resources, and average student socio-economic status. For control variables, missing variables are replaced with a value of -99 and we include an indicator variable with a value of 1 for each imputed value, for each variable with imputed values dummies are added to the specifications. 31 Table 4: Operations management and intermediate outcomes, public schools in Latin America (1) (2) (3) (4) z-teacher z-teacher z-teacher z-household shortage motivation effort effort Operations Management Index -0.080 0.238 0.076 0.160 (0.043) (0.041) (0.038) (0.054) [0.061] [0.000] [0.044] [0.003] R-squared 0.0787 0.171 0.154 0.242 Observations 2407 2414 2414 2414 School controls Y Y Y Y Country FE Y Y Y Y Notes: All regressions use data from public schools only. The table reports coefficients from regressions of the standardized operations management index on each of the intermediate school outcome. The operations management index is built out of the school questionnaire from PISA 2012 using the methodology from Anderson [2008]. All specifications include PISA school final weights and country fixed effects. School controls include school location, student-teacher ratio, log of the number of students, share of government funding relative to total school funding, ratio of computers connected to the web as a proxy for school resources, and average student socio-economic status. For control variables, missing variables are replaced with a value of -99 and we include an indicator variable with a value of 1 for each imputed value, for each variable with imputed values dummies are added to the specifications. 32 Figure 1: Distribution of overall management scores, PISA vs WMS BRA CAN COL Density Density Density .2 .4 .2 .4 .2 .4 0 0 0 -4 -2 0 2 4 -4 -2 0 2 4 -4 -2 0 2 4 6 KS test p-value: 0.181 KS test p-value: 0.887 KS test p-value: 0.116 GBR GER ITA Density Density Density .2 .4 .2 .4 .2 .4 0 0 0 -4 -2 0 2 4 -2 0 2 4 -4 -2 0 2 4 KS test p-value: 0.965 KS test p-value: 0.700 KS test p-value: 0.008 MEX SWE USA Density Density Density .2 .4 .2 .4 .2 .4 0 0 0 -4 -2 0 2 4 -4 -2 0 2 4 -2 0 2 4 KS test p-value: 0.350 KS test p-value: 0.870 KS test p-value: 0.956 PISA 2012 index WMS index Note: Data for the World Management Survey index for all countries except for Mexico and Colombia can be found at www.worldmanagementsurvey.org. Distribution of overall management indices standardized within countries. Kernel density curves estimated using WMS sampling weights (calculated as the inverse probability of being interview on log of number of students, public status, and population density by state, province, or NUTS 2 region as a measure of location) for the WMS data and school final weights for the PISA data. Samples include both public and private secondary schools for both datasets, with the exception of Colombia where WMS data is only available to public primary schools. Number of WMS/PISA observations are as follow (WMS/PISA): Brazil = 510/561, Canada = 129/770, Colombia = 468/268, Great Britain = 89/422, Germany = 102/158, Italy = 284/926, Mexico = 157/1327, Sweden = 85/179, United States = 263/136. 33 Figure 2: PISA-based management index by quartile x PISA student outcomes 10 Math Reading Science (deviations from the mean) 5.32 5.54 5.32 5 PISA score 0.91 1.23 0.77 0.01 0 -0.25 -0.10 -5 -5.91 -5.92 -6.71 -10 Bottom quartile 2nd quartile 3rd quartile Top quartile PISA-based management score (quartiles) Note: Number of observations: 15,196 schools from 65 countries available in PISA 2012 data. Student outcomes are estimated using five plausible values and collapsed at the school level using PISA’s senate weights. Quartiles of management are built at the country level. Test scores are presented as deviations from the global mean. 34 Figure 3: Prova Brasil-based management index x WMS management index 1 Management Index, Prova Brasil -.5 0-1 .5 -2 -1 0 1 2 Management Index, WMS Note: This graph is a binned scatter plot. Each circle represents the average of 5 schools. The sample contains 262 schools which have data for both Prova Brasil and WMS in 2013. Correlation of 0.19 (p-level:0.00). 35 Figure 4: Prova Brasil-based management index by quartile x student outcomes -.4 -.3 -.2 -.1 0 .1 .2 .3 .4 0.40 Math Reading 0.36 Prova Brasil score (standardized) 0.17 0.15 -0.03 -0.01 -0.38 -0.40 Bottom 2nd 3rd Top Prova Brasil-based management score (quartiles) Note: The sample contains 33,148 public secondary schools of Prova Brasil in 2013 for which have available data. For simplicity and to compare the results to the results of PISA and the WMS, we use student learning outcomes on national tests in Portuguese and Math at Grade 9. The same exercise can be repeated with primary schools and national tests in Portuguese and Math at Grade 5. 36 Figure 5: Coefficient plot of PISA-based management index x math PISA points, by country 60 Coefficient on management 0 20 -20 40 LIE TUR TAP PER BEL PRT CHL SRB JPN DEU QRS AUT QCN LUX LTU SGP RUS MNE NLD CHE THA GRC LVA ISL ROU BRA CZE URY COL IDN BGR ESP ITA ARG CAN MEX NZL FIN HKG MYS JOR CRI HRV KOR IRL EST SWE AUS KAZ FRA HUN SVK TUN DNK ALB VNM NOR USA ARE MAC QAT POL ISR GBR SVN E Europe & Central Asia East Asia & Pacific Latin America & Caribbean Middle East & North Africa Southeast Asia W Europe & N America Note: PISA 2012 data. Regressions are estimated using OECD’s repest command in Stata, by country. The specification includes all five plausible values for PISA 2012 and student final weights. Each marker represents the coefficient (and vertical spike represents associated 95% confidence intervals) of the management index on math scores. 37 Figure 6: Cumulative distribution of people management, Latin America 1 Cumulative Probability .75 .5 .25 0 -4 -2 0 2 4 People Management Index (PISA data) Private Public Notes: Cumulative distribution of the PISA-based people management index for private and public schools for 8 Latin American countries. The people management index is built out of the school questionnaire from PISA 2012 using the methodology from Anderson [2008]. Sample consists of 3075 schools: 2432 in the public sector and 637 in the private sector. 38 Figure 7: Teacher selection Note: The blue point in the top panel shows average teacher ability θ and baseline intrinsic motivation τ among teacher types who select into an intermediate management public school; the black point in the same panel shows average θ and τ among teacher types who select into a competing high management private school. The orange point in the bottom panel shows average θ and τ among teacher types who select into a low management public school; the black point in the same panel shows average θ and τ among teacher types who select into a competing high management private school. Both panels are plotted for the numerical example set out in Section 3.2. 39 Figure 8: Coefficient plot of PISA-based people management index x intermediate outcomes, by country Coefficient on people management Coefficient on people management Coefficient on people management Teacher Shortage Index Teacher Motivation Index -.4 -.2 0 .2 .4 .6 -.4 -.2 0 .2 .4 .6 COL CHL BRA URY PER ARG MEX CRI ARG COL URY PER BRA MEX CHL CRI Teacher Effort Index -.4 -.2 0 .2 .4 .6 ARG COL URY MEX CRI CHL BRA PER Notes: Each marker represents the coefficient (and vertical spike represents the associated 95% confidence intervals) from regressions of the people management index on the intermediate school outcome indices for each country in Latin America. The people management index is built out of the school questionnaire from PISA 2012 using the methodology from Anderson [2008]. All specifications include country fixed effects, school controls and PISA school final weights. School controls include school location, student-teacher ratio, log of the number of students, share of government funding relative to total school funding, ratio of computers connected to the web as a proxy for school resources, and average student socio-economic status. For control variables, missing variables are replaced with a value of -99 and we include an indicator variable with a value of 1 for each imputed value, for each variable with imputed values dummies are added to the specifications. 40 Figure 9: Coefficient plot of PISA-based operations management index x intermediate outcomes, by country Coefficient on operations management Coefficient on operations management Teacher Shortage Index Teacher Motivation Index 1 1 .5 .5 0 0 -.5 -.5 COL CHL ARG PER BRA MEX URY CRI ARG COL BRA URY PER MEX CRI CHL Coefficient on operations management Coefficient on operations management Teacher Effort Index Household Effort Index 1 1 .5 .5 0 0 -.5 -.5 ARG COL MEX BRA PER CRI URY CHL ARG COL BRA CHL MEX URY PER CRI Notes: Each marker represents the coefficient (and vertical spike represents the associated 95% confidence intervals) from regressions of the operations management index on the intermediate school outcome indices for each country in Latin America. The operations management index is built out of the school questionnaire from PISA 2012 using the methodology from Anderson [2008]. All specifications include country fixed effects, school controls and PISA school final weights. School controls include school location, student-teacher ratio, log of the number of students, share of government funding relative to total school funding, ratio of computers connected to the web as a proxy for school resources, and average student socio-economic status. For control variables, missing variables are replaced with a value of -99 and we include an indicator variable with a value of 1 for each imputed value, for each variable with imputed values dummies are added to the specifications. 41 References M. Adelman, R. Lemos, and T. Teodorovicz. Appointing versus locally electing school directors: Evidence from Brazil. Mimeo, World Bank, 2019. M. Akmal and L. Pritchett. Learning equity requires more than equality: Learning goals and achievement gaps between the rich and the poor in five developing countries. Working Paper 19/028, RISE, 2019. M. L. Anderson. Multiple inference and gender differences in the effects of early intervention: A reevaluation of the abecedarian, perry preschool, and early training projects. Journal of the American Statistical Association, 103(484):1481–1495, 2008. G. P. Baker, M. C. Jensen, and K. J. Murphy. Compensation and incentives: Practice vs. theory. The Journal of Finance, 43(3):593–616, 1988. S. Bender, N. Bloom, D. Card, J. Van Reenen, and S. Wolter. Management practices, workforce selection, and productivity. Journal of Labor Economics, 36(S1):S371–S409, 2018. T. Besley. Paying politicians - theory and evidence. Journal of the European Economic Association, 2(23):193–215, 2004. T. Besley. Principled agents? The political economy of good government. Oxford University Press, Oxford, 2006. B. Biasi. The labour market for teachers under different pay schemes. Working Paper 24813, NBER, 2019. N. Bloom and J. Van Reenen. Measuring and explaining management practices across firms and countries. The Quarterly Journal of Economics, 122:1351–1408, 2007. N. Bloom, R. Lemos, R. Sadun, and J. Van Reenen. Does management matter in schools? The Economic Journal, 125:647–674, May 2015a. N. Bloom, C. Propper, S. Seiler, and J. V. Reenen. The impact of competition on management quality: Evidence from public hospitals. Review of Economic Studies, 82(2):457–489, 2015b. N. Bloom, R. Lemos, R. Sadun, D. Scur, and J. Van Reenen. International data on measuring 42 management practices. American Economic Review: Papers & Proceedings, 106(5):152–156, 2016. N. Bloom, E. Brynjolfsson, L. Foster, R. Jarmin, M. Patnaik, I. Saporta-Eksten, and J. Van Reenen. What drives differences in management practices? American Economic Review, 109(5):1648–83, May 2019a. N. Bloom, R. Sadun, R. Lemos, and J. V. Reenen. Healthy business? Managerial education and management in healthcare. The Review of Economics and Statistics, 0(ja):1–45, 2019b. C. Cornwell, I. M. Schmutte, and D. Scur. Building a productive workforce: The role of structured management practices. CEP discussion papers, Centre for Economic Performance, LSE, 2019. J. Delfgaauw, R. Dur, C. Propper, and S. Smith. Management practices: Are not for profits different? SSRN Electronic Journal, 07 2011. T. Dohmen and A. Falk. You get what you pay for: Incentives and selection in the education system. The Economic Journal, 120:1–27, May 2010. F. Finan, B. A. Olken, and R. Pande. The personnel economics of the developing state. In A. V. Banerjee and E. Duflo, editors, Handbook of Economic Field Experiments, volume 2 of Handbook of Economic Field Experiments, pages 467 – 514. North-Holland, 2017. R. Fryer. Injecting charter school best practices into traditional public schools: Evidence from field experiments. The Quarterly Journal of Economics, 129(3):1355–1407, 2014. R. Fryer. Management and student achievement: Evidence from a randomized field experiment. Working Paper Series 23437, NBER, May 2017. S. Gagliarducci and T. Nannicini. Do better paid politicians perform better? Disentangling incen- tives from selection. Journal of the European Economic Association, 11(2):369–398, 2013. E. A. Hanushek and L. Wössmann. Does educational tracking affect performance and inequality? Differences- in-differences evidence across countries. The Economic Journal, 116(510):63–76, March 2006. 43 E. A. Hanushek, S. Link, and L. Wössmann. Does school autonomy make sense everywhere? Panel estimates from PISA. Journal of Development Economics, 104(C):212–232, 2013. J. Jerrim, L. A. Lopez-Agudo, O. D. Marcenaro-Gutierrez, and N. Shure. What happens when econometrics and psychometrics collide? An example using the PISA data. Economics of Edu- cation Review, 61(C):51–58, 2017. V. Lavy. Do differences in schools’ instruction time explain international achievement gaps? Ev- idence from developed and developing countries. The Economic Journal, 125(588):F397–F424, 2015. E. P. Lazear. Performance pay and productivity. American Economic Review, 90(5):1346–1361, December 2000. E. P. Lazear. Teacher incentives. Swedish Economic Policy Review, 10(3):179–214, 2003. R. Lemos and D. Scur. Developing management: An expanded evaluation tool for developing countries. Working Paper 16/07, RISE, 2016. A. D. Liberto, F. Schivardi, and G. Sulis. Managerial practices and student performance. Economic Policy, 30(84):683–728, 2015. M. Martinez-Bravo. The role of local officials in new democracies: Evidence from Indonesia. Amer- ican Economic Review, 104(4):1244–87, April 2014. K. J. McConnell, K. A. Hoffman, A. Quanbeck, and D. McCarty. Management practices in sub- stance abuse treatment programs. Journal of Substance Abuse Treatment, 37(1):79–89, 2009. doi: 10.2139/ssrn.1886387. J. McCormack, C. Propper, and S. Smith. Herding cats? Management and university performance. The Economic Journal, 124(578):F534–F564, 2014. C. Prendergast. The motivation and bias of bureaucrats. American Economic Review, 97(1): 180–196, 2007. I. Rasul and D. Rogger. Management of bureaucrats and public service delivery: Evidence from the Nigerian civil service. The Economic Journal, 128(608):413–446, 2016. 44 M. Romero, J. Sandefur, and W. Sandholtz. Outsourcing education: Experimental evidence from Liberia. American Economic Review, forthcoming. J. Rothstein. Teacher quality policy when supply matters. American Economic Review, 105(1): 100–130, 2015. J. Ruhose and G. Schwerdt. Does early educational tracking increase migrant-native achievement gaps? Differences-in-differences evidence across countries. Economics of Education Review, 52 (C):134–154, 2016. F. Schivardi and T. Schmitz. The IT revolution and southern Europe’s two lost decades. Working Paper 18/05, EIEF, 2019. M. R. West and L. Wössmann. Every catholic child in a catholic school: Historical resistance to state schooling, contemporary private competition and student achievement across countries. The Economic Journal, 120(546):F229–F255, 2010. World Bank. World Development Report 2018: Learning to Realize Educations Promise. World Bank, Washington, DC, 2018. L. Wössmann. The effect heterogeneity of central examinations: Evidence from TIMSS, TIMSS- Repeat and PISA. Education Economics, 13(2):143–169, 2005. L. Wössmann. The importance of school systems: Evidence from international differences in student achievement. Journal of Economic Perspectives, 30(3):3–32, September 2016. L. Wössmann, E. Lüdemann, G. Schütz, and M. R. West. School accountability, autonomy, choice, and the level of student achievement. OECD Publishing, Paris, (13), 2007. 45 A Appendix: Theoretical derivations Lemma 1. Assume that the government assigns the teacher to an intermediate management public school. 1. If the teacher accepts the government’s offer, then she exerts effort eI = τ 2. 2. If the teacher declines the government’s offer and is hired by a high management private θB τ +∆ school, then she exerts effort eH = 2(ε−ε) + 2 . 3. If the teacher declines the government’s offer and is hired by an outside employer, then she θβ exerts effort eO = 2(ε−ε) . Proof. Part 1. When working in an intermediate management public school, a teacher with baseline motivation τ chooses effort to solve max G − (e2 − (τ ) · e). e Differentiation to obtain the first order condition yields the solution stated above. (Here, as in the cases below, the second order condition necessary for a maximum holds.) Part 2. When working in a high management private school, a teacher with baseline motivation τ and ability θ chooses effort to solve max P · B + W − (e2 − (τ + ∆) · e) e H exceeds the threshold y where P is the probability that y1 ¯ given e (and student attention a). Given the uniform distribution for ε, we can rewrite this probability as ε+θ e+a−y ¯ ¯ > −ε) = ¯) = Pr (θ e + a − y P = Pr (θ e + a + ε > y . ε−ε The first order condition for this optimization problem is θB = 2e − (τ + ∆), ε−ε App. 1 which yields the solution stated above. Part 3. When working in the outside sector, a teacher chooses effort to solve max P O · β − e2 , e where P O is the probability that z exceeds the threshold z ¯ given e. We can rewrite this probability as ( ) ( ) ε+θ e−z P O = Pr θ e + εO > z = Pr θ e − z > −εO = . ε−ε The first order condition for this optimization problem is θβ = 2e, ε−ε which yields the solution stated above. Lemma 2. Assume that the government assigns the teacher to an intermediate management public school. There exist functions 56 √ √ τG = − 2θ − 1 4 , τ O = 25θ2 − 60, and τ P = 25θ2 − 4 − 4θ − 1 2 8θ + 1 such that: [ ] 1. The teacher accepts the government’s offer with probability Pr (τ, θ) ∈ T I , where T I ≡ (τ, θ) : τ O (θ) ≤ τ ≤ τ G (θ). 2. The teacher declines the government’s offer and accepts an offer from a private school with [ ] { } probability Pr (τ, θ) ∈ T H , where T H ≡ (τ, θ) : τ ≥ max τ G (θ), τ P (θ) . Proof. Part 1. The function τ G traces out the loci of (τ, θ)-types who, anticipating subsequent teacher effort and household effort levels, are indifferent between accepting their government job offer and declining it in favour of a job in a high management private school, i.e. types for whom ( ) ε + θ eH + aH − y G − (e ) + τ e = B L 2 L − (eH )2 + (τ + ∆) eH . ε−ε App. 2 Substituting for eL and eH from Lemma 1, together with the parameters in our numerical example (implying aH = 1), and rearranging yields 56 τG = − 2θ − 4 1 . 8θ + 1 Fixing θ, for any τ < τ G (θ), the teacher’s payoff from accepting the government’s offer is strictly higher than her expected payoff from declining and accepting a job in a high management private school. The function τ O traces out the loci of (τ, θ)-types who, anticipating subsequent teacher effort levels, are indifferent between accepting their government job offer and declining it in favour of a job in the outside sector, i.e. types for whom ( ) ε + θ eO − z G − (e ) + (τ ) e = β L 2 L − (eO )2 . ε−ε Substituting for eL and eO from Lemma 1, together with the parameters in our numerical example, and rearranging for τ yields √ τO = 25θ2 − 60. Fixing θ, for any τ > τ O (θ), the teacher’s payoff from accepting the government’s offer is strictly higher than her expected payoff from declining and accepting a job in the outside sector. All that remains, is to confirm that there exist values of θ such that τ O ≤ τ G . Clearly, τ G is decreasing and τ O is increasing. Straightforward calculations show that τ G − τ O is positive and decreasing on [1, 1.56] which establishes that there exists a set T I ≡ (τ, θ) : τ O ≤ τ ≤ τ G . For any pair (τ, θ) in this set, the payoff from accepting the government job (weakly) exceeds both the expected payoff of declining and accepting a job in a high management private school and the expected payoff of declining and accepting a job in the outside sector. Part 2. The function τ P traces out the loci of (τ, θ)-types who, anticipating subsequent teacher effort and household effort levels, and having declined their government job offer, are indifferent between a job in a high management private school and a job in the outside sector, i.e. types for App. 3 whom ( ) ( ) ε + θ eH + aH − y ε + θ eO − z B − (e ) + (τ + ∆) e H 2 H =β − (eO )2 . ε−ε ε−ε Substituting for eH and eO from Lemma 1, together with the parameters in our numerical example, and rearranging for τ yields √ τP = 25θ2 − 4 − 4θ − 1 2. Fixing θ, for any τ > τ P (θ), the teacher’s expected payoff from declining the government’s offer and accepting a job in a high management private school is higher than her expected payoff from declining the government’s offer and accepting a job in the outside sector. Straightforward calculations show that τ P − τ G is positive and increasing on [1.56, 5] which es- { } tablishes that there exists a set T H ≡ (τ, θ) : τ ≥ max τ G , τ P . For any (τ, θ) in this set, the expected payoff from declining the government offer and accepting a job in a high management private school exceeds both the payoff of accepting the government job and the expected payoff of declining and accepting a job in the outside sector. Lemma 3. Assume that the government assigns the teacher to a low management public school. There exist functions ′ ′ √ √ τG = 36 8θ+1 − 2θ − 1 4, τ O = 25θ2 − 40, and τ P = 25θ2 − 4 − 4θ − 1 2 such that: [ ] 1. The teacher accepts the government’s offer with probability Pr (τ, θ) ∈ T L , where T L ≡ ′ ′ (τ, θ) : τ O (θ) ≤ τ ≤ τ G (θ). 2. The teacher declines the government’s offer and accepts an offer from a private school with [ ′ ] ′ { ′ } probability Pr (τ, θ) ∈ T H , where T H ≡ (τ, θ) : τ ≥ max τ G (θ), τ P (θ) . Proof. Analogous to Lemma 2. Remark 1. Assume that the government assigns the teacher to an intermediate management public school. In the numerical example: App. 4 [ ] [ ] 1. Pr (τ, θ) ∈ T H = 0.741 > Pr (τ, θ) ∈ T L = 0.031. [ ] [ ] 2. E τ + ∆ (τ, θ) ∈ T H = 6.722 > E τ (τ, θ) ∈ T I = 1.311. [ ] [ ] 3. E θ B 2(ε−ε) + τ +∆ 2 (τ, θ) ∈ T H = 8.851 > E τ2 (τ, θ ) ∈ T I = 0.655. Proof. Calculated via numerical integration, using Lemmas 1 and 2. The Mathematica notebook file is available upon request. Remark 2. Compare an intermediate management public school and a low management public school. In the numerical example [ ] [ ] 1. Pr (τ, θ) ∈ T I = 0.031 > Pr (τ, θ) ∈ T L = 0.007. [ ] [ ] 2. E τ (τ, θ) ∈ T I = 1.311 > E τ (τ, θ) ∈ T L = 0.545. [τ ] [ ] 3. E 2 (τ, θ) ∈ T I = 0.655 > E τ2 ( τ, θ ) ∈ T L = 0.301. 4. aI = 1 > aL = 1 2. Proof. Calculated by numerical integration, using Lemmas 1 and 3. The Mathematica notebook file is available upon request. App. 5 ONLINE APPENDIX NOT INTENDED FOR PUBLICATION for Leaver, Lemos and Scur “Measuring and explaining management in schools: new approaches using public data,” November 5, 2019 B Additional tables and figures Table B.1: Summary statistics Standard 10th 25th 50th 75th 90th Mean N Deviation pct pct pct pct pct School Private school 0.19 (0.39) 0.00 0.00 0.00 0.00 1.00 410200 Rural 0.32 (0.46) 0.00 0.00 0.00 1.00 1.00 410209 Student-teacher ratio 17.47 (12.20) 8.70 11.94 15.57 20.17 28.00 380244 Enrolment total 983.89 (789.31) 222.00 439.00 813.00 1317.00 1861.00 394664 Share of govt funding 0.78 (0.32) 0.14 0.68 0.95 1.00 1.00 377927 Location: village 0.13 (0.33) 0.00 0.00 0.00 0.00 1.00 410209 Location: small town 0.19 (0.39) 0.00 0.00 0.00 0.00 1.00 410209 Location: town 0.28 (0.45) 0.00 0.00 0.00 1.00 1.00 410209 Location: city 0.26 (0.44) 0.00 0.00 0.00 1.00 1.00 410209 Location: large city 0.15 (0.35) 0.00 0.00 0.00 0.00 1.00 410209 Computers with internet 0.87 (0.29) 0.35 1.00 1.00 1.00 1.00 389971 Student grade 0.11 (0.32) 0.00 0.00 0.00 0.00 1.00 330163 Students Math score (PV1) 457.47 (103.02) 329.42 381.84 450.47 528.28 598.15 410701 Reading score (PV1) 465.67 (100.83) 335.44 395.65 465.71 536.32 596.87 410701 Science score (PV1) 466.83 (100.85) 339.18 394.01 463.20 537.52 602.14 410701 Female student 0.51 (0.50) 0.00 0.00 1.00 1.00 1.00 410701 Student age 15.41 (0.53) 14.33 15.25 15.58 15.83 15.92 410586 Student: non-immigrant 0.93 (0.26) 1.00 1.00 1.00 1.00 1.00 399606 Student: second-gen 0.04 (0.21) 0.00 0.00 0.00 0.00 0.00 399606 Student: first-gen 0.03 (0.16) 0.00 0.00 0.00 0.00 0.00 399606 Socio-economic status index 0.71 (0.48) 0.12 0.31 0.65 1.02 1.36 175060 App. 1 Table B.2: Above median people management and intermediate outcomes, public and private schools in Latin America (1) (2) (3) z-teacher z-teacher z-teacher shortage motivation effort Above median people -0.186 0.403 0.060 (0.077) (0.073) (0.072) [0.015] [0.000] [0.406] R-squared 0.148 0.151 0.132 Above 75th pct people -0.245 0.482 0.065 (0.088) (0.098) (0.086) [0.005] [0.000] [0.451] R-squared 0.150 0.152 0.132 Observations 3067 3074 3044 School controls Y Y Y Country FE Y Y Y Notes: The first row reports the coefficient from regressions of a binary indicator Above median people (coded to 1 if the school’s PISA-based people management score is above the median, 0 otherwise) on the standardized index of three intermediate school outcomes: teacher shortage, teacher motivation and teacher effort. The second row reports coefficients from regressions of a binary indicator Above 75th pct people (coded to 1 if the school’s PISA-based people management score is above 75th percentile, 0 otherwise) on each of the intermediate school outcomes. The people management index is built out of the school questionnaire from PISA 2012 using the methodology from Anderson [2008]. All specifications include PISA school final weights and country fixed effects. School controls include school location, student-teacher ratio, log of the number of students, share of government funding relative to total school funding, ratio of computers connected to the web as a proxy for school resources, and average student socio-economic status. For control variables, missing variables are replaced with a value of -99 and we include an indicator variable with a value of 1 for each imputed value, for each variable with imputed values dummies are added to the specifications. App. 2 Table B.3: Above median operations management and intermediate outcomes, public schools in Latin America (1) (2) (3) (4) z-teacher z-teacher z-teacher z-household shortage motivation effort effort Above median ops -0.165 0.323 0.130 0.197 (0.082) (0.083) (0.075) (0.088) [0.044] [0.000] [0.083] [0.025] R-squared 0.0790 0.151 0.153 0.234 Above 75pct ops -0.121 0.498 0.221 0.348 (0.101) (0.103) (0.092) (0.121) [0.233] [0.000] [0.016] [0.004] R-squared 0.0759 0.164 0.156 0.240 Observations 2407 2414 2414 2414 School controls Y Y Y Y Country FE Y Y Y Y Notes: The row reports the coefficient from regressions of a binary indicator Above median ops (coded to 1 if the school’s PISA-based people management score is above the median, 0 otherwise) on the standardized index of three intermediate school outcomes: teacher shortage, teacher motivation and teacher effort. The second row reports coefficients from regressions of a binary indicator Above 75th pct ops (coded to 1 if the school’s PISA-based people management score is above 75th percentile, 0 otherwise) on each of the intermediate school outcomes. The operations management index is built out of the school questionnaire from PISA 2012 using the methodology from Anderson [2008]. All specifications include PISA school final weights and country fixed effects. School controls include school location, student-teacher ratio, log of the number of students, share of government funding relative to total school funding, ratio of computers connected to the web as a proxy for school resources, and average student socio-economic status. For control variables, missing variables are replaced with a value of -99 and we include an indicator variable with a value of 1 for each imputed value, for each variable with imputed values dummies are added to the specifications. App. 3 Table B.4: Process of entering the public basic education teaching career in Latin American countries in 2012 Country Relevant Eligibility to Apply Process for Job offer and Allocation Legislation in 2011-2012 Argentina Law 10.579, Professional degree or 1) District government within Province announces vacancies. Estatuto del equivalent in Teaching or 2) Candidate submits application along with supporting documentation. Candidate may choose a (Buenos Docente Education in accordance to maximum of 3 districts per application (with no limit on the number of applications submitted). Aires) educational stage. 3) Decentralized classification tribunal (Tribunales de Clasificación Descentralizados) scores and ranks Candidates cannot be older candidates based on supporting documentation (candidates applying to district of residency receive than 50 years old. Foreign bonus points), following guidance and supervision of centralized classification tribunal at the applicants must have at provincial level. least 5 years of residency in 4) District government makes an offer to suitable candidates based on ranking, and offers permanent the country. assignment after the candidate passes an assessment carried out during the first year of work. Brazil Municipal Law Professional degree in 1) State government announces vacancies. 2391/1995 Teaching or Education. 2) Candidate registers in a specific district, pays an enrollment fee, takes state examination, and (Rio de submits supporting documentation. Janeiro) **Subsequent 3) State Office of Examination, Statistics and Public Service scores candidate’s supporting legislation was documentation, conditional on passing state examination. approved in 2013 4) State government publishes candidates’ final score. (Plano de Cargos, 5) Regional office under the state government (Coordenadoria Regional de Educação) makes an offer App. 4 Carreira e based on candidate’s score, and offers permanent assignment after no more than 3 years of Remuneração- PCCR, Municipal probationary period. Law No 5623/2013) Chile Law 19.070 passed Professional degree in 1) Municipal government announces vacancies. in 1991. Teaching or Education in 2) Candidate submits individual applications for each vacancy (with no limit on the number Subsequent accordance to educational applications) along with supporting documentation. reform was stage. 3) Municipal evaluation committee selects 2 to 5 candidates for each vacancy. carried in 2011 4) Candidate presents a school work proposal to the committee. The proposal is ranked by the (Law 20.501) with municipal evaluation committee (additional assessments may be requested). focus on school 5) Municipal evaluation committee recommends candidates suitable for its vacancies, based on its principals. ranking. 6) Mayor makes an offer to suitable candidates, according to the recommendation received from the municipal evaluation committee. Colombia Law Decree 1278, Professional degree or 1) Local government (Entidad Territorial) announces vacancies. 2002, Estatuto de equivalent. Non-educators 2) Candidate registers, submits supporting documentation, and takes national examination. Profesionalización are required to hold a 3) Local government committee evaluates the candidate based on a psycho-technical assessment, Docente specialization program on conditional on passing the national examination. Pedagogy or similar field. 4) Public Service National Authority decentralized committees (Comisión Nacional del Servicio Civil) score candidate’s supporting documentation and carry out interviews. 5) Public Service National Authority decentralized committees ranks candidates in each locality by educational stages and school modalities. Ranking is valid for 2 years. 6) Local government makes an offer to suitable candidates based on ranking. 7) Local government offers candidates permanent assignment after 1 year of probationary period, based on the school principal’s evaluation. Table B.5: Process of entering the public basic education teaching career in Latin American countries in 2012 Country Relevant Eligibility to Apply Process for Job offer and Allocation Legislation in 2011-2012 Costa Rica 2005 Nueva Professional degree or 1) National government announces vacancies. Carrera equivalent in Teaching or 2) Candidate submits application along with supporting documentation, and indicates region(s) of Profesional Education. preference. Docente 3) Public Service National Authority (Autoridad Nacional de Servicio Civil) ranks each candidate, allocates vacancies to suitable candidates, according to ranking and candidates’ preferred region. 4) National government offers candidate a permanent assignment after 3 months of probationary period, based on the school principal’s evaluation. Mexico 2008, Alianza por Professional degree or 1) State government announces vacancies. la Calidad de la equivalent in Teaching or 2) Candidate register, submits supporting documentation, and takes national examination. Educación. Education in accordance to 3) National government grades examinations and sorts candidates into “Accepted”, “Eligible” and “Not educational stage. States accepted” based on score. Cut-off points are determined by the Independent National Evaluation **An education have discretion to only Body (Órgano de Evaluación Independiente con Carácter Federalista) and might differ across States. reform took place in accept candidates from 4) National government ranks “Accepted” and “Eligible” candidates based on their scores and 2013: Ley General specific teacher-training publishes ranking. “Eligible” candidates will require additional training if accepted. del Servicio institutes or with a 5) State government makes an offer for a permanent assignment based on its ranking. Profesional Docente minimum time of residency App. 5 in the State. Peru Law 29062, Ley de Professional degree in 1) National government announces vacancies. la Carrera Pública Teaching or Education in 2) Candidate registers and takes national examination. Magisterial. accordance to educational 3) Candidate applies to a single school vacancy, conditional on meeting eligibility criteria and passing the stage. Years of experience national examination, and submits supporting documentation. **An education might be a requirement for 4) Evaluation committee (set up at school, municipality, local or regional level) scores candidate based reform took place in candidates to vacancies in on supporting documentation, school interview and classroom teaching practices. Nov2012, and its special or alternative 5) Evaluation committee publishes a ranking of candidates per vacancy. regulation was later education schools. 6) School makes an offer for a permanent assignment based on ranking provided by evaluation issued in Mar2013. committee. Uruguay Estatuto Docente Professional degree in 1) National council announces vacancies. Regulation No 45, Teaching or Education in 2) Candidate registers and submits supporting documentation. Candidate may apply to multiple approved by Act accordance to educational vacancies in up to two municipalities but must submit individual applications. No 68, Resolution stage. 3) Council sets up 3 Evaluation Committees per every 50 to 100 applicants. Committees scores No 9, 1993; candidates based on national level examination, classroom teaching practices, and supporting modified in 2008 documentation. 4) Evaluation Committees rank candidates based on their total score. 5) Nacional Council makes an offer for a permanent assignment based on ranking for each vacancy. Figure B.1: Distribution of management scores, PISA 2012 vs WMS: operations BRA CAN COL .2 .4 .2 .4 .2 .4 Density Density Density 0 0 0 -4 -2 0 2 4 -4 -2 0 2 4 -4 -2 0 2 4 KS test p-value: 0.930 KS test p-value: 0.748 KS test p-value: 0.472 GBR GER ITA .2 .4 .2 .4 .2 .4 Density Density Density 0 0 0 -4 -2 0 2 4 -2 0 2 4 -4 -2 0 2 4 KS test p-value: 0.742 KS test p-value: 0.718 KS test p-value: 0.193 MEX SWE USA .2 .4 .2 .4 .2 .4 Density Density Density 0 0 0 -4 -2 0 2 4 -4 -2 0 2 4 -3 -2 -1 0 1 2 KS test p-value: 0.158 KS test p-value: 0.635 KS test p-value: 0.327 PISA 2012 index WMS index Note: Data for the World Management Survey index for all countries except for Mexico and Colombia can be found at www.worldmanagementsurvey.org. Distribution of operations management indices standardized within countries. Kernel density curves estimated using WMS sampling weights (calculated as the inverse probability of being interview on log of number of students, public status, and population density by state, province, or NUTS 2 region as a measure of location) for the WMS data and school final weights for the PISA data. Samples include both public and private secondary schools for both datasets, with the exception of Colombia where WMS data is only available to public primary schools. Number of WMS/PISA observations are as follow (WMS/PISA): Brazil = 510/561, Canada = 129/770, Colombia = 468/268, Great Britain = 89/422, Germany = 102/158, Italy = 284/926, Mexico = 157/1327, Sweden = 85/179, United States = 263/136. App. 6 Figure B.2: Distribution of management scores, PISA 2012 vs WMS: operations BRA CAN COL Density Density Density 0 .2 .4 0 .2 .4 0 .2 .4 -2 0 2 4 -2 0 2 4 6 -2 0 2 4 6 KS test p-value: 0.001 KS test p-value: 0.750 KS test p-value: 0.142 GBR GER ITA Density Density Density 0 .2 .4 0 .2 .4 0 .2 .4 -2 0 2 4 -2 0 2 4 -4 -2 0 2 4 KS test p-value: 0.548 KS test p-value: 0.792 KS test p-value: 0.255 MEX SWE USA Density Density Density 0 .2 .4 0 .2 .4 0 .2 .4 -2 0 2 4 6 -4 -2 0 2 4 -4 -2 0 2 4 KS test p-value: 0.015 KS test p-value: 0.412 KS test p-value: 0.999 PISA 2012 index WMS index Note: Data for the World Management Survey index for all countries except for Mexico and Colombia can be found at www.worldmanagementsurvey.org. Distribution of people management indices standardized within countries. Kernel density curves estimated using WMS sampling weights (calculated as the inverse probability of being interview on log of number of students, public status, and population density by state, province, or NUTS 2 region as a measure of location) for the WMS data and school final weights for the PISA data. Samples include both public and private secondary schools for both datasets, with the exception of Colombia where WMS data is only available to public primary schools. Number of WMS/PISA observations are as follow (WMS/PISA): Brazil = 510/561, Canada = 129/770, Colombia = 468/268, Great Britain = 89/422, Germany = 102/158, Italy = 284/926, Mexico = 157/1327, Sweden = 85/179, United States = 263/136. App. 7 Figure B.3: WMS score by quartile x country-specific student outcomes Note: Reproduced from Bloom et al. [2015a]. Performance measures for 1002 observations: 472 for Brazil, 77 for Canada, 152 for India, 82 for Sweden, 86 for the UK and 133 for the US. At the time of writing, the authors of Bloom et al. [2015a] had conducted the WMS in 8 countries, the listed 6 plus Germany and Italy. The latter two countries are not included in this figure because data on student learning outcomes was not available to the authors. App. 8 Figure B.4: Cumulative distribution of people management: by country in Latin America ARG BRA CHL COL 1 1 1 1 .75 .75 .75 .75 .5 .5 .5 .5 .25 .25 .25 .25 0 0 0 0 -2 0 2 4 6 -4 -2 0 2 4 -2 -1 0 1 2 3 -2 -1 0 1 2 3 CRI MEX PER URY 1 1 1 1 .75 .75 .75 .75 .5 .5 .5 .5 .25 .25 .25 .25 0 0 0 0 -2 -1 0 1 2 3 -4 -2 0 2 4 -2 -1 0 1 2 3 -2 -1 0 1 2 3 Private Public Notes: Cumulative distribution of the PISA-based people management index for private and public schools for each one of the 8 Latin American countries in the PISA 2012 dataset. The people management index is built out of the school questionnaire from PISA 2012 using the methodology from Anderson [2008]. Sample sizes are as follows: Argentina: 183 schools (63 private, 120 public). Brazil: 561 schools (79 private, 482 public). Chile: 201 schools (137 private, 64 public). Colombia: 268 schools (62 private, 106 public). Costa Rica: 158 schools (22 private 136 public). Mexico: 1327 schools (196 private, 1131 public). Peru: 207 schools (50 private, 157 public). Uruguay: 164 schools (28 private, 136 public). App. 9 Figure B.5: Standardized teacher shortage scores, by sector Private Public 6 4 Density 2 0 -2 0 2 4 -2 0 2 4 Teacher Shortage Index Notes: Teacher Shortage index is built out of four PISA 2012 questions: “is your school’s capacity to provide instruction hindered by any of the following issues? a lack of qualified [science, math, language, other subjects] teachers”. The responses are scored as 1 = not at all, 2 = very little, 3 = to some extent and 4 = a lot. The index is built using the methodology in Anderson [2008]. Measures are standardized. App. 10 C Data C.1 Construction of the PISA-based indices To construct a PISA-based school management index, we followed a three-step approach. First, we classified each of the PISA questions either under one of the WMS topics or under “not man- agement”. We were able to classify 53 2012 PISA questions into 14 WMS topics and using this mapping as a starting point, we further classified 32 2015 PISA questions into 12 WMS topics. For operations management, we classified 40 2012 PISA questions into 11 WMS topics and using this mapping as a starting point, we further classified 30 2015 PISA questions into 11 WMS topics. For people management, we classified 13 questions into 3 WMS topics using the 2012 questionnaire, and 2 questions into 1 WMS topic using the 2015 questionnaire. Table C.6 provides a summary of the mapping for PISA 2012 and Prova Brasil 2013 used in this paper as well as a mapping for PISA 2015. Table C.6: Mapping of Management Practices in Publicly Available Survey Data # of Questions Mapped WMS Management Practice WMS Description PISA 2012 PISA 2015 Prova Brasil 2013 Operations Management 1) Standardization of School uses meaningful processes that allow students to learn over time. 7 1 9 Instructional Processes 2) Personalization of School incorporates teaching methods that ensure all pupils can master 3 3 9 Instruction and Learning the learning objectives. 3) Data-Driven Planning and School uses assessment and easily available data to verify learning 3 3 Student Transitions outcomes at critical stages. 4) Adopting Educational Best School incorporates and shares teaching best practices and pupil 5 3 3 Practices strategies across classrooms accordingly. School implements processes towards continuous improvement and encourages 5) Continuous Improvement 8 7 lessons to be captured and documented. 6) Performance Tracking School performance is regularly tracked with useful metrics. 7) Performance Review School performance is reviewed with appropriate metrics. 5 4 2 School performance is discussed with appropriate content, depth and 8) Performance Dialogue 3 2 communicated to teachers. 9) Consequence Management School has mechanisms in place to follow-up on performance issues. School covers a sufficiently broad set of targets at the school, 10) Target Balance 4 4 department and individual levels. 11) Target Inter-Connection School establishes well-aligned targets across all levels. 1 1 12) Time Horizon of Targets School takes a rational approach to planning and setting targets. 13) Target Stretch School sets targets with the appropriate level of difficulty. 1 1 14) Clarity and Comparability School sets understandable targets and openly communicates and compares 7 6 of Targets school, department and individual performance. People Management School implements a systematic approach to identifying good and bad 15) Rewarding High Performers 5 2 performance, rewarding teachers proportionately. 16) Removing Poor Performers School deals with underperformers promptly. 17) Promoting High Performers School promotes employees based on job performance. 4 2 18) Managing Talent School nurtures and develops teaching and leadership talent. 1 19) Retaining Talent School attempts to retain employees with high performance. 20) Attracting Talent/ School has a thought-through approach to attract employees. 3 5 Creating a Distinctive Employee Value Proposition Second, we manually assigned scores following the conceptual guidelines of the scoring grid of the World Management Survey, similar to the exercise conducted in the census-based management surveys such as the US Census Management and Organizational Practices Survey (MOPS), where App. 11 values indicating best practices receive higher scores than values indicating poor practices. Values are normalized from 0 to 1.25 Third, to build the overall management index, and the operations and people management sub- indices, we follow Anderson [2008]. This methodology weights the impact of the included variables by the sum of their row in the inverse variance-covariance matrix, thereby assigning greater weight to questions that carry more “new information”. Given that the importance (weight) of one ques- tions is relative to the important of all others, we conservatily drop schools missing more than one management question (approximately 15% of schools are dropped, yet all countries are still included in the final sample). We also built the indices using alternative methods (straightforward standard- ization, factor analysis, including the Bartlett correction) which yielded similar results. PISA 2015 has a reduced number of questions relative to the 2012 questions we used to measure people management. Several questions were moved to the new teacher questionnaire which was not mandatory for countries in 2015, preventing us from building an identically rich index across both years. For this reason, we focus on the richer 2012 data. A visual inspection of the distributions for the 2015 PISA-based management index for operations and people management shown in Figures C.6 and C.7 when compared to the distributions for 2012 indices in Figures B.1 and B.2 confirm that the 2012 indices, especially the people management index, are a better fit for this exercise. We run two additional exercises to validate our index. First, we test whether the results are being driven by one specific question in the management index. To do this, we estimate the partial correlation of each of the 53 management questions on student performance, controlling for a partial index which takes into account all remaining management questions, standardized using Anderson [2008]. We find that the partial indices are positive and statistically significant throughout all individual regressions, suggesting that no single question is driving our results. Second, we test the importance of having both operations and people management questions in the index. This is to validate whether it is feasible to use the 2015 PISA data. To do this we run a regression where we include both operations and people management indices to the specification. This specification indicates whether each of the indices contain additively separable relevant information to explain 25 MOPS has since been replicated in a number of other countries. Its questions follow the WMS topics and look to measure similar practices, but with self-reported answers. App. 12 Figure C.6: Distribution of management scores, PISA 2015 vs WMS: operations BRA CAN COL Density Density Density 0 .2 .4 0 .2 .4 0 .2 .4 -4 -2 0 2 4 -4 -2 0 2 4 -4 -2 0 2 4 KS test p-value: 0.885 KS test p-value: 0.793 KS test p-value: 0.333 GBR GER ITA Density Density Density 0 .2 .4 0 .2 .4 0 .2 .4 -4 -2 0 2 4 -4 -2 0 2 4 -4 -2 0 2 4 KS test p-value: 0.975 KS test p-value: 0.816 KS test p-value: 0.687 MEX SWE USA Density Density Density 0 .2 .4 0 .2 .4 0 .2 .4 -4 -2 0 2 4 -4 -2 0 2 4 -3 -2 -1 0 1 2 KS test p-value: 0.860 KS test p-value: 0.406 KS test p-value: 0.853 PISA 2015 index WMS index Note: Data for the World Management Survey index for all countries except for Mexico and Colombia can be found at www.worldmanagementsurvey.org. Distribution of operations management indices standardized within countries. Kernel density curves estimated using WMS sampling weights (calculated as the inverse probability of being interview on log of number of students, public status, and population density by state, province, or NUTS 2 region as a measure of location) for the WMS data and school final weights for the PISA data. Samples include both public and private secondary schools for both datasets, with the exception of Colombia where WMS data is only available to public primary schools. Number of WMS/PISA observations are as follow (WMS/PISA): Brazil = 510/421, Canada = 129/562, Colombia = 468/258, Great Britain = 89/381, Germany = 102/156, Italy = 284/291, Mexico = 157/138, Sweden = 85/192, United States = 263/158. App. 13 Figure C.7: Distribution of management scores, PISA 2015 vs WMS: people BRA CAN COL Density Density Density 0 .2 .4 0 .2 .4 0 .2 .4 -4 -2 0 2 4 -4 -2 0 2 4 -2 0 2 4 6 KS test p-value: 0.000 KS test p-value: 0.035 KS test p-value: 0.000 GBR GER ITA Density Density Density 0 .2 .4 0 .2 .4 0 .2 .4 -4 -2 0 2 4 -2 -1 0 1 2 3 -2 -1 0 1 2 3 KS test p-value: 0.000 KS test p-value: 0.138 KS test p-value: 0.003 MEX SWE USA Density Density Density 0 .2 .4 0 .2 .4 0 .2 .4 -2 0 2 4 6 -2 0 2 4 -4 -2 0 2 4 KS test p-value: 0.000 KS test p-value: 0.179 KS test p-value: 0.000 PISA 2015 index WMS index Note: Data for the World Management Survey index for all countries except for Mexico and Colombia can be found at www.worldmanagementsurvey.org. Distribution of people management indices standardized within countries. Kernel density curves estimated using WMS sampling weights (calculated as the inverse probability of being interview on log of number of students, public status, and population density by state, province, or NUTS 2 region as a measure of location) for the WMS data and school final weights for the PISA data. Samples include both public and private secondary schools for both datasets, with the exception of Colombia where WMS data is only available to public primary schools. Number of WMS/PISA observations are as follow (WMS/PISA): Brazil = 510/421, Canada = 129/562, Colombia = 468/258, Great Britain = 89/381, Germany = 102/156, Italy = 284/291, Mexico = 157/138, Sweden = 85/192, United States = 263/158. App. 14 student performance. We find that the coefficients remain positive and statistically significant across all subjects when including country fixed effects (same specification as Column (1) of Table 1), and remain positive and statistically significant in reading, marginally non-significant in math and science when fully specified (same specification as Column (3) of Table 1). Overall, these results suggest that both measures are still meaningful (tables are available upon request). The list of questions included in the PISA 2012 management index and its mapping to the individual questions is described below. App. 15 PISA 2012 V Var. name in a MGMT WMS questions Questions questionnari Value label l score e u e 1) Standardisation of Instructional Processes a) How structured or Which of the following options describe what your school does for students in mathematics classes? Answer: Some classes 2 0.50 instructional Mathematics classes study similar content, but at different levels of difficulty. planning processes Not for any class 3 1.00 across the school? Which of the following options describe what your school does for students in mathematics classes? Answer: Some classes 2 0.50 Different classes study different content or sets of mathematics topics that have different levels of difficulty. Not for any class 3 1.00 Which of the following measures aimed at quality assurance and improvement SC39Q10 Yes 1 1.00 do you have in your school? Answer: Implementation of a standardised policy for mathematics (i.e. school curriculum with shared instructional materials No 2 0.00 accompanied by staff development and training). b) What tools and Which of the following statements apply in your school? Answer: The school SC40Q01 Yes 1 1.00 resources are has a policy on how to use computers in mathematics instruction (e.g. amount of provided to teachers computer use in mathematics lessons, use of specific mathematics computer No 2 0.00 (e.g. standards- programs). based lesson plans Which of the following statements apply in your school? Answer: All mathematics classes in the school use the same No 2 0.00 ensure consistent textbook. level of quality in Which of the following statements apply in your school? Answer: Mathematics SC40Q03 Yes 1 1.00 delivery across teachers in the school follow a standardised curriculum that specifies content at classrooms? least on a monthly basis. No 2 0.00 2) Personalization of Instruction and Learning b) How do you as a Which of the following options describe what your school does for students in mathematics classes? Answer: In that teachers are mathematics classes, teachers use pedagogy suitable for students with effective in heterogeneous abilities (i.e. students are not grouped by ability). Some classes 2 0.50 personalising instruction in each Not for any class 3 0.00 classroom across the school? 3) Data-Driven Planning and Student Transitions a) Is data used to Are assessments of students in used to SC18Q01 Yes 1 1.00 inform planning and inform parents about their child’s progress? strategies? If so how No 2 0.00 is it used – especially in regards Are assessments of students in used to SC18Q02 Yes 1 1.00 to student transitions make decisions about students’ retention or promotion? through grades/ No 2 0.00 levels? b) What drove the Which of the following measures aimed at quality assurance and improvement SC39Q03 Yes 1 1.00 move towards more do you have in your school? Answer: Systematic recording of data including data-driven teacher and student attendance and graduation rates, test results and No 2 0.00 planning/ tracking? professional development of teachers. 4) Adopting Educational Best Practices a) How does the Are assessments of students in used to SC18Q07 Yes 1 1.00 school encourage identify aspects of instruction or the curriculum that could be improved? incorporating new teaching practices No 2 0.00 into the classroom? App. 16 PISA 2012 V Var. name in a MGMT WMS questions Questions questionnari Value label l score e u e 4) Adopting Educational Best Practices a) How does the Please indicate the frequency of the following activities and behaviours in your SC34Q05 Did not occur 1 0.00 school encourage school during . Answer: I promote teaching practices 1-2 times during 2 0.20 incorporating new based on recent educational research. the year teaching practices 3-4 times during 3 0.40 into the classroom? the year Once a month 4 0.60 Once a week 5 0.80 More than once a 6 1.00 week b) How are these Please indicate the frequency of the following activities and behaviours in your SC34Q18 Did not occur 1 0.00 learning or new school during . Answer: I set aside time at faculty 1-2 times during 2 0.20 teaching practices meetings for teachers to share ideas or information from in-service activities. the year shared across 3-4 times during 3 0.40 teachers? What the year about across grades Once a month 4 0.60 or subjects? How Once a week 5 0.80 does sharing happen More than once a 6 1.00 across schools week (community, state- wide etc), if at all? Which of the following measures aimed at quality assurance and improvement SC39Q08 Yes 1 1.00 do you have in your school? Answer: Teacher mentoring. No 2 0.00 Please indicate the frequency of the following activities and behaviours in your SC34Q17 Did not occur 1 0.00 school during . Answer: I lead or attend in-service 1-2 times during 2 0.20 activities concerned with instruction. the year 3-4 times during 3 0.40 the year Once a month 4 0.60 Once a week 5 0.80 More than once a 6 1.00 week 5) Continuous Improvement a) When problems Please indicate the frequency of the following activities and behaviours in your SC34Q07 Did not occur 1 0.00 (e.g. within school/ school during . Answer: When a teacher has problems 1-2 times during 2 0.20 teaching tactics/ in his/her classroom, I take the initiative to discuss matters. the year etc.) do occur, how 3-4 times during 3 0.40 do they typically get the year exposed and fixed? Once a month 4 0.60 Once a week 5 0.80 More than once a 6 1.00 week c) Who within the Please indicate the frequency of the following activities and behaviours in your SC34Q11 Did not occur 1 0.00 school gets involved school during . Answer: I engage teachers to help build 1-2 times during 2 0.20 in changing or a school culture of continuous improvement. the year improving process? 3-4 times during 3 0.40 How do the different the year staff groups get Once a month 4 0.60 involved in this? Once a week 5 0.80 More than once a 6 1.00 week App. 17 PISA 2012 V Var. name in a MGMT WMS questions Questions questionnari Value label l score e u e 5) Continuous Improvement c) Who within the Please indicate the frequency of the following activities and behaviours in your SC34Q19 Did not occur 1 0.00 school gets involved school during . Answer: I conduct informal 1-2 times during 2 0.20 in changing or observations in classrooms on a regular basis (informal observations are the year improving process? unscheduled, last at least 5 minutes, and may or may not involve written 3-4 times during 3 0.40 How do the different feedback or a formal conference). the year staff groups get Once a month 4 0.60 involved in this? Once a week 5 0.80 More than once a 6 1.00 week Which of the following measures aimed at quality assurance and improvement SC39Q05 Yes 1 1.00 do you have in your school? Answer: Internal evaluation/self-evaluation. No 2 0.00 Which of the following measures aimed at quality assurance and improvement SC39Q06 Yes 1 1.00 do you have in your school? Answer: External evaluation. No 2 0.00 Which of the following measures aimed at quality assurance and improvement SC39Q07 Yes 1 1.00 do you have in your school? Answer: Seeking written feed-back from students No 2 0.00 (e.g. regarding lessons, teachers or resources). 7) Performance Review a) How often do you During the last year, have any of the following methods been used to monitor SC30Q01 Yes 1 1.00 review (school) the practice of mathematics teachers at your school? Answer: Tests or No 2 0.00 performance -- assessments of student achievement. formally or During the last year, have any of the following methods been used to monitor SC30Q02 Yes 1 1.00 informally-- with the practice of mathematics teachers at your school? Answer: Teacher peer teachers and staff? review (of lesson plans, assessment instruments, lessons). No 2 0.00 During the last year, have any of the following methods been used to monitor SC30Q03 Yes 1 1.00 the practice of mathematics teachers at your school? Answer: Principal or No 2 0.00 senior staff observations of lessons. During the last year, have any of the following methods been used to monitor SC30Q04 Yes 1 1.00 the practice of mathematics teachers at your school? Answer: Observation of No 2 0.00 classes by inspectors or other persons external to the school. Please indicate the frequency of the following activities and behaviours in your SC34Q22 Did not occur 1 0.00 school during . Answer: I evaluate the performance of 1-2 times during 2 0.20 staff. the year 3-4 times during 3 0.40 the year Once a month 4 0.60 Once a week 5 0.80 More than once a 6 1.00 week 8) Performance Dialogue a) How are these Please indicate the frequency of the following activities and behaviours in your SC34Q12 Did not occur 1 0.00 review meetings school during . Answer: I ask teachers to participate in 1-2 times during 2 0.20 structured? reviewing management practices. the year 3-4 times during 3 0.40 the year Once a month 4 0.60 Once a week 5 0.80 More than once a 6 1.00 week App. 18 PISA 2012 V Var. name in a MGMT WMS questions Questions questionnari Value label l score e u e 8) Performance Dialogue a) How are these Please indicate the frequency of the following activities and behaviours in your SC34Q13 Did not occur 1 0.00 review meetings school during . Answer: When a teacher brings up a 1-2 times during 2 0.20 structured? classroom problem, we solve the problem together. the year 3-4 times during 3 0.40 the year Once a month 4 0.60 Once a week 5 0.80 More than once a 6 1.00 week Please indicate the frequency of the following activities and behaviours in your SC34Q16 Did not occur 1 0.00 school during . Answer: I discuss academic 1-2 times during 2 0.20 performance results with the faculty to identify curricular strengths and the year weaknesses. 3-4 times during 3 0.40 the year Once a month 4 0.60 Once a week 5 0.80 More than once a 6 1.00 week 10) Target Balance a) What types of Are assessments of students in used to SC18Q04 Yes 1 1.00 targets are set for the compare the school to performance? No 2 0.00 school to improve Are assessments of students in used to SC18Q05 Yes 1 1.00 student outcomes? monitor the school’s progress from year to year? No 2 0.00 Which staff levels Are assessments of students in used to SC18Q08 Yes 1 1.00 are held accountable compare the school with other schools? No 2 0.00 to achieve these Please indicate the frequency of the following activities and behaviours in your SC34Q03 Did not occur 1 0.00 stated goals? school during . Answer: I make sure that the 1-2 times during 2 0.20 professional development activities of teachers are in accordance with the the year teaching goals of the school. 3-4 times during 3 0.40 the year Once a month 4 0.60 Once a week 5 0.80 More than once a 6 1.00 week 11) Target Inter-Connection a) How are these Please indicate the frequency of the following activities and behaviours in your SC34Q14 Did not occur 1 0.00 goals cascaded school during . Answer: I discuss the school’s 1-2 times during 2 0.20 down to the different academic goals with teachers at faculty meetings. the year staff groups or to 3-4 times during 3 0.40 individual staff the year members? Once a month 4 0.60 Once a week 5 0.80 More than once a 6 1.00 week 13) Target Stretch a) How tough are Please indicate the frequency of the following activities and behaviours in your SC34Q02 Did not occur 1 0.00 your targets? How school during . Answer: I use student performance 1-2 times during 2 0.20 pushed are you by results to develop the school’s educational goals. the year the targets? 3-4 times during 3 0.40 the year Once a month 4 0.60 Once a week 5 0.80 More than once a 6 1.00 week App. 19 PISA 2012 V Var. name in a MGMT WMS questions Questions questionnari Value label l score e u e 14) Clarity and Comparability of Targets a) If I asked one of Please indicate the frequency of the following activities and behaviours in your SC34Q04 Did not occur 1 0.00 your staff members school during . Answer: I ensure that teachers work 1-2 times during 2 0.20 directly about according to the school’s educational goals. the year individual targets, 3-4 times during 3 0.40 what would they tell the year me? Once a month 4 0.60 Once a week 5 0.80 More than once a 6 1.00 week Please indicate the frequency of the following activities and behaviours in your SC34Q15 Did not occur 1 0.00 school during . Answer: I refer to the school’s 1-2 times during 2 0.20 academic goals when making curricular decisions with teachers. the year 3-4 times during 3 0.40 the year Once a month 4 0.60 Once a week 5 0.80 More than once a 6 1.00 week Which of the following measures aimed at quality assurance and improvement SC39Q01 Yes 1 1.00 do you have in your school? Answer: Written specification of the school’s No 2 0.00 curricular profile and educational goals. Which of the following measures aimed at quality assurance and improvement SC39Q02 Yes 1 1.00 do you have in your school? Answer: Written specification of student No 2 0.00 performance standards. c) How do people In your school, are achievement data used in any of the following SC19Q01 Yes 1 1.00 know about their ? Answer: Achievement data are posted publicly No 2 0.00 own performance (e.g. in the medi1). compared to other In your school, are achievement data used in any of the following SC19Q02 Yes 1 1.00 people’s ? Answer: Achievement data are tracked over time performance? No 2 0.00 by an administrative authority. 15) Rewarding High Performers a) How does your Please indicate the frequency of the following activities and behaviours in your SC34Q22 Did not occur 1 0.00 evaluation system school during . Answer: I evaluate the performance of 1-2 times during 2 0.20 work? What staff. the year proportion of your 3-4 times during 3 0.40 employees' pay is the year related to the results Once a month 4 0.60 of this review? Once a week 5 0.80 More than once a 6 1.00 week Are assessments of students in used to SC18Q06 Yes 1 1.00 make judgements about teachers’ effectiveness? No 2 0.00 b) Are there any non- To what extent have appraisals of and/or feedback to teachers directly led a SC31Q01 No change 1 0.00 financial or financial change in salary? Small change 2 0.33 bonuses/ rewards for Moderate change 3 0.66 the best performers Large change 4 1.00 across all staff To what extent have appraisals of and/or feedback to teachers directly led a SC31Q02 No change 1 0.00 groups? How does financial bonus or another kind of monetary reward? Small change 2 0.33 the bonus system Moderate change 3 0.66 work (for staff and Large change 4 1.00 teachers)? To what extent have appraisals of and/or feedback to teachers directly led a SC31Q05 No change 1 0.00 public recognition from you? Small change 2 0.33 Moderate change 3 0.66 Large change 4 1.00 App. 20 PISA 2012 V Var. name in a MGMT WMS questions Questions questionnari Value label l score e u e 15) Rewarding High Performers b) Are there any non- Please indicate the frequency of the following activities and behaviours in your SC34Q06 Did not occur 1 0.00 financial or financial school during . Answer: I praise teachers whose 1-2 times during 2 0.20 bonuses/ rewards for students are actively participating in learning. the year the best performers 3-4 times during 3 0.40 across all staff the year groups? How does Once a month 4 0.60 the bonus system Once a week 5 0.80 work (for staff and More than once a 6 1.00 teachers)? week 17) Promoting High Performers b) How do you To what extent have appraisals of and/or feedback to teachers directly led to SC31Q03 No change 1 0.00 identify and develop opportunities for professional development activities? Small change 2 0.33 your star Moderate change 3 0.66 performers? Large change 4 1.00 c) What types of To what extent have appraisals of and/or feedback to teachers directly led SC31Q06 No change 1 0.00 professional changes in work responsibilities that make the job more attractive? Small change 2 0.33 development Moderate change 3 0.66 opportunities are Large change 4 1.00 provided? How are To what extent have appraisals of and/or feedback to teachers directly led a role SC31Q07 No change 1 0.00 these opportunities in school development initiatives (e.g. curriculum development group, Small change 2 0.33 personalised to meet development of school objectives)? Moderate change 3 0.66 individual teacher needs? Large change 4 1.00 d) How do you make To what extent have appraisals of and/or feedback to teachers directly led a SC31Q04 No change 1 0.00 decisions about change in the likelihood of career advancement? promotion/ progression and Small change 2 0.33 additional opportunities within the school, such as performance, tenure, Moderate change 3 0.66 other? Are better performers likely to be promoted faster, Large change 4 1.00 or are promotions given on the basis of tenure/ seniority? 20) Attracting Talent/ Creating a Distinctive Employee Value Proposition b) How do you Please indicate the frequency of the following activities and behaviours in your SC34Q01 Did not occur 1 0.00 monitor how school during . Answer: I work to enhance the school’s 1-2 times during 2 0.20 effectively you reputation in the community. the year communicate your 3-4 times during 3 0.40 value proposition the year Once a month 4 0.60 and the following Once a week 5 0.80 recruitment process? More than once a 6 1.00 week What percentage of all staff in your school has attended a programme of SC35Q01 Percentage 0 0.00 professional development with a focus on mathematics? 1-25 0.25 26-50 0.50 51-75 0.75 76- 1.00 What percentage of math teachers in your school has attended a programme of SC35Q02 Percentage 0 0.00 professional development with a focus on mathematics? 1-25 0.25 26-50 0.50 51-75 0.75 76- 1.00 App. 21 C.2 Construction of teacher shortage, teacher motivation, teacher effort, and household effort indices We use the Anderson [2008] methodology to build each intermediate teacher outcomes index be- low. App. 22 PISA 2012 Var. name in Questions Value label Value questionnarie Teacher Shortage Is your school’s capacity to provide instruction hindered by any of the following SC14Q01 Not at all 1 issues? Answer: A lack of qualified science teachers. Very little 2 To some extent 3 A lot 4 Is your school’s capacity to provide instruction hindered by any of the following SC14Q02 Not at all 1 issues? Answer: A lack of qualified mathematics teachers. Very little 2 To some extent 3 A lot 4 Is your school’s capacity to provide instruction hindered by any of the following SC14Q03 Not at all 1 issues? Answer: A lack of qualified teachers. Very little 2 To some extent 3 A lot 4 Is your school’s capacity to provide instruction hindered by any of the following SC14Q04 Not at all 1 issues? Answer: A lack of qualified teachers of other subjects. Very little 2 To some extent 3 A lot 4 Teacher Motivation In your school, to what extent is the learning of students hindered by the following SC22Q13 Not at all 1 phenomena? Answer: Teachers’ low expectations of students. Very little 2 To some extent 3 A lot 4 In your school, to what extent is the learning of students hindered by the following SC22Q14 Not at all 1 phenomena? Answer: Teachers not meeting individual students’ needs. Very little 2 To some extent 3 A lot 4 During , what proportion of students’ parents participated in SC25Q02 Percentage the following school-related activities? Answer: Discussed their child’s behaviour on the initiative of one of their child’s teachers. During , what proportion of students’ parents participated in SC25Q04 Percentage the following school-related activities? Answer: Discussed their child’s progress on the initiative of one of their child’s teachers. Think about the teachers in your school. How much do you agree with the following SC26Q01 Strongly agree 1 statements? Answer: The morale of teachers in this school is high. Agree 2 Disagree 3 Strongly disagree 4 Think about the teachers in your school. How much do you agree with the following SC26Q02 Strongly agree 1 statements? Answer: Teachers work with enthusiasm. Agree 2 Disagree 3 Strongly disagree 4 Think about the teachers in your school. How much do you agree with the following SC26Q03 Strongly agree 1 statements? Answer: Teachers take pride in this school. Agree 2 Disagree 3 Strongly disagree 4 Think about the teachers in your school. How much do you agree with the following SC26Q04 Strongly agree 1 statements? Answer: Teachers value academic achievement. Agree 2 Disagree 3 Strongly disagree 4 How much do you agree with these statements about teachers in your school? SC27Q01 Strongly agree 1 Answer: Mathematics teachers are interested in trying new methods and teaching Agree 2 practices. Disagree 3 Strongly disagree 4 How much do you agree with these statements about teachers in your school? SC27Q02 Strongly agree 1 Answer: There is a preference among mathematics teachers to stay with well-known Agree 2 methods and practices. Disagree 3 Strongly disagree 4 How much do you agree with these statements about teachers in your school? SC28Q01 Strongly agree 1 Answer: There is consensus among mathematics teachers that academic achievement Agree 2 must be kept as high as possible. Disagree 3 Strongly disagree 4 App. 23 PISA 2012 Var. name in Questions Value label Value questionnarie Teacher Motivation How much do you agree with these statements about teachers in your school? SC28Q02 Strongly agree 1 Answer: There is consensus among mathematics teachers that it is best to adapt Agree 2 academic standards to the students’ levels and needs. Disagree 3 Strongly disagree 4 How much do you agree with these statements about teachers in your school? SC29Q01 Strongly agree 1 Answer: There is consensus among mathematics teachers that the social and Agree 2 emotional development of the students is as important as their acquisition of Disagree 3 mathematical skills and knowledge in mathematics classes. Strongly disagree 4 How much do you agree with these statements about teachers in your school? SC29Q02 Strongly agree 1 Answer: There is consensus among mathematics teachers that the development of Agree 2 mathematical skills and knowledge in students is the most important objective in Disagree 3 mathematics classes. Strongly disagree 4 Teacher Effort In your school, to what extent is the learning of students hindered by the following SC22Q11 Not at all 1 phenomena? Answer: Teachers having to teach students of heterogeneous ability Very little 2 levels within the same class. To some extent 3 A lot 4 In your school, to what extent is the learning of students hindered by the following SC22Q15 Not at all 1 phenomena? Answer: Teacher absenteeism. Very little 2 To some extent 3 A lot 4 In your school, to what extent is the learning of students hindered by the following SC22Q17 Not at all 1 phenomena? Answer: Teachers being too strict with students. Very little 2 To some extent 3 A lot 4 In your school, to what extent is the learning of students hindered by the following SC22Q18 Not at all 1 phenomena? Answer: Teachers being late for classes. Very little 2 To some extent 3 A lot 4 Household Effort In your school, to what extent is the learning of students hindered by the following SC22Q01 Not at all 1 phenomena? Answer: Student truancy. Very little 2 To some extent 3 A lot 4 In your school, to what extent is the learning of students hindered by the following SC22Q02 Not at all 1 phenomena? Answer: Students skipping classes. Very little 2 To some extent 3 A lot 4 In your school, to what extent is the learning of students hindered by the following SC22Q03 Not at all 1 phenomena? Answer: Students arriving late for school. Very little 2 To some extent 3 A lot 4 In your school, to what extent is the learning of students hindered by the following SC22Q04 Not at all 1 phenomena? Answer: Students not attending compulsory school events (e.g. sports Very little 2 day) or excursions. To some extent 3 A lot 4 In your school, to what extent is the learning of students hindered by the following SC22Q05 Not at all 1 phenomena? Answer: Students lacking respect for teachers. Very little 2 To some extent 3 A lot 4 In your school, to what extent is the learning of students hindered by the following SC22Q06 Not at all 1 phenomena? Answer: Disruption of classes by students. Very little 2 To some extent 3 A lot 4 In your school, to what extent is the learning of students hindered by the following SC22Q08 Not at all 1 phenomena? Answer: Students intimidating or bullying other students. Very little 2 To some extent 3 A lot 4 App. 24 PISA 2012 Var. name in Questions Value label Value questionnarie Household Effort In your school, to what extent is the learning of students hindered by the following SC22Q10 Not at all 1 phenomena? Answer: Poor student-teacher relations. Very little 2 To some extent 3 A lot 4 Which statement below best characterises parental expectations towards your school? SC24Q01 There is constant pressure from many 1 parents, who expect our school to set very high academic standards and to have our students achieve them. Pressure on the school to achieve 2 higher academic standards among students comes from a minority of parents. Pressure from parents on the school to 3 achieve higher academic standards among students is largely absent. During , what proportion of students’ parents participated in SC25Q01 Percentage the following school-related activities? Answer: Discussed their child’s behaviour with a teacher on their own initiative. During , what proportion of students’ parents participated in SC25Q03 Percentage the following school-related activities? Answer: Discussed their child’s progress with a teacher on their own initiative. During , what proportion of students’ parents participated in SC25Q05 Percentage the following school-related activities? Answer: Volunteered in physical activities, e.g. building maintenance, carpentry, gardening or yard work. During , what proportion of students’ parents participated in SC25Q06 Percentage the following school-related activities? Answer: Volunteered in extra-curricular activities, e.g. book club, school play, sports, field trip. During , what proportion of students’ parents participated in SC25Q07 Percentage the following school-related activities? Answer: Volunteered in the school library or media centre. During , what proportion of students’ parents participated in SC25Q08 Percentage the following school-related activities? Answer: Assisted a teacher in the school. During , what proportion of students’ parents participated in SC25Q09 Percentage the following school-related activities? Answer: Appeared as a guest speaker. During , what proportion of students’ parents participated in SC25Q10 Percentage the following school-related activities? Answer: Participated in local school , e.g. parent council or school management committee. During , what proportion of students’ parents participated in SC25Q11 Percentage the following school-related activities? Answer: Assisted in fundraising for the school. During , what proportion of students’ parents participated in SC25Q12 Percentage the following school-related activities? Answer: Volunteered in the school . App. 25 C.3 Construction of the Prova Brasil-based school management index To construct the Prova Brasil-based school management index, we followed the three steps as detailed in the construction of the PISA-based index. However, as we map variables from both the school director and teacher questionnaires, we take one further step: we collapse the teacher dataset at the school level, taking the average of all teacher responses by school, combine the school principal responses, and compute the school level index. The WMS-Prova Brasil 2013 mapping is detailed below. For a harmonized version of the Prova Brasil mapping across 2007 to 2017, see Adelman et al. [2019]. App. 26 2013 Questionnaire: Optio MGMT WMS questions Questions Value label Var. name n score 1) Standardisation of Instructional Processes Operations Management a) How structured or Na sua percepção, os possíveis problemas de aprendizagem dos Teacher: A Sim 0.00 standardised are the alunos das série(s) ou ano(s) avaliado(s) ocorrem, nesta escola, TX_RESP_Q73 instructional devido à/ao(s): Não cumprimento dos conteúdos curriculares ao B Não 1.00 planning processes longo da trajetória escolar do aluno. across the school? Na sua percepção, os possíveis problemas de aprendizagem dos Teacher: A Sim 0.00 alunos das série(s) ou ano(s) avaliado(s) ocorrem, nesta escola, TX_RESP_Q74 devido à/ao(s): Sobrecarga de trabalho dos professores, B Não 1.00 dificultando o planejamento e o preparo das aulas. Quanto do conteúdo previsto você conseguiu desenvolver com os Teacher: A Menos de 20%. 0.00 alunos desta turma neste ano? TX_RESP_Q106 B De 20% a menos de 40%. 0.20 C De 40% a menos de 60%. 0.50 D De 60% a menos de 80%. 0.75 E 80% ou mais. 1.00 b) What tools and Como se deu a escolha do livro didático neste ano? Principal: A Não sei. 0.00 resources are TX_RESP_Q086 B Foi escolhido de forma 1.00 provided to teachers participativa pelos (e.g. standards-based professores. lesson plans and C Foi escolhido por 0.50 textbooks) to ensure somente alguns membros consistent level of da equipe escolar. quality in delivery D Foi escolhido por órgãos 0.50 across classrooms? externos à escola. E Foi escolhido de outra missing maneira. Os alunos desta turma têm livros didáticos? Teacher: A Não, esta turma não 0.00 TX_RESP_Q99 recebeu o livro didático. B Sim, menos da metade da 0.25 turma tem. C Sim, metade da turma 0.50 tem. D Sim, a maioria tem. 0.75 E Sim, todos têm. 1.00 Gostaríamos de saber quais os recursos que você utiliza para fins Teacher: A Não utilizo porque a 0.00 pedagógicos, nesta turma: Jornais e revistas informativas. TX_RESP_Q44 escola não tem. B Nunca. 0.00 C De vez em quando. 0.50 D Sempre ou quase sempre. 1.00 Gostaríamos de saber quais os recursos que você utiliza para fins Teacher: A Não utilizo porque a 0.00 pedagógicos, nesta turma: livros de literatura em geral. TX_RESP_Q45 escola não tem. B Nunca. 0.00 C De vez em quando. 0.50 D Sempre ou quase sempre. 1.00 Gostaríamos de saber quais os recursos que você utiliza para fins Teacher: A Não utilizo porque a 0.00 pedagógicos, nesta turma: máquina copiadora (xerox). TX_RESP_Q48 escola não tem. B Nunca. 0.00 C De vez em quando. 0.50 D Sempre ou quase sempre. 1.00 d) How does the Nesta escola e neste ano, indique a frequência com que: O(A) Teacher: A Nunca. 0.00 school leader diretor(a) dá atenção especial a aspectos relacionados com a TX_RESP_Q61 B Algumas vezes. 0.33 monitor and ensure aprendizagem dos alunos. C Frequentemente. 0.66 consistency in quality across D Sempre ou quase sempre. 1.00 classrooms? App. 27 2013 Questionnaire: Optio MGMT WMS questions Questions Value label Var. name n score 2) Personalization of Instruction and Learning Operations Management a) How much does Na sua percepção, os possíveis problemas de aprendizagem dos Teacher: A Sim 0.00 the school attempt to alunos das série(s) ou ano(s) avaliado(s) ocorrem, nesta escola, TX_RESP_Q72 identify individual devido à/ao(s): Conteúdos curriculares inadequados às student needs? How necessidades dos alunos. are these needs B Não 1.00 accommodated for within the classroom? c) What about Indique com qual frequência são desenvolvidas as seguintes Principal: A Nunca. 0.00 students, how does atividades para minimizar as faltas dos alunos neste ano e nesta TX_RESP_Q045 B Algumas vezes. 0.33 the school ensure escola: Os professores conversam com os alunos para tentar C Frequentemente. 0.66 they are engaged in solucionar o problema. D Sempre ou quase sempre. 1.00 their own learning? Indique com qual frequência são desenvolvidas as seguintes Principal: A Nunca. 0.00 How are parents atividades para minimizar as faltas dos alunos neste ano e nesta TX_RESP_Q046 B Algumas vezes. 0.33 incorporated in this escola: Os pais/responsáveis são avisados por comunicação da C Frequentemente. 0.66 process? escola. D Sempre ou quase sempre. 1.00 Indique com qual frequência são desenvolvidas as seguintes Principal: A Nunca. 0.00 atividades para minimizar as faltas dos alunos neste ano e nesta TX_RESP_Q047 B Algumas vezes. 0.33 escola: Os pais/responsáveis são chamados à escola para conversar C Frequentemente. 0.66 sobre o assunto em reunião de pais. D Sempre ou quase sempre. 1.00 Indique com qual frequência são desenvolvidas as seguintes Principal: A Nunca. 0.00 atividades para minimizar as faltas dos alunos neste ano e nesta TX_RESP_Q048 B Algumas vezes. 0.33 escola: Os pais/responsáveis são chamados à escola para conversar C Frequentemente. 0.66 sobre o assunto individualmente. D Sempre ou quase sempre. 1.00 Indique com qual frequência são desenvolvidas as seguintes Principal: A Nunca. 0.00 atividades para minimizar as faltas dos alunos neste ano e nesta TX_RESP_Q049 B Algumas vezes. 0.33 escola: A escola envia alguém à casa do aluno. C Frequentemente. 0.66 D Sempre ou quase sempre. 1.00 Na sua percepção, os possíveis problemas de aprendizagem dos Teacher: A Sim 0.00 alunos das série(s) ou ano(s) avaliado(s) ocorrem, nesta escola, TX_RESP_Q78 devido à/ao(s): Falta de assistência e acompanhamento dos pais na B Não 1.00 vida escolar do aluno. Na sua percepção, os possíveis problemas de aprendizagem dos Teacher: A Sim 0.00 alunos das série(s) ou ano(s) avaliado(s) ocorrem, nesta escola, TX_RESP_Q80 B Não 1.00 devido à/ao(s): Desinteresse e falta de esforço do aluno. Na sua percepção, os possíveis problemas de aprendizagem dos Teacher: A Sim 0.00 alunos das série(s) ou ano(s) avaliado(s) ocorrem, nesta escola, TX_RESP_Q82 B Não 1.00 devido à/ao(s): Alto índice de faltas por parte dos alunos. 4) Adopting Educational Best Practices Operations Management a) How does the Qual foi a quantidade de docentes desta escola que participou das Principal: A Não foram organizadas 0.00 school encourage atividades de formação continuada que você organizou nos últimos TX_RESP_Q027 atividades de formação incorporating new dois anos? B Poucos professores. 0.25 teaching practices C Um pouco menos da 0.50 into the classroom? metade dos professores. D Um pouco mais da 0.75 metade dos professores. E Quase todos ou todos os 1.00 professores. Nesta escola e neste ano, indique a frequência com que: O(A) Teacher: A Nunca. 0.00 diretor(a) estimula atividades inovadoras. TX_RESP_Q65 B Algumas vezes. 0.33 C Frequentemente. 0.66 D Sempre ou quase sempre. 1.00 App. 28 2013 Questionnaire: Optio MGMT WMS questions Questions Value label Var. name n score 4) Adopting Educational Best Practices Operations Management c) How does the Nesta escola e neste ano, indique a frequência com que: O(a) Teacher: A Nunca. 0.00 school ensure that diretor(a) dá atenção especial a aspectos relacionados com a TX_RESP_Q61 teachers are utilising aprendizagem dos alunos. B Algumas vezes. 0.33 these new practices C Frequentemente. 0.66 in the classroom? How often does this D Sempre ou quase sempre. 1.00 happen? 7) Performance Review Operations Management a) How often do you Conselho de classe é um órgão formado por todos os professores Principal: A Não existe Conselho de 0.00 review (school) que lecionam em cada turma/série. Neste ano, quantas vezes se TX_RESP_Q031 Classe nesta escola. performance -- reuniram os conselhos de classe desta escola? B Nenhuma vez. 0.25 formally or C Uma vez. 0.50 informally-- with D Duas vezes. 0.75 teachers and staff? E Três vezes ou mais. 1.00 O Conselho de Classe é um órgão formado por todos os Teacher: A Não existe Conselho de 0.00 professores que lecionam em cada turma/série. Neste ano e nesta TX_RESP_Q52 Classe nesta escola. escola, quantas vezes se reuniu o Conselho de Classe? B Nenhuma vez. 0.00 C Uma vez. 0.33 D Duas vezes. 0.66 E Três vezes ou mais. 1.00 17) Promoting High Performers People Management b) How do you Na sua percepção, os possíveis problemas de aprendizagem dos Teacher: A Sim 0.00 identify and develop alunos das série(s) ou ano(s) avaliado(s) ocorrem, nesta escola, TX_RESP_Q75 your star devido à/ao(s): Insatisfação e desestímulo do professor com a B Não 1.00 performers? carreira docente. Nos últimos dois anos, você organizou alguma atividade de Principal: A Não 0.00 formação continuada (atualização, treinamento, capacitação etc.) TX_RESP_Q026 B Sim 1.00 nesta escola? 18) Managing Talent People Management b) How do you Neste ano, qual foi o principal critério para a atribuição das turmas Principal: A Preferência dos 0.00 ensure you have aos professores? TX_RESP_Q040 professores. enough teachers of B Escolha dos professores, 0.50 the right type in the de acordo com a school? pontuação por tempo de serviço e formação. C Professores experientes 1.00 com turmas de aprendizagem mais rápida. Neste ano, qual foi o principal critério para a atribuição das turmas Principal: D Professores experientes 1.00 aos professores? TX_RESP_Q040 com turmas de aprendizagem mais lenta. E Manutenção do professor 0.50 com a mesma turma. F Revezamento dos 0.50 professores entre as séries. G Sorteio das turmas entre 0.50 os professores. H Atribuição pela direção 0.50 da escola. I Outro critério. missing J Não houve critério. 0.00 App. 29 2013 Questionnaire: Optio MGMT WMS questions Questions Value label Var. name n score 20) Attracting Talent/ Creating a Distinctive Employee Value Proposition People Management a) What makes it Nesta escola e neste ano, indique seu grau de concordancia: O(A) Teacher: Nunca. 0.00 distinctive to teach diretor(a) me anima e me motiva para o trabalho TX_RESP_Q64 Algumas vezes. 0.33 at your school, as Frequentemente. 0.66 opposed to other Sempre ou quase sempre. 1.00 similar schools? If Nesta escola e neste ano, indique a frequência com que: sinto-me Teacher: Nunca. 0.00 you were to ask the respeitado(a) pelo(a) diretor(a) TX_RESP_Q66 Algumas vezes. 0.33 last three candidates Frequentemente. 0.66 would they agree? Sempre ou quase sempre. 1.00 Why? Nesta escola e neste ano, indique a frequência com que: tenho Teacher: Nunca. 0.00 confiança no(a) director(a) como professional TX_RESP_Q67 Algumas vezes. 0.33 Frequentemente. 0.66 Sempre ou quase sempre. 1.00 Nesta escola e neste ano, indique a frequência com que: participo Teacher: Nunca. 0.00 nas decisões relacionadas com o meu trabalho TX_RESP_Q68 Algumas vezes. 0.33 Frequentemente. 0.66 Sempre ou quase sempre. 1.00 Nesta escola e neste ano, indique a frequência com que: a equipe Teacher: Nunca. 0.00 de professores leva em consideração as minhas idéias TX_RESP_Q69 Algumas vezes. 0.33 Frequentemente. 0.66 Sempre ou quase sempre. 1.00 App. 30