WPS5890 Policy Research Working Paper 5890 Assessment Testing Can Be Used to Inform Policy Decisions The Case of Jordan Husein Abdul-Hamid Khattab M. Abu-Lebdeh Harry Anthony Patrinos The World Bank Human Development Network Education Team December 2011 Policy Research Working Paper 5890 Abstract Over the past two decades, the Jordanian education science portion of the Third International Mathematics system has made significant advances. Net enrollment and Science Study. Changes in test scores over time are in basic education increased from 89 percent in 2000 presented and analyzed using decomposition analysis. to 97 percent in 2006. Transition rates to secondary The trends are related to policy changes over time. It education increased from 63 to 79 percent in the same is argued that benchmarking education systems and period. At the same time, Jordan made significant gains constant feedback between researchers and policymakers on international surveys of student achievement, with a contributed to this achievement. particularly impressive gain of almost 30 points on the This paper is a product of the Education Team, Human Development Network. It is part of a larger effort by the World Bank to provide open access to its research and make a contribution to development policy discussions around the world. Policy Research Working Papers are also posted on the Web at http://econ.worldbank.org. The author may be contacted at hpatrinos@worldbank.org. The Policy Research Working Paper Series disseminates the findings of work in progress to encourage the exchange of ideas about development issues. An objective of the series is to get the findings out quickly, even if the presentations are less than fully polished. The papers carry the names of the authors and should be cited accordingly. The findings, interpretations, and conclusions expressed in this paper are entirely those of the authors. They do not necessarily represent the views of the International Bank for Reconstruction and Development/World Bank and its affiliated organizations, or those of the Executive Directors of the World Bank or the governments they represent. Produced by the Research Support Team Assessment Testing Can Be Used to Inform Policy Decisions: The Case of Jordan Husein Abdul-Hamid Khattab M. Abu-Lebdeh Harry Anthony Patrinos1 JEL: I21, J24 Keywords: Education quality, learning assessment, benchmarking education systems, Jordan 1 Abdul-Hamid is at the University of Maryland; Abu-Lebdeh is the National Center for Human Resources Development (NCHRD), Jordan; and Patrinos is at the World Bank (hpatrinos@worldbank.org). All views expressed here are those of the authors and should not be attributed to their respective organizations. Introduction Many countries are struggling with the quality of their education systems. Efforts to reform education are often met with resistance and the lack of a model to follow. Many of the top performers in the world are high-income countries with many years of development of systems. Many middle and even high-income countries are only just starting to undertake important reforms. Resource-rich countries, such as Gulf Cooperation Countries (GCC), are making significant investments in their systems but have yet to see results. What is often lacking is experience from middle-income countries that have made progress. A useful starting point is the entry of many GCC countries into national student assessments that offer them a benchmark on results. Few examples exist however of countries making use of such assessments to inform their reform efforts. Even scarcer still are success stories. Jordan provides a useful case of a country that used an international assessment to benchmark and reform its system; more importantly, Jordan made great strides not only in the implementation of the program but also in improving the system. The literature on the effectiveness of education initiatives in developing countries is scarce. It is also not clear how assessments themselves affect the improvement of national educational policies. Jordan is one of few developing countries that have been taking student assessment seriously. It is a small country that invests extensively to improve its education system because human capital is the major resource Jordan has, especially in comparison to neighboring oil-rich countries. The role of education is important in producing students equipped with the knowledge and skills crucial for Jordan’s growth and development, especially as the country is actively attempting to attract foreign investment. Policymakers in Jordan have always wanted to know what works in their education system and have been experimenting with different educational interventions, including comprehensive enhancements to the curricula, assessment tools, technology, and restructuring the education system and its institutions. Jordan’s investments in improving the quality of education in past decades seem to have paid off. There is a noticeable impact on student learning since the early 1990s. In the 1991 International Assessment of Educational Progress (IEA), out of 20 participating countries, Jordan finished ahead of only Brazil and Mozambique in the mathematics and science tests for 13-year-olds. By the late 1990s, there was a marked change as seen in the 1999 Third International Mathematics and Science Study (TIMSS) where, out of 38 countries, Jordan finished ahead of six (Iran, Indonesia, Chile, Philippines, Morocco and South Africa) in mathematics and ahead of eight (Iran, Indonesia, Turkey, Tunisia, Chile, Philippines, Morocco and South Africa) in science. However, the progress does not stop there. In 2003, Jordan improved its TIMSS science score to 475 from 450 in 1999, an increase of 25 points, or 0.25 standard deviations, which is a significant increase, equivalent to about a whole year of learning. In 2007, Jordan continued to improve, surpassing several countries which had a similar or slightly higher performance in 1999, ending up significantly above the international average. In fact, between 1999 and 2007, no other country improved as much in science as did Jordan. 2 Researchers have used international assessments to analyze the determinants of learning (Hanushek and Luque 2003; Hanushek and Kimko 2000; Barro 2001; Lee and Barro 2001; Afonso and Aubyn 2006; Bedard and Ferrall 2002; Hanushek and Woessmann 2006; Alvarez, Garcia-Moreno and Patrinos 2007; Nabeshima 2003; Fertig 2003; Fertig and Schmidt 2002; Woessmann 2003; Fuchs and Woessmann 2007). While most analyses are cross-country, there is an increasing trend to look at individual countries in depth. This paper documents the assent of Jordan in international assessments. The process involved in preparing for the numerous assessments Jordan takes part in is described, along with a review of the steps involved. The change in scores over time is analyzed by using the decomposition methodology that is usually applied in wage regressions research, but in this case used to measure the effects of resources versus efficiency in explaining score changes. It is shown that a significant part of the overall increase in scores is associated with Jordan’s educational inputs becoming more efficient. Background Jordan’s assessment of the status of student learning outcomes through international comparisons started in the early 1990s. Jordan’s first participation in international studies was in 1991, as the first Arab country to participate in such studies. At the same time that the International Assessment of Educational Progress (IAEP II) was launched Jordan began its review of the education system and a comprehensive reform. Jordan has been participating in the major international exams: 1991 in International Assessment of Educational Progress (IEAP), 1999 in Trends of International Mathematics and Science Study (TIMSS-R), 2003 TIMSS, 2006 Program of International Student Assessment (PISA), 2007 TIMSS, and 2009 PISA. Using these tests and their national assessments they have benchmarked their performance: with IAEP they assess performance at the end of the primary cycle in science and math; with TIMSS the focus is on science and math, for students in grade 8, in parallel to education reforms; and with PISA and NAfKE they assess structural diagnostics of skills at the end of the compulsory school stage. Early IEAP results in 1991 were alarming, as Jordan ranked 18 among 19 countries. IAEP II provided crucial data on educational performance but also allowed the country the opportunity to learn assessment techniques (sample selection, test administration, implementation monitoring). Thus IAEP was instrumental in building national capacity for conducting surveys of student achievement. Jordan’s students ranked near the bottom in IAEP II. The results came as a shock. Almost 75 percent of students in mathematics and 67 percent of students in science scored lower than the international average. Jordan ranked third from the bottom in both subjects among the 20 participating countries. While the impact of the assessment results of the education reforms and projects was expected to take time before showing results, after serious interventions and follow-ups on the gaps in their 3 curriculum and teacher training, significant positive improvement in TIMSS started to appear in 2003, especially in science. Continuous significant improvement has been noticed for female students (see Figure 1). Figure 1: Jordan’s Performance in the Trends of International Mathematics and Science Study 560 540 520 science All 500 science private 480 science public 460 science male 440 science female 420 400 1999 2003 2007 Figure 2: Jordan’s Math Performance in the Trends of International Mathematics and Science Study 520 math All 500 math private 480 math public 460 440 math male 420 math female 400 1999 2003 2007 4 Jordan’s 8thgrade students perform relatively well in science, but still lag in mathematics. While there was some improvement for females, math is still a problem as no serious improvement is seen overall. Using math to solve practical real-life problems is still a challenging issue (see Figure 2). Impetus for Reform The results were alarming as performance was extremely poor. That was a wakeup call. As a follow up, Jordan speeded up their efforts on reforming education and went through consecutive comprehensive reforms of its system. The curriculum was targeted, reviewed, and new textbooks were developed. Teacher qualifications were reviewed and evaluated, to this end massive teacher upgrading through a university bridging program was implemented. The two- year institutions where pre-service teacher training was conducted educated were hard hit. All these certification providers were mandated to consolidate within the university system. No more new teachers with two-year degree were permitted. The actions taken by the authorities in the aftermath of the IAEP results can be summarized as follows:  Expert committees were established to investigate the causes of poor performance  Item-by-item examination of the IAEP test and comparison to curricula were undertaken  The entire examination was re-administered (but results identical to those obtained during the first round of testing, thus officials results accepted)  Establishment of benchmarks for 13-year-olds’ achievement  Identification of strengths and weaknesses in each subject  Comparison of performance of students  Results were used to inform teacher training  Analysis of characteristics related to achievement  Targeted negative and positive influences A national center with a focus on assessment and education research was established by the government in 1990 and commissioned to follow up on the education initiatives. The center (National Center for Education Research, NCERD), then changed to National Center for Human Resources Development (NCHRD), was given autonomy and designed a longitudinal system to monitor learning achievement of students and assess the instructional quality of basic education. 5 Over the years NCHRD conducted systematic national assessment studies and produced and disseminated several reports. These reports were circulated heavily in the country. NCHRD also guided Jordan’s participation in international exams to supplement their efforts in the area of assessment of student learning. In recent years, NCHRD has been in charge of developing and executing a comprehensive evaluation framework for Jordan’s largest education reforms. It is based on a mixed-method approach. It uses continuous and systematic assessment of students’ performance based on national assessments and international studies incorporating the Trends in International Mathematics and Science Study (TIMSS) and the Program of International Student Assessment (PISA). It has also supplemented the national assessment program with new assessment tools that have been conducted biannually since 2008 with a focus on skills needed for the knowledge economy, known as the National Assessment for Knowledge Economy Skills (NAfKE). The approach also includes regular observations and evaluation of what is happening at schools and in the classroom. These are in addition to evaluations of different designs and experiments (such as the Jordan Education Initiative’s Discovery Schools that utilize technology-rich instruction). Initial analysis of TIMSS 1999 (Abdul-Hamid 2001) indicated that socioeconomic and family characteristics related to education continue to have the biggest influence on student achievement. Between-schools differences in achievement were associated with school authority (public versus private, with private superior), school location (urban versus rural, with urban locations producing better results), and school climate (including teacher morale). Gender is also a significant factor in achievement to the advantage of girls. School resources and teacher qualifications were also investigated and tended to have a positive influence on achievement. Jordan also used the international results to compare itself with the world’s best achievers. Jordan reviewed systems such as those in Japan, Singapore, Chinese Taipei, and organized study tours to Korea, Japan and Singapore. It used these benchmarking activities to guide the educational reforms within Jordan. Following up on the analyses, the Ministry of Education, in collaboration with NCHRD, developed teacher guides and initiated nationwide discussions and teacher training to overcome the lack of understanding of specific topics in the curriculum and correct misconceptions. Jordanian authorities developed a feedback loop between those researching the education system and those implementing change. Even more testing was conducted, now on a continuous basis. The results of such research were used to identify gaps and to propose solutions. Thus, teacher guides were developed, teacher training was improved, and workshops were organized for teachers. While noticeable improvements appeared in TIMSS, PISA identified new challenges. Results of the 2006 PISA indicated a need to improve the quality of instruction to prepare students on using reading, math and science skills to synthesize and solve problems (see Figure 3). Mastery of 6 higher order thinking and life skills are still a big challenge and have been objectives of the latest two education reforms. The major goal is to bring the level up and reduce the percentage of students at the lowest international benchmarks. Figure 3: Jordan’s Performance in the Program for International Student Assessment (PISA 2006) 100% 80% Level 6 Level 5 60% Level 4 Level 3 Level 2 40% Level 1 Below Level 1 20% 0% Reading Math Science 7 Table 1: Performance of Students on National Assessment of Knowledge Economy Skills Content Knowledge Economy Grade 5 Grade 9 Grade 11 Domain Skills 2006 2008 2006 2008 2006 2008 Math Communication 39.1 39.6 39.1 42.5* 38.4 40.9* Information 28.0 28.4 36.3 37.1 17.2 18.1 Management Using Symbols 22.3 22.9 42.4 45.3* 33.1 35.8* Problem solving 21.4 21.5 28.2 28.9 21.3 21.8 Science Communication 44.3 44.7 48.3 53.5* 38.4 40.6* Information 51.0 51.2 42.1 46.3* 43.2 43.4 Management Using Symbols - - - - 54.1 54.6 Problem solving 47.2 47.1 31.2 31.5 33.0 33.7 Arabic Communication 50.3 53.9* 51.0 60.1* 53.1 63.4* Information 40.0 45.3* 49.1 57.3* 56.0 65.3* Management Problem solving 37.2 40.1 38.2 45.9* 55.0 56.1 *indicates significant improvement As a result of PISA 2006, the assessment focus was on skills for which the specialized national assessment, NAfKE, was created by NCHRD. The first implementation helped focus on the main issues. Table 1 shows the main results on the different dimensions related to skills with some improvements between 2006 and 2008 in some areas. Methodology and Estimation The improvement in TIMSS results in 2007 and 2003 relative to 1999 was the most noticeable. Hence, in this paper we investigate what contributed to the change. For this we use the regression decomposition methodology. The first step is to specify and estimate student achievement in relation to individual, family, school and institutional inputs. We then proceed to decompose the over-time test score gap into an explained component (accounting for student, family, school and institutional characteristics) and an “unexplained� – or returns, or the efficiency by which the country is able to convert characteristics into student learning outcomes as measured by test scores – component, using the traditional Oaxaca (1973)-Blinder (1973) decomposition method. The model specification for the estimation of the achievement function is as follows: Tija = Ta(Aija, Fija, Sija, Iija) + єija (1) where Tiaj is the observed TIMSS score of student i in household j at time a (time of the test), Aija is a vector of individual, student, characteristics; Fija is a vector of parent characteristics, Sija is a vector of school-related inputs, Iija is a vector of institutional characteristics, and єija is an additive 8 error, which includes all the omitted variables including those which relate to the history of past inputs, endowed mental capacity and measurement error. Todd and Wolpin (2003) discuss in detail the assumptions that would satisfy the application of this specification, in which the achievement test score depends solely on the contemporaneous measures of family, school and other inputs. These assumptions state that: (a) current input measures capture the entire history of inputs or, alternatively, only contemporaneous inputs matter and (b) contemporaneous inputs are unrelated to endowed mental capacity. The linear specification of our estimation model, after dropping subscript a for convenience, is given by: Tij = β0 + β1Aij + β2Fij + β3Sij + β4Iij + єij (2) where β0 to β4 are coefficients to be estimated. The standard procedure for analyzing the determinants of the test score differences over time is to fit equations between test scores and observed characteristics. The observed test score differential can be decomposed as: T2003 - T1999 = (X2003 - X1999) β2003 + X1999(β2003 - β1999) (3) where T is the standardized test score, Xi is a vector of student, family, school and institutional characteristics for the ith individual, β is a vector of coefficients, and 1999, 2003 subscripts are identifiers of the TIMSS score in years 1999 and 2003. The overall test-score increase can, therefore, be decomposed into two components: one is the portion attributed to differences in characteristics (X2003-X1999), or 2006 group performance (β2003); the other portion is attributable to differences in effects on performance (β2003-β1999) of 1999 and 2003 students derived from the same characteristics. This second component, while more difficult to interpret in the present context compared to an earnings gap decomposition framework, can be assigned more than one interpretation. An obvious one is that the unexplained portion of the test score increase may reflect certain unobserved characteristics that are correlated with achievement over time; or it could be the returns to the observed characteristics, meaning how productively the given resources were used to produce educational outputs, measured as student test scores here. Certain of the above coefficient estimates may be subject to biases. For example, if a school characteristic is correlated with unobserved family characteristics that influence achievement (such as family wealth and parents’ motivation), then the effect of attending a school with such characteristics may be biased. Modified Decomposition An alternative decomposition is possible using a modified Oaxaca-Blinder method, in which the unexplained part of the test-score differential is captured by a year indicator (2006) taking the value of 1 for 2006 and 0 otherwise (2003). Consider a production function for cognitive achievement: 9 Tija = Ta(2006ij, Aija, Fija, Sija, Iija) + єija (4) where 2006ija is a dummy variable equal to 1 if the test was taken in 2006 and 0 otherwise. In implementing a modified Oaxaca decomposition of the test score gap, and assuming a linear specification, the differences of mean test scores for 2003 and 2003 students is given by: (T2003-T1999) = β1 + β2(A2003-A1999) + β3(F2003-F1999) +β3(S2003- S1999) (5) where coefficient β1 is an estimate of the portion of the gap that remains – or the gain in efficiency – after accounting for the differences in mean characteristics. To get the proportions that are explained and unexplained: β1 / (T2003-T1999) = unexplained and: [β2(A2003- A1999) + β3(F2003-F1999) +β4(S2003- S1999) ]/(T2003-T1999)=explained and the components of the explained portion are: β2(A2003-A1999) = individual characteristics β3(F2003- F1999) = family β4(S2003-S1999) = school/teacher While test scores and individual and family information are at the individual level, school resources and other school-related inputs are at the school level. In choosing the estimation method we recognize that observed test scores are expected to be correlated at the school level due to clustering effects. Therefore, the assumption that disturbances are independently and identically distributed with fixed conditional variance does not hold. The estimation method of OLS by cluster at the school level is used. Results The decomposition results are summarized in Table 2. A significant proportion at two-thirds of the increase in scores over time is unexplained by changes in observed characteristics. In fact, 16 percent of the total difference is due to the following improvements: higher teacher confidence; higher student self-confidence; and more emphasis on problem-solving in classroom instruction. The attention that the country gave to empowering teachers with training and material to focus on tackling problem-solving has increased teachers’ confidence and effectiveness, and that was reflected in the improvement observed. 10 Most of the difference, however, is “unexplained.� However, “unexplained� in this case refers to the returns to observable characteristics. That is, for the same level of resources, Jordan’s schools are producing more output (student test scores); or put another way, Jordan’s teachers are able to add more value with a given level of resources. More than two-thirds of the improvement is due to improved effectiveness of the use of resources, or increased value-added of Jordan’s teachers. This is reflected in the fact that the urban advantage disappeared over time. Moreover, while the student: teacher ratio increased slightly over time, the effectiveness of teachers to handle a large class improved; that is, the system became much more efficient, thus being able to educate more children, and to improve their test scores at the same time. Female advantage more than doubled over this short period of time. Table 2: TIMSS scores decomposition (1999-2003) Determinants of test score differentials Explained Unexplained as % of total test score diff. b1999 b2003 X1999 X2003 b2003(X2003-X1999) X1999(b2003-b1999) Explained Unexplained Constant 450.3 476.8 1.0 1.0 0.0 26.5 0.0% 38.7% Public school -36.2 -32.6 0.8 0.8 0.3 2.9 0.5% 4.2% Urban 15.7 -14.3 0.7 0.6 0.6 -19.8 0.8% -28.9% Student-teacher ratio 0.6 1.2 23.5 24.6 1.3 14.1 1.9% 20.6% School size 0.0 0.0 749.1 753.1 0.0 0.0 0.0% 0.0% Teacher qualification (univ) 14.3 17.7 0.9 0.9 0.0 3.1 0.0% 4.5% Training certificate 4.6 6.2 0.5 0.7 1.2 0.8 1.8% 1.2% School resources (shortage) -4.2 -5.6 0.7 0.6 0.6 -1.0 0.8% -1.4% Total hours teaching 1.4 1.9 21.9 21.3 -1.1 11.0 -1.7% 16.0% High morale 9.3 19.7 0.5 0.6 2.0 5.2 2.9% 7.6% Homework per week 7.1 6.4 3.6 3.9 1.9 -2.5 2.8% -3.7% Computer for instruction 2.6 2.8 0.2 0.2 0.0 0.0 0.0% 0.1% High teacher confidence 9.4 22.7 0.3 0.5 4.5 4.0 6.6% 5.8% Emphasis problem solving 7.7 11.5 0.2 0.5 3.3 0.8 4.9% 1.2% Female 7.6 18.6 0.5 0.5 0.0 5.5 0.0% 8.1% Self-confidence 11.1 18.3 0.2 0.4 3.7 1.4 5.3% 2.1% Mother - lower secondary 5.6 7.5 0.1 0.2 0.7 0.2 1.1% 0.3% Mother - upper secondary 13.7 12.9 0.3 0.4 1.3 -0.2 1.9% -0.4% Mother – university 19.6 21.2 0.1 0.2 2.1 0.2 3.1% 0.2% 11–100 books 6.3 1.8 0.7 0.7 0.0 -3.2 0.0% -4.6% 101-500 books 15.2 1.3 0.2 0.1 -0.1 -2.8 -0.2% -4.1% Total 22.3 46.1 32.6% 67.4% Overall 68.4 100.0% But perhaps the greatest proof that the Jordanian reforms paid off is reflected in the large size of the returns to total hours teaching. There is no real difference in amount of hours devoted to 11 teaching; but there is a significant positive change in the returns to hours teaching. This alone accounts for 16 percent of the improvement in test scores over time. This shows that Jordanian teachers have become more effective at conveying the material in the classroom. Conclusions Over the last two decades, the Jordanian education system made significant advances. Significant gains were made in international surveys of student achievement, with a particularly impressive gain of almost 30 points in science. This paper shows that several policy actions, spurred by initial reactions to the shock of low scores in international comparison, and practical guidance from policymakers to implementers and teachers, were responsible for the gains in student achievement. Benchmarking education systems and constant feedback between researchers and providers contributed to this achievement. Therefore, education systems can stand to learn from the Jordanian experience. The proper use of assessment results can provide significant returns. Also, the cost of assessment is worthwhile, given the significant benefits that the system receives. While there are many uses to assessments – national and international – primarily as part of the effort to evaluate the education system, this case shows that it can be a wake-up call for action, a tool for informing the system, and an objective metric to monitor progress over time. Assessments, therefore, can be used to establish benchmarks – in international comparison and as national standards. Most importantly, assessments can be used to inform policy responses, and to generate real-time, useful information to providers. Jordan’s experience suggests how countries can use international assessments and education reforms to improve the quality of their education systems. First, participation in international assessments is a must. It provides the country with useful international benchmarks and a wealth of information. Second, rigorous analysis of the determinants of learning and comparison with top performers is needed. Third, implementation of the benchmarks and analyses into curriculum development and teacher training is needed. There must be feedback loops between the research, curriculum and professional development as part of a comprehensive reform. Finally, monitoring of implementation and results must be continuous and meaningful. 12 References Abdul-Hamid, H. 2001. “What Jordan needs to do to prepare for the knowledge economy: Lessons learned from TIMMS-R.� University of Maryland (processed). Afonso, A. and M. St. Aubyn. 2006. "Cross-country efficiency of secondary education provision: A semi-parametric analysis with non-discretionary inputs." Economic Modelling 23(3): 476-491. Alvarez, J., V. Garcia-Moreno and H.A. Patrinos 2007. “Institutional Effects As Determinants Of Learning Outcomes: Exploring State Variations in Mexico� Well-Being and Social Policy 3(1). Barro R.J. 2001. “Human Capital and Growth.� American Economic Review, Papers and Proceedings 91(2): 12-17. Blinder, A. 1973. “Wage discrimination: Reduced form and structural estimates.� Journal of Human Resources 8(4): 436–455. Fertig, M. 2003. “Who is to blame? The determinants of German students’ Achievement in the PISA 2000 study.� IZA Discussion Paper No. 739. Fertig, M. and C. M. Schmidt. 2002. “The role of background factors for reading literacy: Straight national scores in the PISA 2000 study.� IZA Discussion Paper No. 545 Fuchs, V. and L. Woessmann. 2007. “What Accounts For International Differences In Student Performance? A Re-Examination Using PISA Data.� Empirical Economics 32 (2-3): 433 – 464. Hanushek, E. and J. Luque. 2003. “Efficiency and Equity in Schools around the World.� Economics of Education Review 22(5): 481 – 502. Hanushek, E. and D. Kimko. 2000. “Schooling, Labour Force Quality, and the Growth of Nations.� American Economics Review 22(5): 481-502. Hanushek, E. and L. Woessmann. 2006. "Does Educational Tracking Affect Performance and Inequality? Differences-in-Differences Evidence across Countries." Economic Journal 116 (510): C63-C76. Lee, J.-W. and R. Barro. 2001. “Schooling Quality in a Cross-Section of Countries.� Economica 68(272):465-88. Nabeshima, K. 2003. "Raising the quality of secondary education in East Asia." World Bank Policy Research Working Paper Series 3140. 13 Oaxaca, R. 1973. “Male-female wages differentials in urban labor markets.� International Economic Review 14(3): 693–709. Todd, P. and K. Wolpin. 2003. “On the Specification and Estimating of the Production Function for Cognitive Achievement.�Economic Journal113(485): F3-F33. Woessmann, L. 2003. “Schooling Resources, Educational Institutions, and Student Performance: The International Evidence.� Oxford Bulletin of Economics and Statistics 65(2): 117-170. World Bank. 2005. Mexico: Determinants of Learning Policy Note (Report No. 31842-MX). Latin America and the Caribbean, Human Development. 14