74628 v2 A WO R L D BA N K S T U DY Improving Learning in Uganda Vol. II Problematic Curriculum Areas and Teacher Effectiveness: Insights from National Assessments Innocent Mulindwa Najjumba and Jeffery H. Marshall Washington, D.C. © 2013 International Bank for Reconstruction and Development / The World Bank 1818 H Street NW, Washington DC 20433 Telephone: 202-473-1000; Internet: www.worldbank.org Some rights reserved 1 2 3 4  16 15 14 13 World Bank Studies are published to communicate the results of the Bank’s work to the development community with the least possible delay. The manuscript of this paper therefore has not been prepared in accordance with the procedures appropriate to formally edited texts. This work is a product of the staff of The World Bank with external contributions. Note that The World Bank does not necessarily own each component of the content included in the work. The World Bank there- fore does not warrant that the use of the content contained in the work will not infringe on the rights of third parties. The risk of claims resulting from such infringement rests solely with you. The findings, interpretations, and conclusions expressed in this work do not necessarily reflect the views of The World Bank, its Board of Executive Directors, or the governments they represent. The World Bank does not guarantee the accuracy of the data included in this work. The boundaries, colors, denominations, and other information shown on any map in this work do not imply any judgment on the part of The World Bank concerning the legal status of any territory or the endorsement or acceptance of such boundaries. Nothing herein shall constitute or be considered to be a limitation upon or waiver of the privileges and immunities of The World Bank, all of which are specifically reserved. Rights and Permissions This work is available under the Creative Commons Attribution 3.0 Unported license (CC BY 3.0) http:// creativecommons.org/licenses/by/3.0. Under the Creative Commons Attribution license, you are free to copy, distribute, transmit, and adapt this work, including for commercial purposes, under the following conditions: Attribution—Please cite the work as follows: Najjumba, Innocent Mulindwa and Jeffery H. Marshall. 2013. Improving Learning in Uganda Vol. II: Problematic Curriculum Areas and Teacher Effectiveness: Insights from National Assessments. Washington, DC: World Bank. doi:10.1596/978-0-8213-9850-0 License: Creative Commons Attribution CC BY 3.0. Translations—If you create a translation of this work, please add the following disclaimer along with the attribution: This translation was not created by The World Bank and should not be considered an official World Bank translation. The World Bank shall not be liable for any content or error in this translation. All queries on rights and licenses should be addressed to the Office of the Publisher, The World Bank, 1818 H Street NW, Washington, DC 20433, USA; fax: 202-522-2625; e-mail: pubrights@worldbank.org. ISBN (paper): 978-0-8213-9850-0 ISBN (electronic): 978-0-8213-9860-9 DOI: 10.1596/978-0-8213-9850-0 Cover photo: A student in primary school in Kampala, Uganda. ©Arne Hoel / World Bank Library of Congress Cataloging-in-Publication Data has been requested. Contents Acknowledgments xi Abbreviations and Acronyms xiii Executive Summary xv Analytical Framework xv Main Findings xvi Suggestions for Next Steps xxii Chapter 1 Introduction and Methodology 1 Introduction 1 School Curriculum in Uganda 1 Background to This Work 3 Rationale and Objectives of This Report 4 Methodology 5 Report Outline 12 Notes 12 Chapter 2 Learning Outcomes and Problematic Curriculum Areas 13 Overall Achievement Levels in Numeracy and Literacy 13 Student Achievement Levels in Literacy 13 Importance of and Overall Achievement in Numeracy 20 Overall Achievement Levels in Biology 25 Summary of Assessment Results 28 Problematic Curriculum Areas 29 Problem Areas in Numeracy 41 Problem Areas in Biology 55 Note 57 Chapter 3 Teacher Knowledge and Effectiveness 59 Primary Student and Teacher Achievement Levels in Numeracy 61 Predictors of Teacher and Student Performance 77 Linkages with the Primary Leaving Exams (PLE) 90 Note 93 iii   iv Contents Chapter 4 Discussion and Suggestions on Next Steps 95 Emerging Issues in Literacy and Numeracy Achievement 95 Teacher Content Knowledge and Effectiveness 98 Suggestions for Next Steps 100 Appendix A P3 Literacy Test Blueprints 2006 103 Appendix B Summary Tables for English Literacy 107 Appendix C Summary Tables for Numeracy 119 Appendix D 133 References141 Figures Figure 1.1: Knowledge Components of Effective Teaching 10 Figure 2.1: Summary of P3 English Literacy, Overall Percentage Correct, Uganda, 2009/10 13 Figure 2.2: Summary of P3 Average Literacy Scores by Level, Uganda, 2009/10 14 Figure 2.3: Summary of P6 English Literacy, Overall Percent Correct, Uganda, 2009/10 14 Figure 2.4: Summary of P6 Average Literacy Scores, Uganda, 2009/10 15 Figure 2.5: Summary of S2 English Literacy, Overall and Within Common Content Areas, Uganda, 2008–10 15 Figure 2.6: Summary of S2 English Literacy by Proficiency Levels, Uganda, 2008–10 15 Figure 2.7: Female-Male Difference in Overall English Literacy by Grade, Uganda, 2010 16 Figure 2.8: Female Difference in Overall English Literacy by Grade, Uganda, 2006–09 16 Figure 2.9: Urban-Rural Difference in Overall English Literacy by Grade, Uganda, 2010 17 Figure 2.10: Urban-Rural Difference in Overall English Literacy, Uganda, 2006–08 18 Figure 2.11: Government-Private School Difference in Overall English Literacy, Uganda, 2010 19 Figure 2.12: Government-Private School Difference in Overall English Literacy, Uganda, 2006–09 20 Figure 2.13: Summary of Numeracy, Overall Percent Correct in P3, P6, and S2, Uganda, 2006–10 22 Figure 2.14: Summary of Overall Numeracy Proficiency Levels, Uganda, 2006–10 22 Contents v Figure 2.15: Female-Male Difference in Numeracy by Grade, Uganda, 2010 23 Figure 2.16: Female-Male Difference in Numeracy by Grade, Uganda, 2006–09 23 Figure 2.17: Urban-Rural Difference in Overall Numeracy by Grade, Uganda, 2010 24 Figure 2.18: Urban-Rural Difference in Overall Numeracy by Grade, Uganda, 2006–09 25 Figure 2.19: Government-Private School Difference in Numeracy by Grade, Uganda, 2010 25 Figure 2.20: Government-Private School Difference in Numeracy by Grade, Uganda, 2006–09 26 Figure 2.21: Overall Achievement Levels in Biology at S2, Uganda, 2008–10 26 Figure 2.22: Overall Proficiency Levels in Biology at S2, Uganda, 2008–10 26 Figure 2.23: Summary of S2 Biology Achievement by Gender, Uganda, 2008–10 27 Figure 2.24: Summary of S2 Biology Overall Achievement Levels, Uganda, 2008–10 28 Figure 2.25: P3 Reading Comprehension Subcontent Areas, Uganda, 2009/10 30 Figure 2.26: Summary of P6 Reading Comprehension in Detail, Uganda, 2010 31 Figure 2.27: Summary of P6 Reading Comprehension Subcontent Areas, Uganda, 2010 31 Figure 2.28: S2 Reading Comprehension Subcontent Areas Summary, Uganda, 2008–10 32 Figure 2.29: Summary of P3 Writing Content Areas, Uganda, 2009/10 33 Figure 2.30: Detailed Summary of P6 Writing Content Areas, Uganda, 2010 33 Figure 2.31: S2 Writing Subcontent Areas Summary, Uganda, 2008–10 34 Figure 2.32: Detailed Summary of P6 Grammar Content Areas, Uganda, 2010 36 Figure 2.33: S2 Grammar Subcontent Areas Summary, Uganda, 2010 36 Figure 2.34: Summary of P3 Literacy Content Areas, Uganda, 2006–08 37 Figure 2.35: Summary of P6 Literacy Content Areas Uganda, 2006–08 38 Figure 2.36: Female-Male Difference in English Literacy Content Areas by Grade, Uganda, 2010 39 Figure 2.37: Urban-Rural Difference in English Content Areas by Grade, Uganda, 2010 39 Figure 2.38: Urban-Rural Difference in English Content Areas, Uganda, 2006 and 2008 40 vi Contents Figure 2.39: Government-Private School Difference in English Literacy Content Areas, Uganda, 2010 41 Figure 2.40: Summary of P3 Numeracy by Curriculum Area, Uganda, 2006–09 42 Figure 2.41: Summary of P3 Numeracy Content Areas by Proficiency Level, Uganda, 2009 43 Figure 2.42: Detailed Summary of P6 Numeracy by Subcontent Areas, Uganda, 2010 43 Figure 2.43: Summary of P6 Numeracy Subcontent Areas by Proficiency Level, Uganda, 2010 44 Figure 2.44: Detailed Summary of P6 Numeracy by Subcontent Areas, Uganda, 2006–09 45 Figure 2.45: Summary of S2 Numeracy by Content Areas, Uganda, 2008–10 46 Figure 2.46: Summary of S2 Numeracy Subcontent Areas by Proficiency Level, Uganda, 2010 46 Figure 2.47: Female-Male Differences in P3 Numeracy Content Areas, Uganda, 2009 47 Figure 2.48: Female-Male Differences in P6 Numeracy Content Areas, Uganda, 2010 48 Figure 2.49: Female-Male Differences in S2 Numeracy Content Areas, Uganda, 2010 48 Figure 2.50: Summary of P3 Numeracy in Operations on Numbers, Uganda, 2006–09 50 Figure 2.51: Summary of P6 Operations on Numbers Subcontent Area, Uganda, 2010 51 Figure 2.52: Summary of P6 Operations on Numbers Subcontent Area, Uganda, 2006–09 51 Figure 2.53: Female-Male Differences in P3 Operations Subcontent Areas, Uganda, 2009 52 Figure 2.54: Female-Male Differences in P6 Operations Subcontent Areas, Uganda, 2010 53 Figure 2.55: Urban-Rural Difference in Operations on Numbers in P3, Uganda, 2009 53 Figure 2.56: Urban-Rural Difference in Operations on Numbers in P6, Uganda, 2010 54 Figure 2.57: Private-Government School Difference in Operations on Numbers in P3, Uganda, 2009 54 Figure 2.58: Private-Government School Difference in Operations on Numbers in P6, Uganda, 2009 55 Figure 2.59: Summary of S2 Biology Subcontent Areas, Uganda, 2008–10 56 Figure 2.60: Summary of S2 Biology Subcontent Areas by Proficiency Level, Uganda, 2010 56 Contents vii Figure 3.1: Summary of P6 Student and P3-P6 Teacher Numeracy Achievement, Overall and by Content Area, Uganda NAPE, 2011 62 Figure 3.2: Summary of Primary Teacher Numeracy Scores, by Grade and Subject Specialty, Uganda NAPE, 2011 63 Figure 3.3: Summary of P3-P6 Teacher Numeracy Achievement, Overall and by Content Area, Uganda NAPE, 2011 64 Figure 3.4: Summary of S2 Student and Teacher Numeracy Achievement, Overall and by Content Area, Uganda NAPE, 2011 66 Figure 3.5: Frequency Summary of S2 Student and Teacher Overall Numeracy Achievement, Uganda NAPE, 2011 67 Figure 3.6: Summary of Primary Teacher Literacy Scores, by Grade and Subject Specialty, Uganda NAPE, 2011 69 Figure 3.7: Summary of P6 Student and P3-P6 Teacher Literacy Achievement, Overall and by Content Area, Uganda NAPE, 2011 70 Figure 3.8: Summary of Overall Literacy Scores by Frequency Category, P6 Students and P3-P6 Teachers, Uganda NAPE, 2011 71 Figure 3.9: Summary of S2 Student and Teacher English Literacy, Overall and by Content Area, Uganda NAPE, 2011 72 Figure 3.10: Summary of S2 Student and Teacher Grammar Achievement, Overall and by Subcontent Area, Uganda NAPE, 2011 73 Figure 3.11: Summary of S2 Student and Teacher Reading and Writing Achievement, Overall and by Subcontent Area, Uganda NAPE, 2011 73 Figure 3.12: Frequency Summary of S2 Student and Teacher Overall Numeracy Achievement, Uganda NAPE, 2011 74 Figure 3.13: Summary of S2 Student and Teacher Biology Achievement, Overall and by Content Area, Uganda NAPE, 2011 76 Figure 3.14: Frequency Summary of S2 Student and Teacher Biology Achievement, Uganda NAPE, 2011 77 Figure 3.15: Summary of Teacher Knowledge Effect Sizes on Student Achievement, Literacy Content Areas in P3-P6-S2, Uganda NAPE, 2011 90 Figure 3.16: Summary of Teacher Knowledge Effect Sizes on Student Achievement, S2 Numeracy Content Areas, Uganda NAPE, 2011 90 Figure D1.: Summary of P3-P6 Teacher Literacy Achievement, Overall and by Content Area, Uganda NAPE, 2011 134 Tables Table 1.1: Summary of the NAPE Sample Sizes, 2006–10 6 Table 3.1: Detailed Summary of P6 Student and P3–P6 Teacher Numeracy Achievement, Overall and by Content Area, Uganda NAPE, 2011 61 Table 3.2: Detailed Summary of S2 Student and Teacher Numeracy Achievement, Overall and by Content Area, Uganda NAPE, 2011 65 viii Contents Table 3.3: Detailed Summary of P6 Student and P3–P6 Teacher Literacy Achievement, Overall and by Content Area, Uganda NAPE, 2011 68 Table 3.4: Detailed Summary of S2 Student and Teacher English Literacy Achievement, Overall and by Content Area, Uganda NAPE, 2011 71 Table 3.5: Detailed Summary of S2 Student and Teacher Biology Achievement, Overall and by Content Area, Uganda NAPE, 2011 75 Table 3.6: Covariates of P3–P6 Teacher Overall Numeracy and Literacy Achievement (T-Statistics in Parentheses), Uganda NAPE, 2011 78 Table 3.7: Covariates of S2 Teacher Achievement, by Content Area (T-Statistics in Parentheses), Uganda NAPE, 2011 83 Table 3.8: Covariates of P3–P6 Student Numeracy, Literacy, and Reading Achievement, Uganda NAPE, 2011 85 Table 3.9: Covariates of S2 Student Achievement by Content Area (T-Statistics in Parentheses), Uganda NAPE, 2011 87 Table 3.10: Covariates of School Average Pass Rate in 2008, UNPS, 2009/10 92 Table A.1: Relative Weights Allocated to Each Skill Area of 2006 P3 Written Literacy in English 103 Table A.2: Relative Weights Allocated to Each Skill Area of 2007 P3 Written Literacy in English 2007 103 Table A.3: Weight Allocated to Each Skill Area of 2007 P3 Oral Reading 104 Table A.4: Weights Allocated to Each Skill Area of P3 Literacy in English, 2008 104 Table A.5: Weight Allocated to Each Skill Area of P3 Reading 104 Table A.6: Weight Allocated to Each Skill Area of P3 Literacy in English, 2009 105 Table A.7: Competences of P3 Literacy (Reading Comprehension), 2010 105 Table A.8: Competencies of P3 Literacy (Writing) 105 Table B.1: P3 English Literacy Overall Score Summary, Uganda NAPE, 2006–10 107 Table B.2: P6 English Literacy Overall Score Summary, Uganda NAPE, 2006–10 108 Table B.3: S2 English Literacy Overall Score Summary, Uganda NAPE, 2008–10 108 Table B.4: P3 Reading Comprehension Summary, Uganda NAPE, 2006–10109 Table B.5: P6 Reading Comprehension Summary, Uganda NAPE, 2006–10 110 Table B.6: S2 English Reading Comprehension Summary, Uganda NAPE, 2008–10 110 Table B.7: P3 Reading Comprehension Detailed Summary, Uganda NAPE, 2009/10 111 Contents ix Table B.8: P6 Reading Comprehension Detailed Summary, Uganda NAPE, 2010 112 Table B.9: S2 Reading Comprehension Detailed Summary, Uganda NAPE, 2008–10 112 Table B.10: P3 Writing Summary, Uganda NAPE, 2006–10 113 Table B.11: P6 Writing Summary, Uganda NAPE, 2006–10 113 Table B.12: S2 English Writing Summary, Uganda NAPE, 2008–10 114 Table B.13: P3 Writing Detailed Summary, Uganda NAPE, 2009/10 115 Table B.14: P6 Writing Detailed Summary, Uganda NAPE, 2010 115 Table B.15: S2 Writing Detailed Summary, Uganda NAPE, 2008–10 116 Table B.16: P3 Grammar Summary, Uganda NAPE, 2006–08 116 Table B.17: P6 Grammar Summary, Uganda NAPE, 2006–10 117 Table B.18: S2 Grammar Detailed Summary, Uganda NAPE, 2010 118 Table C.1: P3 Numeracy Overall Score Summary, Uganda NAPE, 2006–10 119 Table C.2: P6 Overall Numeracy Summary, Uganda NAPE, 2006–10 120 Table C.3: S2 Numeracy Summary, Uganda NAPE, 2008–10 120 Table C.4: P3 Number System and Place Value Summary, Uganda NAPE, 2006–09 121 Table C.5: P3 Number Patterns and Sequences Summary, Uganda NAPE, 2006–09 122 Table C.6: P3 Fractions Summary, Uganda NAPE, 2006–09 122 Table C.7: P3 Measures and Geometry Summary, Uganda NAPE, 2006–09 123 Table C.8: P3 Operations on Numbers Summary, Uganda NAPE, 2006–09 124 Table C.9: P3 Operations on Numbers ADDITION Summary, Uganda NAPE, 2006–09 124 Table C.10: P3 Operations on Numbers SUBTRACTION Summary, Uganda NAPE, 2006–09 125 Table C.11: P3 Operations on Numbers MULTIPLICATION Summary, Uganda NAPE, 2006–09 125 Table C.12: P3 Operations on Numbers DIVISION Summary, Uganda NAPE, 2006–09 126 Table C.13: P6 Number System and Place Value Summary, Uganda NAPE, 2006–10 127 Table C.14: P6 Number Patterns and Sequences Summary, Uganda NAPE, 2006–10 127 Table C.15: P6 Fractions Summary, Uganda NAPE, 2006–10 128 Table C.16: P6 Measures Summary, Uganda NAPE, 2006–10 129 Table C.17: P6 Geometry Summary, Uganda NAPE, 2006–10 129 Table C.18: P6 Operations on Numbers Summary, Uganda NAPE, 2006–10 130 x Contents Table C.19: P6 Graphs and Interpretation Summary, Uganda NAPE, 2006–10 131 Table C.20: S2 Numeracy Detailed Summary, Uganda NAPE, 2008–10 132 Table D.1: Detailed Summary of P3-P6 Teacher Numeracy Achievement, Overall and by Content Area, Uganda NAPE, 2011 133 Table D.2: Detailed Summary of P3-P6 Teacher Literacy Achievement, Overall and by Content Area, Uganda NAPE, 2011 133 Table D.3: Summary of Variables Used in Teacher Analysis P3-P6-S2 (Standard Deviations in Parentheses), Uganda NAPE, 2011 135 Table D.4: Covariates of P3-P6 Teacher Numeracy Achievement, by Content Area (T-Statistics in Parentheses), Uganda NAPE, 2011 135 Table D.5: Covariates of P3-P6 Teacher Literacy Achievement, by Content Area (T-Statistics in Parentheses), Uganda NAPE, 2011 137 Table D.6: Covariates of School Average Pass Rate in 2007, UNPS 2009/10 138 Table D.7: Covariates of School Average Pass Rate—2007/08 Pooled, UNPS 2009/10 138 Acknowledgments This work benefitted from financial support from the United Kingdom’s Department for International Development Trust Fund for the Implementation of the National Development Plan. This report has been authored by Innocent Mulindwa Najjumba and Jeffrey H. Marshall. The former conceptualized and executed this work including report writing, while the latter supported all the data analysis for this report. Special thanks to Abraham Owino, who undertook an extensive review of the end-of- cycle (EOC) examiners’ reports generated by the Uganda National Examination Board (UNEB). A detailed report exists and can be accessed by interested stakeholders. Valuable technical support was provided by peer reviewers at various stages of this work right from conceptualization to final report writing. These include Ayesha V. Vawda, Cristina Santos, Helen Craig, Marguerite Clarke, Marie-Helene Cloutier, Sukhdeep Brar, and Susan Opper. Strategic guidance from Peter Nicolas Materu and Harriet Nannyonjo significantly improved the scope of this report. The management direction from Ahmadou Moustapha Ndiaye and Sajitha Bashir enabled quality finalization of this work. The Government of Uganda counterpart team of the Ministry of Education and Sports (MoES) is greatly appreciated. The Uganda National Examination Board (UNEB)/NAPE team greatly helped with the national assessment data access including execution of the national assessment exercise for the teachers that was undertaken with the national assessment cycle of 2011. The core team comprised Sylvia Acana, Daniel Kyagaba, and Omara Kizito, under the overall leadership of the Executive Secretary, Mathew Bukenya. From the teacher education unit, special appreciation is extended to Margaret Nsereko and Janet Florence Aguti under the overall leadership of the Permanent Secretary, F.X. Lubanga, for the support provided all through this exercise. Administrative support provided by Agnes Kaye, as well as the day-to-day DFID Trust Fund management support from Kasper Dalten (former Trust Fund Manager), Catherine Ssekimpi, and Annet Alitusabira with overall guidance from Bee Pang was very valuable. xi   Abbreviations and Acronyms CCs Coordinating Centers CCTs Centre Coordinating Tutors CFS Child Friendly Schools CPD Continuous professional development CTEP Certificate in Teacher Education Proficiency DFID Department for International Development, UK EFA Education for All EGRA Early Grade Reading Skills Assessment EMIS Education Management Information System EOC End-of-cycle EPRC Education Policy Review Commission ESSP Education Sector Strategic Plan FE Fixed effects FRESH Focusing Resources on Effective School Health HLM Hierarchical linear model IRT Item response theory JBSF Joint Budget Support Framework MAASC Minds Across Africa Schools Club MDGs Millennium Development Goals MoES Ministry of Education and Sports NAPE National Assessment of Progress in Education NCDC National Curriculum Development Center NER Net enrollment ratio OECD Organisation for Economic Co-operation and Development OLS Ordinary least squares PCK Pedagogical content knowledge PISA Program for International Student Assessment PLE Primary Leaving Examination xiii   xiv Abbreviations and Acronyms PTR Pupil-teacher ratio SABER Systems Approach for Better Education Results SACMEQ Southern and Western Africa Consortium for Monitoring ­ Education Quality SES Socioeconomic status TTC Teacher Training College UACE Uganda Advanced Certificate of Education UBOS Uganda National Board of Statistics UCE Uganda Certificate of Education UNEB Uganda National Examination Board UNHS Uganda National Household Survey UNPS Uganda National Panel Survey UPE Universal primary education UPPET Universal Post-Primary Education and Training USE Universal secondary education UWEZO Swahili word for “capability” Executive Summary Uganda is one of the few African countries with a functional national assessment system. Established in 2003, the National Assessment of Progress in Education (NAPE) Program is executed by the Uganda National Examination Board (UNEB). The program uses a learning outcomes measurement framework to annually measure achievement in literacy and numeracy proficiency on the basis of a cross-sectional, nationally representative sample of learners from the primary three (P3) and primary six (P6) grades. In 2008, the framework was extended to the senior two (S2) grade of lower secondary education for English, math, and biology. However, use of national assessment results to inform improvements in student learning remains weak. These data can nevertheless be used to search for solutions to the challenge of low-quality education in Uganda. The objective of this study is to generate a comprehensive, consolidated evidence base about student learning outcomes and teacher effectiveness in primary and secondary schools Uganda, grounded in existing, nationally owned NAPE assessment data. In specific terms, this analytical work attempts to estab- lish the following: (a) the performance levels and patterns of students in P3, P6, and S2; (b) problematic curriculum areas in the respective grades; (c) teacher competency; and (d) predictors of student and teacher performance levels. The goal is not to reanalyze existing data, but rather, provide additional analysis that can help complement the very useful summary reports provided by NAPE for individual years. This analysis is also supported by findings from the qualitative end-of-cycle (EOC) curriculum examination reports generated by UNEB chief examiners. Analytical Framework The report analyzes NAPE data sets, starting with the most recent data available. The data cover a yearly average of 500 schools and about 8,000 learners per grade in each of P3, P6, and S2. Based on test blueprints and test items for the respective years, the analysis covers a series of general content and subcontent areas in the national literacy and numeracy curricula, as well as in the biology curriculum for S2. xv   xvi Executive Summary The first analytical section (chapter 2) of the report details the overall perfor- mance levels of learners and identifies problematic curriculum areas by subcon- tent area. Among the data challenges faced by the authors were (a) changes in the test blueprints that took place between 2006 and 2010, (b) observable changes in the format of test items, including replacement of multiple-choice questions with more open-ended questions to enable the transition to a more competency-based assessment, (c) perceived alterations in the difficulty levels of test items, and (d) lack of similar test items, which allow for accurate estima- tion of learning achievement trends over time. As a quality assurance measure, a two-pronged analytic approach is used. The report examines the percentage of correct answers on NAPE assessments, calculated on the basis of total points scored on each item, divided by the total available. At the same time, it analyzes proficiency, or adequacy, levels based on assigned cutoff points. The resulting combined data effectively communi- cate student achievement levels. The two approaches are technically accept- able in light of the internationally known challenges of national assessment systems. It is important to note that the two approaches generated consistent findings, creating reasonable ground for comparisons of estimated learning levels over time. Nevertheless, the analytical report relies mostly on the most recent data, with special emphasis on years with similar curriculum blueprints. An analysis of EOC examiners’ reports was also conducted to complement the NAPE findings. The second analytical section (chapter 3) of the report analyzes teacher effectiveness and the predictors of learning outcomes. This chapter uses 2011 NAPE data, plus data generated by additional test items developed for teachers with support from the World Bank and the U.K. Department for Internal Development. The main focus of this chapter is to explain variation both between students and schools, as well as between teachers. Hence, a series of multivariate models were incorporated into the analysis. The authors were able to access the actual test items in each grade, enabling the application of item response theory (IRT) software—the most internationally used software for student assessments. The quality of test items was also validated on the basis of their fit, with very few items (less than 5) eliminated, which reflects well on the quality of the data generated by NAPE. Both percentages and IRT-scaled scores are used in the comparison of overall student achievement in 2011. Main Findings Student achievement levels in English literacy and numeracy at the primary level are still low and fall short of expected levels. In 2010, the average achieve- ment score in literacy at the P3 and P6 levels was 47 percent and 40 percent, respectively. In addition, 60 percent of learners in P3 and about 70 percent in P6 scored below the 50 percent literacy proficiency level for their respective grades. Overall, achievement in literacy at the lower secondary levels is higher than that Executive Summary xvii observed at the primary level (52 percent in 2010), although a significant decline in performance has been observed since 2008 (from about 63 percent in 2008 to 52 percent in 2010). In numeracy, average student achievement in P6 in 2010 was only 40 percent; worse still, 70 percent of learners in this grade performed below the 50 percent mark. Neither are S2 students outperforming earlier Grades (P3 and P6) in numeracy (scoring an average of only 40 percent in 2010 and 2009), indicating that numeracy is a challenging subject for learners even beyond primary education. Earlier results from the Southern and Eastern Africa Consortium for Monitoring Education Quality (SACMEQ) indicate that Uganda’s P6 perfor- mance in 2007 was below the SACMEQ average scores in reading (an average of 511.8 against Uganda’s 478.7) and mathematics (an average of 509.5 against Uganda’s 481.9). In addition, declines were registered in SACMEQ scores for both reading and numeracy between 2000 and 2007: reading declined by 3.7 points, and numeracy, 24.4 points. Problematic Curriculum Areas Students in P3 and P6 are, on the whole, struggling to achieve required profi- ciency levels within all three content areas of literacy (reading comprehension, writing, and grammar), although students in P3 performed slightly better in writing than they did in reading comprehension. For students in P6, perfor- mance in all three areas is equally low (with an average score of 30–40 percent), a pattern that has been true in the past. Students score highest in literacy content areas that require them to respond to simple guided instructions, compared to content areas that require creativity and imagination. In reading comprehension, for example, primary students per- form best when given simple tasks, such as matching (average P3 student perfor- mance of 80 percent) and associating pictures (average P6 student performance of 97 percent). The worst student performance was observed in the subcontent areas of recognizing and describing, with average P3 student scores of only 20 percent and 10 percent, respectively. Within the writing subcontent area, P6 students perform best in learning areas such as copying and writing patterns, and lowest in highly demanding areas such as writing composition. These findings indicate that classroom instruction and learner support should aim not only at simple foundational literacy skills, but also at developing the literacy competencies that foster students’ creativity and critical thinking. These latter skills augment the lifelong learning and innovation needed to promote Uganda’s national growth. The most problematic numeracy areas for P6 learners are geometry, measures, and fractions. Geometry appears to be a challenge even at S2, as do functions, transformations, and statistics. Of importance to note is that geometry is the least problematic area at the P3 level, which may signal the emergence of pedagogical challenges as students progress from lower to upper levels of basic education. Subtraction is a problem area for P3, with average student performance of xviii Executive Summary roughly 20 percent; and division emerges as a problem area for P6, with average student performance of 41 percent. As was observed with literacy, student test scores are highest in areas related to basic concepts and operations. This conclu- sion does, however, come with one caveat: operations on numbers have the lowest average of all P3 numeracy content areas. The wide gaps between best and worst student performance in literacy and numeracy—and in the subcontent areas of these two subjects—imply inequi- table mastery of the comprehensive range of skills that the curriculum is designed to impart. For example, within the reading comprehension subcontent area of literacy, P3 students score 80 percent in matching, but only 12 percent in describing. Likewise, in P6, the average student score for associating pictures is 97 percent, compared to an average of only 12 percent for sequencing pictures. Within numeracy subcontent areas, average student performance in P6 is 70 percent for operations with numbers, compared to a low of 18 percent in geometry. The same variations exist at S2, with the only difference being that they are not as wide. This finding generates questions about how curriculum delivery is struc- tured in the respective learning areas, including time allocation, sequencing, and pacing, as well as how sensitive teachers are to learner needs in various subcontent areas. Overall performance in biology at S2 is very low—an average of 27 percent in 2010. Almost no student performed above the 50 percent mark between 2008 and 2010, meaning that virtually no student gave correct answers for half of the test items drawn from the biology curriculum at that level. Scores this low suggest that students have very little understanding of most biology concepts. Given S2 scores in literacy, this finding cannot be attributed to prob- lems with basic skills such as reading and highlights the challenge of improv- ing science instruction at the lower secondary level in Uganda. Students in S2 are struggling with all subcontent areas of biology, scoring worst in soil (aver- age score of 15 percent), plant structures (average of 25 percent), and diversity of living things (average of 25 percent), together with microscopes and lenses (average of 30 percent). Comparisons of biology learning achievement by type of student and school are mixed. For gender there is a consistent, and at times substantial, advantage of boys over girls. However, differences by school loca- tion and type of school (public versus private) are not as significant. The major challenge is lifting student achievement in biology at all levels in all schools, for all students. The fact that so many primary 3 and 6 children are not able to answer a majority of questions drawn from the official curriculum is troubling, but this result needs to be considered in a larger context. Uganda is one of the poorest countries in the world (its latest Human Development Index ranking as 143 out of 169 countries). However, most Ugandan children are now reaching grade 3 and increasingly moving toward grade 6. The net enrollment rate was 96.1 percent in 2010; and the dropout rate, 4.4 percent. The student survival Executive Summary xix rate to grade 5 is 62 percent and the completion rate, 54 percent. The combina- tion of high poverty and high participation in primary education (especially in grade 3) puts substantial pressure on the system, particularly in terms of educational quality. The existence of the same kinds of problems year after year suggests that either the teachers are unaware of the deficiencies in student achievement based on their own assessments, or they do not receive adequate feedback from sources such as NAPE and EOC reports. It is also possible that teachers are generally aware of the issues, but not provided the necessary support to address them (hence their recurrence); alternatively, it could be that ensuring improve- ment is beyond the reach of these teachers. Predictors of Student Performance Teacher attendance, school size as determined by enrolment, and the availability of toilets and first aid services at school explain 13 percent of the variation in the proportion of students who pass the Primary Leaving Exam in grades 1–3. Key inputs such as trained teachers and textbooks showed no significant asso- ciation with the percentage of students passing this exam. The impact of the availability of toilets and first aid services is probably the most exciting finding in this analysis and clearly stresses the need to promote healthy and hygienic learning environments. Hence, ongoing initiatives aimed at improving school sanitation facilities deserve greater traction—as does passing, adequately resourcing, and effectively implementing a school health policy. Strategies in the current Ugandan draft school health plan are well aligned with the Child Friendly Schools (CFS) and the Focusing Resources on Effective School Health (FRESH) frameworks. Student achievement is consistently higher (in literacy, numeracy, and biol- ogy) when students study with teachers who have higher levels of content knowledge. The effect sizes are small and, on average, show that a standard deviation increase in teacher content knowledge is associated with only about a 0.05 standard deviation of higher student achievement. Nevertheless, the find- ings reinforce the argument that teachers play an important role in affecting student achievement, and their knowledge of the subject matter for which they are responsible is one of the components of effective teaching. As is typical in sub-Saharan Africa and other developing regions, there is a significantly large and persistent difference between urban and rural stu- dent outcomes in literacy and numeracy at the primary level in favor of urban students. However, this pattern appears to be widening. In both P3 and P6, the urban-rural gap is about 20 percentage points in literacy and about 10 percentage points in numeracy. In S2, however, the urban-rural gap in literacy and numeracy is very small, which suggests either that access to S2 in rural areas is restricted to relatively few students or that rural schools at this level are of roughly the same quality as urban lower secondary schools, or perhaps both. xx Executive Summary Private schools significantly outperform public primary schools in numeracy and literacy in both P3 and P6, but the gap—though wide—has been stable over time. For example, in 2010, private students in P3 and P6 scored an aver- age of about 18 percentage points higher in numeracy and between 25 percent- age and 35 percentage points higher in literacy. However, the reverse is true at the secondary level, with public schools outperforming private schools, although by a narrower margin. These differences in the performance of private and government schools are likely attributable to the very different types of service provision characteristic of primary and secondary schools in Uganda. At the primary level, girls in P3 have typical and very large significant learn- ing advantages over males in literacy, however, the pattern reverses in favor of males in P6. This finding calls for greater understanding of the gender dynamics of progression through the primary cycle. With respect to age, older students score significantly lower on the P3 and P6 NAPE exams. At the lower secondary level (S2), gender (in favor of boys), age (in favor of younger students or students the correct age for the grade), and whether a school is day or boarding institution (in favor of boarding schools) significantly affect student performance. Students attending USE schools performed signifi- cantly higher than others, and those attending single-shift schools also per- formed significantly higher than those attending regular schools. These results will be further validated by an ongoing impact evaluation study of the double- shift policy reform. Teacher Effectiveness and Predictors of Learning Outcomes As expected, the performance of Ugandan teachers is significantly higher than that of their students on test items drawn from the same curriculum. However, the gap between the two generates concern, particularly because teachers are clearly not transmitting their superior knowledge to students. In other words, although teachers have the required content knowledge to deliver the numeracy and literacy curriculum, they are not well grounded in content pedagogical prac- tices. The average P6 teacher in Uganda scores 100 percent higher than his or her P6 students in numeracy; in literacy, teachers score about 2.5 times higher than their students. For example, P6 teachers were able to answer about 86 percent of the P6 student literacy test items correctly, compared to only 30 percent for their students. This wide performance gap extends to the secondary level, esti- mated at two full standard deviations in numeracy and literacy. Primary teachers’ overall performance on NAPE literacy and numeracy assessments in 2001 was approximately 85 percent. Although the ideal perfor- mance level for teachers on test items drawn from the curriculum that they teach is expected to be 100 percent, this expectation must be balanced against capacity development and service delivery challenges in developing countries. Hence, although the overall performance level of Uganda teachers is consid- ered high, this finding does not mean that teachers’ content knowledge does not need to be enhanced. For example, a significant group (18 percent) of P3 Executive Summary xxi and P6 teachers scored in the 50–75 percent range in numeracy. Those teachers deserve attention to enable them to improve their subject content knowledge. And only 18 percent of teachers scored in the very top performance range (90–100 percent) in literacy, which also points to needed work on teacher content knowledge. Teacher subject matter knowledge is particularly important to student per- formance in literacy at lower levels. For example, teacher knowledge of the P6 literacy curriculum significantly affects student literacy learning achievement in P3, indicating the impact of higher content knowledge on teacher effectiveness. It is curious to note that the literacy achievement of both teachers and students varies to a greater degree in grammar than in reading comprehension and writ- ing, but the problem curriculum areas of the two groups are different. For example, the difference between the subcontent area with the best (preposi- tions) and worst (pronouns) performance by students is about 32 percentage points. Teachers’ performance, however, differs by about 25 percentage points between the subcontent area with the highest scores (adverbs) and that with the lowest scores (patterns or pronouns). With respect to numeracy, teachers and students share the same areas of relative strength and weakness; the only difference being that the lowest aver- age score of teachers is above 70 percent, compared to below 40 percent for students. Both score lowest in geometry, measures, number patterns, and frac- tions. Similarly, students score worst in the same subcontent areas of literacy where teachers score the worst (average lows of 70 percent). Both teachers and learners have exceptionally high comfort in one area—associating pictures with words—with learners scoring as high as 85 percent and almost all teachers scor- ing 100 percent. It is significant that the proficiency gap between teachers and learners is wider in the subcontent areas where both groups have the lowest scores. For example, P6 teacher performance in geometry (76 percent) was four times higher than that of students (17 percent). S2 biology specialist teachers have a significant content knowledge problem, with only 65 percent answering questions on the curriculum that they teach correctly. The teacher knowledge gap vis-à-vis students in this subject is roughly three full standard deviations, indicating a problem in teacher prepara- tion. Consistent with literacy and numeracy results, student average scores were lowest in the subcontent areas of biology in which teachers had the hardest time answering questions. Teacher knowledge levels are higher in wealthier districts only for numeracy. Schools located in wealthier districts do have higher student achievement levels, but this finding is most likely associated with the socioeconomic status of learners. With regard to gender, female teachers score between 0.39 and 0.54 standard deviations lower than their male counterparts in numeracy. This result, consistent with global norms, holds even when grade level and subject specialty are controlled for. Teacher gender differences, however, do not extend to literacy, which is unusual because women normally have a literacy advantage over men. xxii Executive Summary Yet female teachers are more effective at the secondary level than are their male counterparts. Although teachers with higher levels of academic preparation have higher levels of content knowledge than their peers, experience (as measured by years in the teaching service) is not significantly associated with higher content knowledge levels at the primary level. This means that no experiential learning is taking place at this level. There are, however, very strong and significant effects for the controls that combine teaching area (specialty) and grade. In both literacy and numeracy subject areas, P6 specialist teachers have subject matter knowledge levels that are substantially higher than all other teachers. Unlike primary-level findings, secondary education teachers’ experience is positively related to numeracy and biology content knowledge. At the secondary education level, however, the distribution of teacher quality as measured by content knowledge levels is significantly more variable. Teachers working in private schools have significantly lower content knowledge than do their public school counterparts (on average, between 0.25 and 0.45 standard deviations), which may explain the lower performance of S2 learners in private schools. The limited content knowledge of private secondary school teachers is likely to affect a substantial share of the secondary school population, thereby compromising the realization of quality outcomes at this level. Teachers in schools implementing the universal secondary education (USE) program have significantly lower content knowledge in numeracy and biology. However, teachers in double-shift schools have significantly higher content knowledge in the same subject areas. There are no significant variations in secondary teachers’ content knowledge by gender or teacher preparation, which could be the result of specialist training. On the whole, results relating to teacher content knowledge raise more ques- tions than answers. A key challenge for the education system remains how to transform existing teachers into effective teachers. Their high content knowl- edge is a very good starting point—although this holds only in literacy and numeracy, and is not equally distributed among content areas. Clearly, content knowledge alone is not a magic bullet in terms of ensuring teaching quality. But when teachers on average miss almost 20 percent of questions on student exams, this lack of knowledge certainly raises questions about their effectiveness. The near total absence of students scoring above 75 percent (in any subject) indi- rectly reinforces this contention. Suggestions for Next Steps On the basis of the findings just discussed, the following are suggested next steps for the government to consider: • Refocus ongoing teacher development efforts on strategies that improve teacher effectiveness in the classroom through ongoing pre- and in-service training programs. Intensifying pedagogy and enhancing teacher content Executive Summary xxiii knowledge in the identified problematic curriculum areas are evident impera- tives. The recently launched investigation of teacher pedagogical practices will greatly inform these improvements. • Conduct further investigation of (a) situational analysis of teacher effective- ness policies based on global norms, so that missing or weak policy links in teacher policies or strategies are identified (that is, application of the Systems Approach for Better Education Results [SABER-Teachers] approach of the World Bank could be considered); (b) curriculum coverage in the classroom in order to better understand the extent to which the official curriculum is covered by learners, with attention to time allocation, pacing, and sequencing; and (c) challenges in science instruction, including teacher preparation, that draws on the biology results. • Strengthen ongoing interventions to improve school-level sanitation and hygiene, including parliamentary approval of, resourcing of, and implementa- tion of a school health policy. • Improve NAPE systems data to enable effective tracking of learner perfor- mance over time. • Improve mainstream teacher assessments to facilitate regular progress monitor- ing of teacher competency; improve the manner in which EOC examiners’ reports are prepared in order to enable more strategic informational feedback into the education system. CHAPTER 1 Introduction and Methodology Introduction Uganda has registered tremendous success in the school system expansion that has arisen from the ongoing Universal Primary Education (UPE) reform launched in 1997. Primary enrollment is estimated at 8.7 million children, resulting in a Net Enrollment Ratio (NER) of 83.2 percent (UBOS 2010). Completion and achievement rates are, however, still low. More than half of primary pupils in grades 3 and 6 perform below the desired minimum average of 50 percent in numeracy and literacy. The government is thus faced with the dual challenge of maintaining high enrollment levels and ensuring quality service delivery for the realization of national development goals and the Millennium Development Goals (MDGs) on education. Government and development partners’ efforts are currently focused on improving the provision of key inputs for quality teaching and learning processes—especially qualified teachers, instructional materials, and curriculum reforms—that will be rein- forced by school infrastructure developments to support the expansion. The continued low performance points to the low quality of teaching-learning processes, which is also one of the aspects that continues to be featured in the education sector dialogue. With support from the U.K. Department for International Development, the World Bank initiated exploratory analytical work under the broader theme of “improving learning in Uganda.” Subthemes under investigation are anchored in issues that dominate the education sector discourse, including school feeding, school-based management, and continued low learning outcomes. This specific analytical report attempts to use existing national assessment data to generate more scientific evidence about student performance levels, with a special focus on problematic curriculum areas, as well as on teacher effectiveness. School Curriculum in Uganda The policies, purposes, and programs concerning education in Uganda today officially stem from the 1992 Government White Paper on Education, which resulted from the Education Policy Review Commission (EPRC) of 1989.1 1   2 A World Bank Study Education goals for the different levels are articulated in the Education Sector Strategic Plan (ESSP) of 2005–15, which was recently updated to 2010–15 in order to align it with the National Development Plan’s five-year cycle. These goals include (a) primary-level pupils who can master basic literacy (reading and writing), numeracy, and basic life skills; (b) postprimary students who are prepared to enter the workforce and obtain further education; and (c) tertiary graduates who are prepared to be innovative, creative, and entrepreneurial in the private and public sectors. The government is also focused on efforts toward the attainment of the MDGs and the Education for All (EFA) goals. Primary education in Uganda is aimed at the following: • Enabling individuals to acquire functional literacy, numeracy, and communica- tion skills in Ugandan languages and English • Developing and maintaining the sound mental and physical health of learners • Instilling values of living and working cooperatively with other people and of caring for others in the community • Developing cultural, moral, and spiritual values for life • Inculcating an understanding of, and appreciation for, the protection and use of the natural environment through scientific and technological knowledge • Developing a sense of patriotism and unity, an understanding of one’s rights and responsibilities, and an appreciation of the need to participate actively in civic matters • Developing the prerequisite for continuing education and development • Developing adequate practical skills for making a living • Developing an appreciation of the dignity of work and making a living by one’s honest effort • Developing the ability to use a problem-solving approach to various life situations • Developing discipline and good manners. To deliver on these objectives, the primary curriculum is structured in a manner that enhances achievement of the earlier-mentioned aims and objectives. At the lower primary level of education—primary one (P1) to primary three (P3)—the curriculum emphasis is on the rapid development of literacy, numer- acy, life skills, values, and positive attitudes, which have been developed along a thematic model. The thematic approach was adopted to allow for holistic treat- ment of concepts under themes that have immediate meaning and relevance to the learner. Successful implementation of the thematic curriculum is expected to enable (a) early breakthrough to literacy, (b) masterly of numeracy skills, (c) learner empowerment that will also provide students a head start in the acquisition of higher-order numeracy and literacy skills, and (d) development of critical skills that enable lifelong learning for Uganda’s national growth. Subject-based instruction starts at primary 4 and continues through to the terminal grade of the primary school cycle: primary 7. The content acquired at Introduction and Methodology 3 each grade level is aimed at reinforcing and consolidating earlier-acquired knowledge and skills. Competencies acquired by grade are indicated in national subject-specific syllabuses for the respective grades, including content and suggestions for the activities that a teacher could conduct at the classroom level to facilitate the teaching-learning process. The National Curriculum Development Center (NCDC) has overall respon- sibility for developing the national curriculum and providing the requisite tech- nical guidance to line departments in the Ministry of Education and Sports (MoES). Those departments are charged with ensuring quality delivery of the curriculum. Curriculum reforms that are in place were informed by technical review exercises conducted as early as 1989. Results are featured in the Government Report on Education Policy Review Commission of 1989 and the Government White Paper on Education of 1992. Uganda has been—and still is—grappling with the education quality chal- lenge since the introduction of the UPE reform program in 1997 and the Universal Post-Primary Education and Training (UPPET) Program 10 years later in 2007. The reforms are aimed primarily at expanding access to basic education and enable equitable mass acquisition of foundational skills by Ugandans, and hence the realization of the MDGs on education. The high costs of education were identified as major barriers to education access and the mass reforms have enabled the poor to access to basic education through government provision of tuition, learning materials, and teachers for public primary and lower secondary schools, where girls and boys can freely enroll. As a consequence, primary enroll- ment increased from about 5.2 million in 1995 to about 7.8 million in 2009. Therefore, the Net Enrollment Ratio for primary education in Uganda is estimated at 83.2 percent, based on the most recent 2009/10 Uganda National Household Survey (UNHS), although data generated by the national Education Management Information System (EMIS) estimates it at more than 92 percent, with parity between boys and girls. At the lower secondary education level, enrollment increased from 728,393 in 2006 to 1,088,744 in 2009, which saw a rise in NER from 15.4 to 23.5 over the same period. The expansion has, how- ever, not been matched by quality education. Background to This Work It is commonly recognized that educational quality is measured not only by inputs or simple outputs alone but also by learning achievement indicative of what has been learned and the knowledge and skills acquired in the course of the learning cycle. National assessments represent an overall shift in assessing educational quality: from a concern about inputs to a concern about learning outcomes. The assessments describe the achievement of students in a curricu- lum area, which is then aggregated to provide an estimate of the achievement level in the education system as a whole at a particular age or grade level (Greaney and Kellaghan 2008). Measurement of learning outcomes, therefore, 4 A World Bank Study provides information that can be used to improve education planning, manage- ment, and teaching (Stephens and Moskowitz 1999). Student learning is unlikely to improve unless national assessment findings are used to develop policies and strategies directed at changing school and classroom practices (Greaney and Kellaghan 2008). Uganda is one of the few African countries with a functional and well- established national assessment system. Established in 2003, the National Assessment of Progress in Education (NAPE) program is executed by the Uganda National Examination Board (UNEB). The learning outcomes mea- surement framework annually measures achievement in literacy and numeracy proficiency on the basis of a cross-sectional nationally representative sample of learners from primary three (P3) and primary six (P6). In 2008, the frame- work was extended to the lower secondary education level to assess English, math, and biology at the senior two (S2) level of the lower secondary education system. The NAPE assessment framework is also categorized as a low-stakes mea- surement of system outcomes by virtue of the fact that the results are used to hold education sector players accountable and guide strategic policy and pro- gram reforms in the sector, rather than determining student transition from one grade or school level to another. The framework generates annual statistics about learner proficiency for policy guidance and strategic actions by MoES and devel- opment partners. Although the assessment is not really designed for accurate trend analyses (as will be explained later), its results are used to guide sectoral operations on status, identify past performance patterns, and establish future targets, as is the case with many national cross-sectional sample surveys. What students have learned or can do has a direct bearing on instructional practice (Schubert and Prouty-Harris 2003), which is framed largely by what teachers know and the learning environment and supportive structures that enable instruction, both at school and home. National assessment data, therefore, provide useful information on both good and problem areas of the curriculum on the basis learning achievement results. Those data can, there- fore, provide a rich source of additional information for identifying strategic actions that are necessary to improve the quality of education. Such actions can be targeted at resource allocation, teacher training, accountability, moni- toring changes in achievement, and other variables over time (Greaney and Kellaghan 2008). Rationale and Objectives of This Report According to the NAPE results, learning outcomes are still very low, with more than half of primary pupils in grades 3 and 6 performing below the desired minimum average of 50 percent in numeracy and literacy, while completion levels stand at only 57 percent. The government’s desire to improve learning outcomes is reflected in a number of policy reforms, including (a) the recently Introduction and Methodology 5 launched thematic curriculum for lower primary, which is aimed at improving learner proficiencies in literacy and numeracy; (b) ongoing reforms in teacher training, where minimum-level requirements for teacher trainees were upped by requiring entrants to have obtained credits 1–6 in English and mathematics; and (c) reforms in the preservice primary teacher curriculum. Provision of instructional materials has also been given a high priority. Such efforts are yet to bear fruit, and the need to improve educational quality continues to dominate socioeconomic and political discourse within the country. The general objective of this study is to generate a comprehensive and consolidated evidence base on learning outcomes and teacher effectiveness in Uganda that is grounded in the rich NAPE assessment data. In specific terms, this analytical work attempts to establish the following: (a) performance levels and patterns of students at P3, P6, and S2; (b) problematic curriculum areas at the respective grade levels; (c) teacher competencies; and (d) predictors of the performance levels of both students and teachers. The purpose is not to reana- lyze the data, but rather, provide additional analysis that can help complement the very useful summary reports provided by NAPE for individual years. The analysis is also supported by findings from qualitative end-of-cycle. examination reports generated by the UNEB chief examiners. Methodology The methodology subsection details the NAPE system in Uganda because it is the main source of the data used in this analytical work. Also presented are sampling procedures, together with test item and administration procedures. The NAPE system is used to ascertain pupils’ learning achievement levels and monitor changes in these levels over time. Its establishment was occasioned by the realization that the country lacked reliable and up-to-date, low-stakes systems data about learning outcomes indicators. Since 2003, UNEB has been conducting annual national assessments that are based on the learning outcomes measurement framework for literacy and numeracy and that draw from a nationally representative sample of learners selected from primary three (P3) and primary six (P6). In 2008, the NAPE system was extended to the lower secondary grade of senior two (S2) as a follow-up to the launch of the Universal Post-Primary Education and Training (UPPET) program in Uganda. In specific terms, the system generates comprehensive information on what pupils know and can do in different areas of the curriculum; it also provides annual statistics about learner performance for use by various stakeholders. Sampling Procedures and Data Collection for the NAPE The NAPE uses a two-stage, stratified-sampling survey design. The first stage involves random selection of a sample of schools that are stratified by region (using the Uganda National Bureau of Statistics [UBOS] sampling frame) and district according to the NAPE program team that, in turn, is guided by 6 A World Bank Study the Education Management Information System (EMIS) of MoES. At least 12 schools are selected per district. In the second stage, 20 pupils from P3, P6, and S2 are randomly selected from children present on the day of the assessment. Test administrators are provided with guidelines for quality execution of the random selection procedures at the school level. Sampling weights are also used to reduce bias of the estimates. Results in table 1.1 show the NAPE sample sizes by grade and year between 2006 and 2010. In most years, the samples include about 500 schools and 8,000 students drawn from all over the country (for more detail on the sampling frameworks, see the NAPE individual reports [2006–10]). In 2010, the P3 and P6 samples were considerably larger than in previous years (as can be seen from the table). This increase was due to the need to improve sample representative- ness at the district level. Education service delivery in Uganda at the primary and secondary education levels is decentralized to districts and lower local governments. The focus on district statistics is, therefore, of great importance if it is to improve data use for planning and program implementation. This analytical report is based on the most recent data, with a special focus on years that show consistency in the test blueprints, while the previous years’ data are used to establish whether or not the observed patterns prevailed in the past. This approach will be further clarified in chapters 3 and 4, which detail the findings by curriculum area. Data collection is facilitated by a well-trained team of more than 700 officers with representation at various levels. The training is based on the manual that embraces all NAPE assessment processes. National coordinators are selected from key education institutions at the national level, including the National Curriculum Development Center (NCDC), UNEB, the schools of education at public universities, and MoES. At the district level, test administrators are selected from among (a) tutors at primary teacher training colleges, (b) second- ary school teachers, and (c) professional staff members of local district govern- ments. Those individuals form district-level teams that are charged with school-level sampling and test administration. Schools are visited once to administer tests to selected students. Team leaders also interview head teachers at the respective sampled schools. The tests are then scored by teachers and tutors, who are drawn from colleges at a central venue. Table 1.1  Summary of the NAPE Sample Sizes, 2006–10 Year Primary 3 (P3) Primary 6 (P6) Lower Secondary 2 (S2) 2010 21,653 21,413 19,288 2009 9,856 9,618 12,939 2008 8,436 8,396 6,326 2007 8,053 8,015 n.a. 2006 8,023 8,166 n.a. Source: Uganda National Assessment of Progress in Education (NAPE) d ata sets 2006–10. Note: The secondary NAPE started in 2008, information was not available for 2006 and 2007. n.a. = not available. Introduction and Methodology 7 Analysis of Learning Outcomes and Problematic Curriculum Areas The assessment is based on tests that are developed by the NAPE-UNEB team. Those tests, in turn, are based on the national curriculum for the respective grades in accordance with test frameworks and item specifications that have been pre- pared by a team of experts. The experts’ team is constituted of schoolteachers, tutors from teacher training colleges, and staff members of NCDC and UNEB. Appendix A provides sample test blueprints for P3 for the different years, yielding insight into curriculum areas within the broader learning areas that form the basic structure of this analytical report. In each subject (by year), student tests are designed to cover a series of general content areas (for exam- ple, operations on numbers) and subcontent areas (for example, within the operations on numbers content area there is addition, subtraction, division, and multiplication, and so forth). A review of the test blueprints for the various years and grade levels indicated the following challenges with the data sets: 1. Significant test blueprint changes took place during 2006–10. The most conse- quential instance is when the structure of the curriculum changed between years, meaning that content and subcontent areas were added or discarded. For example, in 2009 the P3 curriculum (in literacy and numeracy) dropped the area of grammar and was reconstituted as reading comprehension and writing only, but with some different subcontent areas. 2. Changes related to item formats based on the need to measure specific con- ceptual skills at a certain point in time, as considered desirable by the test design teams. For instance, the P3 literacy test in 2006 included 50 items worth one point each. But in 2007, even with the same curriculum blueprint, the number of items increased to 66 with a maximum score of 90 points. In later years, the replacement of multiple-choice questions with more open-ended questions became even more pronounced, as occasioned by the need to shift from theory to competency-based assessment modalities. For example, the 2009 and 2010 P3 literacy exams have only 22 questions, but collectively they are worth a total of 100 points (in each year). 3. Changes in perceived levels of test item difficulty over time, even within the same specific content area. This is a more difficult source of change to assess because test blueprints do not contain information on the difficulty levels of individual items. 4. Student tests that do not include similar kinds of items, which would allow for simple comparisons of proficiency levels or percentages correct over time. In other words, there is an absence of linking (or anchor) items that are identical across years. Hence, use of most appropriate item response theory (IRT) analysis in order to enable effective equating of test scores over the years is not possible. These issues have consequences in regard to assessing student achievement trends over time and identifying those curriculum areas where the most work remains. 8 A World Bank Study A two-pronged analytical approach has, therefore, been used in this report to militate against the observed challenges with the NAPE data sets. This finding includes use of “percentage correct” and “proficiency/adequacy levels.” Percentage correct was calculated on the basis of the total points scored on each item divided by the total available. The total maximum points for each item are based on actual student results, not on the test blueprints. In some years, NAPE has defined proficiency levels and assigned a cutoff point for adequate student knowledge within each curricular area. In most cases, this cut off is between 50 percent (that is, 2 points out of a possible 4) and 67 percent (that is, 2 points out of a possible 3). Adequacy levels are an effective way of communicating student achievement results. As noted earlier, the analysis relies mostly on the most recent data, with a special emphasis on years with similar curriculum blueprints. Data for other years that do not conform to this rule have been used to establish whether or not the observed pattern had prevailed in the past. Finally, the analysis is limited to the period since 2006. Important to note is that consistency of the results generated using the two approaches was observed, which provides a reasonable rating on the quality of the data and appropriate- ness of some comparative analyses. Chapter 2 of the analytical report includes a number of comparisons of achievement levels by student and school characteristic within each year. Those kinds of comparisons were also carried out in previous NAPE reports and are extended here to more content areas. The within-year differences between boys and girls, private and government schools, and rural and urban schools are also presented to enable identification of curriculum areas where achievement gaps appear to be increasing or decreasing. Tests of significance (“t-tests”) are used both for “across years” (such as individual years versus all others) and “within years” (such as boys versus girls, private versus government schools) comparisons. The end-of-cycle (EOC), or curriculum, reports, which are produced annu- ally by the UNEB chief examiners for both primary and secondary levels, were also reviewed from 1999 to 2007; this information was used to complement NAPE analysis outcomes. For consistency with the NAPE, only English and mathematics reports for the primary and secondary education leaving exams have been reviewed. The limitation of those reports is that they are generic and do not specify categories of students affected, whether by gender or region. Also, the reports are qualitative and do not quantify the extent of the problems iden- tified in the various content areas. They are, however, useful as a complementary source of information on problem areas that need to be addressed. Therefore, information from those reports is used here to augment the main findings based on NAPE test information. Analysis of Teacher Competencies and Predictors of Learning Outcomes The data used in the analysis of teacher competencies and predictors of learning outcomes are taken from the NAPE 2011 national test application, which was also extended to the teachers of the three respective grades with support from Introduction and Methodology 9 World Bank and the U.K. Department for International Development (DFID). The samples are quite large, allowing for reliable comparisons across strata such as location (urban-rural), school type (private-public), and zone (16 in all). For P3 and P6, a total of 1,230 schools were visited, with approximately 24,000 students taking exams in numeracy and literacy and 900 teachers drawn from each grade. For S2, a total of 524 schools were visited, with 19,000 total students and one teacher drawn from each school in the subjects of numeracy, literacy, and biology. Overall Test Design For each subject and grade, the test blueprints that describe the overall break- down of the tests in terms of content area (that is, geometry, reading compre- hension, and so on) enabled the authors to drill down the data to obtain an assessment of teachers’ and students’ knowledge levels by learning area. In P3, the test items include 57 questions for numeracy, 27 questions for literacy, and 28 questions for reading. For P6, the test booklets include 68 items for mathe- matics and 80 items for literacy. All P3 and P6 teachers answered the P6 student questions in numeracy and literacy, as well as a supplemental group of 28 ques- tions for reading that do not appear in any of the student test files. The overall testing design for the primary level makes it possible to compare student and teacher knowledge on the same items only in P6. Primary three teachers did not answer the P3 student test questions, but rather the P6 items because the P3 test items would be too simple for this category of teacher. Reliance on teachers’ content knowledge is, therefore, based on P6 content because all teachers are trained as primary teachers, as opposed to grade-specific teachers. Although a more flexible design would have added higher-level content knowledge items to the P3–P6 teacher test booklets, preferably anchored in the S2 student and teacher test booklets, this approach was not possible because of the need to adhere to the national specifications of NAPE for the primary level. For S2, the overall design strategy was similar. Students answered 46 ques- tions in numeracy, 37 items in biology, and 77 items in literacy. Teachers answered all of those same test questions. In addition, 25 numeracy items were added to the student and teacher booklets that measure still more advanced content. For student and teacher background, basic information was collected to avoid prolonging the test duration beyond standard requirements. As noted earlier, this approach includes data such as location, type of school, and zone (as will be presented in the analysis). For the teachers, some background ques- tions about training, education levels, and marital status were added, together with socioeconomic status. In sum, the testing design and the data generated make it possible to address teacher competencies and effectiveness. Additional information on the methodological aspects of the analysis on teachers is presented in chapter 3. 10 A World Bank Study Attempts have been made in this chapter to explore other school-based factors that are likely to affect overall performance of learners at the EOC exams, which are based on the most recent Uganda National Panel Survey (UNPS) data. Those data capture school attributes, including teacher presence, which are not captured in the NAPE. Conceptual Framework The conceptual framework is based on existing theories about teacher prepara- tion, teacher knowledge, and teaching practices (Ball and Bass 2000; Hill, Rowan, and Ball 2005; Shulman 1986). It draws still more specifically on recent studies (Marshall and Sorto 2012; Sorto et al. 2009), as well as Marshall’s (2009) framework for analyzing teacher effectiveness. Figure 1.1 provides a simple diagram that draws together different elements of teacher knowledge as ­ presented in the abovementioned framework. Each of the three circles represents a different domain of knowledge that teachers draw upon when teaching. The left-hand circle represents general pedagogical knowledge. General pedagogical knowledge refers to knowledge of “how teachers manage their classrooms, organize activities, allocate time and turns, structure assignments, ascribe praise and blame, formulate the levels of their questions, plan lessons, and judge general student understanding” (Shulman 1986). Teachers accumulate such pedagogical skills through pre- and in-service pedagogical training courses, experiential learning from trial and error in their own classroom, and mentoring effects that result from watching other teachers or working closely with other school personnel (for example, head teachers, mentors, support providers). Lower content knowledge refers to the level that is being taught (such as third grade), while higher content knowledge is for more advanced grades or levels beyond the grade for which the teacher is Figure 1.1  Knowledge Components of Effective Teaching Content at the level taught Content at higher level Pedagogical knowledge E ective teaching = Pedagogical content knowledge (PCK)—for example, of math, language, etc. Source: Sorto et al. 2009. Introduction and Methodology 11 responsible. The right-hand side of the figure is the content knowledge circle divided into lower and higher elements. At the intersection of pedagogical and content knowledge lies a specialized form of knowledge that is highly prized, especially by education researchers and educators. This domain is commonly referred to as pedagogical content knowl- edge (PCK) (Shulman 1986), and its evolution reflects a growing emphasis on practice-based metrics for analyzing teaching effectiveness in the classroom. Pedagogical content knowledge examples include (a) explanations that teachers use to develop a deep understanding of concepts that are part of the curriculum, (b) the ways in which they draw linkages with other elements of the subject matter they teach (such as mathematics), and (c) the questions they pose to students. These kinds of skills, it is argued, can be accumulated only through practice or very specialized training activities (Ball, Hill, and Bass 2005; Hill, Schilling, and Ball 2004).2 By definition, the PCK element is made up of critical strands of knowledge that most directly influence a teacher’s ability to develop curriculum and draws on all three of the knowledge domains. However, each element of teachers’ knowledge potentially helps determine their effectiveness. The most obvious component of effective teaching is content knowledge at the level being taught, which is what this report addresses. This part of the statistical analysis is, hence, divided into two. First, within each grade and subject area, test items were analyzed using IRT software. Because the test items included open-ended items with a range of scores, it was necessary to use the partial-credit IRT extension. In P6 and S2, this process of reviewing item performance also involved grouping students and teachers together in order to get comparable scores, which was facilitated by the fact that they all answered the same test items. This work was carried out using the Construct Map software (Version 4.6) developed at the University of California. Items were reviewed according to their fit. In a handful of cases, problematic items were removed from the analysis, which enabled validation of data quality, process that was not possible with the data used in the analysis of learning out- comes and problematic curriculum areas (presented in chapter 2). In general, the items performed within a suitable range for difficulty and discrimination, which reflects well on the work of the NAPE-UNEB team. The IRT-scaled scores were used mainly for comparing overall achievement levels. For content- area specific comparisons, the raw percentage correct results were used, although those calculations do not include the problematic items identified in the IRT work. The second statistical activity focuses on explaining variation, both between students and schools and between teachers. To do this, a series of multivariate models were incorporated. The analysis for S2 teachers was again based on the standardized measure derived from the IRT analysis, which incorporated all test items answered by teachers. The results are presented by subject, once again with two models. The Ordinary Least Squares (OLS) were used in the first model instead of the Hierarchical Linear Model (HLM) because teachers in 12 A World Bank Study secondary schools are not clustered; in the second model, fixed effects (FE) was also used. Table D.1 in appendix D provides a descriptive overview of all variables used in the analysis. Report Outline In light of the objectives identified earlier, the report is structured into four main chapters. Chapter 2 analyzes levels and patterns of learning outcomes in Uganda and explores problematic curriculum areas, drawing on NAPE results for the years 2006–10. Chapter 3 presents an analysis of teacher competencies and effectiveness, together with predictors of performance among learners and teachers on the basis of 2011 NAPE data. Chapter 4 concludes the report with emerging issues and suggestions for next steps. Notes 1. The 1992 white paper’s articulation of the purposes of Uganda’s education system continues to provide supreme guidance for the sector. The paper articulates the educa- tion systems aims as to (a) promote citizenship; (b) instill moral, ethical, and spiritual values; (c) promote scientific, technical, and cultural knowledge, skills, and attitudes; (d) eradicate illiteracy; and (e) equip individuals with basic skills and knowledge so they gain the ability to contribute to the building of an integrated, self-sustaining, and independent national economy. 2. For example, Ball, Hill, and Bass (2005) argue that a mathematically literate person would struggle to answer questions that they as researchers created in order to measure specialized knowledge. CHAPTER 2 Learning Outcomes and Problematic Curriculum Areas Overall Achievement Levels in Numeracy and Literacy This chapter briefly analyzes the overall achievement levels of learners in Uganda, with a special focus on literacy, numeracy, and biology. Its purpose is to establish a background for the analysis that follows. As indicated earlier, this report is based on cross-sectional national assessment data generated every year for primary grades 3 and 6, together with senior two of the lower secondary system. Student Achievement Levels in Literacy As a result of data limitations identified in the methodology section, the data for two years (2009 and 2010) have been used to analyze literacy achievement because of the consistency observed in the test items for this period. Lower secondary (S2) has only three years of results to analyze, since it began in 2008, and there have been no reported curriculum changes within the literacy in English area. Results for all three years have been presented. Learning achievement in literacy at P3 was estimated at 47 percent in 2010, an improvement from the 43 percent recorded in 2009 (figure 2.1). The per- centage of low-scoring students (0–25 percent) declined by about 30 percent Figure 2.1  Summary of P3 English Literacy, Overall Percentage Correct, Uganda, 2009/10 48 47 Percent correct 46 45 44 43 42 41 2009 2010 Years Source: National Assessment of Progress in Education (NAPE), Uganda, 2009/10. 13   14 A World Bank Study Figure 2.2  Summary of P3 Average Literacy Scores by Level, Uganda, 2009/10 2010 Years 2009 0 20 40 60 80 100 Percent of student samples 0–25% 26–50% 51–67% 68–100% Source: National Assessment of Progress in Education (NAPE), Uganda, 2009/10. between 2009 and 2010, while the percentage that scored in the highest profi- ciency category (68–100 percent) increased, although not by the same margin (figure 2.2). Despite observed improvements, it is important to note that the percentage of students who scored above 50 percent at P3 reached only the 40 percent margin as of 2010. Literacy achievement at P6 was estimated at 40 percent in 2010. Comparing this percentage with the 2009 level of 35 percent may also signal improve- ment in literacy achievement for this grade. This finding can be affirmed from the reduction in the proportion of learners scoring in the lowest category (0–25 percent), which reduced by a noticeable margin (from 38 percent to 24 percent). However, the proportion that scored in the highest category was maintained at only 10 percent. It is important to note that in 2010, less than 30 percent of P6 students scored above 50 percent in English literacy (figures 2.3 and 2.4). Overall, literacy achievement at S2 is estimated at 52 percent, and a relatively high proportion of lower secondary students scored in the highest proficiency level (figures 2.5 and 2.6). Despite the identified weaknesses in the NAPE data sets, the rapidly declining trend in lower secondary achievement levels since 2008 cannot be ignored, including an almost 50 percent reduction in the proportion of students in the highest-performing category. Figure 2.3  Summary of P6 English Literacy, Overall Percent Correct, Uganda, 2009/10 41 40 Percent correct 39 38 37 36 35 34 33 2009 2010 Years Source: National Assessment of Progress in Education (NAPE), Uganda, 2009/10. Learning Outcomes and Problematic Curriculum Areas 15 Figure 2.4  Summary of P6 Average Literacy Scores, Uganda, 2009/10 2009 2010 0 P6 0 20 40 60 80 100 0–25% 26–50% 51–67% 68–100% Source: National Assessment of Progress in Education (NAPE), Uganda, 2009/10. Figure 2.5  Summary of S2 English Literacy, Overall and Within Common Content Areas, Uganda, 2008–10 64 62 60 Percent correct 58 56 54 52 50 48 46 2008 2009 2010 Overall Years All questions Common content Source: National Assessment of Progress in Education (NAPE), Uganda, 2009/10 Figure 2.6  Summary of S2 English Literacy by Proficiency Levels, Uganda, 2008–10 Overall Years 2008 2009 2010 0 20 40 60 80 100 Percent 0–25% 26–50% 51–67% 68–100% Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10 16 A World Bank Study Literacy and Gender Gender comparisons for the overall test score in English literacy across all three grades and NAPE survey years are presented in figures 2.7 and 2.8. The differ- ences between males and females are calculated for each year by subtracting the overall sample average for males from the overall sample average for females. The figure is divided into two halves: one above zero and one below zero. Bars above zero refer to categories where females have higher average test scores; below zero means males have the advantage. The overall literacy scores by gender indicate that although girls have been significantly outperforming boys since 2006 (except for the very marginal differential in favor of boys at P6 for the same year), the pattern appears to change Figure 2.7  Female-Male Difference in Overall English Literacy by Grade, Uganda, 2010 1.4 1.2 1.0 0.8 Di erence, % 0.6 0.4 0.2 0 0.2 0.4 0.6 Grades P3 P6 S2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. Figure 2.8  Female Difference in Overall English Literacy by Grade, Uganda, 2006–09 3.0 2.5 2.0 Di erence, % 1.5 1.0 0.5 0 0.5 Grades P3 P6 S2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. Learning Outcomes and Problematic Curriculum Areas 17 in 2010, but the margin is too narrow and insignificant to be conclusive. However, boys score consistently lower than girls at S2, and once again some (although not all) of the differences are statistically significant (see table B.3 in appendix B). According to initial comparisons of overall English literacy, it is safe to say that there is not a large issue with gender equity in this specific area of the curriculum in Uganda. The improvement of boys vis-à-vis girls between P3 and P6 is some- what of a concern, but because that finding does not continue into lower second- ary, it is not of serious concern. Also, it should be restated that these differences are, on average, very small and never go beyond 2.5 percent in any year or grade. Finally, gross and net enrollment rates for boys and girls are similar in the grades tested, so comparisons are not affected by different rates of participation. Urban-Rural Comparisons One of the more common findings in educational research is that urban students score significantly higher than do their rural counterparts on standardized tests. There are two general explanations for this advantage. The first is related to socio- economic status and family background, which includes household as well as community (including peers) effects. But urban students are also likely to attend schools of higher quality, which includes the critical work of teachers in the class- room (for example, better staffing levels). This combination of higher socioeco- nomic status (SES) and access to better schools can result in very large differences in average levels of achievement. The differences in achievement may be medi- ated, however, by differences in participation. In higher grades especially, access may be relatively limited in rural areas, so the resulting urban-rural gap can be distorted somewhat when comparing a relatively select group of rural children against urban schools that enroll a much higher percentage of eligible children. It is evident that there is a very large and significant difference between urban and rural student achievement in English literacy at the two primary grade levels. The English literacy “gap” between urban and rural students is summarized in figure 2.9. The differences are calculated by subtracting the Figure 2.9  Urban-Rural Difference in Overall English Literacy by Grade, Uganda, 2010 25 Percentage di erence 20 15 10 5 0 Grades P3 P6 S2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–08, 2010. 18 A World Bank Study rural average from the urban average; positive numbers, therefore, mean that urban students score higher. A further look at past years (2006 and 2008) indi- cates that this pattern has been consistent, averaging about 20 percentage points in favor of urban students (figure 2.10). The size of these differences reinforces the importance of family background and school quality, although it is impossible with NAPE data to sort out which influence is more important. This lack of specificity implies that rural students with equal amounts of educa- tion as their urban counterparts are leaving the primary cycle with far lower levels of real preparation, which, in turn, has consequences for their secondary schooling and beyond. In his work on urban-rural gaps in education in Sub-Saharan Africa—based on data of the Southern and Western Africa Consortium for Monitoring Education Quality (SACMEQ), of which Uganda is a participating country— Zhang (2006) found a similar pattern. Rural students underperformed their urban peers by large margins in most of the 14 participating countries.1 Compared to these peers, rural students had lower levels of family SES, were older, more likely to have repeated a school grade, and had less home support for their academic work. In addition, rural schools had fewer and lower-quality resources than did urban schools. Although the pattern also prevails at the lower secondary level according to S2 results, the average S2 student in urban Uganda scores only about 5 percent higher than his or her rural counterpart, which is a much smaller difference when compared with P3 and P6. This result is most likely being driven by the very different rates of participation in lower secondary education in urban and rural Uganda. More children are now making it to the end of the primary cycle, but in rural areas especially, only a relatively elite group of students continues on to postprimary schooling. Also, secondary schools located in rural areas are Figure 2.10  Urban-Rural Difference in Overall English Literacy, Uganda, 2006–08 25 20 Percentage di erence 15 10 5 0 2006 2007 2008 Years P3 P6 S2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–08, 2010. Learning Outcomes and Problematic Curriculum Areas 19 likely to be better staffed and equipped than the average rural primary school. Thus, it is not surprising that those children are able to keep up with their urban counterparts. However, this is another result that could change in coming years, as secondary education participation rates continue to improve in the country. The pattern for lower secondary may not conform to the global pattern for certain countries of the Organization for Economic Co-operation and Development (OECD), according to the 2009 Program for International Student Assessment (PISA). Students in urban OECD schools performed far better than did students in other schools, even after accounting for differences in socioeconomic background. In Chile, Italy, Mexico, the Slovak Republic, and Turkey, as well as in partner countries such as Albania, Argentina, Peru, Romania, and Tunisia, the performance gap between students in urban and rural schools was more than 45 points, after accounting for differences in socioeconomic background. Wider differentials of 80 points or even more were observed in countries such as Bulgaria, Hungary, the Kyrgyz Republic, and Panama. This observation is not to say that Uganda is performing better, but to indicate that high-quality systems such as those in Finland, Canada, and the Netherlands do not have such differentials. Comparisons by Public and Private School Another common comparison in educational research is between private and public schools. In P3 and P6, there are very large gaps between the literacy achievements of students attending public and private schools in favor of the latter. The gap is upward of 30–35 percent, and the private advantage is fairly stable over time for those grades (see figure 2.11). These results are not surprising. As noted earlier, private Ugandan primary schools tend to serve a Figure 2.11  Government-Private School Difference in Overall English Literacy, Uganda, 2010 35 30 25 Di erence, % 20 15 10 5 0 Grades P3 P6 S2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. 20 A World Bank Study Figure 2.12  Government-Private School Difference in Overall English Literacy, Uganda, 2006–09 40 35 30 25 Di erence, % 20 15 10 5 0 2006 2007 2008 2009 –5 Year P3 P6 S2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. fairly elite population, which helps explain why less than 10 percent of grade 3 and grade 6 children are enrolled in such schools. In S2, however, public secondary school students performed slightly better in English literacy for 2010 than did their private counterparts, while in 2006 and 2008 (see figure 2.12 and table B.6 of appendix B), performance shows the reverse (with a very small and almost insignificant margin), although it is less than 0.5 percent for both years. Compared with primary schooling, the results for S2 are surprising. However, given substantial private participation at this level, it is less likely that these are elite schools serving a special clientele. The finding is not unusual internationally, because at higher levels of education (secondary and tertiary), the public presence is smaller but of a generally high quality, while the private sector presence is more diverse and includes both high- and low-quality offerings. Importance of and Overall Achievement in Numeracy In the day-to-day world, the literacy competence of a population is discussed more than numeracy because it is more recognized that low levels of literacy constrain the effective functioning of adults, including their overall management of adult life. It is, therefore, often assumed that numeracy incompetence (the inability to manage numbers, tables, and graphs) is acceptable because numeracy is less important than literacy. Longitudinal studies of numeracy in the United Kingdom indicate, how- ever, that poor numeracy is a problem in its own right. Bynner and Parsons Learning Outcomes and Problematic Curriculum Areas 21 (1997 and 2000) reveal that people with poor numeracy skills tended to leave full-time education at the earliest opportunity—usually without qualifications—followed by patchy employment with periods of casual work and unemployment. Further work by Parsons and Bynner (2005) shows that men with poor numeracy skills, irrespective of their level of literacy, were more likely to have been suspended from school, more at risk of depression, and had little interest in politics. The negative impact for women with poor numeracy skills was, however, worse because they were less likely to be in full-time work. If they did work, they were more likely to be engaged in home care, report poor physical health in the past 12 months, have low self-esteem, and feel that they lacked control over their lives. Such studies reflect the importance of numeracy skills enhancement within a population, as well as the relevance of such skills to the poverty reduction agenda, which amplifies its importance in Uganda today. For both P3 and P6 numeracy, the test blueprints are similar in design in 2006 and 2009. This finding means that the blueprints have identical subcontent areas (for example, geometry, measures), have roughly the same number of test questions per content area, and a total value (in points) that is fairly constant. However, in 2010 there was a fairly significant shift in blueprint coverage and item format for P3 and, to a lesser degree, P6. This change resulted mainly from more open-ended, multiple-point test questions being incorporated into the 2010 assessment, which was justified by the need to strengthen competency assessment as opposed to knowledge. This finding means that the scores across years are comparable to some degree, but some caution is required in comparing 2010 with earlier years, given the possibility that the kinds of skills that were being tested in 2010 were different—especially if there were differences in item format. In light of the need to use the most recent statistics for analysis, this report uses 2010 data. The summary results on numeracy achievement for all the grades are shown in figures 2.13 and 2.14 for ease of presentation. In 2010, the overall numeracy average for P3 was significantly higher than that which had been realized in the earlier years. For the other grades, average student performance in numer- acy averaged 40 percent in 2010, which means that the average student of P6 and S2 in Uganda cannot answer even half of the questions related to material that is supposed to have been taught in the classroom at those levels. Results also show that this performance pattern was not unique to 2010 but also existed in 2009 and earlier years and was actually much lower for the primary grades in 2006 and 2007. P3 registered the highest proportion of students performing above the 50 percent mark (67 percent), compared to the 30 per- cent for P6 and S2. Compared to earlier years, the results for P3 point to improvements in numeracy performance since 2006, while a noticeable stagnation is observed for P6. Further, S2 students are not outperforming other grades in terms of overall scores, which is a significant difference when compared with the results for 22 A World Bank Study Figure 2.13  Summary of Numeracy, Overall Percent Correct in P3, P6, and S2, Uganda, 2006–10 70 60 50 Percent correct 40 30 20 10 0 2006 2007 2008 2009 2010 Years P3 P6 S2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Figure 2.14  Summary of Overall Numeracy Proficiency Levels, Uganda, 2006–10 2010 2009 2008 P3 2007 2006 2010 2009 P6 2008 2007 2006 2010 S2 2009 2008 0 10 20 30 40 50 60 70 80 90 100 0−25% 26−50% 51−67% 68−100% Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. English literacy. In two out of three years, scores are near the 40 percent mark. This finding reinforces the difficulty of achieving numeracy in Uganda, as even more advanced students are struggling to reach the 50 percent level. The chapters that follow explore areas of relative student strength and weakness in the numer- acy curriculum area in order to explore guided interventions in this area. Learning Outcomes and Problematic Curriculum Areas 23 Differentials by Gender Figures 2.15 and 2.16 provide a general summary of female-male differences in numeracy achievement in all three grades, with negative coefficients referring to males’ advantage for the respective grades. It is evident from the results that boys in Uganda are scoring persistently higher than girls in numeracy achieve- ment tests in all three grades, with the magnitude of the gender gap in numeracy achievement much larger than that found for English literacy. Results also show that the female-male difference in numeracy widens by grade—about 4 percentage points in P6 (equivalent to about 0.20 and 0.30 standard deviations) and almost 8 percentage points in S2. This observation is consistent with another common pattern of gender achievement: the differ- ences in numeracy achievement between boys and girls are small in earlier grades, then increase in later grades. It is thus evident that the overall gender Figure 2.15  Female-Male Difference in Numeracy by Grade, Uganda, 2010 Grades 0 –0.5 –1.0 –1.5 Di erence, % –2.0 –2.5 –3.0 –3.5 –4.0 –4.5 –5.0 P3 P6 S2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. Figure 2.16  Female-Male Difference in Numeracy by Grade, Uganda, 2006–09 Years 2006 2007 2008 2009 0 –1 –2 –3 Di erence, % –4 –5 –6 –7 –8 –9 P3 P6 S2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. 24 A World Bank Study gap in numeracy does not appear to be closing, either over time within grades or across grade levels. More detailed tables in appendix C present standard deviations together with statistical tests of significance. The findings explored there help unmask large differentials that may appear to be small differences of a few percentage points. For example, the difference in overall numeracy achievement in P3 is usually between 2 and 3 percentage points. This is not a large gap, but none- theless amounts to about 0.10–0.15 of a standard deviation (see table C.1 in appendix C). Differentials by Rural-Urban Residence In regard to school location, urban students in P3 and P6 score significantly higher (about 9.5 percent and 12 percent, respectively) in numeracy than do their rural counterparts, while the differences in S2 are negligible (about 2 per- cent as shown in figure 2.17). Although the observed pattern is consistent with literacy results, the important difference is that the gap between urban and rural averages for the two primary education grades (P3 and P6) is consider- ably smaller in numeracy than it is for literacy. Results for the previous years (figure 2.18) reveal that this differential, which has prevailed for the past four years, is also on a gradual decline, especially in primary grades. This finding means that rural scores are improving, as opposed to a decline in urban learning levels. Differentials by Public-Private School Both P3 and P6 display very large differences in numeracy achievement between the average government school student and private school student (figure 2.19 and appendix C). On average, this difference is about 20 percent, which amounts to about 1.0–1.20 standard deviations. In other words, private school students are far ahead of their government school counterparts at the end of the Figure 2.17  Urban-Rural Difference in Overall Numeracy by Grade, Uganda, 2010 14 12 Percentage di erence 10 8 6 4 2 0 Grades P3 P6 S2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. Learning Outcomes and Problematic Curriculum Areas 25 Figure 2.18  Urban-Rural Difference in Overall Numeracy by Grade, Uganda, 2006–09 14 Percentage di erence 12 10 8 6 4 2 0 2006 2007 2008 2009 Years P3 P6 S2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. Figure 2.19  Government-Private School Difference in Numeracy by Grade, Uganda, 2010 20 15 Percentage di erence 10 5 0 –5 Grades P3 P6 S2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. primary cycle, although it is important to note that this gap has been higher in P3 in the past (as high as 25 percentage points in 2007 and 2009) compared to P6 (figure 2.20). In lower secondary (S2) there is no sizeable private school advantage; in fact, government school students scored marginally higher not only in 2010, but also in the two previous years. Overall Achievement Levels in Biology Overall student performance in biology in S2 is very low: only 27 percent (figure 2.21). Performance of earlier years is, however, mixed. In 2009, a rela- tively higher performance level of 43 percent was realized, while in 2008, performance levels were as low as the levels registered in 2010. Proficiency levels further confirm the mixed results (see figure 2.22). Almost no student 26 A World Bank Study Figure 2.20  Government-Private School Difference in Numeracy by Grade, Uganda, 2006–09 30 25 Percentage di erence 20 15 10 5 0 5 2006 2007 2008 2009 Years P3 P6 S2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. Figure 2.21  Overall Achievement Levels in Biology at S2, Uganda, 2008–10 50 40 Percent correct  30 20 10 0 2008 2009 2010 Years Public Private National Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10. Figure 2.22  Overall Proficiency Levels in Biology at S2, Uganda, 2008–10 2008 46.1 48.9 4.5 0.5 Years 2009 6.5 66.7 23.3 3.5 2010 41.4 57.5 1.01 0 20 40 60 80 100 Percent 0–25% 26–50% 51–67% 68–100% Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10. Learning Outcomes and Problematic Curriculum Areas 27 performed above the 50 percent mark in biology in 2008 and 2010. This find- ing implies that none of the S2 students could correctly answer even half of the test items drawn from the biology curriculum at that level. In 2009, how- ever, not only was the proportion that could not achieve the 50 percent mark slightly lower (73 percent), 27 percent of students scored above the 50 per- cent mark. Noteworthy is that biology as a science was justified for inclusion in the national assessment because of the relatively limited requirement for laboratories in order to teach it. Laboratory coverage in secondary schools of Uganda is about 52 percent. Including instruction in subjects such as chemis- try and physics (which demand laboratories) in the national assessment at S2 was thus considered unsuitable. The extremely low performance levels in biology therefore signify that Uganda faces a great challenge in improving science instruction at the lower secondary level. Of importance to note, how- ever, is that the performance of public and private schools in this topic was at the same level. Differentials by Other Attributes Boys consistently score higher than girls on biology assessments (figure 2.23); the difference is statistically significant in all three years. The boys’ advantage over girls throughout the years is actually larger than that suggested in the figure, amounting to 0.35–0.45 standard deviations. This is clearly a significant advantage, although again, both categories of students scored low overall in 2008 and 2010. Comparisons of the biology achievement of private and government schools are presented in figure 2.24. The results are consistent with other subjects at the Figure 2.23  Summary of S2 Biology Achievement by Gender, Uganda, 2008–10 50 45 40 35 Percent correct  30 25 20 15 10 5 0 2008 2009 2010 Years Female Male Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10. 28 A World Bank Study Figure 2.24  Summary of S2 Biology Overall Achievement Levels, Uganda, 2008–10 50 40 Percent correct  30 20 10 0 2008 2009 2010 Years Public Private National Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10. S2 level—and very different from the pattern in P3 and P6. The results show that in general, there is not much difference between public and private schools in S2 biology. In both the 2008 and 2010 tests, the averages across those catego- ries are virtually identical. In 2009, however, public school students scored about 0.20 standard deviations higher than did their private school counterparts (the difference is significant). Urban students scored significantly higher than did rural students in the 2008 and 2010 tests. But the differences were not as large as the other comparisons (about 0.15 standard deviations), and in 2009 there was no significant differ- ence between students by location of their school. Summary of Assessment Results In sum, several results stand out. First, student achievement levels at the primary level are far below intended levels. NAPE tests are constructed on the basis of the official (or intended) curriculum—that is, they cover content that children are supposed to learn. The fact that roughly 60–70 percent of P3 and P6 stu- dents are scoring at 50 percent or below overall highlights the challenges of ensuring that a critical mass of (future) workers acquire the foundational skills essential for national growth. Second, there is tentative evidence of improve- ment at the primary level over the two-year period 2009/10, although the pace is still very low. The standard deviations for the two primary grade levels also declined between 2009 and 2010, which suggests increasing levels of achieve- ment and decreasing levels of inequality between students in the samples. This finding is most apparent in P3, but P6 also shows some signs that average achievement levels are increasing, which is encouraging. For S2, the results in English literacy are very different; overall averages are much higher and relatively few students score at the lowest achievement levels. Nevertheless, a decline in performance prevails, with average achievement levels Learning Outcomes and Problematic Curriculum Areas 29 having decreased between 2008 and 2010. The results are consistent, to some degree, with a relatively select group of students that is slowly expanding, as more children enter lower secondary school as a follow-up to the introduction of universal secondary education (USE) in 2007. The fact that the average stu- dent at this level scores much closer to the minimum expected achievement levels is encouraging. The declining trend is, however, worrisome. It will there- fore be important in future years to monitor progress in S2 as more and more children make the transition from primary to lower secondary education. Overall, comparisons by location provide few surprises. The very large gaps in P3 and P6, while alarming from an equity and policy perspective, are not surprising when compared to research and assessment results in other countries (also see the NAPE reports for Uganda). There is no simple explanation for this advantage; the reality is that even poor urban children tend to enjoy meaningful advantages over their rural peers in a number of areas relevant to education. The much smaller gap in S2 is somewhat surprising, especially given the size of achievement differences in P3 and P6. But when one takes into account who is being compared in each location, the result is not surprising. It is also potentially misleading; as more children in rural areas continue to lower secondary school, these results may change substantially. The wide differential between private and public schools in overall literacy and numeracy achievement at the primary level also deserves attention. Overall scores in biology are very low. In two years (2008 and 2010), the average assessement score was below 30 percent, although in 2009, it was above 40 percent. Comparisons by type of student and school are mixed. For gender, there is a consistent—and at times substantial—advantage for boys versus girls. However, the differences by school location and type of school (public versus private) are not as significant. So overall, the main conclusion remains generally the same: the major challenge in this content area is to lift student achievement levels of all students in all schools. Problematic Curriculum Areas This chapter identifies problem areas in the curriculum identified by the NAPE data sets, with the aim of providing feedback to inform ongoing quality improvement programs in Uganda. This feedback is intended so that efforts are geared to address factors that could be responsible for observed achievement patterns, low learning achievement levels notwithstanding. Problematic curricu- lum areas relate to areas where students are struggling most, as seen in their performance on various curriculum content areas for the respective assessment years. The purpose is not to reanalyze these data, but rather, provide additional analysis that can help complement the very useful summary reports provided by NAPE for individual years. Analysis results are supported by findings from the qualitative EOC examination reports generated by UNEB chief examiners. The underlying assumption is that persistently low scores in respective content areas 30 A World Bank Study also serve as a proxy for learning areas that pose challenges for learners. The findings should thus trigger more focused action and more effective use of resources to address gaps in educational quality improvement. In a bid to identify problematic curriculum areas, the analysis in the sections that follow provide NAPE findings by curriculum content, subcontent areas, and grade. Problem Areas in Literacy Learning areas in literacy include reading comprehension, writing, and grammar. Each of those areas has subcontent areas that are explored in the respective sections that follow. Reading Comprehension The ability to read with understanding is a fundamental skill and learning to read both early and at a sufficient rate is essential for effective learning. Figures 2.25–2.27 depict learning achievement patterns for the respective sub- content areas within the reading comprehension curriculum. Although P3 maintained the same curriculum blueprints for the years 2009 and 2010, there were changes in the curriculum blueprints for P6 between 2009 and 2010, which explains the use of only 2010 averages for P6. Moreover, S2 has fewer areas to consider, so all three years are included. Results indicate that the highest scores by grade level were in the following subcontent areas: (a) matching in P3 (about 80 percent of questions correct), (b) associating pictures in P6 (97 percent), and (c) reading passages in S2 (70 per- cent correct). However, these are the only subcontent areas across all three grades where students scored above the 66 percent level and that have been used as one Figure 2.25  P3 Reading Comprehension Subcontent Areas, Uganda, 2009/10 90 80 70 Percent correct  60 50 40 30 20 10 0 e- l ng in g es io n ds ng ng pr ge al ch i ify en c ns or zi rib i m er at t w ni o r a v en nt eh e te g sc c ve cy o M Id se pr e co De ng a e pl Re di ion tera et om m a pl C Co Re ens Li m h Co Subcontent areas 2009 2010 Source: National Assessment of Progress in Education (NAPE), Uganda, 2009/10. Learning Outcomes and Problematic Curriculum Areas 31 Figure 2.26  Summary of P6 Reading Comprehension in Detail, Uganda, 2010 100 97.3 90 80 Percent correct  70 60 56.7 52.1 50 44.2 43.1 39.7 40 33.2 33.0 30 21.5 20 12.0 10 0 es ar em es e or y on s es n cy ur nd Po iti lt im St to ur sio ra ct al e ct iv el ar ct n te pi T pi he l li te C e a tc e pr e al ia rib re nc er so c es c t er p qu e co m Ov As D In Se ng a di Re Subcontent areas Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. Figure 2.27  Summary of P6 Reading Comprehension Subcontent Areas, Uganda, 2010 Overall literacy Reading comprehension Associate pictures Subcontent areas Calendar Poem Describe activities Tell time Story Interpret cartoons Sequence pictures 0 20 40 60 80 100 Percent 0–25% 26–50% 51–67% 68–100% Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. of the cutoff points for proficiency. The next cutoff point is 50 percent; in P3, this level is attained in only one additional area (identifying), while P6 has two subcontent areas with averages above 50 percent (calendar and poems). Another key finding is that the difference between the highest- and lowest- scoring subcontent areas is very wide (80 percent for matching against 10 percent for describing in P3). By comparison, the highest-scoring area of associating pictures in P6 was 97 percent, compared to the lowest-scoring area of sequencing 32 A World Bank Study pictures, which was only 12 percent. Such wide differentials within a subcontent curriculum area are worrisome and point to uneven curriculum coverage of the subject matter, which compromises learner masterly of the whole set of requisite proficiencies. In regard to difficult curriculum areas, results are indicative, especially when one focuses on results for students at the bottom of the range in P3 and P6. For example, P3 students struggled the most in the areas of recognizing (18 percent) and describing (11 percent). In P6, the most troubling results are in the sub- content areas of interpreting cartoons (21.5 percent correct) and sequencing pictures (12 percent correct). A handful of subcontent areas for both grades have averages in the middle. This finding includes completing sentences, comprehension, and completing words for P3, as well as describing activities, telling time, and telling a story for P6. Achievement in P6 is, in some instances, used as a proxy for the likely end of primary cycle performance; hence, further analysis is done for this grade (figures 2.26 and 2.27), which uses the same proficiency levels used earlier. The results provide a slightly better picture of how many students are below expected levels of achievement by content area (expected levels being in the 50–67 per- cent range). The focus is thus on the proportion (by area) of students scoring at or below 50 percent. (Tables B.7 and B.8 in appendix B provide a more detailed overview of reading comprehension subcontent area results in P3 and P6.) Not only are the overall averages for S2 higher than those for the other grades, there is much less variation between subcontent areas in reading comprehension (figure 2.28). All of the subcontent areas are above 50 percent for 2008 and 2009, and only two areas have averages below 50 percent during any of the three years. Figure 2.28  S2 Reading Comprehension Subcontent Areas Summary, Uganda, 2008–10 90 80 70 60 Percent correct  50 40 30 20 10 0 Passage Dialog Poetry Cartoons Pie chart Rota Subcontent areas 2008 2009 2010 Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10. Learning Outcomes and Problematic Curriculum Areas 33 Writing Once again, the P3 results for 2009 and 2010 have been analyzed, while the P6 writing results were analyzed for only 2010 as a result of the curriculum blue- print changes identified earlier. For S2, the test structure appears to be relatively similar across all three years of coverage. Additional information on the writing subcontent area is provided in tables B.9–B.14 in appendix B. A wide gap exists between the high- and low-scoring subcontent areas of writ- ing in both P3 and P6 (figures 2.29 and 2.30). For example, the highest average Figure 2.29  Summary of P3 Writing Content Areas, Uganda, 2009/10 90 80 70 Percent correct  60 50 40 30 20 10 0 g rn s ry in g rd s ce s in g ge all yin t e s to w o n ra er p t a w nt e am e v Co pa ng Dr g N av yo n g r iti itin se ing r ac i r g rit W W in rit Lit e W rit W W Content areas 2009 2010 Source: National Assessment of Progress in Education (NAPE), Uganda, 2009/10. Figure 2.30  Detailed Summary of P6 Writing Content Areas, Uganda, 2010 70 59.3 60 54.3 53.6 Percent correct  50 46.9 44.6 40 32.3 30 20 11.6 10 0 e s s ds r m de d m tte on na m or i te or e iti d ef gu ei w el po s an et e m rit e rit pl rit Na W W co m aw m W Dr Co rit e W Content areas Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. 34 A World Bank Study score in P3 (copying) is almost 60 percent above the lowest-scoring subcontent area (naming), while for P6, the best-performed subcontent area (draw and name) was 50 percent higher than the lowest-scoring area (writing a composition). In both grades, students generally scored higher in areas where writing activ- ities respond to guided instructions: example copying (80 percent) and writing pattern (74 percent) in P3, together with draw and name (59 percent) in P6. Students scored lowest in areas where creativity and compositional skills were required. An example here is letter writing (32 percent) or writing a composi- tion (12 percent) in P6. This finding is not unusual in literacy assessments because guided or structured activities can be relatively straightforward to com- plete, while activities that provide only general instructions and require the student to create a response can be much more difficult. Nevertheless, develop- ment of skills that elicit creative and imaginative writing at the foundational stages of education should be a central learning objective. There is less variation across the different categories of writing skills in S2, despite the emerging declining trend (figure 2.31). The relatively low scores for composition in S2 reinforces the earlier observation that students perform guided writing exercises best, as opposed to writing that requires creativity, such as in compositions. Creative thinking as reflected in written material is hence still a challenge for Ugandan learners. The EOC qualitative reports further confirm that students have challenges with nonguided writing. At the end of the primary level, students are required to do the following: • Rewrite sentences as instructed in brackets without changing the meaning • Construct sentences correctly according to the instructions given in each question Figure 2.31  S2 Writing Subcontent Areas Summary, Uganda, 2008–10 80 70 60 Percent correct  50 40 30 20 10 0 2008 2009 2010 Years Announcement Formal letter Report Composition Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10. Learning Outcomes and Problematic Curriculum Areas 35 • Use common English patterns from everyday life • Give the plural form of words • Read a passage and answer questions in full sentences • Interpret charts, pictures, and notices or advertisements. The problems identified by the chief examiners among students who performed below the average expectation at the end of the primary education cycle were (a) poor use of common English patterns; (b) failure to construct complete sentences; (c) poor interpretation of the expressions used in conversation, hence a failure to make logical conclusions from given statements; (d) misplacement of responses; and (e) interpretation of instructions. For guided writing and composition, students were required to sequence given sentences in proper order to form a good composition. The identified challenge in this case was the inability of students to understand and interpret sentences correctly, thereby leading to their failure to rearrange them in a manner that constitutes a logical composition. For formal letter writing, students were required to format, rewrite, and punctuate a provided text so that it constituted a formal letter. On the basis of the EOC reports that summarize issues related to poor learner performance, the following issues were identified: (a) students’ failure to distinguish between formal and informal writing and (b) their lack of knowledge about the key parts of a letter, in addition to grammatical errors. In conclusion, when based on both NAPE achievement test results and EOC reports, the results for writing are troubling. In terms of overall test scores, writing does not appear to be especially problematic, at least relative to other content areas. But when the results are broken down by more specific skills, it becomes clear that this area requires a lot of work in Uganda. Significant numbers of students can complete basic writing activities, although many P3 and P6 students are unable to do so. But when the tasks require more creative elements, then the scores generally are much lower, and this finding is true in all grade levels. These kinds of skills can be developed only through practice and after receiving comments from teachers. This is a challenging area to address in the classroom, especially in heterogeneous situations with large class sizes. But it is imperative that students be exposed to more applied writing activities, especially those that require them to go beyond copying and filling in forms so that they can effectively engage in more creative forms of expression. Grammar The final content area in English literacy is grammar. For P3, grammar was included as an explicit content area only during 2006–08, but it has ceased to be a distinct content area in the NAPE since 2009. Thus, this chapter analyzes the grammar subcontent area for only P6 and S2. An average P6 student is not very comfortable with grammar content and can answer only about one of every three questions correctly. In fact, the majority 36 A World Bank Study Figure 2.32  Detailed Summary of P6 Grammar Content Areas, Uganda, 2010 60 54.9 54.1 50 42.9 Percent correct 40 37.0 33.3 33.2 30 21.5 20 10 0 es al s es ry ns ite s ns e tiv l ur t ur u la it io s e ar a ep ruc ab os po c tt st c p m p Gi v n vo rep yo o rre o ve en ep if ec i nt ec Us eg gi v Us Id e Us Us Us e Content areas Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. of scores was in the 30–50 percent range (figure 2.32). In one subcontent area (use correct tense), the average score was very low (22 percent), which clearly indicates an area where greater learner support is required. Average scores above 50 percent were only in a couple of areas (use comparatives and give plurals), although even those highest-scoring areas clearly have room for improvement. For S2 students (figure 2.33), the areas of most concern—where learners have averages below 50 percent correct—are punctuation, adjectives, and struc- tural patterns. Students appear to be comfortable with grammar areas related to nouns, pronouns, and tenses where all have averages above 70 percent correct. Interestingly, using correct tenses was the lowest scoring are in P6, while in S2, Figure 2.33  S2 Grammar Subcontent Areas Summary, Uganda, 2010 90 80 70 Percent correct 60 50 40 30 20 10 0 s s es ds ns bs n es s un un ns or io er at io tiv te rn No on o Te w os it v u c t Pr nd Ad ct je pa sa ep n Ad ra l e Pr Pu tu icl c Ar t Stru Content areas Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. Learning Outcomes and Problematic Curriculum Areas 37 it was one of the highest scoring. A second group of grammar subcontent areas that have averages above 50 percent and that thus emerge as the least problem- atic for this grade are articles and words, prepositions, and adverbs (tables B.14 and B.16 in appendix B). According to EOC reports, the underlying problem with English at the pri- mary level—which appears to cut across the most problematic areas—is that candidates are not able to sufficiently read, understand the questions, and follow instructions. This same problem with basic English skills was reported in the EOC English report for secondary education in 2008. This concern is confirmed by the fact that the problem areas occurring most frequently—nearly every year—were word formation and transformation; structures, sentence formation, and plurals; and vocabulary or dictionary work. It is clear that pupils need basic reading and writing skills starting with the formation of words, their meanings, pronunciation, spelling, use in simple sentences, and so forth, which are central to the acquisition of necessary literacy skills. Another important finding from the EOC reports is the link between prob- lems identified at the end of secondary education with similar problems identi- fied at the end of primary education. The 2008 end-of-lower-secondary EOC report confirms that these problems were recurrent and had been cited in earlier reports. Examiners reported poor grammar as reflected in (a) wrong use of words, (b) inappropriate use of punctuation marks, (c) writing of incomplete sentences, (d) inappropriate tenses, and (e) use of unconventional abbrevia- tions. Similar problems were also identified by the primary EOC report. A further look at NAPE data for past years (2006–08) reveals that these prob- lems have persisted despite changes in the test format observed in the test blueprints (figures 2.34 and 2.35). On the whole, there is general agreement that basic reading and writing in a second language is not easy and requires competencies in phonemics awareness, Figure 2.34  Summary of P3 Literacy Content Areas, Uganda, 2006–08 50 45 40 35 30 Percent 25 20 15 10 5 0 2006 2007 2008 Years Reading comprehension Writing Grammar Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–08. 38 A World Bank Study Figure 2.35  Summary of P6 Literacy Content Areas Uganda, 2006–08 50 45 40 35 30 Percent 25 20 15 10 5 0 2006 2007 2008 Years Reading comprehension Writing Grammar Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–08. phonics, vocabulary development, fluency, and comprehension. In their study of literacy practices in primary schools of Uganda, Muwanga et al. (2007) identified good practices that were typical of well-performing schools in literacy. The practices included (a) a set of explicitly stated objectives for developing a reading culture among learners, (b) an effective materials circulation system supported by a library, (c) a group of staff capacity-development initiatives with a specific focus on reading, (d) a philosophy of rewarding pupils and staff members on the basis of their performance, (e) a set of organized structures, such as readers’ clubs and debating clubs, (f) a plan for literacy teaching corners, (g) a collection of reading materials made by teachers and pupils, and (h) a way to focus special attention on those with special needs. These practices are, how- ever, not very common in most of the primary schools of Uganda, which may largely explain the observed levels. Variations in Problematic Areas by Other Attributes Considering the content areas for English literacy, the results in figure 2.36 clearly show definite variations in gender by content area and grade level. Girls perform better than do boys in writing and grammar in all grades, but they perform worse than boys in reading comprehension. The differences are not very large, though, and in some instances, are not statistically significant—as indicated in the tables in appendix B. Nevertheless, this systematic variation in performance between boys and girls in different content areas of English literacy clearly merits further atten- tion. For example, is the content of the reading comprehension classroom, home- work, and textbook activities more interesting for boys than girls? Or do girls devote more time to other areas (such as writing) and, as a result, spend less time on other literacy areas (such as reading)? The results of the 2009 PISA, which shows average scores for 15-year-olds in reading, mathematics, and science, also shows that—for all participating countries—girls outperform boys in the entire Learning Outcomes and Problematic Curriculum Areas 39 Figure 2.36  Female-Male Difference in English Literacy Content Areas by Grade, Uganda, 2010 3.5 3.0 2.5 Percentage difference 2.0 1.5 1.0 0.5 0 −0.5 −1.0 −1.5 −2.0 P3 P6 S2 Grades Reading comprehension Writing Grammar Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. literacy area, including reading (OECD 2010). Hence, the emerging pattern for Uganda at the S2 level may not be consistent with the global pattern; although the margins are still very low, there is a need to further monitor this area. Comparisons between boys and girls are further analyzed within each of the content areas by subcontent in appendix B (table B.7), which shows even more fine-grained differences between boys and girls. For P3 reading comprehension in 2010, for example, boys have significant advantages in three areas: identifying, matching, and complete picture-words. By contrast, female achievement is not significantly higher in any of the subcontent areas in 2010. Urban-rural comparisons for each of the three main content areas in English literacy are presented in figures 2.37 and 2.38. At the P3 level, the urban-rural Figure 2.37  Urban-Rural Difference in English Content Areas by Grade, Uganda, 2010 35 30 Percentage difference 25 20 15 10 5 0 P3 P6 S2 Grades Reading comprehension Writing Grammar Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. 40 A World Bank Study Figure 2.38  Urban-Rural Difference in English Content Areas, Uganda, 2006 and 2008 30 Percentage  difference   25 20 15 10 5 0 2006 2008 2006 2008 2008 P3 P6 S2 Reading comprehension Writing Grammar Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–08. gap is fairly consistent across years and content area, with the exception of writing in 2006 (which is very large). In P6, there is more variation. In the area of writing, the urban advantage in P6 steadily increases across the 2006–10 period. For reading comprehension there is some evidence of an increasing advantage for urban students, although this trend is not linear, as evidenced by the slightly smaller urban-rural gap in 2008. For grammar, the results are even more inconsistent over time in P6, but the important point remains that there are very large (and significant) gaps in achievement between these groups of students. Finally, the results for S2 confirm that the urban-rural difference— while significant—is much less substantial when compared with the other grades. There is also not much variation by content area; in fact, by 2010 the urban-rural gap was virtually identical in all three content areas. Overall, comparisons by location provide few surprises. The very large gaps in P3 and P6—while alarming from an equity and policy perspective—are not sur- prising when compared to research and assessment results in other countries (also see the NAPE reports for Uganda). There is no simple explanation for this advantage. The reality is that even poor urban children tend to enjoy meaningful advantages over their rural counterparts in a number of areas relevant to educa- tion. The much smaller gap in S2 is somewhat surprising, especially given the size of the difference in P3 and P6. But when one takes into account who is being compared in each location, the result is not surprising. It is also potentially misleading. As more and more children in rural areas continue to lower secondary school, these results may change substantially. The detailed tables in appendix B provide additional comparisons of urban and rural test score differences in English literacy across all three grades. Government-private school comparisons within English literacy content areas are presented in figure 2.39. In P3, the differences in areas of reading com- prehension are between 15 percent and 30 percent, respectively. In P6, the gap is wider than in P3, ranging from close to 25 percent in reading comprehension Learning Outcomes and Problematic Curriculum Areas 41 Figure 2.39  Government-Private School Difference in English Literacy Content Areas, Uganda, 2010 35 Percentage  difference 30 25 20 15 10 5 0 −5 P3 P6 S2 Grades Reading comprehension Writing Grammar Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. to about 32 percent for grammar. In S2, the content area results are consistent with English literacy comparisons overall and the results show that none of the differences are very large. Other comparisons between private and government schools are found in detailed tables in appendix B. Problem Areas in Numeracy The numeracy content areas in primary education include measures and geom- etry, the number system, fractions, number patterns, and operations on numbers. Graphs as a learning area is included on this list at the P6 level, in addition to the already-listed five areas. In S2, the numeracy content areas include numeri- cal concepts, measures, statistics, set theory, numeracy, geometry, and functions. Of importance to note is that although numeracy has more content areas than does literacy, there are no subcontent areas within each subject to analyze, with the exception of operations on numbers, which includes addition-subtraction- multiplication-division subcontent results. It is also important to mention here that for P3, the 2010 numeracy test blueprint refers only to individual item properties rather than content areas. This is a notable limitation because the results from 2010 are the most relevant for current policy discussions. But as mentioned earlier, the discussion and analysis for this particular grade will dwell more on the previous four years (2006–09) for meaningful summaries of achievement levels in the numeracy content and subcontent areas. Overall, P3 results show fairly stable averages in the 40–50 percent correct range across numeracy content areas for the years 2008 and 2009 (figure 2.40). Better performance for the two years was recorded in the measures and geom- etry content area, where learners’ average performance was between the 50 and 42 A World Bank Study Figure 2.40  Summary of P3 Numeracy by Curriculum Area, Uganda, 2006–09 70 60 Percent correct 50 40 30 20 10 0 Measures Number Fractions Number Operations and system patterns on numbers geometry Curriculum area 2006 2007 2008 2009 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. 60 percent range. Surprisingly, this content was the most difficult for P3 students in the earlier years of 2006 and 2007. It is, therefore, hard to believe that student achievement levels improved so dramatically in this area over a two-year period; the more likely explanation is that NAPE test difficulty levels in this content area could have been altered. Of importance to note is that performance levels in the number operations subcontent area was relatively low (an average of 40 percent in 2008 and 2009), yet these operations form the core of mathematics. Learner improvements in this subcontent area deserve great attention to promote better math learning in upper grades. The averages for number patterns also fluctuated somewhat during this period. But if one focuses just on 2009, the averages for all five numeracy con- tent areas vary by less than 20 percent between the highest and lowest scoring. This relatively similar performance has implications for identifying areas of relative strength and weakness. But the main point remains that the scores are significantly lower than desired or expected. Consistent with the average scores results for P3 shown in figure 2.41, measures and geometry registered the highest the proportion of high-scoring pupils (68–100 percent), while number patterns and fractions—together with number system subcontent areas—had the highest proportions of low- scoring students. For P3 (figure 2.42), the results for the four subcontent areas of number operations are generally consistent with simple average results. One interesting finding is that division has the highest proportion of high-scoring students (68–100 percent). But the average overall for this operation is lower than addition because addition has significantly fewer students scoring in the lowest category (0–25 percent). This finding suggests that in the area of division, P3 students are likely to fall into one of two categories: those who are very comfortable with the content, and those who are struggling a great deal and need to bridge this gap. Learning Outcomes and Problematic Curriculum Areas 43 Figure 2.41  Summary of P3 Numeracy Content Areas by Proficiency Level, Uganda, 2009 Addition Operations on numbers Division Multiplication Subtraction Measures‐geometry Number patterns Number system Fractions 0 20 40 60 80 100 Percent 0−25% 26−50% 51−67% 68−100% Source: National Assessment of Progress in Education (NAPE), Uganda, 2009. Figure 2.42  Detailed Summary of P6 Numeracy by Subcontent Areas, Uganda, 2010 80 70 60 Percent correct  50 40 30 20 10 0 rs ue ns ns es es ry be val tio tio nc sur et a c e m num ce ret Fra qu ea eo n la rp e M G o /P e /S ns em Int rns tio st h s/ tte ra sy ap a e be r Gr rp Op m be m Nu Nu Subcontent areas Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. Students in P6 performed the worst on geometry and measures subcontent areas, which is contrary to the pattern observed in P3 (figure 2.42). The numer- acy content area with the best student performance according to the 2010 NAPE results is operations with numbers (with an average of about 70 percent correct), followed by the number system (an average of 60 percent correct), 44 A World Bank Study which is the more likely performance pattern in mathematics. Operations on numbers include the basic mathematics areas of addition, multiplication, divi- sion, and subtraction. The fact that P6 students are scoring relatively well in this area is not surprising because learners normally find number operations, together with the number system, simpler than other concepts, although in some areas (such as division) there are still significant numbers of students who are struggling. Of importance to note is the very large variation between the highest and lowest averages for each content area. The gap between the highest-scoring area of operations on numbers and the lowest-scoring area of geometry in 2010 is nearly 40 percentage points, which points to inequitable learning within the various subcontent areas of numeracy. With the exception of operations on numbers, the number system, and place value, no other results in the numeracy content area surpass 45 percent correct in P6, with the probable additional exception of graphs, which is close to this margin. The correct averages for fractions and number patterns hover around 30–40 percent, while for geometry it is about 20 percent. Averages in the 20–30 percent range suggest that students do not have a basic grasp of the content that is being covered. The averages further confirm that overall achievement in numeracy in this grade is far below what is expected. There is also a possibility that this content is not even being introduced in (or prior to) P6, or if it is being taught, then the depth and quality of instruction could be severely lacking. Numeracy proficiency levels by numeracy content areas are provided in figure 2.43; these findings confirm the earlier findings on average performance of P6 learners. Although the highest average scores in the numeracy content areas in P6 were recorded under operations and numbers, the highest proportion Figure 2.43  Summary of P6 Numeracy Subcontent Areas by Proficiency Level, Uganda, 2010 Operations on numbers Number system/Place value Graphs/Interpretations Fractions Number patterns/Sequences Measures Geometry Numeracy overall 0 10 20 30 40 50 60 70 80 90 100 Percent 0–25% 26–50% 51–67% 68–100% Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. Learning Outcomes and Problematic Curriculum Areas 45 of best performers (68–100 percent) was in the number system or place values content area, with operations on numbers coming second. Fractions, number patterns, measures, and (most important) geometry still emerge as key areas with which most students in P6 struggle—four out of every five students do not score above 50 percent or are not comfortable with half of the examination content in this area! Results for the earlier assessment years for P6 are consistent with such find- ings (see figure 2.44), and there is some evidence that student achievement levels in numeracy at P6 are improving steadily. Nevertheless, more work remains, with a greater focus on the weak areas indicated here in order to ensure better performance in the future. Figure 2.45 summarizes results by numeracy content area for S2 students for the years 2008–10. (As previously noted, there were no observed changes in the test blueprint.) What cannot be missed is the fact that performance levels point to a declining trend within the curriculum areas of numerical concepts, statistics, and functions, notwithstanding NAPE data issues and the challenges NAPE data present for trend analyses. There are, however, steady improvements in perfor- mance at this level within the set theory content area. The second key finding relates to the fact that, in 2010, average performance above the 50 percent mark was recorded only in the content areas of set theory (with an average of about 55 percent) and measures (a roughly 52 percent aver- age). Those averages were still the areas with the best student performance in 2010, although the pattern has not been consistent in the past two years. Nevertheless, the content area with the worst student performance, where Figure 2.44  Detailed Summary of P6 Numeracy by Subcontent Areas, Uganda, 2006–09 70 60 Percent correct  50 40 30 20 10 0 rs lu e ns ns ce s es ry be a tio tio n sur et um ev et a ac ue ea om n la c r Fr Se q M Ge on /P erp / ns m nt r ns te /I at io sys hs a tte r p pe er ra rp O m b G be m Nu Nu Subcontent areas 2006 2007 2008 2009 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. 46 A World Bank Study Figure 2.45  Summary of S2 Numeracy by Content Areas, Uganda, 2008–10 80 70 60 Percent correct  50 40 30 20 10 0 s s s y y ry f. pt re tic or ac et ns nc e as u at is th e er -tr a e St t m om lc o M Se Nu Ge ns a tio ric c m e F un Nu Content areas 2008 2009 2010 Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10. students are struggling most, is functions, with an average of only 20 percent correct in the past two years. Other content areas with low scores include sta- tistics, numeracy, and geometry, with an average in the 30–40 percent range. The gap between the content area with the best performance in S2 in 2010 (numer- ical concepts) and the worst performance (functions) is 25 percentage points. Figure 2.46 concludes with S2 numeracy proficiency levels that are based on 2010 data. The results are similar to those for P6. In some areas—namely set Figure 2.46  Summary of S2 Numeracy Subcontent Areas by Proficiency Level, Uganda, 2010 Numeracy overall Numerical concepts Measures Statistics Set theory Geometry Functions and transf. 0 20 40 60 80 100 Percent 0–25% 26–50% 51–67% 68–100% Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. Learning Outcomes and Problematic Curriculum Areas 47 theory, measures, and, to a lesser degree, numerical concepts—sizeable propor- tions of students are scoring at the two highest levels. This is not to say that achievement levels are sufficiently high, because earlier findings clearly demon- strate that work remains across the numeracy curriculum in S2. Rather, areas with good performance are those for which a good base of knowledge exists. At the other extreme in S2 numeracy are the areas of functions and transforma- tions, statistics, and—to some extent—geometry. These are not surprising results, because those areas contain content that is generally considered the most diffi- cult for students at this level. For example, in functions and transformations, nearly 60 percent of the S2 sample scored between 0 percent and 25 percent correct on the exam and less than 10 percent got more than half of the test questions correct. The EOC reports also provide useful comments about numeracy achieve- ment in Uganda. Problematic primary content areas that were identified in the EOC mathematics curriculum reports between 2000 and 2010 include the following: construction/geometry, area and volume, sets, angles, substitution, place values, fractions, probability, equivalents, graphs, equations, numerals, percentages and/or interests, bases, sequences, magic square, and time. The most pronounced areas of difficulty, which seem to recur nearly every year, were construction/geometry. Variations in Problematic Areas by Gender The emerging pattern, based on P3 and largely on P6 data for 2010, is that the gender gap is narrowest in the content areas with the poorest student perfor- mance, which may simply reflect the fact that both girls and boys are struggling equally with these subjects, while score differences are widest in the content areas with the best performance. Fractions and geometry register a percentage difference of about 2.4 percentage points in favor of boys, while other areas that Figure 2.47  Female-Male Differences in P3 Numeracy Content Areas, Uganda, 2009 Content areas Number Number Measures- system patterns Fractions geometry Operations 0 –0.5 –1.0 Di erence, % –1.5 –2.0 –2.5 –3.0 –3.5 –4.0 –4.5 Source: National Assessment of Progress in Education (NAPE), Uganda, 2009. 48 A World Bank Study did not register the same degree of poor performance on the whole actually have percentage differences that are much higher—roughly 5 percent, which is equivalent to about 0.35 standard deviations (figures 2.47 and 2.48). For S2, results show that the largest gender achievement gaps are in the areas of measures and geometry (figure 2.49). These very large differences suggest that males are much more comfortable with this content than are girls. In other areas, the differences—while still significant—are not as dramatic for this grade. Research on teaching and learning mathematics also identifies curriculum areas, such as functions and geometry, as the most difficult areas for both teach- ers and learners. For example, Eisenberg (1991) argues that the functions con- cept is one of the most difficult concepts to master in the learning of school mathematics, which he attributes to the symbolic notation that is often used for this subject area. Again, it is also related to movement between the dependent Figure 2.48  Female-Male Differences in P6 Numeracy Content Areas, Uganda, 2010 Content areas Number Number Graphs and Operations system patterns Fractions Measures Geometry interp. 0 –1 Percent  di erence  –2 –3 –4 –5 –6 –7 Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. Figure 2.49  Female-Male Differences in S2 Numeracy Content Areas, Uganda, 2010 Content areas Numerical Functions‐ concepts Measures Statistics Set theory Geometry trans. 0 –1 Percent difference  –2 –3 –4 –5 –6 –7 –8 Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. Learning Outcomes and Problematic Curriculum Areas 49 and independent variables that, in itself, is an advanced idea because the dependent variable relative to the independent variable could be continuously moving. Research conducted in Nigeria (Telima 2011) also revealed that geom- etry was not only difficult for learners, but also for teachers at the secondary level, resulting in mass failure rates in examinations. Core topics in geometry that pose the greatest challenges were reported as plane, solid shapes, and mea- surement aspects; polygons; geometrical ratio; geometrical transformation; and latitude and longitude. Low learning achievement within subcontent areas has been reported to be caused by learning difficulties that are likely to be further compounded by poor pedagogical approaches. For example, the symbolic notation and variable move- ment aspects within the functions subcontent area have been identified by Eisenberg (1991) as rendering this area problematic from both the teachers’ and learners’ side. This finding calls for a complex teaching-learning process that first develops and then deepens the understanding of “variables” as a mathematical concept before the introduction of “functions.” Dubinsky (1991) further sug- gests that an important way of understanding the concept of a function is to construct a process that relates to the function’s process and that would include the use of visual representations in the form of arrow diagrams, tables, input- output boxes, and many more. Widmer and Sheffield (1994), however, empha- size that the introduction of manipulatives such as calculators at the school level goes a long way toward addressing the learning difficulties associated with the functions concept. The implication here is that poor instructional methods and students’ lack of calculators may greatly affect achievement in this learning area. The degree to which these concerns prevail in the Ugandan classroom is, however, not known. Results in chapter 3 about teacher competencies may contribute to greater understanding on this aspect. With regard to geometry, Pyke (2003), for example, notes that learners’ use of symbols, words, and diagrams to communicate mathematical ideas is another source of challenge. The teaching implication is that before students are required to use or manipulate trigonometric expressions, the meanings of symbols, dia- grams, or letters as they appear in different contexts must be well established by the teacher. In her study of schools in Virginia, the United States, Lenhart (2010) established that there is a positive correlation between teachers’ pedagogical con- tent knowledge and students’ learning performance in geometry and measure- ment. In her view, content knowledge embraces knowledge of (a) subject-specific difficulties and misconceptions; (b) representations of the content; (c) develop- mental levels; (d) connections among “big math ideas;” and (e) the appropriate- ness of students’ proofs, justifications, and mathematical discourse. Cognitive difficulties that generate learning difficulties from the learners’ side feature in an earlier work by Sfard and Linchevski (1994), which focused on geometry and algebra. These included a deletion error (illustrated when students simplify an expression as the result of the overgeneralization of certain mathematically valid operations) and nonuse of the conventions of algebraic 50 A World Bank Study syntax, resulting from learners’ failure to master and appreciate symbolism in mathematics. In their study of the sources of mathematical difficulties in Turkey, Bingolbali et al. (2011) captured views from a sample of pre- and in-service teachers, who cited a number of epistemological and psychological causes. Epistemological factors included students’ difficulties with the concepts because of either their abstract nature or their failure to link mathematics with real-life situations. Psychological factors, conversely, include learners’ lack of prior knowledge, negative attitudes and prejudice, and lack of motivation, interest, and/or self- confidence. Problem Areas Concerning Operations on Numbers The only content area with subcontent curriculum areas at the primary level is operations on numbers. Figure 2.50 indicates that P3 students consistently scored the highest in the area of addition, although in no year was the average above 55 percent correct. Although subtraction is expected to be simpler than multiplication and division when one considers levels of difficulty, the results show that the averages in subtraction are near (or below) 30 percent in all four years. In the other two areas of multiplication and division, the results show substantial improvements—most especially in 2008 and 2009, when compared to 2006 and 2007. For P6 (figure 2.51), addition was again the area with the best student performance, with their averages approaching 90 percent. This finding is followed by subtraction (74 percent), multiplication (68 percent), and divi- sion (41 percent), which is the order of difficulty that would be expected. The averages for division are, however, significantly lower than in the other areas. Data for the earlier years further confirm this pattern, as observed in Figure 2.50  Summary of P3 Numeracy in Operations on Numbers, Uganda, 2006–09 60 50 Percent correct  40 30 20 10 0 2006 2007 2008 2009 Years Addition Subtraction Multiplication Division Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. Learning Outcomes and Problematic Curriculum Areas 51 Figure 2.51  Summary of P6 Operations on Numbers Subcontent Area, Uganda, 2010 100 89.6 90 80 73.8 70 68.0 68.0 Percent correct  60 50 40.5 40 30 20 10 0 Addition Subtraction Multiplication Division Operations average Subcontent areas Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. Figure 2.52  Summary of P6 Operations on Numbers Subcontent Area, Uganda, 2006–09 90 80 70 60 Percent correct  50 40 30 20 10 0 Addition Subtraction Multiplication Division Operations average Subcontent areas 2006 2007 2008 2009 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. figure 2.53, although there are noticeable improvements in the four areas between 2008 and 2009 (figure 2.52). Students’ scores still indicate that performance in the division subcontent area of number operations has been a struggle; average levels were below the 30 percent mark for the period of 2006/07. The remarkable improvements in this subcontent area could have been due to reduced levels of test difficulty, which cannot be ascertained by this analytical report. 52 A World Bank Study A comparison shows that overall averages in operations on numbers for P6 are considerably higher than for P3. In other words, the average P6 student is relatively comfortable using these basic building blocks in numeracy, while in P3 the very low scores are a source of concern. Given the importance of the subcontent area of operations on numbers to the bigger learning area of numeracy, further analysis of the female-male differences were undertaken for P3 and P6 using the most recent data only (2009 for P3 and 2010 for P6). It is surprising to note that the female-male differential was so significant in mathematics in P3. In P6, the pattern is consistent with the already-observed performance pattern. The female-male difference is much wider in the problematic areas of multiplication and division (an average of 7 percent) than it is for addition and subtraction, although the gap between the latter and the former is also double (figures 2.53 and 2.54). Overall, comparisons by gender in numeracy provide additional insight into the challenges that face educators and policymakers in Uganda. Unlike English literacy, where the differences (favoring girls) were relatively small, three results in numeracy are troubling. First, the size of the gender gap is not negligible and it suggests that boys are far ahead of girls in some areas. Second, the gaps are larger in higher grades, at least when comparing P6 and S2 against P3. And third, the differences do not appear to be closing over time, at least not at a significant pace. As noted throughout this chapter, these three results are not uncommon in assessment studies in the developing or developed world. At the very least, they should serve as a catalyst for addi- tional research into the underlying causes of such differences, together with Figure 2.53  Female-Male Differences in P3 Operations Subcontent Areas, Uganda, 2009 Subcontent areas 0 –0.5 –1.0 Percentage di erence –1.5 –2.0 –2.5 –3.0 –3.5 –4.0 –4.5 Addition Subtraction Multiplication Division Source: National Assessment of Progress in Education (NAPE), Uganda, 2009/10. Learning Outcomes and Problematic Curriculum Areas 53 Figure 2.54  Female-Male Differences in P6 Operations Subcontent Areas, Uganda, 2010 Subcontent areas 0 –1 –2 Percentage di erence –3 –4 –5 –6 –7 –8 Addition Subtraction Multiplication Division Source: National Assessment of Progress in Education (NAPE), Uganda, 2009/10. Figure 2.55  Urban-Rural Difference in Operations on Numbers in P3, Uganda, 2009 10 9 8 Percentage di erence 7 6 5 4 3 2 1 0 Operations Addition Subtraction Multiplication Division Source: National Assessment of Progress in Education (NAPE), Uganda, 2009. research into classroom activities that can help close the gaps in geometry, angles, sets, and substitution. Finally, all the areas mentioned as having prob- lems at the primary level were also identified at the secondary level. Comparisons between rural and urban students in the four areas that make up operations on numbers for P3 and P6 are presented in figures 2.55 and 2.56. The significance at of the differences at these levels is that they relate to the central basic mathematical operations needed to prepare young people for the more advanced mathematics curricula. In P3, the residential gaps are gen- erally largest in the easiest content areas (namely addition and subtraction) and 54 A World Bank Study Figure 2.56  Urban-Rural Difference in Operations on Numbers in P6, Uganda, 2010 14 12 Percentage di erence  10 8 6 4 2 0 Operations Addition Subtraction Multiplication Division Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. Figure 2.57  Private-Government School Difference in Operations on Numbers in P3, Uganda, 2009 25 20 Percentage di erence  15 10 5 0 Operations Addition Subtraction Multiplication Division Source: National Assessment of Progress in Education (NAPE), Uganda, 2009. are smaller in multiplication and division. At P6, the pattern is not as clear, with subtraction registering the widest gap, followed by division and then multiplication. Government versus private school differentials in operations on numbers for P3 in 2009 and for P6 in 2010 were also analyzed, as presented in figures 2.57 and 2.58. For P3, the gaps are wider (close to the 25 percent margin) for the three areas of addition, subtraction, and multiplication. However, the gaps narrow for division (about 12 percent), which indicates that both public and private students struggle more in division than in the other three areas. For P6, however, the gap between government and private schools increases as one moves from the simpler to the complex number operations (addition to Learning Outcomes and Problematic Curriculum Areas 55 Figure 2.58  Private-Government School Difference in Operations on Numbers in P6, Uganda, 2009 25 20 Percentage di erence  15 10 5 0 Operations Addition Subtraction Multiplication Division Source: National Assessment of Progress in Education (NAPE), Uganda, 2009. division). Table C.12 in appendix C shows that the government school average in division increased somewhat during the 2006–09 period, while the private school average in this period steadily decreased. Problem Areas in Biology Decomposition of student performance in biology by subcontent areas was done to establish the problematic curriculum areas in this subject that deserve atten- tion by the education system. Going by the 2010 results, students scored above the 50 percent correct mark only in the insects content area; it is also in this area that 30 percent of students scored in the highest performance category (68–100 percent), as shown in figures 2.59 and 2.60. Students are struggling with all remaining subcontent areas, with the worst performance reported in the subcontent area of soil and an average estimated score of 15 percent. Most worrying is that about 85 percent of S2 students scored not more than 25 percent in this subcontent area! Other weak subcon- tent areas included plant structures and diversity of living things (25 percent), together with microscopes and lenses (30 percent). There is, however, a slight difference between proficiency levels. Although 54 percent of S2 students scored in the lowest category (0–25 percent) in the plant structures subcontent areas, the proportion that scored in this category in diversity of living things was 38 percent. As has been the case with other learning areas, it is evident that there is a very wide gap between the subcontent areas of the S2 biology curriculum with the best and worst student performance. Although students could perform at 54 percent correct in the subcontent area of insects, the same students could 56 A World Bank Study Figure 2.59  Summary of S2 Biology Subcontent Areas, Uganda, 2008–10 70 60 50 Percent correct  40 30 20 10 0 Microscopes Insects Diversity of Soil Plant Biology and lenses living structures overall Subcontent areas 2008 2009 2010 Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10. Figure 2.60  Summary of S2 Biology Subcontent Areas by Proficiency Level, Uganda, 2010 Biology overall Microscopes and lenses Insects Diversity of living Soil Plant structures 0 20 40 60 80 100 Percent 0–25% 26–50% 51–67% 68–100% Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. achieve only 15 percent correct in the subcontent area of soil. Inequitable per- formance within content areas undermines learners’ overall masterly of requisite skills for broader curriculum areas. Further analysis by additional attributes was not conducted because of the similarity of findings in 2008 and 2010, as reflected in the comparative analysis presented in the “overall achievement levels” section for biology. Learning Outcomes and Problematic Curriculum Areas 57 In conclusion, low but improving test score averages are a common feature of primary education systems in developing countries. The fact that so many primary 3 and 6 children are not able to answer a majority of questions drawn from the official curriculum is troubling, but this result needs to be considered in a larger context. Uganda is one of the poorest countries in the world (its latest Human Development Index ranking was 143 out of 169 countries). However, most Ugandan children are now reaching grade 3 and increasingly moving toward grade 6. The Net Enrollment Ratio was 96.1 percent in 2010; and the dropout rate, 4.4 percent. The student survival rate to grade 5 is 62 percent; their completion rate is 54 percent. This combination of high poverty and high participation in primary education (especially in grade 3) puts substantial pressure on the system, particularly in terms of educational quality. The existence of the same kinds of problems year after year suggests that either teachers are unaware of the deficiencies in student achievement is based on their own assessments, or they do not receive adequate feedback from sources such as NAPE and EOC reports. It is also possible that they are generally aware of the issues, but not provided with the necessary support to address them (hence the recurrence), or it could be that even the mechanisms of ensuring improvements are beyond the reach of these teachers. Note SACMEQ countries include Botswana, Kenya, Lesotho, Malawi, Mauritius, 1. Mozambique, Namibia, Seychelles, South Africa, Swaziland, Tanzania (Mainland), Uganda, Zambia, Zanzibar, and Zimbabwe. CHAPTER 3 Teacher Knowledge and Effectiveness Teachers are the crucial players in the teaching-learning process and the key determinants of learning outcomes. Uganda’s efforts to improve the quality of education have prioritized teacher development and allocation, although the results of these efforts have yet to be realized. Approximately 90 percent of teachers in the public primary schools are qualified. The pupil-to-teacher ratio (PTR) has also improved from a national average of 1:85 in the early 2000s to the current 1:57, notwithstanding wide differentials across and within districts, as well as within schools at the lower and primary education levels. Current efforts undertaken by the government of Uganda are meant to address these concerns and make the approach to their resolution consistent with the Joint Budget Support Framework (JBSF). The performance levels for learners discussed in chapter 4 raise a number of questions regarding teacher competency in Uganda, but little evidence exists to guide strategic efforts in this area. Scientific evidence shows that teacher content knowledge influences how teachers engage students with a given sub- ject matter (see Fennema et al. 1993; Spillane 2000) and how they evaluate and use instructional materials (Manouchehri and Goodman 2000; Sherin 2002). That knowledge is most important when related to what students learn (Hill, Rowan, and Ball 2005). This chapter of the report attempts to provide insights about teacher’s content knowledge in Uganda and the extent to which it predicts learning outcomes. The focus is on teacher achievement levels in the same P3, P6, and S2 grades. The report analyzes tests that are similar to those taken by students in the 2011 National Assessment of Progress in Education (NAPE) cycle. This work is guided by three overarching research objectives: (a) providing a descriptive over- view of teacher knowledge levels in the same curriculum areas covered by the student exams, by grade and subject matter, in a bid to answer one of the most discussed issues in Uganda: the competency levels of teachers; (b) understanding more about the relationship between teacher background and subject-matter 59   60 A World Bank Study knowledge; and (c) analyzing the relationship between teacher background (including subject knowledge) and student achievement level by grade, subject, and curriculum content area. Existing literature about Uganda in regard to teacher content knowledge and its relation to student achievement levels draws on the Southern and Western Africa Consortium for Monitoring Education Quality (SACMEQ) testing project in Uganda and other countries in the region. The results are for grade 6 only and this report reviews data only through the 2005 testing. But, in general, two results stand out for Uganda. First, primary school P6 teachers have much higher achievement levels than their students on the basis of the SACMEQ tests, which combine P6-level content with higher-level test questions and use item response theory (IRT) techniques to equate scores across nonstandardized test booklets. Second, according to SACMEQ test data, there is not much of a relationship between teachers’ knowledge of the mathematics curriculum and students’ achievement level in P6. This result is more of a surprise, although it is important to restate that subject-matter knowledge is likely a necessary but insufficient component of teaching effectiveness. Also, given the nonrandom distribution of students and teachers, this relationship can require a very power- ful research design to be fully tested. In sum, there is a very good justification for assessing teacher subject matter knowledge to inform improvements in teacher training policy, as well as to help fill existing gaps in knowledge. Among those gaps is the lack of insight into the kinds of investigations needed to achieve deeper understanding of how to improve the quality of Uganda’s education system. At the same time, it is impor- tant not to focus solely on this particular characteristic of teachers and include additional measures of teacher background, such as education, training, and experience. The evidence to date in Uganda—which is not extensive and comes mainly from P6—provides some tentative support on this area. This chapter continues the preceding analysis using teacher (and student) test scores from the most recent 2011 NAPE assessment, then extends the analysis to other grades and subjects. The general research themes on teachers focus on questions consistent with the overall objectives of this analytical work: how do student and teacher achievement levels compare when their answers to identical sets of questions are examined? As explained, this question can be answered only for P6 and S2 (separately). With previous research, it is expected that P6 teachers would score considerably higher than their P6 students. For S2, the use of subject-matter specialists would also predict higher teacher achievement levels, but there is less previous research to build on. However, in both cases, it is useful to deepen the analysis by comparing achievement levels in specific content areas. Additional research questions explored in this chapter include (a) what kinds of factors predict higher teacher achievement levels? and (b) how does teacher knowledge predict variation in student knowledge? Teacher Knowledge and Effectiveness 61 Primary Student and Teacher Achievement Levels in Numeracy Primary teachers have high content knowledge of numeracy, with the combined overall performance of P3 and P6 teachers rated at 84 percent. These results are presented in table 3.1, which summarizes numeracy achievement levels for P6 students together with P3 and P6 teachers (jointly in the second column and separately in the third and fourth columns). The performance of teachers by subcontent area indicates that their strongest content knowledge is in number operations, with 94 percent being able to answer the P6 test items in this area correctly, followed by the number system (89 percent). The subcontent area Table 3.1  Detailed Summary of P6 Student and P3–P6 Teacher Numeracy Achievement, Overall and by Content Area, Uganda NAPE, 2011 Whole sample Teachers only All P3–P6 Achievement area P6 students teachers P3 only P6 only Overall percent correct 42.1 83.5** 81.0 85.8** (18.0) (12.8) (13.5) (11.7) IRT-scaled score 485.7 699.6** 681.2 716.6** (84.8) (80.9) (81.7) (76.4) By content area Operations on numbers   Addition (6) 85.5 96.3** 95.9 96.6   Subtraction (6) 70.7 95.3** 94.6 95.9*   Multiplication (6) 65.4 95.7** 95.4 95.9   Division (5) 45.0 95.3** 92.9 95.6**   Symbols/brackets (3) 52.9 83.1** 82.7 83.4   Overall operations (24) 65.9 94.0** 93.4 94.6** Number system/place value (10) 60.2 88.6** 86.8 90.2** Number pattern-seq. (14) 35.2 71.1** 67.2 74.7** Measures (18) 29.5 88.5** 85.9 90.9** Graphs-stats (8) 38.9 81.4** 79.8 82.8** Fractions   Drawing-writing (3) 68.2 92.3** 91.9 92.7   Addition (5) 49.2 90.3** 88.7 91.9**   Subtraction (5) 49.9 89.7** 88.2 91.1**   Multiplication (4) 44.9 84.6** 80.5 88.6**   Division (4) 15.2 75.9** 69.0 82.3**   Operations (7) 18.9 76.9** 73.7 79.8**   Overall fractions (28) 38.3 84.2** 81.2 86.9** Geometry (12) 16.8 70.3** 65.4 75.8** Sample size 23,944 1,714 821 893 Source: National Assessment of Progress in Education (NAPE), 2011. Notes: The dependent variable for content area results are percentages correct between 0 percent and 100 percent. For each content area, the number of possible points or questions is included in parentheses. Standard deviations are presented only for the overall scores (top part). Asterisks are used to denote teacher achievement averages that are significantly different from student levels (all comparisons are significant). ** = significant at 0.05 level; * = significant at 0.10 level. 62 A World Bank Study with the worst teacher performance was geometry (70 percent), which indicates that some work still needs to be done in this subcontent area. Teachers’ performance is significantly higher than that of their learners on similar test items drawn from the same curriculum. On average, P6 teachers were able to answer about 86 percent of the P6 student numeracy test items correctly, compared to only 42 percent of P6 learners. This finding means that the average P6 primary teacher in Uganda scores 100 percent higher than the average P6 student. Teachers who score higher than their pupils is an expected pattern and reaffirms the basic condition of effective teaching (being familiar with the content being taught). However, such a wide margin between the two points to low teacher effectiveness. The observed differential translates into about two full standard deviations, although the gap is not as wide when one uses an IRT-scaled score. The results are consistent with an earlier work by Kasirye (2009) and are based on 2005 SACMEQ data for P6 using the IRT 500/100 score. However, the gap in the 2011 data used for this study was much wider (closer to three standard deviations). Teachers and learners share the same areas of relative strength and weakness in the numeracy subcontent areas, with the only difference being that the lowest average for teachers is still above 70 percent, while the problematic subcontent areas for learners register below 40 percent (table 3.1 and figure 3.1). For either group, performance is highest in number operations and the number system or place values, and lowest in geometry, followed by measures, number patterns, and fractions. It is important to note that the proficiency gap between teachers and learners is even greater in the subcontent areas of numeracy with the worst performance. For example, in geometry, P6 teachers’ performance was 4 times Figure 3.1  Summary of P6 Student and P3-P6 Teacher Numeracy Achievement, Overall and by Content Area, Uganda NAPE, 2011 100 90 80 Percent correct 70 60 50 40 30 20 10 0 all ns m s. ns rn es ry er t io ste tat tio tte ur et Ov ra sy s-s ac pa ea s om e r h Fr r Ge Op be ap be M um Gr m N Nu Content areas P6 students P3 teachers P6 teachers Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Teacher Knowledge and Effectiveness 63 higher (76 percent) than that of learners (17 percent); even for measures, P6 teachers scored 91 percent compared to only 30 for their learners. A comparison of P3 and P6 teachers shows that the latter performed sig- nificantly better than the former. The difference amounts to about one-third of a standard deviation for the simple percentage correct comparison (81 ver- sus 86 percent) and almost one-half of a standard deviation using the IRT- scaled score (681 versus 716 points). As mentioned earlier, these results are fairly compressed, so differences by grade are not enormous. But there is a clear advantage for P6 teachers. Not surprisingly, the P6 teacher advantage tends to be largest in the more demanding areas of the numeracy curriculum (geometry, measures, and fractions) and is smaller in areas such as operations. As indicated in the multivariate analysis, P6 teachers are more educated, have more experience, and have a higher level of preservice training than do their P3 counterparts, which would predict higher levels of overall mathematics knowledge. Another important factor is that P6 teachers are more familiar with the P6 curriculum because they work with it on a more regular basis than do P3 teachers. One way to deepen the analysis of teacher achievement levels is to split them into four groups: P3 numeracy, P3 literacy, P6 numeracy, and P6 literacy, which are consistent with the box-and-whiskers plot findings presented in figure 3.2 (which, in turn, are based on the IRT scores for the respective groups). Results indicate that P6 numeracy teachers have the highest average achievement and Figure 3.2  Summary of Primary Teacher Numeracy Scores, by Grade and Subject Specialty, Uganda NAPE, 2011 800 IRT-scaled numeracy score 600 400 200 0 P3 numeracy P3 literacy P6 numeracy P6 literacy Grade and subject specialty Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. 64 A World Bank Study substantially less variation within this category, which is not surprising because they are more exposed to the P6 curriculum in their day-to-day classroom work with learners. Their score is significantly higher than those of other teachers, with an overall advantage of about 0.75 of one standard deviation, which is certainly a significant difference. This finding is, however, also unsurprising and shows the importance of regular exposure to the curriculum in enhancing teachers’ content knowledge. (Table D.1 in appendix D provides a more detailed summary of comparisons of teacher achievement levels across teacher specialty categories). It is important to identify groups of low- and high-scoring students and teachers, as indicated in figure 3.3. For students, only about 5 percent of the P6 sample scored below 15 percent correct on the numeracy test. But relatively large groups of P6 students had results in the 15–25 percent and 25–35 percent ranges; clearly, all three groups are the students who are struggling the most. At the other extreme, only about 5 percent of P6 students scored at 75 percent or above, with very few in the top category (90–100 percent). For teachers the results are very different. Most important, a very small percentage of P3–P6 teachers scored below 50 percent on the P6 numeracy exam. A significant group (about 18 percent) scored in the 50–75 percent range; a group of teachers who appear to have the most difficulty in this area. The remaining teachers are con- centrated in the top two groups, with nearly 40 percent scoring in the 90–100 percent range, although the ideal performance level expected of teachers serving in any education system is 100 percent. Figure 3.3  Summary of P3-P6 Teacher Numeracy Achievement, Overall and by Content Area, Uganda NAPE, 2011 100 95 90 85 Percent correct 80 75 70 65 60 55 50 al l ns s ns s. rn ry er io tem ure io tat tte et Ov at ys as ct s-s a m er rs e Fr a h rp eo Op be M ap be G m Gr m Nu Nu Content areas P3 numeracy P6 numeracy P3 literacy P6 literacy Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Teacher Knowledge and Effectiveness 65 Table 3.2 presents the results for S2 numeracy for students and teachers. As noted earlier, students and teachers in S2 have three numeracy results to summarize. This summary begins with the test items that were created by NAPE and that are drawn strictly from the S2 curriculum. In addition, 24 ques- tions were added to the student and numeracy teacher test papers by the World Bank, drawing on international tests, and were called “supplemental” in the test.1 Taken together, these items provide an overall numeracy score. The results show that students were able to answer only 35 percent of the S2 curriculum test questions correctly. Their scores were marginally lower on the supplemental items, hence the overall student average of 34 percent. As expected, S2 mathematics teachers scored significantly higher than did S2 students. Of importance to note is that the achievement gap between S2 stu- dents and teachers is substantially larger than that between P6 students and teachers. For both the percentage averages and IRT-scaled score, the difference between students and teachers is about two full standard deviations, which is a very large gap. This variance could be attributed to the fact that secondary school teachers in Uganda—as is mostly the case elsewhere—are subject special- ists and, hence, have some specialized training in the area that they are teaching. It could thus be expected that they would be more familiar with the curriculum Table 3.2  Detailed Summary of S2 Student and Teacher Numeracy Achievement, Overall and by Content Area, Uganda NAPE, 2011 S2 sample Achievement area S2 students S2 teachers Overall percent correct   Student test items (45) 34.5 83.0** (15.2) (8.1)   Supplemental items (24) 32.0 77.0** (14.3) (16.0)   All test items (69) 34.0 81.9** (14.1) (8.5) IRT-scaled score (all items) 490.6 788.1** (80.6) (62.1) By content area Set theory, relations-mapping (9) 31.0 71.9** Numerical concepts (26) 50.2 86.0** Cartesian coordinates-graphs (15) 17.7 76.2** Measures (16) 41.6 92.8** Geometry (15) 33.6 70.2** Transformations and functions (12) 24.5 92.7** Statistics (12) 28.3 87.4** Sample size 19,790 507 Source: National Assessment of Progress in Education (NAPE), 2011. Notes: The dependent variable for content area results are percentages correct between 0 percent and 100 percent. For each content area, the number of possible points or questions is included in parentheses. Standard deviations are only presented for the overall scores. a. ** = significant at 0.05 level. 66 A World Bank Study for which they are responsible. Nevertheless, there is still room for improvement for Ugandan secondary school teachers. Because the ideal score is 100 percent, 83 percent should not be considered sufficient. Results in table 3.2 and figure 3.4 further demonstrate the importance of summarizing results by content area in order to have more complete diagnoses of the challenges facing the system. If one uses 50 percent correct as the average that students should at least aim for in any test item, then the S2 numeracy results by subcontent area show that students are struggling with mathematics because all their results were below the 50 percent mark with the exception of numerical concepts, where they got 50 percent of the test items correct. The lowest scores were in Cartesian coordinates/graphs (18 percent). Students were able to score between 25 percent and 30 percent of the test items correct in statistics and transformations and functions; however, in set theory and coordi- nates, they got only about 30 percent correct. In addition, figure 3.4 shows that there is not much of a relationship between teacher and student achievement. Teacher achievement levels are below 85 per- cent in only three content areas: geometry (70 percent), set theory (72 percent), and Cartesian coordinates/graphs (77 percent). These appear to be areas where there is a need to improve teacher achievement levels, but—as was the case in P6—in no single area are teacher scores so worryingly low to warrant such low levels of student achievement. Figure 3.5 presents a frequency summary of S2 student and teacher numer- acy results. For students, the results lend themselves to a normal curve, which was also found for P6 student numeracy achievement. S2 student averages in numeracy are fairly concentrated in the 25–35 percent correct range (30 percent Figure 3.4  Summary of S2 Student and Teacher Numeracy Achievement, Overall and by Content Area, Uganda NAPE, 2011 100 90 Percentage correct  80 70 60 50 40 30 20 10 0 ac y ts es ry ry tic s ns ian er cep sur et eo tis tio es ea om h a t um on tt St a rm Ca r ll n al c M Ge Se a ic n sfo er er Tr a Ov m Nu Content areas S2 students S2 teachers Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Teacher Knowledge and Effectiveness 67 Figure 3.5  Frequency Summary of S2 Student and Teacher Overall Numeracy Achievement, Uganda NAPE, 2011 80 70 67.3 60 Percentage of total 50 40 30.1 27.9 30 22.2 20 16.8 15.2 12.8 10 6.2 0.8 0.8 0 0–15 15–25 25–35 35–50 50–75 75–90 90–100 Overall percentage correct S2 students S2 teachers Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. of students), followed by the 35–50 percent correct range (about 28 percent of students). There is not a large group of students in the 0–15 percent range (only 6 percent); conversely, very few S2 students score above 75 percent on these tests (less than1 percent). For teachers, however, the results show that roughly two-thirds (67 percent) score in the 75–90 percent range in S2 numeracy, which indicates relatively high content knowledge of S2 subject matter. One surprise is that only 15 percent scored in the 90–100 percent range. Nevertheless, almost no teachers are in the really problematic categories (below 50 percent), and relatively few (17 percent) are in the 50–75 percent range. Literacy Comparisons Consistent with numeracy, primary teachers have high content knowledge in literacy. On average, 84 percent of P3 and P6 teachers could answer the literacy test items correctly. Teachers’ scores were highest in the subcontent area of grammar (87 percent), followed by writing (84 percent); scores were lowest in the subcontent area of reading (81 percent). Teachers were also subjected to a supplemental 40-point reading test, with an average score of 77 percent correct, making it the area in literacy with the worst performance (table 3.3). On the whole, primary teachers scored close to three times higher than their students in literacy, according to the P6 test items. The observed score for P6 teachers was 85 percent, compared to only 30 percent for P6 pupils, which translates into a learner-teacher performance ratio of only 0.35. The IRT-scaled score confirms this finding, although the differential translates into less than two standard deviations. Interestingly, the IRT-scaled score difference between students and teachers using NAPE data is nearly identical to that 68 A World Bank Study Table 3.3  Detailed Summary of P6 Student and P3–P6 Teacher Literacy Achievement, Overall and by Content Area, Uganda NAPE, 2011 Whole sample Teachers only All P3–P6 Achievement area P6 students teachers P3 only P6 only Overall percent correct 29.9 83.7** 82.7 84.7** (19.2) (9.5) (9.5) (9.3) IRT-scaled score 485.4 707.2** 700.9 713.0** (85.4) (50.5) (50.3) (50.1) By content area Reading   Associating objects and words (2) 85.3 99.1** 99.0 99.2   Describing activities (4) 33.8 88.6** 88.1 89.1   Reading and interpreting (18) 26.7 80.6** 79.7 81.5**   Reading and answering (16) 29.9 76.4** 75.4 77.2*   Overall reading (40) 31.6 80.6** 79.8 81.4** Reading supplement (40) — 76.5 76.2 76.8 Writing   Drawing objects (3) 47.9 84.6** 83.6 85.6   Writing words (3) 49.4 94.9** 94.6 95.1   Completing form/cards (17) 26.1 88.8** 88.0 89.6*   Naming objects (2) 11.3 70.4** 70.4 70.4   Writing activities (15) 27.2 78.4** 76.7 80.0**   Overall writing (40) 29.1 84.1** 83.0 85.2** Grammar (20) 27.9 87.2** 87.6 90.6** Sample size 23,928 1,687 809 878 Source: National Assessment of Progress in Education (NAPE), 2011. Notes: The dependent variable for content area results are percentages correct between 0 percent and 100 percent. For each content area, the number of possible points or questions is included in parentheses. Standard deviations are presented only for the overall scores (top part). a. — = not available. b.*** = significant at 0.01 level; ** = significant at 0.05 level; * = significant at 0.10 level. found by the 2005 SACMEQ project (Kasirye 2009). Although this pattern is expected (teachers should ideally outperform their students), the extremely high differential in content knowledge deserves attention by educators in Uganda. In relation to the primary literacy curriculum area for P6, fewer significant differences exist between P3 and P6 teachers compared with numeracy. This result is confirmed in figure 3.6. The results show that on average, the four groups of teachers (P3 numeracy, P3 literacy, P6 numeracy, P6 literacy) have similar overall literacy scores, and there is not much variation within each group (that is, small standard deviations). As expected, P6 literacy teachers have the highest overall score. But again, the gap is substantially smaller when compared with the other teachers. The results confirm that teacher achievement levels in literacy are very similar across teacher categories. In only one content area (grammar) is the gap between the highest- and lowest-scoring teacher category greater than 5 percent. (Table D.2 and figure D.1 in appendix D present a more Teacher Knowledge and Effectiveness 69 Figure 3.6  Summary of Primary Teacher Literacy Scores, by Grade and Subject Specialty, Uganda NAPE, 2011 800 IRT-scaled literacy score 600 400 200 P3 numeracy P3 literacy P6 numeracy P6 literacy Grade and subject specialty Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. detailed summary of teacher achievement levels in literacy overall and by content area, with teachers divided into the four aforementioned groups.) Teachers’ subject matter knowledge across the three literacy content areas follows a different pattern than that of learners. For example, P6 teachers scored best on the test items from the grammar curriculum area (an average of 91 percent), followed by writing (81 percent), while they scored the lowest (81 percent) in reading. For P6 learners, however, their performance was nearly the same in all three areas: reading comprehension (32 percent), writing (29 percent), and grammar (28 percent). Results by subcontent area further show that both teachers and learners have exceptionally high comfort in one area: associating objects with words. It is sig- nificant to note that this is the only aspect of the curriculum where students scored above 50 percent overall (85 percent), which may be explained by the fact that these questions simply required students to match pictures with words. If this result is related to teacher performance, again, it is the only area where teacher performance reached the almost 100 percent expected of teachers. In regard to areas of weakness, the results show that P6 teacher performance was lower (in the range of 70 percent) in subcontent areas where student perfor- mance was also observed to be in the lowest range, which indicates that teacher content knowledge needs to be improved in those very areas. It can be further observed from the subcontent area results that teacher content knowledge is reasonably high (at least 70 percent). For example, only 70 A World Bank Study 11 percent of P6 pupils could name an object correctly, but their teachers’ per- formance on was 6 times greater (70 percent). Nevertheless, teachers’ content knowledge in curriculum areas where their scores are in the 70 percent margin deserves attention by teacher trainers. Although P6 teachers performed significantly better than P3 teachers (as expected)—figure 3.7—the difference between the three respective content areas of literacy was much narrower than was observed in numeracy. Within subcontent areas, however, P3 teachers seem to struggle more with reading and writing (naming and writing activities) when compared to their counterparts in P6. This finding indicates that teacher content knowledge needs to be improved in those very areas. Figure 3.8 shows a frequency summary of student and teacher scores and reveals that a substantial percentage of P6 students scored in the lowest two categories (0–15 percent and 15–25 percent correct), with the 0–15 percent correct category taking up the largest (about 28 percent) proportion of learners. Only 16 percent of P6 pupils performed above the 50 percent correct mark. In relation to teachers, very few teachers scored in the lowest categories, which is certainly not a surprise. But in literacy, not as many teachers (only about 18 percent) are in the top category (90–100 percent). Table 3.4 presents literacy achievement for S2 students and teachers. Students’ literacy performance in S2 is estimated at 46 percent. However, their English teachers scored 81 percent, roughly two standard deviations higher than their students. Again, the results suggest a general comfort level for teachers in regard to the curriculum for which they are responsible, but at the same time, Figure 3.7  Summary of P6 Student and P3-P6 Teacher Literacy Achievement, Overall and by Content Area, Uganda NAPE, 2011 100 90 80 70 60 Percent 50 40 30 20 10 0 Overall Reading Reading Writing Grammar basic supplement Content areas P6 students P3 teachers P6 teachers Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Teacher Knowledge and Effectiveness 71 Figure 3.8  Summary of Overall Literacy Scores by Frequency Category, P6 Students and P3-P6 Teachers, Uganda NAPE, 2011 80 70 Percent of total sample 60 50 40 30 20 10 0 15 –2 5 –3 5 –5 0 –7 5 –9 0 00 0– 15 25 35 50 75 –1 90 Overall percentage correct P6 students P3–P6 teachers Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Table 3.4  Detailed Summary of S2 Student and Teacher English Literacy Achievement, Overall and by Content Area, Uganda NAPE, 2011 S2 sample Achievement area S2 students S2 teachers Overall percent correct 45.9 81.1** (15.6) (8.0) IRT-scaled score 494.4 721.2** (94.3) (62.6) By content area Reading comprehension   Passage (12) 44.4 78.8**   Poetry (8) 41.0 73.3**   Dialog (8) 63.5 89.6**   Cartoons (8) 40.4 67.1**   Overall reading comp. (36) 47.0 77.4** Writing   Conversation (10) 49.4 93.7**   Formal letter (12) 49.4 86.2**   Composition (12) 49.3 85.2**   Overall writing (34) 49.4 88.0** Grammar   Nouns (1) 36.4 91.7**   Pronouns (2) 27.0 87.2**   Adverbs (1) 53.7 96.2**   Prepositions (2) 60.5 92.8**   Adjectives (3) 44.0 77.3** (table continues on next page) 72 A World Bank Study Table 3.4  Detailed Summary of S2 Student and Teacher English Literacy Achievement, Overall and by Content Area, Uganda NAPE, 2011 (continued) S2 sample Achievement area S2 students S2 teachers   Tenses (5) 47.8 90.3**   Punctuation (4) 29.2 72.9**   Structural patterns (8) 35.7 72.6**   Articles and words of quantity (4) 52.7 76.2**   Overall grammar (30) 41.6 80.3** Sample size 19,687 497 Source: National Assessment of Progress in Education (NAPE), 2011. Notes: The dependent variable for content area results are percentages correct between 0 percent and 100 percent. For each content area, the number of possible points or questions is included in parentheses. Standard deviations are only presented for the overall scores (top part). a. *** = Significant at 0.01 level; ** = Significant at 0.05 level; * = Significant at 0.10 level. it is hard to conclude that teacher content knowledge at this level is sufficient when it is below the 85 percent mark. The other result for teachers is that at the S2 level, there is less variation in their scores, as indicated by a differential of about 8 percentage points compared with a differential with P3 and P6 teachers of 10–13 percentage points, which may be the result of the specialized nature of secondary education teachers. Table 3.4, as well as figure 3.9, provide a detailed summary of content and subcontent area results for students and teachers. Overall averages for students in reading comprehension and writing are at 47 percent and 49 percent correct, respectively, with the lowest scores in grammar (42 percent). There are however, significant differentials in the teachers’ scores for the three broad content areas Figure 3.9  Summary of S2 Student and Teacher English Literacy, Overall and by Content Area, Uganda NAPE, 2011 100 80 Percentage correct 60 40 20 0 Overall Reading Writing Grammar literacy comprehension Content areas S2 students S2 teachers Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Teacher Knowledge and Effectiveness 73 in literacy for S2. The highest performance was registered for writing (88 per- cent), followed by 80 percent for grammar—findings that compare with 77 percent for the area with the worst performance: reading comprehension. This analysis clearly indicates that some work needs to be done with teachers in reading comprehension and grammar. Results for the literacy subcontent areas presented in the last two panels of table 3.4 and in figures 3.10 and 3.11 reveal that for both teachers and students, Figure 3.10  Summary of S2 Student and Teacher Grammar Achievement, Overall and by Subcontent Area, Uganda NAPE, 2011 100 90 Percentage correct  80 70 60 50 40 30 20 10 0 ar s s es es ives uns rns ion uns m ition verb ticl ens t e t m ra po s d r T ec No Patt ua ono g A A dj n ct Pr all re A u er P P Ov Content areas S2 students S2 teachers Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Figure 3.11  Summary of S2 Student and Teacher Reading and Writing Achievement, Overall and by Subcontent Area, Uganda NAPE, 2011 100 90 80 Percentage correct  70 60 50 40 30 20 10 0 s r ng og ge try on in g io n tte on di al sa e to rit at le iti re a Di Pa s Po r w r s a l o s al l Ca rall ve rm m p er v e C on Fo Co Ov O Subcontent areas S2 students S2 teachers Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. 74 A World Bank Study there is more variation in achievement in grammar than in reading comprehen- sion and writing. For example, the difference between the subcontent area with the best (prepositions) and worst (pronouns) performance by students is about 32 percentage points. Similarly, teachers record a differential of about 25 per- centage points between the subcontent area with the highest scores (adverbs) and that with the lowest scores (patterns or pronouns). On the whole, the results show that problematic curriculum areas for students in grammar are not the same as those for the teachers, with a few exceptions, which is not consistent with the general findings for this chapter. Figure 3.12 concludes with a frequency summary of S2 literacy scores for students and teachers. Once again, the most important result is that very few S2 students scored in the lowest two categories (0–15 percent and 15–25 percent). Also, a substantial proportion (37 percent) scored in the 50–75 percent range. This finding compares with only 13 percent of S2 students who scored in this category on the numeracy exam. Taken together, results for S2 students suggest that there is a good base of knowledge to build on at this level, meaning that most students have some basic grasp of the curriculum and that a fairly large group is already performing at a level close to acceptable. For teachers, the 14 percent who scored at the levels of 35–50 percent and 50–75 percent require attention in terms of content knowledge. S2 Biology Achievement In regard to learning achievement in biology, table 3.5 show low average scores for students in this subject. S2 students answered only 25 percent of the biology test questions correctly, which is the lowest score of all grades and subjects summarized in this report. This suggests that students in this content area are, Figure 3.12  Frequency Summary of S2 Student and Teacher Overall Numeracy Achievement, Uganda NAPE, 2011 80 70 Percentage of total 60 50 40 30 20 10 0 0–15 15–25 25–35 35–50 50–75 75–90 90–100 Overall percentage correct S2 students S2 teachers Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Teacher Knowledge and Effectiveness 75 Table 3.5  Detailed Summary of S2 Student and Teacher Biology Achievement, Overall and by Content Area, Uganda NAPE, 2011 S2 sample Achievement area S2 students S2 teachers Overall percent correct 24.9 64.6** (9.4) (12.3) IRT-scaled score 492.8 783.8** (89.1) (90.0) By content area Diversity of living things   Introduction to biology (6) 56.9 77.2**   Classification (18) 17.0 45.2**   Microscopes and hand lenses (10) 30.5 70.2**   Animal and plant cell (7) 29.6 82.8**   Features of flowering plants (34) 21.3 65.8**   Features of insects (16) 34.2 70.3**   Overall diversity (91) 26.7 65.3** Soil properties, etc. (29) 16.7 55.8** Sample size 19,748 498 Source: National Assessment of Progress in Education (NAPE), 2011. Notes: The dependent variable for content area results are percentages correct between 0 percent and 100 percent. For each content area, the number of possible points or questions is included in parentheses. Standard deviations are presented only for the overall scores (top part). a. *** = significant at 0.01 level; ** = significant at 0.05 level; *S = significant at 0.10 level. on average, far behind expected or desired levels of knowledge. In fact, scores this low suggest that students have very little understanding of most biology concepts. Given the S2 scores in literacy, this finding cannot be attributed to problems with basic skills such as reading. The more likely explanations deal with other factors, such as teaching approaches or pedagogy, actual instruction time (including teacher time on task), and many others. The second result that stands out is that S2 specialist biology teachers answered only 65 percent of the questions correctly. In other words, this is the first area summarized in this report where there appears to be a widespread problem with teachers’ content knowledge. The S2 biology results also suggest that compara- tively low levels of teacher knowledge may help explain low student scores; again, this idea will be tested in a more rigorous framework in the next section. In terms of the teacher knowledge advantage vis-à-vis students, the gap is actu- ally one of the largest, amounting to roughly three full standard deviations. But considering the very low levels of student knowledge in this subject area, the deviation is potentially misleading. An average of 65 percent for teachers is clearly insufficient, especially in light of the fact that all teachers should ideally score 100 on items drawn from the curriculum content in which they instruct in class, which raises important questions about teacher preparation in this subject. The results by content area provide few surprises. With an overall score below 25 percent, it is unlikely that students are scoring well in specific aspects of the 76 A World Bank Study curriculum. Indeed, the summary in table 3.5 confirms that S2 student averages are above 35 percent in only one area: introduction to biology (57 percent). Their scores are very low in critical areas of the curriculum, namely soil properties and classification (17 percent), as well as features of flowering plants (21 percent). Figure 3.13 compares student and teacher scores by content area. The results do show some correlation between the two groups, because the areas where student averages are the lowest are also (to some degree) the areas where teach- ers had the hardest time answering the questions. This finding is true for the areas of classification, soil, and plants. The relationship is not a perfect one, as evidenced by high teacher averages in the area of cells. Nonetheless, the results suggest that one of the underlying reasons for low student scores in biology may simply be that their teachers are not adequately versed in the basic components of the curriculum. Figure 3.14 concludes the analysis of biology with a frequency distribution of student and teacher scores; as expected, student scores are concentrated in the left-hand side of the figure. Although it is important to note that—even in this very low-scoring subject—the percentage of students in the 0–15 percent range is only 11.7 percent, we readily see that in P6 literacy this proportion was nearly 30 percent. Nearly half of S2 students are concentrated in the 15–25 percent category, which is consistent with their overall average score of 25 percent. Less than 15 percent of the students scored above 35 percent on the biology test. Figure 3.13  Summary of S2 Student and Teacher Biology Achievement, Overall and by Content Area, Uganda NAPE, 2011 90 80 70 Percentage correct  60 50 40 30 20 10 0 gy y s es ls ts n il og ct l an tio So olo ol se c op Ce Pl a l bi bi I n os sific al to icr as er n M Cl Ov c tio du tro In Content areas S2 students S2 teachers Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Teacher Knowledge and Effectiveness 77 Figure 3.14  Frequency Summary of S2 Student and Teacher Biology Achievement, Uganda NAPE, 2011 80 70 Percent of total sample 60 50 40 30 20 10 0 0–15 15–25 25–35 35–50 50–75 75–90 90–100 Overall percentage correct S2 students S2 teachers Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. For teachers, the results clearly demonstrate a lack of sufficient content knowledge at this level. None of the S2 biology specialist teachers scored above 90 percent, and less than 20 percent scored in the 75–90 percent range. Instead, the bulk of teachers scored in the 50–75 percent range, with another group of nearly 10 percent scoring below 50 percent. In sum, overall student achievement levels in biology are significantly lower than expected and suggest that very little learning is taking place. However, far fewer S2 students are scoring in the lowest range (in comparison to P6 literacy), which suggests that they do at least have some of the basic skills needed to advance. The challenge is, therefore, to raise achievement levels by building on this base of knowledge. In biology, teacher results point to much deeper problems which may be explainable (at least in part) by either challenges in obtaining teachers with specific training or weak preservice training in this area. Furthermore, of all of the subjects analyzed in this report, this is the one area where student and teacher scores seem to be closely correlated, at least at the macro level of mea- suring overall averages by content area. This finding, in turn, highlights the need to understand more about the underlying determinants of teacher knowledge levels and the positive and negative effects of teacher characteristics—including subject matter knowledge—on student achievement levels. Predictors of Teacher and Student Performance This section summarizes the results of two sets of multivariate statistical analyses. The first focuses on teacher academic achievement as a dependent variable in order to consider policy-relevant influences on this important 78 A World Bank Study teacher characteristic. The second set focuses on student achievement and includes a number of student, family, school, and teacher variables as covariates, including teachers’ subject-matter knowledge. The emphasis of this report is on the general achievement scores of numeracy, literacy, and biology. Some con- tent-specific results (for teachers) are presented in appendix D. The purpose of this chapter is to further stimulate debate about the likely determinants of teacher and student performance, including further investigation of those issues in a manner that generates conclusive results around causal factors. Nevertheless, results from these analyses provide initial insights on this widely discussed issue, which could inform ongoing efforts to enhance the quality of education in Uganda. Covariates of Teacher Achievement These results are presented into two categories. The first focuses on primary teachers while the second focuses on lower secondary teachers. P3 and P6 Teachers Earlier results indicate that average teacher content knowledge was 80 percent, but with standard deviations in the 8–12 percent range, which justifies the need to further explore the kinds of variables associated with higher and lower levels of teacher subject matter knowledge. Two groups of influences have been considered in the teachers’ analysis. The first includes school- and community- level indicators, such as rural location, average socioeconomic status (SES), and private school. These variables provide information about the distribution of teacher quality (measured by content knowledge) across different kinds of schools. But they do not represent variables that are likely to directly influence outcomes. Those kinds of influences are instead (potentially) captured by char- acteristics that are specific to each individual teacher, including the teacher’s education level, professional qualifications, gender, and experience, which are captured in the second group of variables—hence the use of Hierarchical Linear Models (HLM), as indicated in table 3.6. Table 3.6  Covariates of P3–P6 Teacher Overall Numeracy and Literacy Achievement (T-Statistics in Parentheses), Uganda NAPE, 2011 Numeracy Literacy Independent variable HLM HLM FE HLM HLM FE School characteristics  Rural –0.09 –0.07 –0.03 –0.07 –0.08 –0.08 (–1.01) (–0.85) (–0.28) (–0.76) (–0.82) (–0.68)  Private –0.06 –0.08 –0.06 0.13 0.14 0.06 (–0.67) (–0.81) (–0.63) (1.29) (1.39) (0.48)   Average SES (district) 0.01 –0.01 — 0.17* 0.15 — (0.14) (–0.08) (1.68) (1.51) (table continues on next page) Teacher Knowledge and Effectiveness 79 Table 3.6  Covariates of P3–P6 Teacher Overall Numeracy and Literacy Achievement (T-Statistics in Parentheses), Uganda NAPE, 2011 (continued) Numeracy Literacy Independent variable HLM HLM FE HLM HLM FE Teacher characteristics  Female –0.54*** –0.39*** –0.39*** –0.02 0.08 0.04 (–10.90) (–7.75) (–7.34) (–0.35) (1.44) (0.69)   Academic qual. = UACEa 0.12* 0.08 0.08 0.04 0.01 –0.03 (1.90) (1.29) (1.33) (0.55) (0.07) (–0.43)   Teacher qualificationb    Grade 5 primary 0.12** 0.12** 0.11* 0.19*** 0.17*** 0.18*** (2.18) (2.10) (1.82) (3.11) (2.60) (2.64)    Grade 5 secondary –0.47** –0.46** –0.30 –0.25 –0.23 –0.02 (–2.18) (–2.23) (–1.24) (–1.04) (–0.94) (–0.10)    Bachelor’s and above 0.35** 0.34** 0.23 0.15 0.14 0.11 (2.26) (2.27) (1.52) (0.84) (0.82) (0.81)   Other –0.11 –0.12 –0.09 0.02 0.08 0.03 (–0.63) (–0.70) (–0.53) (0.10) (0.39) (0.18)   Teacher experiencec   2–3 years — 0.04 0.03 — 0.05 0.04 (0.40) (0.24) (0.41) (0.39)   4–6 years — 0.13 0.13 — 0.27** 0.21* (1.14) (1.01) (2.12) (1.67)   7–10 years — –0.12 –0.12 — 0.19 0.16 (–1.12) (–0.82) (1.52) (1.37)   11+ years — –0.06 –0.05 — 0.12 0.03 (–0.59) (–0.38) (0.94) (0.29)  Grade-subject   P3 numeracy — –0.53*** –0.53*** — –0.48*** –0.48*** (–9.72) (–9.99) (–7.42) (–7.18)   P3 literacy — –0.68*** –0.69*** — –0.37*** –0.36*** (–11.06) (–10.70) (–5.34) (–5.82)   P6 numeracy — — — — –0.36*** –0.38*** (–5.59) (–5.65)   P6 literacy — –0.60*** –0.60*** — — — (–11.03) (–10.79)   P3–P6 numeracy 0.37*** — — — — — (8.83)   P3–P6 literacy — — — 0.27*** — — (5.61) District fixed effects? No No Yes No No Yes Explained variance (R2) — — 0.27 — — 0.16 Sample size 1,553 1,553 1,553 1,534 1,534 1,534 Source: National Assessment of Progress in Education (NAPE), 2011. Notes: The dependent variable for all estimations (by subject) is the standardized measure (mean = 0, standard deviation = 1.00) of the IRT-equated test score. Coefficients, therefore, represent changes in standard deviations. HLM refers to mixed model with a random school effect; FE is for fixed effects at district level. T-statistics (in parentheses) are based on robust standard errors that correct for clustering of teachers at school level. Additional included variables (not presented) include “Teacher is married” and controls for missing data about teacher experience and qualifications. — = not available. a. Excluded category: UCE-only academic qualification. b. Excluded category: Grade 3 teaching qualification. c. Excluded category: Experience = 1 year. d. *** = significant at 0.01 level; ** = significant at 0.05 level; * = significant at 0.10 level. 80 A World Bank Study Teachers’ results are combined, or pooled, for P3 and P6 because they took the same exams. The dependent variable is a standardized measure (mean = 0, standard deviation = 1.0) of the IRT scores that were created for each subject. The coefficients, therefore, represent standardized effects that measure the dependent variable change in standard deviations for each unit change in the independent variables. HLMs include a random effect at the school level, while a fixed effects (FE) model controls for all differences between districts and focuses on variation within districts. As discussed in the methodology section, fixed effects models are likely to capture more unmeasured influences than do the random effects models. First, when one controls for teacher background and district poverty, there is not much evidence that the distribution of teacher quality—as measured by teacher subject knowledge—varies significantly across location (rural) or school type (private). For teacher literacy achievement, the measure for poverty is significant in one estimation (and positive), which suggests that teacher knowl- edge levels are higher in wealthier districts. But this result does not hold for numeracy, so it cannot be concluded that there is a strong relationship between the distribution of teacher knowledge and SES. Second, female teachers score between 0.39 and 0.54 standard deviations lower than their male counterparts in the area of numeracy. This is a large difference, but the result is generally consistent with mathematics research internationally. It is important to note that this result holds even when control- ling for the grade and subject specialty reported by the teacher (see the bottom half of table 3.6) and hence compares equally educated and qualified male and female teachers who are working in the same grades and subjects. From a policy standpoint, teacher gender results in numeracy must not be viewed simply as a question of training or recruitment issues. They are instead likely to be part of a larger cultural phenomenon that begins at an early age and is related to (a) how girls view mathematics and (b) the kinds of signals they receive as they move through school. This issue is not unique to Uganda. Gender differences do not extend into literacy, where there are no significant differences between male and female primary teachers. This finding is somewhat surprising, because it is not unusual for men to have a significant advantage in numeracy, but for women to have the advantage in literacy. The general focus on the primary level content could be the contributor to this finding. Results provide some evidence that teachers with higher levels of academic preparation and teaching qualifications have higher levels of content knowledge than do their counterparts with lower levels of education. The variable that measures Uganda Advanced Certificate of Education (UACE) preparation (ver- sus Uganda Certificate of Education [UCE]) is generally positive, although only significant (and positive) in one estimation. However, the results for teacher training qualifications are more consistent, showing that teachers with a grade 5 primary training level consistently score between 0.10 and 0.20 standard Teacher Knowledge and Effectiveness 81 deviations higher on P6 content knowledge tests, compared with teachers who have only a grade 3–level training. This difference is large enough to qualify as an important result, especially because it suggests that pr-service training is associated with content knowledge and because it further grounds Uganda’s adoption of grade 5 teacher training as the minimum requirement for primary education teachers. Other teacher qualifications categories are not consistently related to teacher subject matter knowledge, which is not surprising, given the very small number of primary teachers who have other qualification levels (table D.1 in appendix D). The results are somewhat mixed for variables related to teaching level and experience. For experience, results are consistently insignificant except for the teachers who have 4–6 years of experience and whose scores were significantly higher than other teachers by 0.20 standard deviations. Nevertheless, the results on teacher experience suggest that—for primary-level content knowledge—no experiential learning is taking place, meaning that teachers who are exposed to this content for longer periods of time are not able to answer more test ques- tions than their counterparts with less experience, which is a challenging result. There are very strong and significant effects for controls that combine teach- ing specialty and grade. In both subject areas (numeracy and literacy), P6 spe- cialist teachers have subject matter knowledge levels that are substantially higher than those of all other teachers. The largest negative impact is in numeracy, where P6 numeracy teachers (an excluded category in table 3.6) score almost 0.70 standard deviations higher than do P3 literacy teachers. But the difference is also large when this group is compared with fellow numeracy teachers who work in P3: 0.53 standard deviations. The size of the difference among numeracy specialists is somewhat surprising, suggesting that a teacher’s exposure to the P6 curriculum in numeracy helps explain why they score so much higher than other teachers. Similarly, the differences between teachers in the literacy content area are still significant, but not as large. Once again, the largest difference (about 0.50 standard deviations) is between P6 specialist teachers and their P3 nonspecialist counterparts (in this case, numeracy teachers). But P6 literacy teachers continue to score higher than do their P3 specialist counterparts (0.36–0.37 standard deviations). Because the models control for teacher qualification levels, the results again suggest that the P6 teachers’ exposure to this same curriculum plays a substantial role. The tables in appendix D summarize the multivariate results for teacher knowledge by content area in numeracy (table D.2) and literacy (table D.3). In general, the results are consistent with those presented in table 3.6. But a few specific findings merit mention. For gender, the results confirm that female teachers score below their male counterparts in numeracy, and the result is fairly constant across content areas. In literacy, however, female teachers score signifi- cantly higher than do male teachers in content knowledge related to reading. Also, the effect of teacher qualifications is inconsistent across content areas and 82 A World Bank Study instead is relatively specific to a handful of areas. Finally, for teacher literacy content knowledge, there is more evidence of differences across school types for indicators such as rural, private school, and district SES. S2 Teachers The distribution of teacher quality (measured by content knowledge) is signifi- cantly more variable at the S2 than the primary level. For example, S2 teachers working in private schools have significantly lower levels of content knowledge. The results are very consistent across subject and average between 0.25 and 0.45 standard deviations. This is an important result because it may help explain the observed lower performance of S2 students in private schools com- pared to their counterparts in public schools (as presented in chapter 2 of this report). In numeracy and literacy, teacher specialists in schools implementing the Universal Secondary Education (USE) reform program also have signifi- cantly lower quality, as reflected in their content knowledge scores in numeracy and biology, with standard deviations between 0.28 and 0.51. However, teach- ers in double-shift schools have significantly higher content knowledge of numeracy and biology subject matter, with standard deviations that average 0.21–0.29 standard deviations. The result for USE schools is hard to explain in light of the fact that USE and double-shift schools are prioritized by the gov- ernment in the allocation of qualified public service teachers, which explains the positive results in the double-shift schools. Further research can help to unearth issues that underpin the negative result in USE schools. There are no significant differences in S2 teacher achievement by residence (urban-rural) and between schools with different levels of SES, with the exception of the negative result for numeracy. In regard to teacher characteristics, no significant differences exist in second- ary teachers’ content knowledge by gender, or in teacher preparation. The gender result could be due to the more demanding and competitive filtering process for teacher specialists at this level. There is also no relationship between the preparation levels of teachers and their content knowledge. This is a surpris- ing finding, especially when one takes into account the effect of this variable in primary grades. But the results in table 3.7 show that more educated teachers are not more proficient than are teachers with the standard grade 5 secondary training. The coefficients for highly educated teachers (that is, those with master’s or doctorate degrees) are generally very large, but in only one estima- tion was this variable significant. Teachers’ experience is positively related to numeracy and biology content knowledge; in some categories, the effect size is substantial (above 0.50 standard deviations). This is another result that is likely related to the teacher’s exposure to the curriculum, rather than to age effects, which are related to the quality of training that older teachers received in earlier eras. Given the difficulty level of the S2 curriculum, teachers are able to improve their own knowledge during instruction as they devise ways to help students. Teacher Knowledge and Effectiveness 83 Table 3.7  Covariates of S2 Teacher Achievement, by Content Area (T-Statistics in Parentheses), Uganda NAPE, 2011 Numeracy Literacy Biology Independent variables OLS FE OLS FE OLS FE School characteristics  Rural –0.09 –0.01 –0.15 –0.19 –0.14 –0.08 (–0.90) (–0.04) (–1.33) (–1.63) (–1.46) (–0.81)  Private –0.45*** –0.40** –0.25** –0.32*** –0.42*** –0.45*** (–3.83) (–3.08) (–2.36) (–2.74) (–4.45) (–4.46)   USE school –0.47*** –0.51*** –0.09 –0.13 –0.29*** –0.28** (–3.97) (–3.51) (–0.61) (–0.81) (2.88) (–2.17)   Double shift 0.29** 0.29* –0.17 –0.30 0.29*** 0.21* (2.26) (1.90) (–1.02) (–1.29) (2.95) (1.79)   Average SES (school) –0.14* –0.13 0.12 0.07 0.01 0.02 (–1.68) (–1.17) (1.44) (0.64) (0.04) (0.20) Teacher characteristics  Female 0.07 0.11 0.10 0.03 0.05 –0.11 (0.46) (0.59) (1.03) (0.27) (0.40) (–0.74)   Teacher qualificationa    Grade 5 primary –0.09 –0.03 –0.35 –0.22 –0.45 –0.78 (–0.31) (–0.09) (–0.61) (–0.33) (–0.77) (–1.32)   Bachelor’s –0.04 –0.07 0.07 0.18 –0.02 –0.06 (–0.37) (–0.58) (0.84) (1.59) (–0.29) (–0.74)    Master’s or doctorate 0.46 0.93 0.57 0.05 0.30** –0.05 (1.13) (1.60) (1.27) (0.10) (2.07) (–0.22)   Other –0.37*** –0.35** –0.28 –0.33* –0.27 –0.03 (–2.78) (–2.06) (–1.53) (–1.69) (–1.38) (–0.18)   Teaching experienceb   2–3 years 0.22 0.14 0.35* 0.32 0.28 0.32* (1.49) (0.79) (1.88) (1.46) (1.45) (1.85)   4–6 years 0.42** 0.35* 0.19 0.16 0.36** 0.53*** (2.25) (1.76) (1.03) (0.68) (2.29) (3.03)   7–10 years 0.53*** 0.38* 0.21 0.23 0.44*** 0.49*** (3.09) (1.82) (0.83) (0.79) (2.55) (2.73)   11+ years 0.52*** 0.34 0.23 0.30 0.41** 0.49** (3.15) (1.57) (1.06) (1.17) (2.01) (2.36) District fixed effects? No Yes No Yes No Yes Explained variance (R ) 2 0.16 0.35 0.07 0.32 0.13 0.43 Sample size 479 479 466 466 460 460 Source: National Assessment of Progress in Education (NAPE), 2011 Notes: The dependent variable for all estimations (by subject) is the standardized measure (mean = 0, standard deviation = 1.00) of each content area percentage score (see bottom of table for summary of dependent variables). Coefficients, therefore, represent changes in standard deviations. OLS models are used (rather than HLM) because of a lack of clustering of teacher within schools; FE is for fixed effects at the district level. T-statistics (in parentheses) are based on robust standard errors that correct for clustering of teachers at school level. Additional included variables (not presented) include “Teacher is Married” and controls for missing data on teacher experience and qualifications. a. Excluded category: Grade 5 secondary qualification. b. Excluded category: experience = 1 year c. *** = significant at 0.01 level; ** = significant at 0.05 level; * = significant at 0.10 level. 84 A World Bank Study Covariates of Student Achievement Earlier work on Uganda by Nannyonjo (2007) identifies a number of factors that affect learning outcomes at the P6 level in Uganda. Those factors are cate- gorized into learners’ background characteristics, availability of school inputs, teacher characteristics, teaching characteristics, and administrative strategies. In regard to learners’ background characteristics, the number of books in their home, pupils’ punctuality, their regular attendance, parental interest, and pres- ence of reliable lighting at home (including electricity) were positively associ- ated with higher test scores in English and mathematics in P6. When one considers school inputs, funding per pupil, time spent teaching a subject by teachers, and greater availability of textbooks to pupils, findings are positively correlated with performance in English and math at the P6 grade level. However, the same study revealed that teacher qualifications had mixed results. Math scores for P6 learners increased along with a decrease in teacher qualifications, except for university education. English test scores increased with an increase in teacher qualifications. In addition, pupil performance increased with teacher experience, with a peak realized between 6 and 10 years. In-service teacher training was also positively related to English performance, but was negative for math performance. In regard to teaching and administrative strate- gies, the following were observed to be positively associated with high test scores: (a) frequency of homework, (b) going over homework with teacher in class, (c) mode of handling tests, (d) grouping, (e) out-of-school time spent by the teacher who focused on a pupil’s academic issues, and (f) pupil-teacher interactions in the classroom. The findings of this work, therefore, supplement these observations and provide a comprehensive view of challenges faced by the system. P3 and P6 Students Predictors of student achievement were determined based on HLM models in order to cater for the clustering of students in schools, together with a school- specific random effect, as presented in table 3.8. The variables are divided into three groups: students, school, and teacher characteristics. As earlier indicated, the dependent variable is the standardized version of the IRT-scaled score. Results by gender are mixed, with different patterns in P3 and P6. In P3, female students have very large significant advantages over males in literacy and reading (with average standard deviations of between 0.47 and 0.56). This pat- tern, however, is reversed by the time students reach P6, with girls performing worse (with a standard deviation of 0.08). There is no significant difference between boys and girls in P3 numeracy, but by the time students reach P6, the results show that boys have opened up a sizeable advantage vis-à-vis girls (0.27 standard deviations). More surprisingly, they actually score significantly higher than girls in literacy. This pattern in numeracy is not unusual, meaning that research often finds that the numeracy advantage for boys increases over time. But the results for literacy are surprising, although it should be pointed out that Teacher Knowledge and Effectiveness 85 Table 3.8  Covariates of P3–P6 Student Numeracy, Literacy, and Reading Achievement, Uganda NAPE, 2011 Numeracy Literacy Independent variables P3 P6 P3 P6 P3 reading Student is female 0.05 –0.27*** 0.47** –0.08*** 0.56*** (0.29) (–13.50) (2.47) (–4.18) (3.19) Student age 0.04*** –0.04*** –0.01 –0.09*** –0.03*** (7.63) (–4.81) (–1.29) (–12.22) (–5.63) School characteristics  Rural –0.23** –0.42*** –0.50*** –0.57*** –0.41*** (–2.32) (–3.91) (–4.74) (–4.88) (–4.21)  Private 0.52*** 0.67*** 0.78*** 0.89*** 0.68*** (5.34) (5.46) (7.11) (5.56) (6.81)   Average SES (district) 0.46*** 0.34*** 0.79*** 0.11 0.75*** (4.28) (2.61) (6.21) (0.84) (6.41) Teacher characteristics  Female 0.08 –0.08 –0.03 –0.02 –0.07 (0.96) (–0.69) (–0.29) (–0.18) (–0.75)   Academic qual. = UACEa –0.05 –0.04 0.31*** –0.01 0.23** (–0.55) (–0.44) (2.76) (–0.12) (2.29)   Teacher qualificationb    Grade 5 primary 0.08 0.02 0.06 0.18* –0.03 (0.89) (0.23) (0.60) (1.91) (–0.30)    Grade 5 secondary 1.00** –0.83* –0.25 0.36 –0.04 (2.22) (–1.92) (–0.91) (1.31) (–0.15)    Bachelor’s and above 0.27 0.15 0.08 0.46** –0.09 (0.99) (0.65) (0.30) (2.29) (–0.38)   Other 0.30 –0.74*** –0.18 –0.42 0.21 (1.07) (–2.56) (–0.67) (–1.11) (0.81)   Teaching experiencec   2–3 years 0.02 –0.09 0.11 0.09 0.29* (0.08) (–0.59) (0.61) (0.42) (1.76)   4–6 years –0.11 0.12 –0.11 0.16 0.07 (–0.54) (0.75) (–0.62) (0.82) (0.49)   7–10 years –0.22 –0.09 –0.07 –0.05 0.08 (–1.08) (–0.59) (–0.44) (–0.28) (0.50)   11+ years –0.28 –0.21 –0.12 0.01 0.06 (–1.42) (–1.34) (–0.76) (0.02) (0.40)   Teacher content knowledge –0.01 0.03 0.15*** 0.03 0.12*** (–0.02) (0.57) (3.37) (0.65) (3.25) Sample size 8,487 5,656 6,473 5,066 6,418 Source: National Assessment of Progress in Education (NAPE), 2011. Notes: The dependent variable for all estimations (by subject) is the standardized measure (mean = 0, standard deviation= 1.00) of each content area percentage score (see bottom of table for summary of dependent variables). Coefficients, therefore, represent changes in standard deviations. All estimations incorporate a mixed model (HLM) with a random school effect. T-statistics (in parentheses) are based on robust standard errors that correct for clustering of teachers at school level. Additional included variables (not presented) include “Teacher is married” and controls for missing data on teacher experience and qualifications. a. Excluded category: UCE-only academic qualification. b. Excluded category: Grade 3 teaching qualification. c. Excluded category: Experience = 1 year d. *** = significant at 0.01 level; ** = significant at 0.05 level; * = significant at 0.10 level. 86 A World Bank Study the P6 literacy difference is not substantial (0.08 standard deviations). More research is required to understand these gender dynamics as young people move through the primary cycle. Age exerts a significant influence on student performance, with older students scoring significantly lower on the P3 and P6 student achievement exams. The one exception is P3 numeracy, where the point estimate is actually positive. The negative relationship with age likely reflects the struggles of late- entry and repeating students who, despite having a physical advantage in terms of maturity, may face other kinds of sociocultural problems, compounded by physiological changes that may affect their learning. This finding calls for height- ened efforts to ensure the timely enrollment of children in school: Uganda’s official age at entry into primary one is 6. There are very significant differences in student achievement levels across different kinds of schools and communities. Rural students score between 0.23 and 0.57 standard deviations lower than urban students when other factors are held constant. Private school students have a very large advantage (0.52–0.89 standard deviations) over their public school counterparts. And schools located in wealthier districts have higher achievement levels. Teacher characteristics are generally less significant predictors of variations in student achievement compared with student and school variables. For education levels, qualifications, and experience, the results are almost always insignificant, and any significant findings do not form a clear pattern. One exception is the variable measuring UACE education (versus UCE), which has a significant effect on P3 literacy and reading. But, in general, it cannot be concluded that there is strong evidence linking teaching credentials with student achievement levels. Results on teacher experience negate the usually common finding in a number of countries. In their study of fifth graders in North Carolina, for example, Clotfelter, Ladd, and Vigdor (2006) found that teacher experience is consistently associated with learning achievement. The generally insignificant results for teacher credentials and background are not unusual in education research for the simple reason that credentials alone may not capture a teacher’s teaching ability (or dedication)—hence, the importance of data that capture actual teaching activities, which can be gathered by further research on teacher performance. Teacher subject matter knowledge in P6 literacy significantly affects literacy learning achievement in P3, and the effect size is fairly large: between 0.12 and 0.15 standard deviations. This finding clearly shows the importance of higher content knowledge in teacher effectiveness. The significant results for teacher content knowledge serve as an important reminder that teachers can make a difference, and they also highlight the need to go beyond simple credential and experience measures when analyzing teacher characteristics. The results also point to a policy chain or sequence, meaning that in the previous chapter, a link was established between primary teacher training levels and their content knowledge. That chain continues by linking student achievement with teacher knowledge, despite the fact that it is true only for P3 reading and literacy. Teacher Knowledge and Effectiveness 87 S2 Students Gender, age, family socioeconomic status, and whether the school is a day or boarding institution significantly affect the performance of lower secondary students in Uganda (table 3.9). Student background variables are consistently significant. For gender, the results for numeracy are consistent with those for primary and show that boys’ advantage compared to girls continues to increase in lower secondary school. As noted before, this finding is not unusual in mathematics research. Boys also score much higher in biology. For literacy, the differences are very small and only marginally significant. The variables for age and SES again show that older children have lower scores, ceteris paribus, while students from wealthier families have higher Table 3.9  Covariates of S2 Student Achievement by Content Area (T-Statistics in Parentheses), Uganda NAPE, 2011 Numeracy Literacy Biology Independent variables HLM FE HLM FE HLM FE Student is female –0.38*** –0.40*** –0.02* –0.03* –0.38*** –0.40*** (–29.10) (–23.45) (–1.81) (–1.82) (–28.84) (–22.91) Student age –0.06*** –0.08*** –0.13*** –0.15*** –0.06*** –0.08*** (–12.38) (–10.83) (–28.14) (–15.42) (12.22) (10.59) Student family SES 0.09*** 0.09*** 0.18*** 0.17*** 0.09*** 0.09*** (10.64) (9.52) (21.80) (16.03) (10.39) (9.21) Student is boarder 0.26*** 0.32*** 0.24*** 0.31*** 0.26*** 0.33*** (12.36) (9.56) (11.74) (9.49) (12.09) (9.38) School characteristics  Rural –0.09** –0.09** –0.24*** –0.18*** –0.10** –0.10** (–2.09) (–2.37) (–5.70) (–4.58) (–2.26) (–2.49)   Private school –0.88*** –0.93*** –0.70*** –0.75*** –0.93*** –0.94*** (–8.43) (–9.02) (–6.92) (–8.68) (–8.51) (–8.20)   Private partnership (PPP) 0.73*** 0.88*** 0.56*** 0.67*** 0.76*** 0.88*** (6.43) (8.11) (5.06) (7.11) (6.39) (7.29)   USE school 0.91*** 0.92*** 0.95*** 0.92*** 0.96*** 0.94*** (9.03) (9.04) (9.73) (11.28) (9.20) (8.50)   Single shift 0.11* 0.02 –0.05 –0.02 0.10* 0.03 (1.73) (0.41) (–0.82) (–0.49) (1.61) (1.01)   Average SES (school) 0.04 –0.01 –0.01 –0.03 0.03 –0.01 (1.09) (–0.04) (–0.32) (–0.81) (0.65) (–0.09) Teacher characteristics  Female 0.09 0.10** 0.05 0.07** –0.01 0.04 (1.37) (2.20) (1.36) (1.96) (–0.05) (0.87)   Teacher qualificationa    Grade 5 primary –0.41** –0.29* 0.01 0.02 –0.31* –0.30** (–1.98) (–1.87) (0.03) (0.10) (–1.68) (–2.40)   Bachelor’s degree 0.02 0.02 0.09** 0.02 –0.04 –0.01 (0.46) (0.55) (2.24) (0.71) (–0.77) (–0.07)    Master’s or doctorate 0.37* 0.21 0.04 –0.18** –0.11 –0.18* (1.70) (1.17) (1.12) (–2.24) (–0.57) (–1.80)   Other –0.05 –0.05 –0.13* –0.09 –0.22*** –0.13** (–0.92) (–0.88) (–1.83) (–1.43) (–2.86) (–2.18) (table continues on next page) 88 A World Bank Study Table 3.9  Covariates of S2 Student Achievement by Content Area (T-Statistics in Parentheses), Uganda NAPE 2011 (continued) Numeracy Literacy Biology Independent variables HLM FE HLM FE HLM FE   Teaching experienceb   2–3 Years –0.01 0.01 –0.15** –0.04 0.05 –0.03 (–0.13) (0.20) (–1.99) (–0.54) (0.64) (–0.56)   4–6 Years –0.06 0.02 –0.14* –0.01 0.04 0.05 (–0.90) (0.55) (–1.77) (–0.02) (0.47) (0.66)   7–10 Years –0.08 –0.04 –0.12 0.04 0.04 0.05 (–1.03) (–0.67) (–1.40) (0.56) (0.49) (0.61)   11+ Years –0.06 –0.05 –0.10 0.05 0.04 0.04 (–0.80) (–0.84) (–1.20) (0.73) (0.51) (0.48)   Teacher content Knowledge 0.07*** 0.05*** 0.06*** 0.04*** 0.04* 0.03 (3.22) (3.02) (3.44) (2.72) (1.84) (1.40) District fixed effects? No Yes No Yes No Yes Explained variance (R ) 2 — 0.22 — 0.28 — 0.22 Sample size 18,974 18,974 18,524 18,524 18,637 18,637 Source: National Assessment of Progress in Education (NAPE), 2011. Notes: The dependent variable for all estimations (by subject) is the standardized measure (mean = 0, standard deviation = 1.00) of each content area percentage score (see bottom of table for summary of dependent variables). Coefficients, therefore, represent changes in standard deviations. OLS models are used (rather than HLM) because of a lack of clustering of teacher within schools; FE is for fixed effects at district level. T-statistics (in parentheses) are based on robust standard errors that correct for clustering of teachers at school level. Additional included variables (not presented) include “Teacher is Married” and controls for missing data on teacher experience and qualifications. a. Excluded category: Grade 5 secondary qualification. b. Excluded category: Experience = 1 year. c. *** = Significant at 0.01 level; ** = Significant at 0.05 level; * = Significant at 0.10 level. results. Finally, S2 students who report that they are boarding at the school score about 0.25 standard deviations higher on average than do nonboarders. The results on school characteristics also reveal a number of significant relationships. As expected, rural students score significantly lower than do urban students, with average standard deviations the lowest for numeracy (0.09) and the highest for literacy (0.24). Private secondary school students also score significantly lower than their public counterparts in S2, with standard deviations as high as at least 0.93 for biology and numeracy. The lowest differ- entials registered in literacy average 0.70 standard deviations. These effects are very large, considering that multivariate modeling takes into account differ- ences in family SES and teacher variables between school types. Students attending schools that are participating in the delivery of the USE program performed significantly higher, with high average standard deviations on the order of 0.91–0.96. S2 students attending single-shift schools also per- formed significantly higher in numeracy and biology, but only in the HLM analysis. Those results will be further validated by the ongoing impact evaluation of the USE program, with a special focus on the double-shift policy reform and the public-private partnership in delivery. Teacher Knowledge and Effectiveness 89 In relation to teacher characteristics, there is some evidence that female teachers are more effective in numeracy and literacy, although the point esti- mates are not very large (average standard deviations are 0.10 for numeracy and 0.07 for literacy under the FE model). Of importance to note is that this result could be due to the few female teachers observed in the numeracy teacher sample and may thus be driven by unmeasured variables that are different in schools that have female teachers. Finally, the results show that student achievement is consistently higher (in all three subjects) when students are studying with teachers who have higher levels of content knowledge. The effect sizes are small and but, on average, show that a standard deviation increase in teacher content knowl- edge is associated with about a 0.05 standard deviation in higher student achievement. Given the small effect sizes, however, it cannot be concluded that teacher subject matter knowledge alone is a critical driver of student achievement in this sample. But when such knowledge is taken together with the results for primary education, there is support for the argument that teachers play an important role in affecting student achievement. The results also support the argument that one of the components of effective teaching is teachers’ knowledge of the subject matter for which they are responsible. This analysis can be expanded by focusing more on specific content areas. For example, the models summarized in tables 3.8 and 3.9 can be estimated by content and subcontent area with matching measures of teacher content knowledge in the specific content or subcontent areas. This approach makes it possible to test for specific areas of the curriculum where the teacher’s content knowledge is more closely related to student achievement. One ­ ­ limitation with such measures is that the teacher content knowledge effect would be portrayed as the same across all aspects of the curriculum, while in practice, this may not be the case. Figures 3.15 and 3.16 provide some preliminary results that are from this line of analysis and that focus on the teacher’s effect from having content knowledge within the specific content and subcontent areas. Each of the bars represents effect sizes, which are measured in standard deviations taken from regression models. The results in figure 3.15, for example, show that the effects of teacher content knowledge in literacy are largest in P3 versus the other grades. Also, among the different content areas within literacy, the effect sizes are not much different. Figure 3.16 digs even deeper by examin- ing the relationship between teacher content knowledge and student achieve- ment in the subcontent areas of S2 numeracy. Results thus show that the measure of teacher’s knowledge is marginally more important for student learning in geometry and set theory, while in areas such as statistics and trans- formations, the teacher’s knowledge of the specific content does not appear to be as important. This line of analysis may, however, not be necessary for this general report. 90 A World Bank Study Figure 3.15  Summary of Teacher Knowledge Effect Sizes on Student Achievement, Literacy Content Areas in P3-P6-S2, Uganda NAPE, 2011 0.16 0.14 Effect size in standard deviations 0.12 0.10 0.08 0.06 0.04 0.02 0 −0.02 −0.04 Overall Writing Grammar Reading Reading supplement Content areas P3 P6 S2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Figure 3.16  Summary of Teacher Knowledge Effect Sizes on Student Achievement, S2 Numeracy Content Areas, Uganda NAPE, 2011 0.08 E ect sizes in standard 0.07 0.06 deviations 0.05 0.04 0.03 0.02 0.01 0 l ts es s al ry ry tic ns n er ep ur et eo tis io sia Ov c ea s om h at rte on tt St a m Ca al c M Ge Se fo r ic s er an m Tr Nu Content areas Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Linkages with the Primary Leaving Exams (PLE) The analyses in figures 3.15 and 3.16 clearly indicate that teachers’ content knowledge is high in literacy and numeracy, but their effectiveness is low. Contributors to this pattern require further investigation beyond the scope of this work, although the immediate thinking looks at teachers’ pedagogical con- tent knowledge levels in light of the conceptual framework presented earlier, Teacher Knowledge and Effectiveness 91 which guided the analytics in this chapter. In other words, although teachers have the required content knowledge levels to deliver the numeracy and literacy curriculum, they are not well grounded in the content pedagogical practices that would probably enable them to effectively transfer their knowledge to learners in the course of the teaching-learning process. Other likely contributors could be the limited contact time teachers may have. Uganda has high teacher absenteeism levels, estimated at 21 percent in 2009/10 (a decline from 27 percent in 2007). This work would greatly have benefitted from estimations on variation in learning outcomes that arise from teacher absenteeism. However, the NAPE data do not capture school-level dynamics in teacher attendance to enable this kind of analysis. Initial steps have, however, been made in this work by applying the most recent 2009/10 Uganda National Panel Survey (UNPS) data, which provide a rich national sample survey of teacher attendance. Attempts to link this data set with NAPE through identification of overlapping schools in the two surveys proved futile because the cases and schools totaled only seven. Nevertheless, UNPS data capture information on the number of students that passed in grades 1–3 in the sampled primary schools for the past two years before the survey (2008 and 2007). Drawing from the end-of-cycle (EOC) connection in chapter 2 of this report, an analysis presented in table 3.10 shows a bid to explore other school-level factors that are likely to predict school pass rates at the EOC or Primary Leaving Examination (PLE) at the primary level, including teacher attendance. Using the 2008 data, which represent the most recent year before this survey, results clearly indicate that teacher attendance significantly predicts the per- centage of students at a school that pass in grades 1–3 on the EOC exam, other factors held constant. Ongoing efforts to reduce teacher absenteeism should thus be intensified. Model two further indicates that (a) teacher attendance, (b) school size as determined by enrollment, (c) availability of toilets at the school, and (d) avail- ability of first aid services in the school explain 13 percent of the variation in the proportion of students who pass PLE in grades 1–3. Of importance to note is that availability of toilets and first aid services remain significant predictors of performance in student pass rates at PLE, even with the introduction of other explanatory variables (see models 3–6 in table 3.10). The interplay between school size and pass rates could be mediated through location externalities that are likely to be enjoyed by large schools as opposed to any other factors. Large schools are located mainly in urban places, which attract other benefits associated with urban environs, including an ability to attract better-qualified teachers and children with higher SES, coupled with a whole host of other factors. The availability of toilets and of first aid services is probably the most exciting finding in this analysis and clearly stresses the need to promote healthy and hygienic learning environments. Hence, ongoing initiatives aimed at improving 92 A World Bank Study Table 3.10  Covariates of School Average Pass Rate in 2008, UNPS, 2009/10 Model Independent variable (1) (2) (3) (4) (5) (6) Teacher attendance (%) 0.39** 0.29** 0.29 0.27 0.29 0.31 (2.13) (1.97) (1.61) (1.31) (1.34) (1.40) Total enrollment — 0.001*** 0.001* 0.001* 0.001* 0.0007 (2.87) (1.84) (1.71) (1.86) (0.82) Availability of toilets — 0.14*** 0.13** 0.09* 0.07* 0.03 (2.58) (1.08) (1.75) (1.48) (0.80) Availability of first aid — 0.17** 0.18*** 0.15*** 0.16*** 0.15*** (2.42) (2.66) (2.64) (2.55) (2.57) Teacher educationa   Percentage with grade 5 — — 0.04 0.05 0.05 –0.14 (0.12) (0.18) (0.18) (–0.44)   Percentage untrained — — 0.19 0.26 0.35 0.64 (0.36) (0.05) (0.52) (0.95) Provide textbooks — — — 0.03 0.04 0.07 (0.40) (0.48) (0.74) Private secondary in community — — — 0.40 0.45* 0.39 (1.57) (1.66) (1.37) School type (excluded: public)  Private — — — — 0.10 –0.04 (1.00) (–0.36)  NGO — — — — 0.03 — (0.05)  Other — — — — –0.54** –0.40 (–2.15) (–1.54) School pass rate 2007 — — — — — 0.73*** (6.38) Sample size 302 299 299 281 283 273 Explained variance-R2 0.03 0.13 0.12 0.20 0.22 0.30 Source: Uganda National Panel Survey, 2009/10. Notes: Dependent variable is percentage of students who passed P7 exam. All models are based on weighted data. T-statistics are in parentheses. a. Excluded category for teacher education: Grade 3. school sanitation facilities deserve greater traction—so do passing, adequate resourcing, and effectively implementing a school health policy. Strategies in the draft school health plan are well aligned with the Child Friendly Schools (CFS) and the Focusing Resources on Effective School Health (FRESH) frameworks. Worth noting is that key inputs such as trained teachers and textbooks do not seem to matter, but as earlier indicated, this analytical attempt with the UNPS data was really exploratory and is meant primarily to highlight other areas where additional scientific investigation could be useful. Tables D.7 and D.8 in appendix D further show results based on the 2007 pass rates for schools, together with results of the pooled estimates for 2007 and 2008. In conclusion, student achievement levels are still very low, a finding that, of course, is consistent with the previous chapter of this report. Teacher Teacher Knowledge and Effectiveness 93 achievement levels are generally high, but a more detailed review shows that more work remains in this area. Such work is especially needed given the expectation that effective teachers need to know this curriculum in great detail, meaning they should be able to answer all of the assessment and supple- mental questions very easily. There is some evidence linking preservice teacher training regimes with their content knowledge. This finding is an important validation of training and a good reminder that training can affect teacher quality and capacity. But there are other influences on teacher subject matter knowledge. In addition, there is a significant link between teacher knowledge and student performance on stan- dardized tests. This relationship is not very strong, but the result is still impor- tant because it validates the emphasis of this analysis on teachers and, to a lesser degree, teacher knowledge. At the very least, the results highlight one aspect of effective teaching, and there is an interesting policy chain that links preservice training with teacher knowledge and teacher knowledge with student achievement. Note 1. The number is 24 rather than the actual 25 that were created because one of the items (number 24) had an error and could not be used as stipulated in the IRT analysis. CHAPTER 4 Discussion and Suggestions on Next Steps A number of issues emerge from the analytical findings presented in chapters 2 and 3 of this report that deserve attention in order to effectively pursue the education quality agenda for Uganda. These issues are presented into two separate subsections that follow: literacy and numeracy, then biology. Emerging Issues in Literacy and Numeracy Achievement Student achievement levels in English literacy and numeracy at the primary level are still low and fall short of expected levels. For example, the average achievement score for literacy in primary 3 (P3) and primary 6 (P6) in 2010 was 47 percent and 40 percent, respectively. In addition, 60 percent of learners in P3 and about 70 percent in P6 scored below the 50 percent proficiency level for literacy in their respective grades. In numeracy, average achievement in P6 in 2010 was only 40 percent; worse still, 70 percent of learners in this grade performed below the 50 percent mark. This finding implies that the average student in Uganda cannot answer even half of the questions related to material that is supposed to have been taught in the classroom. Such findings are consistent with other studies that have been conducted in Uganda regarding learning achievement in these specific areas. For example, according to household surveys conducted by UWEZO Uganda, 9 out of every 10 children in P3 in more than 80 districts of the country could not read and understand an English story at the P2 level of difficulty, while 7 out of every 10 at the same grade could not solve numerical sums that were at the P2 level of difficulty. Earlier results from SACMEQ, in which 15 African countries participate, indicate that Uganda’s performance in 2007 in P6 was below the SACMEQ average scores for reading (an average of 511.8 against Uganda’s 478.7) and mathematics (an average of 509.5 against Uganda’s 481.9). In addition, declines were registered in SACMEQ scores for both reading and numeracy between 2000 and 2007; reading declined by 3.7 points, and numeracy, by 24.4. 95   96 A World Bank Study Overall achievement in literacy at the lower secondary levels is higher than that observed at the primary level, but a decline in performance is recorded (from an average of 60 percent in 2008 to 52 percent in 2010). The data also indicate an increase in the proportion of students who scored in the lower performance brackets of 0–25 percent and 26–50 percent between 2008 and 2010. In numeracy, however, S2 students are not outperforming other grades (with an average score of only 40 percent correct for 2009 and 2010), which indicates that numeracy is not an easy area even beyond primary education. In regard to literacy, students at the P3 and P6 levels are, on the whole, strug- gling to achieve the required proficiency levels within the curriculum content areas for literacy (reading comprehension, writing, and grammar), although students in P3 performed writing slightly better than they did reading compre- hension. For students in P6, performance within the three content areas of read- ing comprehension, writing, and grammar is equally low (with an average score of 30–40 percent) and has been so even in the past. Literacy content areas that require learners’ application of creativity and imagination are more problematic than are those where learners respond to simple guided instructions. In the reading comprehension subcontent area, for example, results reveal that primary students perform best when given simple tasks, such as matching (average student performance is 80 percent in P3) and associating pictures (average student performance is 97 percent in P6). However, the worst performance was observed in the subcontent areas of recognizing and describing, with average student performance of only 20 and 10 percent in P3, respectively. Similarly, average attainment in the interpreting and sequencing of pictures (subcontent areas of reading comprehension) at the P6 level was only 22 percent and 12 percent, respectively, with more than 60 percent of students scoring in the 0–25 percent band. Within the writing subcontent area in P6, performance is highest in learning areas such as copying and writing patterns and lowest in highly demanding areas such as writing composition. This finding means that classroom instruction and learner support should aim not only at simple and foundational literacy skills, but also at enabling learners to develop all literacy competencies within the subcontent areas that enhance their creativity and innovation. Creativity and critical thinking are central to the promotion of the Ugandan growth agenda, together with lifelong learning. In regard to the numeracy curriculum area, the most problematic subcontent areas at the P6 level, as assessed from learning outcomes, are geometry, mea- sures, and fractions. Geometry appears to be a challenge even at S2, in addition to functions and transformations and statistics. Of importance to note is that geometry features as the least problematic area at the P3 level, which may signal an emergence of pedagogical challenges as students transit from the lower to the upper levels of basic education. Within the number operations subcontent area, it is important to note that subtraction is a problem area for P3, with average performance estimated at only 20 percent, while for P6, division emerges as an Discussion and Suggestions on Next Steps 97 even larger problem learning area, with average performance estimated at 41 percent. As was observed with literacy, student test scores are highest in the areas related to basic numerical concepts and operations. This conclusion, however, comes with one caveat, which is that operations on numbers have the lowest average score of all P3 numeracy content areas. The gaps between best and worst student performance in literacy and numeracy, and between subcontent areas of these two subjects, are extremely wide, which implies an inequitable mastery of the comprehensive range of skills structured in the curriculum. For example, within the reading comprehension subcontent area of literacy, P3 student scores in the best area of matching was 80 percent, compared to a low of only 12 percent for describing, while at the P6 level, the average score for associating pictures was 97 percent, compared to an average of 12 percent for sequencing pictures. Again, within the numeracy subcontent areas, average performance among P6 students for operations on numbers was 70 percent, compared to a low of 18 percent for geometry. Likely contributors to this pattern deserve more inves- tigation, with a special focus on (a) curriculum coverage, pacing, and sequencing across different subcontent areas, including the distribution of instruction time among content and subcontent areas; (b) teachers’ pedagogical content knowl- edge; (c) availability of support materials; and (d) success of the skills transfer process from teachers to learners. (Chapter 2 of this report attempts to explore teachers’ content knowledge levels while trying to shed more light on teacher competencies.) There is a significantly large and persistent difference between urban and rural student outcomes in literacy and numeracy at the primary level in favor of literacy. At the two grade levels of P3 and P6, the urban-rural gap is about 20 percentage points for literacy and about 10 percentage points for numeracy. Moreover, the pattern not only has persisted over the years, but appears to be widening. For S2, however, the urban-rural gap in literacy and numeracy is very small, which suggests that access to S2 in rural areas is restricted to relatively few students or that rural schools at this level are of roughly the same quality as urban lower secondary schools. Private schools significantly outperform public primary schools in numeracy and literacy at the P3 and P6 levels, and the gap—though wide—has been stable over time. For example, in 2010, private students in P3 and P6 scored an average of about 18 percentage points higher in numeracy and between 25 percentage and 35 percentage points higher in literacy. This learning gap between private and government schools by grade level is likely attributable to very different service provision in the two types of schools in Uganda. Boys are scoring significantly higher than girls in numeracy, while the margin between the two is very narrow in literacy and in favor of girls. However, there are some meaningful gender differences by content area and grade level within literacy. For example, girls perform consistently better than boys in writing and grammar in all grades, but they perform consistently worse than boys in reading 98 A World Bank Study comprehension. For numeracy, the differences between boys and girls are small in earlier grades, but increase in later grades. The data clearly demonstrate that the overall gender gap in numeracy does not appear to be closing, either over time within grades or across grade levels. This finding confirms a pattern that is not uncommon throughout the world: boys do better in numeracy than girls, and girls do better in literacy than boys. The EOC reports are consistent with the results of national assessments, as indicated earlier. In numeracy, for example, the most pronounced areas of difficulty, which seem to recur nearly every year, are construction, geometry, angles, sets, and substitution at both the primary and lower secondary education levels, while for literacy, reading comprehension and composition writing were regularly identified as core challenges to improved EOC scores. Achievement in Biology Overall scores in biology are very low. In two of the years examined (2008 and 2010), the average is below 30 percent, although in 2009 the average did climb above 40 percent overall, while in 2011, results are much lower (only 25 per- cent). It is, however, unclear whether there is significant variation in student achievement in biology across content areas. Students are struggling especially in the areas of diversity of living things, soil, and plant structures. Comparisons by type of student and school are mixed. For gender there is a consistent, and at times substantial, advantage of boys over girls. However, differences by school location and type of school (public versus private) are not as significant. So overall, the main conclusion remains generally the same: the major challenge in this content area is lifting student achievement at all levels in all schools, for all students. Teacher Content Knowledge and Effectiveness Teachers’ content knowledge is high in the curriculum areas that they teach. Despite the fact that the ideal expected performance for any teacher—when based on test items drawn from the curriculum that they teach—is 100 percent, Ugandan primary teachers’ record of 84 percent in literacy and numeracy is considered high. Considered equally high are secondary teachers’ scores of 83 percent and 81 percent for numeracy and literacy. This finding raises one fundamental question: why is student achievement in Uganda so low if teachers have such high levels of content knowledge for the grades they teach? Content knowledge of S2 biology teachers was, however, observed to be very limited. It is in this subject area where the performance gap between teaches and learners was widest: three standard deviations. Despite the teachers being subject specialists, their performance on test items drawn from the S2 curricu- lum was only 65 percent, which is far below the expected standards. This very troubling result clearly indicates the limited content knowledge base of those teachers. Discussion and Suggestions on Next Steps 99 A significantly wide gap exists between the performance of teachers and learners on similar test items drawn from the same curriculum. For example, an average P6 teacher in Uganda scores 100 percent higher than the average P6 student. Even in subjects with poor performance, such as biology, teacher per- formance was rated at 65 percent compared to just 25 percent for learners. Although teachers are expected to outperform their students, such a wide gap between the two scores points to low teacher effectiveness and immediately challenges not only teachers’ pedagogical knowledge but also their pedagogical content knowledge (PCK). Moreover, teachers’ PCK is expected to enable their effective transfer of their high content knowledge to students in the teaching- learning transactional process. In addition, teachers and learners share the same areas of relative strength and weakness in numeracy and biology (classification, soil properties, and plants), which establishes the link between teachers’ content knowledge and their ability to deliver in the classroom. Female primary teachers scored lower than did their male counterparts, while teachers with higher levels of academic preparation (as reflected in teacher qualifications) recorded higher content knowledge than did others. Surprisingly, primary teachers’ experience does not enhance their content knowledge in any way, a finding that challenges the aspect of experiential learning to enhance one’s knowledge and skills. The significantly lower levels of content knowledge of private secondary school teachers also deserve attention. Secondary school enrollment in Uganda is almost equitably shared between private and public schools. The limited content knowledge of private school teachers, therefore, is likely to affect a substantial share of the secondary school population, thereby compromising the realization of quality outcomes at this level. Because practically all teachers in Uganda attend the same pre-service teacher training colleges and universities at the secondary level, it is not clear why those with limited content knowledge end up in private secondary schools. What seems to be common knowledge is the heavy reliance of private secondary schools on part-time teachers. Furthermore, the lower performance of private compared to public secondary schools really comes as a shock, especially in light of the extremely wide and persistent gap in favor of private schools at the primary level. This finding also signals a need to look into the teaching-learning environments of private secondary schools in Uganda, if quality enhancement efforts are to move in tandem across the two streams. How they are financed is, however, another challenge altogether. In regard to covariates of students’ learning outcomes, the results revealed the following: (a) female teachers are more effective in numeracy and literacy, although the point estimates are not very large; (b) teacher qualifications affect student achievement because student achievement is significantly lower in the small number of classrooms where teachers report having only a grade 5 pri- mary qualification; (c) teacher experience seems not to show any meaningful pattern linking this variable with student achievement; and, lastly, (d) student 100 A World Bank Study achievement is consistently higher (in all three subjects) when students study with teachers who have higher levels of content knowledge. The effect sizes are small and, on average, show that a standard deviation increase in teacher content knowledge is associated with about a 0.05 standard deviation of higher student achievement. Nevertheless, all results taken together reinforce the argument that teachers play an important role in affecting student achievement, and their knowledge of the subject matter for which they are responsible is one of the components of effective teaching. On the whole, results relating to teacher content knowledge raise more ques- tions than answers. A key challenge for the education system remains how to transform existing teachers into effective teachers. Their high content knowledge is a very good starting point—although this holds only in literacy and numeracy, and also not in equal distribution within content areas. As discussed earlier with respect to the conceptual framework, content knowledge alone is not a magic bullet in terms of ensuring teaching quality. But when teachers on average are missing almost 20 percent of the questions on student exams, lack of knowledge certainly raises questions about just how effective they can be. The near-total absence of students scoring above 75 percent (in any subject) indirectly rein- forces this contention because there are apparently limits on how much of the curriculum their teachers are familiar with. The work in this report also generates plenty of space for future research to build on these results. The next steps currently being undertaken by the Teacher Education Department, which is a good start, is an exploration of the pedagogical practices used by teachers in schools, together with other binding constraints to their performance. Suggestions for Next Steps Drawing on the preceding discussion, the following are suggested next steps for the government to consider: • Refocus ongoing teacher development efforts on strategies that improve teacher effectiveness in the classroom through ongoing pre- and in-service training programs. Intensifying pedagogy and enhancing teacher content knowledge in the identified problematic curriculum areas are evident imperatives. The recently launched investigation of teacher pedagogical practices will greatly inform such improvements. • Conduct further investigation of the following areas: (a) situational analysis of teacher effectiveness policies based on global norms, so that missing or weak policy links in teacher policies or strategies are identified (that is, application of the Systems Approach for Better Education Results [SABER-Teachers] approach of the World Bank could be considered); (b) curriculum coverage in the classroom for better understanding of the extent to which the official curriculum is covered by learners, including issues related to time allocation, Discussion and Suggestions on Next Steps 101 pacing, and sequencing; and (c) challenges with science instruction, including teacher preparation, that draws on the biology results. • Strengthen ongoing interventions to improve school-level sanitation and hygiene, including parliamentary approval of resourcing and implementing a school health policy. • Improve National Assessment of Progress in Education (NAPE) systems data to enable effective tracking of learner performance over time. • Improve mainstream teacher assessments to facilitate regular progress monitor- ing of teacher competencies, and improve the manner in which EOC examiners’ reports are prepared to enable more strategic informational feedback into the education system. APPENDIX A P3 Literacy Test Blueprints 2006 Table A.1  Relative Weights Allocated to Each Skill Area of 2006 P3 Written Literacy in English Skill area Subskill area Item Score Subtotal Reading comprehension Picture Q: 30–35 6 15 Conversation Q: 23–24 2 Story (prose) Q: 5–8 4 Calendar Q: 36–38 3 Writing Drawing and copying Q: 46–47 2 10 Guided composition Q: 29 4 Picture story Q: 39–42 4 Grammar Alphabet Q: 1–2 2 25 Tenses Q: 25–28, 43 5 Vocabulary Q: 20–22, 44–45 5 Plurals Q: 3–4 2 Parts of speech Q: 9–19 11 Total 50 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006. Table A.2  Relative Weights Allocated to Each Skill Area of 2007 P3 Written Literacy in English 2007 Skill area Subskill area Item Score Subtotal Reading Picture Q: 30–35 6 15 Conversation Q: 23–24 2 Story (prose) Q: 5–8 4 Tabular information Q: 36–38 3 Writing Drawing and copying Q: 46–47 2 10 Guided composition Q: 29 4 Picture story Q: 39–42 4 Grammar Alphabet Q: 1–2 2 25 Tenses Q: 25–28, 43 5 Vocabulary Q: 20–22, 44–45 5 Plurals Q: 3–4 2 Parts of speech Q: 9–19 11 Total 50 Source: National Assessment of Progress in Education (NAPE), Uganda, 2007. 103   104 A World Bank Study Table A.3  Weight Allocated to Each Skill Area of 2007 P3 Oral Reading Subskill Items Score Reading the English alphabet Q: 1–3 3 Reading words Q: 4–9 6 Reading sentences Q: 10–12 15 Reading a story Q: 13–16 16 Total 40 Source: National Assessment of Progress in Education (NAPE), Uganda, 2007. Table A.4  Weights Allocated to Each Skill Area of P3 Literacy in English, 2008 Skill areas Subskill area Items Score Total (score) Grammar Alphabet Q: 1–2 2 15 Tenses Q: 23–24 2 Vocabulary Q: 3–4 2 Opposites Q: 36–37 2 Articles Q: 20 1 Parts of speech Q: 6–7, 18–19, 6 21 – 22 Writing Drawing and copying Q: 5, 8 4 15 Guided composition Q: 32 4 Picture story Q: 28–31 4 Spelling Q: 9–10 3 Reading comprehension Picture Q: 15–17, 38–42 10 20 Conversation Q: 33–35 3 Story/prose Q: 11–14 4 Time table Q: 25–27 3 Total 50 Source: National Assessment of Progress in Education (NAPE), Uganda, 2008. Table A.5  Weight Allocated to Each Skill Area of P3 Reading Sub skill Items Score Reading the English alphabet Q: 5–9 5 Reading words Q: 10–19 10 Reading sentences Q: 20–22 15 Reading a story Q: 23–26 16 Listening comprehension Q: 1–4 4 Total 50 Source: National Assessment of Progress in Education (NAPE), Uganda, 2009/10. P3 Literacy Test Blueprints 2006 105 Table A.6  Weight Allocated to Each Skill Area of P3 Literacy in English, 2009 Skill areas Subskill area Items Score Total (score) Reading comprehension Describing Q: 15, 17 10 47 Recognizing Q: 5 4 Comprehension Q: 22 5 Identifying Q: 16 6 Matching Q: 1, 8, 11 9 Completing pictures Q: 3 4 Completing words Q: 9 4 Completing sentences Q: 19 5 Writing Naming Q: 18, 21 10 53 Drawing Q: 10 6 Copying Q: 6 5 Writing letters Q: 4 4 Writing words Q: 12, 14 9 Writing patterns Q: 2 4 Writing sentences Q: 7, 20 10 Writing stories Q: 13 5 All 100 Source: National Assessment of Progress in Education (NAPE), Uganda, 2009. Table A.7  Competences of P3 Literacy (Reading Comprehension), 2010 Competences Items Score Describing 15, 17 14 Recognizing 5 6 Comprehension 22 10 Identifying 16 9 Matching 1, 8, 11 9 Completing pictures 3 4 Completing words 14 6 Completing sentences 19 5 Subtotal 63 Source: National Assessment of Progress in Education (NAPE), Uganda, 2009/10. Table A.8  Competencies of P3 Literacy (Writing) Competencies Items Score Naming 18, 21 13 Drawing 10 6 Copying words 6 5 Writing letters 4 6 Writing words 9, 12 8 Writing patterns 2 6 Writing sentences 7, 20 14 Copying story 13 5 Subtotal 63 Total 126 Source: National Assessment of Progress in Education (NAPE), Uganda, 2009/10. APPENDIX B Summary Tables for English Literacy Table B.1  P3 English Literacy Overall Score Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 26.9 30.6 34.7 43.4 47.4 40.2   Standard deviation 23.9 23.9 20.8 21.6 20.5 —   Number of questions (total points) 25 (25) 25 (25) 31 (35) 22 (100) 22 (100) — Boys 26.4 29.4 34.5 42.9 47.6 36.2 Girls 27.5 31.8 34.9 43.6 47.2 37.0  Difference +1.1 +2.4 +0.4 +0.7 −0.4 +0.8 Rural 23.5 27.4 31.6 40.1 45.3 33.6 Urban 45.8 49.5 50.0 57.7 62.9 52.9  Difference +22.3 +22.1 +19.4 +17.6 +17.6 +19.3 Government 24.2 27.8 32.6 41.2 45.3 34.3 Private 57.1 61.9 63.3 72.7 75.3 65.6  Difference +32.9 +34.1 +30.7 +31.5 +30.0 +31.3 Central 33.6 39.9 43.0 51.3 57.3 44.6 East 22.8 20.1 27.9 37.1 39.9 29.6 North 15.6 19.0 26.0 33.7 40.6 27.4 West 35.1 39.6 41.0 50.9 53.2 44.1 Kampala 63.2 64.4 59.6 74.5 81.4 67.1   High/low gap +47.6 +45.4 +33.6 +40.8 +41.5 +39.7 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. 107   108 A World Bank Study Table B.2  P6 English Literacy Overall Score Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 31.9 41.6 34.5 35.3 39.7 36.2   Standard deviation 22.1 23.6 20.5 21.5 18.5 —   Number of questions (total points) 92 (101) 113 (141) 94 (101) 72 (101) 84 (110) — Boys 32.0 41.3 34.4 35.0 39.8 36.5 Girls 31.7 41.9 34.5 35.8 39.6 36.7  Difference –0.3 +0.6 +0.1 +0.8 –0.2 +0.2 Rural 28.4 38.5 31.3 32.5 37.1 33.6 Urban 49.6 57.9 51.1 50.1 58.2 53.1  Difference +21.2 +19.4 +20.8 +17.6 +21.1 +19.5 Government 29.6 39.3 32.5 34.1 37.9 34.7 Private 57.4 65.3 62.2 55.3 64.0 61.0  Difference +27.8 +26.0 +29.7 +21.2 +26.1 +25.3 Central 31.4 42.1 33.7 37.1 42.2 37.1 East 28.5 38.5 31.9 31.9 35.0 33.1 North 25.5 34.5 29.1 31.1 36.5 31.5 West 40.6 50.0 41.6 42.0 43.9 43.6 Kampala 51.8 61.0 53.2 52.8 73.6 57.3   High/low gap +26.3 +26.5 +24.1 +21.7 +38.6 +26.8 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table B.3  S2 English Literacy Overall Score Summary, Uganda NAPE, 2008–10 Percent correct (0–100%) Overall Category 2008 2009 2010 average National average 61.3 56.9 52.4 56.9   Standard deviation 16.5 14.8 14.3 —   Number of questions (points) 96 (101) 66 (100) 87 (107) — Boys 60.8 56.6 51.8 56.4 Girls 61.9 57.4 53.1 57.4  Difference –1.1 –0.8 –1.3 –1.0 Rural 59.2 55.4 50.6 55.1 Urban 65.8 59.2 55.9 60.1  Difference +6.6 +5.8 +5.3 +5.0 Government 61.7 58.4 51.9 57.3 Private 61.0 55.8 52.7 56.5  Difference –0.7 –2.6 +0.8 –0.8 (table continues on next page) Summary Tables for English Literacy 109 Table B.3  S2 English Literacy Overall Score Summary, Uganda NAPE, 2008–10 (continued) Percent correct (0–100%) Overall Category 2008 2009 2010 average Central 63.7 56.3 52.8 57.8 East 58.6 55.4 50.0 54.6 North 58.6 55.0 49.6 54.7 West 60.1 58.4 53.9 57.4 Kampala 73.4 68.0 63.0 68.4   High/low gap +14.8 +13.0 +13.4 +13.8 Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2008–10, with weights to correct for imbalanced sampling across years and differences in the percentage of private schools. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table B.4  P3 Reading Comprehension Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 19.7 26.7 26.3 42.1 45.3 32.0   Standard deviation 22.1 22.8 20.8 23.3 20.4 —   Number of questions (total points) 15 (15) 15 (15) 18 (20) 11 (44) 11 (47) — Boys 19.4 25.9 26.1 42.1 46.0 31.9 Girls 20.1 27.6 26.3 42.1 44.6 32.1  Difference +0.7 +1.7 +0.2 0.0 –1.4 +0.2 Rural 16.7 23.9 23.1 38.8 43.3 29.2 Urban 36.6 44.0 42.3 57.2 59.9 47.8  Difference +20.0 +20.1 +19.2 +18.4 +16.6 +18.6 Government 17.3 24.1 24.1 39.9 43.3 29.8 Private 46.9 56.2 56.1 73.7 71.4 60.2  Difference +29.6 +32.1 +32.0 +33.8 +28.1 +30.4 Central 24.0 34.1 32.2 48.7 53.7 38.1 East 15.1 19.5 20.3 36.2 38.9 26.0 North 11.5 17.8 19.4 33.8 40.6 25.0 West 25.7 33.7 31.2 48.8 48.7 37.9 Kampala 59.7 57.9 53.9 76.9 77.6 61.9   High/low gap +48.2 +40.1 +34.5 +43.1 +38.7 +36.9 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. 110 A World Bank Study Table B.5  P6 Reading Comprehension Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 46.8 47.6 40.4 40.8 43.1 43.7   Standard deviation 26.2 28.7 26.9 26.2 17.3 —   Number of questions (points) 39 (40) 37 (40) 39 (40) 39 (40) 38 (50) — Boys 47.8 48.3 41.3 41.6 43.5 44.5 Girls 45.8 46.9 39.5 40.0 42.6 43.0  Difference –2.0 –1.4 –1.8 –1.6 –0.9 –1.5 Rural 42.9 44.0 36.7 37.3 40.8 43.0 Urban 66.6 66.8 59.5 59.2 59.7 62.5  Difference +23.7 +22.8 +22.8 +21.9 +28.9 +19.5 Government 44.4 45.0 38.2 39.4 41.5 41.7 Private 73.7 74.5 70.8 61.9 64.7 69.7  Difference +29.3 +29.5 +32.6 +22.5 +24.2 +28.0 Central 47.3 48.9 39.6 43.5 44.8 44.9 East 41.8 44.2 36.9 35.7 39.1 39.5 North 40.0 38.8 36.0 37.9 40.9 38.8 West 57.1 56.8 47.2 47.3 46.2 50.7 Kampala 67.4 71.0 61.5 60.8 74.9 67.2   High/low gap +27.4 +32.2 +25.5 +25.1 +35.8 +28.4 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table B.6  S2 English Reading Comprehension Summary, Uganda NAPE, 2008–10 Percent correct (0–100%) Overall Category 2008 2009 2010 average National average 61.9 71.7 54.1 62.6   Standard deviation 18.3 13.7 15.9 —   Number of questions (points) 35 (35) 35 (35) 28 (38) — Boys 62.0 71.9 54.0 62.7 Girls 61.9 71.5 54.2 62.4  Difference +0.1 +0.4 –0.2 +0.3 Rural 59.4 70.3 52.3 60.3 Urban 67.4 73.8 57.6 66.7  Difference +8.0 +3.5 +5.3 +6.4 (table continues on next page) Summary Tables for English Literacy 111 Table B.6  S2 English Reading Comprehension Summary, Uganda NAPE, 2008–10 (continued) Percent correct (0–100%) Overall Category 2008 2009 2010 average Government 62.0 73.2 54.0 63.1 Private 61.9 70.5 54.1 62.2  Difference –0.1 –2.7 +0.1 –0.9 Central 65.1 71.3 55.4 63.4 East 58.2 70.6 50.8 60.5 North 60.7 70.5 52.5 62.0 West 59.4 72.1 55.2 61.8 Kampala 76.4 82.0 63.8 74.6   High/low gap +18.2 +11.5 +13.0 +14.1 Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2008–10, with weights to correct for imbalanced sampling across years and differences in the percentage of private schools. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table B.7  P3 Reading Comprehension Detailed Summary, Uganda NAPE, 2009/10 Completing Picture- Completing Describing Recognizing Comprehension Identifying Matching Words: Sentences: Category 2009 2010 2009 2010 2009 2010 2009 2010 2009 2010 2009 2010 2009 2010 National Average 11.3 12.7 21.8 17.4 32.2 43.2 36.8 59.9 82.6 81.0 44.8 41.0 33.8 47.6  Standard 22.8 21.7 33.2 29.3 35.9 38.1 41.4 40.6 22.1 22.3 33.6 25.6 37.8 39.1  Deviation  Number of  questions 2 (10) 2 (10) 1 (4) 1 (4) 1 (5) 1 (5) 1 (6) 1 (6) 3 (9) 3 (9) 2 (8) 2 (9) 1 (5) 1 (5) (points) Boys 10.9 12.5 22.1 17.3 31.0 43.0 38.0 63.2 83.0 81.7 45.2 41.7 32.9 47.8 Girls 11.7 13.0 21.6 17.4 33.5 43.4 35.5 56.4 82.2 80.2 44.4 40.3 34.6 47.5  Difference +0.8 +0.5 –0.5 +0.1 +2.5 +0.4 –2.5 –6.8 –0.8 –1.5 –0.8 –1.4 +1.7 –0.3 Rural 8.4 10.4 17.7 14.7 28.1 40.3 33.8 58.7 80.7 79.8 41.8 38.8 29.9 44.9 Urban 24.6 30.1 40.9 36.7 51.3 64.4 50.4 68.3 91.3 89.8 58.5 57.1 51.6 66.9  Difference +16.2 +19.7 +23.2 +20.0 23.2 24.1 +16.6 +9.6 +10.6 +10.0 +16.7 +18.3 +21.7 +21.9 Government 8.9 10.2 19.1 14.5 29.3 40.4 35.2 59.0 81.5 79.9 42.9 38.8 31.2 45.1 Private 45.6 46.9 61.1 55.4 74.5 79.6 59.4 72.0 97.7 95.6 72.3 70.5 70.7 81.0  Difference +36.7 +36.7 +42.0 +40.9 +45.2 +39.2 +24.2 +13.0 +16.2 +15.7 +29.4 +31.7 +39.5 +35.9 Source: National Assessment of Progress in Education (NAPE), Uganda, 2009/10. Notes: Averages represent percentage correct (0–100%). Tests of significance are carried out across years (comparing 2010 with 2009) for each category (national average, boys, rural, and so forth). These comparisons are made against the Overall Average. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. Shading is used to denote averages/differences that are significantly different at the p < .05 level. 112 A World Bank Study Table B.8  P6 Reading Comprehension Detailed Summary, Uganda NAPE, 2010 Content areas Associate Describe Sequence Interpret Category Tell time Pictures Activities Story Calendar Poem Pictures Cartoons National average 33.2 97.3 44.2 33.0 56.7 52.1 12.0 21.5   Standard deviation 36.8 13.0 32.8 23.7 30.7 24.7 20.5 25.8   Number of questions (points) 2 (2) 2 (6) 1 (4) 9 (9) 5 (5) 9 (10) 4 (8) 6 (6) Boys 40.3 97.3 43.7 33.1 58.9 51.8 11.8 22.1 Girls 25.7 97.4 44.7 32.9 54.4 52.5 12.2 20.8  Difference –14.6 +0.1 +1.0 –0.2 –4.5 +0.7 +0.4 –1.3 Rural 31.3 97.1 41.0 30.5 54.3 49.7 9.6 18.0 Urban 46.6 99.2 66.7 50.6 74.2 69.3 29.0 46.6  Difference +15.3 +2.1 +25.7 +20.1 +19.9 +19.6 +19.4 +28.6 Government 31.6 97.3 42.2 31.4 54.9 50.5 10.0 19.1 Private 53.8 99.0 70.6 53.8 80.8 73.3 38.2 53.7  Difference +22.2 +1.7 +28.4 +22.4 +25.9 +22.8 +28.2 +34.6 Central 32.3 98.7 43.5 32.4 63.4 53.2 17.5 21.6 East 29.3 97.3 38.2 29.7 50.7 46.8 8.0 17.5 North 33.1 95.7 43.3 32.4 50.8 49.9 9.8 18.2 West 35.9 98.4 48.9 35.4 62.8 57.1 11.2 26.0 Kampala 60.6 99.3 85.3 61.5 86.3 82.0 60.1 67.0 Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. Notes: Averages represent percentage correct (0–100%). Tests of significance are carried out within years (see Difference) between boys-girls, urban-rural, and government-private. Shading is used to denote averages/differences that are significantly different at the p < .05 level. Table B.9  S2 Reading Comprehension Detailed Summary, Uganda NAPE, 2008–10 Passage Poetry Dialog Category 2008 2009 2010 2008 2009 2010 2008 2009 2010 National average 70.4 73.3 66.4 54.6 59.2 46.7 66.5 73.7 39.9   Standard deviation 19.6 16.0 23.3 26.6 25.2 24.1 24.0 18.1 19.9   Number of questions (points) 10 (10) 10 (10) 7 (11) 7 (7) 7 (7) 6 (8) 8 (8) 8 (8) 6 (8) Boys 70.4 73.4 64.9 54.4 58.9 47.3 65.3 73.1 39.4 Girls 70.5 73.0 68.0 54.9 59.6 46.0 67.8 74.4 40.4  Difference +0.1 –0.4 +3.1 +0.5 +0.7 –1.3 +2.5 +1.3 +1.0 Rural 68.1 71.2 65.6 51.8 57.3 44.2 64.4 72.4 37.1 Urban 75.5 76.3 67.9 60.7 62.1 51.7 70.8 75.6 45.3  Difference +7.4 +5.1 +2.3 +8.9 +4.8 +7.5 +6.4 +3.2 +8.2 Government 70.7 74.1 65.9 54.9 61.1 46.8 66.9 75.0 39.9 Private 70.2 72.6 66.7 54.4 57.7 46.6 66.1 72.6 39.8  Difference –0.5 –1.5 +0.8 –0.5 –3.4 –0.2 –0.8 –2.4 –0.1 Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10. Notes: Averages represent percentage correct (0–100%). Tests of significance are carried out across years for each category (national average, boys, rural, and so forth). These comparisons are made against the Overall Average. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. Shading is used to denote averages/differences that are significantly different at the p < .05 level. Summary Tables for English Literacy 113 Table B.10  P3 Writing Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 37.7 36.3 45.9 46.7 50.7 43.5   Standard deviation 30.7 29.5 24.4 23.2 22.2 —   Number of questions (points) 10 (10) 10 (10) 13 (15) 11 (53) 11 (52) — Boys 36.8 34.6 45.6 46.0 50.5 42.7 Girls 38.6 38.1 46.2 47.4 50.9 44.2  Difference +1.8 +3.5 +0.6 +1.4 +0.4 +1.5 Rural 33.8 32.8 43.1 43.5 48.5 40.3 Urban 59.5 57.7 60.3 61.3 66.3 60.8  Difference +25.7 +25.0 +17.2 +17.7 +17.8 +20.5 Government 34.6 33.3 44.0 44.6 48.5 41.0 Private 72.3 70.5 72.9 76.0 79.1 74.0  Difference +37.6 +37.2 +29.0 +31.3 +30.6 +33.0 Central 48.0 48.7 57.4 56.4 61.8 54.3 East 29.4 26.0 37.9 40.0 42.3 35.1 North 21.9 20.8 34.8 35.5 42.1 31.3 West 49.3 48.3 54.0 55.5 58.3 53.2 Kampala 79.1 74.1 67.2 76.6 84.8 75.4   High/low gap +57.2 +53.3 +32.4 +41.1 +42.7 +44.1 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. Table B.11  P6 Writing Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 14.1 29.3 27.8 28.8 36.3 27.2   Standard deviation 17.3 24.4 22.7 21.9 20.7 —   Number of questions (points) 23 (32) 19 (28) 15 (21) 3 (30) 27 (40) — Boys 13.6 28.2 26.8 27.1 36.2 26.4 Girls 14.6 30.3 28.9 30.6 36.4 28.1  Difference +1.0 +2.1 +2.1 +3.5 +0.2 +1.7 Rural 11.5 26.2 24.8 26.4 33.5 24.6 Urban 27.1 45.6 43.6 41.3 56.5 42.0  Difference +15.6 +19.4 +18.8 +14.9 +23.0 +17.4 Government 12.3 26.9 25.9 27.3 34.2 25.4 Private 33.6 53.1 54.4 50.7 63.4 50.5  Difference +21.3 +26.2 +28.5 +23.4 +29.2 +25.1 (table continues on next page) 114 A World Bank Study Table B.11  P6 Writing Summary, Uganda NAPE, 2006–10 (continued) Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average Central 12.7 28.2 25.7 29.2 40.1 26.7 East 12.2 27.8 24.6 26.0 30.7 24.2 North 9.6 22.3 21.9 24.1 32.5 22.5 West 20.6 37.1 37.2 35.9 40.9 34.6 Kampala 28.5 47.6 49.6 47.0 74.6 46.3   High/low gap +18.9 +25.3 +27.7 +22.9 +33.9 +23.8 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. Table B.12  S2 English Writing Summary, Uganda NAPE, 2008–10 Percent correct (0–100%) Overall Category 2008 2009 2010 average National Average 60.0 47.8 49.8 52.5   Standard deviation 21.2 20.3 17.6 —   Number of questions (points) 31 (36) 31 (35) 29 (36) — Boys 58.8 47.1 48.5 51.4 Girls 61.4 48.6 51.3 53.8  Difference 2.6 1.5 2.8 2.4 Rural 57.8 46.1 48.1 50.9 Urban 64.6 50.2 53.3 55.5  Difference +6.8 +4.1 +5.2 +4.6 Government 60.9 49.0 48.7 52.9 Private 59.3 46.7 50.7 52.2  Difference –1.6 –2.3 –2.0 –0.7 Central 62.1 47.7 49.6 53.6 East 57.2 45.3 47.2 49.4 North 57.1 46.4 46.7 50.2 West 59.6 49.6 52.5 53.9 Kampala 71.1 59.0 60.5 63.7   High/low gap +14.0 +13.7 +13.8 +14.3 Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2008–10, with weights to correct for imbalanced sampling across years and differences in the percentage of private schools. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. Summary Tables for English Literacy 115 Table B.13  P3 Writing Detailed Summary, Uganda NAPE, 2009/10 Writing Writing Writing Naming Drawing Copying words-Letters: Patterns: Sentences: Writing story Category 2009 2010 2009 2010 2009 2010 2009 2010 2009 2010 2009 2010 2009 2010 National Average 35.1 24.6 44.6 63.1 78.2 82.1 42.5 51.9 73.6 73.8 36.5 33.5 50.6 69.4  Standard 30.2 27.3 41.4 36.4 36.1 33.0 25.7 27.3 31.0 28.5 31.5 33.4 35.7 31.1  deviation  Number of 2 (10) 2 (10) 1 (6) 1 (6) 1 (5) 1 (5) 3 (13) 3 (12) 1 (4) 1 (4) 2 (10) 2 (10) 1 (5) 1 (5)  questions (points) Boys 33.7 24.2 44.3 62.5 77.4 82.4 42.0 51.9 74.1 74.3 36.4 33.5 48.3 67.9 Girls 36.5 25.0 45.0 63.7 79.1 81.8 43.0 51.9 73.0 73.3 36.7 33.5 52.9 71.0  Difference +2.8 +0.8 +0.7 +1.2 +1.7 –0.6 +1.0 0.0 –1.1 –1.0 0.3 0.0 +4.6 +3.1 Rural 30.7 21.7 40.2 61.2 76.7 81.3 39.5 49.6 72.4 73.1 33.2 30.5 47.7 67.9 Urban 55.1 45.9 64.9 76.7 84.9 88.3 56.1 68.3 78.9 78.8 52.0 55.2 68.3 80.3  Difference +24.4 +24.2 +24.7 +15.5 +8.2 +7.0 +16.6 +18.7 +6.5 +5.7 +19.8 +24.7 +20.6 +12.4 Government 32.3 21.5 41.8 61.3 77.4 81.2 40.4 49.6 73.0 73.3 34.4 30.4 48.9 68.2 Private 74.7 65.7 85.8 86.1 89.6 93.6 72.5 82.0 82.1 81.0 67.4 73.8 74.4 85.3  Difference +42.4 +44.2 +44.0 +24.8 +12.2 +12.4 +32.1 +32.4 +9.1 +7.7 +33.0 +43.4 +25.5 +17.1 Source: National Assessment of Progress in Education (NAPE), Uganda, 2009/10. Notes: Averages represent percentage correct (0–100%). Tests of significance are carried out across years (comparing 2010 with 2009) for each category (national average, boys, rural, and so forth). These comparisons are made against the Overall Average. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. Shading is used to denote averages/differences that are significantly different at the p < .05 level. Table B.14  P6 Writing Detailed Summary, Uganda NAPE, 2010 Content areas Name Draw Complete Write Write Write Category Items and Name Forms Words Guided Comp. Letter National average 46.9 59.3 54.3 44.6 53.6 11.6 32.3   Standard deviation 35.7 35.0 24.8 37.3 39.0 19.0 28.1   Number of questions (points) 2 (2) 1 (3) 7 (7) 3 (3) 1 (5) 6 (10) 7 (10) Boys 46.4 61.7 54.3 45.6 54.1 11.5 31.2 Girls 47.5 56.9 54.2 43.6 53.1 11.7 33.5  Difference +1.1 –4.8 –0.1 –2.0 –1.0 +0.2 +2.3 Rural 44.4 56.7 52.0 41.4 50.0 9.3 29.5 Urban 66.2 78.4 70.3 67.6 79.5 27.7 52.5  Difference +21.8 +21.7 +18.3 +26.2 +29.5 +18.4 +23.0 Government 44.8 57.7 52.8 42.3 51.3 9.9 30.3 Private 75.3 81.9 74.6 75.6 84.5 34.0 60.4  Difference +30.5 +24.2 +21.8 +33.3 +33.2 +24.1 +30.1 (table continues on next page) 116 A World Bank Study Table B.14  P6 Writing Detailed Summary, Uganda NAPE, 2010 (continued) Content areas Name Draw Complete Write Write Write Category Items and Name Forms Words Guided Comp. Letter Central 59.7 63.0 57.7 51.5 55.1 13.1 35.6 East 40.9 56.9 47.7 37.4 46.3 8.3 26.0 North 38.1 55.4 51.6 36.5 52.4 10.5 26.2 West 52.1 62.0 59.7 54.0 58.7 12.4 40.9 Kampala 77.6 85.9 81.7 81.5 90.5 48.9 69.7   High/low gap +39.5 +30.5 +34.0 +45.0 +44.2 +40.6 +43.7 Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. Notes: Averages represent percentage correct (0–100%). When test booklets include items that are worth more than one point the points are totaled and divided by the possible total to form the percentage. Comparisons are made within years (see Difference) between boys-girls, urban-rural, and government-private. Shading is used to denote averages/differences that are significantly different at the p < .05 level. Table B.15  S2 Writing Detailed Summary, Uganda NAPE, 2008–10 Formal letter Composition Report Category 2008 2009 2010 2008 2009 2010 2008 2010 National average 65.8 53.4 54.0 50.1 31.4 53.9 61.1 53.5   Standard deviation 26.2 27.4 25.8 30.4 23.8 25.2 30.4 28.9   Number of questions (points) 10 (12) 12 (12) 11 (12) 9 (12) 11 (11) 7 (10) 6 (6) 7 (8) Boys 64.7 52.5 53.3 47.0 30.0 52.7 60.9 51.6 Girls 67.0 54.4 54.9 53.6 33.0 55.2 61.3 55.7  Difference +2.3 +1.9 +1.6 +6.6 +3.0 +2.5 +0.4 +4.1 Rural 63.9 52.6 52.4 47.1 29.3 52.1 58.9 51.9 Urban 69.8 54.6 57.6 56.5 34.5 57.5 65.8 56.6  Difference +5.9 +2.0 +5.2 +9.4 +5.2 +5.4 +6.9 +4.7 Government 67.8 54.4 53.6 48.6 32.2 51.8 63.1 52.7 Private 64.2 52.5 54.3 51.4 30.8 55.6 59.5 54.2  Difference –3.6 –1.9 +0.7 +2.8 –1.4 +3.8 –3.6 +1.5 Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10. Notes: Averages represent percentage correct (0–100%). Tests of significance are carried out across years for each category (national average, boys, rural, and so forth). These comparisons are made against the Overall Average. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. Shading is used to denote averages/differences that are significantly different at the p < .05 level. Table B.16  P3 Grammar Summary, Uganda NAPE, 2006–08 Percent correct (0–100%) Overall Category 2006 2007 2008 average 2008/2006 National average 43.0 43.3 44.2 43.5 +1.1   Standard deviation 23.0 21.1 28.6 — —   Number of questions (total points) 25 (25) 25 (25) 15 (15) — — Boys 42.5 42.6 44.0 43.1 +1.5 Girls 43.6 44.0 44.3 44.0 +0.7  Difference +1.1 +1.3 +0.7 +0.9 — Rural 40.5 41.1 40.6 40.7 +0.2 Urban 57.4 56.6 62.0 58.8 +4.6  Difference +16.9 +15.5 +17.4 +18.1 — (table continues on next page) Summary Tables for English Literacy 117 Table B.16  P3 Grammar Summary, Uganda NAPE, 2006–08 (continued) Percent correct (0–100%) Overall Category 2006 2007 2008 average 2008/2006 Government 40.9 41.2 41.8 41.3 +0.9 Private 67.2 66.7 76.9 69.8 +9.7  Difference +26.3 +25.4 +35.1 +28.5 — Central 48.0 49.5 56.6 51.4 +8.6 East 38.4 38.4 37.5 38.1 –0.9 North 34.4 34.7 28.6 32.5 –5.8 West 49.4 49.5 54.0 50.9 +4.6 Kampala 71.0 66.1 69.4 68.8 –1.4   High/low gap +36.6 +31.4 +40.8 +36.3 — Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–08. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table B.17  P6 Grammar Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 30.0 32.2 35.4 36.0 37.9 34.3   Standard deviation 23.5 23.6 20.3 19.5 24.9 —   Number of questions (points) 30 (30) 30 (30) 30 (30) 30 (30) 19 (19) — Boys 29.7 31.5 34.7 35.3 37.2 33.7 Girls 30.2 32.9 36.0 36.8 38.6 34.9  Difference +0.5 +1.3 +1.3 +1.5 +1.4 +1.2 Rural 26.2 28.6 32.7 33.6 34.7 31.2 Urban 49.2 51.6 49.0 48.6 60.4 51.3  Difference +23.0 +23.0 +16.3 +15.0 +25.7 +20.1 Government 27.3 29.5 33.6 34.9 35.6 32.2 Private 59.0 60.8 60.0 53.0 68.0 60.3  Difference +31.7 +31.3 +26.4 +18.1 +32.4 +28.1 Central 29.2 32.8 35.2 37.4 40.7 34.8 East 27.5 29.8 34.3 34.2 33.0 31.8 North 22.4 23.4 28.1 30.5 33.1 27.7 West 38.5 40.5 42.3 42.6 43.6 41.6 Kampala 54.0 59.1 51.1 49.7 77.9 57.2   High/low gap +31.6 +35.7 +23.0 +19.2 +44.9 +29.5 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. 118 A World Bank Study Table B.18  S2 Grammar Detailed Summary, Uganda NAPE, 2010 Content areas Articles/ Structural Category Nouns Pronouns Tenses words Preposition Adverbs Punctuation Adjectives patterns National average 85.0 73.4 72.9 62.3 60.5 51.4 40.8 39.4 32.4   Standard deviation 36.5 28.6 26.5 23.0 21.7 30.0 39.2 49.9 21.2  Number of 1 (1) 2 (2) 3 (3) 5 (5) 6 (6) 2 (2) 1 (4) 1 (1) 8 (8)   questions (points) Boys 84.5 72.2 72.4 65.7 61.2 64.7 40.3 40.4 35.9 Girls 85.7 74.8 72.7 65.9 61.9 67.6 41.5 38.3 37.0  Difference +1.2 +2.6 +0.3 +0.2 +0.7 +1.9 +1.2 –2.1 +1.1 Rural 84.5 71.6 70.5 64.3 59.4 65.6 39.1 37.3 34.7 Urban 86.2 77.1 76.6 68.9 65.8 67.0 44.4 43.5 39.9  Difference +1.7 +5.5 +6.1 +4.6 +6.4 +1.4 +5.3 +6.2 +5.2 Government 85.1 73.7 72.6 66.0 61.4 66.0 41.2 39.7 35.7 Private 85.0 73.2 72.4 65.6 61.6 66.1 40.5 39.1 37.0  Difference –0.1 –0.5 –0.2 –0.4 +0.2 +0.1 –0.7 –0.6 +1.3 Source: National Assessment of Progress in Education (NAPE), Uganda, 2010. Notes: Averages represent percentage correct (0–100%). Tests of significance are carried out within years (see Difference) between boys-girls, urban-rural, and government-private. Shading is used to denote averages/differences that are significantly different at the p < .05. APPENDIX C Summary Tables for Numeracy Table C.1  P3 Numeracy Overall Score Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 37.7 39.4 52.1 44.7 58.2 46.4   Standard deviation 21.2 21.6 21.5 21.4 19.3 —   Number of questions (points) 46 (50) 45 (50) 37 (50) 37 (50) 60 (100) — Boys 39.1 40.3 53.4 46.2 59.7 47.8 Girls 36.2 38.5 50.7 43.2 56.5 45.0  Difference –2.9 –1.8 –2.7 –3.0 –3.2 –2.8 Rural 35.9 37.6 50.2 42.8 57.0 44.8 Urban 47.5 50.3 61.7 53.3 66.4 55.5  Difference +11.6 +12.7 +11.5 +10.5 +9.4 +10.7 Government 35.9 37.4 50.6 43.1 56.9 44.8 Private 58.0 62.0 71.9 67.7 74.9 66.4  Difference +22.1 +24.6 +21.3 +24.6 +18.0 +21.6 Central 41.3 45.6 58.9 48.6 64.0 51.4 East 31.7 31.8 44.9 38.4 51.0 39.5 North 29.1 28.7 44.7 39.4 55.3 39.8 West 47.9 51.5 61.3 53.4 63.5 55.7 Kampala 59.2 60.2 62.3 61.4 75.8 62.7   High/low gap +30.1 +31.5 +17.6 +23.0 +24.8 +23.2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. 119   120 A World Bank Study Table C.2  P6 Overall Numeracy Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 30.7 35.6 39.9 39.8 42.2 37.7   Standard deviation 14.8 18.0 17.6 17.3 16.8 —   Number of questions (points) 66 (101) 68 (101) 69 (101) 69 (101) 71 (115) — Boys 33.0 37.7 42.2 42.3 44.3 39.9 Girls 28.3 33.6 37.6 37.1 40.1 35.3  Difference –4.7 –4.1 –4.6 –5.2 –4.2 –4.6 Rural 28.9 34.0 38.3 38.7 40.7 36.2 Urban 40.0 44.2 47.9 45.3 53.4 45.8  Difference +11.1 +10.2 +9.6 +6.6 +12.7 +9.6 Government 29.1 34.0 38.4 38.9 41.0 36.3 Private 48.4 52.1 59.8 53.0 59.6 54.2  Difference +19.3 +18.1 +21.4 +14.1 +18.6 +17.9 Central 29.8 35.5 38.2 40.6 43.2 37.1 East 27.2 32.4 37.2 36.6 37.9 34.2 North 26.4 29.7 35.3 36.2 40.0 33.8 West 39.8 45.0 49.0 47.1 47.1 45.7 Kampala 38.4 45.9 46.8 43.2 62.8 46.0   High/low gap +13.4 +16.2 +13.7 +10.9 +24.9 +12.2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table C.3  S2 Numeracy Summary, Uganda NAPE, 2008–10 Percent correct (0–100%) Overall Category 2008 2009 2010 average National average 55.8 39.7 42.6 46.0   Standard deviation 19.0 16.4 15.8 —   Number of questions (points) 38 (88) 38 (100) 46 (102) — Boys 58.5 41.6 44.7 48.2 Girls 52.8 37.5 40.3 43.6  Difference –7.7 –4.1 –4.4 –4.6 Rural 55.2 39.0 41.9 45.6 Urban 57.2 40.7 44.1 46.8  Difference +2.0 +1.7 +2.2 +1.2 Government 57.4 41.5 43.1 47.4 Private 54.5 38.3 42.2 44.9  Difference –2.9 –3.2 –0.9 –2.5 (table continues on next page) Summary Tables for Numeracy 121 Table C.3  S2 Numeracy Summary, Uganda NAPE, 2008–10 (continued) Percent correct (0–100%) Overall Category 2008 2009 2010 average Central 56.5 38.4 42.7 46.5 East 52.4 38.3 39.6 42.9 North 52.3 37.7 40.8 43.6 West 59.6 43.5 45.7 49.6 Kampala 61.3 44.5 48.2 51.6   High/low gap +9.0 +6.8 +8.6 +8.7 Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2008–10, with weights to correct for imbalanced sampling across years and differences in the percentage of private schools. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table C.4  P3 Number System and Place Value Summary, Uganda NAPE, 2006–09 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 average National average 44.9 44.6 50.0 46.8 46.6   Standard deviation 21.8 23.0 23.0 22.8 —   Number of questions (points) 10 (10) 10 (10) 8 (8) 8 (8) — Boys 45.9 45.0 50.8 47.9 47.4 Girls 44.0 44.3 49.3 45.7 45.8  Difference –1.9 –0.7 –1.5 –2.2 –1.6 Rural 42.9 42.8 47.8 44.5 44.5 Urban 56.1 55.6 61.3 57.4 57.7  Difference +13.2 +12.8 +13.3 +12.9 +13.2 Government 43.1 42.7 48.5 45.1 44.9 Private 65.2 66.9 71.2 70.9 68.3  Difference +22.1 +24.3 +22.6 +25.7 +23.4 Central 48.2 50.4 56.6 52.6 51.9 East 39.7 38.3 42.7 39.6 40.1 North 35.8 33.8 42.3 40.6 38.2 West 55.2 56.0 59.2 54.8 56.3 Kampala 67.3 63.8 67.3 72.2 67.5   High/low gap +31.5 +30.0 +25.0 +32.6 +29.3 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–09, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. 122 A World Bank Study Table C.5  P3 Number Patterns and Sequences Summary, Uganda NAPE, 2006–09 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 average National average 32.3 29.1 44.7 44.9 37.8   Standard deviation 34.9 34.0 28.0 28.1 —   Number of questions (points) 5 (5) 5 (5) 3 (5) 3 (5) — Boys 35.1 31.2 46.4 47.0 40.0 Girls 29.3 26.9 42.8 42.8 35.4  Difference –5.8 –4.3 –3.6 –4.2 –4.6 Rural 31.1 28.3 43.5 43.7 36.6 Urban 38.5 34.1 50.6 50.4 44.0  Difference +7.4 +5.8 +7.1 +6.4 +7.4 Government 30.7 27.3 43.7 43.8 36.4 Private 49.9 48.8 58.8 60.2 53.9  Difference +19.2 +21.5 +15.1 +16.4 +17.5 Central 36.4 34.6 50.6 47.4 42.1 East 22.7 18.6 37.8 37.4 29.2 North 22.7 18.8 38.5 42.2 30.7 West 48.5 46.6 54.4 54.6 51.0 Kampala 49.3 40.6 46.2 53.1 47.1   High/low gap +26.6 +28.0 +16.6 +17.2 +17.9 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbal- anced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table C.6  P3 Fractions Summary, Uganda NAPE, 2006–09 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 average National average 38.8 34.3 53.0 45.3 42.8   Standard deviation 30.5 30.0 35.9 34.2 —   Number of questions (points) 7 (7) 7 (7) 4 (4) 4 (4) — Boys 40.3 34.6 54.4 46.6 44.0 Girls 37.3 34.0 51.5 43.8 41.6  Difference –3.0 –0.6 –2.9 –2.8 –2.4 Rural 36.7 32.0 50.4 43.1 40.5 Urban 50.4 48.4 66.2 55.0 55.3  Difference +13.7 +16.4 +15.8 +11.9 +14.8 Government 36.6 31.7 51.1 42.8 40.6 Private 63.4 63.3 79.3 79.6 70.6  Difference +26.8 +21.6 +28.2 +36.8 +30.0 (table continues on next page) Summary Tables for Numeracy 123 Table C.6  P3 Fractions Summary, Uganda NAPE, 2006–09 (continued) Percent correct (0–100%) Overall Category 2006 2007 2008 2009 average Central 47.0 44.6 65.0 52.7 52.3 East 30.6 23.9 43.6 38.4 34.2 North 23.1 18.5 35.0 32.5 27.4 West 54.3 51.0 69.7 59.2 58.5 Kampala 67.6 58.9 76.5 67.5 67.7   High/low gap +44.5 +40.4 +41.5 +35.0 +40.3 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–09, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table C.7  P3 Measures and Geometry Summary, Uganda NAPE, 2006–09 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 average National average 20.0 30.2 59.8 57.5 41.9   Standard deviation 22.6 25.4 27.1 27.1 —   Number of questions (points) 8 (8) 8 (8) 7 (9) 7 (9) — Boys 21.4 30.5 61.4 59.0 43.1 Girls 18.5 29.9 58.1 56.0 40.6  Difference –2.9 –0.6 –3.3 –3.0 –2.5 Rural 17.9 27.7 57.6 55.5 39.5 Urban 31.3 45.2 70.8 66.8 54.6  Difference +12.4 +17.5 +13.2 +11.3 +15.1 Government 17.8 28.0 58.4 55.9 40.2 Private 44.0 55.0 79.1 80.2 63.0  Difference +26.2 +27.0 +20.7 +24.3 +22.8 Central 21.7 35.6 65.9 62.0 46.0 East 13.4 23.2 52.9 50.7 35.2 North 15.3 18.7 52.8 52.9 35.1 West 28.3 41.7 68.9 65.1 50.9 Kampala 43.3 61.1 70.6 77.9 62.8   High/low gap +34.9 +42.4 +17.8 +17.2 +27.7 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–09, with weights to correct for imbal- anced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. 124 A World Bank Study Table C.8  P3 Operations on Numbers Summary, Uganda NAPE, 2006–09 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 average National average 34.6 36.7 44.9 41.2 39.4   Standard deviation 22.9 23.6 25.0 25.0 —   Number of questions (points) 12 (16) 12 (16) 12 (17) 12 (17) — Boys 36.0 38.0 46.4 42.7 40.8 Girls 33.1 35.3 43.4 39.8 37.9  Difference –2.9 –2.7 –3.0 –2.9 –2.9 Rural 32.9 35.1 43.0 39.6 37.6 Urban 43.8 46.1 54.6 48.8 48.5  Difference +10.9 +11.0 +11.6 +9.2 +10.9 Government 32.8 34.7 43.3 39.7 37.7 Private 54.6 58.5 67.4 63.2 60.5  Difference +21.8 +23.8 +24.1 +23.5 +22.8 Central 37.0 42.5 51.3 43.8 43.6 East 29.8 29.7 37.6 35.4 33.2 North 27.9 28.2 39.2 36.9 33.1 West 42.5 46.3 53.1 50.0 47.9 Kampala 55.3 55.0 54.8 54.3 54.9   High/low gap +27.4 +26.8 +17.2 +18.9 +21.8 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–09, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table C.9  P3 Operations on Numbers ADDITION Summary, Uganda NAPE, 2006–09 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 average National average 50.7 55.2 55.3 52.5 53.4   Standard deviation 30.3 30.4 28.4 29.4 —   Number of questions (points) 4 (5) 4 (5) 4 (6) 4 (6) — Boys 52.5 57.0 57.0 54.3 55.2 Girls 48.8 53.3 53.6 50.6 51.6  Difference –3.8 –3.7 –3.4 –3.7 — Rural 48.8 53.9 53.7 50.8 51.8 Urban 61.1 62.6 63.6 60.0 61.8  Difference +12.3 +11.3 +10.1 +9.2 — (table continues on next page) Summary Tables for Numeracy 125 Table C.9  P3 Operations on Numbers ADDITION Summary, Uganda NAPE, 2006–09 (continued) Percent correct (0–100%) Overall Category 2006 2007 2008 2009 average Government 48.9 53.4 53.9 50.9 51.8 Private 70.6 74.7 75.0 74.5 73.6  Difference +21.7 +21.3 +21.1 +23.6 — Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. Notes: Averages represent percentage correct (0–100%). When test booklets include items that are worth more than one point (2008/09) the points are totaled and divided by the possible total to form the percentage. Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for each category (national average, boys, rural, and so forth). These comparisons are made against the Overall Average. Comparisons are also made within years between boys-girls, urban-rural, and government-private. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table C.10  P3 Operations on Numbers SUBTRACTION Summary, Uganda NAPE, 2006–09 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 average National average 33.2 33.4 32.9 23.3 30.7   Standard deviation 28.0 28.4 28.1 29.1 —   Number of questions (points) 4 (5) 4 (5) 4 (6) 4 (6) — Boys 34.8 34.6 34.0 24.0 31.9 Girls 31.6 32.6 31.8 22.5 29.5  Difference –3.2 –2.0 –2.2 –1.5 — Rural 31.4 31.7 30.7 21.8 28.9 Urban 43.3 44.0 43.6 30.7 40.0  Difference +11.9 +12.3 +12.9 +8.9 — Government 31.4 31.4 31.1 21.7 28.9 Private 54.0 56.3 58.2 45.5 53.7  Difference +22.6 +24.9 +27.1 +23.8 — Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. Notes: Averages represent percentage correct (0–100%). When test booklets include items that are worth more than one point (2008/09) the points are totaled and divided by the possible total to form the percentage. Overall Average refers to the mean across 2006–09, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for each category (national average, boys, rural, and so forth). These comparisons are made against the Overall Average. Comparisons are also made within years between boys-girls, urban-rural, and government-private. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table C.11  P3 Operations on Numbers MULTIPLICATION Summary, Uganda NAPE, 2006–09 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 average National average 23.4 25.2 51.1 51.6 37.8   Standard deviation 26.5 29.1 41.8 41.0 —   Number of questions (points) 2 (3) 2 (3) 2 (2) 2 (2) — (table continues on next page) 126 A World Bank Study Table C.11  P3 Operations on Numbers MULTIPLICATION Summary, Uganda NAPE, 2006–09 (continued) Percent correct (0–100%) Overall Category 2006 2007 2008 2009 average Boys 24.7 26.7 53.2 53.7 39.6 Girls 22.2 23.6 49.0 49.5 36.1  Difference –2.5 –3.1 –4.2 –4.2 — Rural 22.5 23.7 49.2 50.1 36.2 Urban 29.1 33.9 61.2 58.5 46.7  Difference +6.6 +10.2 +12.0 +8.4 — Government 21.9 23.2 49.6 50.2 36.4 Private 41.2 46.2 72.3 71.4 56.3  Difference +19.3 +23.0 +22.7 +21.2 — Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. Notes: Averages represent percentage correct (0–100%). When test booklets include items that are worth more than one point (2008/09) the points are totaled and divided by the possible total to form the percentage. Overall Average refers to the mean across 2006–09, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for each category (national average, boys, rural, and so forth). These comparisons are made against the Overall Average. Comparisons are also made within years between boys-girls, urban-rural, and government-private. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table C.12  P3 Operations on Numbers DIVISION Summary, Uganda NAPE, 2006–09 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 average National average 21.0 22.7 22.0 23.9 22.4   Standard deviation 29.2 31.3 19.6 19.8 —   Number of questions (points) 2 (3) 2 (3) 2 (3) 2 (3) — Boys 22.0 23.4 22.7 24.6 23.2 Girls 20.0 22.0 21.4 23.2 21.7  Difference –2.0 –1.4 –1.3 –1.4 — Rural 19.3 20.8 21.1 23.0 21.1 Urban 30.6 34.5 27.2 27.9 29.9  Difference +11.4 +13.7 +6.1 +4.9 — Government 19.1 20.6 21.2 23.1 21.0 Private 42.3 47.2 33.9 35.4 40.2  Difference +23.1 +26.6 +12.7 +12.3 — Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–09. Notes: Averages represent percentage correct (0–100%). When test booklets include items that are worth more than one point (2008/09) the points are totaled and divided by the possible total to form the percentage. Overall Average refers to the mean across 2006–09, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for each category (national average, boys, rural, and so forth). These comparisons are made against the Overall Average. Comparisons are also made within years between boys-girls, urban-rural, and government-private. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Summary Tables for Numeracy 127 Table C.13  P6 Number System and Place Value Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 32.5 45.1 40.7 41.2 60.2 43.9   Standard deviation 21.2 20.8 23.2 22.1 22.9 —   Number of questions (points) 19 (20) 19 (19) 17 (18) 18 (19) 10 (10) — Boys 35.2 47.5 42.8 44.4 62.8 46.5 Girls 29.6 43.0 38.7 37.7 57.6 41.3  Difference –5.6 –4.5 –4.1 –6.7 –5.2 –5.2 Rural 32.8 43.6 39.0 40.0 58.9 42.6 Urban 41.0 52.9 49.6 47.0 69.8 51.1  Difference +8.2 +9.3 +10.6 +7.0 +10.9 +8.5 Government 30.8 43.6 39.1 40.1 59.2 42.6 Private 50.2 60.8 63.6 57.8 74.2 60.9  Difference +19.4 +17.2 +24.5 +17.7 +15.0 +18.3 Central 30.8 44.0 39.4 42.4 60.6 42.8 East 29.8 43.3 37.1 37.6 55.7 40.3 North 28.4 39.7 35.4 37.8 59.0 40.8 West 41.1 53.1 51.1 48.4 64.8 52.1 Kampala 38.6 54.6 49.2 44.2 77.2 50.8   High/low gap +12.7 +14.9 +15.7 +10.8 +21.5 +11.8 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table C.14  P6 Number Patterns and Sequences Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 20.3 25.3 41.8 42.4 35.3 33.0   Standard deviation 21.0 20.9 25.3 25.6 24.5 —   Number of questions (points) 6 (10) 8 (9) 7 (8) 7 (8) 9 (12) — Boys 22.7 27.7 44.9 45.4 37.4 35.7 Girls 17.8 23.0 38.9 39.2 33.1 30.3  Difference –4.9 –4.7 –6.0 –6.2 –4.3 –5.4 Rural 19.1 24.3 40.7 41.6 33.6 31.9 Urban 26.1 30.7 47.7 46.4 47.2 39.2  Difference +7.0 +6.4 +7.0 +4.8 +5.6 +7.3 (table continues on next page) 128 A World Bank Study Table C.14  P6 Number Patterns and Sequences Summary, Uganda NAPE, 2006–10 (continued) Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average Government 19.2 24.4 40.6 41.7 33.8 32.0 Private 31.8 35.0 58.7 53.8 55.3 45.5  Difference +12.6 +10.6 +18.1 +12.1 +21.5 +13.5 Central 20.4 25.6 40.6 40.0 36.4 32.2 East 18.0 23.8 39.5 39.9 30.6 30.5 North 17.8 22.6 37.8 40.2 32.8 30.4 West 24.8 29.4 50.4 49.2 40.5 39.1 Kampala 27.0 29.2 43.5 53.8 58.6 38.3   High/low gap +9.2 +6.8 +12.6 +13.9 +18.0 +8.7 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table C.15  P6 Fractions Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 15.5 18.1 26.0 26.8 38.2 24.9   Standard deviation 20.0 20.1 22.0 22.0 17.8 —   Number of questions (points) 9 (15) 9 (15) 9 (15) 9 (15) 16 (29) — Boys 17.0 18.9 27.7 28.0 39.3 26.2 Girls 13.9 17.2 24.3 25.4 37.0 23.5  Difference –3.1 –1.7 –3.4 –2.6 –2.3 –2.7 Rural 13.7 16.5 24.1 25.8 36.7 23.5 Urban 24.5 26.1 35.5 31.5 48.6 32.4  Difference +10.8 +9.6 +11.4 +5.7 +11.9 +8.9 Government 13.8 16.3 24.3 25.7 37.0 23.5 Private 34.7 36.1 49.7 43.0 54.0 42.8  Difference +20.9 +19.8 +24.6 +17.3 +17.0 +19.3 Central 13.8 18.6 24.6 26.4 39.4 23.9 East 12.3 14.7 23.7 25.2 34.6 21.9 North 11.2 12.2 20.4 22.3 35.7 21.0 West 24.8 26.9 35.1 33.7 42.3 32.9 Kampala 24.4 27.7 34.3 29.3 59.7 32.7   High/low gap +13.6 +15.5 +14.7 +11.4 +25.1 +11.9 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Summary Tables for Numeracy 129 Table C.16  P6 Measures Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 23.5 30.1 18.7 19.5 28.8 24.1   Standard deviation 24.2 26.8 17.9 18.3 18.9 —   Number of questions (points) 6 (12) 6 (12) 7 (13) 8 (15) 10 (21) — Boys 26.0 33.5 21.3 21.9 31.7 26.9 Girls 20.8 26.7 16.1 16.9 25.7 21.3  Difference –5.2 –6.8 –5.2 –5.0 –6.0 –5.6 Rural 21.4 28.4 17.4 18.7 27.2 22.6 Urban 34.1 39.9 25.2 23.6 39.8 32.1  Difference +12.7 +11.5 +7.8 +4.9 +12.6 +9.5 Government 21.6 28.1 17.6 18.6 27.4 22.6 Private 44.2 50.5 34.6 33.1 46.8 42.6  Difference +22.6 +22.4 +17.0 +14.5 +19.4 +20.0 Central 23.2 31.1 17.5 20.3 29.7 24.3 East 19.1 25.4 16.0 16.9 24.9 20.3 North 17.8 21.7 15.3 16.2 26.4 19.7 West 33.6 42.6 26.2 25.3 33.3 32.1 Kampala 36.0 42.6 24.5 25.3 49.9 35.5   High/low gap +18.2 +20.9 +10.9 +9.1 +25.0 +15.8 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table C.17  P6 Geometry Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 15.1 16.5 18.5 29.4 18.7 19.6   Standard deviation 19.5 17.9 21.1 23.5 18.9 —   Number of questions (points) 5 (7) 5 (7) 7 (8) 9 (13) 8 (12) — Boys 15.5 16.4 19.9 31.3 19.8 20.7 Girls 14.6 16.6 17.2 27.4 17.5 18.6  Difference –0.9 +0.2 –2.7 –3.9 –2.3 –2.1 Rural 13.3 15.3 17.1 28.3 17.1 18.2 Urban 23.8 23.0 26.2 35.1 29.6 27.4  Difference +10.5 +7.7 +9.1 +6.8 +12.5 +9.2 Government 13.9 15.2 16.8 28.4 17.3 18.4 Private 27.8 29.7 42.6 45.1 37.0 35.5  Difference +13.9 +14.5 +25.8 +16.7 +19.7 +17.1 (table continues on next page) 130 A World Bank Study Table C.17  P6 Geometry Summary, Uganda NAPE, 2006–10 (continued) Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average Central 14.9 18.6 19.3 33.3 20.5 21.0 East 13.1 14.4 16.3 25.4 15.6 17.1 North 12.4 10.8 12.2 22.7 14.6 14.7 West 20.1 22.7 26.8 38.9 23.7 26.6 Kampala 19.2 22.5 24.8 30.0 39.2 25.0   High/low gap +7.7 +11.9 +14.6 +16.2 +24.6 +11.9 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Table C.18  P6 Operations on Numbers Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 52.5 55.1 64.2 63.3 68.0 60.6   Standard deviation 24.3 23.9 21.2 20.2 20.4 —   Number of questions (points) 16 (26) 16 (26) 15 (26) 15 (26) 12 (20) — Boys 55.6 57.9 66.5 66.2 70.4 63.3 Girls 49.3 52.3 61.9 60.0 65.5 57.8  Difference –6.3 –5.6 –5.4 –6.2 –4.9 –5.5 Rural 50.4 53.4 62.9 62.0 66.7 59.1 Urban 62.8 63.9 71.0 69.7 77.3 68.5  Difference +12.4 +10.5 +8.1 +7.7 +10.6 +9.4 Government 50.7 53.4 63.0 62.7 67.0 59.4 Private 71.7 71.8 81.0 72.0 81.1 75.2  Difference +21.0 +18.4 +18.0 +9.3 +14.1 +15.8 Central 51.4 54.7 61.0 63.5 68.6 59.4 East 48.1 50.9 61.9 59.1 63.5 56.6 North 48.1 49.3 61.3 61.4 67.3 57.9 West 63.1 65.6 72.6 70.5 71.8 68.8 Kampala 59.5 65.5 69.7 62.3 82.0 66.8   High/low gap +15.0 +16.3 +11.6 +11.4 +18.5 +12.2 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. Summary Tables for Numeracy 131 Table C.19  P6 Graphs and Interpretation Summary, Uganda NAPE, 2006–10 Percent correct (0–100%) Overall Category 2006 2007 2008 2009 2010 average National average 26.7 27.7 37.7 38.7 43.6 34.9   Standard deviation 25.8 26.3 26.1 25.7 29.7 —   Number of questions (points) 4 (10) 4 (10) 5 (10) 5 (10) 4 (10) — Boys 28.3 28.9 41.1 41.8 46.0 37.3 Girls 25.0 26.5 34.3 35.3 41.0 32.3  Difference –3.3 –2.4 –6.8 –6.5 –5.0 –5.0 Rural 24.4 25.3 35.5 37.5 41.2 32.9 Urban 38.2 40.2 48.6 45.0 60.6 45.8  Difference +13.8 +14.9 +13.1 +7.5 +19.4 +12.9 Government 25.0 25.8 36.2 37.8 41.7 33.4 Private 45.9 46.9 57.6 51.4 68.6 53.4  Difference +20.9 +21.1 +21.4 +13.6 +26.9 +20.0 Central 26.1 25.2 35.7 40.6 43.5 33.7 East 21.6 23.7 34.0 33.5 35.1 29.5 North 20.4 20.2 31.9 33.0 41.1 29.8 West 39.7 40.9 49.2 49.6 53.0 46.7 Kampala 36.0 45.6 48.4 45.0 70.3 46.8   High/Low Gap +19.3 +25.4 +17.3 +16.6 +35.2 +17.3 Source: National Assessment of Progress in Education (NAPE), Uganda, 2006–10. Notes: Averages represent percentage correct (0–100%). Overall Average refers to the mean across 2006–10, with weights to correct for imbalanced sampling across years. Tests of significance are carried out across years for gender, location, and school type. The average for each year (by category) is compared against the Overall Average for that category. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. For the regional categories the tests of significance are within each year and measure the regional average for that year against the average in the other regions. Shading is used to denote averages/differences that are significantly different at the p < .05 level. — = not available. 132 Table C.20  S2 Numeracy Detailed Summary, Uganda NAPE, 2008–10 Numerical concepts Measures Statistics Geometry Functions-transform Category 2008 2009 2010 2008 2009 2010 2008 2009 2010 2008 2009 2010 2008 2009 2010 National average 72.7 55.1 49.1 57.1 41.6 53.3 67.7 38.8 36.3 48.7 36.1 43.8 31.6 23.8 23.7   Standard Deviation 19.1 23.0 17.1 24.5 20.8 21.6 25.9 27.1 26.0 29.5 23.0 22.8 25.8 19.5 18.7   Number of questions (points) 8 (17) 9 (22) 13 (25) 8 (21) 9 (22) 6 (15) 4 (10) 5 (12) 4 (14) 3 (19) 4 (14) 8 (18) 4 (10) 3 (10) 6 (13) Boys 74.6 57.6 51.3 61.7 44.0 56.5 70.2 40.5 38.2 52.0 37.7 46.6 33.1 24.7 24.7 Girls 70.7 52.1 46.8 52.0 38.8 49.8 64.8 36.9 34.2 45.0 34.2 40.7 29.9 22.7 22.5  Difference –3.9 –5.5 –4.5 –9.7 –5.2 –6.7 –5.4 –3.6 –4.0 –7.0 –3.5 –5.9 –3.2 –2.0 –2.2 Rural 71.8 54.0 48.7 56.5 40.8 52.9 67.3 38.1 35.2 47.8 35.4 42.4 30.8 23.0 22.7 Urban 74.8 56.8 50.0 58.6 42.7 54.6 68.4 39.8 38.6 50.6 37.1 46.7 33.4 24.9 25.5  Difference +3.0 +2.8 +1.3 +2.1 +1.9 +1.7 +1.1 +1.7 +3.4 +2.8 +1.7 +4.3 +2.6 +1.9 +2.8 Government 73.7 56.0 49.6 58.8 43.4 54.2 69.0 41.5 36.5 51.4 38.9 44.5 33.0 24.9 23.5 Private 72.0 54.4 48.8 55.7 40.2 52.6 66.6 36.7 36.2 46.4 33.8 43.3 30.4 22.9 23.8  Difference –1.7 –1.6 –0.8 –3.1 –3.2 –1.6 –2.4 –4.8 –0.3 –5.0 –5.1 –1.2 –2.6 –2.0 +0.3 Source: National Assessment of Progress in Education (NAPE), Uganda, 2008–10. Notes: Averages represent percentage correct (0–100%). Tests of significance are carried out across years for each category (national average, boys, rural, and so forth). These comparisons are made against the Overall Average. Comparisons are also made within years (see Difference) between boys-girls, urban-rural, and government-private. Shading is used to denote averages/differences that are significantly different at the p < .05 level. APPENDIX D Table D.1  Detailed Summary of P3-P6 Teacher Numeracy Achievement, Overall and by Content Area, Uganda NAPE, 2011 Numeracy teachers Literacy teachers Achievement area P3 P6 P3 P6 Overall percent correct 82.0 89.0 79.6 82.1 (12.5) (10.3) (14.7) (12.2) IRT-scaled score 686.9 740.6 673.4 688.8 (75.6) (67.1) (88.9) (77.2) By content area Operations on numbers   Addition (6) 96.2 96.9 95.4 96.3   Subtraction (6) 95.2 96.8 93.6 94.9   Multiplication (6) 95.8 97.0 95.0 94.6   Division (5) 94.1 96.7 91.2 94.3   Symbols/brackets (3) 83.5 85.2 81.6 81.3    Overall average (24) 94.0 95.5 92.5 93.5 Number system/place Value (10) 88.0 93.0 85.1 87.0 Number pattern-seq. (14) 69.1 80.1 64.4 68.5 Measures (18) 86.6 92.6 85.1 89.0 Graphs-stats (8) 79.7 84.8 79.8 80.5 Fractions   Drawing-writing (3) 93.6 92.9 89.6 92.4   Addition (5) 89.8 94.7 87.1 88.6   Subtraction (5) 89.1 93.3 86.9 88.5   Multiplication (4) 82.2 93.6 78.2 82.1   Division (4) 71.0 91.7 66.2 71.4   Operations (7) 75.6 85.4 70.7 73.4   Overall average 82.8 91.4 79.0 81.8 Geometry (12) 65.9 81.3 64.6 67.3 Sample size 475 479 346 414 Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Shading is used to denote statistical significance at the p < 0.05 level. Table D.2  Detailed Summary of P3-P6 Teacher Literacy Achievement, Overall and by Content Area, Uganda NAPE, 2011 Literacy teachers Numeracy teachers Achievement area P3 P6 P3 P6 Overall percent correct 83.1 86.1 82.2 83.4 (8.1) (8.1) (10.5) (10.8) (table continues on next page) 133   134 A World Bank Study Table D.2  Detailed Summary of P3-P6 Teacher Literacy Achievement, Overall and by Content Area, Uganda NAPE, 2011 (continued) Literacy teachers Numeracy teachers Achievement area P3 P6 P3 P6 IRT-scaled score 704.2 721.9 698.1 703.9 (42.3) (41.8) (56.0) (56.0) By content area Reading   Associating objects/words (2) 99.6 99.5 98.5 98.8   Describing activities (4) 88.4 89.9 87.9 88.2   Reading and interpreting (18) 80.1 82.9 79.4 80.1   Reading and answering (16) 75.7 78.8 75.2 75.5    Overall reading (40) 80.2 82.8 79.5 80.0 Reading supplement (40) 76.1 78.4 76.3 75.1 Writing   Drawing objects (3) 84.1 86.3 83.1 84.9   Writing words (3) 94.9 95.5 94.3 94.7   Completing form/cards (17) 88.4 90.3 87.6 88.9   Naming objects (2) 72.1 73.7 68.9 67.0   Writing activities (15) 77.2 81.2 76.2 78.7    Overall writing (40) 83.6 86.2 82.6 84.1 Grammar (20) 88.3 92.6 87.0 88.6 Sample size 372 443 437 435 Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Figure D1.  Summary of P3-P6 Teacher Literacy Achievement, Overall and by Content Area, Uganda NAPE, 2011 95 90 85 Percent correct 80 75 70 65 60 Overall Reading Reading Writing Grammar basic supplement Content areas P3 literacy P6 literacy P3 numeracy P6 numeracy Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Appendix D 135 Table D.3  Summary of Variables Used in Teacher Analysis P3-P6-S2 (Standard Deviations in Parentheses), Uganda NAPE, 2011 S2 Teachers P3-P6 teachers Numeracy Literacy Biology School Characteristics  Rural 0.86 0.72 0.72 0.72  Private 0.11 0.37 0.37 0.37   Average SES 0.00 –0.02 –0.02 –0.02 (0.29) (0.52) (0.52) (0.52)   Private partnership (PPP) — — — —   USE school — 0.78 0.78 0.78   Single shift — 0.88 0.88 0.88 Teacher characteristics  Female 0.30 0.09 0.40 0.11   Academic Qual.=UACE 0.16 — — —   Teacher qualification    Grade iii primary 0.65 — — —   Grade V primary 0.22 0.01 0.01 0.01   Grade V secondary 0.01 0.42 0.46 0.52   Bachelors degree  + 0.02 0.33 0.35 0.27   Masters-doctorate — 0.01 0.01 0.01   Other 0.02 0.14 0.08 0.08   Missing 0.09 0.10 0.10 0.11   Teaching experience   1 Year 0.05 0.14 0.07 0.09   2–3 Years 0.12 0.22 0.19 0.19   4–6 Years 0.15 0.19 0.23 0.23   7–10 Years 0.22 0.17 0.15 0.20   11+ Years 0.36 0.19 0.23 0.18   Missing 0.11 0.09 0.13 0.11 Sample size 1,811 513 513 513 Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Note: — = not available. Table D.4  Covariates of P3-P6 Teacher Numeracy Achievement, by Content Area (T-Statistics in Parentheses), Uganda NAPE, 2011 Numeracy content areas Independent variables Operations Fract. Geometry Measures Stats. School characteristics  Rural 0.11 0.05 –0.05 –0.01 0.14 (1.35) (0.52) (–0.52) (–0.13) (1.61)  Private –0.02 –0.10 0.02 –0.01 0.11 (–0.20) (–1.05) (0.24) (–0.08) (1.10)   Average SES (district) 0.06 –0.01 –0.11 0.06 –0.06 (0.65) (–0.10) (–1.06) (0.60) (–0.56) Teacher Characteristics  Female –0.19*** –0.30*** –0.22*** –0.33*** –0.04 (–3.51) (–5.67) (–3.86) (–6.24) (–0.64)   Academic Qual.=UACEa –0.04 0.11* 0.05 0.06 0.10 (–0.65) (1.64) (0.69) (0.85) (1.44) (table continues on next page) 136 A World Bank Study Table D.4  Covariates of P3-P6 Teacher Numeracy Achievement, by Content Area (T-Statistics in Parentheses), Uganda NAPE, 2011 (continued) Numeracy content areas Independent variables Operations Fract. Geometry Measures Stats. b   Teacher qualification   Grade V primary 0.11* –0.01 0.02 0.12** 0.06 (1.82) (–0.01) (0.33) (2.11) (0.95)   Grade V secondary –0.09 –0.82*** –0.30 –0.33 0.03 (–1.42) (–3.66) (–1.26) (–1.49) (0.12)   Bachelors degree  + 0.29* 0.14 0.42*** 0.07 0.08 (1.86) (0.87) (2.54) (0.45) (0.45)   Other 0.08 –0.27 0.09 –0.14 –0.07 (0.47) (–1.51) (0.48) (–0.76) (–0.36)   Teaching experiencec   2–3 Years 0.06 –0.05 –0.06 –0.06 0.05 (0.50) (–0.43) (–0.46) (–0.48) (0.38)   4–6 Years 0.03 –0.06 0.01 0.02 0.16 (0.22) (–0.50) (0.10) (0.21) (1.25)   7–10 Years –0.15 –0.27** –0.23* –0.19 0.01 (–1.34) (–2.29) (–1.88) (–1.61) (0.08)   11+ Years –0.12 –0.13 –0.28** –0.10 0.05 (–1.06) (–1.14) (–2.30) (–0.85) (0.43)   Grade-subject controls   P3 numeracy –0.07 –0.34*** –0.47*** –0.30*** –0.19*** (–1.14) (–5.69) (–7.62) (–5.24) (–2.89)   P3 literacy –0.22*** –0.53*** –0.52*** –0.34*** –0.15** (–3.30) (–7.98) (–5.50) (–5.26) (–2.02)   P6 numeracy — — — — —   P6 literacy –0.20*** –0.47*** –0.47*** –0.21*** –0.18*** (–3.28) (–7.94) (–7.63) (–3.56) (–2.79) Sample average percent (std. deviation) 94.0 84.2 70.3 88.5 81.4 District fixed effects? No No No No No Sample size 1,553 1,553 1,553 1,553 1,553 Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Notes: The dependent variable for all estimations (by subject) is the standardized measure (mean = 0, standard devia- tion = 1.00) of each content area percentage score (see bottom of table for summary of dependent variables). Coefficients therefore represent changes in standard deviations. All estimations incorporate a mixed model (HLM) with a random school effect. T-statistics (in parentheses) are based on robust standard errors that correct for clustering of teachers at school level. — = not available. a. Excluded category: UCE-only academic qualification. b. Excluded category: Grade iii teaching qualification. c. Excluded category: experience = 1 year. d. ***Significant at 0.01 level; **Significant at 0.05 level; *Significant at 0.10 level. Appendix D 137 Table D.5  Covariates of P3-P6 Teacher Literacy Achievement, by Content Area (T-Statistics in Parentheses), Uganda NAPE, 2011 Literacy content areas P6 supplemental Independent variables Reading Writing Grammar reading School characteristics  Rural –0.18** 0.02 0.02 –0.05 (–2.05) (0.20) (0.20) (–0.49)  Private 0.04 0.06 0.26*** 0.05 (0.43) (0.51) (2.64) (0.42)   Average SES (district) 0.07 0.07 0.25*** 0.38*** (0.73) (0.60) (2.74) (3.07) Teacher Characteristics  Female 0.11* 0.06 –0.01 0.10** (1.80) (0.92) (–0.10) (1.95)   Academic qual.=UACEa 0.01 0.01 –0.01 0.14** (0.18) (0.04) (–0.06) (2.17)   Teacher Qualificationb   Grade V primary 0.18*** 0.10 0.18*** 0.07 (2.84) (1.56) (2.73) (1.24)   Grade V secondary –0.30 –0.13 –0.14 –0.06 (–1.25) (–0.55) (–0.55) (–0.29)    Bachelors and above 0.25 0.09 0.15 –0.01 (1.41) (1.50) (0.86) (–0.04)   Other 0.06 0.25 0.12 –0.10 (0.45) (1.25) (0.60) (–0.54)   Teaching experiencec   2–3 Years –0.01 –0.01 0.01 0.20* (–0.02) (–0.03) (0.07) (1.71)   4–6 Years 0.28** 0.07 0.29** 0.18 (2.24) (0.57) (2.23) (1.57)   7–10 Years 0.16 0.02 0.29** 0.16 (1.28) (0.17) (2.27) (1.39)   11+ Years –0.06 –0.02 0.28** 0.36*** (–0.45) (–0.15) (2.24) (3.09)   Grade-subject controls   P3 numeracy –0.31*** –0.31*** –0.46*** –0.35*** (–4.68) (–4.82) (–6.68) (–6.10)   P3 literacy –0.26*** –0.22** –0.37*** –0.19*** (–3.57) (–3.15) (–4.98) (–3.20)   P6 numeracy –0.27*** –0.19*** –0.32*** –0.28*** (–4.09) (–3.01) (–4.70) (–4.93)   P6 literacy — — — — Sample average percent (standard deviation) 80.6 84.1 87.2 76.5 District fixed effects? No No No No Sample size 1,534 1,534 1,534 1,521 Source: National Assessment of Progress in Education (NAPE), Uganda, 2011. Notes: The dependent variable for all estimations (by subject) is the standardized measure (mean = 0, standard deviation = 1.00) of each content area percentage score (see bottom of table for summary of dependent variables). Coefficients therefore represent changes in standard deviations. All estimations incorporate a mixed model (HLM) with a random school effect. T-statistics (in parentheses) are based on robust standard errors that correct for clustering of teachers at school level. — = not available. a. Excluded category: UCE-only academic qualification. b. Excluded category: Grade iii teaching qualification. c. Excluded category: experience = 1 year. d. ***Significant at 0.01 level; **Significant at 0.05 level; *Significant at 0.10 level. 138 A World Bank Study Table D.6  Covariates of School Average Pass Rate in 2007, UNPS 2009/10 Model Independent variable (1) (2) (3) (4) (5) Teacher attendance (%) 0.08  0.02 –0.03 –0.05 –0.03 (0.79) (0.27) (–0.41) (–0.56) (–0.42) Total enrollment — 0.001***  0.001*** 0.001***  0.001*** (4.14) (3.31) (3.15) (3.48) Availability of toilets — ­ 0.04  0.02  0.02  0.01 (0.96) (0.69) (0.61) (0.38) Availability of first-aid —  0.07  0.06  0.06  0.05 (1.59) (1.50) (1.37) (1.29) Teacher educationa   Percent with Grade V — —  0.28**  0.27**  0.28** (2.37) (2.30) (2.30)   Percent untrained — — –0.37** –0.39** –0.35** (–1.94) (–2.03) (–2.12) Provide textbooks — — — –0.03  –0.03 (–1.09) (–0.95) Private secondary in community — — —  0.08**  0.11*** (2.12) (2.83) School type (excluded: public)  Private — — — —  0.16*** (2.66)  NGO — — — — —  Other — — — — –0.23*** (–3.00) Sample size 305 302 299 284 284 Explained variance -R2 0.01  0.17  0.23  0.25  0.29 Source: Uganda National Panel Survey, 2009/10. Notes: Dependent variable is percentage of students who passed P7 exam. All models based on weighted data. T-statistics in parentheses. NGO = nongovernmental organization; — = not available. a. Excluded category for teacher education: Grade iii Table D.7  Covariates of School Average Pass Rate—2007/08 Pooled, UNPS 2009/10 Model Independent variable (1) (2) (3) (4) (5) 2008 control –0.02 –0.02 –0.02 –0.02 –0.02 (–0.51) (–0.53) (–0.50) (–0.32) (–0.39) Teacher attendance (%) 0.26* 0.17* 0.16 0.13 0.15 (1.90) (1.73) (1.29) (0.94) (1.02) Total enrollment — 0.001*** 0.001*** 0.001** 0.001*** (4.04) (2.61) (2.31) (2.57) Availability of toilets — 0.09** 0.08* 0.06 0.05 (2.27) (1.86) (1.51) (1.28) Availability of first-aid — 0.12** 0.12*** 0.11*** 0.11*** (2.53) (2.66) (2.54) (2.51) (table continues on next page) Appendix D 139 Table D.7  Covariates of School Average Pass Rate—2007/08 Pooled, UNPS 2009/10 (continued) Model Independent variable (1) (2) (3) (4) (5) a Teacher education   Percent with Grade V — — 0.14 0.15 0.14 (0.80) (0.84) (0.79)   Percent untrained — — –0.08 –0.06 –0.03 (–0.23) (–0.19) (–0.09) Provide textbooks — — — 0.01 0.01 (0.01) (0.13) Private secondary in community — — — 0.25* 0.29* (1.74) (1.90) School type (excluded: public):  Private — — — — 0.13* (1.81)  NGO — — — — 0.26 (0.95)  Other — — — — –0.40*** (–2.69) Sample size 610 604 599 568 568 Explained variance -R2 0.02 0.12 0.22 0.16 0.19 Source: Uganda National Panel Survey, 2009/10. Notes: Dependent variable is percentage of students who passed P7 exam. All models based on weighted data. T-statistics in parentheses. — = not available. a. Excluded category for teacher education: Grade iii REFERENCES Ball, D. L., and H. Bass. 2000. “Making Believe: The Collective Construction of Public Mathematical Knowledge in the Elementary Classroom.” In Yearbook of National Society for the Study of Education, Constructivism in Education, edited by D. Phillips, 193–224. Chicago: University of Chicago Press. Ball, D. L., H. C. Hill, and H. Bass. 2005. “Knowing Mathematics for Teaching: Who Knows Mathematics Well Enough to Teach Third Grade, and How Can We Decide?” American Educator (Fall): 14–22. Bingolbali, E., H. Akkoc, F. M. Ozmantar, and S. Demir. 2011. “Pre-service and In-service Teachers’ Views of the Sources of School of Education, King’s College, London. Students’ Mathematical Difficulties.” International Electronic Journal of Mathematics Education 6 (1): 40–59. Bynner, J., and S. Parsons. 1997. Does Numeracy Matter? London: The Basic Skills Agency. ————. 2000. “Impact of Poor Numeracy on Employment and Career Progression.” In The Maths We Need Now: Demands, Deficits, and Remedies, edited by C. Tickly and A. Woolf. Bedford Way Papers. London: Institute of Education. Clotfelter, C. T., H. F. Ladd, and J. L. Vigdor. 2006. “Teacher–Student Matching and the Assessment of Teacher Effectiveness.” Journal of Human Resources 41 (4): 778–820. Dubinsky, E. 1991. “Reflective Abstraction in Advanced Mathematical Thinking.” In Advanced Mathematical Thinking, edited by D. Tall. The Netherlands: Kluwer. Eisenberg, T. 1991. “Functions and Associated Learning Difficulties.” In Advanced Mathematical Thinking, edited by D. Tall. London: Kluwer Academic Publishers. Fennema, E., M. L. Franke, T. P. Carpenter, and D. A. Carey. 1993. “Using Children’s Mathematical Knowledge in Instruction.” American Educational Research Journal 30 (3): 555–83. Greaney, V., and T. Kellaghan. 2008. Assessing National Achievement Levels in Education, vol. 1. Washington, DC: World Bank. Hill, H. C., B. Rowan, and D. L. Ball. 2005. “Effects of Teachers’ Mathematical Knowledge for Teaching on Student Achievement.” American Educational Research Journal 42 (2): 371–406. Hill, H.C., S.G. Schilling, and D.L. Ball. 2004. “Developing Measures of Teachers’ Mathmatics Knowledge for Teaching.” Elementary School Journal 105:11–30. Kasirye, I. 2009. Determinants of Learning Achievement in Uganda. Uganda: Economic Policy Research Centre. Lenhart, S.T. 2010. “The Effect of Teacher Pedagogical Content Knowledge and the Instruction of Middle School Geometry.” Doctoral Dissertation and Projects. University of Liberty. Guillermin Online Library. Manouchehri, A. and T. Goodman. 2000. “Implementing Mathematics Reform: The Challenge Within.” Education Studies in Mathmatics. 42 (1): 1–34. 141   142 A World Bank Study Marshall, J. H. 2009. “School Quality and Learning Gains in Rural Guatemala.” Economics of Education Review 28 (April): 207–16. Marshall, J. H., and M. A. Sorto. 2012. “Teaching What You Know or Knowing How to Teach? The Effects of Different Forms of Teaching Mathematics Knowledge on Student Achievement in Rural Guatemala.” International Review of Education (forthcoming). Published on line February 2012. Muwanga, K. N., J. N. Aguti, A. N. Mugisha, A. N. Ndidde, and S. N. Simiinyu. 2007. Literacy Practices in Primary Schools in Uganda: Lessons for Future Interventions. Kampala, Uganda: Fountain Publishers. Nannyonjo, H. 2007. “Education Inputs in Uganda: An Analysis of Factors Influencing Learning Achievement in Grade Six.” World Bank Working Paper no. 98, Africa Human Development Series, World Bank, Washington, DC. OECD (Organisation for Economic Co-operation and Development). 2010. What Students Know, Can Do: Student Performance in Reading, Mathematics, and Science. Paris: OECD. Parsons, S., and J. Bynner. 2005. Does Numeracy Matter More? National Research and Development Centre for Adult Literacy and Numeracy. London: Institute of Education, University of London and European Union, European Social Fund. Pyke, C. L. 2003. “The Use of Symbols, Words and Diagrams as Indicator of Mathematical Cognition: A Causal Model.” Journal of Research in Mathematics Education 34: 406–32. Schubert, J. G., and D. Prouty-Harris. 2003. “Accelerating Paths to Quality: A Multi- Faceted Reality.” American Institutes for Research, AED, University of Pittsburgh, PA. Sfard, A., and L. Linchevski. 1994. “The Gains and the Pitfalls of Reification—The Case of Algebra.” Educational Studies in Mathematics 26 (2–3): 191–228. Sherin, M. G. 2002. “When Teaching Becomes Learning.” Cognition and Instruction 20 (2): 119–50. Shulman, L. S. 1986. “Those Who Understand: Knowledge Growth in Teaching.” Educational Researcher 15 (2): 4–14. Sorto, M. A., J. H. Marshall, T. F. Luschei, and M. Carnoy. 2009. “Teacher Knowledge and Teaching in Panama and Costa Rica: A Comparative Study in Primary and Secondary Education.” Revista Latinoamericana de Investigación en Matemática Educativa 12 (2): 251–90. Spillane, J. P. 2000. “A Fifth-Grade Teacher’s Reconstruction of Mathematics and Literacy Teaching: Exploring Interactions among Identity, Learning, and Subject Matter.” The Elementary School Journal 100 (4): 307–30. Stephens, M., and J. Moskowitz. (1999). “Measuring Learning Outcomes: A Primer for Work in Developing Countries.” Education Quality Improvement Program 2 (EQUIP2), Issues Paper, USAID, Washington, DC. Telima, A. 2011. “Problems of Teaching and Learning Geometry in Secondary Schools in Rivers State, Nigeria.” International Journal of Emerging Sciences 1 (2): 143–52. ISSN 2222–4254. Uganda Bureau of Statistics (UBOS). 2010. 2009/10 Uganda National Household Survey Report. Uganda. Widmer, C. C., and L. Sheffield. 1994. “Putting the Fun into Functions and Through the Use of Manipulatives, Computers, and Calculators.” School Science and Mathematics 94 (7): 350–55. Zhang, Y. 2006. “Urban-Rural Literacy Gaps in Sub-Saharan Africa: The Roles of Socio­ economic Status and School Quality.” Comparative Education Review 50 (4): 581–602. ECO-AUDIT Environmental Benefits Statement The World Bank is committed to preserving In 2010, the printing of endangered forests and natural resources. this book on recycled paper The Office of the Publisher has chosen to saved the following: print World Bank Studies and Working • 11 trees* Papers on recycled paper with 30 percent • 3 million Btu of total postconsumer fiber in accordance with the energy recommended standards for paper usage • 1,045 lb. of net greenhouse set by the Green Press Initiative, a non- gases profit program supporting publishers in • 5,035 gal. of waste water using fiber that is not sourced from endan- • 306 lb. of solid waste gered forests. For more information, visit www.greenpressinitiative.org. * 40 feet in height and 6–8 inches in diameter