Policy Research Working Paper 9288 Technology in the Classroom and Learning in Secondary Schools Moussa P Blimpo Ousman Gajigo Solomon Owusu Ryoko Tomita Yanbin Xu Africa Region, Office of the Chief Economist & Education Global Practice June 2020 Policy Research Working Paper 9288 Abstract This paper studies the impact of a computer-assisted learn- later used the certification exam data on the same students ing program on learning outcomes among high school to replicate the results. The findings show that the pro- students in The Gambia. The program uses innovative tech- gram led to a 0.59 standard deviation gains in mathematics nologies and teaching approach to facilitate the teaching of scores and an increase of 15 percentage points (a threefold mathematics and science. Since the pilot schools were not increase) in the share of students who obtained credit in randomly chosen, the study first used administrative and mathematics and English, a criterion for college admis- survey data, including a written test, to build a credible sion in The Gambia. The impact is concentrated among counterfactual of comparable groups of control students. It high-achieving students at the baseline, irrespective of their used these data to conduct a pre-analysis plan prior to stu- gender or socioeconomic background. dents taking the high-stakes certification exam. The study This paper is a product of the Office of the Chief Economist, Africa Region and the Education Global Practice. It is part of a larger effort by the World Bank to provide open access to its research and make a contribution to development policy discussions around the world. Policy Research Working Papers are also posted on the Web at http://www.worldbank.org/prwp. The authors may be contacted at mblimpo@worldbank.org. The Policy Research Working Paper Series disseminates the findings of work in progress to encourage the exchange of ideas about development issues. An objective of the series is to get the findings out quickly, even if the presentations are less than fully polished. The papers carry the names of the authors and should be cited accordingly. The findings, interpretations, and conclusions expressed in this paper are entirely those of the authors. They do not necessarily represent the views of the International Bank for Reconstruction and Development/World Bank and its affiliated organizations, or those of the Executive Directors of the World Bank or the governments they represent. Produced by the Research Support Team Technology in the Classroom and Learning in Secondary Schools 1 Moussa P Blimpo Ousman Gajigo Solomon Owusu World Bank African Development Bank United Nations University - MERIT Ryoko Tomita Yanbin Xu World Bank Georgetown University Keywords: Secondary Education, Technology, Computer Assisted Learning, Pedagogy JEL Classification: C93, I21, I28, O15 1 We thank The Ministry of Basic and Secondary Education of the Gambia (Honorable Claudiana Cole, Minister of Basic and Secondary Education; Ebrima Sisawo, Permanent Secretary, Ministry of Basic and Secondary Education; Momodou Jeng, Director of Science and Technology, Ministry of Basic and Secondary Education; Alpha Bah, System Analyst, Ministry of Basic and Secondary Education; and Abdoulie Sowe, Project Manager, Ministry of Basic and Secondary Education). Thanks to Meskerem Mulatu and Tanya June Savrimootoo of the World Bank for facilitating this study. We thank Robert Goodman and his colleagues at NJCTL who designed and implemented the PSI-PMI program. Progressive Science Initiative® (PSI®) and Progressive Mathematics Initiative® (PMI®) are registered trademarks of Dr. Robert Goodman and the New Jersey Center for Teaching and Learning is the exclusive Licensee of these marks. We thank seminar participants at the World Bank for their comments and suggestions on earlier versions of this study. All errors are those of the authors. The findings, interpretations, and conclusions expressed in this paper are entirely those of the authors. They do not necessarily represent the views of the International Bank for Reconstruction and Development/World Bank and its affiliated organizations, or those of the Executive Directors of the World Bank or the governments they represent. 1. Introduction Over the past 20 years, primary school enrollment has increased dramatically in Sub-Saharan Africa—from a gross enrollment rate of 81.8 percent at the turn of the century to 97.3 percent in 2017. However, the quality of learning has not kept pace, despite the rapidly expanding enrollment levels. 2 Compared with low- and middle-income countries in other regions, the measured level of learning in Sub-Saharan Africa is quite low. In particular, students’ performance in mathematics and science is well below that of students in comparator countries in all internationally comparable assessments (Bashir et al. 2018; Evans, Santos, and Arias 2019; Bold et al. 2017). The situation of education is a microcosm of this continent-wide phenomenon. Specifically, although the countries registered impressive improvements in enrollments, learning outcomes have not progressed satisfactorily. To improve mathematics and science education in secondary schools, The Gambia has implemented a new pedagogical innovation called the Progressive Science Initiative® (PSI®) and Progressive Mathematics Initiative® (PMI®) since 2012. The program has been piloted in 24 schools, one of which is a basic cycle school (grades 1 to 9), seven are upper basic schools (middle schools), nine are senior secondary schools (high schools), and seven are combined upper basic and senior secondary schools. PSI-PMI incorporates technology in the teaching of mathematics and science subjects (physics, chemistry, and biology) in schools. The PSI-PMI model creates a student-centered environment with interactive teaching and learning methods. Central to this model is the use of technologies such as interactive white board (IWB) software and a student polling device (responders), as well as emphasizing group discussions. Rather than being a remedial tool like other computer-assisted learning (CAL) programs, the PSI-PMI model is directly integrated into the teaching and learning curriculum, replacing the way science and mathematics subjects are traditionally taught in the country. The Gambia was the first country to adopt this program in One World in Data: https://ourworldindata.org/search?q=gross+enrollemt+ration+primary+education+Africa. 2 One World in Data: https://ourworldindata.org/search?q=gross+enrollment+ratio+secondary+education+africa. One World in Data: https://ourworldindata.org/tertiary-education. All accessed August 13, 2019. 2 Africa, but many other countries have followed since (Lesotho, Nigeria, Niger, and Rwanda), and an increasing number of countries are considering its implementation. This paper assesses the impacts of the PSI-PMI program on students’ learning outcomes. To assess the program’s impact, we focus mainly on the performance of 12th grade students on the compulsory high school leaving examination, the West African Secondary School Certificate Examination (WASSCE). We used matching procedures to construct two groups of non-program students to serve as the control group. In the first control group, given that the selection of pilot schools was not random, we used administrative data and propensity score matching to select a comparable group of control schools. We then obtained student-level data through a survey, significantly oversampling control group students to further match at the student level. The second control group exploits the fact that within the pilot schools, not all students were exposed to the program due, among other things, to capacity constraints. As a result, we were able to build a within-school control group, which addresses the limitation of the matching at the school level given the small sample. At the time of data collection, the sampled students were assessed on an exam based on the Gambian curriculum but designed by educators and the researchers. In addition to our survey data, we have access to a significant amount of administrative data, including the students’ performance on their nationwide grade 9 examination, the Gambia Basic Education Certificate Examination (GABECE). The results of our analyses show that the PSI-PMI program in The Gambia significantly improved student performance in mathematics and English. 3 The program improved students’ average mathematics score by 0.54 standard deviation or 11.9 percentage points, from 38.8 to 50.7, on the researcher designed exam and by 0.59 standard deviation or 9.2 percentage points on the WASSCE exam. The program also improved students' English score by 3.87 percentage points. Although these results are impressive, they were driven by high-performing students, as reflected in their scores on the GABECE, the 9th grade national exit exam. The analyses further show that the interactive nature of the program contributed to greater student participation in the classroom and increased student interest in learning science and mathematics. Teachers also 3 These two subjects are compulsory for all students. 3 reported that the program substantially improved the effectiveness of teaching, as it allowed teachers to monitor students’ performance in real-time, adjust teaching methods when necessary, and give special attention to students with unique learning needs. The paper contributes to a growing strand of literature examining the impact of the use of CAL programs in developing countries. Although the history of the use of such programs in developed countries is much wider (Angrist and Lavy 2002; Rouse and Krueger 2004; Machin, McNally, and Silva 2006; Goolsbee and Guryan 2006; Barrow, Markman, and Rouse 2009; Roschelle et al. 2010), the list of studies of developing countries, including those in Sub-Saharan Africa, is growing in recent years but remains relatively thin (Escueta et al. 2017). Some of these studies focus on programs in which technology is introduced in schools to supplement the regular curriculum, while others focus on contexts in which it is fully integrated into the curriculum. Many of the studies are forms of remedial education such as Banerjee et al (2007) in India or Lai et al (2015; 2016) in China. Both studies reported a positive impact on learning outcomes, with notable gains among poor performers and disadvantaged students. The CAL is also often used to help students with varying learning speed to progress at their own pace. In a recent study, Muralidharan, Singh, and Ganimian (2019) evaluated the impact of a personalized technology-aided after-school instruction program in middle-school grades in urban India and found that attending the program for 90 days would increase math and Hindi test scores by 0.6 and 0.39 standard deviations respectively, with academically weaker students benefiting most. Unlike in our study, these studies often come in the form of targeted supplemental instruction or using technologies to facilitate students learning the right level, rather than at scale integration or change in the instructional methods. Linden (2008) captured the importance of this distinction in a CAL program in Gujarat, India, showing a positive effect when it was implemented as a supplement to regular instruction, but no significant effect when it was fully integrated to the schools. The closest study to our paper is a study in Ecuador, which was more integrated within the school system (Carrillo, Onofa, and Ponce 2011). Like in our paper, they found a positive effect, also driven by high performers. Nevertheless, a distinction with our study is that the 4 program primarily aimed at allowing the students to learn at their own pace and, like most of the other papers, it was also in primary school. In the developing countries context, these technologies may be more promising for higher grades where most of the students have prior contact with the technologies and where classroom management may be less difficult. Additionally, the CAL in our context not only is integrated with the curriculum, but it also induced a change in pedagogical approach toward more student-centered instruction. Several studies have evaluated other various programs attempting to introduce computers in various ways in developing countries. They often find limited to no impact along with implementation challenges (Bet et al 2010; Cristia, Czerwonko, and Garofalo 2010; Barrera-Osorio and Linden 2009). As these studies suggest, the effect of CAL programs on learning varies quite widely. This is not surprising. A priori, it is unclear whether the introduction of such technology would improve student learning overall or worsen existing inequalities among students. On the one hand, the incorporation of technology represents additional inputs in teaching, similar to the introduction of teaching aids such as books and posters, which could improve students' learning. On the other hand, the effects of such programs on learning may be limited by the availability of complementary technology or other deficiencies in implementation (Barrera-Osorio and Linden 2009). Even with proper implementation, some students may be better positioned to take advantage of the introduction of such technologies, through greater resources at home or a stronger foundation, thereby aggravating the inequalities in learning outcomes instead of ameliorating them. Consequently, the impact of the introduction of technology on educational outcomes is mostly an empirical matter. The rather limited evidence so far, especially in developing countries, underscores the importance of generating evidence to inform policy makers on the merits and potential pitfalls of technology innovations in classrooms in developing countries. The remainder of the paper is organized as follows. Section 2 provides the contextual background of the Gambian education system and the design and implementation of the PSI-PMI program. Section 3 presents the research design, data, and methodology used in the study. The 5 results of our analyses, robustness checks, and heterogeneous impact analysis are presented in section 4. Section 5 discusses the qualitative assessment. Section 6 concludes. 2. Contextual Background 2.1 The Gambia’s Education System The Gambia’s current formal education system follows a 6-3-3-4 structure. There are six years of lower basic education (grades 1 to 6), which officially begins at age 7, followed by three years of upper basic education (grades 7 to 9). This is followed by three years of senior secondary education (grades 10 to 12) and two to four years of tertiary or higher education. This structure has a unified basic education level with automatic promotion in grades 1 to 9. The local education system carries out periodic evaluations to assess students' learning. Between grades 1 and 9, students take three exams, the National Assessment Test at grades 3, 5, and 8. These are low-stakes exams, as the results are only used by the government to gauge learning, rather than to determine progress or placement in the next higher grade. The two high- stakes exams are the GABECE, which is taken at grade 9 and determines entry into senior secondary schools, and the WASSCE, which is taken in grade 12 and determines placement in tertiary institutions. 4 The students taking the GABECE can take up to 10 subjects, but four of those must include the core subjects of mathematics, English, science, and social studies. The required subjects for the WASSCE in The Gambia are English and mathematics. Although The Gambia has made impressive progress in gross enrollment and realized the elimination of gender disparity in access at the basic cycle and senior secondary school levels, and in recent years girls have had better access than boys, the improvements in overall access rates have not been matched by increases in learning outcomes (Tomita and Savrimotoo 2016). For instance, grade 8 students scored an average of about 50 percent on the mathematics portion 4 These high-stakes exams (GABECE and WASSCE) are administered by the West African Examination Council. While the GABECE is specific to The Gambia, the WASSCE is also taken by high school students in Ghana, Liberia, Nigeria, and Sierra Leone. 6 of the National Assessment Test in 2017, which was an improvement of only 6 percentage points over the preceding five years. 5 2.2 PSI-PMI Program in The Gambia Stemming from concerns about students’ performance on national and international examinations, the Ministry of Basic and Secondary Education of The Gambia and the World Bank entered into an agreement in 2012 to implement the PSI-PMI as a pilot project. 6 The program was implemented with support from the New Jersey Center for Teaching and Learning (NJCTL), the organization that designed the program. Although the pilot has been implemented in a total of 24 basic cycle, upper basic, and senior secondary schools, this evaluation focuses on senior secondary schools (map 1). And even within the pilot schools, not all students were exposed to the PSI-PMI program. Map 1: PSI-PMI Program Schools (Senior Secondary Schools) in The Gambia The PSI-PMI program seeks to improve the quality of mathematics and science education through the facilitation of teacher instruction and monitoring and incorporation of technology 5 The 2017 National Assessment Test for grade 8 was designed to be aligned with the 2012 National Assessment Test for grade 8, by linking the test items so that the mean scores of the tests are comparable. 6 The Gambian government obtained approximately US$500,000 in July 2012 from the Institutional Development Fund through the World Bank, to implement the Teaching Math and Physics through e-learning project (P129888). 7 that enhances students’ participation. The program provided computers for teachers, scripted lessons, and customized software; equipped classrooms with smart projectors (smartboards) and handheld devices (smart responders) that students can use to respond to teachers; as well as provided textbooks for students. Rather than being a remedial tool, the program is incorporated into the curriculum by substantially altering the way science and mathematics subjects are traditionally taught. The program, which originated in New Jersey, in the United States, has gained traction within and outside the country. The Gambia was the first country in Sub-Saharan Africa to implement this program. The central part of the PSI-PMI technology is the IWB, which is a platform that allows teachers to develop digital course content. No internet connection is required at the school level since all the modules are downloadable; however, with internet access, the content can be available for other teachers in a way that enhances teacher collaboration. A complementary technology is the student responders, which are battery operated, wireless handheld devices that allow students to provide responses simultaneously. The responders have an interface with the IWB that enables teachers to monitor and track students’ responses in real-time. In addition, the responders enable teachers to assess students’ understanding of the material in real-time. For instance, as the teacher is discussing a particular topic, all students can respond to any question posed, which lets the teacher know what proportion of the students understood the material. This encourages participation in a way that is not possible in a traditional setting where only one student at a time can respond to a question. Another major part of the PSI-PMI program is the emphasis on a student-centered interactive teaching philosophy. The program is structured so that the delivery of the subject matter is frequently punctuated with brief student assessments. Even the seating of the students is optimized to reinforce greater student participation and collaboration. For example, the program recommends round tables for students, to encourage collaboration. In theory, some aspects of the PSI-PMI approach can be taught even in the absence of the technology. Prior to the launch of the program in each school, teachers took a pretest to gauge their content knowledge in the subject they were teaching. The first training, which lasted two weeks 8 in August 2012, focused on identified weak areas on the pretest. Then the same teachers attended training in December 2012, which was the last training they took before starting classroom instruction. NJCTL provided the training for the first cohort. The subsequent cohorts of teachers (cohorts 2 and 3) were trained by the first cohort of local teachers who performed well on the NJCTL training program, initially under NJCLT supervision. They trained the cohort 3 teachers without NJCTL supervision and started training in other countries, such as Nigeria, Niger, and Rwanda. In all the pilot schools, trained teachers taught the PSI-PMI courses. Table 1 presents the timeline of the implementation of the program. Although the program was implemented according to the plan for the most part, there were some implementation challenges. For example, due to limited internet access and unreliable electricity supply, there was limited collaboration or peer review among teachers during program implementation in The Gambia. Similarly, due to limited funding, most of the schools were not equipped with round student tables; therefore, the program was implemented with traditional desk configurations in the classrooms. And the program implementation was halted in two pilot schools for at least a year due to technology breakdown and the departures of trained teachers. 9 Table 1: Timeline of the PSI-PMI Program Date Description of PSI-PMI program August 2012 Two-week training for 24 9th and 10th grade teachers in 12 upper basic schools and senior Teacher training (1) secondary program schools. Students in 12 pilot schools took the PSI algebra-based physics and PMI algebra exams (pretest exams). December 2012 Teacher training (2) A week-long follow-up training provided to teachers in the 12 program pilot schools. January 2013 Students in program pilot schools began receiving PSI and PMI instruction. April 2013 Almost all the same teachers from the August and December trainings continued the training Teacher training (3) by NJCTL. It focused on algebra I and algebra-based physics courses. Spring 2013 PSI and PMI instruction continued. Roll-out continued at more schools as additional schools received the technology. June 2013 Upper basic school students in cohort 1 took a modified GABECE. August 2013 Teachers from cohort 1 had 10 additional days of PMI/PSI training. Under NJCTL supervision, Teacher training (4) teachers from cohort 1 trained the 29 cohort 2 9th and 10th grade teachers. January 2014 One school from cohort 2 began instruction. April 2014 Cohort 1 completed algebra-based physics training and was also trained in geometry. Four Teacher training (5) selected cohort 1 Gambian teachers returned to train cohort 2 teachers. Spring 2015 Cohort 1 PSI and PMI students took the WASSCE. Students in cohort 1 who took the WASSCE had received approximately two years of PMI/PSI instruction. Spring 2016 Students in PSI and PMI cohorts 1 and 2 took the WASSCE. Students in cohort 1 who received PSI-PMI training from grade 9 and took the WASSCE had 3.5 years of PSI-PMI instruction; those in cohort 2 received approximately two years of PSI-PMI instruction. Spring 2017 Students in PSI and PMI cohort 2 took the WASSCE. Gambian trainers trained trainers in Rwanda (February 2018), Nigeria (August 2018), and 2018 Niger (October 2018). Present The pilot program continues and is ready to scale up. Note: The table mainly describes the training of cohort 1 teachers. Teachers and students in cohort 2 started receiving PSI and PMI training and instruction one year later than teachers and students in cohort 1. WASSCE is organized by the West African Examinations Council and attended by candidates residing in Anglophone West African countries. In these countries, access to university or other higher education depends on performance on the WASSCE. GABECE = Gambia Basic Education Certificate Examination; NJCTL = New Jersey Center for Teaching and Learning; PMI = Progressive Mathematics Initiative; PSI = Progressive Science Initiative; WASSCE = West African Secondary School Certificate Examination. 3. Study Design and Methodology The goal of the study is to estimate the impact of the PSI-PMI program on learning outcomes. Specifically, we aim to estimate the impact of the program on grade 12 students’ 10 performance. These students received the program in varying degrees, but every year for 3 years. Since we did not have student-level administrative data from the beginning, we first used school- level administrative data and propensity score matching to create a comparable group of schools. We then designed a pre-analysis plan using survey data we collected in all program and control schools, including on student performance on a test we designed. The survey data allowed us to build a second control group from within program schools. We consider the latter as our preferred control group because it addresses potential additional biases stemming from the differences in the quality of schools after the matching at the school level. We later used students’ high-school exit exam performance to replicate the results and control for student-level characteristics. Figure A2 in the appendix shows the steps of the study design timeline. 3.1. Data and Descriptive Statistics We used three main data sets for the analysis in this paper. The first source is the Gambia’s Education Management Information System (EMIS) data as of 2014. EMIS is a system- wide data set collected by MoBSE annually and covers all schools from early childhood development (ECD) to senior secondary education across the country, both public and private. EMIS data aim at facilitating better education system planning and policy dialogue. We used the EMIS data as the main data source for information about school characteristics at the baseline. The second data source is the 2014 and 2018 WASSCE data, obtained from the West African Examinations Council (WAEC). WASSCE is a type of standardized test in West Africa. It is administered by the WAEC. It is only offered to candidates residing in Anglophone West African countries (Nigeria, Ghana, Sierra Leone, Liberia and The Gambia). Admission to university education is based on student WASSCE exam scores. The WASSCE score is graded on a 9-point scale with 1 being the best score while 9 is a fail. A satisfactory grade locally known as “credit” is a score between 1 and 6, inclusive. Scoring “credit” in both Math and English” is a pre-requisite for entering university in The Gambia. We used the 2014 WASSCE data along with the EMIS data 11 as baseline characteristics and the 2018 7 data of WASSCE test outcomes as our main outcome variables in the analysis. The third data source is a survey designed by the team, following the use of administrative data to build a control group. The survey collected data on students in grade 12 in both the program and control schools. The survey also gathered information on learning outcomes through written assessments in English and Math. The questions in the assessment were designed by teachers designated by MoBSE and based on the curriculum. We additionally gathered information on the socio-economic characteristics of students, teachers’ background, and school principal’s background. We also collected qualitative data to shed light on evaluation questions and topics that were not well-suited for quantitative measurement and analysis, using semi-structured interviews with program teachers, school headmasters and project beneficiaries. The survey allowed us to collect the unique identifier of students, assigned for the purpose of WASCEE 2018, and later used the identifiers to match the survey with students’ WASCEE performance. Additionally, we collected students' GABECE performance data to serve as the student level baseline performance. To allow for further matching at the student level, we oversampled two to three times as many students in the control groups compared to program students. 3.2. Identification The identification and the causal interpretation of the findings rely on a combination of two main methods. First, the Ministry of Basic and Secondary Education used the following criteria to select the pilot schools: availability of electricity, availability of a sufficient number of science teachers, and the presence of a computer lab. As a result, a simple comparison of the program students and non-program students is ruled out. From Table 2, we see that PSI-PMI pilot schools are generally better off than the non-program schools but not across the board: the program pilot schools tend to be larger, with more enrolled students, but at a higher student- 7 The study team was able to obtain student-level WASSCE 2018 data for the 32 schools in the sample. We then used the WASSCE ID collected during the survey to match students’ WASSCE scores with other information. The match rate was 99.81%. 12 teacher ratio; they are more likely to have electricity (which is essential for the PSI-PMI program), a library, and a computer lab; and there is a higher ratio of students in program schools taking the WASSCE exam. To construct a reliable control group, we used extensive administrative data from 2014, 8 in addition to detailed information on the selection process of program schools, to select a list of non-program schools that would have been ex-ante comparable to the program schools, through propensity score matching. Although there were 135 senior secondary schools in The Gambia in 2014, only 71 of these schools comprise our sampling frame. This is because the other schools are religious schools or follow a nonstandard curriculum. As such, students in those schools do not take the grade 12 WASSCE exam, which is our main outcome variable. This means that the total number of senior secondary schools from which we could select our control schools was 55. Table 2 provides the average differences between the pilot schools and this universe of control schools. After applying the propensity score to match schools based on the above pretreatment variables (using nearest neighbor matching), 16 control schools were selected. The descriptive statistics on these schools and how they compare with the pilot schools are presented in Table 3. The table shows that our matching produces a comparable control group at the school level. We also used the Kolmogorov-Smirnov test to check the equality of the distribution for each variable before and after matching. Figure 1 shows the effects of matching. Although there is one set of 16 control schools, there are two different sets of control students as stated before. This is because even in the PSI-PMI pilot schools, some students were not exposed to the program due to budgetary constraints. As a result, there were within-school control students (unexposed students in pilot schools) and between-school control students (students in schools that were not part of the pilot program). The within-school control group 8 In spring 2014, all program schools had been selected and started PSI-PMI instruction, but there were no beneficiaries taking the WASSCE until 2015. 13 alleviates the concerns of bias at the school level, especially given the small sample of schools. Therefore, we use the within-school control as our reference. Table A1, in the appendix, presents a comparison of the descriptive statistics of GABECE scores, socioeconomic characteristics, WASSCE scores and researcher designed exam scores for the treatment group, within and between-school control groups before matching. It also shows the descriptive statistics of all the students in the study together. Regardless of the group, the grade 12 students are generally about age 20 years with similar socioeconomic backgrounds. However, the PSI-PMI group has fewer female students than in the other two groups. Students in the pilot program performed better on the GABECE in mathematics and science, compared with the within-school control students, and the pilot program students were comprehensively superior (in all subjects) to the between-school control group. Tables A1a and A1b in the appendix present GABECE scores and socioeconomic characteristics after matching at the student level, showing overall that the different groups are comparable pre-intervention. This matching method, however, relies on the assumption that unobserved characteristics are also orthogonal to the participation in the program, conditioned on the observables, a somewhat stringent assumption. We supplemented this method with a pre- analysis that builds further confidence in the estimates. Prior to students taking the final regional exam WASCEE 2018, we administered a test and used that test along with student-level data to conduct a first analysis and predict the impact of the program. When the students took the WASCEE at the end of the year, we then use that data to replicate the findings, building confidence in the estimates and obtaining similar results from different data sources made available at different points in time. 14 Table 2: Baseline Characteristics of PSI-PMI Pilot Senior Secondary Schools and Other Senior Secondary Schools before Matching, 2014 PSI-PMI pilot senior Other senior Variable secondary schools secondary schools Diff P-value N Mean Std. dev. N Mean Std. dev. School hardware characteristics Good desks (%) 16 0.92 0.34 55 0.91 0.26 0.96 0.82 Has library (%) 16 1.00 0.00 55 0.85 0.05 0.15 0.00 Has computer lab (%) 16 1.00 0.00 55 0.76 0.06 0.24 0.00 Has water tap (%) 16 0.88 0.09 55 0.89 0.04 -0.01 0.86 Has electricity (%) 16 0.88 0.09 55 0.69 0.06 0.19 0.08 School software characteristics Total enrollment 16 868.36 168.06 55 509.09 51.41 359.29 0.04 Student-teacher ratio 16 34.53 2.87 55 26.10 1.75 8.43 0.01 Qualified teachers (%) 16 0.97 0.55 55 0.95 0.75 2.14 0.21 Female teachers (%) 16 0.92 0.19 55 0.95 0.13 -0.24 0.91 Students’ performance Mathematics Students passing the test (score 1-8) (%) 16 0.16 0.05 55 0.13 0.03 0.03 0.72 Students with excellent or good scores (1-3) (%) 16 0.01 0.01 55 0.01 0.01 0.01 0.41 Students earning credit (score 4-6) (%) 16 0.05 0.02 55 0.03 0.01 0.02 0.34 Students with passing scores (7-8) (%) 16 0.09 0.03 55 0.10 0.02 -0.01 0.80 English Students passing the test (score 1-8) (%) 16 0.37 0.06 55 0.27 0.03 0.01 0.11 Students with excellent or good scores (1-3) (%) 16 0.01 0.00 55 0.00 0.00 0.004 0.17 Students earning credit (score 4-6) (%) 16 0.11 0.03 55 0.06 0.01 0.05 0.17 Students with passing scores (7-8) (%) 16 0.26 0.04 55 0.21 0.02 0.05 0.20 Students taking science subjects in the WASSCE 16 0.34 0.34 55 0.18 0.30 0.16 0.09 2014 (%) Note: The WASSCE is graded on a score of 1 to 9, with lower values being better. The P-value is based on using the t-test to examine whether these two groups (program versus non-program schools) have equal means for the WASSCE scores, by clustering at the school level. Robust standard errors are presented. PSI-PMI = Progressive Science Initiative and Progressive Mathematics Initiative; WASSCE = West African Secondary School Certificate Examination. 15 Table 3: Baseline Characteristics of PSI-PMI Pilot Senior Secondary Schools and Other Senior Secondary Schools after Matching, 2014 PSI-PMI pilot senior Matched senior Variable secondary schools secondary schools Diff P-value N Mean Std. dev. N Mean Std. dev. School hardware characteristics Good desks (%) 16 0.92 0.34 16 0.93 0.55 -0.85 0.90 Has library (%) 16 1.00 0.00 16 0.94 0.06 0.06 0.33 Has computer lab (%) 16 1.00 0.00 16 0.94 0.06 0.06 0.33 Has water tap (%) 16 0.88 0.09 16 0.94 0.06 -0.06 0.56 Has electricity (%) 16 0.88 0.09 16 1.00 0.00 -0.12 0.15 School software characteristics Total enrollment 16 868.38 168.06 16 677.69 98.69 190.69 0.34 Student-teacher ratio 16 34.53 2.87 16 32.18 3.85 2.35 0.63 Qualified teachers (%) 16 0.97 0.55 16 0.97 0.14 0.39 0.84 Female teachers (%) 16 0.92 0.19 16 0.12 0.31 -2.52 0.50 Students’ performance Mathematics Student passing the test (score 1-8) (%) 16 0.16 0.05 16 0.15 0.01 0.01 0.89 Students with excellent or good scores (1-3) (%) 16 0.01 0.01 16 0.01 0.00 0.01 0.26 Students earning credit (score 4-6) (%) 16 0.05 0.02 16 0.04 0.01 0.01 0.56 Students with passing scores (7-8) (%) 16 0.09 0.03 16 0.10 0.04 -0.01 0.74 English Students passing the test (score 1-8) (%) 16 0.37 0.06 16 0.40 0.05 -0.03 0.70 Students with excellent or good scores (1-3) (%) 16 0.01 0.003 16 0.00 0.00 0.003 0.39 Students earning credit (score 4-6) (%) 16 0.11 0.03 16 0.10 0.03 0.01 0.96 Students with passing scores (7-8) (%) 16 0.26 0.04 16 0.30 0.03 -0.04 0.45 Students taking science subjects in the WASSCE 16 0.34 0.08 16 0.40 0.10 -0.06 0.64 2014 (%) Note: The WASSCE is graded on a score of 1 to 9, with lower values being better. The P-value is based on using the t-test to examine whether these two groups (program versus non-program schools) have equal means for the WASSCE scores, by clustering at the school level. Robust standard errors are presented. PSI-PMI = Progressive Science Initiative and Progressive Mathematics Initiative; WASSCE = West African Secondary School Certificate Examination. 16 Figure 1: PSI-PMI Schools and Non-PSI-PMI Schools Before Matching After Matching Note: The school quality index is not a real constructed index. To construct the propensity score matching to create a comparable group of schools, we used the following school-level variables: School hardware characteristics: the presence of electricity, good desks in school, a library, a computer lab, and running water in the school. School software characteristics: student enrollment, student-teacher ratio, percentage of qualified teachers, and percentage of female teachers. Students’ performance: measure of the share of students with high performance in mathematics, English, and science on the WASSCE. PSI-PMI schools do very well on these indicators. The non-PSI-PMI program schools used in the study were assumed to do well as well on these indicators as the program schools. PSI-PMI = Progressive Science Initiative and Progressive Mathematics Initiative; WASSCE = West African Secondary School Certificate Examination. 17 3.3. Model Specification To estimate the impact of the PSI-PMI program on student learning outcomes, we estimate the following model on the sample of the matched program and non-program students: = 0 + 1 + 2 + (1) Where is the performance outcome of student in school on test . Our main outcome variables are standardized mathematics scores on the WASSCE and a dummy variable for having satisfactory scores9 in mathematics and English on the WASSCE. The variable is the treatment variable, which has value 1 if the student is in the treatment group and 0 if otherwise. X is a vector of covariates that includes the student- and school-level variables, and is the idiosyncratic error term. In our estimations, we also control for any preexisting differences between the treatment and control groups. We estimated the program effect in equation 1 using OLS and directly through the propensity score matching estimator. The student-level variables used as controls include the following: (i) students' performance in mathematics, English, and science three years earlier in grade 9; (ii) whether the student has repeated a grade; (iii) socioeconomic status, as measured by household asset ownership (TV, fridge, and car); and (iv) gender. The school-level variable is administrator capacity, as measured by the principal's level of education and experience. The standard errors are clustered at the school level. Grade 12 students can take up to nine subjects in the WASSCE. However, only mathematics and English are compulsory for all students. This students’ choice of subjects taken could introduce a potential bias, since the choice of subjects will not be random. Hence, we focus on mathematics and English scores for the impact of the PSI-PMI program, although the program is designed for mathematics and science subjects (biology, chemistry, and physics). 9 A satisfactory score is defined as a score of at least 50 percent. On the WASSCE scale, this corresponds to a score of at least between 1 and 6, inclusive. 18 4. Results and Discussion 4.1. Main Results: OLS Estimates Table 4 presents the results of the estimation of equation (1). The outcome variable is the mathematics score (standardized) from the exam we administered to the grade 12 students. Columns 1 and 2 show the program effect, comparing program students with within-school control students. Column 2 controls for the student- and school-level covariates. Columns 3 and 4 show the program effect, comparing program students with between-school control students. Column 4 controls for the student- and school-level covariates. In all the estimations, we find significantly positive effects of the PSI-PMI program on students’ performance in mathematics. As stated before, our preferred results are reported in column 2, which not only includes student- and school-level controls but also uses within-school control groups. In other words, compared with the within-school control group, the PSI-PMI program improves students’ mathematics scores by 0.54 standard deviation or 11.9 percentage points, from 38.8 to 50.7 on a scale of 0-100. Nevertheless, the magnitudes of the estimated effects of the PSI-PMI program on student scores using the research design exam are not statistically different from each other. Compared with the between-school control group, the result is similar; PSI-PMI improves students’ mathematics performance by 12.1 percentage points (0.65 SD). We also find similar robust evidence of the positive effect, albeit smaller, of the PSI-PMI program on students’ performance in English (see Table A2, in appendix). 19 Table 4: Effects of the PSI-PMI Program on Student Learning Outcomes: Using the Researcher Designed Exam Score Outcome variable: Standardized mathematics score Within-school control Between-school control Variable group group [1] [2] [3] [4] PSI-PMI program 0.634** 0.544*** 0.848*** 0.655*** (0.247) (0.150) (0.306) (0.207) Student-level covariates NO YES NO YES School-level covariates NO YES NO YES Average mathematics score for control 38.78 38.78 33.37 33.37 Standard deviation of mathematics score for 21.91 21.91 18.51 18.51 control Observations 875 875 1,044 1,044 R-squared 0.065 0.349 0.122 0.24 Note: The results are based on equation 1. The outcome variable is the standardized mathematics score on the researcher designed exam. Robust standard errors clustered at the school level are in parentheses. PSI-PMI = Progressive Science Initiative and Progressive Mathematics Initiative.*** p < 0.01, ** p < 0.05, * p < 0.1 School covariates variables include: presence of electricity in school, good desks in school, school has a library, a computer lab, running water in the school, student enrollment, student-teacher ratio, percentage of qualified teachers, percentage of female teachers, experience of the school principal, age of school principal, administration experience of school principal, level of education of school principal, school principal is female, share of students with high performance in mathematics, English, and science on the WASSCE. Student-level covariates variables include the following: percentage of students not repeating any grade, percentage of students earned credit and above on GABECE mathematics, percentage of students earned credit and above on GABECE English, percent of students earned credit and above on general science, all at baseline. And socioeconomic characteristics of students—percent of students in households that has a quality roof, a wall, television, car, access to grid electricity, parents have some education. We later estimated equation (1) using the official WASSCE scores from 2018. The results are presented in Table 5. The result from the preferred specification is strikingly similar to the pre-analysis using the survey and researcher-administered test (0.59 versus 0.54 as reported in Table 4). The estimates from the between-school control are positive and similar in magnitude, although they are not statistically significant. The lack of statistical significance, in this case, is likely due to the small sample size offering limited statistical power. The estimated effects from the preferred specification correspond to a real significant gain in learning improvement that is equivalent to an increase of about 9 percentage points from a low base of 20.18 in the control group. In Table 5, columns 5 to 8, we use a different outcome measure, which is a dummy variable indicating whether a student received a satisfactory score on both English and 20 mathematics in the WASSCE. Using this outcome variable is particularly relevant for policy makers, as English and mathematics are the core subjects that all students must take. Furthermore, a satisfactory score 10 in these two subjects is among the requirements for university entrance in The Gambia. As in the other columns, the results show that PSI-PMI led to a gain of between 15 and 21 percent more students who met the basic criteria for attending university. These results are statistically significant at the 10 percent level. Table 5: Effects of the PSI-PMI Program on Student Learning Outcomes: Using the WASSCE Exam Score Outcome variable: Received Outcome variable: Standardized WASSCE satisfactory (“credit”) score on the mathematics score WASSCE in both mathematics and English Variable Within-school Between-school Within-school Between-school control group control group control group control group [1] [2] [3] [4] [5] [6] [7] [8] PSI-PMI program 0.788** 0.585** 0.717 0.443 0.188* 0.154* 0.207* 0.157* (0.334) (0.206) (0.445) (0.324) (0.101) (0.079) (0.118) (0.086) Student-level covariates NO YES NO YES NO YES NO YES School-level covariates NO YES NO YES NO YES NO YES Average WASSCE mathematics 20.18 20.18 20.74 20.74 0.05 0.05 0.03 0.03 score for control Standard deviation of WASSCE 15.6 15.6 15.15 15.15 0.23 0.23 0.18 0.18 mathematics score for control Observations 873 873 1,037 1,037 873 873 1,034 1,034 R-squared 0.096 0.434 0.077 0.247 0.073 0.258 0.097 0.181 Note: The results are based on equation 1. The outcome variables are the standardized WASSCE mathematics and English exam scores and students receiving satisfactory (“credit”) score on the WASSCE in both mathematics and English. Robust standard errors clustered at the school level are in parentheses. PSI-PMI = Progressive Science Initiative and Progressive Mathematics Initiative; WASSCE = West African Secondary School Certificate Examination. *** p < 0.01, ** p < 0.05, * p < 0.1. School covariates variables include: presence of electricity in school, good desks in school, school has a library, a computer lab, running water in the school, student enrollment, student-teacher ratio, percentage of qualified teachers, percentage of female teachers, experience of the school principal, age of school principal, administration experience of school principal, level of education of school principal, school principal is female, share of students with high performance in mathematics, English, and science on the WASSCE. Student-level covariates variables include the following: percentage of students not repeating any grade, percentage of students earned credit and above on GABECE mathematics, percentage of students earned credit and above on GABECE English, percent of students earned credit and above on general science, all at baseline. And socioeconomic characteristics of students—percent of students in households that has a quality roof, a wall, television, car, access to grid electricity, parents have some education. 10 The WASSCE is graded on a nine-point scale, with 1 being the best score and 9 being a fail. A satisfactory grade is locally known as “credit,” which is a score between 1 and 6, inclusive. 21 4.2. Main Results: Propensity Score Matching Estimates Rather than using the OLS approach to estimate the treatment effect after constructing the control groups, we now use the propensity score matching estimator directly to capture the treatment effect. Since this estimator allows for a large set of covariates without concern of dimensionality issues, we expanded the covariates used in the OLS estimation.11 The results are presented in the Appendix in Table A3 and Table A4 with an expanded set of covariates. The results still show that the PSI-PMI program had a statistically significant positive effect on student performance. The PSM estimates are larger than the OLS estimates both for the standardized math score (0.79 vs. 0.59 SD) and the share of students passing both math and English (23 vs 15 percentage points). We consider these results to be reinforcing the findings that the program led to large gains in students’ performance. 4.3. Heterogeneity Analysis In this subsection, we assess the heterogeneous impact of the program to uncover whether the program had larger or smaller impacts among students of different socioeconomic backgrounds, gender, and different initial performance (three years earlier). To analyze this question, we estimate the following model, an augmented version of equation (1) with an interactive term: = 0 + 1 + 2 ∗ +3 + (2) where I denote gender, socioeconomic status, and grade 9 performance, and ∗ is an interactive term. We find that the estimated effect of the program was driven largely by high-performing students who scored highly on their grade 9 exam. Figure 2 shows that the students who scored satisfactory grades (known locally as “credit”) accounted for most of the gain from the program. 11 The OLS estimator controls only for variables that are highly potentially unbalanced between treatment and control. Any additional controls in the OLS estimations serve only the purpose of increasing the precision of the coefficient of interest. Table A.3 uses the same covariates as in the OLS and Table A.4 expands to include additional covariates. 22 In other words, the benefit of the PSI-PMI program, as measured by students’ performance in grade 12, accrues to high-performing students. As such, additional measures are needed to ensure that the benefits of the program are more inclusive along the entire spectrum of student performance. Figure A1, in the appendix, provides the results for the between school control group. Figure 2: Effects of the PSI-PMI Program on Student Learning Outcomes: By Baseline Performance Impact of PSI-PMI Program on Students with Varied Baseline Performance Program Students VS Within-School Control Students 3 Improvement/Effect WASSCE Math Standardized Score 2 1 0 -1 1 2 3 4 5 6 7 8 9 GABECE Math Grade Score Impact of PSI-PMI Program on Students with Varied Baseline Performance Program Students VS Within-School Control Students 2 Improvement/Effect WASSCE Math Standardized Score 1 0 -1 Credit and Above Pass Fail GABECE Math Grade Score Note: The standardized WASSCE mathematics score is the dependent variable. The results are for the program group versus the within-school control group, controlling for covariates such as gender, baseline performance, and socioeconomic background, and school characteristics. GABECE = Gambia Basic Education Certificate Examination; PSI-PMI = Progressive Science Initiative and Progressive Mathematics Initiative; WASSCE = West African Secondary School Certificate Examination. The GABECE is graded on a score of 1 to 9, with lower values being better. 23 A more concerning equity consideration may also arise if students’ performance at baseline is correlated with their socio-economic status (SES). In that case, the heterogeneous effect could imply that the program benefits more students of high socio-economic status. However, unlike for baseline academic performance, we do not find significant interaction effects of gender or socioeconomic background with PSI-PMI program on student performance (Table 6). Table A5, in the appendix, provides the results if we interact the PSI-PMI program with socioeconomic background, in which case we find no statistically significant interaction effect. Table 6: Effects of the PSI-PMI Program on Student Learning Outcomes: Interaction Effect Outcome variable: Outcome variable: WASSCE Outcome variable: Outcome variable: Standardized mathematics Standardized WASSCE Variable mathematics score score mathematics score mathematics score Within-school control group Between-school control group (1) (2) (3) (4) PSI-PMI program 0.477** 0.477* 0.632** 0.279 (0.204) (0.262) (0.259) (0.357) Interaction effect (Program*Gender: 1=Female) -0.020 0.164 0.004 0.215 (0.147) (0.131) (0.164) (0.169) Student-level covariates YES YES YES YES School-level covariates YES YES YES YES Average mathematics score for control 38.78 20.18 33.37 20.74 Standard deviation of mathematics score for control 21.91 15.60 18.51 15.15 Observations 875 873 1,044 1,037 R-squared 0.265 0.334 0.226 0.235 Note: The results are based on equation (2). The outcome variables are the standardized mathematics score from the researcher designed exam and the standardized WASSCE mathematics exam score. Robust standard errors are in parentheses. PSI-PMI = Progressive Science Initiative and Progressive Mathematics Initiative; WASSCE = West African Secondary School Certificate Examination. *** p < 0.01, ** p < 0.05, * p < 0.1. School covariates variables include: presence of electricity in school, good desks in school, school has a library, a computer lab, running water in the school, student enrollment, student-teacher ratio, percentage of qualified teachers, percentage of female teachers, experience of the school principal, age of school principal, administration experience of school principal, level of education of school principal, school principal is female, share of students with high performance in mathematics, English, and science on the WASSCE. Student-level covariates variables include the following: percentage of students not repeating any grade, percentage of students earned credit and above on GABECE mathematics, percentage of students earned credit and above on GABECE English, percent of students earned credit and above on general science, all at baseline. And socioeconomic characteristics of students—percent of students in households that has a quality roof, a wall, television, car, access to grid electricity, parents have some education. 24 The fact that the program impact is not heterogeneous along the gender and the student socio-economic dimension alleviate equity concerns about the impact of the program. Nevertheless, the program could still be disproportionately benefiting high SES students if they are disproportionately represented in the group of high performers. We assess in our sample by looking at the composition of students who scored credit and above in GABECE. Column 1 in Table A6 shows that the share of the student who scores credit and above is similar in each of the first three quartiles of student SES (ranging 42-43%) and 53% in the top quartile. There is also a 5 percentage point gap in favor of boys (47% vs 42%). Both of these variations are not disproportionately large as anticipated. Furthermore, column 2 in the same table shows that of all the students who scored credit and above in GABECE, 25.4% are in the bottom quartile of SES and 61% are in the bottom half of SES, of which the gender composition is about balanced as shown in the bottom panel of Table A6. Surprisingly, there are more girls than boys high performers at baseline. Of all the high performers at baseline, (60.4% were girls and 39.6% were boys, and thus the program benefited more girls than boys. If the observation from our sample is true at the national level, then we can conclude that the program does not pose a significant equity issue in terms of student socio-economic background and does benefit girls more than boys. This may not be surprising because the poor kids who make it to high school are often among the high performers. 5. Qualitative Assessment To arrive at a fuller picture of the program’s impact and draw relevant policy implications, we complement the rigorous estimation of the program’s impact on test scores with qualitative analysis. The qualitative analysis includes a qualitative survey, as well as focus group interviews with teachers, school administrators, and the Ministry of Education officials. The goal of the qualitative study was to understand teachers’ perceptions of the technology, whether any implementation challenges occurred, and to what extent the challenges would help explain the results obtained in the rigorous quantitative evaluation. School principals and teachers have highly positive views about the program. For instance, about 80 percent of the mathematics teachers and 95 percent of the science teachers 25 who used the PSI-PMI program to teach believe that it improves the effectiveness of their teaching. One way in which the teachers indicated that the program changed their teaching practice is that the nature of the program required them to spend more time preparing for their classes. Similarly, the teachers indicated that the program forces them to spend more time going through the same material relative to their traditional teaching methods. In accordance with the program design, teachers indicated that their emphasis and the speed with which they cover materials are influenced by students’ comprehension of the material, which is relayed in real- time by the handheld responders. However, more rigorous study is needed to establish how the program and CAL programs in general interact with teachers’ efforts and impact pedagogy (Escueta et al. 2017: 30). Teachers’ and administrators’ concerns about the program are driven mostly by the frequency of the breakdown of the technologies (the responders and IWB). These breakdowns were not always fixed on a timely basis. Another program-related problem has been insufficient materials, particularly textbooks and handheld responders. This problem was limited to large schools with high average class sizes. A major problem that was not specific to the program but nonetheless affected its functioning was the unreliable electricity supply in the country. While the PSI-PMI program can be used without power supply or the technology, in reality most teachers were not using the program when there was no power supply or when the technology broke down. The Ministry of Education seems to be constrained in its ability to monitor the program’s implementation in the pilot schools. This was reflected mainly in the lack of timely responses in addressing the problem of malfunctioning equipment. The choice of technology used did not often take into account the ease with which the technology parts could be acquired locally when breakdowns occur. Given these implementation gaps, it may be possible for the program to have an even greater impact on student learning, in the absence of these problems. It is also possible that these implementation challenges would be magnified if the program were to be scaled nationally. 26 6. Conclusions This study analyzed the impact of the PSI-PMI, a CAL program, on the learning outcomes of Gambian senior secondary school students. The program was directly integrated into classroom teaching and learning, in contrast to other CAL programs that are designed as remedial tools. The program was implemented as a pilot in several Gambian senior secondary schools. Given the nonrandom selection of the participating schools, we used propensity score matching to select a comparable group of students, to estimate the program’s impact on student learning. Our evaluation focuses on the final year of senior secondary (grade 12) students’ performance on the compulsory, nationwide examination. Our estimation results show that the PSI-PMI program improved student performance in mathematics. Specifically, the program improved students’ mathematics scores by 11.9 percentage points, from 38.8 to 50.7, when we used scores from the researcher designed exam and 9.2 percentage points using the WASSCE exam score. The results have no interactive effects with gender or socioeconomic background. However, the gains in exam performance are largely driven by high-performing students, based on how well they performed three years earlier in grade 9. Although the program has had an overall positive effect, additional measures are needed to ensure that the benefits of the program are more inclusive for students at the lower spectrum of performance at baseline. However, the fact that the impact of the program does not vary along students’ gender and socioeconomic status suggests that the program likely raises no equity concerns. Furthermore, of the group of students who are likely to benefit from the program, a larger majority are girls and students from the bottom half of the socio-economic status index. The qualitative assessment shows that the teachers and administrators viewed the program’s introduction quite positively. The lessons from both the quantitative and qualitative analyses suggest several recommendations to strengthen the program. First, program implementation should take precautions to allow for careful evaluation along the way to allow for timely adjustment and promote learning by doing. It is therefore critical to carefully document (with reliable data) the implementation process and, if possible, to 27 consider a Randomized Control Trial design to allow for more rigorous evaluation to help adjust the program when needed and promote learning by doing. Second, the content of the PSI-PMI requires continuous efforts to align with the existing curriculum and student evaluations. Many teachers reported that the content was not sufficiently adapted and there may be a need to adjust it to match the existing Gambian curriculum on which the WASSCE is based. Third, to address the variation of the impact on students and ensure that the program benefits all students, the implementers should monitor the implementation at the classroom level and set up classroom peer observations and exchange visits among teachers within and across schools. More generally, experience sharing among countries (administrators and teachers) can help countries avoid challenges experienced elsewhere. Fourth, a highly responsive technical support unit per school (or group of schools) to handle technical glitches or replace defective equipment is needed to ensure implementation continuity. Furthermore, the experience from The Gambia demonstrates that reliable electricity is essential for smooth implementation. Therefore, countries should consider school-level investments in redundancies in energy supply such as backup generators or solar panels. 28 References Angrist, J., & Lavy, V. (2002). New evidence on classroom computers and pupil learning. Economic Journal, 112, 735–65. Banerjee, A. V., Cole, S., Duflo, E., & Linden, L. (2007). Remedying education: evidence from two randomized experiments in India. Quarterly Journal of Economics, 122(3), 1235–64. Barrera-Osorio, F., & Linden, L. (2009). The use and misuse of computers in education: evidence from a randomized experiment in Colombia. Policy Research Working Paper 4836, World Bank, Washington, DC. Barrow, L., Markman, L., & Rouse, C. (2009). Technology’s edge: the educational benefits of computer aided instruction. American Economic Journal: Economic Policy, 1 (1), 52–74. Bashir, S., Lockheed, M., Ninan, E., &. Tan, J.-P. (2018). Facing forward: schooling for learning in Africa. Washington, DC: World Bank. Bet, G., Ibarraran, P., & Cristia, J. (2010). Access to computers, usage, and learning: evidence from secondary schools in Peru. Mimeograph. Inter-American Development Bank Research Department. Bold, T., Filmer, D., Martin, G., Molina, E., Stacy, B., Rockmore, C., & Wane, W. (2017). “Enrollment without learning: teacher effort, knowledge, and skill in primary schools in Africa.” Journal of Economic Perspectives, 31 (4), 185–204. Carrillo, P. E., Onofa, M., & Ponce, J. (2011). Information technology and student achievement: evidence from a randomized experiment in Ecuador. Washington, DC: Inter-American Development Bank. Cristia, J. P., Czerwonko, A., & Garofalo, P. (2010). Does ICT increase years of education? evidence from Peru. Washington, DC: Inter-American Development Bank. Escueta, M., Quan, V., Nickow, A. J., & Oreopoulos, P. (2017). Education technology: an evidence- based review.” NBER Working Paper No. 23744, National Bureau of Economic Research, Cambridge, MA. Evans, D. K., Santos, I., & Arias, O. (2019). The skills balancing act in sub-Saharan Africa: investing in Skills for productivity, inclusivity, and adaptability. Washington, DC: World Bank. Goolsbee, A., & Guryan, J. (2006). The Impact of Internet Subsidies in Public Schools. The Review of Economics and Statistics, 88(2): 336-347. Lai, F., Luo, R., Zhang, L., Huang, X., & Rozelle, S. (2015). Does computer-assisted learning improve learning outcomes? evidence from a randomized experiment in migrant schools in Beijing.” Economics of Education Review, 47, 34–48. 29 Lai, F., Zhang, L., Bai, Y., Liu, C., Shi, Y., Chang, F., & S. Rozelle. (2016). More is not always better: evidence from a randomized experiment of computer-assisted learning in rural minority schools in Qinghai.” Journal of Development Effectiveness, 8 (4), 449–472. Linden, L. L. (2008). Complement or substitute? the effect of technology on student achievement in India. Working Paper No. 17, Jameel Poverty Action Lab, Columbia University, New York. Machin, S. J., McNally, S., & Silva, O. (2006). New technology in schools: is there a payoff? IZA Discussion Paper No. 2234, Institute of Labor Economics, Bonn. Muralidharan, K., Singh, A., & Ganimian, A. J. (2019). Disrupting education? experimental evidence on technology-aided instruction in India. American Economic Review, 109 (4), 1426–1460. Roschelle, J., Shechtman, N., Tatar, D., Hegedus, S., Hopkins, B., Empson, S., Knudsen, J., & Gallagher, L. P. (2010). Integration of technology, curriculum, and professional development for advancing middle school mathematics: three large-scale studies. American Educational Research Journal, 47 (4), 833–78. Rouse, C. E., & Krueger, A. B. (2004). Putting computerized instruction to the test: a randomized evaluation of a ‘scientifically based’ reading program. Economics of Education Review, 23 (4), 323–38. Tomita, R. T., & Savrimootoo, T. J. (2016). Improving education performance in math and science in The Gambia. Washington, DC: World Bank. 30 Appendix: Additional Figures and Tables Figure A1: Effects of PSI-PMI on Student Learning Outcome: Heterogeneity Impact Analysis Impact of PSI-PMI Program on Students with Varied Baseline Performance Program Students VS Between-School Control Students Improvement/Effect WASSCE Math Standardized Score 3 2 1 0 -1 1 2 3 4 5 6 7 8 9 GABECE Math Grade Score Impact of PSI-PMI Program on Students with Varied Baseline Performance Program Students VS Between-School Control Students 2 Improvement/Effect WASSCE Math Standardized Score 1 0 -1 Credit and Above Pass Fail GABECE Math Grade Score Note: The standardized WASSCE mathematics score is the dependent variable. The results are for program versus between-school control group, controlling for covariates such as gender, baseline performance, socioeconomic background, and school characteristics. GABECE = Gambia Basic Education Certificate Examination; PSI-PMI = Progressive Science Initiative and Progressive Mathematics Initiative; WASSCE = West African Secondary School Certificate Examination. The GABECE is graded on a score of 1 to 9, with lower values being better. 31 Figure A2: Timeline of the Research Activities Nov Dec Apr May June Mar Apr May Jun 2017 2017 2018 2018 2018 2019 2019 2019 2019 32 Table A1: Comparison of Descriptive Statistics of WASSCE Scores, Researcher Exam Scores, GABECE Scores and Socioeconomic Characteristics Within-school Between-school All Students PSI-PMI Diff P-value Diff P-value Variable control control N Mean SD N Mean SD N Mean SD N Mean SD Socio Economic and GABECE Age 2137 19.89 1.54 285 19.80 1.50 746 19.94 1.58 -0.15 0.18 1106 19.89 1.56 -0.10 0.36 Girls (%) 1299 0.60 0.50 146 0.51 0.50 458 0.61 0.49 -0.10 0.00 695 0.63 0.48 -0.11 0.00 HH has a quality roof (%) 2132 0.73 0.44 288 0.79 0.40 750 0.74 0.43 0.05 0.08 1094 0.70 0.45 0.09 0.00 HH has quality walls (%) 2126 0.76 0.43 286 0.74 0.43 748 0.74 0.43 0.00 0.90 1092 0.76 0.42 -0.02 0.48 HH has access to grid electricity (%) 2142 0.77 0.42 288 0.78 0.41 750 0.73 0.44 0.05 0.10 1104 0.79 0.40 -0.01 0.60 Parents have some education (%) 2118 0.68 0.46 281 0.68 0.46 745 0.66 0.47 0.02 0.52 1092 0.69 0.46 0.00 0.96 HH has a TV (%) 2134 0.77 0.42 286 0.76 0.42 750 0.77 0.41 -0.01 0.70 1098 0.78 0.41 -0.02 0.51 HH has a fridge (%) 2116 0.59 0.49 285 0.60 0.49 743 0.57 0.49 0.03 0.41 1088 0.60 0.48 -0.01 0.75 HH has a car (%) 2107 0.35 0.48 283 0.32 0.46 740 0.29 0.45 0.02 0.42 1084 0.40 0.49 -0.08 0.02 Students not repeating any grade (%) 2144 0.66 0.47 287 0.71 0.45 753 0.67 0.46 0.04 0.17 1104 0.63 0.48 0.08 0.01 Students earned credit and above on 1923 0.44 0.49 264 0.67 0.47 712 0.49 0.50 0.17 0.00 947 0.33 0.47 0.33 0.00 GABECE mathematics (%) Students earned credit and above on 1894 0.79 0.40 261 0.83 0.37 710 0.84 0.36 -0.01 0.90 923 0.74 0.43 0.09 0.00 GABECE English (%) Students earned credit and above on 1861 0.71 0.45 259 0.86 0.34 700 0.71 0.45 0.15 0.00 902 0.66 0.47 0.19 0.00 general science (%) Researcher Designed Exam Average Mathematics score (%) 2109 0.37 0.21 287 0.507 0.26 729 0.388 0.219 n.a n.a 1092 0.334 0.185 n.a n.a Average English score (%) 2188 0.42 0.16 288 0.467 0.169 732 0.428 0.17 n.a n.a 1097 0.412 0.163 n.a n.a WASSCE Exam (2018) Average Mathematics score (%) 2110 0.22 0.17 283 0.293 0.238 735 0.202 0.156 n.a n.a 1091 0.207 0.152 n.a n.a Average English score (%) 2146 0.53 0.14 286 0.585 0.163 754 0.548 0.136 n.a n.a 1105 0.517 0.136 n.a n.a Student got Credit and above (both WASSCE Math and English) (%) 2107 0.07 0.24 283 0.208 0.411 735 0.05 0.23 n.a n.a 1088 0.030 0.178 n.a n.a Note: The P-value is based on using the t-test to examine whether these two groups (pure control group versus treatment group) have equal means. For the GABECE scores, clustering is at the school level. Robust standard errors are presented. GABECE = Gambia Basic Education Certificate Examination; HH = household 33 Table A1a: Comparison of Students' Characteristics after Matching Within-school Variable PSI-PMI control Diff P-value N Mean Std. dev. N Mean Std. dev. Age 252 19.78 1.52 253 19.84 1.53 0.06 0.66 Girls (%) 256 0.52 0.50 256 0.53 0.50 0.01 0.79 HH has a quality roof (%) 256 0.81 0.39 255 0.76 0.43 -0.05 0.16 HH has quality walls (%) 254 0.73 0.44 254 0.73 0.44 0.04 1.00 HH has access to grid electricity (%) 256 0.79 0.41 255 0.76 0.43 -0.03 0.39 Parents have some education (%) 250 0.69 0.46 252 0.64 0.48 -0.05 0.21 HH has a TV (%) 254 0.78 0.42 256 0.79 0.40 0.02 0.56 HH has a fridge (%) 253 0.60 0.49 252 0.61 0.49 0.01 0.74 HH has a car (%) 252 0.31 0.46 252 0.31 0.46 0.00 1.00 Students not repeating any grade (%) 256 0.75 0.43 256 0.74 0.44 -0.01 0.76 Students earned credit and above on GABECE mathematics (%) 256 0.69 0.46 256 0.70 0.46 0.01 0.77 Students earned credit and above on GABECE English (%) 256 0.84 0.36 256 0.86 0.35 0.01 0.71 Students earned credit and above on general science (%) 256 0.86 0.34 256 0.88 0.33 0.01 0.69 Note: HH = household; PSI-PMI = Progressive Science Initiative and Progressive Mathematics Initiative. Table A1b: Comparison of Students' Characteristics after Matching Between-school Variable PSI-PMI control Diff P-value N Mean Std. dev. N Mean Std. dev. Age 252 19.78 1.52 256 19.77 1.61 0.00 0.98 Girls (%) 256 0.52 0.50 256 0.51 0.50 0.00 1.00 HH has a quality roof (%) 256 0.81 0.39 252 0.76 0.42 0.04 0.24 HH has quality walls (%) 254 0.73 0.44 253 0.79 0.40 -0.06 0.10 HH has access to grid electricity (%) 256 0.79 0.41 254 0.79 0.40 -0.01 0.86 Parents have some education (%) 250 0.69 0.46 251 0.68 0.46 0.01 0.80 HH has a TV (%) 254 0.78 0.42 251 0.77 0.41 0.00 0.97 HH has a fridge (%) 253 0.60 0.49 250 0.62 0.49 -0.02 0.90 HH has a car (%) 252 0.31 0.46 249 0.43 0.49 -0.11 0.01 Students not repeating any grade (%) 256 0.75 0.43 256 0.75 0.43 0.00 1.00 Students earned credit and above on GABECE mathematics (%) 256 0.69 0.46 256 0.69 0.46 0.00 1.00 Students earned credit and above on GABECE English (%) 256 0.84 0.36 256 0.84 0.36 0.00 1.00 Students earned credit and above on general science (%) 256 0.86 0.34 256 0.81 0.34 0.00 1.00 Note: HH = household; PSI-PMI = Progressive Science Initiative and Progressive Mathematics Initiative. 34 Table A2: Effects of the PSI-PMI Program on Student Learning Outcomes: English Score Outcome variable: Standardized Outcome variable: Standardized Researcher-designed English score WASSCE English score Variable Within-school Between-school Within-school Between-school control group control group control group control group [1] [2] [3] [4] [5] [6] [7] [8] PSI-PMI program 0.236* 0.205** 0.349* 0.242 0.359 0.285*** 0.546 0.494* (0.128) (0.088) (0.195) (0.161) (0.235) (0.088) (0.344) (0.256) Student-level covariates NO YES NO YES NO YES NO YES School-level covariates NO YES NO YES NO YES NO YES Average English/WASSCE English score 42.8 42.8 41.2 41.2 54.8 54.8 51.7 51.7 for control Standard deviation of English/WASSCE 17.0 17.0 16.3 16.3 13.6 13.6 13.6 13.6 English score for control Observations 879 879 1,046 1,046 894 894 1,047 1,047 R-squared 0.011 0.187 0.022 0.138 0.025 0.467 0.051 0.274 Note: The results are based on equation (1). The outcome variables are the standardized English score on the researcher designed exam and the standardized WASSCE English exam score. Robust standard errors clustered at the school level are in parentheses. PSI-PMI = Progressive Science Initiative and Progressive Mathematics Initiative; WASSCE = West African Secondary School Certificate Examination. *** p < 0.01, ** p < 0.05, * p < 0.1. School covariates variables include: presence of electricity in school, good desks in school, school has a library, a computer lab, running water in the school, student enrollment, student-teacher ratio, percentage of qualified teachers, percentage of female teachers, experience of the school principal, age of school principal, administration experience of school principal, level of education of school principal, school principal is female, share of students with high performance in mathematics, English, and science on the WASSCE. Student-level covariates variables include the following: percentage of students not repeating any grade, percentage of students earned credit and above on GABECE mathematics, percentage of students earned credit and above on GABECE English, percent of students earned credit and above on general science, all at baseline. And socioeconomic characteristics of students—percent of students in households that has a quality roof, a wall, television, car, access to grid electricity, parents have some education. 35 Table A3: PSI-PMI Program School Students and PSI-PMI School Subgroup of Non-Program Students Outcome variable: Outcome variable: Outcome variable: Standardized Received a score of Standardized mathematics score on “credit” in Variable mathematics score on researcher designed mathematics and WASSCE exam English on WASSCE [1] [2] [3] PSI-PMI program (ATE) 0.385** 0.792*** 0.232*** (0.167 (0.275) (0.090) Observations 875 873 873 Note: The results are based on equation (1). The outcome variables are the standardized mathematics score from the researcher designed exam, the standardized WASSCE English and mathematics exam scores and students receiving satisfactory (“credit”) score on the WASSCE in both mathematics and English. Robust standard errors are in parentheses. PSI-PMI = Progressive Science Initiative and Progressive Mathematics Initiative; WASSCE = West African Secondary School Certificate Examination. Estimated treatment effect is based on caliper matching method and is robust to using other matching methods (nearest neighbor and radius matching). Abadie-Imbens robust standard errors in parenthesis. *** p < 0.01, ** p < 0.05, * p < 0.1. Covariates include: presence of electricity in school, good desks in school, school has a library, a computer lab, running water in the school, student enrollment, student-teacher ratio, percentage of qualified teachers, percentage of female teachers, experience of the school principal, age of school principal, administration experience of school principal, level of education of school principal, school principal is female, share of students with high performance in mathematics, English, and science on the WASSCE, percentage of students not repeating any grade, percentage of students earned credit and above on GABECE mathematics, percentage of students earned credit and above on GABECE English, percent of students earned credit and above on general science, all at baseline, percent of students in households that has a quality roof, a wall, television, car, access to grid electricity, parents have some education. 36 Table A4: PSI-PMI Program School Students and PSI-PMI School Subgroup of Non-Program Students Outcome variable: Outcome variable: Outcome variable: Standardized Received a score of Standardized mathematics score on “credit” in Variable mathematics score on researcher designed mathematics and WASSCE exam English on WASSCE [1] [2] [3] PSI-PMI program (ATE) 0.321** 0.997*** 0.311*** (0.168) (0.344) (0.109) Observations 875 873 873 Note: The results are based on equation (1). The outcome variables are the standardized mathematics score from the researcher designed exam, the standardized WASSCE English and mathematics exam scores and students receiving satisfactory (“credit”) score on the WASSCE in both mathematics and English. Robust standard errors are in parentheses. PSI-PMI = Progressive Science Initiative and Progressive Mathematics Initiative; WASSCE = West African Secondary School Certificate Examination. Estimated treatment effect is based on caliper matching method and is robust to using other matching methods (nearest neighbor and radius matching). Abadie-Imbens robust standard errors in parenthesis. *** p < 0.01, ** p < 0.05, * p < 0.1. Covariates included the following: presence of electricity in school, school has a library, a computer lab, student enrollment, percentage of qualified teachers, percentage of female teachers, experience of the school principal, age of school principal, administration experience of school principal, level of education of school principal, school principal is female, girl, percentage of students not repeating GABECE Math, percentage of students earned credit and above on GABECE mathematics, percentage of students earned credit and above on GABECE English, percent of students earned credit and above on general science, all at baseline, percent of students in households that has a quality roof, a wall, television, fridge, car, access to grid electricity, parents have some education 37 Table A5: Effects of the PSI-PMI Program on Student Learning Outcomes: Interaction Effect Outcome variable: Outcome variable: Outcome variable: Outcome variable: Standardized WASSCE Standardized WASSCE mathematics score mathematics score mathematics score mathematics score Variable Within-school control group Between-school control group (1) (2) (3) (4) PSI-PMI program 0.485** 0.632** 0.421* 0.101 (0.189) (0.236) (0.222) (0.424) Interaction effect (Program*socioeconomic background) 0.078 -0.069 0.404 0.546 (0.241) (0.345) (0.249) (0.534) Student-level covariates YES YES YES YES School-level covariates YES YES YES YES Average mathematics score for control 38.78 20.18 33.37 20.74 Standard deviation of mathematics score for control 21.91 15.60 18.51 15.15 Observations 922 921 1,111 1,104 R-squared 0.325 0.416 0.227 0.228 Note: The results are based on equation (2). The outcome variables are the standardized mathematics score from the researcher designed exam and the standardized WASSCE mathematics exam score. Robust standard errors are in parentheses. PSI-PMI = Progressive Science Initiative and Progressive Mathematics Initiative; WASSCE = West African Secondary School Certificate Examination. *** p < 0.01, ** p < 0.05, * p < 0.1. School covariates variables include: presence of electricity in school, good desks in school, school has a library, a computer lab, running water in the school, student enrollment, student-teacher ratio, percentage of qualified teachers, percentage of female teachers, experience of the school principal, age of school principal, administration experience of school principal, level of education of school principal, school principal is female, share of students with high performance in mathematics, English, and science on the WASSCE. Student-level covariates variables include the following: percentage of students not repeating any grade, percentage of students earned credit and above on GABECE mathematics, percentage of students earned credit and above on GABECE English, percent of students earned credit and above on general science, all at baseline. And socioeconomic characteristics of students—percent of students in households that has a quality roof, a wall, television, car, access to grid electricity, parents have some education. 38 Table A6: Students Scoring Credit and Above in GABECE Math Exam, by Socioeconomic Status and Gender Socioeconomic Status Within quartile share Overall share First quartile of SES - Poorest 41.7 25.4 Second quartile of SES 42.9 34.7 Third quartile of SES 43.3 24.8 Fourth quartile of SES - wealthiest 53.4 15.1 Gender Girl 42.4 60.4 Boy 47.2 39.6 Socioeconomic Status by Gender Girls in First quartile of SES – Poorest 40.4 12.5 Girls in Second quartile of SES 40.8 22.0 Girls in Third quartile of SES 40.5 16.4 Girls in Fourth quartile of SES - wealthiest 52.2 9.5 Boys in First quartile of SES – Poorest 42.9 12.9 Boys in Second quartile of SES 46.7 12.7 Boys in Third quartile of SES 48.7 8.4 Boys in Fourth quartile of SES - wealthiest 55.5 5.6 Source: Authors calculation using the described data. SES is Socioeconomic Status. 39