WPS8665 Policy Research Working Paper 8665 Re-Kindling Learning eReaders in Lagos James Habyarimana Shwetlena Sabarwal Education Global Practice December 2018 Policy Research Working Paper 8665 Abstract Empirical literature on digital technologies for student positive impacts on learning, but only if the devices had learning is generally unable to identify separately whether curriculum material and were filling input gaps resulting learning gains arise from reciprocity in response to the gift from a lack of textbooks. Consistent with other recent of a valuable gadget (the ‘gadget effect’) or from increasing findings, even six to eight months of exposure to eReaders exposure to relevant materials (the ‘content effect’). This with non-curriculum recreational material reduced student paper attempts to disentangle these mechanisms using a learning outcomes. These results demonstrate that the randomized control trial in junior secondary schools in promise of digital solutions to improve learning depends Lagos, Nigeria. It estimates three contrasts: (i) the effect largely on the extent that these solutions address unmet of just receiving an eReader with non-curriculum content, access to instructional material. The paper also finds that (ii) the marginal effects of receiving an eReader with cur- exposure to eReaders improved student retention. However, riculum text books, and (iii) the marginal effects (relative these impacts are not very robust and could be achieved to ii) of receiving curriculum with supplementary current much more cost-effectively through the provision of infor- and remedial instructional content. The findings show that mation about the economic returns to education. six to eight months of exposure to eReaders led to modest This paper is a product of the Education Global Practice. It is part of a larger effort by the World Bank to provide open access to its research and make a contribution to development policy discussions around the world. Policy Research Working Papers are also posted on the Web at http://www.worldbank.org/research. The authors may be contacted at ssabarwal@worldbank.org. The Policy Research Working Paper Series disseminates the findings of work in progress to encourage the exchange of ideas about development issues. An objective of the series is to get the findings out quickly, even if the presentations are less than fully polished. The papers carry the names of the authors and should be cited accordingly. The findings, interpretations, and conclusions expressed in this paper are entirely those of the authors. They do not necessarily represent the views of the International Bank for Reconstruction and Development/World Bank and its affiliated organizations, or those of the Executive Directors of the World Bank or the governments they represent. Produced by the Research Support Team Re-Kindling Learning: eReaders in Lagos James Habyarimana and Shwetlena Sabarwal JEL Classification: C93, I21, I28 Keywords: Education, Technology, Experiments Acknowledgements: This work has been done in collaboration with the Lagos Eko Team. We thank Marito Garcia, Michael Koenig, and Olatunde Adekola for being core members of the team and providing strong leadership and support. Financial support from EFA FTI Education Program Development Fund and Bank Netherlands Partnership Program is gratefully acknowledged. We are grateful to Niyi Omnikunle, Uttam Sharma, and Aisha Garba Muhammad, for ensuring high quality fieldwork. Malek Abu Jawdeh, Emily Kayser, Lucine Munkyung Park, Celeste Sununtnasuk and Laxman Timilsina provided excellent research assistance. We thank seminar participants at the 3iE/IFPRI seminar, Mariam Adil, Julian Crisitia, David Evans, Deon Filmer, Dina N. Abu-Ghaida, and Michael Trucano and for discussions and comments that improved the paper. 1. Introduction Can digital technologies improve student learning in developing countries where enrollments have expanded rapidly, but learning levels remain low?1 This question has sparked a compelling debate, to which this paper contributes. Digital technology - defined as technologies that enable access to large amounts of information stored on portable devices – is increasingly being used for education. These technologies targeted directly at students2 can potentially improve learning through two key channels: (i) through the provision of instructional content delivered in an exciting format that potentially increases student engagement with content; and (ii) by inducing reciprocal student effort in response to the gift of a valuable gadget. The first of these two mechanisms can help compensate for the inadequate or poor quality of school inputs. This might be especially crucial in developing country contexts where textbook shortages remain acute (Fredriksen, Brar, and Trucano 2015),3 teacher knowledge and/or effort are far from optimal (Bold et al. 2017) or text book materials and/or instruction is too advanced for most pupils (Glewwe, Kremer, and Moulin 2009; Banerjee et al. 2007; Beatty and Pritchett 2012). Digital technologies could lead to greater student engagement with instructional content in multiple ways. In typical classroom environments, students are often merely passive recipients of knowledge (Hung 2011; Rasku-Puttonen et al. 2003), a problem likely to be exacerbated in over- crowded classrooms. In contrast, digital devices like computers, tablets, and e-readers can potentially make the learning experience more focused and targeted. They can offer highly individualized instruction and allow students to learn at their own pace. These features are particularly beneficial for struggling students who might have problems keeping pace with classroom instruction (Duflo, Dupas, and Kremer 2015; Banerjee et al. 2016), or even for more advanced students who might lose interest with a slower instructional pace. The second mechanism operates through the activation of social norms of reciprocity induced by the receipt of an expensive device provided in a school setting (Akerloff 1982, Fehr et al 1998, Fehr and Gächter 2000).4 In this context, getting access to an expensive gadget and the implicit signaling of the importance of learning may prolong student interest in and engagement with learning materials, especially for students who have mastered foundational reading skills.                                                              1 For a discussion of issues in education service delivery in developing countries, see World Bank (2017), Pritchett (2013), and World Bank (2003). 2 While digital technologies can be used to support teacher preparation and instruction, in this paper we focus on technologies targeted directly at students. 3 A survey covering 22 SSA countries found that, in 2010, the “median country” had 1.4 students per textbook in both reading and math in primary education, ranging from about less than one pupil per book to 11 for reading and 13 for math. A 2008 study for secondary education found that in 18 of 19 SSA countries surveyed there was severe book shortage for most students (Fredriksen, Brar, and Trucano 2015). 4 See Sobel (2005) for a discussion of theoretical models of reciprocity and Gneezy and List (2006) for a test of reciprocity in a set of field experiments.   2   However, there are reasons to be cautious about the promise of digital technology in education. Technology adoption is prohibitively costly especially in resource-constrained education systems.5 The costs can be even higher if complementary investments in infrastructure and teacher competence are required for digital technologies to have impact (Butler 2007).6 They may also be difficult to implement in low-capacity environments. One review finds that many technology interventions fail even before implementation (World Bank 2017). The promise of technology for education has been almost as dazzling as measured impacts have been generally disappointing. Among the subset of education technology interventions that are successfully implemented and evaluated, impacts vary greatly. One summary is that the impact of technology interventions in education is mixed, with a pattern of ‘null results’ (Bulman and Fairlie 2016).7 Even expensive large-scale interventions like the One Laptop Per Child programs in Peru and Uruguay, showed no impacts on learning (Cristia et. al 2017; de Melo, Machado, and Miranda 2014). At the same time, there are interventions showing significantly positive or significantly negative effects. At the positive end of the spectrum is the dynamic computer-assisted learning program for secondary school students in India that increased math and language scores sharply over a 4.5-month period (Muralidharan, Sing and Ganimian , forthcoming). On the other extreme are studies that show negative impacts on learning (Fuchs and Woessmann 2004; Angrist and Lavy 2002; Vigdor and Ladd 2010; Malamud and Pop-Eleches 2011). High costs, implementation challenges, and mixed, often disappointing, impacts might be why technology has not been particularly disruptive in education so far.8 Given the mixed evidence, can a pattern be detected in terms of when such interventions succeed and when they fail? Some have argued that technologies that complement teachers work better than technologies that substitute for them (World Bank 2017; Snilstveit et al. 2015). Another important determining factor is context-relevance and suitability. For technology interventions to succeed, it is crucial to focus on technologies that are truly feasible in existing systems (World Bank 2017). One other hypothesis is proposed by Muralidharan et al. (forthcoming) who argue that hardware- focused technology interventions such as the provision of computers/laptops to students or schools, have no positive impact on learning outcomes; while pedagogy-focused technology interventions that enable students to access content at their own pace or allow for personalized instruction, have positive impacts on learning outcomes.9 The relatively strong effectiveness of well-                                                              5 http://blogs.worldbank.org/edutech/worst-practice 6 There are also concerns about potential exposure to non-academic and adult content (Subrahmanyam et al. 2000; Wartella and Jennings 2000). 7 A partial list includes studies that have shown a positive association between computer access and education outcomes in developed countries (Attewell and Battle 1999; Fairlie 2005; Fairlie and London 2012; Fairlie, Beltran, and Das 2010; Machin, McNally, and Silva 2007) and developing countries (Banerjee et al. 2007; Beuermann et al. 2015; Muralidharan, Singh and Ganimian forthcoming). Others find little effect (Barrera and Linden 2009) or even negative impacts (Fuchs and Woessmann 2004; Angrist and Lavy 2002; Vigdor and Ladd 2010; Malamud and Pop- Eleches 2011). For a detailed discussion, see: Bulman and Fairlie (2016) and Annex B of Muralidharan et al. (forthcoming). 8 Pritchett (2013) argues that technology does not show up in a big way in the way education is delivered. The processes of the school day, and the interactions between teachers and students are very similar to those of a century ago. 9 Pedagogy-focused interventions include computer-assisted learning interventions (Banerjee et al. 2016; Lai et al. 2015) or web or app-enabled applications like the Khan Academy. 3   designed/implemented software interventions is echoed in other studies (Banerjee et al. 2007; Carrillo, Onofa, and Ponce 2011). In a meta-review of all rigorously evaluated education interventions, McEwan (2015) finds the largest effects for interventions involving instructional technology. In a similar review, Conn (2014) considers “pedagogical interventions” to be the most effective subset of education interventions, and many of these pedagogical interventions are in fact computer-assisted learning programs. In this paper, we examine the potential effectiveness of digital technologies for student learning by focusing on non-interactive eReaders. The experiment is explicitly designed to disentangle the content effect from the reciprocity effect. Given that the eReaders are non-interactive, we are only able to examine a very limited technology enabled pedagogical intervention with no room for active feedback or targeting of content. We test the impact of providing eReaders with or without curriculum content. In a third arm, we also examine whether provision of other remedial content can usefully enhance learning outcomes. The paper measures the impact of digital technology on both learning and participation in the next level of secondary schooling. We conducted a nested randomized control trial (RCT) in which eReaders with different types of content were provided to Grade 8 students in Lagos, Nigeria. A study sample of 497 students were randomly assigned to one of the following groups - a control group that received nothing and three treatment groups that each received eReaders with either: a) only non-curriculum reading material; b) non-curriculum reading material and curriculum textbooks; and c) non-curriculum material, curriculum textbooks and supplementary curriculum-relevant material. We examine short-term student outcomes after a six-month exposure to eReaders and longer-term outcomes obtained from a student tracking survey 2.5 year after the experiment. For the purposes of benchmarking the impact of eReaders on school participation against a non- technology intervention, we implement a low-cost ‘information’ intervention (Jensen 2010). A random sub-sample of treatment students was assigned–orthogonal to eReader randomization–to receive an information intervention. Students assigned to the information treatment group received a short script on the expected returns to education in the Nigerian labor market. This intervention is extremely low-cost and scalable. As such it provides a useful benchmark for a more expensive digital technology intervention. Overall, we found no significant impact of any eReader treatment on learning outcomes in reading, math, and non-Verbal reasoning. However, eReaders with curriculum content improve reading outcomes for students with no text-books (estimates are large but imprecise). In contrast, among students with textbooks, eReaders with curriculum led to no gains in reading comprehension or a small and imprecise decline in math scores. Further, and consistent with Malamud and Pop- Eleches (2011), eReaders without curriculum material led to a decline in overall reading and math scores. These results are suggestive that learning impacts of eReaders depend on the relevance of content delivered and especially whether or not the student has access to alternative forms of the same content (textbooks). We do not find systematic impacts on student aspirations or attitudes. However, eReader provision appears to impact student retention. Around 2.5 years after the experiment ended, we were able to trace 89 percent of the original sample (442 out of 497 students). At this time, we find that students in any of the e-Reader arms are about 5 percentage points more likely to stay in school. This effect 4   is concentrated among students from poorer households and with low baseline reading scores. However, accounting for attrition in a Lee Bounds Analysis produces a wide range of possible effects that include zero. We also find that these gains in retention can be more cost effectively generated through an information intervention, a la Jensen (2010). Students assigned to receive this information on the returns to secondary education in Lagos, are 11 percentage points more likely to be in school. Overall, our results suggest that digital solutions that deliver otherwise unavailable instructional ‘content’ can boost learning outcomes. However, simply having a shiny gadget with the same instructional content as that available in textbooks has a null or negative impact on learning outcomes. On the other hand, providing a valuable ‘gadget’ may be a sufficient, but non-cost- effective option to boost school retention. The remainder of the paper is organized as follows. In Sections 2 and 3, we describe the intervention and evaluation design, respectively. In Section 4 we outline the empirical strategy and in Sections 5 and 6, we present and discuss the results of the evaluation; section 7 concludes. 2. Design 2.1 Background, Sampling, and Experimental Design In 2011-12, the Government of Lagos, in partnership with the World Bank, implemented the ‘Interactive Lagos e-Reader Assessment Program’ (iLEAP) to test the impacts of providing eReaders to lower secondary students (Grade 8/Junior Secondary 2 (JS2)) in public schools. The government selected eReaders as the digital device that was both promising and scalable. The first clear advantage was of price, both at the time but also prospectively. In 2011, a basic Kindle device had an average cost of US$200 with a range of free non-curriculum age-appropriate books available; in 2018 the average cost of this device was US$80.10 Second, eReaders are a potential option for learning material delivery in developing countries that often struggle with delivering paper textbooks to schools (Fredriksen and Brar 2015). Third, their light weight and orientation flexibility makes them highly suitable for digital reading and accessing content. Low-cost, handheld e-reading devices can hold more than a thousand books,11 are more mobile than laptops, and have long battery life. Finally, eReaders have limited functionality beyond the provision of reading content. This makes them the ideal device for examining the relative importance of static ‘content’ within the technology-for-education question. Five hundred students were randomly selected in October 2010 from the sampling frame (stratified by the six education districts in Lagos) of all JS1 public school students sitting the end of year exams. Three students were dropped due to missing school records and excluded from analysis. Due to unanticipated delays, the program did not start until the next academic year, when the sampled cohort was in JS2. As a result, students selected in the original sampling frame who did not sit/pass the end of year JS1 exam and did not move to JS2, had to be replaced through a                                                              10 Amazon Kindle, Black, 6" Glare-Free Touchscreen Display, Wi-Fi. 11 The lowest cost e-reading device from Amazon, the 6” Kindle, holds over 1,000 books (http://www.amazon.com/dp/B007HCCNJU/ref=sa_menu_kdptq).   5   randomly generated replacement list from the original frame. Forty percent of the original sample was replaced.12 The study sample was randomly divided into a control group (176 students) and three treatment groups (consisting of 107 students each). Treatment group 1 (T1) received a digital library of freely available (non-curriculum) fiction/non-fiction reading material. Treatment group 2 (T2) received the digital library and core curriculum textbooks in Mathematics and English. Treatment group 3 (T3) received the digital library, the core curriculum textbooks and supplementary instructional material. Supplementary materials included open source materials curated by curriculum experts to provide both remedial and current instructional content. To benchmark the impact of eReaders on school participation against a non-technology, a random subsample of 70 students from the treatment groups– identified via a cross design orthogonal to the treatment and stratified by treatment group – were read a short script by a trained enumerator on the expected returns to education in Lagos.13 The experimental design is presented in Figure 1. A baseline school survey covering 214 schools in Lagos was conducted in November 2010 before the intervention began with a series of workshops in August 2011. During these workshops, students completed a self-administered, 45-minute survey and test to capture data on student characteristics and reading comprehension and ability. The test included publicly released test items used in the Grade 5 English–Language Arts Standards Test (one of the California Standards Tests administered as part of the Standardized Testing and Reporting (STAR) Program). After the survey and test, students in the treatment groups were provided eReaders and given a short orientation on the device. For control students, a placebo session was organized wherein students participated in a short talk about the potential of technology use in education. Students in the cross-designed information treatment group received the returns to schooling information at the close of the workshop. Endline student testing was conducted during workshops in April 2012, just before students sat the JS2 end of year exam All students completed a self-administered survey and were given 45- minute tests on reading comprehension and ability (as was done at baseline), math, and non-verbal intelligence. The Math test included publicly released items from various years of the Trends in International Mathematics and Science study (TIMMS) Grade 4 test. The timed non-verbal intelligence test was based on Raven’s Progressive Matrices, which assesses general intelligence or “general cognitive ability” by measuring the ability to form perceptual relations and to reason by analogy (Raven 2003). Students were asked to bring their eReaders to the endline workshops, and information on device use and status was collected through the self-administered student survey, direct observation of the device, and extraction of use information from the device.                                                              12 Nearly half of this “attrition” is due to class repetition. See Table 1.1. 13 Using information from a set of sampled workers from an investment climate survey, we generate information on the returns to schooling for completion of primary, junior secondary, senior secondary school and tertiary schooling. 6   Long-term follow-up outcomes were obtained from a two-phase, student tracking exercise at the beginning of the 2014 academic year, corresponding with the study cohort’s fourth year of secondary school and their first year of senior secondary school. In Phase I, the school participation status of the study sample was collected through Lagos Ministry of Education student record verification requests. In Phase II, enumerators used phone numbers collected during the baseline survey to call missing students. This strategy was augmented by visits to schools where enumerators interviewed head teachers and teachers. These two data sets were combined, giving priority to Phase II information in cases with conflicting information. Eighty-nine percent of the original sample of students were successfully traced. 2.2 Student Profile Baseline student characteristics and balance across treatment arms are reported in Table 1. Balance across the information treatment arms is reported in Table 2. The average age of our sample at baseline is 13 years. Approximately half the students are male and 14.7 percent were repeating their current grade. For 30.6 percent of students, the father has more than secondary education and for 17 percent, the mother had more than secondary education. Nearly all students come from households with electricity connections (95.4 percent) but the frequency of electricity outages is high. A large share of students (35.0 percent) have mobile phones but very few (2.4 percent) have email addresses. Despite ongoing government programs for textbook provision to public schools, a significant share of students did not have textbooks at baseline. The share of students who reported owning textbooks in English and Math is 38.3 and 35 percent, respectively. However, more students report usage than actual ownership of textbooks. Around 74 percent and 71 percent of students claimed to have used English and Math textbooks at least twice in the last seven days, respectively. Students’ self-reported interest in extra-curricular reading appears to be high. Roughly half of the students claimed to enjoy reading newspapers and magazines (50.1 and 40.4 percent, respectively). Student attrition during the experiment is reported in Table 3. Eleven percent of students attrited between baseline and endline. Attrition is unbalanced between treatment (9.3 percent) and control groups (13.6 percent). Within treatment groups, student attrition is somewhat higher in the T2 group relative to the T1 and T3 groups. Attrition is primarily driven by student unavailability at endline, either due to transferring schools or travel and health-related absence during the endline workshop. 3. Empirical Strategy Our empirical strategy exploits random assignment to estimate the causal effect of being assigned to each of the treatment groups, on the outcomes of interest. Results are derived from OLS regressions of outcomes of interest on indicators for assignment to treatment groups: coefficients reflect how much an outcome of interest, e.g. change in test scores, differs between students that are assigned into treatment group(s) and those that are assigned into the control. We estimate the following specification: Σ (1) 7   where is the outcome for student i, j indexes the three assignments to the eReader treatment and is a vector of student-level controls. Given the nested nature of the treatment groups (see below), the Treatj variables are an overlapping set of indicators. This implies that:  Treat1 variable takes the value 1 for all students who received an eReader with the non- curriculum digital library and zero otherwise;  Treat2 takes on the value of 1 for students who received an eReader with the curriculum textbooks and zero otherwise;  Treat3 takes on the value 1 if a student received an eReader with the supplementary curriculum materials and zero otherwise. This overlapping definition implies that captures the impact of receiving an eReader with no curriculum content and captures the marginal impact of additional content received (for j=2, 3). In addition to estimating specification (1) we estimate heterogeneous treatment effects by fully interacting assignment indicators with baseline student characteristics. Σ Σ ∗ (2) Zi represents key interactions that capture a range of sources of heterogeneity motivated by theory and recent empirical research. In particular, we examine the following interactions separately: (i) student scoring above median reading at baseline (Glewwe, Kremer and Moulin, 2009), (ii) student owned a paper textbook at baseline (Das et al. 2013), and (iii) student had aspirations for high level of education at baseline. We estimate the impact of information treatment using a reduced version of equation (1), where is 1 if student i did receives the information treatment and 0 otherwise. To explore the potential influence of attrition (missing students) on our findings, we calculate Lee (2009) bounds on the potential treatment effect given extreme assumptions on attrition bias. Lee bounds assume monotonicity, i.e. that the likelihood of non-response is monotonically related to receiving the treatment. This assumption rules out the possibility that the treatment may affect sub-groups differently and implies that non-respondents in groups with high response rates would not have responded if their treatment status were changed. To do this, we follow Tauchmann (2014) and use the leebounds command in Stata12. 4. Device Usage Descriptive statistics on self-reported eReader usage are presented in Table 4. Information on eReader usage was collected through a self-administered survey as well as direct extraction from the devices. Self-reported data show high usage. Around 73 percent of the treatment sample claimed to have used the device at least once a week or almost every day. The likelihood of using the device at least once a week or more is slightly higher among students in the T1 group. About 45 percent of the sample gave a reason for limited usage; and the most frequently cited reason was lack of electricity to charge the device, followed by broken eReader. About 262 students (87 percent of the interviewed treatment sample) were able to name at least one book they had read on 8   the device, 245 students were able to name two, and 208 students were able to name three books (not shown).14 Most students found the eReader easy and convenient to use. About 63 percent of respondents had to charge their eReaders once or twice a week with 84 percent claiming that charging had little or no impact on their use. More than 75 percent had no problems with eReader features (e.g. marking text, weight of the device, difficulty of use). Around 89 percent claimed difficulty in reading had little or no impact on their use. Safety concerns about device use in public or breakage concerns were limited, however, nearly half of the students used their eReader only at home. Around 84 percent of the sample reported being comfortable with the device; 90 percent claim that their overall experience with the device was positive. Despite this, one-fourth of the respondents claimed that their use of the device was limited because they have a preference for textbooks (27 percent). Information on eReader usage was also derived from direct extraction of usage statistics from the eReaders themselves. The eReaders used in the experiment (Kindle, version 3.2) allowed for limited extraction of usage data – such as books read/opened, highlights, and notes. Two strategies were followed. First, data were collected by transcribing information from the ‘menu’ of actual eReaders recovered from students at endline. Second, functioning devices were connected to the internet to extract usage data. The extent to which these data can be used for rigorous analysis, however, is constrained by the high rates of device breakage and unavailability. Device status at endline is reported in Table 3. Around 88 percent of devices showed some sign of physical damage (51 percent of which was damage to the screen). Analyzing usage data from 86 functioning devices stored on the cloud, we find that average number of books read - in the 6-8 months duration of treatment - was 4.6. Only 6 percent of devices with extracted data showed no book read. There were no systematic differences across treatment groups in the average number of books read. Just over 85 percent of devices with extracted data had at least one highlight; 56 percent had at least one note. 5. Impacts of eReader treatment The results of estimating specifications (1) and (2) (described above) are presented in Tables 5-9. The first pair of columns in each table correspond to specification (1) with and without controls. Controls are drawn from a set of baseline student characteristics15 as well as baseline student performance in reading and average school performance at baseline (de Ree et al. 2017). The coefficients of interest capture contrasts between adjacent treatments (T1 vs control, T2 vs T1, T3 vs T2). We then present results of estimating specification (2) where the overlapping treatment indicators are interacted with three student-level attributes: student having above median reading at baseline, student owned a paper textbook at baseline, student had aspirations for high level of education at baseline. All specifications cluster standard errors at the school level.                                                              14 Oliver Twist was the most frequently cited book, followed by students’ English textbook, Five Little Friends, and students’ math textbook. 15 Age, educational attainment of mother and father, household wealth, access to electricity, mobile phones, and proxies for frequency of paper-textbook use. 9   5.1 Cognitive Outcomes Our first set of results are linked to learning outcomes in reading and comprehension, math, and non-verbal ability (Tables 5–7, respectively). Our results for reading comprehension (Table 5) are consistent with prior results. Using estimates from column (2), the impact of just receiving an eReader without curriculum content is negative and substantive, albeit imprecisely estimated (Malamud and Pop-Eleches (2011)). Adding curriculum textbooks to the eReaders produces imprecise gains of nearly 0.2 standard deviation. Finally adding supplementary materials leads to a small and imprecise negative impact. The examination of heterogeneity produces some interesting insights. First, the “hardware-only” losses appear to be concentrated among students with low baseline reading scores. Second, all the gains generated by eReaders with curriculum content (T2) accrue only to students who do not own a paper textbook at baseline. Finally, and consistent with heterogeneity by baseline scores, the hardware-only penalty is concentrated among students with low aspirations for higher education at baseline. The pattern of results for math (Table 6) is different from those for reading. Firstly, we observe negative (albeit imprecise) impacts of adding textbooks to eReaders (T2). On the other hand, the supplementary materials (T3) appear to produce imprecise gains of the same magnitude. Examining heterogeneity suggests that the declines associated with (T2) are concentrated among students with math textbooks at baseline. In contrast to reading, the math results suggest eReaders crowd out productive engagement with paper textbooks. An examination of the math textbook content suggests that the inability to zoom in on formulas or diagrams may in part explain the observed results. Finally, for non-verbal ability (Table 7), we note some interesting patterns of the results. First, in contrast to Malamud and Eleches (2011) and Cristia et al. (2017), access to digital devices does not boost non-verbal reasoning ability. If anything, we observe declines in non-verbal reasoning particularly for students with low reading scores at baseline (significant at the 10 percent level). Overall, our results suggest that even within the 6-8-month exposure, access to technology that provides non-curriculum content can impact student test scores negatively. However, when technology fills input gaps – as in providing textbook content to those who do not have it – it can improve learning. Such technology also has the potential to crowd-out traditional input use – as in making students substitute away from paper textbooks to digital format – which may in fact impact learning negatively. 5.2 Non-Cognitive Outcomes For students in the sample (age 13-14 years), access to the gadgets and the wide range of content therein may potentially impact the way they think of themselves and their future. In light of this, we examine the impact of eReader provision on student aspirations (Table 8) and self-efficacy (Table 9). We do not find any systematic patterns, although, compared to no eReader, access to non-curriculum content (T1) appears to negatively impact aspirations for higher education. 10   6. Student Retention As mentioned above, about 2.5 years following the intervention a student tracking survey was undertaken to determine which students were still in school. Assuming no repetition, these students would have been in Senior Secondary 1 grade (Grade 10). The methodology for tracking is laid out in Section 2.2. Note that this does not rely on speaking to students directly. Instead we rely on teachers and head teachers at the students’ Junior Secondary Schools. We were unable to trace 11 percent of the original sample (55 of 497 students). Of the students we were able to trace, 365 were still engaged in some kind of educational activity (school, vocational training) while 78 were not. Impacts of eReader provision and informational provision on student retention are presented in Tables 10 and 11, respectively. To explore the potential influence of attrition (missing students) on our findings, we calculate Lee (2009) bounds on the potential treatment effect given extreme assumptions on attrition bias and report the results in Table 12. Consistent with some of the predictions of the reciprocity literature, we find that, compared to the control group, assignment to receive an eReader with non-curriculum content (T1) has a positive impact on student retention of 13 percentage points (significant at the 5 percent level). This positive impact is mostly concentrated among the more disadvantaged students - those with lower than median baseline reading ability, low maternal education, and low wealth. On the other hand, compared to the eReader with just the digital library, receiving additional curriculum content reduces retention in school (Table 10). Finally, we document positive impacts of the information treatment on student retention (Table 11). Those who received the scripted message on returns to secondary education in Lagos were 11 percentage points more likely to continue their education (significant at the 5 percent level). These impacts are also more concentrated among students from poorer households. In addition, the observed gains from the information script are more than twice as large as the effect of assignment to receive an expensive eReader. Further, Lee Bounds analysis (Table 12) shows that student retention impacts from information treatment are robust to extreme assumptions regarding student attrition. 7. Conclusion We examine the impacts of a low-cost technology intervention (eReader provision) on student outcomes through a randomized control trial. Within the overall experiment, we investigate the marginal short-term impacts of different types of content on test scores. In addition, we benchmark longer term impacts on school retention against a low-cost non-technology-based alternative. First, we establish that eReaders are a viable/feasible option for content delivery in urban contexts like Lagos. Second, we show that impacts are heavily mediated by content. When eReaders address input gaps – in this case providing curriculum material to students who do not have paper textbooks – they have a positive impact on student learning even with short exposure. On the other hand, when eReaders provide non-curriculum recreational books (age-appropriate English language fiction and non-fiction), student test scores in reading and math decline. Finally, exposure to eReaders appears to improve student retention. However, similar impacts can be achieved at much 11   lower costs through a non-technology intervention that involves dissemination of returns to schooling information directly to students. These impacts demonstrate the need to focus much more on the type of content that is being delivered through digital technology interventions, rather than the gadgets. These results also appear to suggest the importance of clearly defining/measuring the counterfactual – is the technology ‘additive’ or ‘substitutive’ of traditional inputs? Finally, the results demonstrate the need to better quantify the cost-effectiveness of technology interventions by benchmarking them against low-cost non-technology interventions. 12   References Akerlof, George A. (1982), “Labor Contracts as Partial Gift Exchange,” Quarterly Journal of Economics, 97, 543-569. Attewell, Paul, and Juan Battle. 1999. "Home Computers and School Performance." The Information Society 15(1): 1-10. https://doi.org/10.1080/019722499128628 Angrist, Joshua and Victor Lavy. 2002. “New Evidence on Classroom Computers and Pupil Learning.” The Economic Journal, 112(482): 735-765. https://doi.org/10.1111/1468-0297.00068 Banerjee, Abhijit V., Shawn Cole, Esther Duflo, and Leigh Linden. 2007. “Remedying Education: Evidence from Two Randomized Experiments in India.” The Quarterly Journal of Economics, 122(3): 1235-1264. https://doi.org/10.1162/qjec.122.3.1235 Banerjee, Abhijit V., Rukmini Banerji, James Berry, Esther Duflo, Harini Kannan, Shobhini Mukherji, Marc Shotland, and Michael Walton. 2016. “Mainstreaming an Effective Intervention: Evidence from Randomized Evaluations of “Teaching at the Right Level” in India.” NBER Working Paper No. 22746. National Bureau of Economic Research. https://doi.org/10.3386/w22746 Banerjee, Abhijit V., Rukmini Banerji, Esther Duflo, Rachel Glennerster, and Stuti Khemani. 2010. “Pitfalls of Participatory Programs: Evidence from a Randomized Evaluation in Education in India.” American Economic Journal: Economic Policy 2(1): 1–30. http://www.aeaweb.org/articles.php?doi=10.1257/pol.2.1.1 Barrera-Osorio, Felipe and Leigh L. Linden. 2009. “The Use and Misuse of Computers in Education: Evidence from a Randomized Experiment in Colombia.” Impact Evaluation Series, Policy Research Working Paper No. WPS4836. Washington, DC: World Bank. http://documents.worldbank.org/curated/en/346301468022433230/The-use-and-misuse-of-computers-in- education-evidence-from-a-randomized-experiment-in-Colombia Beatty, Amanda, and Lant Pritchett. 2012. "From Schooling Goals to Learning Goals." CGD Policy Paper 012. Washington, DC: Center for Global Development. http://www.cgdev.org/content/publications/detail/1426531 Beuermann, Diether W., Julian Cristia, Santiago Cueto, Ofer Malamud, and Yyannu Cruz-Aguayo. 2015. “One Laptop per Child at Home: Short-term Impacts from a Randomized Experiment in Peru.” American Economic Journal: Applied Economics, 7(2): 53-80. http://dx.doi.org/10.1257/app.20130267 Bold, Tessa, Deon Filmer, Gayle Martin, Ezequiel Molina, Brian Stacy, Christophe Rockmore, Jakob Svensson, and Waly Wane. 2017. “Enrollment without Learning: Teacher Effort, Knowledge, and Skill in Primary Schools in Africa.” Journal of Economic Perspectives, 31(4), 185-204. https://doi.org/10.1257/jep.31.4.185 Bulman, George, and Robert W. Fairlie. 2016. “Technology and Education: Computers, Software, and the Internet.” Handbook of the Economics of Education 5: 239-280. https://doi.org/10.1016/B978-0-444- 63459-7.00005-1 Butler, Declan. 2007. “The Race to Wire up the Poor.” Nature 447(6-7). https://doi.org/10.1038/447006a 13   Carrillo, Paul, Mercedes Onofa, and Juan Ponce. 2011. “Information Technology and Student Achievement: Evidence from a Randomized Experiment in Ecuador.” IDB Working Paper No. 78. Washington, DC: Inter-American Development Bank. https://dx.doi.org/10.2139/ssrn.1818756 Conn, Katherine. 2014. “Identifying Effective Education Interventions in Sub-Saharan Africa: A Meta- Analysis of Rigorous Impact Evaluations.” PhD diss., Columbia University. Cristia, Julián, Pablo Ibarrarán, Santiago Cueto, Ana Santiago, and Eugenio Severín. 2017. “Technology and Child Development: Evidence from the One Laptop Per Child Program.” American Economic Journal: Applied Economics 9(3): 295–320. https://doi.org/10.1257/app.20150385 Das, Jishnu, Stefan Dercon, James Habyarimana, Pramila Krishnan, Karthik Muralidharan, and Venkatesh Sundararaman. 2013. “School Inputs, Household Substitution, and Test Scores.” American Economic Journal: Applied Economics 5(2): 29-57. http://dx.doi.org/10.1257/app.5.2.29 de Melo, Gioia, Alina Machado, and Alfonso Miranda. 2014. “The Impact of a One Laptop per Child Program on Learning: Evidence from Uruguay.” IZA Discussion Paper No. 8489. Bonn, Germany: Institute for the Study of Labor. de Ree, Joppe, Karthik Muralidharan, Menno Pradhan, and Halsey Rogers. 2017. “Double for Nothing? Experimental Evidence on an Unconditional Teacher Salary Increase in Indonesia.” Quarterly Journal of Economics 133(2): 993-1039. https://doi.org/10.1093/qje/qjx040 Duflo, Esther, Pascaline Dupas, and Michael Kremer. 2015. “School Governance, Teacher Incentives, and Pupil-teacher Ratios: Experimental Evidence from Kenyan Primary Schools.” Journal of Public Economics 123: 92-110. https://doi.org/10.1016/j.jpubeco.2014.11.008 Glewwe, Paul and Michael Kremer. 2006. “Schools, Teachers, and Education Outcomes in Developing Countries.” Handbook of the Economics of Education 2: 945-1017. https://doi.org/10.1016/S1574- 0692(06)02016-2 Fairlie, Robert W. 2005. “The Effects of Home Computers on School Enrollment.” Economics of Education Review 24(5): 533-547. https://doi.org/10.1016/j.econedurev.2004.08.008 Fairlie, Robert W., Daniel O. Beltran, and Kuntal K. Das. 2010. “Home Computers and Educational Outcomes: Evidence from the NLSY97 and CPS.” Economic Inquiry, 48(3), 771-792. https://doi.org/10.1111/j.1465-7295.2009.00218.x Fairlie, Robert W., and Rebecca A. London. 2012. “The Effects of Home Computers on Educational Outcomes: Evidence from a Field Experiment with Community College Students”. The Economic Journal, 122(561), 727-753. https://doi.org/10.1111/j.1468-0297.2011.02484.x Fehr, Ernst, Erich Kirchler, Andreas Weichbold, and Simon Gächter. 1998. "When Social Norms Overpower Competition: Gift Exchange in Experimental Labor Markets," Journal of Labor Economics 16, no. 2: 324-351 14   Fehr, Ernst and Simon Gächter (2000), Fairness and Retaliation: The Economics of Reciprocity, Journal of Economic Perspectives, 14(3), 159-181. Ferrando, Mery, Alina Machado, Ivone Perazzo, and Adriana Vernengo. 2011. “Una Primera Evaluación de los Efectos del Plan CEIBAL en Base a Datos de Panel.” Montevideo, Uruguay: Instituto de Economía de la FCEydeA. Filmer, Deon, Amer Hasan, and Lant Pritchett. 2006. “A Millennium Learning Goal: Measuring Real Progress in Education.” Working Paper 97. Washington, DC: Center for Global Development. Fredriksen, Birger, Sukhdeep Brar, and Michael Trucano. 2015. Getting Textbooks to Every Child in Sub-Saharan Africa: Strategies for Addressing the High Cost and Low Availability Problem. Directions in Development--Human Development. Washington, DC: World Bank. https://openknowledge.worldbank.org/handle/10986/21876 Fredriksen, Birger and Sukhdeep Brar. 2015. Getting textbooks to every child in Sub-Saharan Africa: strategies for addressing the high cost and low availability problem. Washington, DC: World Bank. https://doi.org/10.1596/978-1-4648-0540-0 Fuchs, Thomas and Ludger Woessmann. 2004. “Computers and Student Learning: Bivariate and Multivariate Evidence on the Availability and Use of Computers at Home and at School.” CESifo Working Paper Series No. 1321. CESifo Group Munich. Glewwe, Paul W., Eric A. Hanushek, Sarah D. Humpage, and Renato Ravina. 2011. “School Resources and Educational Outcomes in Developing Countries: A Review of the Literature from 1990 to 2010.” NBER Working Paper No. 17554. National Bureau of Economic Research. https://doi.org/10.3386/w17554 Glewwe, Paul W., Michael Kremer, and Sylvie Moulin. 2009. "Many Children Left Behind? Textbooks and Test Scores in Kenya." American Economic Journal: Applied Economics 1(1): 112-35. http://www.aeaweb.org/articles.php?doi=10.1257/app.1.1.112 Gneezy, U. and List, J. A. 2006. “Putting Behavioral Economics to Work: Testing for Gift Exchange in Labor Markets Using Field Experiments”. Econometrica, 74: 1365-1384. doi:10.1111/j.1468- 0262.2006.00707.x Hung, Woei. 2011. “Theory to Reality: A Few Issues in Implementing Problem-based Learning.” Educational Technology Research and Development 59(4): 529–552. http://dx.doi.org/10.1007/s11423- 011-9198-1 Kremer, Michael and Alaka Holla. 2009. “Improving Education in the Developing World: What Have We Learned from Randomized Evaluations? Annual Review of Economics 1: 513-542. https://doi.org/10.1146/annurev.economics.050708.143323 Lai, Fang, Renfu Luo, Linxiu Zhang, Xinzhe Huang, and Scott Rozelle. 2015. “Does Computer-assisted Learning Improve Learning Outcomes? Evidence from a Randomized Experiment in Migrant Schools in Beijing.” Economics of Education Review 47: 34-48. https://doi.org/10.1016/j.econedurev.2015.03.005 15   Jensen, Robert. 2007. “The Digital Provide: Information (Technology), Market Performance, and Welfare in the South Indian Fisheries Sector.” Quarterly Journal of Economics 122(3): 879-924. https://www.jstor.org/stable/25098864 Jensen, Robert. 2010. " The (Perceived) Returns to Education and the Demand for Schooling." Quarterly Journal of Economics 125(2): 515-548. https://doi.org/10.1162/qjec.2010.125.2.515 Kraemer, Kenneth L., Jason Dedrick and Prakul Sharma. 2009. “One Laptop Per Child: Vision vs. Reality.” Communications of the ACM 52(6): 66-73. https://doi.org/10.1145/1516046.1516063 Lee, David. S. 2009. “Training, Wages, and Sample Selection: Estimating Sharp Bounds on Treatment Effects.” The Review of Economic Studies 76(3): 1071-1102. https://doi.org/10.1111/j.1467- 937X.2009.00536.x Machin, Stephen, Sandra McNally, and Olmo Silva. 2007. “New Technology in Schools: Is There a Payoff?” The Economic Journal 117(522): 1145-1167. https://doi.org/10.1111/j.1468-0297.2007.02070.x Malamud, Ofer and Cristian Pop-Eleches. 2011. “Home Computer Use and the Development of Human Capital.” The Quarterly Journal of Economics 126(2): 987-102. https://doi.org/10.1093/qje/qjr008 McEwan, Patrick J. 2015. “Improving Learning in Primary Schools of Developing Countries: A Meta- Analysis of Randomized Experiments.” Review of Educational Research 85(3): 353–394. https://doi.org/10.3102%2F0034654314553127 Mo, Di, Johan Swinnen, Linxiu Zhang, Hongmei Yi, Qinghe Qu, Matthew Boswell, and Scott Rozelle. 2013. “Can One-to-One Computing Narrow the Digital Divide and the Educational Gap in China? The Case of Beijing Migrant Schools.” World Development 46: 14-29. https://doi.org/10.1016/j.worlddev.2012.12.019 Muralidharan, Karthik, Abhijeet Singh, and Alejandro J. Ganimian. Forthcoming. “Disrupting Education? Experimental Evidence on Technology-Aided Instruction in India.” American Economic Review. http://econweb.ucsd.edu/~kamurali/papers/Working%20Papers/Disrupting%20Education%20(Current%2 0WP).pdf Pritchett, Lant. 2013. The Rebirth of Education: Schooling Ain't Learning. Washington, DC: Center for Global Development. Rasku-Puttonen, Helena, Anneli Eteläpelto, Maarit Arvaja, and Päivi Häkkinen. 2003. “Is Successful Scaffolding an Illusion? Shifting Patterns of Responsibility and Control in Teacher-student Interaction during a Long-term Learning Project.” Instructional Science: An International Journal of the Learning Sciences 31(6): 377–393. http://dx.doi.org/10.1023/A:1025700810376 Raven, John. 2003. “Raven Progressive Matrices.” In Handbook of Nonverbal Assessment, edited by R. Steve McCallum, 223-237. Boston: Springer. https://doi.org/10.1007/978-1-4615-0153-4_11 Sharma, Uttam. 2012. “Essays on the Economics of Education in Developing Countries.” PhD diss., University of Minnesota. 16   Snilstveit, Birte, Jennifer Stevenson, Daniel Phillips, Martina Vojtkova, Emma Gallagher, Tanja Schmidt, Hannah Jobse, Maisie Geelen, Maria Grazia Pastorello, and John Eyers. 2015. “Interventions for Improving Learning Outcomes and Access to Education in Low- and Middle-Income Countries: A Systematic Review.” London: International Initiative for Impact Evaluation (3ie). Sobel, J. (2005) “Interdependent preferences and reciprocity,” Journal of Economic Literature, 43(2): 392-436. Subrahmanyam, Kaveri, Robert E. Kraut, Patricia M. Greenfield, and Elisheva F. Gross. 2000. “The Impact of Home Computer Use on Children's Activities and Development.” The Future of Children 10(2): 123-144. https://doi.org/10.2307/1602692 Tauchmann, Harald. 2014. “Lee (2009) treatment-effect bounds for nonrandom sample selection.” The Stata Journal 14(4): 884-894. Vigdor, Jacob L., and Helen F. Ladd. 2010. “Scaling the Digital Divide: Home Computer Technology and Student Achievement.” NBER Working Paper No. 16078. Washington, DC: National Bureau of Economic Research. Wartella, Ellen A., and Nancy Jennings. 2000. “Children and Computers: New Technology—Old Concerns.” The Future of Children 10(2): 31-43. http://psycnet.apa.org/doi/10.2307/1602688 World Bank. 2003. World Development Report 2004: Making Services Work for Poor People. World Bank Publications. World Bank. 2017. World Development Report 2018: Learning to Realize Education’s Promise. World Bank Publications. 17   Tables Table 1: Description and balance of baseline student characteristics (1) (2) (3) (4) (5) (6) Selected Characteristic Full sample T1 T2 T3 All treated Control Student Age 13.06 13.09 13.09 13.00 13.06 13.05 (1.86) -0.18 (0.18) (0.16) (0.10) (0.16) Male 47.66 47.66 50.94 44.86 47.81 47.37 (50.00) (4.85) (4.88) (4.83) (2.80) (3.83) Repeated class 14.66 17.76 16.04 11.21 15.00 14.04 (35.41) (3.71) (3.58) (3.06) (2.00) (2.66) Has a mobile phone 35.01* 27.78 31.13 40.19 33.02 38.64 (47.75) (4.33) (4.52) (4.76) (2.63) (3.68) Has an email address 2.41 0.93 2.83 3.74 2.49 2.27 (15.37) (0.93) (1.62) (1.84) (0.87) (1.13) Enjoys reading magazines 40.44 37.96 39.62 45.79 41.12 39.20 (49.13) (4.69) (4.77) (4.84) (2.75) (3.69) Enjoys reading newspapers 50.1 46.30 49.06 54.21 49.84 50.57 (50.05) (4.82) (4.88) (4.84) (2.80) (3.78) Household Mother's Education (Greater than Secondary) 17.1** 10.19 16.98 20.56 15.89 19.32 (37.69) (2.92) (3.66) (3.93) (2.04) (2.98) Father's Education (Greater than secondary) 30.58 27.78 29.25 36.45 31.15 29.55 (46.12) (4.33) (4.44) (4.67) (2.59) (3.45) Wealth index 0.00 -0.13 0.10 -0.08 -0.04 0.07 (1.02) (0.11) (0.08) (0.11) (0.06) (0.07) Electricity connection 95.37* 96.30 96.23 97.20 96.57 93.18 (21.03) (1.83) (1.86) (1.60) (1.02) (1.91) 18   (1) (2) (3) (4) (5) (6) Selected Characteristic Full sample T1 T2 T3 All treated Control Electricity outrages > 2 days per week 77.6 75.70 77.36 77.57 76.88 78.95 (41.74) (4.17) (4.08) (4.05) (2.36) (3.13) Owns English Textbook 38.29 34.58 36.79 41.12 37.50 39.77 (48.66) (4.62) (4.71) (4.78) (2.71) (3.75) Used English Textbook at least twice in past seven days 0.74 0.70 0.72 0.78 0.73 0.75 (0.44) (0.04) (0.04) (0.04) (0.02) (0.03) Owns Math Textbook 0.34 0.32 0.35 0.36 0.34 0.33 (0.47) (0.05) (0.05) (0.05) (0.03) (0.04) Used Math Textbook at least twice in past seven days 71.43* 68.52 65.09 74.77 69.47 75.00 (45.22) (4.49) (4.65) (4.22) (2.57) (3.27) Attending special exam prep sessions in Math 72.84** 69.44 74.53 80.37 74.77 69.32 (44.52) (4.45) (4.25) (3.86) (2.43) (3.49) Attending special exam prep sessions in English 73.04* 72.22 73.58 79.44 75.08 69.32 (44.42) (4.33) (4.30) (3.93) (2.42) (3.49) Baseline Reading Comprehension, % correct 27.51 28.10 26.47 28.26 27.62 27.32 (9.55) (0.91) (0.88) (0.95) (0.53) (0.74) Baseline Reading Score (School Mean) 27.52 27.83 27.43 27.54 27.60 27.37 (6.00) (0.56) (0.59) (0.63) (0.34) (0.44) T1, eReader with library; T2, eReader with library and curriculum books; T3, eReader with library, curriculum books and supplementary material. Standard errors are reported in parentheses. *, **, and *** indicate that treatment arms differed at the 10, 5 and 1 percent levels, respectively. 19   Table 2: Information treatment balance for student and household characteristics (3) (1) (2) Selected Characteristic Information Full sample Control treatment1 Student Age 13.06 13.06 13.06 (1.86) (1.89) (1.73) Male 47.66 48.1 44.93 (50.00) (50.02) (50.11) Repeated class 14.66 14.45 15.94 (35.41) (35.21) (36.87) Have a mobile phone 35.01 34.81 36.23 (47.75) (47.69) (48.42) Have an email address 2.41 2.1 4.35 (15.37) (14.36) (20.54) Enjoy reading magazines 40.44 40.19 42.03 (49.13) (49.08) (49.72) Enjoy reading newspapers 50.1 50 50.72 (50.05) (50.06) (50.36) Household Mother's Education(Greater than Secondary) 17.1 17.06 17.39 (37.69) (37.66) (38.18) Father's Education(Greater than secondary) 30.58 30.14 33.33 (46.12) (45.94) (47.49) Wealth index 0.00 -0.02 0.1 (1.02) (1.03) (0.92) with Electricity connection 95.37 95.09 97.1 (21.03) (21.63) (16.90) Electricity outrage > 2 days per week 77.6 78.2 73.91 (41.74) (41.34) (44.23) Own English Textbook 38.29 38.86 34.78 20   (3) (1) (2) Selected Characteristic Information Full sample Control treatment1 (48.66) (48.80) (47.98) Used English Textbook at least twice in past seven days 0.74 0.73 0.77 (0.44) (0.44) (0.43) Own Math Textbook 0.34 0.33 0.36 (0.47) (0.47) (0.48) Used Math Textbook at least twice in past seven days 71.43 70.56 76.81 (45.22) (45.63) (42.51) Attending special exam prep sessions in Math 72.84 71.96 78.26 (44.52) (44.97) (41.55) Attending special exam prep sessions in English 73.04* 71.73 81.16 (44.42) (45.08) (39.39) Baseline Reading Comprehension, % correct 27.51** 27.89 25.15 (9.55) (9.62) (8.85) Baseline Reading Score 27.52 27.64 26.74 (6.00) (5.96) (6.25) Number of Observations 497 428 69 1 Information treatment arms received a short script on the expected returns to education in the labor market. Standard errors are reported in parentheses and clustered at the school level. *, **, and *** indicate that information treatment arms differed at the 10, 5 and 1 percent levels, respectively. 21   Table 3. Attrition rates for students and eReaders status at endline (1) (2) (3) (4) (5) (6) Full sample T1 T2 T3 T4 Control Student Attrition Student interviewed at baseline 497 108 106 107 321 176 Student interviewed at endline 443 99 93 99 291 152 Student attrition (%) 10.9 8.3 12.3 7.5 9.3 13.6 Device status at endline Devices distributed at baseline 107 107 107 321 Devices recovered at endline 91 79 91 261 Device attrition (%) 15.0 26.2 15.0 18.7 Data extraction from devices Extraction of cloud data possible 28 26 32 86 No data extraction possible 8 11 17 36 Broken devices 55 42 42 139 T1, eReader with library; T2, eReader with library and curriculum books; T3, eReader with library, curriculum books and supplementary material. 22   Table 4. Self-reported eReader use (1) (2) (3) (4) T1 T2 T3 All treated How often did you use your eReader in the last 6 months? Never 1.03 7.95 7.14 5.30 Once or twice 7.22 4.55 4.08 5.30 A few times 12.37 20.45 17.35 16.61 At least once a week 17.53 17.05 14.29 16.25 Almost every day 61.86 50.00 57.14 56.54 What were the reasons you didn't use your eReader?1 eReader broken 23.62 eReader got lost 8.66 No electricity to charge the eReader 43.31 Lent out the eReader to a friend/family member 14.17 eReader kept at my school 10.24 Did you use eReader at school Yes 56.48 49.06 53.27 52.96 Hours each day, on average, spent using the eReader for reading books Less than 1 hour 15.05 21.79 14.61 16.92 1-2 hours 45.16 37.18 35.96 39.62 2-4 hours 24.73 28.21 29.21 27.31 More than 4 hours 15.05 12.82 20.22 16.15 What has motivated you to use the eReader in the last one month? Prepare for tests 52.63 45.24 34.41 44.12 Find the book or topic interesting 12.63 7.14 7.53 9.19 Instructed to do so 18.95 14.29 16.13 16.54 Boredom and having nothing else to do 16.85 12.20 7.69 12.21 Did the following have a major impact on your use of eReader? Charging 6.74 6.10 7.69 6.87 Difficulty in reading 2.25 1.23 4.40 2.68 23   (1) (2) (3) (4) T1 T2 T3 All treated Difficulty in marking/highlighting text 13.48 9.88 11.96 11.83 Learning how to use the device 3.33 6.02 2.22 3.80 Lack of training/information on using the eReader 3.41 3.70 6.59 4.62 Safety concerns about using the eReader in public 8.05 6.10 10.99 8.46 Which of the following would you prefer to read Paper textbooks 13.54 7.06 9.47 10.14 eReader 53.13 62.35 62.11 59.06 Indifferent between textbooks and eReader 27.08 20.00 24.21 23.91 Don't know 6.25 10.59 4.21 6.88 T1, eReader with library; T2, eReader with library and curriculum books; T3, eReader with library, curriculum books and supplementary material. 1 Forty-five percent of the treatment sample provided a reason for limited usage. 24   Table 5. Impact of treatments on reading Aspire for tertiary No interaction Above median reading Owns English textbook education ITT Controls ITT Controls ITT Controls ITT Controls T1 -0.169 -0.177 -0.330* -0.295 -0.249 -0.254 -0.945* -1.078** (0.132) (0.132) (0.157) (0.157) (0.171) (0.167) (0.414) (0.383) T2 0.154 0.188 0.233 0.224 0.379* 0.393* 0.209 0.370 (0.139) (0.138) (0.174) (0.172) (0.172) (0.172) (0.476) (0.458) T3 -0.043 -0.043 0.043 0.047 -0.137 -0.136 0.164 0.208 (0.128) (0.123) (0.161) (0.156) (0.152) (0.143) (0.363) (0.407) Main Effect for Interacting Variable 0.474** 0.147 0.001 0.001 0.416* 0.330 (0.160) (0.213) (0.002) (0.002) (0.196) (0.234) Interaction * T1 0.569* 0.483 0.002 0.002 0.809 0.939* (0.271) (0.273) (0.003) (0.003) (0.435) (0.410) Interaction * T2 -0.282 -0.214 -0.006* -0.005 -0.063 -0.191 (0.274) (0.268) (0.003) (0.003) (0.497) (0.482) Interaction * T3 -0.323 -0.298 0.002 0.002 -0.208 -0.262 (0.254) (0.242) (0.003) (0.003) (0.386) (0.429) Baseline Reading Score N Y N Y N Y N Y Baseline Reading Score (School Average) N Y N Y N Y N Y Baseline Controls N Y N Y N Y N Y R-Squared 0.00 0.11 0.03 0.11 0.02 0.12 0.02 0.11 No. observations 438 438 438 438 438 438 438 438 Note: Treatment variables indicators defined as overlapping set with T1 corresponding to all students assigned any eReader; T2 corresponding to all students assigned an eReader with curriculum textbooks; and T3 corresponding to students assigned an eReader with supplementary instructional material. Baseline controls include: age, mother's education (greater than secondary), father's education (greater than secondary), household wealth index, whether the household has an electricity connection, used English textbook at least twice in the last week, and attending special exam prep in English. Standard errors clustered at the school level are reported in parentheses. *, ** and *** represent significance at the 10, 5 and 1 percent levels, respectively. 25   Table 6. Impact of treatments on math Owns math Aspire for tertiary No Interaction Above median math textbook education ITT Controls ITT Controls ITT Controls ITT Controls T1 -0.004 -0.028 -0.048 -0.044 -0.090 -0.102 -0.089 -0.061 (0.140) (0.141) (0.174) (0.174) (0.184) (0.186) (0.430) (0.461) T2 -0.201 -0.200 -0.180 -0.220 0.090 0.082 -0.357 -0.365 (0.153) (0.152) (0.192) (0.186) (0.163) (0.161) (0.432) (0.388) T3 0.193 0.220 0.261 0.296 -0.003 0.024 -0.238 -0.265 (0.145) (0.148) (0.182) (0.180) (0.145) (0.147) (0.640) (0.679) Main Effect for Interacting Variable 0.214 0.045 -0.007 -0.014 -0.286 -0.370 (0.184) (0.250) (0.173) (0.172) (0.587) (0.634) Interaction * T1 0.182 0.102 0.248 0.229 0.088 0.032 (0.277) (0.277) (0.274) (0.276) (0.454) (0.482) Interaction * T2 -0.113 -0.007 -0.856* -0.843* 0.162 0.171 (0.306) (0.307) (0.350) (0.352) (0.461) (0.422) Interaction * T3 -0.232 -0.238 0.579 0.593 0.454 0.511 (0.306) (0.305) (0.345) (0.348) (0.658) (0.698) Baseline English Score N Y N Y N Y N Y Baseline English Score (School Average) N Y N Y N Y N Y Baseline Controls N Y N Y N Y N Y R-Squared 0.01 0.03 0.01 0.04 0.03 0.05 0.01 0.04 No. observations 408 408 408 408 408 408 408 408 Note: Treatment variables indicators defined as overlapping set with T1 corresponding to all students assigned any eReader; T2 corresponding to all students assigned an eReader with curriculum textbooks; and T3 corresponding to students assigned an eReader with supplementary instructional material. Baseline controls include: age, mother's education (greater than secondary), father's education (greater than secondary), household wealth index, whether the household has an electricity connection, used math textbook at least twice in the last week, and attending special exam prep in math. Standard errors clustered at the school level are reported in parentheses. *, ** and *** represent significance at the 10, 5 and 1 percent levels, respectively. 26   Table 7. Impact of treatments on non-verbal (Raven’s Matrix) Aspires for tertiary No Interaction Above median reading education ITT Controls ITT Controls ITT Controls T1 -0.249 -0.263 -0.381* -0.338* -0.417 -0.590 (0.137) (0.140) (0.156) (0.159) (0.414) (0.508) T2 0.167 0.191 0.311 0.312 -0.412 -0.180 (0.144) (0.145) (0.173) (0.177) (0.531) (0.633) T3 0.001 -0.001 -0.051 -0.040 0.387 0.371 (0.133) (0.127) (0.163) (0.159) (0.584) (0.618) Main Effect for Interacting Variable 0.346* 0.396 0.455 0.394 (0.174) (0.213) (0.436) (0.450) Interaction * T1 0.450 0.304 0.173 0.335 (0.317) (0.324) (0.437) (0.529) Interaction * T2 -0.437 -0.423 0.604 0.391 (0.315) (0.317) (0.551) (0.651) Interaction * T3 0.108 0.138 -0.400 -0.389 (0.275) (0.253) (0.600) (0.634) Baseline English Score N Y N Y N Y Baseline English Score (School Average) N Y N Y N Y Baseline Controls N Y N Y N Y R-Squared 0.01 0.10 0.04 0.11 0.02 0.11 No. observations 439 439 439 439 439 439 Note: Treatment variables indicators defined as overlapping set with T1 corresponding to all students assigned any eReader; T2 corresponding to all students assigned an eReader with curriculum textbooks; and T3 corresponding to students assigned an eReader with supplementary instructional material. Baseline controls include: age, mother's education (greater than secondary), father's education (greater than secondary), household wealth index, whether the household has an electricity connection, used science textbook at least twice in the last week, and attending special exam prep in science. Standard errors clustered at the school level are reported in parentheses. *, ** and *** represent significance at the 10, 5 and 1 percent levels, respectively. 27   Table 8. Impact of treatments on aspirations1 Would like to complete Inadequate requirements Lack of motivation higher institution or is not the main obstacle to is not the main obstacle beyond goal to goal T1 -0.103** 0.110* 0.043 (0.039) (0.049) (0.030) T2 0.043 -0.092 -0.046 (0.033) (0.048) (0.029) T3 0.045 0.012 0.056* (0.037) (0.034) (0.025) Baseline English Score Y Y Y Baseline Engliish Score (School Average) Y Y Y Baseline Controls Y Y Y R-Squared 0.24 0.08 0.05 No. observations 420 327 327 1 Only significant factors are presented. Other factors which were tested, but do not show impact, include: certainty of achieving educational goals, the financial situation is not the main obstacle to goals, and the lack of programs close to home is not the main obstacle to goals. T1, eReader with library; T2, eReader with library and curriculum books; T3, eReader with library, curriculum books and supplementary material. Baseline controls include: age, mother's education (greater than secondary), father's education (greater than secondary), household wealth, whether the household has an electricity connection and baseline values. Standard errors clustered at the school level are reported in parentheses. *, ** and *** represent significance at the 10, 5 and 1 percent levels, respectively. 28   Table 9. Impact of treatments on self-efficacy I don’t feel that I do not I take a positive attitude On the whole, I am have much to be proud of. toward myself. satisfied with myself. T1 0.156* -0.148* -0.045 (0.073) (0.062) (0.051) T2 0.011 0.118 0.104* (0.072) (0.062) (0.051) T3 -0.037 -0.034 -0.073 (0.065) (0.059) (0.048) Baseline English Score Y Y Y Baseline English Score (School Average) Y Y Y Baseline Controls Y Y Y R-Squared 0.03 0.05 0.07 No. observations 430 430 430 Only significant factors are presented. Other factors which were tested, but do not show impact, include the following statements: “I feel that I am a person of worth, at least on an equal basis with others”; “I feel that I have a number of good qualities”; “All in all, I am not inclined to feel that I am a failure”; “I am able to do things as well as most people”; “I do not wish I could have more respect for myself”; “I do not feel useless at times”; and “At times I do not think I am no good at all.” T1, eReader with library; T2, eReader with library and curriculum books; T3, eReader with library, curriculum books and supplementary material. Baseline controls include: age, mother's education (greater than secondary), father's education (greater than secondary), household wealth, and whether the household has an electricity connection. Standard errors clustered at the school level are reported in parentheses. *, ** and *** represent significance at the 10, 5 and 1 percent levels, respectively. 29   Table 10. Impact of treatments on the likelihood of staying in school (at the 2-year follow-up) Above median Aspires for tertiary Mother's No interaction Wealth index reading education education ITT Controls ITT Controls ITT Controls ITT Controls ITT Controls T1 0.105 0.128* 0.153* 0.176** 0.179 0.172 0.141* 0.161** 0.121* 0.132* (0.057) (0.056) (0.060) (0.060) (0.289) (0.276) (0.061) (0.061) (0.057) (0.057) T2 -0.145** -0.155** -0.148* -0.149* -0.286 -0.264 -0.149* -0.156* -0.155** -0.164** (0.054) (0.054) (0.062) (0.063) (0.231) (0.229) (0.062) (0.063) (0.054) (0.055) T3 0.099* 0.081 0.098 0.082 -0.143 -0.155 0.098 0.072 0.076 0.082 (0.045) (0.045) (0.054) (0.054) (0.133) (0.151) (0.053) (0.052) (0.045) (0.045) Main Effect for Interacting Variable -0.001 0.043 -0.180*** -0.185*** 0.072 0.056 0.029 0.023 (0.070) (0.088) (0.033) (0.049) (0.077) (0.083) (0.049) (0.052) Interaction * T1 -0.156 -0.173 -0.074 -0.052 -0.274 -0.274 -0.126* -0.117 (0.141) (0.140) (0.294) (0.282) (0.182) (0.169) (0.063) (0.067) Interaction * T2 -0.004 -0.007 0.151 0.117 0.030 0.009 0.109 0.101 (0.128) (0.127) (0.238) (0.235) (0.118) (0.122) (0.066) (0.068) Interaction * T3 -0.027 -0.009 0.224 0.246 -0.003 0.042 -0.030 -0.022 (0.098) (0.098) (0.141) (0.159) (0.097) (0.100) (0.056) (0.056) Baseline English Score N Y N Y N Y N Y N Y Baseline English Score (School Average) N Y N Y N Y N Y N Y Baseline Controls N Y N Y N Y N Y N Y R-Squared 0.02 0.04 0.03 0.05 0.03 0.05 0.03 0.05 0.03 0.05 No. observations 442 436 440 436 436 436 442 436 436 436 T1, eReader with library; T2, eReader with library and curriculum books; T3, eReader with library, curriculum books and supplementary material. Baseline controls include: age, mother's education (greater than secondary), father's education (greater than secondary), household wealth, and whether the household has an electricity connection. Standard errors clustered at the school level are reported in parentheses. *, ** and *** represent significance at the 10, 5 and 1 percent levels, respectively. 30   Table 11. Impacts of information treatment Above median Aspires for tertiary Mother's No interaction Wealth index reading education education Information treatment 0.110** 0.074 0.017 0.097* 0.115** (0.037) (0.045) (0.219) (0.043) (0.035) Main effect for interacting variable -0.029 0.035 0.038 0.010 (0.065) (0.096) (0.057) (0.025) Interaction * Information treatment 0.155* 0.101 0.078 -0.047 (0.063) (0.223) (0.065) (0.028) eReader 0.044 0.046 0.041 0.046 0.047 (0.041) (0.041) (0.041) (0.041) (0.041) Control Average 0.80 Baseline English Score Y Y Y Y Y Baseline English Score (School Average) Y Y Y Y Y Baseline Controls Y Y Y Y Y R-Squared 0.04 0.04 0.04 0.04 0.04 No. observations 436 436 436 436 436 Information treatment arms received a short script on the expected returns to education in the labor market. Baseline controls include: age, mother's education (greater than secondary), father's education (greater than secondary), household wealth, and whether the household has an electricity connection. Standard errors clustered at the school level are reported in parentheses. *, ** and *** represent significance at the 10, 5 and 1 percent levels, respectively. 31   Table 12. Lee bounds analysis of 2014 status on treatment and information treatment Information Treatment Treatment Full Impact 0.038 0.133** (0.038) (0.051) Lower 0.025 0.13*** (0.041) (0.038) Upper 0.103* 0.18*** (0.052) (0.053) Treatment - eReader with library or curriculum books or supplementary material. Full impact regression means OLS regression of 2014 status on Treatment. Lower and Upper are the Stata 'leebounds' commands. 32   Figure 1: Experimental Design       497 JS 2 students (randomly selected)        Control Group   Treatment Group  no intervention   Receive eReaders    (176 students)  (321 students)        eReaders with  eReaders with  eReaders with  digital library  digital library +  digital library +     (107 students)  curriculum books  curriculum books   (107 students)  + supplementary  material   (107 students)  33