WPS6439 Policy Research Working Paper 6439 Why Is Voluntary Financial Education So Unpopular? Experimental Evidence from Mexico Miriam Bruhn Gabriel Lara Ibarra David McKenzie The World Bank Development Research Group Finance and Private Sector Development Team May 2013 Policy Research Working Paper 6439 Abstract Take-up of voluntary financial education programs percent of interested individuals invited to training to is typically extremely low. This paper reports on attend. Using a randomized encouragement design, the randomized experiments around a large financial literacy authors measure the impact of the course on financial course offered in Mexico City to understand the reasons knowledge and behavior. Attending training results in a for low take-up, and to measure the impact of financial 9 percentage point increase in financial knowledge and education. It documents that the general public displays a 9 percentage point increase in saving outcomes, but little interest in such courses and that participation is low no impact on borrowing behavior. Administrative data even among individuals who express interest in financial indicate that the savings impact is relatively short-lived. education. The paper experimentally investigates barriers The results suggest people are making optimal choices to take-up, and finds no impact of relaxing reputational not to attend financial education courses, and point to or logistical constraints and no evidence that time the limits of using general purpose courses to improve inconsistency is the reason for limited participation. Even financial behavior for the general population. relatively sizeable monetary incentives get less than 40 This paper is a product of the Finance and Private Sector Development Team, Development Research Group. It is part of a larger effort by the World Bank to provide open access to its research and make a contribution to development policy discussions around the world. Policy Research Working Papers are also posted on the Web at http://econ.worldbank.org. The authors may be contacted at mbruhn@worldbank.org, glaraibarra@worldbank.org, and dmckenzie@worldbank.org. The Policy Research Working Paper Series disseminates the findings of work in progress to encourage the exchange of ideas about development issues. An objective of the series is to get the findings out quickly, even if the presentations are less than fully polished. The papers carry the names of the authors and should be cited accordingly. The findings, interpretations, and conclusions expressed in this paper are entirely those of the authors. They do not necessarily represent the views of the International Bank for Reconstruction and Development/World Bank and its affiliated organizations, or those of the Executive Directors of the World Bank or the governments they represent. Produced by the Research Support Team Why is voluntary financial education so unpopular? Experimental evidence from Mexico # Miriam Bruhn, World Bank Gabriel Lara Ibarra, World Bank David McKenzie, World Bank Keywords: Financial literacy; financial capability; encouragement design; low take-up. JEL codes: D14, O12, G28. # We thank the Russian Trust Fund for Financial Literacy for financing this work; Pablo Antón Díaz, Eder González Ramos and the staff at Innovations for Poverty Action for their assistance in implementing the surveys and impact evaluation; and the staff at our partnering financial institution for their support for this project. All opinions expressed in this work are those of the authors alone and do not necessarily represent those of the World Bank, IPA, or the partnering financial institution. 1 1. Introduction Access to finance is strongly associated with poverty reduction worldwide (World Bank, 2008). For Mexico, evidence suggests that many poor and lower middle-class individuals are credit- constrained and that relaxing these constraints has high returns. For example, McKenzie and Woodruff (2008) show that small injections of working capital funding have monthly returns of 15 percent or more for microenterprise owners. Bruhn and Love (forthcoming) find that a rapid expansion of bank credit serving low-income individuals led to increases in income and employment. Lack of access to finance has also meant Mexican households have been unable to smooth consumption in the face of income shocks (Attanasio and Székely, 2004, McKenzie, 2006). These factors illustrate the great potential for increases in access to savings and credit to benefit low-income individuals. However, as access to financial services expands around the world, there is also a growing concern that many consumers may not have sufficient information and financial acumen to use these new financial products responsibly. Such concerns are particularly pressing in middle- income countries such as Mexico, which have seen rapid expansions of access to finance in recent years, bringing many low-income individuals into the formal financial system for the first time. The number of credit cards has doubled during the past five years, and in urban areas an increasing number of credit card holders are now poorer and less-educated individuals. According to the 2008 Mexican Income and Expenditure Survey (ENIGH), 42 percent of households making a credit card payment had a head with less than high school education, and one in six had only primary education or less. Lower educated households are more likely to make unsophisticated decisions (Lusardi and Mitchell, 2011). These unsophisticated decisions can lead to lower efficiency and competition in the credit card market and, as this market grows, they could also have implications for the stability of the financial system as a whole. The recent global financial crisis has emphasized the importance of over-indebtedness on systemic risk. Low levels of financial literacy were arguably one important factor leading many homeowners in the U.S. to take out mortgages exceeding their means. 1 During the subprime mortgage crisis, individuals with lower financial literacy were more likely to be delinquent or to default on their mortgage (Gerardi, Goette, and Meier, 2010). 2 In Mexico, nearly 10 percent of all credit card holders fell delinquent on their payments in February 2009, double the U.S. rate.3 Many of these are low-income individuals who are first-time card holders. In addition, the financial crisis highlighted the need for building up sufficient savings to smooth consumption in 1 Although see Willis (2011) for a dissenting opinion. 2 In contrast, individuals with higher financial literacy are more likely to strategically default on underwater mortgages after the crisis (Burke and Mihaly, 2012). 3 Source: “Reigning in the Credit Card Industry�, PRI’s The World, May 11, 2009 [http://www.pri.org/theworld/?q=node/26252] the face of adverse shocks. However, savings rates tend to be low in many countries, including Mexico (4.2 percent). 4 In response to these concerns, many governments, employers, non-profit organizations and even commercial banks have started to provide financial literacy courses with the aim to improve financial education. However, participation rates for non-compulsory financial education programs are typically extremely low. Willis (2011, p. 430) sums this up as “Voluntary financial education is widely available today, yet seldom used�. For example, Brown and Gartner (2007) examine pilot experimental efforts by three credit card providers in the United States to provide online financial literacy training to delinquent and at-risk credit card holders, as well as to college students who had newly received credit cards. Target Financial Services made calls to 80,982 at-risk cardholders, reaching only 6,417 of them, of which half were invited to use a credit education website: only 684 of these requested a code to log-on, of which only 28 used the code to log on and only 2 people completed the course. U.S. Bank had only 384 cardholders out of the 42,000 it attempted to reach complete its online program (0.9 percent). Wells Fargo offered college students a 60-minute phone card as an incentive to do the training, and had the highest response rate, with 6.7 percent of college student cardholders offered the treatment logging into the website, and 6.5 percent completing the training. Thus despite financial education programs becoming increasingly popular among policy-makers and financial providers, they appear to be deeply unpopular among customers. This raises two interrelated questions which are important for research and policy. The first is whether there are economic or behavioral constraints which prevent more individuals participating in such programs? The second question is whether there are any benefits to these marginal individuals from doing so, or whether they are rationally choosing not to participate in such training? We investigate these questions through the context of randomized experiments conducted in Mexico City. We collaborated with a financial institution to evaluate a free financial literacy course offered on a large-scale. We document that there is relatively little interest in such a program among the general population, and then screen recruited subjects on interest in participating. We then randomly selected half of the interested individuals to be invited to the course. The initial participation rate in the course was low (18 percent) even among the sample of interested individuals. Motivated by different theoretical and logistics reasons why individuals may not attend training, we randomized the treatment group into different subgroups, which received treatments designed to provide evidence on some key barriers to take-up. These treatments included monetary payments for attendance, deferred payments, 4 This number is based on data from the 2008 Mexican Income and Expenditure Survey (ENIGH). It corresponds to the average household savings rate (total income-total expenditures/total income) using sampling weights and dropping those with savings rates below -100 percent and above 100 percent. The average household saving rate using monetary income and expenditures is 11.4 percent. 3 free cost transportation to the training location, and a video CD with positive testimonials about the training. We find that the latter two incentives did not increase take-up, but both current and deferred monetary payments boosted attendance rates by about 10 percentage points (from 18 to between 27 and 33 percent, depending on the payment size). There was no significant difference in attendance between those who got the payment immediately and those who received a deferred payment, suggesting that time-inconsistency or high discount rates are not the main constraint to participation. A follow-up survey conducted six months after the course is then used to measure the impacts of the training on financial knowledge, behaviors and outcomes. We find attending training results in a 9 percentage point increase in financial knowledge, and a 9 percentage point increase in saving outcomes, but no impact on credit card behavior, retirement savings, or borrowing. Moreover, administrative data suggests that the savings impact is relatively short- lived. Data on credit card balances and repayment rates show no systematic differences across the treatment and control groups related to the course. In addition to limited overall impacts of training, we find relatively little heterogeneity in impacts. We do not find significantly different effects by gender or whether the individual is a customer of our partner financial institution, but we do find stronger savings impacts for individuals with a bachelor’s degree than those without. Our results therefore suggest that the answer to the question of why voluntary financial literacy programs are so unpopular is that they do not offer that much in the way of benefits to those who are not currently attending them. That is, it appears individuals may be making rational decisions not to attend such financial literacy training programs. We cannot rule out that such programs offer benefits to the individuals who voluntarily choose to go without being given any additional information or incentives, but our results suggest there are limited gains to trying to encourage more people to attend. There is a growing literature attempting to assess the causal impacts of financial literacy trainings, which we review in the next section. Much of the existing literature has looked at training provided in a compulsory manner, such as to students (Bernheim et al, 2001; Cole and Shastry, 2008; Lührmann et al, 2012; Bruhn et al, 2013), or otherwise has focused on specialized content given to specific populations such as migrants and their families (Gibson et al, 2012; Seshan and Yang 2012; Doi et al, 2012), and farmers (Cai, 2012). We contribute to this literature by focusing on a large-scale voluntary financial literacy program offered to the general population in a large urban city – precisely the type of effort that is being increasingly adopted in a number of countries around the world. As well as evaluating the impact of such a 4 program, we are able to experimentally test the importance of different constraints to participation in such programs, which existing literature has not done. The remainder of this paper is structured as follows. Section 2 provides background information on financial literacy in Mexico and reviews the existing literature. Section 3 describes the financial literacy course. Section 4 lays out the experimental design and initial take-up. Section 5 then describes our experiments to overcome barriers to take-up. Section 6 presents estimates of the impact of the course on financial knowledge, behavior, and outcomes. Section 7 concludes with policy recommendations. 2. Background and Literature Review 2.1 Financial Literacy in Mexico During the last two decades Mexicans have experienced significant changes in access to finance, banking products, and exposure to risk. In 1997, the Mexican Pension System switched from a pay-as-you-go system to one based on individual retirement accounts. Under the new system, all private sector workers have to choose a pension fund administrator to manage their account and invest in portfolios with different levels of risk. A similar pension system was launched for public sector workers in 2007. Banking services and financial product availability has also increased in recent years. While in 2004 around 25 percent of the population had a bank account (World Bank 2008), this number had more than doubled by 2009 (Saldivar, 2009), reaching 58 percent. The number of credit cards has increased from just below 15 million to almost 28 million during the past five years – so that there are now as many credit cards as households in Mexico.5 As access to financial services expands and many consumers become users of formal financial products for the first time, there is a growing concern that many individuals may not have sufficient information and financial acumen to use these financial products responsibly. The National Commission for the Protection of Financial Services Users (CONDUSEF) found that 62 percent of Mexicans lack financial education and are not aware of their rights and responsibilities with respect to financial institutions. 6 In the 2012 Visa International Financial Literacy Barometer, Mexico ranks in the lowest third of a sample of 28 countries on questions 5 The INEGI Population Count (Conteo de Población y Vivienda) reports that there were 28 million households in Mexico in 2010. 6 http://www.cnnexpansion.com/mi-dinero/2010/10/05/condusef-tras-la-educacion-financiera 5 related to having a household budget, or savings set aside for an emergency. 7 Coupled with the evidence of low financial literacy, there is also evidence suggesting that individuals may not be managing their financial products correctly. For example, a recent study by Ponce, Seira, and Zamarripa (2009) analyzing credit card use and payment behavior in Mexico suggests that consumers make financially unsophisticated decisions in the credit card market. Also, Mexican workers tend to choose their pension fund administrator based on features not related to fees or returns (Hastings and Tejeda-Ashton, 2008, and Lara-Ibarra, 2011). The Mexican government has addressed the concern for low levels of financial education in several ways. CONDUSEF has held a Financial Education Week annually since 2008. The Ministry of Finance has also launched campaigns to improve financial literacy and financial education awareness. Furthermore, one of the objectives of the 2008-2012 National Development Financing Program was “to develop financial culture and consumer protection, promoting that people realize that they can save, get financing for their productive products, be responsible with their loans and protect against risks� (Saavedra 2012). The government’s efforts have also been matched by the development bank BANSEFI and by private sector, e.g. commercial bank, initiatives. In 2009, a total of 59 programs throughout the country were promoting financial education among different audiences (Treviño Garza, 2011). 2.2 Literature Review Financial education programs act under the assumption that lack of knowledge may be preventing individuals from making sound financial decisions. This is in part due to growing evidence showing strong associations between financial literacy measures and financial decisions. For example, Lusardi and Tufano (2008) find that individuals who have low measured levels of financial literacy tend to pay minimum balances on credit cards, incur late fees on cards, and use informal sources of credit. Stango and Zinman (2009) show that people who make mistakes in interest and future value calculations tend to borrow more and save less. Lusardi and Mitchell (2009) illustrate that people with low levels of financial literacy think less about retirement and most of them have not planned for retirement at all. And in Mexico, Hastings and Tejeda-Ashton (2008) conducted a survey that reveals that less literate individuals tend to choose mutual pension funds with higher fees. 7 Overall, the Barometer ranked Mexico in the top 3. However, this result may be biased by “taste�-related questions such as: At what age should the government require financial literacy? How well do you think teens understand finance? How often do you speak to your kids about money? http://www.practicalmoneyskills.com/summit2012/decks/bodnar.pdf 6 There are three main problems with moving from these studies to developing policy recommendations. The first is that the results are not necessarily causal. They show a correlation between proxies for financial literacy and outcomes of interest, but these correlations may simply reflect unobserved characteristics of individuals such as their numeracy, ability, self-discipline, parental background, or other such features. Second, it is often conceptually difficult to understand whether individuals are truly under-savings and over- borrowing, so even if we see increases in savings it is unclear whether this represents a welfare improvement. Thirdly, even if we knew that individuals were over-borrowing and under-saving, and that financial literacy had a causal impact on these outcomes, it is still an open question as to whether and how financial literacy skills can be taught and improved. A growing literature tries to address the first and third of these issues by relying on quasi experimental or experimental variation in the provision of financial education programs. The studies vary widely in terms of the context of the study. In addition, they face the challenge that the concept of financial literacy is often measured and defined differently (see Xu and Zia, 2012, and Lewis and Messy, 2012). Compulsory financial education classes taught in high schools have been subject to a number of studies. Bernheim et al. (2001) use exogenous variation in high school financial education mandates across U.S. states to show that students exposed to financial education classes save more as adults. However, Cole and Shastry (2008) cast doubt on these findings, showing that they are not robust to controlling for state fixed effects and examining effects over time. Shorter-term evidence come from Bruhn et al. (2013), who conducted a randomized experiment providing financial education in Brazilian high schools. They find positive effects on financial knowledge, attitudes and behaviors, and an increase in savings rates, although the impacts are small in absolute magnitude: a 3 percentage points increase in knowledge, and 1 percentage point increase in savings. In Germany, Lührmann et al. (2012) find teenagers given financial literacy training show increased interest and knowledge of financial matters, and save more in a hypothetical task, but they do not measure actual savings. Other studies have focused on providing financial education to working adults, recognizing the differences in households’ financial needs and exposure across developed and developing countries. The literature in developed countries tends to study the impact of financial education on planning for retirement or investment portfolio choices. Participation in voluntary seminars for retirement savings also tends to be low. Duflo and Saez (2011) show that offering monetary rewards can increase attendance substantially, but that this large increase leads to less than a 1.5 percentage point difference in retirement plan participation. 7 In the developing country context, impact evaluations of financial literacy training have studied the unbanked, insurance take-up, and migrants. One of the first papers to examine the impact of financial education in a developing country is Cole, Sampson, and Zia (2009). The authors implemented a field experiment in Indonesia where they offered randomly selected unbanked households a financial literacy course geared towards opening a bank savings account. They find that the financial literacy course had no effect on the likelihood of opening a bank savings account in the full sample, but it had modest effects for uneducated and financially illiterate households. Cai (2012) used a randomized experiment to show that farmers in rural China are more likely to take-up crop insurance and become less price-sensitive after attending financial education sessions. Gibson, McKenzie, and Zia (2012), Doi, McKenzie, and Zia (2012), and Seshan and Yang (2012) analyze how providing information and financial education affects the behavior of migrants and their households. Gibson et al. (2012) work with migrants in New Zealand and Australia, and find that financial education increases knowledge about remittance transaction costs, but did not lead to changes in the amount of remittances sent or use of the cheapest remittance method. Using a sample of Indonesian migrants, Doi et al. (2012) find that impacts on financial knowledge, behavior, and savings are largest when both the migrants and their families receive financial education. The results show that financial education can have large effects when provided at a teachable moment, but that this impact varies with who receives training. GAO (2004) has also recognized the importance of providing education during “teachable moments.� Seshan and Yang (2012) find that Indian migrants in Qatar increase savings after financial literacy training, but only if they had low financial literacy to begin with. A common assumption in most financial literacy interventions is that people are not saving enough. Indeed George Akelof (2001) goes so far as to state in his Nobel address that “it is common wisdom that people save too little�. He notes that in standard economic models, savings is the outcome of utility maximization decisions, so saving too little or too much is not possible. However, behavioral economics offers several reasons for undersaving: present consumption is more salient than future consumption, so individuals may procrastinate about saving and have time-inconsistent preferences. He notes that the best evidence of undersaving is the observation that upon retirement, consumption falls substantially. However, the view that people do not save enough for retirement has been challenged by Aguiar and Hurst (2005), who show that although expenditure falls dramatically upon retirement, consumption does not as people increase home production. Nevertheless, evidence from lab experiments (e.g. Brown et al., 2009) supports two views of undersaving: that people don’t know how to save optimally in complex environments; and that even when they do know how to save, they can’t resist short-term temptations to spend. Financial education in principle can reduce under-saving by 8 helping with these two problems: by giving individuals more information and knowledge about savings strategies, and by providing tools and ideas to help resist temptations to spend. In a recent review, Gale and Levine (2010) find that none of the traditional approaches 8 have generated unambiguous evidence of positive impacts of financial literacy efforts. The literature is now moving towards exploring whether innovative channels for providing financial education may affect behavior. Ongoing studies in India, Peru, South Africa and in the U.S. (among others) are testing whether the provision of information via videos, radio, mass media or video games are effective in improving individuals’ financial decisions. This study builds on and complements this existing literature in several ways. First, we work with a population that is already banked. Thus, we do not study whether financial education makes individuals more likely to use formal financial products, such as savings accounts or credit cards, but rather on whether they use these products to their advantage and responsibly. Second, the financial education course we analyze covers topics that are relevant for a range of users of financial services and is delivered at scale by a financial institution, rather than being a pilot program or government program. Third, we examine interest for such training in a general population, and then focus on individuals who state an interest in attending a financial literacy course. We show that even among this population training attendance rates are low and we examine the barriers to participation experimentally through the use of different incentive treatments. Finally, we obtain administrative data on savings and credit card usage from a financial institution to validate the conclusions drawn from survey data. 3. The Financial Literacy Training Product The financial literacy training course studied in this paper is a large-scale program offered to the general public. The goal of the program is to convey basic knowledge and tools that individuals may need to manage their personal finances responsibly. It is targeted at adults and is offered free of charge. The program was first launched a few years ago and has trained over 300,000 individuals. In 2011, the course won a regional award for innovativeness in fostering financial education. During the time of our study, the course was being taught at several locations in Mexico City, in a number of other cities in Mexico, as well as through an online platform. Our study focuses on 8 The authors mention four approaches: employer-provided financial education, state-mandated financial and consumer education for students, credit and mortgage counseling, and community-based programs providing general financial education. 9 the training locations in Mexico City. Courses are offered on a continuous basis, with one or two sessions per day Monday through Saturday. Each session has capacity for twenty participants, although typical attendance at the most central location is only four or five people per day on weekdays and more on weekends; and attendance is even lower at the other locations, with participation rates of about one person per day during our pre-treatment monitoring. Training is administered via individual computer terminals, with an instructor present to show videos and facilitate interactive exercises that are used to strengthen the concepts taught in the material. Participants also receive workbooks that contain the information being presented, as well as exercises to be completed during the course. At the end of the session, participants take a short test and receive a certificate conditional on completing the test successfully. They also receive a CD to take home. This CD includes the tools used in the exercises performed during the course. The course lasts half a day and consists of modules on saving, retirement, credit cards, and responsible use of credit. The course explains why savings is important and discusses different savings instruments and steps individuals can take to increase the amount they save, such as setting savings goals and keeping a household budget. Saving for retirement and pension funds are also covered. The course then discusses the use of credit cards, associated fees, and how to decipher a credit card statement. Finally, information is provided on good credit card debt management practices, an individual’s credit score and credit history, and steps individuals can take to preserve and improve their credit management. 4. Experimental Design and the Low Demand for Financial Literacy Training To estimate the impact of the training on financial knowledge, behavior, and outcomes, we conducted a randomized experiment. This section describes the design and implementation of the initial experiment, including sample selection, randomization, and take-up. The next section then discusses our approaches to experimentally examining the barriers to take-up, while follow-up data collection is discussed in Section 6. Appendix A provides a timeline to summarize when each project phase took place. Many programs in finance and private sector development face low take-up (McKenzie, 2010), and prior voluntary financial literacy programs have also experienced low take-up issues (Brown and Gartner, 2007). While the 300,000 individuals taking part in the financial literacy training program studied here is a substantial number, this is country-wide and over four years. Assuming at most 100,000 were trained in Mexico City in any one year, this represents less than 10 0.6 percent of Mexico City’s adult population obtaining the training annually. 9 Low take-up has severe consequences for statistical power. We therefore tried several approaches to screen individuals for their interest in participating in financial literacy training and thereby measure the impacts of such training on these individuals. This process gives insights into the demand for financial literacy training, and provides training impacts for a policy relevant group: since it is difficult to make adult financial education mandatory, a key policy question is whether policy efforts should try and encourage more people who are interested in attending such programs to actually attend them. 4.1 Approach One: Low Demand for Financial Literacy Demonstrated through a Mailing Campaign Our first approach to obtaining a sample of people interested in financial literacy training was to send a screener survey to clients of our partner financial institution. The institution agreed to partner with us and to provide us with a de-identified list of all their clients in Mexico City who have a savings account and a credit card. By conditioning on these characteristics we intended to identify individuals for whom debt management and saving advice is likely to be relevant and who have credit card and savings behavior that we could study. We further narrowed down the list of clients to those who lived in a municipality with a financial literacy training location or less than 5 km away. From this sample, we randomly selected 40,000 clients to receive a mailing with the screener survey. The randomization was stratified by gender and age. The mailings were sent by our partner institution through their usual provider between January 7 and 12, 2011. The delivery company confirmed delivery of 98.8 percent of the letters. Non- deliveries were due to clients having moved or the address not being found. The mailing contained a letter informing clients we are partnering with their financial institution to help research ways people are managing their savings and credit card debt, and that we would like to see if they are interested in participating in a financial education training session. The letter only mentioned the training in general terms and did not refer to the specific program we are studying or the locations where this training was being offered. The mailing also contained a two-page screener survey that clients could mail back to us in a pre-paid envelope to indicate their interest in the training. This short screener survey collected information on name, address, phone number, sex, age, education level, occupation, household income and expenditure (in bins), as well as basic usage of savings accounts and credit cards. Clients also had the option of responding to the survey by going to a website or by calling a toll- free number. In order to increase response to the screener survey, half of the letters (20,000) 9 CONAPO projects that in 2012 there were 17,372,952 individuals 17 years old or older living in the Metropolitan Zone of Mexico City. http://www.conapo.gob.mx/es/CONAPO/De_las_Entidades_Federativas_2010-2050 11 were randomly selected to include an offer for a monetary payment of 75 Pesos (about US$5) to the first 200 clients who submitted their answer. The total number of letters sent (40,000) was chosen based on information from our partner financial institution that typically only 2-3 percent of their clients reply to any sort of mail offer sent by them. This expected response rate would yield 800-1,200 clients who would form the sample for our randomized experiment. However, we received much fewer responses than anticipated – only 42 responses. That is, only 0.1 percent of clients expressed interest in a financial literacy training program. We suspect that this low response rate is in part due to the climate of insecurity that has prevailed in Mexico City during the past few years and that has made people distrustful of unsolicited requests for data. Around the time of our study, fake phone and mail extortions had become a threat to the general population. In fact, both the federal government and commercial banks launched a campaign asking people to avoid giving personal information to strangers, especially through phone or mail communications offering money or prizes in return for personal data. Our letters were sent out by our partner financial institution, bearing their logo, but people may still have been suspicious of the request for data. 10 4.2 Approach Two: Low Demand for Financial Literacy Demonstrated through an Online Campaign Our second strategy for obtaining a study sample was to conduct a screener survey through Facebook. We created a Facebook page for financial literacy and launched a Facebook ad that pointed to this page. The Facebook page included the same information as the letters sent in the mail, mentioning the importance of financial literacy. The page invited people to indicate their interest in participating in a (generic) financial literacy course by clicking on a link that redirected them to a page where they could answer our screener survey online. This survey contained the same questions as one mailed to our partner financial institution’s clients. We did not offer a financial incentive for completing the online survey. The Facebook ad was targeted to individuals residing in Mexico City and ran for two months, from mid-February to mid-April, 2011. 11 It was displayed about 16 million times. We obtained a total of 1,240 fans of our Facebook page and 119 responses to the online survey. Since this 10 All letters included a toll free number where people could make enquiries. We did not receive any phone calls enquiring about training in response to this mailer, suggesting a low demand for such training. 11 At the time, Facebook had 7,743,220 registered users who reside in Mexico City (approximately 87 percent of the population). 12 sample was still not large enough for our study, we implemented a third approach to screening people for an interest in a financial literacy course, as described below. 4.3 Approach Three: Street and Branch Surveys We conducted screener surveys on the streets of Mexico City and outside branches of our partner institution. Surveyors were placed in busy locations within the city during a period of eight weeks (from April 25 to June 25, 2011) where they tried to interview people passing by. We also placed surveyors outside branches of our partner institution between July 6 and 19, 2011, where they approached exiting customers and people waiting in line. For this approach, interviewers asked people if they would be interested in participating in a financial literacy course, providing the same information as stated in letters sent during the mailing campaign. As in the letters, the name of the course was not disclosed. If the respondents expressed interest in the course 12, interviewers asked them to fill out the screener survey, so that we could contact them later with further information about it. The questionnaire included the same set of question as the mail and online surveys. We did not offer a monetary incentive for completing the surveys, but people who answered the survey were offered cookies and a pen as a small thank you token. We obtained a total of 6,945 completed questionnaires from the street survey and 2,294 from the branch survey. 4.4 Treatment Randomization and Balance For all individuals who had expressed interest in a financial literacy course either through the mail, online, street or branch screener survey, we conducted phone audits to verify the contact information they had provided. This eliminated about half of the respondents. We further dropped respondents who lived outside the Mexico City metropolitan area or who had participated in a financial literacy program in the past. We also dropped observations that had missing answers to the questions we stratified on in the randomization, as described below. Our final sample includes eight respondents from the mail survey, five from the online survey, 2,490 from the street survey and 1,000 from the branch survey, giving a total sample of 3,503 people. We divided this sample into a control group of 1,752 individuals, and a treatment group of 1,751 individuals, using stratified randomization. The randomization was conducted by the authors by computer. 12 Subject recruiters reported that, roughly, only 2 out of every 5 people approached expressed an interest in the course and agreed to fill out the survey. 13 We stratified the randomization by whether we obtained the respondent through the branch vs. the mail, online or street survey, by gender, by having at least a bachelor’s degree or not, and by whether the person was (i) a client of our partner financial institution, (ii) a client of another financial institution, (iii) neither. Within the sample of financial institution clients, we further stratified by whether they made a deposit into their savings account during the past month and by whether they have a credit card. For clients with a credit card, we stratified by whether they made more than the minimum payment each month during the past six months. For individuals who were not financial institution clients, we stratified by whether they lived closer than 8 km away from a training location or not. Our original intention with the mail screener survey was to study only individuals who are financial institution clients to learn whether the financial literacy course can improve their financial behavior and outcomes. Through the online, street, and branch surveys, we obtained 1,325 respondents who were not financial institution clients. We decided to keep these in the sample since the course material could in principle also be relevant for them and they may start financial institution relationships as a result of taking the course. However, the percentage of non-financial institution clients in the treatment group who ended up attending the financial literacy training was very low (18.1 percent), making it difficult to detect any effects on this sample. Since the take-up rates were higher among financial institution clients (28.1 percent), as discussed further below, we decided to conduct our follow-up survey only among financial institution clients and we drop non-financial institution clients from the impact analysis, although they are included in our experiments on inducing attendance. Table 1 shows baseline variables collected through the screener survey for the sample of financial institution clients. About half of the individuals were clients of our partner financial institution as opposed to clients of another institution. Close to 65 percent made a savings deposit during the past month. About 40 percent had a credit card at baseline and only half of them made more than the minimum payment in all previous months. In fact, about 20 percent made a late payment on their credit card during the past six months. The demographic variables show that about half of the sample is female and 40 percent has at least a bachelor’s degree. The average age is 33 years. As expected given the random assignment, all baseline variables are balanced across the treatment and control groups. 4.5 Initial Take-Up: Low Attendance for Training Even among Those Who Say They Are Interested in Attending a Financial Literacy Course Starting on August 1, 2011, each person in the treatment group was contacted by telephone and invited to participate in the financial literacy training program. If the participant confirmed 14 interest, they were offered the opportunity to choose a training location, date and schedule that best suited them. The phone operator then enrolled the participant based on this information. The operator proceeded to confirm the details of the appointment with the participant before ending the call. A second call was made the day prior to the participant’s appointment as a reminder to increase the probability of attendance, and a third and final call was done the day after their appointment to confirm attendance and inquire about their level of satisfaction with the courses. If during this third call, if participants responded that they had missed the appointment, they were offered the opportunity to reschedule for a future date, if they claimed they were still interested in attending. During this phase we signed up 1,049 out of 1,751 individuals (59.9 percent) for the course. About a third of the people who signed up for the course actually attended (312 people). Participants gave a range of reasons for not attending the session they had signed up for, including difficulties attending due to work and family commitments, sickness, and in some cases, issues with instructors turning up late or them arriving late and being turned away. The overall attendance rate for the 1,751 treatment group individuals who had been screened for interest in attending a financial literacy course was thus only 17.8 percent. This number is low compared to attendance rates for business training courses. A study in Mexico found a 65 percent attendance rate for business training. Attendance rates for business training interventions in other countries range from 39 to 92 percent (McKenzie and Woodruff, 2012). 5. Overcoming Barriers to Attendance through Experimental Interventions We first examine theoretical reasons why individuals may choose not to participate in financial literacy training, and then describe the experiments designed to test these barriers. 5.1 Why People Might Not Want to Participate in Financial Literacy Training Let c be the cost of attending the financial literacy program. While the program itself is free, individuals would incur transportation costs in getting to and from the program, as well as the opportunity cost of lost income (or lost leisure time). Let bt be the benefits the individuals will realize in period t from participating in the course, such as better financial outcomes, and Ebt the certainty-equivalent expectation of these benefits. Then theory will predict that an 15 individual will choose to attend a financial literacy course if the expected discounted benefits of the course exceed the costs of attending. i.e. if ∑𝑇 𝑡 𝑡=0 𝛽𝛿 𝐸�𝑡 > � (1) Revealed preference would suggest that anyone whom it would benefit to take the financial literacy course would have done so, while anyone who chooses not to take the course is doing so because they do not view the benefits as exceeding the costs. This theory then suggests several barriers that may prevent individuals from participating in the financial literacy course, even if it has positive benefits to them (∑𝑇 𝑡=0 �𝑡 > 0). A first reason is just that they face costs of attending, so that c is large in magnitude. A second set of reasons concern the timing of when costs are incurred relative to when benefits are received. In particular, individuals may not participate because the costs are experienced immediately, while the benefits may take time to accrue. Individuals with high discount rates (low 𝛿 ) may find the discounted value of benefits is less than the current costs. Individuals who are present- biased (𝛽 < 1) can have time-inconsistent preferences, and so while they would like to have attended financially literacy training in the future, because the benefits occur in the future and attendance occurs today, they keep putting the course off. Thirdly, individuals may not know the benefits of participating, potentially undervaluing them. Even if their expected benefits are accurate, with risk aversion, uncertainty as to these benefits will still cause Ebt < bt. Finally, one could also imagine that liquidity constraints prevent individuals from paying the costs today, even if they see positive net expected benefits. This final explanation seems less relevant in our case where many individuals have credit cards and most have savings. 5.2 Experimental Interventions to Overcome Barriers to Attendance We designed a second stage of interventions to explore if we could tease out whether high uncertainty about benefits, transport costs, high opportunity costs, impatience or low appeal for the course could explain individuals’ low attendance and thereby learn what policy efforts could spur attendance. The treatment group was divided further into six different groups – one group who received no further assistance, and the following five booster treatment groups: 1. Offered 1,000 pesos (US$72) for completing the training: participants were given a Walmart gift card of 1,000 pesos if they attended 13, 2. Offered 500 pesos (US$36) gift card for completing the training, 13 For comparison, at follow-up, median monthly income in our sample was about 9,000 pesos (US$650). 16 3. Offered 500 pesos (US$36) gift card that they would receive one month after completing the training, 4. Offered a free taxi ride to and from the course location, 5. Provided a video CD containing positive testimonials from people who had attended the course. Treatments 1 and 2 enable us to examine whether individuals are more likely to attend the course as the benefits of attending the course increase. This helps get at the issue of whether individuals are making rational decisions by responding to changes in the net benefits, as well as to measure how the demand for training varies with these benefits. The comparison of treatments 2 and 3 enables us to see whether high discount rates or present bias is a reason for a lack of attendance – people might think the course has benefits, but because these benefits occurs in the future and attendance occurs today, keep putting the course off. If this was the issue, we would expect much greater response to Treatment 2 than to Treatment 3. Treatment 4 lowers the costs of attending the training, which enables examination of whether one important component of c is the constraint. Treatment 5 aims to reduce informational constraints that may prevent people from attending the course if they are not sure if it will be helpful, thereby attempting to reduce the difference between Ebt and bt. The incentive treatments were assigned through stratified randomization using the entire treatment group. We stratified by whether the respondent was screened through the branch vs. mail, online, or street survey, by income group, by whether the person was (i) a client of our partner financial institution, (ii) a client of another financial institution, (iii) neither, and by attendance status thus far. The attendance status categories where (i) attended, (ii) was scheduled, but didn't attend, (iii) was reached, but didn't want to be scheduled and, (iv) can't be reached. Individuals who had already attended the course did not receive the incentives, but they were included in the randomization to allow us to report post-incentive treatment take-up rates by incentive type for the complete treatment group. 5.3 Impacts of Experiments to Increase Attendance Figure 1 shows the overall percentage attending in each incentive treatment group (after the initial treatment offer and the booster treatments). Some participants were unreachable by phone, so we show results as a percentage of all participants assigned to a treatment, and of all who could be reached to actually offer them the treatment. Table 2 provides regression estimates of the impacts of the various treatment arms on attendance. 17 As a result of the booster interventions, we succeeded in getting an additional 114 individuals to attend the course, giving a total of 426 attendees out of 1,751 treatment group individuals (24.3 percent). Offering a monetary incentive of $36 increased the take-up rate from about 18 percent to 27 percent of those assigned to treatment, while the $72 incentive increased take- up further to 33 percent. While the difference between the two monetary amounts is not statistically significant, they both suggest that individuals are rationally responding to higher benefits of training by being more likely to attend. The treatment impact is exactly the same when US$36 is offered immediately at the completion of training, or one month after training. This suggests that high discounts or present-bias are not the main barriers to participation in training. In contrast to the monetary incentives, we find that transportation assistance and the testimonials did not significantly increase attendance. One reason for the lack of impact of transportation assistance were security concerns in Mexico City, with people reluctant to take a taxi cab that came to their home. The lack of impact of the testimonials could reflect people updating their beliefs about the efficacy of the training very little after the receipt of this treatment, or that lack of information about the benefits and quality of the training was not the main reason for non-participation. Finally note that even when participants were offered US$72 to attend, it is still the case that the majority of individuals who had initially expressed interest in attending financial literacy training do not attend. 14 5.4 Which Individuals Attend Training? If we break down the take-up rate by individuals who are clients of a financial institution and those who are not, we find that the attendance rate was 28.1 percent among clients vs. 18.1 percent among non-clients. Since we do not expect to be able to detect treatment effects with a take-up rate of 18.1 percent and since our randomization was stratified by being a financial institution client, we decided to collect follow-up data only on the 2,178 individuals who are clients of a financial institution. From this point on, all tables and analysis in this paper cover only these individuals. Using data from the screener survey, we analyze which individual background characteristics are correlated with training attendance in the treatment group. Table 3 displays this analysis, using two different measures of take-up (i) a dummy variable for whether the individual 14 This dissociation of intentions and follow through has been documented in studies of retirement seminars (Madrian and Shea, 2001). 18 attended the course prior to our booster incentive interventions and (ii) a dummy variable for whether the individual attended either before or after we offered the incentives.15 The bottom row of table 3 shows that the take-up rate increased from 20.8 percent to 28.5 percent after the incentive intervention (in our sample of financial institution clients). Two characteristics are strongly and positively correlated with attendance: education and age. Individuals who have a bachelor’s degree or higher are up to 14 percentage points more likely to have attended the training. The take-up rate among this group was 38 percent after the incentive intervention. Older individuals are also more likely to have participated in the training. In addition, we find weak evidence that females and individuals who are clients of our partner financial institutions as opposed to another institution are slightly more likely to have attended the training. 6. Impacts of Financial Literacy Training We first discuss our data and estimation methodology, and then turn to impacts of the financial literacy training on financial knowledge, behaviors and outcomes. 6.1 Follow-Up Survey We conducted a follow-up survey between February and July 2012 to measure post-training financial knowledge, behavior and outcomes. We kept the questionnaire relatively short (about 15 minutes) to encourage participation. The questions focused on concepts and behaviors taught in the course. We discuss specific questions and outcome variables below. For logistical reasons, we first attempted to conduct the follow-up survey over the phone. If the person did not respond to the survey during the first attempt, we offered them a 500 pesos (US$36) Walmart gift card for completing the survey during the second attempt. If we were still not able to interview the person over the phone, a surveyor visited their house to conduct a face-to-face interview. If the participant was not at home, the surveyor delivered a letter with information about our study and instructions for how to contact us to participate in the survey and to receive the Walmart gift card. Surveyors made two more attempts (three attempts in total) to conduct a face-to-face interview if a respondent was not at home. We were able to interview 72.8 percent of our sample during the follow-up survey. The attrition rate was slightly higher in the treatment group (29 percent) than in the control group (25.3 15 We do not include income among the variables displayed in Table 3 since this variable has many missing values and including it would thus reduce the sample size. When we add income to the analysis, it is not correlated with take-up and the coefficients on the other variables remain similar to the ones shown in Table 3. 19 percent). The last two columns of table 1 show baseline characteristics for the sample of individuals interviewed at follow-up. The characteristics are very similar to the full sample and we do not find any statistically significant differences between control and treatment group means in the follow-up sample. This shows the difference in attrition rates between the two groups is not leading to imbalance on observable characteristics. 6.2 Administrative Data We obtained administrative data on saving account balances and credit card outcomes from our partner institution. Due to confidentiality reasons, our partner cannot disclose individual level data, but they offered to generate summary statistics at the treatment and control group level. This was going to be straightforward with the sample we screened through the mail survey since the list of individuals in this sample came from our partner financial institution, meaning that they could easily find these individuals in their records. The low response rate to this survey implied, however, that our sample now almost entirely consists of individuals who were screened through the street or branch surveys. About half of the current sample (1,034 individuals) reported being clients of our partner financial institution and 470 of these individuals could be found in the institution’s records based on name, address and phone number. 16 The last two columns of table 1 show our baseline statistics for this sample of 470 clients. Overall, this group is similar to the full sample. Also, the characteristics of the treatment and control group clients who were matched with administrative data were not statistically different at baseline. 6.3 Estimating Treatment Impacts We estimate the impact of the financial literacy training with the following intention-to-treat (ITT) equation yi,s,m = α + βTrainingInvitei,s,m + ∑ γs ds + ∑ δm dm + εi,s,m (2) where yi,s,m is a follow-up survey measure of the financial knowledge, behavior, or outcome of individual i, in randomization strata s, who was surveyed in month m. The variable TrainingInvitei,s,m indicates whether an individual was invited to the course or not and is thus equal to one for the treatment group and equal to zero for the control group. We control for randomization strata dummies ds , as well as month of follow-up interview dummies dm . The main coefficient of interest is β, which represents the treatment effect of being invited to a 16 For ethical reasons and given the security concerns in Mexico City, we did not ask for date of birth or national ID numbers in the survey. 20 financial literacy course. The coefficient β in equation 2 is also equal to the difference in means of the outcome variable yi,s,m across the treatment and control group, conditional on strata and interview month dummies. In addition, we estimate the following local average treatment effect (LATE) regression yi,s,m = α + βAttendedTraining i,s,m + ∑ γs ds + ∑ δm dm + εi,s,m (3) where AttendedTraining i,s,m is equal to one for treatment group individuals who attended a course and zero otherwise. We instrument this variable with our indicator variable for whether an individual was invited to the training or not (TrainingInvitei,s,m ). The coefficient of β here represents the local average treatment effect: that is, the impact of the financial literacy course on the individuals who attended a course as a result of being invited to the course but who would not have otherwise attended. 6.4 Impact on Financial Knowledge We measure financial knowledge through eight follow-up survey questions based on the material that was taught in the course. For each question, we create a dummy variable indicating whether the respondent gave the correct answer to the question. These eight questions are then aggregated into a financial knowledge index that is the average of the eight dummy variables and thus represents the fraction of questions that the respondent answered correctly. Finally, we asked individuals to rate their own level of financial education on a scale from one to five (1 = excellent, 2 = good, 3 = satisfactory, 4 = unsatisfactory, 1 = no knowledge of the topic). Based on the responses to this question, we coded a dummy variable indicating whether the self-assessed level of financial literacy is satisfactory or higher. Table 4 lists the eight dummy variables used to measure financial literacy, as well as the knowledge index and self-assessed knowledge level. The table shows the control group mean for each variable and the ITT and LATE estimates that correspond to the coefficients β in equations (1) and (2). The results in table 4 show that the course had a positive and statistically significant impact on financial knowledge. The knowledge index increased from 0.31 in the control group to 0.34 in the treatment group due to the training, meaning that control group individuals answer 31 percent of the eight questions correctly, while treatment group individuals answer 34 percent correctly. The size of this impact is relatively small, but the effect is larger for some individual questions. For example, only 13 percent of control group individuals know that bank deposits 21 are insured up to 400,000 UDIs (currency units), but the training increased this number to 20 percent in the treatment group. Similarly, 24 percent of the control group knows what CAT (total annual cost of credit) is, and this number increased to 28 percent in the treatment group due to the training. The training also increased the self-assessed level of financial literacy, with 63 percent of the treatment group saying their knowledge is satisfactory or higher, compared to only 58 percent in the control group. When interpreting the magnitude of the effects, it is important to keep in mind that only about 30 percent of the treatment group attended the financial literacy course. The LATE estimates in the last column of table 4 take into account this relatively low take-up rate. They show much larger impacts on individuals who actually went to the training as a result of being invited through the intervention. The knowledge index increased by 8.7 percentage points due to the course for these individuals and the probability of answering some of the specific knowledge questions correctly increased by up to 20 percentage points. 6.5 Impact on Savings Behavior and Outcomes The financial literacy course emphasized specific behaviors that may help individual save more money, including: checking financial institution transactions regularly and keeping track of expenses, making a budget and setting a savings goal, identifying necessary and unnecessary expenses to reduce overall expenditures. The follow-up survey asked whether individuals engage in these behaviors and we code five dummy variables indicating that they do, as listed in panel A of table 5. We aggregate these behaviors into a savings behavior index by taking an average of the five dummies, giving the fraction of five behaviors that individuals engage in. The control group means in panel A of table 5 show that many individuals check their financial institution transactions regularly (69 percent), keep track of expenses (79 percent), and make a budget (77 percent), even without having taken the financial literacy course. We do not detect a significant treatment effect of the training on these behaviors, perhaps because these behaviors were already quite common in the control group. The remaining two savings behaviors we study are not as frequently used in the control group: 57 percent of individuals have a savings goal and 59 percent have cut expenses in the past three months. We find that the course increased the percentage of individuals who cut expenses during the past three months to 63 percent, but this effect is only statistically significant at the ten percent level. In addition to studying savings behavior, we also examine whether the training had an impact on savings outcomes. Panel B of table 5 lists three measure of personal savings (i) a dummy variable indicating whether the individual has any type of savings/money set aside, (ii) a dummy 22 variable for whether the individual reports saving at least some fraction of their income during the past six months, and (iii) a dummy variable indicating whether the respondent said they save more each month than they did a year ago. We construct a savings outcomes index by taking the average of these three dummy variables. All three savings outcomes are higher in the treatment group than in the control group at follow-up, although the differences are not statistically significant for the individual variables. We do, however, find a positive and significant impact of the course on the savings outcome index. Control group individuals reach on average 65 percent of all three savings outcomes, increasing to 68 percent in the treatment group. Recall that we provided monetary incentives of 500 or 1,000 pesos ($36 or $72) to some randomly chosen treatment group individuals in order to encourage them to attend the financial literacy course. These amounts are equivalent to 5.5 and 11 percent of median monthly income in our sample. To check whether the positive effect on savings outcomes is driven by the monetary payments instead of the course itself, we add a dummy variable to estimation questions (1) and (2), indicating whether the individual received an incentive payment for course participation. Panel C of table 5 replicates the results in panel B, controlling for the monetary incentive dummy variable. The estimated impact of the course on savings outcomes is slightly smaller, but remains statistically significant. The course included material on retirement savings and pension funds. Table 6 shows the impact of the course on retirement savings behavior and outcomes. In the control group, 57 percent of individuals have a pension fund and the course did not change this percentage in the treatment group. Pension funds are typically provided through an individual’s employer, implying that we did not necessarily expect to find an impact of the course on this outcome since acquiring a pension fund may involve switching jobs. The other variables reported in this table are based on follow-up survey questions that were only answered by individuals who have a pension fund. In order to account for possible selection bias, we fill the variables in with “0� for individuals who do not have a pension fund, so that they are defined for the complete sample, except for individuals who did not know whether they have a pension fund or not. In terms of retirement savings behaviors, we measure whether (i) individuals choose a pension fund administrator based on fees or returns, as opposed to using the default option, relying on friends’ advice etc., (ii) they check their pension fund statement regularly and, (iii) they have calculated how much money they will need upon retirement. We construct an index of retirement saving behavior that is the average of the three behavior dummy variables. As the retirement savings outcome, we examine whether individuals report saving money for 23 retirement (on top of their pension fund). 17 We do not find an impact of the training on any of the retirement savings behaviors or outcomes listed in table 6. 6.6 Impact on Borrowing Behavior and Outcomes The course also discussed responsible use of credit cards and healthy borrowing behavior. Table 7 shows that 48 percent of the control group had a credit card at follow-up. This percentage was not statistically different in the treatment group. For individuals with a credit card, we define several measures of credit card behavior and outcomes, as listed in panels A and B of table 7. The measures are based on questions that were only posed to individuals with a credit card and refer to their most frequently used card. The variables are filled in with “0� for individuals who do not have a credit card to account for possible selection bias. We examine six credit cards behaviors. Three are dummy variables indicating whether the person (i) knows their credit limit, (ii) knows the credit card interest rate, and (iii) checks the credit card statement every month. The other three variables are coded as the fraction of the past six months, where the individual (i) paid the balance in full, (ii) made only the minimum payment, and (iii) got cash through the credit card. We construct a credit card behavior index by first converting all variables to z-scores using the control group mean and standard deviation and then taking the average of these z-scores. The two behaviors that are last on the list are coded with a negative sign in the average since they represent undesirable credit card behavior. We study three credit card outcomes (i) whether the issuer blocked the credit card during the past six month, (ii) the fraction of the past six months where the individual was charged late payment fees, and (iii) the fraction of past six months where they were charged overdraft fees. The incidence of all three evens is quite low in the sample, ranging from 1 percent to 4 percent. We also construct on index of these three outcomes, using the average of z-scores. The results in table 7 show no impact of the course on credit card behavior or outcomes. Table 8 examines borrowing behavior more broadly. We look at three loan behaviors, coded as dummy variables that indicate whether the individual (i) applied for a loan from any source during the past six months, (ii) went to a pawn shop to get credit during the past six months, and (iii) stopped servicing outstanding debt during the past six months. We aggregate these behaviors into a loan behavior index by taking the average of the three dummy variables. In the 17 We intended to also ask questions on having calculated how much money is needed upon retirement and whether the individual is saving money for retirement in the complete sample, independent of whether the person has a pension fund or not. However, by mistake, these questions were only posed to individuals who reported having a pension fund. 24 control group, 23 percent of individuals applied for a loan during the past six months, 10 percent went to a pawn show, and 13 percent stopped serving outstanding debt. These percentages are not statistically different in the treatment group. We also find no effect of the course on the index of loan behavior. Panel B of table 8 shows the impact of the course on loan outcomes. One third of control group individuals have a loan from any source and their total outstanding debt represents about 15 percent of annual income. These numbers are statistically identical in the treatment group, implying that the training had no effect on loan outcomes. 6.7 Heterogeneous Treatment Effects We now ask whether the impact of the course was different for different groups of people. The take-up regressions in table 3 show that individuals with a bachelor’s degree, as well as females and clients of our partner financial institution were more likely to attend the training compared to other people in the treatment group. The treatment effects may be larger for these groups since more of them were exposed to the course material. It could also be the case that individuals who know that they will benefit more from the course are more likely to attend in the first place. The treatment randomization was stratified by gender, being a client of our partner institution, and having a bachelor’s degree. We can thus examine the effect of the training on these groups of people separately by adding an interaction term between the treatment group dummy (TrainingInvitei,s,m) and a dummy variable indicating whether the individual was female, a client of our partner institution, or had a bachelors, to equation (2). Table 9 reports the heterogeneous treatment effect regression, using all indices from tables 4 through 8 as the outcome variables. We do not study the index components here to minimize multiple hypothesis testing. The results in panel A show that the course had a similar impact on clients of our partner financial institution and on clients of other institutions. The positive effect on financial knowledge and on savings outcomes is not statistically different across these two groups. Panel B shows similarly no statistically significant differences in treatment effects by gender. The only indication of heterogeneity in impacts is found in panel C, which shows that individuals with a bachelor’s degree are more likely to improve savings behavior and retirement savings behavior than individuals without this education level. Since the take-up rates for training were also higher for the more educated, this could just reflect the fact that this group was more likely 25 to attend training. We therefore estimate LATE impacts by education level to see whether the effect of actually receiving training varies with education status. We find that receiving training had a larger effect on individuals with a bachelor’s degree for both the savings behavior and retirement savings behavior index (although the heterogeneous treatment effect on savings behavior is only statistically significant at the 10.4 percent level). 6.8 Comparison of Treatment and Control Group Outcomes with Administrative Data Our partner financial institution provided monthly data on savings and credit card outcomes at the treatment and control group level for the 470 clients found in their data base, from December 2010 through May 2011. This data covers several pre-intervention months since we started inviting clients to the financial literacy course in August 2011. It also overlaps with the follow-up survey, which was conducted between February and July 2012. Figures 2 to 4 plot the administrative data for the treatment and control group over time. Figure 2 shows that the median savings account balance in the treatment and control group followed a more or less parallel trend before our intervention started in August 2011. In October 2011, the savings balance starts rising in the treatment group, going from about 900 pesos in September 2011 to 1,600 pesos in December 2011, while the savings balance in the control group stays relatively constant. After January 2011, the savings balance in the treatment group starts falling again and goes back to its original level in April 2012. Although the aggregate data does not allow us to conduct a test of statistical significance in the difference of medians, the observed pattern is consistent with the course having a positive effect on savings account balances. However, this effect appears to be temporary. Figures 3 and 4 displays median credit card balances and average percentage of credit card debt paid off each month, respectively. The treatment and control group values follow a similar pattern throughout the period, indicating no impact of the course on credit card balances or debt paid off each month. Overall, the patterns in the administrative data are consistent with the follow-up survey results. The course seems to have had no effect on credit card behavior or outcomes, but it led to a small increase in savings. The administrative data suggests that this increase was temporary and dissipated within a few months. 26 7. Conclusions and Policy Implications Despite the popularity of financial literacy workshops among policymakers and financial institutions, voluntary participation rates are typically very low for these programs. Working with a half-day long training course in Mexico City that aims to teach participants the importance of savings and responsible debt and credit card use, we use randomized experiments to investigate the lack of demand. We find very little interest in participating in financial education among samples of financial institution clients and Facebook users. Screening our sample to focus on those individuals who express interest in attending training, we find the majority of them also do not attend. Experiments to increase take-up suggest that this low participation rate is not mainly due to high discount rates, time-inconsistency, or lack of information, but rather appear to be due to individuals thinking that the benefits of such training are not high enough. Monetary incentives which increase the benefits lead to more attendance. A follow-up survey conducted about six months after the course enable us to measure just what these benefits are. We find participating in the course leads to increases in financial knowledge and savings, but this increase in savings appears to dissipate quickly, and there is no evidence that the training changed credit card usage or borrowing behavior. The lack of interest in training therefore appears to be a rational choice, since users see relatively little benefit from it. One natural response to the modest impacts of training measured here is to note that the training is only a few hours long, and to thus argue that much longer and intensive training sessions are needed to really affect financial behavior. However, our study shows that most of the general population has very little interest in attending even a short financial literacy course, and that it is very difficult to even get individuals who state they are interested in participating to show up and attend. Monetary incentives did help boost attendance, but even an immediate payment of $72 got fewer than 40 percent of people who had expressed interest in a financial literacy program to attend. Longer and more intensive programs would appear likely to have even more trouble attracting participants. Moreover, Kim et al. (2003) provides an example to suggest that even 18 months of adult credit counseling has not been effective. We caution that our study measures local treatment effects, showing the effect of training for people who are induced to attend it as a result of our interventions, but would not attend it without the information, logistical support, and incentives we provided. It seems plausible that the impacts for these individuals who have self-selected into not attending a program that is available to anyone who wants to participate will have less benefit from such training than the people who choose to attend of their own accord. However, as our study notes, fewer than two 27 percent of the eligible population voluntarily choose to take part in training. Even if the training has large benefits for them, the results of our study suggest that the benefits of encouraging more people to participate in such training are likely to be slight. This concurs with recent skepticism about the value of such training (Willis, 2011; The Economist, 2013). Finally, we note two issues with the increasing focus of financial institutions and policymakers in extending financial education to the masses through general multi-purpose financial literacy courses. First, the classroom setting for the provision of financial education may not be appealing to the general public. We find higher take-up rates among the population of bachelor graduates who may be more comfortable in such a setting, or may be better at digesting information provided through this channel. Alternative methods to educate the general public should be tested. For example, Berg and Zia (2013) provide evidence of an increase in financial knowledge and reductions in certain types of borrowing following financial education taught through a Soap Opera in South Africa. Second, the variety of topics covered in the course conflicts somewhat with the idea that some concepts are best taught at teachable moments. Workshops about pension savings provided to workers at the time of making their pension savings allocations, or to college students or young graduates when they receive their first credit card may be more effective in influencing these outcomes. In contrast to credit and retirement savings, there is less of an obvious teachable moment for education about savings. Indeed, since individuals make spending and savings decisions on a very regular basis, it could be argued that general financial literacy courses are likely to work better for this topic since individuals have the ability to quickly translate the concepts learned into practice. This might help explain why we find impacts (albeit transitory ones) only on savings outcomes in our study, and be consistent with the savings impacts of financial education programs taught in schools. 28 References Aguiar, Mark and Erik Hurst (2005) “Consumption versus Expenditure�, Journal of Political Economy, 113(5): 919-48. Akerlof, George (2001) “Behavioral Macroeconomics and Macroeconomic Behavior�, Nobel Prize Lecture, http://www.nobelprize.org/nobel_prizes/economics/laureates/2001/akerlof-lecture.pdf Attanasio, Orazio and Miguel Székely (2004) “Wage shocks and consumption variability in Mexico during the 1990s�, Journal of Development Economics 73(1): 1-25. Berg, Gunhild and Bilal Zia (2013) “Financial Literacy through Mainstream Media: Evaluating the Impact of Financial Messages in a South African Soap Opera�, Mimeo. World Bank. Bernheim, B. Douglas, Daniel M. Garrett, and Dean M. Maki (2001) “Education and Saving: The Long Term Effects of High School Financial Curriculum Mandates� Journal of Public Economics 80(3): 435-65. Bertrand, Marianne, Dean Karlan, Sendhil Mullainathan, Eldar Shafir and Jonathan Zinman (2010) “What’s Advertising Content Worth? Evidence from a Consumer Credit Marketing Field Experiment�, Quarterly Journal of Economics, 125(1): 263-306. Brown, Alexander, Zhikang Chua and Colin Camerer (2009) “Learning and Visceral Temptation in Dynamic Savings Experiments�, Quarterly Journal of Economics, 124(1): 197-231. Brown, Amy and Kimberly Gartner (2007) “Early Intervention and Credit Cardholders.� http://cfsinnovation.com/system/files/imported/managed_documents/earlyintervention.pdf [accessed 15 March, 2013]. Bruhn, Miriam, Luciana de Souza Leão, Arianna Legovini, Rogelio Marchetti, and Bilal Zia (2013) “Financial Education and Behavior Formation: Large-Scale Experimental Evidence from Brazil�. Mimeo. World Bank. Bruhn, Miriam and Inessa Love (forthcoming) “The Economic Impact of Banking the Unbanked: Evidence from Mexico.� The Journal of Finance. Burke, Jeremy and Kata Mihaly (2012) “Financial Literacy, Social Perception and Strategic Default�. RAND Working Paper WR-937. March 2012. Cai, Jing (2012) “Social networks and the Decision to Insure: Evidence from Randomized Experiments in China� Mimeograph. 29 Cole, Shawn A., Thomas Sampson, and Bilal Zia (2009) “Financial Literacy, Financial Decisions, and the Demand for Financial Services: Evidence from India and Indonesia.� Harvard Business School Working Paper, No. 09-117, February. Cole, Shawn A., and Gauri Kartini Shastry (2008) “Smart Money: The Effect of Education, Cognitive Ability, and Financial Literacy on Financial Market Participation.� Harvard Business School Working Paper, No. 09-071, December. Doi, Yoko, David McKenzie, and Bilal Zia (2012) “Who You Train Matters. Identifying complementary Effects of Financial Education on Migrant Households� World Bank Policy Research Working Paper 6157. August. Duflo, Esther and Emmanual Saez (2011) “The Role of Information and Social Interactions in Retirement Plan Decisions: Evidence from a Randomized Experiment� Quarterly Journal of Economics 118 (3) (April 13): 815-842 The Economist (2013) ““Teacher, leave them kids alone�, February 16, http://www.economist.com/news/finance-and-economics/21571883-financial-education-has- had-disappointing-results-past-teacher-leave-them Gale, William G. and Ruth Levine (2010) “Financial Literacy: What Works? How Could it be More Effective?�, Mimeograph, October. Gerardi, Kristopher, Lorenz Goette, and Stephan Meier (2010) “Financial Literacy and Subprime Mortgage Delinquency: Evidence from a Survey Matched to Administrative Data� Atlanta Fed Working Paper 2010-10. April 2010. Gibson, John, David McKenzie, and Bilal Zia (2012) “The Impact of Financial Literacy Training for Migrants� World Bank Policy Research Working Paper 6073. May. Government Accountability Office (2004) “The Federal Government’s role in improving financial literacy: highlights of a GAO forum� http://www.gao.gov/assets/210/202486.pdf [accessed 02/08/2013] Hastings, Justine and Lydia Tejeda-Ashton (2008) “Financial Literacy, Information, and Demand Elasticity: Survey and Experimental Evidence from Mexico.� National Bureau of Economics Working Paper No. 14538, December. Kim, Jinhee, Thomas Garman, and Benoit Sorhaindo (2003) “Relationships among Credit Counseling Clients’ Financial Well- Being, Financial Behaviors, Financial Stressor Events, and Health.� Financial Counseling and Planning, 14(2): 75–87. 30 Lara-Ibarra, Gabriel (2011) “The Effects of Framing on Retirement Management Decisions: Evidence form a Field Study in Mexico� Mimeograph Lewis, S. and Messy, F. (2012) “Financial Education, Savings and investments: An Overview� OECD Working Papers on Finance, Insurance and Private Pensions, No. 22. Lührmann, M., Serra-Garcia, M., & Winter, J. (2012) “The effects of financial literacy training: Evidence from a field experiment with German high-school children�, University of Munich Working Paper 2012-24. Lusardi, Annamaria (2004) “Saving and the Effectiveness of Financial Education� mimeograph, Dartmouth College, January. Lusardi, Annamaria and Olivia S. Mitchell (2009) “How Ordinary Consumers make Complex Economic Decisions: Financial Literacy and Retirement Readiness.� Mimeo, March. Lusardi, Annamaria and Olivia S. Mitchell (2011) “Financial Literacy and Retirement Planning in the United States�. CeRP Working Papers. Center for Research on Pensions and Welfare Policies, Turin (Italy), February. Lusardi, Annamaria and Peter Tufano (2009) “Debt Literacy, Financial Experience and Overindebtedness.� Dartmouth College Working Paper, March. Madrian, Brigitte and Dennis Shea (2001) “Preaching to the Converted and Converting Those Taught: Financial Education in the Workplace.� Working Paper, University of Chicago McKenzie, David (2006) “The consumer response to the Mexican Peso crisis�, Economic Development and Cultural Change, 55(1): 139-172 McKenzie, David (2010) “Impact Assessments in Finance and Private Sector Development: What have we learned and what should we learn?�, World Bank Research Observer, 25(2): 209-33. McKenzie, David and Christopher Woodruff (2008) “Experimental Evidence on Returns to Capital and Access to Finance in Mexico�, World Bank Economic Review 22(3): 457-82 McKenzie, David and Christopher Woodruff (2012) “What are we learning from business training and entrepreneurship evaluations around the developing world?� World Bank Policy Research Working Paper No. 6202. Ponce, Alejandro, Enrique Seira, and Guillermo Zamarripa (2009) “Do consumers borrow on their cheapest credit card? Evidence from Mexico.� Working paper available at SSRN http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1364722 31 Saavedra, Ana Luisa (2012) “Developing National Strategies for financial education: the Mexican case� Colombia-OECD-World Bank Conference on Financial Education “Progress of global policies and practices and Latin American Experiences� http://www.banrep.gov.co/educacion- economica/OECD/009543/documentos/saavedra_oecd_2012.pdf [accessed 01/16/2013] Saldívar, German (2009) “Mexican National Strategy on Financial Education� SHCP Presentation. http://www.oecd.org/finance/financialeducation/44265373.pdf [accessed 01/16/2013] Seshan, Ganesh and Dean Yang (2012) “Transnational household finance: A field experiment on the cross-border impacts of financial education for migrant workers�, Mimeo. University of Michigan. Stango, Victor and Jonathan Zinman (2009. “Exponential Growth Bias and Household Finance.� Journal of Finance 64(6): 2807-2849) Treviño Garza, Luis (2011) “Financial Capabilities Measurement� Presentation at AFI�BNM Financial Inclusion Policymakers Forum. http://www.bnm.gov.my/microsites/fipf2011/day2_session_5/06_FIPF%20Session%205_Luis% 20Trevino.pdf [accessed 02/08/2013] Willis, Lauren (2011) “The Financial Education Fallacy�, American Economic Review Papers and Proceedings 101(3): 429-34. World Bank (2008) Finance for All? Policies and Pitfalls in Expanding Access. World Bank Policy Research Report. Xu, Lisa and Bilal Zia (2012) “Financial Literacy Around the World. An Overview of the Evidence with Practical Suggestions for the Way Forward� World Bank Policy Research Working Paper 6107. June. 32 Figure 1: Financial Literacy Training Take-Up Rates by Incentive Group 45 39 40 35 33 32 32 30 27 27 25 25 21 23 21 % 19 20 18 15 10 5 0 No extra $72 now $36 now $36 later Free Testimonials incentive transportation Full treatment group Only individuals who could be reached Note: Full treatment group includes 1,751 individuals. The sub-sample who could be reached to offer them the treatment includes 1,485 individuals. Figure 2: Median Savings Account Balance 3,500 3,000 2,500 Mexican Pesos 2,000 1,500 1,000 500 - Dec-10 May-11 Apr-11 Dec-11 May-12 Jan-11 Mar-11 Jun-11 Aug-11 Apr-12 Oct-11 Nov-11 Jan-12 Mar-12 Feb-11 Jul-11 Sep-11 Feb-12 Control group Treatment group Note: Administrative data. Sample includes 470 individuals who are clients of our partner financial institution. 33 Figure 3: Median Credit Card Balance 9,000 8,000 7,000 Mexican Pesos 6,000 5,000 4,000 3,000 2,000 1,000 - Dec-10 May-11 May-12 Apr-11 Dec-11 Jan-11 Mar-11 Jun-11 Aug-11 Apr-12 Oct-11 Nov-11 Jan-12 Mar-12 Feb-11 Jul-11 Sep-11 Feb-12 Control group Treatment group Note: Administrative data. Sample includes 470 individuals who are clients of our partner financial institution. Figure 4: Average Percentage of Credit Card Debt Paid Off Each Month 35.0 30.0 25.0 20.0 % 15.0 10.0 5.0 0.0 Dec-10 May-11 Dec-11 May-12 Apr-11 Jan-11 Mar-11 Jun-11 Aug-11 Apr-12 Oct-11 Nov-11 Jan-12 Mar-12 Feb-11 Jul-11 Sep-11 Feb-12 Control group Treatment group Note: Administrative data. Sample includes 470 individuals who are clients of our partner financial institution. 34 Table 1: Confirming Randomization Using Baseline Data Full Sample in Baseline Sample Interviewed at Follow-up Administrative Data Sample Control Treatment Control Treatment Control Treatment Mean Difference Mean Difference Mean Difference Stratification Variables Baseline survey conducted in branch 0.35 -0.0058 0.37 -0.0045 0.62 -0.0005 Client of partner financial institution (vs. other institution) 0.48 -0.0010 0.48 0.0015 Made savings deposit during past month 0.64 0.0012 0.64 0.0011 0.70 0.0534 Has credit card 0.41 -0.0039 0.42 0.0161 0.56 -0.0005 Paid more than credit card minimum in all past 6 months1 0.51 0.0172 0.52 0.0169 0.56 0.0085 Has bachelor’s degree or higher 0.40 0.0016 0.41 0.0195 0.45 -0.0386 Female 0.47 0.0064 0.50 0.0115 0.48 0.0336 Other Baseline Variables Age 32.69 0.6308 32.97 0.7372 36.14 0.9789 Occupation is employee 0.51 -0.0171 0.49 0.0031 0.50 -0.0282 Paid credit card late in past 6 months1 0.23 0.0124 0.22 0.0214 0.21 0.0087 Monthly household income is above MXP 6,500 0.64 -0.0072 0.63 -0.0111 0.67 0.0367 Monthly household expenditure is above MXP 6,500 0.54 -0.0081 0.54 -0.0141 0.54 -0.0394 Sample Size 1090 1088 814 772 243 227 Notes: *, **, and *** indicate statistically different from control mean at the 10, 5 and 1% levels respectively. Differences for baseline variables not used in the stratification control for randomization strata. Administrative data sample includes clients of our partner financial institution that were found in their database. 1 Conditional on having a credit card 35 Table 2: Impact of Incentive Treatments on Take-Up in Treatment Group Dependent variable: Attended course (1) (2) $72 now dummy 0.1480*** 0.1758*** (0.0352) (0.0401) $36 now dummy 0.0931*** 0.1123*** (0.0343) (0.0393) $36 later dummy 0.0918*** 0.1110*** (0.0343) (0.0392) Free transportation dummy 0.0330 0.0407 (0.0326) (0.0376) Testimonials dummy 0.0142 0.0153 (0.0320) (0.0367) F-test p-value: $72 now = $36 now 0.1435 0.1357 F-test p-value: $36 now = $36 later 0.9725 0.9761 Control group mean of outcome variable 0.1815 0.2129 Observations 1751 1457 Only individuals Full treatment Sample who could be group reached Notes: Robust standard errors in parentheses. *, **, and *** indicate statistical significance at the 10, 5 and 1 percent levels respectively. OLS regressions of a dummy for having attended the course on a set of dummies indicating to which incentive group the individual was randomly assigned (the omitted category is the control group, i.e. received no extra incentives). Regressions control for randomization strata dummies. 36 Table 3: Determinants of Program Take-Up in Treatment Group Dependent variable: Attended Attended course course w/o w/ or w/o incentives incentives (1) (2) Baseline survey conducted in branch 0.0437 0.0250 (0.0298) (0.0330) Client of partner financial institution (vs. other institution) 0.0232 0.0540* (0.0271) (0.0300) Made savings deposit during past month 0.0284 0.0359 (0.0258) (0.0285) Has credit card -0.0450 -0.0296 (0.0355) (0.0393) Paid more than credit card minimum in all past 6 months 0.0570 0.0309 (0.0392) (0.0435) Has bachelor’s degree or higher 0.1189*** 0.1420*** (0.0254) (0.0282) Female 0.0445* 0.0405 (0.0245) (0.0272) Age 0.0035*** 0.0039*** (0.0011) (0.0012) Occupation is employee -0.0127 -0.0137 (0.0243) (0.0270) Paid credit card late in past 6 months 0.0524 0.0503 (0.0454) (0.0467) Observations 1081 1081 Mean of outcome variable 0.2077 0.2849 Notes: Robust standard errors in parentheses. *, **, and *** indicate statistical significance at the 10, 5 and 1 percent levels respectively. 37 Table 4: Impact on Financial Knowledge ITT LATE Sample Control Treatment Treatment Size Mean Difference Difference Panel A: Knowledge index and components Knowledge index (average of 8 components below) 1586 0.31 0.0307*** 0.0871*** (0.0094) (0.0261) (1) Knows what UDI is ("Unidad de Inversion")1 1580 0.10 0.0044 0.0125 (0.0150) (0.0426) (2) Knows deposit insurance exists up to 400,000 UDIs 1578 0.13 0.0732*** 0.2073*** (0.0188) (0.0533) (3) Knows what a credit report is 1575 0.39 0.0518** 0.1464** (0.0245) (0.0684) (4) Knows credit card cycle is 30 days 1568 0.46 0.0190 0.0535 (0.0251) (0.0701) (5) Knows they have 20 days to pay credit card w/o interest 1568 0.12 0.0143 0.0402 (0.0168) (0.0472) (6) Knows that what CAT is ("Costo Anual Total")2 1557 0.24 0.0369* 0.1036* (0.0216) (0.0602) (7) Knows what an AFORE (pension fund) is 1559 0.72 0.0499** 0.1409** (0.0219) (0.0613) (8) Knows retirement age is 65 1551 0.29 0.0253 0.0714 (0.0234) (0.0659) Panel B: Self-assessed financial literacy Says their financial knowledge is satisfactory or higher 1550 0.58 0.0537** 0.1500** (0.0248) (0.0683) Notes: Robust standard errors in parentheses. *, **, and *** indicate statistically different from control mean at the 10, 5 and 1 percent levels respectively, after controlling for randomization strata and month of follow-up interview dummies. 1 Unidad de Inversion (UDI) is an inflation adjusting currency unit. 2 Costo Annual Total (CAT) is total annual cost of credit, including all interest rates and fees 38 Table 5: Impact on Savings Behavior and Outcomes ITT LATE Sample Control Treatment Treatment Size Mean Difference Difference Panel A: Savings behavior index and components Savings behavior index (avg. of 5 components below) 1586 0.68 0.0133 0.0376 (0.0124) (0.0348) (1) Checks financial institution transactions regularly 1586 0.69 -0.0235 -0.0666 (0.0226) (0.0644) (2) Keeps track of expenses 1586 0.79 0.0072 0.0204 (0.0206) (0.0582) (3) Makes a budget 1585 0.77 0.0264 0.0748 (0.0211) (0.0596) (4) Has a savings goal 1582 0.57 0.0130 0.0367 (0.0250) (0.0705) (5) Cut expenses in past 3 months 1584 0.59 0.0428* 0.1212* (0.0247) (0.0698) Panel B: Savings outcomes index and components Savings outcomes index (avg. of 3 components below) 1586 0.65 0.0335** 0.0948** (0.0147) (0.0414) Has any type of savings1 1586 0.80 0.0288 0.0814 (0.0200) (0.0566) Saved more than zero during past 6 months 1413 0.83 0.0293 0.0800 (0.0192) (0.0524) Saves more each month than a year ago 1547 0.36 0.0408 0.1151* (0.0250) (0.0697) Panel C: Savings outcomes controlling for monetary incentives Savings outcomes index (avg. of 3 components below) 1586 0.65 0.0268* 0.0902* (0.0151) (0.0506) Has any type of savings1 1586 0.80 0.0187 0.0629 (0.0205) (0.0693) Saved more than zero during past 6 months 1413 0.83 0.0253 0.0823 (0.0198) (0.0642) Saves more each month than a year ago 1547 0.36 0.0347 0.1161 (0.0255) (0.0845) Notes: Robust standard errors in parentheses. *, **, and *** indicate statistically different from control mean at the 10, 5 and 1 percent levels respectively, after controlling for randomization strata and month of follow-up interview dummies. Regressions in Panel C additionally include a dummy for whether the individual received a 1 monetary incentive payment for participation in the financial literacy course. Includes savings account, caja de ahorro, tanda and other non-retirement savings 39 Table 6: Impact on Retirement Savings Behavior ITT LATE Sample Control Treatment Treatment Size Mean Difference Difference Has a pension fund 1471 0.57 -0.0031 -0.0088 (0.0263) (0.0739) Panel A: Retirement savings behavior index and components Retirement savings behavior index (avg. of 3 components below) 1471 0.20 0.0114 0.0320 (0.0144) (0.0404) Pension fund administrator choice based on fees or returns 1471 0.16 0.0058 0.0162 (0.0195) (0.0547) Checks pension fund statement 1467 0.34 0.0340 0.0958 (0.0250) (0.0704) Has calculated how much money will need upon retirement 1465 0.11 -0.0026 -0.0074 (0.0158) (0.0447) Panel B: Retirement savings outcomes Is saving money for retirement 1470 0.17 0.0122 0.0343 (0.0199) (0.0559) Notes: Robust standard errors in parentheses. *, **, and *** indicate statistically different from control mean at the 10, 5 and 1 percent levels respectively, after controlling for randomization strata and month of follow-up interview dummies. Variables in panels A and B are based on questions that were only answered by individuals who have a pension fund. To account for potential selection bias, we fill these variables in with "0" for individuals who do not have a pension fund. 40 Table 7: Impact on Credit Card Behavior and Outcomes ITT LATE Sample Control Treatment Treatment Size Mean Difference Difference Currently has at least one credit card 1560 0.48 -0.0287 -0.0811 (0.0210) (0.0596) Panel A: Credit card behavior index and components Credit card behavior index (avg. of z-scores of 6 components below) 1556 0.00 -0.0233 -0.0660 (0.0229) (0.0652) (1) Knows credit limit 1550 0.45 -0.0266 -0.0756 (0.0209) (0.0594) (2) Knows interest rate 1519 0.20 -0.0017 -0.0050 (0.0193) (0.0552) (3) Checks statement every month 1543 0.41 -0.0278 -0.0786 (0.0209) (0.0594) (4) Fraction of past 6 months where paid balance in full 1536 0.20 -0.0180 -0.0505 (0.0170) (0.0479) (5) Fraction of past 6 months where made only the minimum payment1 1539 0.12 0.0022 0.0062 (0.0138) (0.0389) (6) Fraction of past 6 months where got cash through the credit card1 1540 0.07 -0.0094 -0.0267 (0.0098) (0.0281) Panel B: Credit card outcomes index and components Credit card outcomes index (avg. of z-scores of 3 components below) 1554 0.00 0.0434 0.1228 (0.0416) (0.1184) (1) Issuer blocked credit card during past 6 months 1547 0.04 0.0009 0.0026 (0.0096) (0.0270) (2) Fraction of past 6 months where was charged late payment fees 1546 0.03 0.0102 0.0289 (0.0064) (0.0183) (3) Fraction of past 6 months where was charged overdraft fees 1545 0.01 0.0026 0.0075 (0.0034) (0.0098) Notes: All variables in Panel A and B refer to the most frequently used credit card. Robust standard errors in parentheses. *, **, and *** indicate statistically different from control mean at the 10, 5 and 1 percent levels respectively, after controlling for randomization strata and month of follow-up interview dummies. 1 Included in credit card behavior index with negative sign. 41 Table 8: Impact on Loan Behavior and Outcomes ITT LATE Sample Control Treatment Treatment Size Mean Difference Difference Panel A: Loan behavior index and components Loan behavior index (avg. of 3 components below) 1570 0.15 0.0075 0.0212 (0.0118) (0.0334) Applied for a loan from any source during past 6 months 1564 0.23 -0.0074 -0.0208 (0.0210) (0.0594) Went to a pawn shop to get credit during past 6 months 1568 0.10 0.0054 0.0152 (0.0152) (0.0429) Stopped servicing outstanding debt during past 6 months 1434 0.13 0.0205 0.0591 (0.0180) (0.0523) Panel B: Loan outcomes index and components Loan outcomes index (avg. of z-scores of 2 components below) 1560 -0.01 -0.0132 -0.0372 (0.0427) (0.1206) Currently has a loan (from any source) 1555 0.33 -0.0058 -0.0165 (0.0234) (0.0660) Total outstanding debt as percentage of annual income 1209 15.38 -0.6563 -1.7753 (1.1899) (3.2220) Notes: Robust standard errors in parentheses. *, **, and *** indicate statistically different from control mean at the 10, 5 and 1 percent levels respectively, after controlling for randomization strata and month of follow-up interview dummies. 42 Table 9: Heterogeneous Treatment Effects (ITT) Dependent variable: Index of Retirement Savings Savings Credit card Credit card Loan Loan Knowledge savings behavior outcomes behavior outcomes behavior outcomes behavior Panel A: Type of financial institution Treatment group dummy 0.0315** 0.0095 0.0343* 0.0099 -0.0550* 0.0157 0.0065 -0.0066 (0.0130) (0.0174) (0.0199) (0.0194) (0.0294) (0.0559) (0.0161) (0.0548) Treatment*Client of partner institution -0.0016 0.0078 -0.0018 0.0030 0.0658 0.0572 0.0021 -0.0135 (0.0188) (0.0246) (0.0295) (0.0287) (0.0460) (0.0836) (0.0237) (0.0859) F-test p-value: Treatment + Interaction = 0 0.028 0.321 0.135 0.543 0.761 0.240 0.622 0.761 Panel B: Gender Treatment group dummy 0.0182 0.0057 0.0546*** -0.0035 -0.0345 0.0587 0.0144 0.0414 (0.0134) (0.0169) (0.0205) (0.0215) (0.0331) (0.0564) (0.0171) (0.0626) Treatment*Female 0.0250 0.0150 -0.0421 0.0297 0.0223 -0.0304 -0.0137 -0.1086 (0.0188) (0.0246) (0.0295) (0.0287) (0.0459) (0.0842) (0.0237) (0.0858) F-test p-value: Treatment + Interaction = 0 0.001 0.249 0.554 0.172 0.702 0.648 0.967 0.249 Panel C: Education Treatment group dummy 0.0288** -0.0067 0.0148 -0.0139 -0.0174 0.0585 0.0017 -0.0421 (0.0116) (0.0164) (0.0192) (0.0179) (0.0284) (0.0575) (0.0161) (0.0546) Treatment*Has bachelor or higher 0.0046 0.0486* 0.0455 0.0613** -0.0144 -0.0369 0.0141 0.0702 (0.0196) (0.0248) (0.0300) (0.0297) (0.0477) (0.0819) (0.0237) (0.0877) F-test p-value: Treatment + Interaction = 0 0.034 0.025 0.009 0.046 0.406 0.713 0.363 0.681 Observations 1586 1586 1586 1471 1556 1554 1570 1560 Notes: Robust standard errors in parentheses. *, **, and *** indicate statistically different from control mean at the 10, 5 and 1 percent levels respectively, after controlling for randomization strata and month of follow-up interview dummies. 43 Appendix A: Project Timeline 2010 2011 2012 Project Stage A S O N D J F M A M J J A S O N D J F M A M J J A S O N D Project setup and preparation � � � � � � Recruitment: Mailing Campaign � � Recruitment: Online Campaign � � � Recruitment: Street/Branches � � � � Invitation to course (Treatment) � � � � Follow-up Survey � � � � � � � Analysis � � � � � � � � � � � � Administrative data � � � � � � � � � � � � � � � � � � 44