WBI Evaluation Briefs REPORTING ON CLIENT AND STAFF LEARNING PROGRAMS--A SPECIAL JOINT ISSUE BY WBI AND OED Marlaine E. Lockheed, Manager WBI Evaluation Unit March 2002 Developing Evaluation Capacity The Challenge · Nine "Introduction to Program Evaluation (IPE)" offerings. IPE The shift in theWorld Bank's development focus towards comprehensive has evolved from a classroom course designed for OED staff in 1998 and sustainable development, country ownership, increased transparency, to a distance-learning course forWorld Bank Staff and clients in 2002. knowledge sharing and evidence of results has generated an increasing In all about 1000 participants from 34 countries covering the regions need for adequate evaluation of development policies, programs, and of ECA, Africa, EAP, and LAC have been trained. In its current project implementation. This emphasis also brings forth a growing delivery format, IPE utilizes distance learning technology--interactive demand for training in evaluation concepts and methods. Specifically, video-conference (V/C) sessions-- to reach participants convened at World Bank client countries need to: 1) develop an increased capacity to multiple sites. The V/C sessions last two to three hours each, and are assess the efficiency of its institutions, policies and reform processes, 2) delivered over a period of about four weeks for a total of 12 to 16 achieve greater development effectiveness and results orientation, 3) take hours of instructional time. The primary target audience is ownership of the reform process, and 4) draw lessons from experience practitioners who are involved in planning and implementing project to sustain the development process. and program evaluations, and it is an "open access" training program2. The broad objectives of IPE are to offer training in monitoring and evaluation (M&E) concepts and methods; demonstrate the link World Bank Intervention between performance measurement and program evaluation; and To develop evaluation capacity of both Bank staff and client countries, provide hands-on experiences in developing an evaluation design to theWorld Bank has sponsored a variety of activities, including diagnostic help participants plan for effective program evaluation. There is a work, technical assistance, and training programs in evaluation. The three specific focus on bringing two key populations together--Bank staff training programs discussed in this Brief are: 1) Introduction to Program and clients-- with an objective to provide an enriched mutual learning Evaluation (IPE); 2) Program Evaluation Workshops (PEWs); and 3) International Figure 2. Hours of instruction Program for Development Evaluation Figure 1. Total number of per course offering Training (IPDET). Since 1998, these three participants programs have trained close to 1,300 Participants Hours participants (figure 1). Each of these training 1,200 100 programs is distinct in its training approach, duration (figure 2), mode of delivery, and 1,000 80 target audience, but all share common features of content (figure 3). IPE is a broad 800 survey of evaluation methods which focuses 60 on developing basic evaluation skills. PEW 600 1,008 is explicitly a "train the trainers" program, 40 80 and IPDET is an intense skills-building 400 program. IPE and IPDET target both Bank 20 40 staff and clients1. In addition, there is an 200 16 intentional connectivity between these two 140 65 0 0 programs as IPDET began with the IPE IPE * PEWs IPDET IPE * PEWs IPDET materials, and its core course can be viewed (Core course) (Core course) as an extension and expansion of the IPE Distance Learning course offering course. These evaluation capacity building *Distance Learning course offering * initiatives comprise: 1. The Program Evaluation Workshops (PEWs) are funded by donors therefore do not include Bank staff. W O R L D B A N K I N S T I T U T E Promoting knowledge and learning for a better world Figure 3. SelectedWorld Bank evaluation capacity-building programs components: 1) an integrated two-week (80 hour) applied core course attended by 65 International Program for IPDET also covers the Development participants, and 2) two additional weeks of AdditionalTarget Audience following in its core course Evaluation(IPDET) for IPE and IPDET Programs · Managing Dev. Evaluations A residential program of two to seventeen workshops (ranging in length Bank Staff from networks, · Major Issues in Dev. four week duration. Includes a operations and research who are · Sector/Thematic Evaluations from one to three days with three of them two week course focusing on involved in planning, Two additional weeks of core evaluation skills, and two implementing, and managing electives/indepth workshops offered concurrently) attended by 74 weeks of optional indepth evaluations offered by IPDET focus on: workshops. · advanced methods participants, and focusing on additional Offering to date: 1 · theory, and Participants: 130 skills in development evaluation.3 The target · applications. audience included Bank group staff, and staff from other development and bilateral CommonAreas of Focus for CommonTarget Audience Introduction to Program all three Programs for allThree Programs organizations, NGOs, private sector and Evaluation (IPE) 1. Models and Assumptions Policy makers; and practitioners A distance learning course 2. Causality and Questions government officials who conduct or involved in planning, delivered through 2 hour video 3. Design implementing, and managing conference sessions for a total 4. Measurement manage development evaluations, or those evaluations, specifically: of about 12 to 16 hours of 5. Data Collection (Primary- Government Officials, instruction time (3 to 4 DL who manage evaluation systems or units. Secondary) Specialist/Technical Staff, training days). Stresses basic 6. Data Analysis (Quality and Academics, Researchers, Participant fees ($2,461 residential) were competency in core skills. Quantity) Evaluators,Trainers of Partner Offerings to date: 9 7. Evaluation Design Matrix Institutes, NGO Staff charged. Support for the program Participants: approx. 1,000. 8. Communicating Results (scholarships for 30 participants) was provided by the World Bank and Program Evaluation Governments of the Netherlands, Canada, Workshops (PEWS) Residential one week "Train and Norway. The Program Evaluation the Trainer Workshop". Workshops also include a Includes a specific focus on Additional target audience focus on: building local institutional for EvaluationWorkshops · Planning for Workshop capacity for delivery of Underlying Assumptions PRSP team members · Delivery Evaluation of evaluation training and is Poverty focused programs. currently evolving towards These evaluation capacity-building programs poverty focused evaluation. Offering to date 4 are based on four key assumptions: Participants: approx 140. 1. Evaluation training provides basic tools for social science inquiry and evaluation. experience. One of the program offerings the PRSP team members on how to 2. Participants equipped with basics in social which focused on the East Asia region was effectively monitor and evaluate the process science inquiry can contribute to more funded by the Policy and Human Resources of implementation of PRSP. Support for objective and sound M&E practices. Development (PHRD) Trust Fund of the PEWs has been provided by the Governments Government of Japan. of Canada, France, Japan, and Switzerland. 3. Better M&E helps improve transparency, No participant fees were charged. encourages attention to development · Four Program Evaluation Workshops effectiveness, and contributes to better (PEWs). PEWs were initiated in 1998 and · One pilot offering of the "International management of development initiatives. have trained about 140 participants from 23 Program for Development Evaluation countries in regional one-week workshops Training (IPDET)" . IPDET was initiated 4. IPE and IPDET also assume that the held in Ghana, Senegal, Turkey, and Burkina in 2001 and was delivered in partnership capacity building objective of evaluation Faso. The training focuses on building with Carleton University, Ottawa, Canada. training is enhanced with the participation evaluation capacity in developing countries About 130 participants representing 23 of both the Bank staff and clients, as it allows primarily through "Training of Trainer" developing and 17 developed countries were them to share their views and experiences workshops, and institutionalizing evaluation trained. This two to four-week residential and learn from each other. training in the host countries. This is a "select program was designed to fill the existing gap access" program and besides trainers, the between demand and supply for Evaluation Data and target audience also includes management competency-based training in core teams from training institutions and senior development evaluation skills, provide Instruments government officials. In an effort to support additional professional development Evaluation data came from theWBIES database Poverty Reduction Strategy Papers (PRSP) training, or in some cases more in-depth covering participant4 reactions (level 1 Initiative countries, the most recent offering coverage for development evaluators who evaluation), and learning achievement (level 2 delivered in Burkina Faso in December 2001 have the basics. The IPDET program built evaluation), and from OED for level 2 adds a variation to the basic course content on IPE materials and added a stronger evaluations of IPDET. This report is based on and target audience by providing training to development focus. It contains two key the following data: 2. As opposed to "select access" courses, participants in the "open access" courses are not pre-screened for specified educational or professional criteria. 3. Some participants from the core course also attended the in-depth workshops and about 130 individuals participated in at least one part of the four week IPDET program. Page 2 WBI Evaluation Briefs--March 2002 W O R L D B A N K I N S T I T U T E Promoting knowledge and learning for a better world · level 1 data from the five most recent Table 1. Demographic characteristics of participants in most recent offerings--two for IPE; two for PEWs, and offerings of three training programs in evaluation one for IPDET (two-week core course), and IPE PEWs IPDET · level 2 data from four offerings--two IPE, Participant Characteristics (percent) (percent (percent) one PEW, and one IPDET (two-week core Female 34 14 54 course only). With masters degrees 50 58 79 With Ph.D. 13 39 9 The available evaluation data used in this report World Bank staff 9 0 8 cover about 35% of the total program offerings From developing country 95 100 57 to date, and do not include results from two Modal occupation Managers (54%) Trainers (55%) Operational and additional weeks of optional in-depth Evaluation workshops offered by IPDET. Over 85% of Staff (75%) participants in each of the courses reviewed completed a level 1 evaluation, and over 75% completed a level 2 evaluation. Learning Achievement Tests: Level 2 Evaluation Results Evaluation (L2): For IPE and PEW, course 1. Participants in all three programs were Participant Reactions: level 1 evaluation participants were given a short test comprising satisfied with the training,finding it useful, (L1): At the end of each offering, participants multiple-choice questions (test items) covering relevant, and having met the program were asked to give feedback on course content the course content. Test items were randomly objectives. All three programs received and design. The IPE and PEWs used a assigned to two test forms, one of which was positive evaluation ratings, particularly for questionnaire that included six standard administered before the course or module (the perceived program usefulness and relevance questions. The participants rated each of the pre-test) and the other at the end (post-test). The (table 2). Of the course offerings evaluated, following aspects of the course on a scale of 1 pre- and post-test data were matched for between 86% to 92% of respondents rated (minimum) to 5 (maximum). individual participants while ensuring the overall usefulness of the IPE and PEW participant anonymity. For IPDET, a 19 item · Relevance of this course to your current courses as "4" or "5" on a scale of 1 to 5, pre-test and a 30 item post-test were admin- work or functions? with an average rating of 4.21 for IPE and istered to 56 and 64 participants, respectively. 4.41 for PEW. Overall, 87% of IPDET · Extent to which you have acquired respondents provided a rating of "4" or "5" information that is new to you? Program Participants to the question focusing on the "degree to · Usefulness for you of the information that PEW had fewest female or World Bank which the program met their expectations", you have acquired? participants, IPE had fewest participants with with an average of 4.1. This feedback advanced degrees, and IPDET had fewest exceeds theWBI quality benchmark of 85% · Focus of this course on what you specifically participants from developing countries. for client and staff learning programs, and needed to learn? Participants' work varied considerably across is comparable with the performance ofWBI · Extent to which the content of this course groups with IPE attracting more managers, core courses over the last few years. 85% of matched the announced objectives? PEW more trainers and IPDET more the respondents rated the overall usefulness operational and evaluation staff, reflecting of the core courses as "4" or "5", with an · Overall usefulness of this course? differences in the target audience for each average rating of 4.24. For the IPE and course (table 1). PEWs, the ratings were lowest on the extent The IPDET program used a different evaluation form from IPE and PEW. Participant responses on the overall quality of, and satisfaction with Table 2. Respondents' average rating on level 1 indicators for the three programs the course were sought through the following four questions. The first two required scaled IPDET responses from the participants and the other Course data IPE PEWs (core course) two sought "Yes" "No" responses. Offerings 2 2 1 Respondents 301 61 56 · Likelihood of use of knowledge and skills Response rate 90% 95% 86% acquired. Level 1 Indicators Relevance 4.16 4.53 · Degree to which IPDET met your New information 3.75 3.90 expectations. Useful information 4.21 4.41 Focus on specific learning needs 4.05 3.96 · Would you recommend this program to a Met objective 4.31 4.10 colleague? Overall usefulness 4.38 4.33 · Would you return again for additional Met expectations 4.10 Likelihood of using the knowledge and skills acquired 4.60 training? Would you recommend the program to a colleague? 98%Yes Would you return again for additional training? 96%Yes WBI Evaluation Briefs--March 2002 Page 3 W O R L D B A N K I N S T I T U T E Promoting knowledge and learning for a better world Figure 4. Efficacy: Learning in the three programs Figure 5. Efficiency: Learning gain per hour of instruction Average percent Learning gain per correct, matched Pre-test hour of instruction respondents (percent) Post-test 70 1.0 60 0.8 50 0.6 40 68 0.96 30 61 56 0.4 0.72 20 41 39 36 0.2 10 0.31 0 0 IPE * PEWs IPDET ** IPE * PEWs IPDET 2 offerings 1 offering 1 offering (Core course) 246 matched 32 matched 56 pre-test respondents respondents 64 post-test * Distance Learning course offering * Distance Learning course offering ** Core course to which the information was new to participants and on whether the AboutWBI courses focused on their learning needs. The World Bank Institute (WBI) works to build the capacity of its client 2. Participants in all three programs learned about evaluation. The countries for poverty reduction and sustainable development. It supports the extent to which the three programs boosted participants' learning was World Bank's learning and knowledge agenda by delivering learning programs, providing policy services, facilitating action programs, supporting networks of measured through pre- and post-tests.The tests used in the three programs professionals, and creating and managing initiatives for knowledge sharing. were not equivalent, but were designed to capture the program content. The WBI Evaluation Unit (WBIES) works with the Institute's program leaders Post-test scores were higher than pre-test scores in all three programs, and with sector managers Bank-wide to prepare, process, and report with, the PEWs showing the largest gain (figure 4), and the IPE course evaluation results for staff, client, and joint learning events. WBIES also offers showing the largest gain per instructional hour (figure 5). The distance learning and face-to-face training in program evaluation. WBIES statistically significant aggregate gains of 15% and 29% respectively Evaluation Briefs report on the evaluation results, lessons learned, and impact for IPE and PEW exceed the average learning gain of 11% for WBI of the Institute's major offerings. core courses in the past few years. For IPDET, the 25 percentage Contacts point gain falls within this range but comes from non-equivalent forms WBI Evaluation Briefs of the test and unmatched participants. Heidi S. Zia, Evaluation Officer 202.458.0853 or Hzia@worldbank.org Shobha Kumar, Program Officer Implications forTheWorld Bank 202.458.7021 or Skumar1@worldbank.org 1. Distance learning (DL) methodology appears to be more efficient IPE Program than non-DL format in building basic monitoring and evaluation skills, William Eckert, Senior Evaluation Officer but there may be a tradeoff between efficacy and efficiency in 202.458.1584 or Weckert@worldbank.org evaluation training.While the distance learning (DL) course produced PEWs Marie-Aline Wood, Consultant smaller learning gains than the non-DL formats, its gains came from 202.473.3877 or Mwood@worldbank.org much less instructional time. Thus DL may be more efficient than IPDET Program face-to-face instruction. Little can be said about cost-effectiveness, Ray Rist, Senior Evaluation Officer however, in the absence of unit costs comparably reported for each 202.458.5625 or Rrist@worldbank.org program. Future indicators will need to include unit cost measures. Visit our web site for more information on the Evaluation unit of WBI 2. Level 3 and 4 evaluations should be conducted as follow-up activities and electronic copies of all Evaluation Briefs: for the three programs to determine the sustainability and impact of http://www.worldbank.org/wbi/evaluation/index.html "increase in competencies" on individual and organizational performance. Sponsors of theThree Programs IPE, PEWs and IPDET were designed and implemented by World Bank staff and consultants associated with OED, OEG, and WBI. Present responsibility for IPE and PEW lies with WBI, and for IPDET with OED. Staff from all units have taught in all programs, and the programs share materials to better meet the needs of their target audiences. Page 4 WBI Evaluation Briefs--March 2002