40208 Evaluation of FY02-03 WBI Thematic Programs: Urban and City Management Program Jaime B. Quizon WBI Evaluation Studies EG05-109 The World Bank Institute The World Bank Washington, D.C. July 2005 ACKNOWLEDGMENTS The World Bank Institute (WBI) prepared this report under the overall guidance of Nidhi Khattri, Acting Manager, WBI Evaluation Group. Jaime Quizon of the Institute Evaluation Group is the principal author. The author thanks Richard Tobin, William Eckert, and Evangeline Kim Cuenco for reviewing the study and offering suggestions for its improvement. The author also thanks Shreyasi Jha for her assistance with the statistical analysis and Humberto S. Diaz for his help with formatting. The Institute Evaluation Group produces WBI Evaluation Studies to report evaluation results for staff, client, and joint learning events. An objective of the studies is to publish findings quickly, even if the presentations are less than fully polished. The papers carry the names of the authors and should be cited accordingly. The findings, interpretations, and conclusions expressed in this study are entirely those of the author and do not necessarily represent the view of the World Bank Group. WBI Evaluation Studies are available online at: http://web.worldbank.org/WBSITE/EXTERNAL/WBI/0,,contentMDK:20252874~pageP K:209023~piPK:335094~theSitePK:213799,00.html Vice President, World Bank Institute Frannie Léautier Manager, Institute Evaluation Group Richard J. Tobin Task Manager Jaime B. Quizon ii ACRONYMS AND ABBREVIATIONS AFR Africa ASTD American Society for Training and Development CRS Client Registration System EAP East Asia and Pacific ECA Europe and Central Asia FGD Focus Group Discussion FY02-03 Fiscal year 2002-2003 K&S Knowledge and Skills LAC Latin America and the Caribbean MNA Middle East and North Africa NGOs Nongovernmental organizations OLS Ordinary least squares SAR South Asia TTLs Task team leaders UCMP Urban and City Management Program WB World Bank WBI World Bank Institute WBIEG World Bank Institute Evaluation Group iii iv TABLE OF CONTENTS ACKNOWLEDGMENTS.............................................................................. ii ACRONYMS AND ABBREVIATIONS................................................................iii TABLE OF CONTENTS............................................................................... v EXECUTIVE SUMMARY............................................................................ vii 1. INTRODUCTION................................................................................. 1 Evaluation Context and Objectives ............................................................................. 1 The Urban and City Management Program: Background........................................... 1 Evaluation Questions................................................................................................... 2 2. EVALUATION METHODS, DESIGN, AND RESULTS......................................... 4 Participant Survey........................................................................................................ 4 Focus Group Discussion and Participant Interviews................................................... 5 Informal Interviews with UCMP Staff and Relevant WB Country Staff.................... 6 Results.......................................................................................................................... 6 Relevance of the Learning Events............................................................................... 8 Activity Effectiveness.................................................................................................. 9 Participants' Use of Awareness, Knowledge and Skills (K&S) from the UCMP Activity ...................................................................................................................... 11 Follow-Up Activities with UCMP Participants......................................................... 14 Determinants of Activity Effectiveness and Participants' Use of K&S.................... 15 3. CONCLUSIONS AND RECOMMENDATIONS................................................. 22 APPENDICES........................................................................................ 25 Appendix A. List of FY02-03 UCMP Learning Events........................................... 27 Appendix B. UCMP Evaluation: Questions and Data for Answering these Questions by Data Source................................................................... 28 Appendix C. World Bank Institute (WBI) UCMP Evaluation Questionnaire.......... 29 Appendix D. Participant Survey: Evaluation Design and Sampling........................ 35 v vi EXECUTIVE SUMMARY This report analyzes the relevance and effectiveness of WBI's Urban and City Management Program (UCMP) learning activities. It also examines participants' use of the awareness, knowledge, and skills that they obtained from attending UCMP learning activities. A key aim is to recommend ways to improve the impacts of UCMP interventions. The study uses data drawn from a survey of FY02-03 participants in UCMP learning activities. In all, 141 participants, representing 17 UCMP FY02-03 activities responded. Their responses are supplemented by a focus group discussion with participants and discussions with randomly selected participants, with UCMP staff responsible for delivery of these learning activities, and with relevant World Bank (WB) country operations staff. Participants rated the relevance, effectiveness, and impacts of the UCMP learning events they attended as positive as and slightly higher than similar average FY02 ratings for WBI-wide learning activities. Participants' use of the knowledge and skills they acquired from the UCMP learning activities is also positive, but unexceptional. Compared with similar non-UCMP learning events offered in their countries, knowledgeable UCMP alumni rate the usefulness of the UCMP activity they attended as marginally better. The most important determinant of the use of the knowledge and skills that participants acquire from the UCMP learning activity is its effectiveness, as rated by participants. Several participant and activity factors help boost effectiveness. Effectiveness improves when: (a) the learning activity is designed for the participant's country; (b) participants develop an action plan during the learning activity; (c) the level of the participant's proficiency in the technical terminology used in the learning event is high; (d) the participant is from a low-income country; and, (e) the participant is from the Africa region. Net of activity effectiveness, other activity features affect participants' use of the knowledge and skills from the UCMP event. Aggregate use is significantly higher when: (a) the participant had discussions after the learning activity of issues raised in the activity; (b) participants are provided with the contact information of other participants (including event presenters/facilitators) for networking purposes; and, (c) the participants' work environment (e.g., work procedures, colleagues, incentive system, resources, and so on) and country-related factors (e.g., country policies, social groups, political groups, readiness for reform, and so on) are rated by participants themselves as vii helpful for reform. Also, participants in Latin America and the Caribbean (LAC), East Asia (EAP), Africa (AFR), and Europe and Central Asia (ECA) regions are more likely to report significantly higher aggregate use, operational use, and academic use than their counterparts from the South Asia (SAR) region. viii 1. INTRODUCTION Evaluation Context and Objectives 1.1 The World Bank Institute (WBI) and its partners deliver learning opportunities in 15 sector or thematic programs to policymakers, academics, business and community leaders, professionals and other civil society stakeholders to foster capacity development for poverty reduction and sustained development. The goal is to build the required individual and institutional capacities in the World Bank's client countries through the dissemination of the Bank's global knowledge and expertise in the key areas of environment and sustainable development, poverty reduction and economic management, finance and private sector development, and human development. 1.2 The WBI Evaluation Group (WBIEG) first assessed the impacts of six of WBI's 15 thematic and sector programs in FY02. These six included the largest of the WBI programs at the time in terms of the number of participants and learning events offered. These thematic evaluations followed a common evaluation design and made use of a standard questionnaire. In FY04, WBIEG initiated similar impact evaluations for four additional WBI thematic programs: Poverty and Growth; Community Empowerment and Social Inclusion; Social Protection; and Urban and City Management. This report represents the FY04 evaluation of the Urban and City Management Program (UCMP). 1.3 The aims of this evaluation are: · To assess the relevance and effectiveness of the UCMP's learning activities; · To examine participants' use of the awareness, knowledge, and skills they obtained from attending UCMP learning activities; · To recommend ways to improve the UCMP's relevance, effectiveness, and usefulness. The Urban and City Management Program: Background 1.4 The UCMP recognizes that a demographic transition is taking place worldwide. The world will be an urbanized planet before 2020. The growth of cities can help the world achieve the Millennium Development Goals, as urban areas have the great potential for accelerating innovations and capacity building. For this to happen, however, urban growth needs more effective institutions and urban management. 1 1.5 The UCMP offers learning activities to enhance the capacity of cities ­ by strengthening national and local policymaking capacities and regulatory systems and improving national and local academic and training establishments ­ to meet the challenges of urbanization. These challenges include a better climate for business, improved governance, enhanced public services, quality infrastructure, and an overall metropolitan environment that is economically competitive and resilient to changing global demands. 1.6 In FY02, UCMP was one of four components of WBI's Public Finance, Decentralization and Urban Program. In FY03, WBI launched UCMP as a separate thematic program. In FY 02-03, some 1,050 participants participated in UCMP events. The program delivered 17 learning events of longer than one-day duration to participants from 52 countries. This study is focused on these participants and learning events.1 Appendix A lists these learning events. Evaluation Questions 1.7 This evaluation addresses the following four questions:2 a) How relevant are UCMP learning events? b) What is the effectiveness of these learning events?3 c) What are the impacts of these UCMP learning activities?4 d) What features of these learning events are related to better effectiveness and impacts? 1.8 Appendix B provides further details about these questions. It also identifies the main sources of data used. In addition to answering these questions, this study investigates whether UCMP brings any unique and/or special elements to its learning 1 UCMP does more than organize and deliver learning events. This group also publishes books and other documents, coordinates related learning activities within regions, compiles and maintains useful databases, builds and/or maintains networks of practitioners, and promotes new areas of research. This evaluation focuses on the UCMP learning events. 2 This UCMP evaluation is part of a larger evaluation focused on four WBI thematic programs. As such, it makes use of a common evaluation design (including a standard questionnaire) that, although economical, limits the kinds of questions (and analysis) of interest that pertain specifically to the UCMP. For example, this evaluation does not answer specifically whether the UCMP has accomplished its objectives because it is difficult to accommodate this program-specific question within an evaluation design (and questionnaire) that is generic. 3 Effectiveness is defined in terms of raising participants' awareness, increasing their knowledge, helping them develop strategies to address country needs, and assisting them with developing contacts and networking with others in the field. 4 Impact is measured by the degree to which the activity influenced or led to changes in the areas of research, training, legislation and regulation, and country development strategies. Impact measures the contribution of WBI learning programs/activities to building in-country capacity. 2 events. Finally, it also examines whether there are in-country factors that enable or hinder the UCMP's effectiveness and impact. 3 2. EVALUATION METHODS, DESIGN, AND RESULTS 2.1 This evaluation involved three related information collection methods: (a) a survey of FY02-03 UCMP participants; (b) a focus group discussion (FGD) and interviews with select participants; and, (c) conversations with UCMP staff in WBI and relevant WB country staff in Indonesia and in Washington, D.C. Participant Survey 2.2 We conducted a survey to collect information from FY02-03 UCMP participants. We field tested the seven-page instrument on participants before we launched the survey.5 Appendix C includes the final English version of the survey questionnaire. This survey, which collected participants' ratings of the relevance, effectiveness, and impact of WBI activities, is the primary data source used to answer the evaluation questions in Appendix B. 2.3 There are certain limitations to the use of the survey and of participant ratings as indicators of the relevance, effectiveness, and impact of a learning activity. For one, this approach provides only one perspective, which is that of the participant. The views of other key stakeholders (e.g., the UCMP learning providers, the organizations to which participants belong, the WB country staff, etc.) are not captured by the survey or by these measures. Also, this approach assumes that participants are able to recall past events and to attribute specific impacts to a learning activity. Their efforts can be more challenging the longer the elapsed time since the learning event. 2.4 To mitigate these difficulties, we complemented the participant survey with other data-gathering approaches (discussed below) that bring other relevant views into this evaluation. We also limit the learning activities for this evaluation to UCMP events that were delivered 6 to 18 months prior to the survey. 2.5 Appendix D describes the participant survey in detail. Table 1 summarizes the distribution of the sampled UCMP respondents by country. As shown, only 141, or 33.5 percent, of the 421 sampled participants completed the survey. This is despite having 5We translated the survey questionnaire to the local language. To assure that the survey questions had the same meaning in all countries, we did either of two things where possible: (a) asked a native speaker to translate the English version of the questionnaire to the local language and then asked another native speaker to translate this back to English; or (b) asked two or more separate translators to work cooperatively in translating the English version to the local language. For the survey, we used five languages in all. 4 made at least three attempts ­ either by e-mail, phone or fax ­ to contact them. There are a number of reasons for this relatively low survey response rate.6 In general, the survey response rates are lowest in countries where we did not hire local consultants to administer the questionnaire and where additional follow up is therefore more difficult. Even in countries where we had hired local consultants, a common complaint was the wrong (or outdated) participant contact addresses provided in WBI's Client Records System (CRS). Local survey consultants had to find new contact addresses for several participants. When successfully contacted, however, participants themselves, even high- level officials, were willing to complete the surveys. Table 1. UCMP Evaluation: Distribution of Sampled FY02-03 Participants, by Country Country Population Sample Size No. of Respondents Response Rate (%) Brazil 130 130 19 14.62 Burkina Faso 5 5 4 80.00 China 26 6 1 16.67 Ethiopia 6 6 3 50.00 Guatemala 8 7 1 14.29 India 23 23 19 82.61 Indonesia 45 40 28 70.00 Kenya 22 21 6 28.57 Russia 144 121 20 16.53 Thailand 10 9 9 100.00 Others 349 51 31 60.78 TOTAL 768 419 141 33.65 2.6 The difficulty with reaching UCMP graduates suggests the strong likelihood of their exclusion from any follow-up learning or other activities stemming from the WBI- led learning event. Clearly, UCMP's task team leaders (TTLs) need to improve their collection of participant data and updating efforts considerably, particularly in the context of "country focus," where the goal is capacity development through longer-term relationships with key local participants. A better CRS is important if the UCMP is to maintain useful and continued engagement with its alumni. Focus Group Discussion and Participant Interviews 2.7 The aims of the FGD were to: (a) verify and elaborate on answers to evaluation questions from the survey; (b) obtain more information that may have been omitted from 6In contrast, similar WBI participant surveys in 12 countries (for WBIEG's recently completed country- focused evaluations) had an average response rate of nearly 70 percent. We do not know the extent of any statistical biases that could arise on account of our low response rates. 5 the survey; and, (c) assess the comprehensiveness of the coverage of the participant survey in addressing the relevance, effectiveness, and impacts of WBI activities. We conducted one face-to-face FGD with five Indonesians.7 This FGD lasted for about two and one-half hours and involved a local facilitator and a notes keeper. We also interviewed 10 Brazilian participants via telephone because of difficulties encountered with organizing a FGD.8 Informal Interviews with UCMP Staff and Relevant WB Country Staff 2.8 We interviewed some Washington-based UCMP staff, including some of their contacts/partners in the field. We also interviewed seven WB country staff in Washington, D.C. and in Indonesia who are familiar with UCMP activities. These were unstructured, informal interviews. These conversations were intended to get participant views about the UCMP program and to learn about the program to be assured that nothing substantial was overlooked. Results 2.9 Table 2 shows the distribution (unweighted) of the FY02-03 UCMP survey respondents according to a number of characteristics.9 As shown, the majority of participants were male (58 percent) and worked in government, i.e., mainly in provincial/local governments (54 percent) ­ given the nature of the thematic learning activity ­ rather than in national/central bureaucracies (11 percent). In terms of their job positions at the time of the survey, 46 percent of participants reported having the highest or senior level positions (minister, mayor, full/associate professor, department head, senior researcher), 35 percent claimed middle-level positions (program manager, project leader, assistant professor, technical expert), and the remaining 18 percent recorded junior or entry-level positions. Participants' primary work areas were in management or administration (41 percent), policymaking or legislation (23 percent), and research or teaching (20 percent). In all, UCMP learning events seem to attract a broad and balanced spectrum of participants. 7Our selection of the site for the FGD was not random. We selected Indonesia for the FGD because an ongoing evaluation there made it economical to piggyback on this related activity. Twelve randomly selected participants in this UCMP learning activity, who were residents of Bandung, Indonesia, were invited to the FGD. 8The WBI participants were from different parts of Brazil, so it was thus difficult to get them (even via videoconference) to attend a FGD. The one-on-one interviews via telephone proved to be equally fruitful. 9These participant characteristics may be important variables explaining respondent ratings of the relevance, effectiveness, and usefulness of UCMP learning events. They are used as explanatory variables in our regression analysis shown later. 6 Table 2. Urban and City Management Program: Characteristics of Sampled Respondents Participant Characteristic Distribution in % Gender Female 41.8 Male 58.2 Work Organization University/research institution 18.7 NGO, not-for-profit 7.9 Private sector 7.9 National/central government 10.8 Provincial government 10.8 Local/municipal government 43.2 Other 0.7 Primary Type of Work Research 12.3 Policymaking/legislation 22.5 Management/administration 41.3 Teaching 7.8 Provision of services (e.g., financial, health, etc.) 7.8 Other 7.8 Level of Position Held Highest 8.1 Senior 38.2 Middle 35.3 Junior/entry 18.4 2.10 Sixty-nine percent of survey respondents rated themselves as proficient ­ i.e., they self-rated themselves a 6-7 on a 7-point scale ­ in the language of instruction of the WBI learning event. Sixty-seven percent rated themselves as proficient in the technical terminology used in the learning event. 2.11 Without any points of reference, it is difficult to form an opinion on whether these proficiency ratings, or any other estimates from the survey, are high, average or low. For this reason, we use the results from a similar, albeit much larger, survey of FY02 participants in WBI activities in 12+ countries (henceforth referred to as the multicountry survey) as benchmarks against which we compare, herewith, our key UCMP results.10 According to the larger survey, 69 and 59 percent of participants rated themselves as proficient (i.e., 6-7 on a 7-point scale) in the language of instruction of the learning event and in the technical terminology used in the learning event, respectively. This suggests that UCMP participants' proficiencies in both the language of instruction of the learning 10This multicountry survey covers 1,010 FY02 participants in several WBI thematic programs. This is a useful benchmark for comparing how well the UCMP is doing vis-à-vis all WBI learning activities. 7 event and in the technical terminology used in the course are similar to WBI-wide averages. 2.12 Several participants in the focus group discussion and in the interviews noted their having had difficulties with following the UCMP courses they attended. The majority of these courses were delivered in English. A large number of participants were from regional (as opposed to central) localities, so their English language and other skills were not as sufficient as those from central/main offices, even with local facilitators, for them to follow all the material that was presented and discussed in these learning events. Indeed, the conduct of the FGD and one-on-one interviews with UCMP participants required local interpreters. This suggests that some participants may have encountered difficulties in language during the UCMP learning events. RELEVANCE OF THE LEARNING EVENTS 2.13 UCMP survey respondents gave high overall ratings for the relevance of the learning events they attended. On a 7-point scale (where, 1 = "not relevant" and 7 = "extremely relevant"), they rated the events' relevance to their own work and to their country's needs as 5.4 and 5.6, respectively. From the larger multicountry survey, the comparable figures are also 5.4 and 5.6, respectively. 2.14 More than half (or 56 percent) of participants noted that the UCMP activity they attended was designed specifically for their country. This is a useful benchmark against which the focus and impacts of upcoming UCMP learning events ­ under WBI's new country focus, capacity development strategy ­ might be compared.11 2.15 Relevance might also be gleaned from how respondents answered the question: "Since the learning activity, have you discussed the issues raised in the activity at work, with local partners, government officials, NGOs or with the media?" Again, with options ranging from 1 = "never discussed" to 7 = "thoroughly discussed," participants' average response to this question was 4.9 (compared with 4.8 from the multicountry survey). In all, some 36 percent of respondents answered a 6 or 7 to this question, and another 35 percent responded with a 5 (the corresponding multicountry survey figures are 36 and 28 percent, respectively). 2.16 We need to be cautious when attempting to draw broad conclusions from the FGD discussions and interviews because these were limited to two countries (Indonesia and Brazil) only. In the FGD, participants agreed that the new materials discussed in their WBI events were relevant, although not all materials were of equal relevance to their jobs. FGD participants noted the lack of local case studies that could have added better context to the issues discussed in the learning event. Again, the question of participant 11We note that a higher percentage is not necessarily more desirable currently. However, WBI managers may desire to use this figure as one of their performance indicators for WBI's country focus strategy. 8 selection emerged as a key issue here. Although attempts were made to invite participants facing issues related to the learning event, not all topics were relevant for the participants invited. Nonetheless,, these topics may have been relevant for others. In sum, while UCMP graduates gave high overall ratings for the relevance of the learning events they attended, these ratings are not different than similar WBI-wide (or the multicountry) ratings. These UCMP relevance ratings are also not exceptional when compared with the WBI benchmark of 85 percent of training participants providing a rating of "4" or "5" (on a 5-point scale) on Level 1 evaluations of activity relevance and effectiveness.12 2.17 These initial UCMP measures of activity relevance are useful benchmarks against which upcoming country-focused WBI interventions might be assessed. As noted, there is ample room for UCMP to improve on these baseline scores under WBI's country-focus strategy. ACTIVITY EFFECTIVENESS 2.18 We asked respondents a series of questions to obtain some measure of the "effectiveness" of the learning event. We asked them to self-rate, again on a seven-point scale, the impacts of their participation in a UCMP learning activity in six areas: raising awareness and understanding of the development issues; providing the essential knowledge or skills; enhancing understanding of their role as agents of change in their country's development, providing approaches and strategies for addressing their country's development needs; offering approaches for addressing the needs of their organization; and introducing respondents to others also interested in the UCMP learning activity.13 2.19 Figure 1 shows that respondents rated UCMP activities above the mid-rating (of 4) in all six areas of effectiveness. The range of these ratings varies from "not effective at all" (= 1) to "extremely effective" (= 7). On average, participants rated UCMP activities as follows: raising individual awareness of their country's development needs (= 5.5 vs. 5.3 for the multicountry survey); providing essential knowledge or skills (= 5.5 vs. 5.5); enhancing understanding of their role in their country's development (= 4.8 vs. 5.1); offering approaches for addressing the needs of their organization (= 5.3 vs. 4.9); 12For a learning activity to be regarded as satisfactorily relevant and effective, WBI uses an American Society for Training and Development (ASTD) benchmark that 85 percent of training participants provide a rating of "4" or "5" (on a 5-point scale) on Level 1 questions about the activity's relevance and effectiveness. A WBI Level 1 evaluation is a participant survey that is done immediately after the completion of the learning activity. 13The response options ranged from "not effective at all" (= 1) to "extremely effective" (= 7). "Not applicable" was also an option. We note that these are desired outcomes of any learning activity. Not all WBI learning events include these outcomes. 9 providing approaches and strategies for addressing their country's development needs (= 5.2 vs. 5.0); and offering networking opportunities (= 5.1 vs. 5.0). Figure 1. UCMP: Mean ratings of effectiveness of the activity, by area of effectiveness Raising awareness/ understanding of development 5.6 issues Providing knowledge or skills 5.4 ngit 5.2 Helping understand role as ra agent of change na 5.0 Developing strategies/ Me4.8 approaches for organizational 4.6 needs Developing strategies/ approaches for country needs 4.4 Area of effectiveness Developing contacts/ partnerships, build coalitions in the field Note: Seven-point rating scale used: 1 = "not effective at all"; 7 = "extremely effective." 2.20 The FGD and interviews generally support these results. The majority of participants in the FGD and interviews noted that the activity raised their awareness of the many dimensions (and corresponding strategies) in reducing urban afflictions, particularly new useful paradigms and innovative experiences in tackling these issues. It offered them a comprehensive perspective to resolve urban development concerns. Improved networking opportunities were mentioned also as something that the UCMP event provided. According to FGD participants, even though many of the attendees in the learning event belonged to the same institution or had met previously because of their shared job interests, the UCMP event allowed them to establish familiarities that facilitated their meetings and/or communications after the event. 2.21 As a measure of aggregate effectiveness of UCMP activities, we averaged participants' scores across the individual effectiveness items. The UCMP composite measure of effectiveness is 5.3 (vs. 5.1 for the multicountry survey). Also, some 50 percent of respondents rated these activities a 6-7 on this aggregate measure.14 In contrast, when compared with the WBI-wide benchmark of 85 percent of training participants providing a rating of "4" or "5" (on a 5-point scale) on Level 1 evaluations of activity relevance, effectiveness, and impact, these estimates appear to be unsatisfactory.15 14These survey-derived measures make useful baselines for assessing future UCMP performance. 15See footnote 12. 10 Figure 2. UCMP: Distribution of respondents scoring WBI activity as "Highly Effective" by area of effectiveness 60% Raising awareness/ 7 understanding of development 6- issues gnir 50% Providing knowledge or skills 40% sco Helping understand role as st 30% agent of change endno Developing strategies/ 20% approaches for organizational needs espR 10% Developing strategies/ approaches for country needs 0% Developing Area of effectiveness contacts/partnerships, build coalitions in the field PARTICIPANTS' USE OF AWARENESS, KNOWLEDGE AND SKILLS (K&S) FROM THE UCMP ACTIVITY 2.22 Similar to their ratings of activity effectiveness, participants rated the change brought about by the UCMP activity they attended as positive but not extraordinary. To the question: "How would you rate the change - brought by the activity - in the main topic or issue it addressed?" survey participants responded with an average rating of 5.6 (again based on a 7-point scale), with 51 percent of them responding with a 6-7. These figures are similar to the aggregate effectiveness ratings noted earlier (i.e., 5.3 and 50 percent, respectively). 2.23 How did participants use the K&S that they gained from the UCMP activity that they attended? Again, based on a 7-point scale where 1 = "not at all" and 7 = "very often,"16 the survey asked respondents to rate how often they had used what they had acquired from the learning event in seven specific areas.17 With the actual range of these participant ratings varying from 1 to 7, on average, UCMP respondents rated their use of K&S from the learning activity they attended as follows (see Figure 3): conducting research (= 4.9 vs. 4.6 for the multicountry case); teaching (= 4.9 vs. 4.6); raising public awareness in development issues (= 5.2 vs. 5.0); implementing new practices within their work organization (= 5.1 vs. 4.7); organizing collective initiatives (= 4.7 vs. 4.4); influencing legislation and regulation (= 4.4 vs. 4.0); implementing country development strategies (= 4.4 vs. 4.3). What might explain these results? 16"Not applicable" was also an option. 17Participant ratings are for activities that had taken place 6-18 months prior to the survey. 11 Figure 3. UCMP: Mean ratings of frequency of use of K & S obtained from the WBI activity, by purpose of use esu 5.4 Conducting Research of 5.2 y Teaching nc 5.0 ueqerf 4.8 Raising public awareness 4.6 of Implementing new practices gn 4.4 in work organization Organizing collective rati 4.2 initiatives 4.0 Influencing legislation and Mean regulation 3.8 Implementing country development strategies Purpose of K & S use Note: Seven-point rating scale used: 1 = "not at all"; 7 = "very often." 2.24 In the FGD and interviews, participants noted several factors. Even if they wanted to implement what they had learned, they are constrained by the structure of decision making in their organizations and local bureaucracies. The majority felt that the awareness of development issues of their own superiors should be raised for any institutional change to take root and flourish. Given the hierarchical structure typical of most bureaucracies, the strong need for consensus at the top, and the importance of unambiguous guidance for lower level officials before anything new can happen, the careful selection of participants in focused UCMP events is vital. Participants suggested that UCMP's selection of participants should consider local policymaking and implementation conditions for the learning program to have greater influence on institutional change. 2.25 A majority of interviewed participants noted also that the UCMP learning event they attended was more effective for reflection and analysis than for immediate usage in specific urban contexts. What they learned from the event was to be applied over time, with the gradual raising of awareness and the building of collective initiatives. 2.26 Finally, participants in the FGD and interviews also mentioned the problem of work overload. There is only a thin layer of qualified high and middle echelon officials in many local institutions, particularly in smaller municipalities and in remote regions. On this account, these officials are usually overworked and overburdened with routine work that makes it difficult to initiate change. The same FGD participants and interviewees also realize, however, that many UCMP learning events aim correctly to mitigate this undesirable situation. 2.27 As a summary measure of use of K&S acquired from the UCMP event, we created a composite measure of "use" as the average of the seven specific areas listed in the survey questionnaire. This average aggregated measure of "use" was 4.8, with some 12 40 percent of respondents rating a 6-7 for this indicator. Figure 4 summarizes the proportion of participants who rated a 6-7 for each component of this "use" indicator. Figure 4. UCMP: Distribution of respondents scoring K & S acquired from the WBI activity as "Highly Used," by purpose of use on 60% Conducting Research 6-7 Teaching e 50% us Raising public awareness oringcs of 40% ncy 30% Implementing new practices ntsedn in work organization 20% Organizing collective freque initiatives 10% Influencing legislation and Respo regulation 0% Implementing country development strategies Purpose of K & S use 2.28 The survey asked participants to rate the degree to which the activity influenced or led to changes in the same areas of use listed for the previous question. Based again on a 7-point scale, where "negative influence" = 1; "neither helped nor hurt" = 4; and "greatly helped" = 7, survey respondents reported positive, albeit low ratings, as follows (see Figure 5):18 research (= 5.8 vs. 5.5 for the multicountry survey); teaching (= 5.8 vs. 5.5); public awareness and development issues (= 4.9 vs. 5.6); new practices within you work organization (= 5.5 vs. 5.4); collective initiatives (= 5.3 vs. 5.2); legislation and regulation (= 5.0 vs. 5.0); and country development strategies (= 4.2 vs. 5.4). The simple average for the aggregate of these components is 5.2 (vs. 5.4 for the multicountry survey), or just below the midpoint between "neither helped nor hurt" and "greatly helped." Although positive, these findings are not comforting. Figure 5. UCMP: Mean ratings of WBI activity-led changes, by area of influence 7 Research 6 Teaching ngs 5 Raising public awareness rati 4 New practices in work 3 organization Mean Collective initiatives 2 1 Legislation and regulation 0 Country development strategies Area of influence Note: Seven-point rating scale used: 1 = "negative influence"; 4 = "no influence"; 7 = "positive influence. 18Note that for this question, the neutral to positive ratings range from "neither helped nor hurt" = 4 to "greatly helped" = 7. Anything below a 4 is a negative rating. 13 2.29 Participants in the FGD and interviews suggested the use of more action learning exercises tied directly to their work programs. This might include preparations by participants themselves before the training (e.g., request participants to bring certain problems they face into the learning activity) and follow-up activities on the action plans developed by participants during the learning course. Direct application of newly acquired K&S through team-based exercises is also suggested, as for instance, requiring participants to produce a plan (or targeted outputs) for public presentation ­ to their regional bosses or other responsible authorities ­ upon completion of the UCMP course, but including more UCMP-directed follow-up/guidance activities afterwards. Several UCMP events already have these action-plan exercises, even more so than the rest of WBI. From the survey, 62 percent of respondents (vs. 45 percent for the multicountry survey) noted that the WBI activity they attended had action-learning components Of these respondents, 73 percent (vs. 68 percent vs. for the multicountry survey) noted that they used part or all of the plan/strategy they developed in the learning activity in their work. 2.30 Lack of time for discussion during the learning activity was a key complaint of participants in the FGD and interviews. Participants suggest organizing UCMP events around more focused themes and/or having more follow-up activities (workshops) after the main WBI-led event to explore relevant topics in greater depth. 2.31 UCMP is not the only urban management program in the many countries with UCMP graduates. Indeed, UCMP participants in our FGD and interviews note the need for WBI to coordinate with other donors/partners who provide similar, albeit less well- structured, urban/city management training. Additionally, 48 percent of survey respondents reported that they had participated in learning events offered by other organizations that were similar to the UCMP activity they attended. These participants rated the usefulness of this UCMP learning activity relative to the non-WBI event they attended (again on a 7-point scale where "much less useful" = 1; "about the same" = 4; and "much more useful" = 7) a 4.9. Thus, UCMP-sponsored events appear, on average, to be more useful to participants than those learning activities offered by others. This slight relative advantage of the UCMP effort, however, is not heartening. FOLLOW-UP ACTIVITIES WITH UCMP PARTICIPANTS 2.32 The difficulties with contacting past graduates of UCMP events for this evaluation already attest to the problems with any follow-up activities. Additionally, only 23 percent of respondents in the UCMP participant survey (28 percent for the multicountry survey) reported that the WBI program contacted them for follow-up issues regarding the activity. A much smaller 8 percent (12 percent for the multicountry survey) reported having contacted UCMP themselves for follow-up issues or for questions on the 14 content of the learning activity. These low levels of follow-up efforts are likely to have had limited effects only on UCMP's capacity-development goals. 2.33 Part of this problem may be on account of shortfalls in the coordination of UCMP activities with local-based WB staff. For instance, among the higher-level staff members in the WB resident mission in Indonesia (where we held the FGD) whom we interviewed, none could recall correctly any FY02-03 UCMP learning events with Indonesian participants. A majority of these staff members raised as a key concern the lack of communication/coordination between them and the WBI's TTLs who, at times, deal directly with Indonesian institutions and/or participants. To scale up in-country impacts, any future WBI country-focused capacity-development initiatives should more closely involve resident Bank staff in the selection, design, delivery, and follow-up of these learning opportunities to assure that these are aligned with the Bank's overall country assistance strategy. Better WBI-wide record keeping methods and follow-up efforts with local institutions/individuals are also vital because capacity development is not likely to result from a single WBI learning event. Impacts at the organizational and institutional levels require sustained and targeted efforts. DETERMINANTS OF ACTIVITY EFFECTIVENESS AND PARTICIPANTS' USE OF K&S 2.34 What factors explain the effectiveness of UCMP learning events? What are the determinants of a participant's use of the K&S acquired from these activities? We use a two-stage least squares regression model to answer these questions, where the first stage initially explains what determines activity effectiveness. The second stage relates these estimates of activity effectiveness and other variables to participants' ratings of their use of K&S gained from the learning activity. 2.35 We first define an aggregate measure of effectiveness as a composite variable, i.e., the average of participants' ratings across the six areas of effectiveness, namely, effectiveness with: a) raising individual awareness of their country's development needs; b) providing essential knowledge or skills; c) enhancing understanding of their role in their country's development; d) offering approaches for addressing the needs of their organization; e) providing approaches and strategies for addressing their country's development needs; and, f) offering networking opportunities. 2.36 We note here that we also tried decompositions of this aggregate measure to define more focused effectiveness indicators, namely: academic effectiveness (the average of items (a), (b), and (c) above); operational effectiveness (the average of items (d), (e), and (f) above). These component indicators, however, did not fare any better as 15 dependent variables in first-stage regressions compared with the single aggregate measure of effectiveness. 2.37 We also define separate measures of a participants' use of their acquired K&S from the activity. The first is aggregate use: a composite variable that is the average of a participant's ratings across the seven areas of possible use, namely, use in: a) conducting research; b) teaching; c) raising public awareness in development issues; d) implementing new practices within his/her work organization; e) organizing collective initiatives; f) influencing legislation and regulation; and, g) implementing country development strategies. 2.38 We also decomposed this aggregate measure of use into the following components: academic use (the average of items (a), (b) and (c) above); operational use (the average of items (d) to (g)). 2.39 We explored several variables that might explain activity effectiveness and use. All these independent variables, as explained in Box 2.1. are all from the participant survey, unless otherwise noted. 16 Box 2.1.: Regression Variables Activity-level variables: · face-to-face = 1, if the activity attended was delivered face-to-face (as opposed to via video conferencing sessions), = 0 otherwise; · duration = actual number of days of the learning activity (from CRS data), = 0 otherwise; · designed for country = 1, if learning activity was thought by respondent to be designed specifically for his/her country, = 0 otherwise; · activity location = 1, if learning activity was held in the respondent's country (from CRS data), = 0 otherwise; · action plan = 1, if participant said he/she developed an action plan/strategy during the activity to apply the K&S learned, = 0 otherwise; · seminar = 1, if activity was a seminar, = 0 otherwise. Data is from CRS where there are four activity types: conference, course, seminar and workshop, in that order based on two criteria - duration and intensity of the course. The seminars tend to be longer duration than courses and involve more in- depth coverage of issues than courses or conferences but less than workshops; · contact information = 1, if participant was provided contact information for other participants, = 0 otherwise; · WBI follow-up = 1, if WBI contacted participant after the learning event, = 0 otherwise; · participant follow-up = 1, if participant contacted WBI regarding the activity after the learning event, = 0 otherwise. Participant-level variables: · gender = 1, if male, = 0 otherwise; · proficiency in the language of instruction = participant's self-rated score on a 7-point scale of his/her proficiency in the main language of instruction of the activity; · proficiency in the technical terminology = participant's self-rated score on a 7-point scale of his/her proficiency in the technical terminologies used in the learning course; · organization: government = 1, if participant worked the longest in national, provincial or local government since the learning activity, = 0 otherwise; · job position; coded between 0 and 1 as follows: = 0 for entry-level position, = 0.25 for junior-level position, = 0.50 for middle-level position, = 0.75 for senior-level position, and = 1 for highest-level position. Exogenous variables: · work and country development environment = participants' average rating of the degree to which work environment factors (e.g., work procedures, colleagues, incentive system, resources, etc.) and country- level development factors (e.g., country policies, social groups, political groups, readiness for reform, etc.) hurt/help use of K&S acquired from the learning activity; · discussions after learning event = actual participant self-rating on a 7-point scale, where 1 = never discussed and 7 = thoroughly discussed, of his/her discussion of the issues raised in the learning activity at work; with local partners, government officials, NGOs; or in the media; · low-income country = 1. if participant is from a low-income country as defined in the 2004 World Development Report, = 0 otherwise; · Latin America (LAC) region = 1, if participant is from the Latin America region, = 0 otherwise; · East Asia (EAP) region = 1, if participant is from the East Asia region, = 0 otherwise; · Africa (AFR) region = 1, if participant is from the Africa region, = 0 otherwise; · Europe and Central Asia (ECA) region = 1, if participant is from the ECA region, = 0 otherwise. 17 2.40 We tested several factors that might explain the composite measure of activity effectiveness, including activity-related variables, participant-related characteristics and exogenous factors. Table 3 reports the regression estimates for aggregate effectiveness. Among the ordinary least squares (OLS) models tested, this regression estimate has the highest explanatory power (i.e., R2 = 0.30)19 and includes all those explanatory variables that appear to be the most consistently significant and meaningful in the other test regressions.20 2.41 What features of UCMP events and participants are related to effectiveness? As shown, there are not many. Five key variables have positive and significant effects on effectiveness: (a) whether the learning activity was designed specifically for the participant's country; (b) whether participants were asked to develop an action plan during the learning activity; (c) the participant's level of proficiency in the technical terminology used in the learning event; (d) whether the participant is from a low income country; and, (e) whether the participant is from the Africa region. 2.42 The first two of these significant explanatory variables are activity-level variables; they suggest what activity features UCMP might address to raise participants' ratings of activity effectiveness. Action plan development (or action learning) is a feature that improves overall effectiveness. As noted earlier, 62 percent of UCMP respondents already report that an action plan was included in the activity they attended. While this is higher than the WBI average (of 45 percent), there is room for having more of this effort in UCMP learning offerings. Also, designing UCMP activities for the needs of a country raises significantly the program's effectiveness. This suggests that UCMP's overall effectiveness is likely to improve with WBI's country-focus capacity development strategy. 2.43 The other three significant variables suggest the types of participants who are more likely to rate the activity they attended as highly effective and thereby also more likely to benefit from the activity than their counterparts. These participants appear to be those who have some proficiency in the technical terminology of the learning event, who are from a low-income country, and who are from the Africa region.21 For UCMP then, 19We also tested several regressions with decomposed measures of activity effectiveness as the dependent variable. These alternative estimates did not show any significant improvements over the result reported in Table 3. 20We also tested OLS regressions with the activity's relevance as explanatory variables, i.e., where the variable "relevance to work" = participant's rating on a seven-point scale of the degree to which the learning event was relevant to the participant's work rating and the variable "relevance to country" = participant's rating on a seven-point scale of the degree to which the learning event was relevant to his/her country's needs. These explanatory variables are highly corrected with measures of effectiveness, so both variables were highly significant in all OLS results. Their inclusion in the estimated regressions made all other dependent variables statistically irrelevant. 21South Asia (SAR) is the omitted region in this Table 3 regression. We note that there were no UCMP FY02-03 participants from MNA in the sample. 18 one challenge is either to select activity participants who already have some proficiency in the technical terminology used in UCMP learning events or to adapt the technical terminologies used in learning activities to the level of proficiency of participants. Involving local partners, including WB resident staff, in the selection of participants, might do this. UCMP can also raise its effectiveness ratings by focusing more on Africa, or perhaps by developing new tailored courses that can appeal more broadly to middle- income countries in Latin America and East Asia. Table 3. Activity Effectiveness: Regression Estimates Activity Effectiveness Variables explaining activity effectiveness: Weighted OLS Gender 0.11 [0.57] Activity specifically designed for respondent's country 0.76 [3.13]*** Action plan 0.60 [2.82]*** Proficiency in the technical terminology of learning event 0.31 [2.26]** Proficiency in the language of instruction of learning event -0.34 [1.05] Duration (in days) of learning event -0.07 [1.29] Seminar 0.18 [0.78] Activity location -0.29 [0.91] Face-to-face delivery mode -0.45 [1.10] Low-income country 1.11 [3.99]*** Latin America region 0.20 [0.70] East Asia region 0.33 [0.58] Africa region 1.53 [3.26]*** Europe and Central Asia region -0.10 [0.18] Constant 3.64 [2.58]** Observations 109 R-squared 0.30 Note: t-statistics in brackets * significant at 10%; ** significant at 5%; *** significant at 1% 19 2.44 Table 4 reports second-stage regression estimates explaining activity use. The first estimate explains aggregate use while the last two report the second-stage regression results for academic use and operational use, respectively. In all three regressions, predicted effectiveness from the equation in Table 3 is used as the explanatory variable. As expected, effectiveness is a positive and significant predictor of aggregate use and its components: academic use and operational use. Table 4. Activity Use: Second-Stage Regression Estimates Activity Use Aggregate Academic Operational 2SLS 2SLS 2SLS Activity effectiveness: aggregate 0.38 0.30 0.45 [1.80]* [1.65]* [1.71]* Gender -0.11 -0.19 -0.05 [0.44] [0.59] [0.17] Job position 0.02 0.03 -0.02 [0.18] [0.35] [0.24] Organization: government 0.06 -0.19 0.19 [0.25] [0.71] [0.62] Discussions after learning event 0.30 0.31 0.26 [2.53]** [1.98]* [1.68]* Contact information -0.50 -0.37 -0.48 [2.00]** [1.14] [1.26] WBI follow-up 0.16 0.23 0.18 [0.45] [0.60] [0.42] Participant follow-up 0.62 0.67 0.40 [1.22] [1.13] [0.77] Work and country development environment 0.17 0.14 0.23 [1.80]* [1.45] [1.80]* Low-income country 1.70 1.57 1.72 [2.91]*** [2.74]*** [2.05]** Latin America region 2.72 2.74 2.69 [3.08]*** [2.80]*** [2.81]*** East Asia region 2.60 2.58 2.70 [2.73]*** [2.45]** [2.48]** Africa region 1.97 2.24 1.89 [2.57]** [2.64]** [1.98]* Europe and Central Asia region 2.92 2.75 3.13 [3.29]*** [2.78]*** [3.08]*** Constant -1.91 -1.28 -2.534 [1.61] [1.06] [1.51] Observations 80 77 77 R-squared 0.57 0.5 0.52 Note: t-statistics in brackets * significant at 10%; ** significant at 5%; *** significant at 1% 20 2.45 Whether the participant had discussions after the learning activity of issues raised in the activity is a positive and significant factor explaining operational use, academic use, and aggregate use. Providing participants with the contact information of other participants (including event presenters/facilitators) raises aggregate use as well. Whether WBI contacted participants after the learning event (WBI contact) had no significant bearing on participants' aggregate use of the K&S that they acquired from the learning activity. This may be due to the nature of the follow-up activity and the opportunities that participants have to use their acquired awareness, knowledge, and skills. In a way, these results suggest simply that participants themselves are more likely to initiate and use the K&S from the learning event when provided directly with these opportunities for knowledge sharing and learning. 2.46 Work-environment (e.g., work procedures, colleagues, incentive system, resources, etc.) and country-related factors (e.g., country policies, social groups, political groups, readiness for reform, etc.) are significant determinants of increased aggregate and operational use. Low-income country participants are also likely to have significantly higher operational and academic use of the K&S acquired from the learning activity than middle-income country participants. Finally, participants in LAC, EAP, AFR, and ECA regions are more likely to report significantly higher aggregate, operational, and academic use than participants from the South Asia region, the region omitted in the Table 4 regressions. In brief, SAR stands out, relative to other regions, as having less use of the K&S from UCMP learning events.22 In the context then of improving UCMP's aggregate use, there may be lessons to be learned from a closer scrutiny of the program's learning activities of specific to SAR vis-à-vis other regions. 2.47 The same is true for position level, where participants in higher positions are more likely than others to report higher and significant operational use, but not academic use. An earlier concern raised in the FGDs ­ that in the context, organizational hierarchies matter for effecting any changes from the learning activity ­ supports the latter result. For WBI, therefore, these results suggest that the operational use of the K&S that its graduates acquire may be raised by selecting participants who are in high-level positions and by strongly encouraging UCMP alumni to continue to discuss after the event the issues raised during the learning activity. 22This result is difficult to explain because we did not have FGDs or participant interviews in SAR. 21 3. CONCLUSIONS AND RECOMMENDATIONS 3.1 Participants' average ratings of the relevance, effectiveness, and impacts of the UCMP learning events they attended are positive but generally not much different from similar average FY02 ratings for WBI-wide learning activities. Also, these same ratings are about average or below the WBI benchmark of 85 percent of training participants providing a rating of "4" or "5" (on a 5-point scale) on Level 1 evaluations of activity relevance, effectiveness, and impact. 3.2 Participants' use of the K&S that they acquired from the UCMP activities is also positive. Compared with similar non-UCMP urban management learning events offered in their countries, knowledgeable UCMP alumni rate the usefulness of the UCMP activity they attended as only slightly better (rating of 4.9) than a rating of "about the same" (rating of 4.0). 3.3 The survey and other data collected for this study offer useful benchmarks for assessing the relevance, effectiveness, and impacts of upcoming UCMP activities under the WBI's country-focused capacity development strategy. And, as noted above, there exists room for UCMP to improve on these initial benchmark indicators of its overall performance. 3.4 There are ways to improve these performance indicators. · Improve the process of selecting invitees to UCMP learning events. The regression results indicate that selection based on a participant's proficiency in the technical terminology used in the learning activity can improve an activity's effectiveness and usefulness. Adapting technical terminologies to the level of participants should also achieve similar results. Also, FGD participants and interviewees commented that if capacity development is the goal, the selection of participants to UCMP events should recognize and thereby consider how institutional and policy changes happen in different institutional hierarchies. Finally, a majority of local Bank staff we interviewed want themselves to be included more closely in the selection of the participants, as well as in the design, delivery and follow-up activities of WBI interventions. · Tailor UCMP learning activities to country needs. Our survey results suggest tailoring UCMP activities, which are aligned with WBI's country-focus capacity development strategy, will improve effectiveness (and impact) of UCMP learning activities. FGD participants 22 also note that the incorporation of local case studies for discussion will improve the relevance and effectiveness of the UCMP activity. Donor coordination is also important so that the same courses/topics are not offered by multiple sources to the same audiences. · Use more work-related, action-learning activities. Regression results indicate that the effectiveness and usefulness ratings of a UCMP activity improves significantly when participants are asked to develop action plans/strategies as part of their learning activity. FGD participants suggest the employment of qualified local facilitators for distance learning courses for these action-learning activities. While this might be an additional expense for course delivery, this extra effort should tie in well with the initiation of useful subsequent capacity development activities. · Encourage participants to discuss among themselves and with others after the learning event. Having related topical discussions after the UCMP learning event improves the use of the K&S gained from the event. Additionally, sharing contact information among participants also raises their use of the K&S gained from the event. Of particular interest to UCMP graduates are accessible follow-up activities that encourage open discussions on the relevant topics of the learning event. Nonetheless, follow-up activities are constrained naturally by available resources on hand. There are the familiar tradeoffs here between broadening versus deepening the effectiveness and impact of UCMP- supported events. · Set clearly focused, more narrowly targeted goals that would allow impacts to be evident and easily measured. This recommendation comes from several WB country staff whom we interviewed. These WB staff note that, except when the learning (or capacity building) interventions are directly related to ongoing operational work, WBI's (and not only UCMP's) goals for their countries are not focused. The impacts of these interventions are therefore not evident and are difficult to measure. 3.5 Finally, it is important to improve the CRS. Client records should be kept more up to date, and the responsibility for its upkeep lies with the program deliverer. Reliable participant records are vital for UCMP's contribution to WBI's capacity development country-focused agenda, where systematic follow-up activities are required for institution building. 23 24 APPENDICES 25 26 APPENDIX A. LIST OF FY02-03 UCMP LEARNING EVENTS Month No. Activity Title and Year Total Respondents of Location Mode of Delivery of Participants (n=141) Delivery Days Brazil Urban and City Management Core Nov-02 Brazil Face to Face 5 42 4 Course City Strategies to Reduce Urban Poverty Nov-01 Ethiopia Blended/Multiple Modes 8 6 3 through Local Economic Development City Strategies to Reduce Urban Poverty Nov-01 Indonesia Blended/Multiple Modes 8 27 41 through Local Economic Development Curso de Gestion Urbana para Jun-02 El Salvador Face to Face 10 59 3 Centroamerica Curso de Gestion Apr-02 Honduras Videoconference 6 20 1 Urbana y Municipal Curso en Gestion Feb-03 Peru Face to Face 8.5 51 14 Urbana Decentralization and Municipal Management Mar-03 Georgia Face to Face 6 47 2 Regional Course Gestion urbaine et Oct-01 Senegal Face to Face 10 41 7 municipale en Afrique Minas Gerais Urban & City Management Core Oct-02 Brazil Face to Face 5 79 10 Course Municipal Management Apr-03 Russia Face to Face 6 28 6 Course for Russia Municipal Management Mar-03 Russia Face to Face 2 35 6 Workshop South Asia Urban and City Management Jan-02 India Face to Face 9 34 25 Course Urban Poverty Learning Jun-02 Singapore Face to Face 2 40 1 Workshop for Asia Urban and City Management Course for May-02 Ukraine Face to Face 10 46 10 CIS Urban and City Management Course for Aug-01 Hungary Face to Face 6 26 2 Europe and Central Asia II Urban and City Management in a Mar-02 Uganda Face to Face 5 42 6 Decentralization Framework 27 APPENDIX B. UCMP EVALUATION: QUESTIONS AND DATA FOR ANSWERING THESE QUESTIONS BY DATA SOURCE Evaluation Question Source I. What is the relevance of UCMP learning activities to your country's needs? To what degree are the topics covered by the learning activities relevant to your country's Participant surveys, specific needs? interviews, and FGD WB country staff interviews Do UCMP learning activities address your country's current issues and needs? Participant surveys, interviews, and FGD WB country staff interviews II. What is the effectiveness and impact of UCMP learning activities?* How effective are the learning events in your country? (effectiveness is defined in terms of Participant surveys, three components: knowledge and skills, strategies, and approaches and networking). interviews, and FGD To what degree are the participants utilizing the knowledge/skills they learned through Participant surveys, UCMP learning activities/products? And in what areas? How useful is the activity? interviews, and FGD WB country staff interviews What are the country-specific facilitators and barriers to utilizing the knowledge/skills? Participant surveys, interviews, and FGD WB country staff interviews Has the activity influenced or contributed to changes in the relevant sector/area? Participant surveys, interviews, and FGD WB country staff interviews III. What is the contribution of UCMP learning programs/activities to building in-country capacity, relative to others delivered by other members of the development community? To what degree do UCMP activities build sustainable in-country capacities in learning and Participant surveys, applying new knowledge and skills? [country-specific indicators of sustainability include interviews, and FGD local partnerships, follow-up meetings mobilized by participants] WB country staff interviews What is the extent of similar learning activities provided by non-WBI organizations? How do Participant surveys, their effectiveness and impact compare to WBI training programs? interviews, and FGD WB country staff interviews 28 APPENDIX C. WORLD BANK INSTITUTE (WBI) UCMP EVALUATION QUESTIONNAIRE Instructions WBI had the pleasure to have you participate in the following learning activity: Title: _____________________________________________________________ Held from: ________________________ to: ________________________ In: _______________________________________________________________ Getting your opinion of the above-mentioned activity--now that you have had time to reflect on it--is very important to help WBI improve its programs. For this, we ask you to complete this questionnaire. The questionnaire has four sections and should take approximately 20 minutes to complete. · Section 1 asks about the relevance of the activity. · Section 2 asks about the usefulness of the activity. · Section 3 asks you to compare this activity with similar learning activities offered by other organizations. · Section 4 asks about the characteristics of the activity, its follow-up and your background. We need your honest feedback. Please keep in mind that your responses will be kept confidential, and will be used for the sole purpose of improving WBI programs. If you have any questions about the questionnaire please call or send a message to Mr. Jaime B. Quizon at jquizon@worldbank.org . Thank you for agreeing to complete this questionnaire! ID:_________________ 29 World Bank Institute (WBI) UCMP Evaluation Questionnaire I. Relevance of the Activity 1. Since the end of the activity, to what degree has the activity been relevant to your work? Not relevant Extremely at all relevant 1 2 3 4 5 6 7 2. To what degree have the topics covered in the activity been relevant to your country's needs? Not relevant Extremely at all relevant 1 2 3 4 5 6 7 3. Was the activity designed specifically for participants from your country? Yes No Don't now 4. Was the activity related to the country development goals listed below? a. Eradicate extreme poverty Don't now Yes No b. Achieve universal primary education Don't now Yes No c. Promote gender equality and empower women Don't now Yes No d. Reduce child mortality Don't now Yes No e. Improve maternal health Don't now Yes No f. Combat HIV/AIDS, malaria, and other diseases Don't now Yes No g. Ensure environmental sustainability Don't now Yes No h. Develop global partnerships for development Don't now Yes No i. Ensure water sanitation and supply Don't now Yes No j. Improve investment climate and finance Don't now Yes No k. Promote trade Don't now Yes No 30 II. Usefulness of the Activity 5. Please rate the degree of effectiveness of the activity in each area noted below. (If the area was not an objective of the activity, please mark "not applicable.") Not Areas effective Extremely Not at all effective applicable 1 2 3 4 5 6 7 NA a. Raising your awareness and understanding of the development issues important to your country b. Providing you with knowledge or skills c. Helping you better understand your role as an agent of change in your country's development d. Helping you develop strategies or approaches to address the needs of your organization e. Helping you develop strategies or approaches to address the needs of your country f. Helping you develop contacts, develop partnerships and build coalitions in the field 6. How would you rate the change--brought by the activity--in the main topic or issue it addressed? Strong negative No Change Strong positive Don't change change Know 1 2 3 4 5 6 7 DK 7. How often have you used the knowledge and skills you acquired in the activity for the following purposes? (If you have not worked in the given area since this activity, please mark "Not applicable.") Purposes Not Very Not at all often applicable 1 2 3 4 5 6 7 NA a. Conducting research b. Teaching c. Raising public awareness in development issues d. Implementing new practices within your work organization e. Organizing collective initiatives f. Influencing legislation and regulation g. Implementing country development strategies 8. To what extent did the following factors help or hurt the process of using the knowledge/skills that you acquired at the activity? Factors Greatly Neither helped Greatly Not hurt nor hurt helped applicable 1 2 3 4 5 6 7 NA a. Your work environment (e.g., work procedures, colleagues, incentive system, funding, etc.) Factors Greatly Neither helped Greatly Not hurt nor hurt helped applicable 1 2 3 4 5 6 7 NA b. Your county's development environment (e.g., country policies, social groups, political groups, readiness for reform, etc.) 31 9. How has the activity influenced or led to changes in the following areas? (If the area is not relevant to the activity, please mark "Not applicable.") Areas Negative No Positive Not influence influence influence applicable 1 2 3 4 5 6 7 NA a. Research b. Teaching c. Public awareness in development issues d. New practices within your work organization e. Collective initiatives f. Legislation and regulation g. Country development strategies 10. Since the activity, have you discussed the issues raised in the activity, at work, with local partners, government officials, NGOs, or in the media? Never Thoroughly discussed discussed 1 2 3 4 5 6 7 III. Comparison of the WBI Activity with Similar Activities Offered by Other Organizations 11. Did you participate in any similar learning activities offered by other (NON-WBI) organizations in your country? (If no, please skip to question 14.) Yes No 12. If yes, please provide the name(s) of the organization(s): 1. 2. 3. 13. How would you rate the usefulness of the WBI activity compared to NON-WBI activities? WBI About the same WBI No much much opinion less more useful useful 1 2 3 4 5 6 7 32 V. Characteristics of the WBI Activity, its Follow-up and Your Background 14. How would you describe the type of the WBI learning activity that you attended? Video Sessions Mix of Video (Distance Class room and Face to Study Learning) (Face to Face) Face Conference Web-based Learning tour 1 2 3 4 5 6 15. How effective was this type of learning activity in helping you learn? Not effective at No all Extremely effective opinion 1 2 3 4 5 6 7 16. During the WBI activity, did you develop an action plan/strategy (e.g., work plans, strategy papers, or policy documents) to apply the knowledge and skills you learned? (If no, please mark "no" below, then skip to question 18.) Yes No 17. If yes, did you use part or all of the action plan in your work? Yes No 18. Were you provided with the contact information of other participants in the activity, such as e-mail addresses, telephone numbers or mailing addresses? (If no, please mark "no" below, then skip to question 20.) Yes No 19. If yes, how did you use it? Used it to Used it to continue activity organize related joint follow- Other uses Never used it discussions up activities (Please specify briefly) ________________________________________ 20. Was the language of instruction used during the activity the same language you use at work? Yes No 21. At the time of the activity, what was your level of proficiency in the language of instruction? Not proficient at all Highly proficient 1 2 3 4 5 6 7 22. At the time of the activity, what was your level of proficiency in the technical terminology used in the activity? Not proficient at all Highly proficient 1 2 3 4 5 6 7 23. After the activity, did WBI contact you for follow-up issues regarding the activity? Yes No 24 After the activity, did YOU contact WBI for follow-up issues or questions on the content of the activity? (If no, please skip to question #23) Yes No 33 25. If yes, please rate WBI's helpfulness in addressing your issues. WBI responded, I did not have WBI did but was not follow-up not helpful WBI responded and was requests for respond at all extremely helpful WBI 0 1 2 3 4 5 6 7 NA 26. Which of the following best describes the organization in which you have worked the longest since the activity? (Select one.) University/research institution National/central government Non-governmental organization (not-for- profit) Provincial/regional government Media Local/municipal government Private sector Other, specify: ____________________________ 27. Which of the following best describes the primary type of work you have done the longest since the activity? (Select one.) Research Teaching Policymaking/legislation Provision of services (e.g., financial, health, etc) Management/administration Other, specify: ____________________________ 28. How would you best describe the level of the position you have held the longest since the activity? Highest level (e.g., Minister, Deputy Minister, Top Government Official, Full Professor, President of an organization) Senior level (e.g., Department Head, Division Head, Associate Professor, Senior Researcher) Middle level (e.g., Program Manager, Project Leader, Assistant Professor, Technical Expert) Junior Level (e.g., Research associate, Ph.D. level graduate student, Technical Specialist) Entry level (e.g., Intern, assistant) Other, Please specify: __________________________________________________________________ 29. What is your gender? Male Female Thank you for your feedback. We appreciate very much your cooperation. 34 APPENDIX D. PARTICIPANT SURVEY: EVALUATION DESIGN AND SAMPLING The impacts of short learning events are likely to be negligible or nonexistent, so this study includes only WBI learning events in FY02-03 that were longer than one day in duration. We did, however, consider all one-day events (or courses) that are part of a series of related activities but count the entire series as a single WBI learning event. We focus only on FY02-03 learning events because asking participants to recall anything beyond two years ago can be difficult. Also, the elapsed time between the WBI event and the survey (between 6 and 18 months) should have been sufficient for any effects deriving from a WBI learning activity to have manifested themselves. In all, there were 17 eligible FY02-03 UCMP events ­ as listed in WBI's Client Records System, or CRS - in the sample universe (see Appendix A). In sampling the participants of UCMP learning events for the survey, we first defined an eligible survey participant as one who: (a) had attended an eligible FY02 or FY03 UCMP event; (b) had at least one contact (i.e., an e-mail address, mailing address, fax number, or telephone number) available in the CRS; and (c) was not a staff member of the World Bank. In all, we identified 768 eligible participants from the CRS out of 1,050 UCMP participants. About 95 percent of those who were not eligible for the survey did not have at least one contact address in the CRS. We do not know the identities of these participants, so we can neither determine the nature nor extent of any bias associated with their exclusion. We used stratified random sampling to select the UCMP participants for the survey. This study was part of a broad evaluation looking into country-specific impacts of WBI programs, so the country of residence of the participant (or the country category) was the initial sampling strata used, i.e., we selected certain WBI priority countries from which to draw the UCMP survey respondents. We selected all UCMP participants in these priority countries.1 We then randomly selected survey respondents from the remaining UCMP participants from all other countries that were not in the initial selection. The one exception was Ethiopia where we sampled all six UCMP participants. Table 1 (in the text) summarizes the distribution of the sampled UCMP respondents by country.2 1The priority countries were: Burkina Faso, Guatemala, India, Indonesia, Kenya, and Thailand. In a few instances, some participants in these countries were not sampled for various reasons: the addresses (or points of contact) of listed participants were unclear, the individual was a speaker and not a participant; the participants were double-listed; the participant was actually a WB staff; and so on. Russia and China are also WBI priority countries, although we randomly sampled participants from these two countries. 2In our analysis of the participant survey results, we use both weighted (by the probability of a participant being sampled had we used purely random sampling) and unweighted estimates. Also, the countries listed in Table 1 pertain to the countries of residence of UCMP participants and not to the venues of the UCMP learning events (which are shown in Appendix A). 35 For the survey, we sampled 421 names from the list of 768 eligible participants.3 The actual number of completed surveys returned varies by country (see Table 1) because in some countries (e.g., Burkina Faso, India, Indonesia), we used local consultants to locate sampled participants and to interview them, either face-to-face or by phone. In other countries (e.g., Brazil and "others"), we had to rely on e-mail for sending questionnaires and receiving completed surveys. We made at least three attempts to contact ­ either by e-mail, phone or fax ­ the majority of sampled participants who were not responding. Despite this effort, only 141 participants, or 33.5 percent, of the 421 sampled participants completed the survey. 3This survey was also used in another study where the thematic category of the WBI offering was important. As a consequence, we used stratified random sampling to guarantee that some randomly drawn respondents participated in certain key WBI thematic learning programs. 36