43155 What Can WBI Learn from the Participants, Task Team Leaders, and Systems Records of its Learning Activities? A Review of Client Learning Jaime B. Quizon Joy Behrens Basab Dasgupta Cristina Ling Oliver Rajakaruna Dawn Roberts What Can WBI Learn from the Participants, Task Team Leaders, and Systems Records of its Learning Activities? A Review of Client Learning Jaime Quizon Joy Behrens Basab Dasgupta Cristina Ling Oliver Rajakaruna Dawn Roberts WBI Evaluation Studies No. EG08-138 The World Bank Institute The World Bank Washington, D.C. January 2008 Acknowledgments The World Bank Institute Evaluation Group (WBIEG) prepared this report under the direction of Richard Tobin (manager until June 2007). Jaime Quizon served as the task team leader. The authors wish to thank the WBI Task Team Leaders (TTLs) who gave generously of their time to participate in interviews for this analysis. Much appreciation also goes to the staff of the World Bank Institute's Office of the Chief Administrative Officer (WBICA), including Bernard Harragan and Manar Eliriqsousi, for help with extracting systems data. The authors thank Hnin Hnin Pyne and Marian S. Delos Angeles who peer reviewed this study. Nidhi Khattri and Violaine le Rouzic also offered valuable suggestions for its improvement. Finally, the authors thank Humberto S. Diaz for his assistance with formatting and graphics. WBIEG evaluates learning by staff of the World Bank and activities of the World Bank Institute (WBI). The Institute supports the World Bank's learning and knowledge agenda by providing learning programs and policy services in the areas of governance, knowledge for development, human development, environment and sustainable development, poverty reduction and economic management, and finance and private sector development. The findings, interpretations, and conclusions expressed in WBI Evaluation Studies are entirely those of the authors and do not necessarily represent the views of the World Bank Group, including WBI. WBI Evaluation Studies are available at http://www.worldbank.org/wbi/evaluation Suggested citation: Quizon, Jaime, Joy Behrens, Basab Dasgupta, Cristina Ling, Oliver Rajakaruna, and Dawn Roberts (2008). What Can WBI Learn from its WBICRS Records and Participant Assessments? A Review of Client Learning. Report No. EG08- 138. Washington, DC: World Bank Institute. Vice President, World Bank Institute Rakesh Nangia, Acting Manager, Institute Evaluation Group Nidhi Khattri, Acting Task Team Leader Jaime B. Quizon ii Contents EXECUTIVE SUMMARY ....................................................................................................... v Participants.................................................................................................................... v Program design and delivery ....................................................................................... vi Partnerships.................................................................................................................. vi Performance within the World Bank Institute ............................................................. vi Recommendations.......................................................................................................vii 1. INTRODUCTION............................................................................................................. 1 2. PARTICIPANTS .............................................................................................................. 5 Regression results ......................................................................................................... 5 Reflections from task team leaders............................................................................... 7 3. PROGRAM DESIGN AND DELIVERY............................................................................... 9 Regression results ......................................................................................................... 9 Reflections from task team leaders............................................................................. 11 4. PARTNERSHIPS............................................................................................................ 12 Regression results ....................................................................................................... 12 Reflections from task team leaders............................................................................. 12 5. PERFORMANCE WITHIN THE WORLD BANK INSTITUTE ........................................... 15 Regression results ....................................................................................................... 15 Reflections from task team leaders............................................................................. 16 6. RECOMMENDATIONS.................................................................................................. 18 APPENDIXES Appendix A: WBICRS and Level 1 evaluation variables .......................................... 21 Appendix B: Data and methods................................................................................. 24 Appendix C: Topic guide and questions for TTL interviews ..................................... 28 Appendix D: List of variables used for regression analysis ....................................... 30 iii Acronyms and abbreviations AIS Activity Initiation Summary OLS Ordinary least squares Plato WBI's electronic Planning Tool PTDs Participant training days SAP The Bank's cost accounting system TTL Task team leader WBI World Bank Institute WBICRS World Bank Institute Client Registration System WBIEG World Bank Institute Evaluation Group WBIEN World Bank Institute Environment and Natural Resource Management Division, former name of World Bank Institute Sustainable Development Division WBIES World Bank Institute Evaluation and Scholarships Unit, former name of World Bank Institute Evaluation Group WBIFP World Bank Institute Finance and Private Sector Development Division WBIHD World Bank Institute Human Development Division WBIKL World Bank Institute Knowledge and Learning Services WBIMO World Bank Institute office in Moscow WBIPR World Bank Institute Poverty Reduction and Economic Management Division WBIRC World Bank Institute Regional Coordination Unit WBISD World Bank Institute Sustainable Development Division WBIST World Bank Institute Sector and Thematic Programs iv EXECUTIVE SUMMARY This review identified recommendations for improving the quality of future World Bank Institute (WBI) programs. The recommendations are based on regression analyses of data from FY04-05 on 736 activities in WBI's Client Registration System (WBICRS) and associated assessments collected from participants at the end of each activity. The regressions explored which factors might explain participants' ratings of the overall usefulness (and other indicators of quality) of WBI learning events. Interviews with 26 task team leaders (TTLs) further clarified ideas and key concerns. WBI is moving toward a programmatic approach to capacity development, and learning activities remain an integral part of this strategy. Although the data analyzed do not reflect activities in FY06 or FY07, the extended effort to clean, merge, and analyze reliable data from past learning activities yielded relevant findings for strengthening WBI programs. PARTICIPANTS Data from the WBICRS and participant assessments provided evidence on which participant characteristics tended to improve ratings of learning events. Factors that were associated with an increase in participants' ratings for overall usefulness included · An increase in the share of female participants and · A concentration of participants from a region (rather than a broader worldwide representation or a single country representation). Factors that decreased ratings for overall usefulness included · An increase in the number of participants in an activity and · Having a greater proportion of participants from international organizations. Interviews with TTLs highlighted participant characteristics associated with successful activities, including that they should · View their participation in the learning activity as something they want to do to accomplish their goals, · Have the authority to make relevant policy decisions following the event, · Hold a position in which they can use their acquired knowledge and skills, v · Be higher level policy makers or researchers who can build capacity for the government, and · Be part of a network where they can continue to share experiences. PROGRAM DESIGN AND DELIVERY Some factors related to the design and delivery of learning events also influenced participants' ratings for overall usefulness and other desired outcomes. Holding the WBI learning activity in a low-income country increased the event's overall usefulness rating. Events that targeted skills building were rated to be more useful than those focused on general knowledge or policy. Electronic or blended learning programs both rated higher in overall usefulness compared with face-to-face delivery. Action learning did not emerge as a determinant of overall usefulness, but it was also not a clearly defined or consistently used variable in the WBICRS. TTLs described "the key ingredients of a successful activity," frequently emphasizing the use of relevant content and effective materials to match local demand and the general quality of design and delivery. As with the regression analysis, the interview data revealed a lack of common understanding about what qualifies as action learning. PARTNERSHIPS Common contributions by partners in WBI events include funding, content, event delivery, and access to knowledge networks or communities of practice. Existing data on WBI learning programs for FY04-05 lack sufficient reliable details about partnerships to support the systematic analysis of these joint efforts, and the regressions indicated that the one partner variable did not influence ratings for overall usefulness or other outcomes. This finding underscored the need for WBI to continue to refine the collection of data on learning activities. TTLs described the roles of partners for WBI programs and reflected on the factors needed for a successful partnership. Recurrent themes were that ideal partners should (a) make a clearly identified contribution, (b) have a long-term or ongoing commitment to collaborate with WBI, and (c) have a long-term plan for how to generate the income or other funding necessary to continue their work. TTLs also emphasized the importance of working with local partners to ensure that the right participants are selected and the logistics are handled smoothly. These reflections during interviews revealed a lack of common approach to defining or characterizing partnerships among TTLs. PERFORMANCE WITHIN THE WORLD BANK INSTITUTE Participant assessment ratings varied between WBI divisions and over time. Learning events had higher average ratings for overall usefulness when delivered by WBI's Environment and Natural Resource Management Division (WBIEN), Poverty Reduction and Economic Management Division (WBIPR), Human Development Division (WBIHD), or other divisions rather than by WBI's Finance and Private Sector vi Development Division (WBIFP). In addition, ratings for overall usefulness for WBI learning events increased in the second half of FY05 compared with the first six months of FY04. Interviews with TTLs solicited advice on how to improve WBI programs, and the resulting recommendations most frequently focused on (i) improving the coordination between WBI and Bank operations, (ii) ending the emphasis on participant training days, and (iii) integrating evaluation better into program design and delivery. RECOMMENDATIONS Results of this analysis shed light on the factors that are likely to increase the quality of learning or capacity development. Specifically, the factors identified are associated with higher participant assessment ratings, but they do not necessarily result in better learning outcomes. Considerations for program design and delivery are that events are likely to be more useful for attendees if they have · Fewer participants, · A greater share of female participants, · Fewer representatives of international organizations participating, · A focus on skills building, · Delivery in a low-income country, and · A delivery mode of electronic or blended learning, or a level of advance planning and pedagogical structure comparable to those needed for electronic or blended learning. Findings also warrant organizational improvements within WBI, some of which are already underway. This review highlighted the need for more systematic coordination between WBI TTLs and Bank operational staff. Interviews underscored the value of decreasing the emphasis on counting the number of participants and considering this measure as part of an overall package of performance indicators. TTLs want evaluation to be more fully integrated into their programs--by defining the program's logic at the outset, collecting baseline data, designing a monitoring and evaluation plan that includes participant assessments, and following up with participants over time. Finally, WBI managers should have direct experience with WBI programs in order to supply the needed guidance and support to TTLs. This review identified suggestions for increasing data quality to support WBI's programmatic approach. Definitions of variables need to be standardized. Learning activity data should be readily accessible to evaluation officers and accompanied by a transparent codebook. A detailed systems strategy to increase the coordination among WBICRS, Plato, and SAP and to minimize the reporting burden for TTLs will be helpful. Activity-specific data and participant perceptions of the relevance and usefulness of vii individual events are important building blocks for higher-level indicators linked to organizational development. Overall, the findings and recommendations of this review support WBIEG's engagement in ongoing efforts to refine data systems and to establish comprehensive indicators for assessing WBI's programmatic approach to capacity enhancement. viii 1. INTRODUCTION 1.1 The World Bank Institute (WBI) has a wealth of data on learning programs for clients from which to derive valuable lessons for task team leaders (TTLs) and managers. This review provides guidance for improving the quality of future learning events through both the analysis of existing data and interviews with TTLs to highlight ideas and key concerns. 1.2 Since FY02, WBI's Client Registration System (WBICRS) has recorded data on learning activities and now has information on several thousand discrete learning events. Activity-level data on the features of learning events and the characteristics of participants are relevant for managerial use in institutional decisionmaking.1 Although not all the WBICRS records are complete or reliable, WBI's Monthly Monitoring Reports as well as communications from systems developers suggest that there has been steady improvement in data quality with increasing compliance among WBI staff in entering information. WBI asks participants to complete a standard postactivity questionnaire in which participants assess the learning event in which they participated.2 Response data from these questionnaires provide activity-specific feedback to TTLs and populate WBI's Monthly Monitoring Report, periodic Quality Reports, and other reports for WBI managers. 1.3 Data from the WBICRS and participant assessments have already proven their worth as useful learning and managerial tools. However, participant assessment data have rarely been used as outcome variables in regression analysis with WBICRS data or triangulated with TTL interviews.3 This review merged WBICRS data and participant assessments to explore the following questions: · What key participants' characteristics, activity-level variables, and exogenous factors explain participants' ratings of the overall usefulness (and other indicators of quality) of WBI learning events? · Are there significant differences in assessment ratings across WBI thematic groups or over time? 1There are ongoing efforts to improve the links between WBI's local systems (WBICRS, Plato) and other Bankwide data systems--such as SAP and Business Warehouse. These efforts are intended to streamline data input and increase the timeliness and quality of information available on specific WBI activities. 2These questionnaires gauge participants' immediate reactions to learning events and are often referred to as Level 1 evaluations. 3Another study is Liu, Chaoying, Shreyasi Jha, and Tingting Yang, (2006), What Influences the Outcomes of WBI's Learning Programs? -- Evidence from WBIEG Evaluations. Report No. EG06-117. World Bank Institute. Washington, DC. This study conducted ordinary least squares regressions on a subset of data from the WBI CRS and participant assessments to identify which features of client learning programs contribute to improved participant assessments. 1 · What can WBI TTLs do to improve these assessments of their activities? · How might the WBICRS/participant assessment database be improved to provide more relevant and useful information for institutional learning? 1.4 Creating a merged dataset required identifying the current activity-level variables in the WBICRS and then mapping variables from participant assessments (i.e., activity and participant level) to these WBICRS activities. A list of the variables and their definitions for this merged dataset is in appendix A.4 Because data from the WBICRS and participant assessments are less complete and reliable for early years (FY02-03), this review focused on activities delivered in FY04-05. 1.5 Multiple regression analyses of data on 736 learning activities investigated the key factors that might explain participants' assessments of the events' overall usefulness, relevance to current work, provision of new information, usefulness of this new information, focus on what a participant specifically needed to learn, and extent to which the content matched the announced objectives. In addition, this review explored 27 learning events more extensively by reviewing available data and interviewing the TTLs.5 This process was designed to develop a qualitative understanding of how factors might explain participant ratings and to explore TTLs' concerns and ideas for improving the impact of learning programs. These 27 offerings represented a random sample of the highest- and lowest rated FY04-FY05 WBI activities. Additional details on the methodology for this review are in appendix B, and the questions used for interviews with TTLs are in appendix C. 1.6 Readers will note the limitations of this study--namely that the data analyzed do not reflect learning activities in FY06 or FY07, and that the scope focuses on individual learning activities in an era when WBI is moving toward a programmatic approach to capacity development, meaning that activities are delivered as part of a coordinated series to build organizational capacity.6 In fact, activities for individuals remain an integral part of WBI's programmatic approach, and the extended effort to clean, merge, and analyze reliable data from past learning activities yielded relevant findings about participants, program design and delivery features, and partnerships. 1.7 This review represents a learning process for merging participant assessment and learning activity data that can guide continued improvement of WBI's data systems. At the start of this process, participant assessment data for FY06 and FY07 were not yet available, so the analysis focused on data through FY05.7 A major focus of the data 4Included are variables as they exist currently and potential variables whose definitions require clarification from their WBI sources. In some cases, coding is not consistent across the WBICRS and participant assessment datasets, and the recommendations of this review include guidance on improving data quality. 5One TTL represented two of the sampled activities, so 26 TTLs were interviewed about 27 activities. 6WBI is formalizing this new approach in the FY08 work program planning process. All learning and capacity building activities conducted to achieve a development objective in a country for a subprogram of a thematic group are to be defined by a single concept note. 7Aggregate results on participant assessment data for FY06 and FY07 (based on descriptive statistics rather than multiple regression analysis) are available on WBI's Quality Reports intranet site, 2 management effort for this paper was to address problems with missing values in the analysis dataset and to merge data originating from various WBI units to allow for a simple time series. The methodology, now tested, could be replicated to clarify further what factors contribute to effectiveness and impact--given adequate access to the WBICRS and Plato datasets and adequate resources to cover time consuming data scrubbing as well as data analysis. http://intranet.worldbank.org/WBSITE/INTRANET/UNITS/WBIINT/0,,contentMDK:20326329~pagePK: 135700~piPK:135698~theSitePK:136975,00.html. 3 4 2. PARTICIPANTS 2.1 Attracting and selecting appropriate participants is an important part of good learning design. Data collected via the WBICRS and from participant assessments provided evidence on which participant characteristics tended to increase ratings of learning events. These data, combined with the reflections of experienced TTLs, underscore the value of reaching the right participants rather than the most participants. REGRESSION RESULTS 2.2 In designing the regression analysis, we hypothesized that the collective profile of participants in a learning event and the characteristics of individual respondents who assess the event might serve as explanatory factors for activity ratings. Analyses for this review explored such variables as the total number of participants, the share of female participants, the share of international representatives, whether World Bank staff were present, and the degree to which the mix of participants represented the country, the region, or the world. The significance of these factors as explanatory variables is presented in table 1. 2.3 Several variables influenced ratings for an event's overall usefulness and other related indicators of quality. However, an important caveat accompanying findings throughout this report relates to an issue that often arises in evaluations based on administrative data: the regression results in this report do find some explanatory factors, but they also show that most of the variance in participants' assessments of the overall usefulness of WBI learning activities can not be explained (R2=0.17).8 Factors related to the mix of participants that do explain some of the variance in assessments are highlighted below: · Total number of participants: The number of participants in a learning event was negatively correlated with participants' assessments of the event's overall usefulness. In other words, larger numbers of participants in events resulted in lower assessment ratings. If the number of participants in a learning event increased by 100, the event's average usefulness ratings would drop by about 2 percent. Thus, any push to raise the number of participants (for example, to increase the number of participant training days or PTDs) comes with a small 8R2, the coefficient of determination, is the proportion of the variation in assessment ratings that can be attributed to the variation of a particular independent variable. 5 "quality" tradeoff in terms of the participants' ratings of the learning event's usefulness.9 · Gender: An increase in the share of female participants in an activity raised the activity's overall usefulness ratings. Although female participants typically rated WBI activities higher than their male counterparts, this pattern of higher ratings was only part of the story--the increased presence of female participants also raised males' ratings for overall usefulness of the same event. 10 One anecdotal explanation for this finding is that females bring in different perspectives and tend to engage in a more interactive learning style, creating a more highly valued learning experience for all participants. In short, there is a defensible case for including more female participants in WBI learning events, as all participants benefit more than if the event were largely male- dominated. · Share of participants from international organizations: An increase in the proportion of participants from international organizations present decreased the average overall usefulness rating of the learning event, as well as the average ratings on four other indicators. A possible explanation for this finding is that WBI events are tailored, in general, to address the practical needs of in-country clients representing their governments and civil societies, rather than for staff of international organizations with more global/regional interests. · Regional representation: A concentration of participants from a single region (rather than a broader worldwide representation or a single country representation) added to an activity's average ratings on overall usefulness as well as four other indicators. 2.4 These regression results suggest that WBI might improve ratings of overall usefulness and related indicators on average, by involving fewer participants per event than in FY04-FY05, with relatively more females but fewer representatives from international organizations. Interviews with TTLs supplement this quantitative analysis to provide a comprehensive understanding of what ingredients contribute to successful learning events for clients. 9This analysis was not designed to identify the optimal numbers of participants for different types of learning events; however, this issue could be explored further in future analyses. 10Appendix B presents all regression results in table B.1, first using the ratings of all participants and then using the ratings for male participants only. 6 Table 1. Participant characteristics as explanatory variables: regressions of mean ratings by participants (Extract from Appendix B) Selected Dependent variables (mean ratings by participants) explanatory Usefulness of Focus on Matched variables Overall Relevance Acquired new new learning announced (partial data) usefulness to work information information need objectives N 736 736 736 736 736 736 R2 0.17 0.18 0.11 0.17 0.13 0.11 Intercept 0.8051*** 0.8393*** 0.7410*** 0.7854*** 0.7534*** 0.8129*** Total number of participants -0.0001*** -0.0001*** -0.0001 -0.0002*** -0.0001*** -0.0001*** Share of female participants 0.0302** -0.0127 0.0130 0.0260* 0.0187 0.0320** Share of international representatives -0.0610*** -0.0022 -0.0697*** -0.0510** -0.0829*** -0.0556** Share of other representatives 0.0042 0.0064 -0.0009 0.0005 -0.0061 0.0035 WB staff participation? (Dummy: Yes=1, 0.0014 -0.0004 -0.0131** -0.0038 -0.0057 0.0043 No=0) Representation (Reference: World representation) -0.0049 -0.0181** 0.0125* -0.0031 0.0012 0.0 Country representation Region representation 0.0163** 0.0016 0.0222*** 0.0201*** 0.0160** 0.0122 Note: ***, **, * represent levels of significance at 1 percent, 5 percent, and 10 percent respectively. For the complete regression results, see table B.1 in appendix B. REFLECTIONS FROM TASK TEAM LEADERS 2.5 The 27 learning activities sampled for further investigation through TTL interviews represented a broad range of participant characteristics. The majority of participants in most activities were male, and two events had no female participants. The number of participants ranged from 9 to 180. Based on data entered in WBICRS for the sampled activities, participants were most frequently government employees, but five events were dominated by academics or educators and three events were mainly populated by private sector representatives. In addition, World Bank staff were among the audience for five of the 27 events. 2.6 By design, interviews with the TTLs of sampled learning events were structured to explore possible factors contributing to higher or lower participant assessment ratings. This approach elicited in-depth comments and explanations from TTLs about both the individual events in the sample and their broader experiences. Although the sample was designed to enable a comparison of high- and low-rated activities, analysis of the qualitative data from interviews indicated that responses did not differ measurably according to whether they were associated with the highest ratings or the lowest ratings. However, analysis of TTLs' nuanced reflections based on their direct experience with 7 WBI programs enabled this study to provide suggestions on factors that facilitate successful learning programs and recommendations to strengthen WBI in the future. 2.7 TTLs' reflections about the sampled learning events provided a unified view that participants have an important role in determining the success of learning programs. Although no interview question focused specifically on how to select participants or the best mix to attract, 20 of the 26 TTLs volunteered their opinions on these topics and emphasized the importance of reaching the right audience. Recurrent messages included that the ideal participants should... · View their participation in the learning activity as something they have sought out to help them accomplish their goals, · Have the authority and expertise to make relevant policy decisions following the learning event, · Hold a position in which they can use their acquired knowledge and skills to accomplish particular development goals, · Be higher level policy makers or researchers who can build capacity for the government, and · Be part of a network where they can continue to share experiences. 2.8 Developing clear strategies for identifying and reaching the target audiences for specific programs is a critical part of the event planning process, and TTLs' comments collectively outlined tips and challenges for doing this most effectively. When possible, TTLs should work with local partners who are often best situated to attract, assess, and select appropriate participants. However, some TTLs warned that leaving the selection of participants to partners can have disadvantages, particularly in cases where partners are charging a fee for an activity and are likely to accept anyone who pays the fee. TTLs should therefore facilitate an education process, helping partners to consider the characteristics listed above in developing their participant selection strategy. 2.9 Interviews also highlighted how participants help determine the relevance of the content and the success of the delivery (both discussed in chapter 3), how local partners can play a critical role in selecting optimal participants (chapter 4), and how the indicator of participant training days is not sufficient if the management goal is to measure (or to promote) effective learning practices (chapter 5). 8 3. PROGRAM DESIGN AND DELIVERY 3.1 There are many variations in the design and delivery of WBI's learning events, and lessons derived from extant data and TTLs' experiences about the roles of different factors can guide program improvements. In addition to having the right audience present, participants' ratings can be influenced by the location and objective of the program, the means of delivery, and the relevance of the content. REGRESSION RESULTS 3.2 Together, the WBICRS and participant assessment data describe the format and delivery mode of learning events, identify their product lines, and indicate where activities were located and how long they lasted. As shown in table 2, an analysis to explore which factors could explain participants' ratings indicated the following: · Event location in a low-income country: Holding the WBI learning activity in a low-income country is associated with higher ratings for the event's overall usefulness compared with holding the same event in a middle- or high-income country.11 · Learning objective: Events that target skills building were rated to be more useful than those focused on knowledge exchange or policy service. This focus on practical applications or skills building tailored to address specific concerns is in line with WBI's country focus strategy. · Mode of delivery: Compared to face-to-face delivery of learning events, electronic and blended delivery both rated higher in overall usefulness. By necessity, electronic or blended learning programs require advance preparation, so this finding could be due--at least in part--to the increased planning and careful preparation inherent in electronic and blended learning. · Duration of the learning event: The length of an activity did not affect the ratings for its overall usefulness. However, participants were more likely to acquire new information in longer events. 3.3 Where appropriate, guidelines for raising participants' ratings for the overall usefulness of WBI learning offerings call for more learning events in low-income 11While this result may be more on account of having the right participants, it is likely that the location of the learning activity encourages better participant selection. Unfortunately, we could not test these hypotheses with the available data. 9 countries within the same region, additional skills building type of activities, and wider use of electronic and blended learning delivery modes. Table 2. Design and delivery characteristics as explanatory variables: regressions of mean ratings by participants (Extract from Appendix B) Selected Dependent variables (mean ratings by participants) explanatory Acquired Usefulness of Focus on Matched variables Overall Relevance new new learning announced (partial data) usefulness to work information information need objectives N 736 736 736 736 736 736 R2 0.17 0.18 0.11 0.17 0.13 0.11 Intercept 0.8051*** 0.8393*** 0.7410*** 0.7854*** 0.7534*** 0.8129*** Duration (in days) 0.0007 -0.0011 0.0022*** 0.0013 0.0005 0.0002 Action learning (dummy variable: 0.0094 0.0107* -0.0069 0.0040 0.0096 -0.0005 yes=1, no=0) Location of activity (Reference: Middle- 0.0236*** 0.0071 -0.0060 0.0195*** 0.0098 0.0031 income country) Low- income country High-income country 0.0041 -0.0184** -0.0072 -0.0078 -0.0047 0.0050 Learning product line (Reference: Knowledge 0.0101 -0.0049 0.0035 -0.0004 0.0024 0.0009 Exchange) Policy Service Skills Building 0.0133** 0.0015 -0.0025 0.0125** 0.0089 0.0037 Mode of delivery (Reference: Face to face) Distance -0.0128 -0.0262** -0.0052 -0.0078 -0.0226** -0.0153 learning Electronic learning 0.0312** -0.0067 0.0292** 0.0403*** 0.0319*** 0.0318*** Blended offering 0.0152* -0.0023 0.0085 0.0115 -0.0014 0.0111 Note: ***, **, * represent levels of significance at 1 percent, 5 percent, and 10 percent respectively. For the complete regression results, see table B.1 in appendix B. 3.4 One interesting finding was that the use of action learning did not emerge from the regression analysis as a determinant of overall usefulness even though there is sufficient a priori reason and other evidence to think otherwise.12 This is because the WBICRS question from which this action learning variable was derived was not associated with a specific definition, so a common understanding of the term is not possible.13 The manner in which data on action learning were recorded in the WBICRS 12Earlier evaluations have identified this variable as a significant factor that explains participants' use of knowledge and skills acquired from the WBI learning event. 13In FY05, WBIEG conducted face-to-face interviews with 34 WBI TTLs on their use of data systems. The interviews revealed that WBI TTLs had different notions of what constituted action learning. This is not surprising because the definition of action learning was complex and had not been finalized. Also, TTLs did not have convenient access to this definition. Furthermore, TTLs--who were in the best position to describe their activity--often delegated to assistants the task of entering activity information into administrative systems. These people were not always sufficiently informed to indicate 10 was too imprecise to be useful for analysis or planning purposes; however, WBI's development of an electronic activity planning tool (Plato) represents a positive step, and hopefully will allow for clarification of definitions such as "action learning." REFLECTIONS FROM TASK TEAM LEADERS 3.5 TTLs were asked to identify "the key ingredients of a successful activity" and most emphasized similar factors. Aside from the importance of having the right participants (discussed previously), the most dominant themes included using relevant content and effective materials to match local demand (17 of 26 TTLs), and the general quality of design and delivery features--often including an action learning component and the use of recognized experts as presenters (14 of 26 TTLs). Examples of these comments are in box 1. Box 1. Illustrative comments from TTLs on what factors contribute to successful learning events "We try to do action learning, which is matching the knowledge with something that will produce a result for the client. That doesn't always work: it's a function of the nature of the course. In some cases, it's a dialogue, so action learning doesn't make sense. In other cases, it's skills building, in which action learning makes a lot of sense. So we try to match the pedagogy to the program's objective." "If you don't have content that is relevant, that is of high quality according to international standards, that adds value, that is trusted, then you won't have a successful activity." "To have a successful activity, you must have something topical and relevant with interaction--not just people listening to lectures ... presentations [by participants], panels, case studies are different ways of learning." 3.6 As with the regression results, the analysis of interview data revealed a lack of common understanding about what qualified as action learning. When asked directly about whether their specific activities in the sample had included an action learning component, 16 TTLs claimed that they did. However, the accompanying descriptions encompassed a range of definitions. For example, one task manager explained that the action learning component was having participants do homework individually in the evening whereas another TTL had participants conduct economic modeling and computer simulations during the learning session but claimed that no action learning had been included. whether or not an activity used action learning. All these circumstances made the information related to action learning unreliable. 11 4. PARTNERSHIPS 4.1 Learning providers in WBI have many definitions for "partnership." In interviews, TTLs used this term broadly to refer to collaborations with Bank operations, joint efforts with other international donors, agreements with local institutions, and even interactions with clients. Common contributions by partners include providing funding (including in-kind contributions), developing content, participating in event delivery (which could include providing training and/or logistical support), and offering access into knowledge networks or communities of practice. Despite the widespread reliance on and substantial contribution by partners, existing data on WBI learning programs for FY04-05 lack sufficient reliable details about partnerships to support the systematic analysis of these joint efforts. The newer Plato, at least as of June 2007, requests data about partners that may help in future analyses if the Plato data can be merged with other relevant datasets. REGRESSION RESULTS 4.2 Limited information on partner involvement was recorded in the WBICRS or collected via participant assessment evaluations for learning events in FY04-05. In fact, the only clear variable related to partnerships came from a field about whether or not a partner was involved "to a large extent" in delivering the offering.14 Given that the basic types of partner involvement were not captured, it is not surprising that the regressions conducted for this study did not find this variable to have any influence on participants' ratings for an event's overall usefulness or other desired outcomes. Rather than supporting any conclusions regarding the importance of partnerships, this finding underscored the need for WBI to continue to refine the collection of data on learning activities to improve their overall reliability and usefulness. WBI has achieved important progress in this area: starting in FY07, Plato prompts providers to list whether partners are involved, to indicate the role or roles the partner organization(s) fill, and to list the partner(s) by name.15 REFLECTIONS FROM TASK TEAM LEADERS 4.3 TTLs described the roles of partners for the sampled activities and noted that they were often shaped by political context. In some countries, a task team must collaborate closely with government officials to design and deliver any program. Client governments or local partners frequently played an instrumental or even autonomous role in selecting participants. The variety of ways that collaboration happened among WBI task teams, Bank operations, clients, local partners, and other donors therefore reflected a range of factors that affected partner involvement. Some of these factors stemmed from funding, design, and logistical considerations; others occurred through political necessity. 14See appendix A for descriptions of specific variables in the merged dataset. 15The roles listed do not link directly back to specific partners, so the precise contribution of individual organizations is still unclear when multiple partners are listed. 12 4.4 The learning events sampled for TTL interviews reflected various types and intensities of partnerships. Some of the events were delivered entirely by partners, without any Bank staff present. In 19 of the 26 cases, partners had the main responsibility for selecting and screening participants. Eight of the 26 events involved the efforts of multiple partners. 4.5 Despite the varying experiences with collaboration, TTLs expressed similar views on factors needed for a successful partnership. Recurrent themes were that, in a successful partnership, all partners should... · Make a clearly identified contribution (that is, they "add value") to the event, · Have a long-term or ongoing commitment to collaborate with WBI, and · Have a long-term view of their work and a plan for how to generate the income or other funding necessary to continue their work. 4.6 In addition to the characteristics for successful partnerships in general, nine of the 26 TTLs emphasized that working with a local partner to ensure that the right participants were selected and that logistics were handled smoothly was critical to the success of a learning event. Comments from TTLs to illustrate these common themes are in box 2. Box 2. Illustrative comments from TTLs about partners Characteristics of a successful partnership "A successful partnership is where one begins to develop or take on the full responsibility and actually continue on one's own--even after we have left." "We depend on the partner for delivering or developing content with passion and commitment. We help develop operational sustainability. Our partner has had a relationship with us for about four years and we have seen them grow successfully. The credit should be shared mutually and not be single sided." "Complementarity with the partner is key... one has to make strategic choice[s] and differentiate between content partners, logistic partners, technical partners. One should make sure that a mix of these important factors are met in the actual partnership as everybody should ... bring a particular contribution to the table." "You have to find a reliable partner, a partner that is not only after the money but also contributes towards the objective and that is not one sided and shares risks." Importance of a local partner "If you get the local institutions involved in the activity then you have a higher likelihood of succeeding. For training, the local institutions should be involved to develop part of the content as they know the participants, know the problems, the culture. They can easily get the right participants. In terms of preparation, this is the key." "You need a local partner with appropriate capacity to organize the event...facilities [are] the key in developing countries and contribute 25-30 percent for a successful activity... if the participants are not comfortable, they may lose interest. I am a believer in relying on many local partners as competent as possible to ensure the delivery." 13 4.7 Overall, the interviews highlighted differences in how WBI partnerships are defined or characterized. Some TTLs characterized their work with counterparts in Bank operations as partnerships while others specifically noted that they had not collaborated with partners and had teamed with staff in operations instead. One TTL wondered if "partner" was really just another term for their client. Some TTLs identified local or international organizations that played a major role in the activity design and delivery whereas others referred to only peripheral involvement. In at least one case, a partner described as having a "minor role" contributed content, identified local experts and presenters, selected the participants, and handled logistics. 14 5. PERFORMANCE WITHIN THE WORLD BANK INSTITUTE 5.1 Although some activity-level data needed improvement (particularly for variables on partnerships and action learning), merging the data from the WBICRS and participant assessments for the 736 FY04-05 activities in the dataset provided a useful vehicle for exploring how well WBI was performing across units and over time. These findings, supplemented with insights from TTLs, inform ideas for increasing the quality and impact of WBI programs, including both external training for individuals and broader capacity development initiatives. REGRESSION RESULTS 5.2 As shown in table 3 and detailed below, participant assessment ratings varied between WBI divisions and over time: · WBI division: Participants' average ratings for overall usefulness were explained at least partly by the unit that managed and delivered the learning event. Learning events had higher average ratings when managed and delivered by WBI's Environment and and Natural Resource Management Division (WBIEN), Poverty Reduction and Economic Management Division (WBIPR), Human Development Division (WBIHD), or other divisions (WBIOTHER) rather than by WBI's Finance and Private Sector Development Division (WBIFP).16 Of these units, WBIPR had the largest positive effect on overall usefulness, followed by WBIOTHER, WBIEN, and WBIHD. This ordering changed, albeit moderately, when considering alternative success indicators such as the relevance of the activity to participants' current work, the extent to which the activity provided new information, the usefulness of the acquired information, the focus of the activity on the learning needed by the participant, and the degree to which the activity matched the announced objectives. · Fiscal year: Overall usefulness ratings for WBI learning events rose in the second half of FY05 compared with the first six months of FY04, the base period. This is a gain of 13 percent in this indicator from what was possible at the base period.17 Similar gains are shown for other success indicators. These 16WBIOTHER includes the following WBI unit codes: WBIST, WBISM, WBIKL, WBIKD, WBIES, WBIEG, WBIMO, and WBIRC. 17The FY04 (first half) mean for overall usefulness was 4.28, based on a 5-point scale. The possible gain over time from this base is 0.72. 15 results attest to notable improvements, from the viewpoint of participants, in the conduct and delivery of WBI learning events over the FY04-05 period. Table 3. WBI division and timing as explanatory variables: regressions of mean ratings by participants (Extract from Appendix B) Selected Dependent variables (mean ratings by participants) explanatory Usefulness of Matched variables Overall Relevance Acquired new new Focus on announced (partial data) usefulness to work information information learning need objectives N 736 736 736 736 736 736 R2 0.17 0.18 0.11 0.17 0.13 0.11 Intercept 0.8051*** 0.8393*** 0.7410*** 0.7854*** 0.7534*** 0.8129*** Division (Reference: 0.0239*** 0.0352*** 0.0289*** 0.0349*** 0.0267*** 0.0130 WBIFP) WBIEN WBIPR 0.0343*** 0.0410*** 0.0245*** 0.0369*** 0.0309*** 0.0282*** WBIHD 0.0210** 0.0528*** 0.0255** 0.0377*** 0.0282*** 0.0266*** WBIOTHER 0.0302** 0.0365** 0.0058 0.0308** 0.0133 0.0231 Fiscal year (Reference: First half of FY04) -0.0037 0.0075 0.0009 0.0009 -0.0037 -0.0066 Second half of FY04 First half of FY05 0.0103 0.0128 0.0104 0.0104 0.0126 0.0068 Second half of FY05 0.0193*** 0.0094 0.0167** 0.0167** 0.0168** 0.0130* Note: ***, **, * represent levels of significance at 1 percent, 5 percent, and 10 percent respectively. For the complete regression results, see table B.1 in appendix B. REFLECTIONS FROM TASK TEAM LEADERS 5.3 Interview sessions with TTLs ended with questions on how to improve learning programs and capacity development activities carried out by WBI.18 Recommendations focused on: · Improving the coordination between WBI and Bank operations (13 of 26 TTLs), · Ending the emphasis on participant training days (12 of 26 TTLs), and · Integrating evaluation better into program design and delivery (8 of 26 TTLs), including 18As noted in appendix C, TTLs were asked how they would (a) improve the quality and impact of WBI activities and (b) change the ways WBI carries out its capacity development mission. They were also asked what they would change if they were a Bank leader. Answers to these three questions were interchangeable and are discussed collectively in this chapter. 16 o Developing methods or tools to collect feedback and adjust delivery during longer events (rather than just through a questionnaire at the end), o Having WBIEG rather than the unit delivering the activity develop customized questions for participant assessments and analyze the results, and o Establishing a system for following up with participants six months or more after an event to ask about the use of what they learned or other ways in which their participation in the WBI activity affected their work. 5.4 Quotes illustrating frequent recommendations by TTLs are in box 3. Other less frequent suggestions by those interviewed included using a multiyear cycle for programming and budgeting (7 of 26 TTLs) and strengthening WBI management (6 of 26 TTLs) by eliminating at least one layer of management and by requiring that all managers serve as task manager for at least one learning event per year and as team member for at least one learning event per year. Finally, four of 26 TTLs highlighted the need to emphasize quality content and delivery to compete better with universities and the importance of tailoring offerings to local demand based on needs assessments. Box 3. Illustrative recommendations from TTLs on how to improve the quality and impact of WBI learning programs and capacity development initiatives End the emphasis on PTDs "Look at the incentive structure of TTLs. The incentive structure is designed to increase the number to have a better PTD. Instead of a large group of participants, I would rather train a small group of policy makers who actually change policy in a country." "If WBI ever gets its act together and eliminates the whole concept of PTDs and focuses instead on selection and quality--on what content are we teaching, who are we teaching it to, does it have any impact--it would eliminate half my stupid load. It would allow me to actually follow up. The biggest problem is we have no follow-up capacity. We are obsessed with pushing stuff through the system, training as many people as you can train in any way you can." Coordinate better with Bank operations "In operations, people do not know that WBI has the capacity to respond to situations that could be addressed well and they keep going around searching web sites to select training institutes in countries to do the training." "We need RCET [WBIRC] help to establish and operationalize the link with operations on capacity building. Operations should keep us [WBI] in their radar not just to be informed of the training but to help design the capacity development components of the loan itself, brought in at the very beginning along with loan documents." Integrate evaluation more fully into programming "There should be a feedback process to establish a link with the policy makers in order to ensure an impact of the training activity. In simple terms, the training activity should directly produce the desired results, with diagnostic tools provided to be applied to the local data and be used by policy makers in finance, planning, and so on. Otherwise there will be no value added in this type of training activities." "We do level 1s as required, but we do [daily assessments] in our programs as feedback, and we publish the results the following day...we make necessary adjustments immediately which is more important that the level 1 results which come later. My clients are there for one week to be dealt with on the spot!" 17 6. RECOMMENDATIONS 6.1 The emphasis of WBI programs has evolved since the scope of this review was established, with efforts increasingly focusing on multiyear programmatic initiatives to develop organizational capacity rather than just on PTDs. Although this review is based on earlier data, results of this analysis shed light on the factors that are likely to increase the quality of learning or capacity development. Specifically, the factors identified are associated with higher participant assessment ratings, but they do not necessarily result in better outcomes. WBI managers and TTLs should consider and apply the findings of this review when appropriate. The following recommendations outline advice for programs, suggest organizational improvements within WBI, and identify steps needed to improve the quality of data on learning activities and participant characteristics. · Considerations for program design and delivery. TTLs consider a range of factors in delivering WBI's programs--from available resources and cost effectiveness to feasibility and potential impact. As task teams plan events and encounter trade-offs, findings from this review can inform decisionmaking to maximize programs' usefulness. Specifically, participants perceive programs to be more useful if events have: o Fewer participants overall, o A greater share of female participants, o A smaller share of international representatives participating, as opposed to government representatives o A focus on skills building, o A delivery in a low-income country, and o A delivery mode of electronic or blended learning, or a level of advance planning and pedagogical structure comparable to those needed for electronic or blended learning. Furthermore, working with local partners is advised to optimize the selection of participants, the adaptation of content, and the logistical details of an event's delivery. · Organizational improvements within WBI. Several of the changes suggested by TTLs in interviews are already underway in WBI: most notably, the program emphasis is shifting to a multiyear focus on initiatives at the organizational level rather than on learning events for individuals, and management is actively exploring alternative indicators to supplement the use 18 of PTDs. Findings from this review provide further guidance within this context: o Links between WBI programs and Bank operations. WBI TTLs have varying degrees of interaction with operational staff, and many reported that this this coordination happens only on an ad hoc basis. Increased guidance and support from WBIRC is needed to establish more systematic coordination. Suggestions include involving WBI counterparts early on in the Bank's lending process or bringing WBI staff into the Bank's matrix whereby they report also to operational managers. o Performance indicators: TTLs expressed a frustration with being assessed through PTDs, but they also demonstrated an understanding that some such indicators are needed to gauge performance. This study underscored the current opportunity to continue decreasing the emphasis on counting the number of participants and considering this measure as part of an overall package. Assessments of and incentives for TTLs should focus on reaching the right participants, delivering quality programming, and following up as necessary or appropriate to ensure intended impact. o Evaluation services and tools. TTLs want evaluation to be more fully integrated into their programs, and this perspective is aligned with WBI's growing focus on coherent programs rather than individual events, as well as on a results framework for capacity development programs. Ideally, evaluation practices should support and guide the programmatic approach by defining the program's logic at the outset, collecting baseline data, designing a monitoring and evaluation plan that includes participants' assessments, and following up with participants over time. This holistic approach will provide more meaningful direction to programs rather than just collecting participant assessments at the conclusion of individual activities. At minimum, TTLs want customized questions to ask their participants and guidance on how best to follow up with participants later. Ideally, WBIEG should be engaged at the start of the program and have the capacity to help TTLs shape appropriate evaluation questions, strategies, and analyses. o Strengthening management. To ensure that WBI managers can supply the needed guidance and support to TTLs, they should be directly familiar with both the overall strategy and the practical details of the relevant WBI learning programs. Such familiarity could enable managers to identify and overcome barriers to optimal performance. A systematic approach of requiring managers to lead or serve on at least one task team may be warranted given the dominance of this theme in TTL interviews. · Increasing data quality to support WBI's programmatic approach. Although the data for this review were from FY04-05, the preparation and analysis of the dataset highlighted lessons to improve systems and practices in FY08 and beyond: 19 o Defining variables. The process of merging data from the WBICRS and participant assessment evaluations revealed several cases where variables with similar coding had different meanings in different datasets (see appendix A). Efforts to standardize the definitions of all variables should continue. Of particular concern are the variables for action learning and partnerships, two important factors in learning programs whose contribution can not be understood without better data. o Supporting monitoring and evaluation. The growing repository of data available on WBI activities should be mined and analyzed regularly as part of a more integrated approach to program evaluation. To support such a process and to ensure that analyses such at this one are accurate and up-to-date, learning activity data should be readily accessible to evaluation officers, for example, through a full database query available at any time without the intervention of programmers. In addition, a transparent codebook is needed to promote the common understanding of all variables therein. o Aligning systems thinking to WBI programming. The WBICRS, Plato, and SAP all contain valuable data on learning activities; however, it is not yet clear how all of this information will be harvested and used. A detailed systems strategy is needed to increase the coordination among data systems and to minimize the reporting burden for TTLs. As part of this strategy, data entry protocols are needed to promote consistency. The changes in the SAP system format for accounting records for external training (TE) activities represent a positive step in this direction. New changes in Plato are also expected to be helpful, and management should emphasize collection and analysis of data from SAP, Plato, and the WBICRS, as well as from participant assessments, to continue learning how to improve WBI's programs. 6.2 Activity-specific data and participant perceptions of the relevance and usefulness of individual events are important building blocks for higher-level indicators linked to organizational development. Continuing to improve the reliability of these data and using them to inform decisions on program design and delivery can contribute to the success of WBI's programmatic approach. 20 APPENDIX A: WBICRS AND LEVEL 1 EVALUATION VARIABLES CRS L1 Variable Additional explanation Division Five-letter abbreviation for the WBI division. These are not necessarily the same across fiscal years. Thematic Program One of 18 or so WBI thematic programs WPA Number Work program agreement number -- corresponds to a thematic program for a given fiscal year WBS Element P0 number Schedule Code Number that comes from the Learning Catalog - one schedule code per site Name of Offering Activity title TM Name of task manager Location City and country (or sometimes only country) where the activity actually took place; venue Start Start date of the activity End End date of the activity N Participants Number of participants registered in the activity N Observers Number of observers registered in the activity N Resource People Number of resource people registered in the activity 1.1 Participants Response "Participant" to "Which of the following best describes your main role in this activity?" 1.2 Observers Response "Observer" to "Which of the following best describes your main role in this activity?" 1.3 Resource Persons Response "Resource person" to "Which of the following best describes your main role in this activity?" 1.4 Other Response "Other" to "Which of the following best describes your main role in this activity?" 2.1 Attended all Response "All of it" to "How much of the activity were you able to attend?" 2.2 Attended most Response "Most of it" to "How much of the activity were you able to attend?" 2.3 Attended half or less Response "Half or less of it" to "How much of the activity were you able to attend?" 3.1 WB Employee Response "Yes" to "Are you a World Bank employee?" 3.2 Not WB Employee Response "No" to "Are you a World Bank employee?" 4.1 Male Response "Male" to "Are you" 4.2 Female Response "Female" to "Are you" 5. Relevance Response to "Relevance of this activity to your current work or functions," rating on a 1-5 scale (1=minimum; 5=maximum) 6. New info Response to "Extent to which you have acquired information that is new to you," rating on a 1-5 scale (1=minimum; 5=maximum) 7. Useful info Response to "Usefulness for you of the information that you have acquired." rating on a 1-5 scale (1=minimum; 5=maximum) 8. Learning Response to "Focus of this activity on what you specifically needed to learn," rating on a 1-5 scale (1=minimum; 5=maximum) 9. Objectives Response to "Extent to which the content of this activity matched the announced objectives," rating on a 1-5 scale (1=minimum; 5=maximum) 10. Overall useful Response to "Overall usefulness of this activity," rating on a 1-5 scale (1=minimum; 5=maximum) Action Learning Whether the activity was designated as action learning or not; values can be "Yes" or "No" (Table continues on next page.) 21 (Table continued.) CRS L1 Variable Additional explanation Mode of delivery Mode of delivery of the activity; could be face-to-face; distance learning; electronic learning; blended learning; others? Partner involvement from "Was a Partner involved to a large extent in delivering this offering?" CRS Can be Y or N: Y= partner-led; N = WBI-led Activity Type Type of activity -- values can be course, seminar, clinic/workshop, conference, global dialogue, study tour, e-discussion, others Scheduled Duration Number of days for which this activity is scheduled Participant Training Days Number of participant training recorded for this activity Actual Duration Number of days the activity lasted. Duration There may be several variables of "actual duration" because there may be a different duration under each activity GDLN Whether the activity was a GDLN activity - probably a Yes/No variable. Note that some WBI distance learning courses did not make use of GDLN facilities. AMS Send Whether the activity was sent to AMS -- probably a Yes/No variable Due Level 1 Whether an evaluation was required for this activity or not, in other words, whether this activity would appear on the Due Level 1 report. (Activities lasting 1 day or less or conferences are exempt) Product Line Values can be -- T=skills building; P=policy service; K=knowledge exchange; others Requested by Also known as source of demand: EC=external client; OP=operations; OT-others; other Pillar Pillar -- sector & thematic; knowledge; regional; global programs; other Used 10 questions? Whether the evaluation used the old six-question format or the current ten-question format Participant contact info? How much contact information was in the CRS for the participants in this activity -- values can be: All have contact info, Some have contact info, Participants entered but none have contact info, Summary only (individual participants did not enter) Country - location of the Country in which the activity took place. DL activities with more than activity one location or activities with an unspecified location. EL activities are coded "Worldwide" Country - where participants There could be as many countries listed here as there are participants live and work Number of participants from If there is more than one participant from a country, the CRS should country indicate how many participants are from that country Region - location of the Region corresponding to the country in which the activity took place. activity There may be activities with more than one location or with an unspecified location Region - where the participants in this activity come from (Table continues on next page.) 22 (Table continued.) CRS L1 Variable Additional explanation Participant roles Roles the participants were registered as having, and how many in each role -- values could be Resource: Donor Resource: Observer Resource: Organizer Resource: Lecturer Resource: Other Participant: Client country Participant: IMF Participant: World Bank Participant: Other international organization Participant: Other Participant representations To what type of organization did the participant belong, and how many participants were from each type of organization M: Minister P: Parliamentarian G: Government official A: Academics/education J: Media/journalist H: NGO/research institute V: Private sector B: Bilateral aid donor I: IMF O: International organization Z: Other W: WB Participant education level Education level of participants, and how many participants with each education level Priority country Calculated by the results database based on a table of priority countries Focus country Calculated by the results database based on a table of focus countries Low-income focus country Whether this was one of the six low-income focus countries Probably calculated on the basis of other variables Economy Classification of the economy, from the World Development Report -- LIC=low-income; LMC=lower-middle-income; UMC=upper-middle- income; OECD=high-income OECD countries; OHI=high-income countries other than OECD -- probably calculated by the results database based on a table of countries and economies Note: =Included; = Not included 23 APPENDIX B: DATA AND METHODS SELECTION OF WBI LEARNING ACTIVITIES FOR EVALUATION For both our quantitative and qualitative analysis, we focused on FY04 and FY05 WBI learning activities only. Although we merged and cleaned WBICRS and participant assessment (Level 1) data from FY02-FY05, we excluded activities before FY04 because we found some key variables were not available or were not reliable. For FY04-05, the total number of WBI learning activities delivered was 815. From this total, we dropped 76 activities of one day or less in duration and included 739 activities for analysis19. REGRESSION ANALYSIS We used multiple regression analysis to investigate the key factors that might explain the participants' ratings of an activity's overall usefulness and other desired features and outcomes of the learning event (relevance, provision of new information, and focus of the particular activity on what the respondent specifically needed to learn).20 We use OLS estimation procedures to test the following model: n n n Y (Overall Usefulness) = + (activity features)+ (respondent characteristics)+ (exogenousvariables) + 0 ij ij ij ij j =1 j =1 j =1 We constructed some important activity-level variables such as total number of participants; share of government, international or other representatives; and share of female participants. For example, to capture the significance of country representation or regional representation in a particular activity, we created two dummy variables: RCOUNTRY and RREGION. The former is =1 when 80 percent or more participants in an activity come from the same country, zero otherwise. The latter variable is =1 if 80 percent or more participants come from the same region. Detailed procedures used creating all the new variables used in the regression analysis are given in the appendix D. In all, the regressions are intended to answer the question: what key participants' characteristics, activity-level variables, and exogenous factors can explain participants' ratings of WBI learning events?21 19However, due to missing values, we used only 736, out of 739 available, observations in our regression analysis. 20We standardized these dependent variables by converting their values to between [0, 1]. 21We also investigated other possible explanatory variables that are not reported here, i.e., language of instruction, number of sessions, and whether or not the average activity participant attended all, attended half, or attended most of the learning event. We dropped these variables from the final regressions because they either (a) contained a lot of missing values; (b) had very unclear definitions; or (c) could be defined by a more precisely defined variable used in the regressions (to avoid multicollinearity). 24 Table B.1. Regressions of mean ratings by all participants and by male participants Dependent variables (mean ratings by all participants) Explanatory variables Acquired Usefulness of Focus on Matched Overall Relevance new new learning announced usefulness to work information information need objectives N 736 736 736 736 736 736 R2 0.17 0.18 0.11 0.17 0.13 0.11 Intercept 0.8051*** 0.8393*** 0.7410*** 0.7854*** 0.7534*** 0.8129*** Total number of participants -0.0001*** -0.0001*** -0.0001 -0.0002*** -0.0001*** -0.0001*** Duration (in days) 0.0007 -0.0011 0.0022*** 0.0013 0.0005 0.0002 Share of female participants 0.0302** -0.0127 0.0130 0.0260* 0.0187 0.0320** Share of international representatives -0.0610*** -0.0022 -0.0697*** -0.0510** -0.0829*** -0.0556** Share of other representatives 0.0042 0.0064 -0.0009 0.0005 -0.0061 0.0035 Dummy variables (Yes=1, No=0) WB staff participation 0.0014 -0.0004 -0.0131** -0.0038 -0.0057 0.0043 Action learning 0.0094 0.0107* -0.0069 0.0040 0.0096 -0.0005 Partner led 0.0061 0.0035 0.0083 0.0086* 0.0016 0.00570 Division (Reference: WBIFP) WBIEN 0.0239*** 0.0352*** 0.0289*** 0.0349*** 0.0267*** 0.0130 WBIPR 0.0343*** 0.0410*** 0.0245*** 0.0369*** 0.0309*** 0.0282*** WBIHD 0.0210** 0.0528*** 0.0255** 0.0377*** 0.0282*** 0.0266*** WBIOTHER 0.0302** 0.0365** 0.0058 0.0308** 0.0133 0.0231 Location of activity (Reference: Middle- income country) Low-income country 0.0236*** 0.0071 -0.0060 0.0195*** 0.0098 0.0031 High-income country 0.0041 -0.0184** -0.0072 -0.0078 -0.0047 0.0050 Learning product line (Reference: Knowledge exchange) Policy service 0.0101 -0.0049 0.0035 -0.0004 0.0024 0.0009 Skills building 0.0133** 0.0015 -0.0025 0.0125** 0.0089 0.0037 Mode of delivery (Reference: Face to face) Distance learning -0.0128 -0.0262** -0.0052 -0.0078 -0.0226** -0.0153 Electronic learning 0.0312** -0.0067 0.0292** 0.0403*** 0.0319*** 0.0318*** Blended offering 0.0152* -0.0023 0.0085 0.0115 -0.0014 0.0111 Representation (Reference: World representation) Country representation -0.0049 -0.0181** 0.0125* -0.0031 0.0012 0 Region representation 0.0163** 0.0016 0.0222*** 0.0201*** 0.0160** 0.0122 Fiscal year (Reference: First half of FY04) Second half of FY04 -0.0037 0.0075 0.0009 0.0009 -0.0037 -0.0066 First half of FY05 0.0103 0.0128 0.0104 0.0104 0.0126 0.0068 Second half of FY05 0.0193*** 0.0094 0.0167** 0.0167** 0.0168** 0.0130* 25 Dependent variables (mean ratings by male participants) Acquired Usefulness of Focus on Matched Overall Relevance new new learning announced usefulness to work information information need objectives N 736 736 736 736 736 736 R2 0.17 0.18 0.10 0.18 0.13 0.10 Intercept 4.0202*** 4.2092*** 3.6681*** 3.9009*** 3.7721*** 4.1088*** Total number of participants -.0007*** -0.0006*** -0.0003 -0.0007*** -0.0007*** -0.0009*** Duration (in days) 0.0046 -0.0058 0.0106*** 0.0104 0.0041 0.0004 Share of female participants 0.1385** -0.0512 0.0529 0.0482 0.0269 0.1246* Share of international representatives -0.2772** -0.0522 -0.2201* -0.2143* -0.2559* -0.2798** Share of other representatives 0.0194 0.0607 0.0152 -0.0209 -0.0384 0.0124 Dummy variables Yes=1, No=0 WB staff participation 0.0005 -0.0451 -0.0804** -0.01027 -0.0457 0.0207 Action learning 0.0527* 0.0609* -0.0459 0.0188 0.0614 -0.0063 Partner led 0.0336 0.0182 0.0733** 0.0431 -0.0049 0.0182 Division (Reference: WBIFP) WBIEN 0.1196*** 0.8152*** 0.0523*** 0.1743*** 0.1342*** 0.0617 WBIPR 0.1686*** 0.2177*** 0.1372*** 0.2046*** 0.1764*** 0.1398*** WBIHD 0.0943* 0.2949 0.1372*** 0.1751*** 0.1250** 0.1231** WBIOTHER 0.1731** 0.2405 0.0961 0.1145 0.0982 0.1170 Location of activity (Reference: Middle-income country) Low-income country 0.1265*** 0.0422 -0.0062 0.1097*** 0.0502 0.0226 High-income country 0.0061 -0.0717* -0.0151 -0.0515 -0.0166 0.0182 Learning product line (Reference: Knowledge exchange) Policy service 0.0591 -0.0124 0.0063 0.0072 0.0162 0.0239 Skills building 0.0840*** 0.0090 -0.0137 0.0841*** 0.0659** 0.0399 Mode of delivery (Reference: Face to face) Distance learning -0.0473 -0.1526*** -0.0603 -0.0127 -0.1000* -0.0547 Electronic learning 0.1824*** -0.1103* 0.1483** 0.1954** 0.1426** 0.1612** Blended offering 0.0801* -0.0418 0.0195 0.0812* 0.0035 0.0806* Representation (Reference: World representation) Country representation -0.0493 -0.1370*** 0.0554 -0.0141 -0.0106 -0.0199 Region representation 0.0669* -0.0170 0.1126*** 0.1062*** 0.0895** 0.0376 Fiscal year (Reference: First half of FY04) Second half of FY04 -0.0179 0.0469 -0.0132 0.0117 -0.0328 -0.0482 First half of FY05 0.0459 0.0582 0.0573 0.0530 0.0515 0.0154 Second half of FY05 0.0958** 0.0574 0.1013** 0.0923** 0.0768* 0.0413 Note: ***, **, * represent levels of significance at 1 percent, 5 percent, and 10 percent respectively. 26 SELECTION OF WBI TTLS FOR INTERVIEWS A typical WBI Level 1 evaluation asks participants immediately after the event to rate, on a five-point scale, a learning activity's: (a) relevance, (b) quality of instruction, (c) usefulness and job applicability, (d) achievement of stated objectives, and (e) overall usefulness. We drew 10 percent of the highest-rated and lowest-rated FY04-05 WBI activities in each of these five criteria and we report on the key features of these activities that relate to their having extreme ratings. Table B.2: Proposed vs. actual number of sampled activities for TTL interviews Participant Assessment (Level 1) Evaluation Number of sampled activities from: criteria: highest 10% lowest 10% (a) relevance 3 3 (b) quality of instruction 3 3 (c) usefulness and job applicability 3 3 (d) achievement of stated objectives 3 3 (e) overall usefulness 3 3 TOTAL 15 15 We did not know a priori how an activity's ratings, based on the above criteria, are correlated. Neither did we know whether these criteria really measure different attributes of learning events, as intended. Also, it is possible that certain activities, or WBI programs, were consistently among the best (or worst) according to specific criteria ratings, thereby indicating their particular strengths (or weaknesses). We interviewed the WBI TTLs of these sampled learning activities to understand more clearly the features of these events that contributed to their highest (and lowest) rankings. We investigated whether TTLs have the same understandings of key questions (or variables) that are used to describe their activities in the WBICRS, such as whether the WBI activity they managed involved "action learning" or "working with partners." This inquiry was intended to clarify, refine, and standardize the definitions used in the WBICRS, thereby making this entire dataset more useful. Appendix C lists the guideline questions used in the TTL interviews. 27 APPENDIX C: TOPIC GUIDE AND QUESTIONS FOR TTL INTERVIEWS Factors for a successful learning program What, in your opinion, are the key ingredients for a successful activity? The nature of WBI's partners: What role do they play in the partnership? What types and levels of involvement? What is a successful partnership? What happened with this partnership since the activity (outcome)? Client/Staff composition of participants Were there World Bank staff? How many? Do you think it makes a difference when staff attend activities? Why or why not? What is their level of engagement? Single versus multicountry offerings If single: Why was this offered in country X? According to the CRS summary sheet (provide copy), this activity was / was not broadcast regionally / globally. Is that correct? Why did you decide to broadcast regionally / globally? If multicountry, Why was this offered in countries X Y and Z? Does the number of countries participating influence the success of the activity? Presenter and facilitator selection How did you select the presenter(s)? Facilitator(s)? Why? Participant selection: How were participants selected? -open or invitation? -advertised? where? -screening? how? by whom? -did participants pay? how much? -how were partners involved in participant selection? 28 Action learning Did you use any action learning in this activity? Why? What kind? Examples? Follow up What has happened since the activity? Why? Recommendations What do you think could be done to improve the quality and impact of WBI activities? If you could change how WBI carries out its capacity development mission, what would you change and how? (If you were the president of the Bank, what would you change?) What advice would you have for management (for your peers) on how to change the ways WBI carries out capacity development? 29 APPENDIX D: LIST OF VARIABLES USED FOR REGRESSION ANALYSIS Dependent Variables: (We consider activity wise mean ratings by `only male participants' as well as `mean of male female together' standardized to [0, 1].) Overalluseful : Overall usefulness of this activity. Relevance : Relevance of this activity to your current work or function. Newinfo : Extent to which you have acquired information that is new to you. Usefulinfo : Usefulness for you of the information that you have acquired. Learning : Focus of this activity on what you specifically needed to learn. Independent Variables ------------------ TPARTICIPANT = Number of participants = INTERNAL_TOTAL+EXTERNAL_TOTAL+ DONORS_OBSERVER_TOTAL+RESOURCE_TOTAL PFEMALE = share of female participants = (External + Internal female)/TPARTICIPANT TREP = REP_MIN + REP_PARLIAMENTARIANS + REP_GOV_OF + REP_ACADEMICS +REP_JOURNALISTS + REP_NGO + PRIVATE_SECTOR_REPRESENTATIVES + INTERNATIONAL_REPRESENTATIVES + IMF_REPRESENTATIVES + AID_DONORS_REPRESENTATIVES+OTHER__REPRESENTATIVES); PINTREP = share of international representatives = INTERNATIONAL_REPRESENTATIVES+IMF_REPRESENTATIVES)/TREP. PGOVREP = ( REP_MIN+REP_PARLIAMENTARIANS+REP_GOV_OF)/TREP; POTHREP = share of the other representatives. = (1-(PGOVREP+PINTREP)). **(Proportion of Government representatives (PGOVREP) is used as reference). ------------------- RCOUNTRY =1 if 80% or more participants are from single country; =0 otherwise. RREGION =1 if 80% or more representatives are from single region; =0 otherwise. (this is done only for those cases where RCOUNTRY=0). Dummy variables DSTAFF = 1 if WB staff attended the activity as a participant rather than as a resource person; =0 otherwise. DACTION = 1 if the activity used action learning; =0 otherwise. DPARTNER = 1 if the WBICRS indicates that the activity involved a partner to a large extent; =0 otherwise. Mode of Delivery DDL = 1 if distance learning; =0 otherwise. DEL = 1 if electronic learning; =0 otherwise. DBL = 1 if blended mode of delivery; =0 otherwise. **DMF2F = is used as reference 30 Division (Cost Centers) CCWBIEN = 1 if activity was sponsored by EN division: =0 otherwise. CCWBIFP** = 1 if activity was sponsored by FP division: =0 otherwise CCWBIHD = 1 if activity was sponsored by HD division: =0 otherwise CCWBIOTHER = 1if activity was sponsored by other division: =0 otherwise CCWBIPR = 1 if activity was sponsored by PR division: =0 otherwise ** WBIFP is the reference WBI division. Note: WBIOTHER includes WBIST, WBISM, WBIKL, WBIKD, WBIES, WBIEG, WBIMO, and WBIRC. Income dummies based on the income status of the representative countries (Directly taken from Level 1 evaluation data) DLIC = 1 if low- income country; 0 otherwise DMIC** = 1 if upper and lower middle-income countries (UMC+LMC); 0 otherwise DHIC = 1 if high-income countries; 0 otherwise ** DMIC is used as reference Learning Product line DLPLK** = 1 if for knowledge exchange; 0 otherwise DLPLP = 1 if for policy service; 0 otherwise DLPLT = 1 if for skill building; 0 otherwise ** DLPLK is used as reference Half-Year Dummies HALF 41** =1 if FY=2004 and quarter=1 and 2; 0 otherwise HALF 42 =1 if FY=2004 and quarter=3 and 4; 0 otherwise HALF 51 =1 if FY=2005 and quarter=1 and 2; 0 otherwise HALF 52 =1 if FY=2005 and quarter=3 and 4; 0 otherwise ** HALF41 or the first half of fiscal year 2004 is used as reference 31