76280 Guide to Evaluating Capacity Development Results A collection of guidance notes to help development practitioners and evaluators assess capacity development efforts 1 About the World Bank Institute (WBI) WBI’s mission is to be a global facilitator of capacity development for poverty reduction, helping leaders, institutions, and coalitions address their capacity constraints to achieving development results. For more information, visit www.worldbank.org/wbi. About WBI Capacity Development and Results (WBICR) WBICR seeks to increase the effectiveness of capacity development by supporting innovation, learning, and knowledge exchange about approaches that are country-led and focused on results. For more information, visit www.worldbank.org/capacity or email capacity4change@worldbank.org. 2 Guide to Evaluating Capacity Development Results A collection of guidance notes to help development practitioners and evaluators assess capacity development efforts World Bank Institute Capacity Development and Results 3 Copyright © December 2012 The World Bank 1818 H Street, N.W. Washington, D.C. 20433, USA All rights reserved The World Bank Institute’s Capacity Development and Results unit prepared this report. Dawn Roberts and Cristina Marosan Ling led the report team. Samuel Otoo provided overall guidance. With special thanks to peer reviews by Patrick Grasso, former advisor to the World Bank Independent Evaluation Group, and Marlaine Lockheed, former manager of World Bank Institute Evaluation Group. Design: Sharon Fisher, World Bank Institute Cover photo: Ray Witlin, World Bank Photo Collection 4 Contents Acronyms............................................................................................................................... 7 Introduction.......................................................................................................................... 9 Guidance Notes Section I: Mapping the Capacity Development Results Chain .................................. 12 Overview of the Process Guidance Note 1: Building Blocks of a Capacity Development Results Story........... 13 Guidance Note 2: Steps for Reviewing Program Results............................................. 16 Guidance Note 3: Opportunities for Assessing Country Strategies........................... 21 Guidance Note 4: Guide to Writing a Results Story..................................................... 26 Institutional Capacity Change Guidance Note 5: Understanding Institutional Capacity Change Objectives........... 30 Guidance Note 6: Checklist for Identifying Targeted Capacity Change Objectives....................................................................................................... 32 Guidance Note 7: Assigning Indicators and Data Sources for Assessing the Achievement of Capacity Change Objectives................................... 34 Guidance Note 8: Institutional Capacity Indicators Database. .................................... 37 Intermediate Capacity Change Guidance Note 9: Understanding Intermediate Capacity Outcomes........................ 39 Guidance Note 10: Checklist for Identifying Targeted Intermediate Capacity Outcomes............................................................................... 41 Guidance Note 11: Assigning Indicators and Data Sources for Assessing the Achievement of Intermediate Capacity Outcomes........................... 43 Section II: Study Designs and Analytical Techniques to Assess Outcomes ............ 45 Data Management Guidance Note 12: Preparing, Storing and Managing Data for Analysis................... 46 Qualitative Methods Guidance Note 13: Interviewing Key Program Stakeholders. ...................................... 50 Guidance Note 14: Conducting Group Interviews with Stakeholders........................ 56 Guidance Note 15: Analyzing Qualitative Data............................................................ 59 Quantitative Methods Guidance Note 16: Exploring Opportunities for Quantitative Analysis. ..................... 63 Guidance Note 17: Survey Data..................................................................................... 66 Appendix 1—References......................................................................................................................... 75 2—Examples of Attributes for Intermediate Capacity Outcomes.................................... 77 3—Example of Survey Questionnaire for WBI Participants............................................... 79 5 6 Acronyms CAE Country Assistance Evaluation CAS Country Assistance Strategy CDRF Capacity Development and Results Framework CPS Country Partnership Strategy CPSCR Country Partnership Strategy Completion Report ICO Intermediate Capacity Outcome ICR Implementation Completion Report IEG World Bank Independent Evaluation Group KDI Korea Development Institute KSP Knowledge Sharing Program M&E Monitoring and Evaluation OECD Organisation for Economic Co-operation and Development PAD Project Appraisal Document PDO Project Development Objective PRSP Poverty Reduction Strategy Paper TTL Task Team Leader WBI World Bank Institute 7 8 Introduction Background constraining or enabling characteristics Despite donor commitments of more than of institutional conditions, that is, facilitat- $30 billion per year on capacity develop- ing improvements in institutional capacity ment activities, donors lack consensus dynamics to advance reforms and develop- regarding what these activities include and ment goals. what results should be expected. Conven- The focus on change and the definition tional monitoring and evaluation (M&E) of capacity development as the process systems regularly fail to capture the impact whereby change is enabled allows prac- of such activities. As a result, development titioners to apply specialized knowledge practitioners are deprived of the opportu- to capacity development initiatives from nity to learn which capacity development across the spectrum of governance, politi- interventions are most effective in different cal economy, social accountability and situations. institutional development. This focus on In its evaluation, Capacity Building change, and the intermediate outcomes in Africa, the World Bank Independent that drive or facilitate change, also makes Evaluation Group critiqued the World Bank the challenge of monitoring and measur- for not having “developed indicators to ing results conceptually and operationally define capacity building outcomes” and, more tractable. The CDRF can be used to by extension, not having “developed a test program logic ex ante, and to measure body of knowledge on what tools should and evaluate results ex post. This guide, be applied and how in different country which is based on the CDRF, aims to help and sector circumstances” (World Bank practitioners to evaluate the ex post results 2005:44). Other studies and reviews have of capacity development work. reached similar conclusions, highlighting such overarching challenges as the absence Purpose of a conceptual framework and the poor This set of guidance notes is designed to articulation of a results chain related to support practitioners and evaluators in capacity development (OECD 2005, conducting retrospective evaluations of 2006; World Bank 2006, 2008; Taylor and a capacity development intervention or Clark 2008). portfolio to assess and document results. The World Bank Institute (WBI) devel- Users will enhance their understanding of oped the Capacity Development and the capacity development process, of what Results Framework (CDRF) in response to works and what does not work in promoting these challenges, to provide a systematic change and to inform future programs. approach and a set of tools for develop- The standard M&E approach for assess- ment practitioners to design a rigorous yet ing capacity development results has not flexible capacity development strategy, or been sufficient. These guidance notes are program logic, to monitor and adaptively designed to complement and supplement manage their interventions and to evaluate good M&E practice to more effectively their results (World Bank 2011). The CDRF identify capacity development results. Typi- focuses on capacity development as a pro- cally, results-based M&E emphasizes the cess of empowering local agents to change assessment of outcomes and impacts while 9 also tracking inputs, activities and outputs changes need to be measured cannot to monitor implementation. A results chain be determined without analyzing or logic model is used to articulate the existing capacity constraints and sequence from inputs to results. specifying a capacity change strategy. For example, in World Bank lending Only then will it be possible to assign operations, a project’s results framework indicators for the desired institutional specifies the project development objec- capacity changes and the ICOs. tive (PDO), higher-level outcomes that • Context matters. Standard sector indica- reflect the achievement of this objective, tors lack adaptability and assume that and intermediate outcomes that need to institutional arrangements have the be in place to reach the desired results. same meaning in different contexts. A M&E arrangements in the Bank’s project nuanced understanding of each capacity documents usually specify key outputs and development change process is needed how to track them during project imple- to identify which indicators are appropri- mentation. ate for assessing targeted institutional The results chains for the capacity capacity changes and ICOs. components of development projects often Overall, the use of the traditional results remain poorly defined for the following framework or logic model for assessing the reasons: achievement of a capacity change objec- • The standard levels of indicators tive too often leads to the problem of the (such as PDO-level and intermediate “black box” of capacity development, outcomes) do not necessarily trace the wherein the needed improvements in the achievement of capacity development ability or disposition of stakeholders remain objectives. The achievement of a undefined and unmonitored. capacity change objective is not This guide focuses on retrospective the same as the achievement of a evaluation for two principal reasons: (1) PDO. In many cases, institutional prospective evaluations such as random- capacity changes are required as an ized control trials are often impractical intermediate outcome before a PDO to implement in the context of capacity can be achieved. Although the targeted development interventions and (2) external capacity change process is key to evaluators are often called in after the fact the success of the overall project in to assess the results of an intervention. contributing to the related development However, in many instances the topics and goal, the milestones needed for guidance apply to prospective evaluations achieving this institutional capacity will and monitoring activities. be largely overlooked. Depending on the specific case, institutional capacity Orientation to this Guide changes might be captured at either The 17 guidance notes explain and demon- the PDO or intermediate levels, but the strate how to assess capacity development relevant intermediate capacity outcomes efforts by reviewing and documenting the (ICOs) are rarely associated and tracked, results of ongoing or completed capacity creating a missed opportunity for development activities, projects, programs learning about what worked and what or broader strategies. The key concepts did not for the capacity development in this approach apply to a wide range of interventions. development initiatives. The described • The role of change agents and the methods have been tested on capacity targeted change process(es) need to be development projects within the World identified. Capacity development entails Bank’s lending portfolio and capacity build- preparing or empowering designated ing programs, on the Korea Development local change agents to initiate and/ Institute (KDI) Knowledge Sharing Program, or manage needed changes. What and on a knowledge exchange program 10 What is a Change Agent? A change agent is an individual or group that initiates or manages needed change(s) for developing institutional capacity in relation to a particular development goal. Change agents are often participants of a capacity development intervention, but the terms are not synonymous—program participants are not necessarily well positioned to achieve the needed changes and change agents do not always directly participate in program activities. Stakeholders include all who hold an interest in relation to a development goal. A subset of stakeholders are positioned to serve as change agents and/or to be participants of capacity development interventions. sponsored by the World Bank’s South chain, giving an overview of the process South Experience Exchange Facility. (notes 1–4) and explaining how to assess Any project team member can use the achievement of capacity development the methods and approaches described objectives (notes 5–8) and ICOs (notes to review capacity development results, 9–11). The six guidance notes in the sec- explore how specific interventions worked ond section explore and share analytical within a defined context and obtain techniques (qualitative and quantitative) insights into how the design and imple- that can help to address information gaps mentation of future interventions under that are likely to emerge in most program similar conditions could be improved. For assessments. ease of understanding and consistency, These guidance notes aim to help prac- this guide refers to “capacity develop- titioners and evaluators explore systemati- ment programs” (capacity development cally the outcomes of a capacity develop- activities, projects, programs or strate- ment activity, project or strategy. They help gies), “users” (users of the guide), “prac- to highlight lessons learned and identify titioners” (often donor staff involved with which approaches were successful and capacity development activities, proj- unsuccessful within specific contexts. This ects, programs or strategies), “agents of information provides a practical orientation change” (typically local stakeholders) and for designing more effective results frame- “evaluators” (whoever is using this guide works and monitoring arrangements during to assess results). the project or strategy design stage. Because this is a comprehensive guide to assessing capacity development, not all of the guidance provided may be relevant for all users. The guide is a flex- ible resource for supporting practitioners and evaluators in evaluating the results of capacity development efforts in relation to a particular development goal. Some users will choose to work consecutively through all of the notes to identify and document their results, whereas others might employ a more selective approach in consulting one or two notes to learn from test cases and existing examples. The first set of 11 guidance notes provides instructions on how to map and document a capacity development results 11 Section I: Mapping the Capacity Development Results Chain Capacity development entails the purposeful use of knowledge and information to achieve capacity outcomes. These outcomes enable local agents of change to trigger or advance positive changes that contribute to the achievement of a particular development goal. Understanding the “program theory” or “program logic” underlying a capacity develop- ment intervention is a critical early step for discovering or telling a capacity development results story. Practitioners can use this guide to assess whether or not they have achieved targeted capacity development results at the project or strategy level. The guidance notes can help a project team member or evaluator to identify or clarify the program logic of an intervention—the causal chain and key assumptions through which resources, activities and outputs were expected to produce capacity outcomes. This section provides guidance on how to trace the capacity development change process to define the different levels of outcomes needed to advance toward a targeted development goal. The first set of guidance notes provides an overview of the steps needed to effectively identify, substantiate and communicate a capacity development results story for stakeholders. The subsequent sets provide in-depth guidance for identifying and documenting institutional capacity results and ICOs. Overview of the Process Guidance Note 1: Building Blocks of a Capacity Development Results Story........... 13 Guidance Note 2: Steps for Reviewing Program Results............................................. 16 Guidance Note 3: Opportunities for Assessing Country Strategies........................... 21 Guidance Note 4: Guide to Writing a Results Story..................................................... 26 Institutional Capacity Change Guidance Note 5: Understanding Institutional Capacity Change Objectives........... 30 Guidance Note 6: Checklist for Identifying Targeted Capacity Change Objectives....................................................................................................... 32 Guidance Note 7: Assigning Indicators and Data Sources for Assessing the Achievement of Capacity Change Objectives................................... 34 Guidance Note 8: Institutional Capacity Indicators Database. .................................... 37 Intermediate Capacity Change Guidance Note 9: Understanding Intermediate Capacity Outcomes........................ 39 Guidance Note 10: A Checklist for Identifying Targeted Intermediate Capacity Outcomes............................................................................... 41 Guidance Note 11: Assigning Indicators and Data Sources for Assessing the Achievement of Intermediate Capacity Outcomes........................... 43 12 Guidance Note 1 Building Blocks of a Capacity Development Results Story A results chain serves as a roadmap for institutional changes that contribute to a how desired progress toward a targeted targeted development goal. development goal can be achieved by This guide is based on a conceptual presenting a logically linked sequence— framework for results-focused capacity from inputs and activities to intermediate development, which is a country-led outcomes and longer-term results. Prac- approach wherein local agents design and titioners use results chains to think more implement their own change process. This analytically about cause and effect (Dia- emphasis on change and the definition gram 1). Specifically, results chains help of capacity development as the process to identify relationships among program whereby change is enabled makes the components, clarify program objectives, challenge of monitoring and measuring establish key indicators for M&E, explore results conceptually more tractable. This key assumptions and visualize a program approach adds value to M&E practice by to identify external factors that might influ- providing: ence outcomes. • A structured framework to guide and Basic results chains can be useful for define a theory of change for capacity identifying program outputs, outcomes development and impact, but they fall short when • A change process logic to facilitate it comes to defining or describing the the assignment of measurable results change process(es) targeted by capac- indicators ity development interventions. A more • Sets of intermediate and final outcome comprehensive approach is needed to indicators that can be flexibly applied describe the transformative change that across sectors and countries occurs when potential change agents gain Diagram 2 articulates the results chain an improved ability or disposition to affect for capacity development programs—the Diagram 1. Basic Results Chain for M&E Inputs Activities Outputs Outcomes Impacts Resources What the Products Results or Long-term program or or services effects of effects project does produced or outputs provided 13 Diagram 2. Capacity Development Process Development Goal Changes in Institutional Capacity Areas er ehold • In cre stak pol ase ef hen • Im icy in ficie ngt hip p st nc t re • S ners org rove e rumen y of aniz ffec ts Resources ow atio t nal ivenes arra s • Financial nge of men • Human ts • Technology • Raised awareness • Infrastructure Agents of Change • Enhanced skills • Improved consensus Intermediate Capacity Outcomes and teamwork • Strengthened coalitions • Enhanced networks • New implementation know-how Capacity Development Interventions Diagram 3. Tracing a Capacity Development Project Results Chain Project example: The standard results framework in a World Bank project appraisal document customizes a basic results chain to trace two levels of outcomes for resource investments or capacity development interventions. Intermediate Outcome PDO Development Goal (Impact) Improved supply of Increased access to HIV/ antiretroviral therapy AIDS treatment services Mitigated social and drugs available at economic impact of HIV/ treatment centers AIDS epidemic Value added by detailing the results chain: The same basic results logic would be applied, but each major capacity development change process would be identified in terms of the ICOs (evidence of the altered disposition, motivation, knowledge or skills of change agents) and the targeted institutional capacity changes. This more precise and detailed articulation of the program logic allows stakeholders to understand whether capacity development objectives are being achieved as planned, and, if not, what adjustments are warranted to achieve targeted changes in the future. ICO Institutional PDO Development Goal Capacity Change (Impact) Strategy Increased access to implemented at Improved HIV/AIDS treatment Mitigated social and health facilities to operational efficiency services economic impact of improve inventory of health facilities HIV/AIDS epidemic management (as evidenced by the reliable supply of AVR drugs) 14 Table 1. Components of a Capacity Development Results Story Story Element Description Development Goal A beneficiary-centered statement of the desired high-level outcome(s) that articulates what benefits are targeted and for whom Institutional Capacity The most common challenges to the achievement of the development goal fall into one of Areas three areas: (these serve as the • Strength of stakeholder ownership: Low or divergent priority is attached to the change objectives) development goal by key stakeholders • Efficiency of policy instruments: There are deficiencies in the policy instruments guiding pursuit of the development goal by different stakeholders • Effectiveness of organizational arrangements: Organizations charged with the achievement of the development goal have weak performance An effective results story explains how interventions helped to enhance one or more characteristics within these institutional capacity areas to remove or minimize the identified challenge(s). Change Agents The critical individuals or groups who could play effective roles in managing or initiating the needed changes Intermediate An improvement in the ability or disposition of the local change agents to take actions that Capacity Outcomes will effect institutional changes toward the development goal. There are six standard types (ICOs) of ICOs: • Raised awareness • Enhanced knowledge or skills • Improved consensus and teamwork • Strengthened coalitions • Enhanced networks • New implementation know-how Capacity The knowledge services provided to address priority reforms and achieve the targeted Development changes in the institutional constraints. Interventions typically include a combination of Interventions learning programs, technical assistance, knowledge exchange experiences or other services and resources. progression from needs assessments and When project teams document only interventions to outcomes and impact PDO-level and intermediate outcomes in toward development goals. the project results framework, they miss the The individual components of this opportunity for assessing capacity change capacity development change process processes and learning about what worked serve as the building blocks for a results and what did not. This approach helps to story. Practitioners and evaluators can diagnose individual capacity change objec- use these components to explain how tives, identify the targeted change processes a set of capacity development interven- and assign indicators to measure ICOs and tions changed the ability or disposition of institutional capacity changes for telling a individuals or groups so that these change meaningful and comprehensive capacity agents can affect the institutional changes development results story (Table 1). needed for achieving a development goal. The benefits of applying this framework to complement and supplement traditional M&E practice can be understood more clearly by considering a specific example from World Bank lending operations in Diagram 3. 15 Guidance Note 2 Steps for Reviewing Program Results Using this guidance note to review pro- constraint (the basis for the capacity grams and assess results helps practitioners development objective), the related and evaluators to understand what works ICOs and the corresponding institutional in capacity development interventions capacity change(s) and compile lessons learned for inform- • Follow up on data collection as needed to ing future project design. A retrospec- refine the results stories tive assessment of capacity development • Understand the intervention’s results by results requires both reviewing program identifying evidence of intermediate and documents and also interviewing knowl- institutional level outcomes edge partners and other key stakeholders. Adapted versions of this approach have Ideally, the process for a retrospec- been developed and tested both for Bank tive assessment of capacity development operations projects with a capacity develop- results will be iterative, with opportunities ment emphasis (Approach A) and knowledge built in to refine each capacity develop- exchange or knowledge sharing programs ment change story and to fill information (Approach B). gaps. The basic steps for this approach follow. This method can be adapted to Adapted Approach A: Steps to assess the results of any type of capacity Identify the Outcomes of Bank development intervention. In each case, Lending Operations the right mix of data collection and analysis 1. Assemble the available project docu- steps will need to be determined based on mentation, particularly the project appraisal which key informants and data sources are document (PAD), midterm review, implemen- accessible to the reviewer and the level of tation status reports (ISRs), implementation resources available to carry out any primary completion report (ICR) and aide memoires data collection activities. from supervision visits. Where available, standard relevant documents related to com- Basic Approach to Retrospectively pleted projects should also be available to Assess Capacity Development consult, including IEG project performance Results assessment reports, IEG ICR reviews, coun- • Assemble documents and materials try partnership strategy completion reports from the entire program cycle (CPSCR), IEG CPSCR reviews, and IEG coun- • Review the program background, try assistance evaluations (CAEs). objectives and activities to identify 2. Review narrative sections of the PAD to the targeted development goal and understand the project context, looking at institutional capacity change objectives the sector and country background sections • Collect data through interviews of and the description of project components change agents and key informants to understand what institutional capacity • Analyze data to trace each capacity characteristics the project was designed development results story by identifying to address. Additional documents relevant the pre-existing institutional capacity to the project can be consulted to further 16 Table 2. Reviewing Project Background Information to Identify Capacity Challenges Project Narrative Description of Capacity Corresponding Capacity Area Challenges (Excerpted from PAD) and Targeted Characteristic Multi-Sectoral AIDS “General awareness of the disease is fairly high, Capacity Area Project (Malawi) but so are misconceptions about how to avoid Strength of Stakeholder Ownership— the disease. As a result, high risk behavior among Compatibility of social norms and values sexually active youth and adults continues… The with the development goal immediate impacts are staggering: 70% of all admissions to hospital medical wards are AIDS Targeted Characteristic related, and HIV/AIDS is now the leading cause Widespread changes are needed in of death in the most productive age group (20-49 the attitudes and behavior of local years).” stakeholders to support the achievement of the development goal (reduce the transmission of HIV/AIDS). Community and “There is also lack of transparency and Capacity Area Basic Health Project accountability in the flow of funds for primary care, Efficiency of Policy Instruments— (Tajikistan) since PHC [primary health care] funds normally Resistance to corruption flow through hospitals, polyclinics or jamoats (local village councils) and there is plenty of Targeted Characteristic scope for diversion, especially for any non-salary The government’s health financing and allocations… Informal payments are rampant.” budgeting process needs refinement to reduce opportunities for rent seeking behavior by public officials. Community-Based “Past efforts to tackle the policy, institutional Capacity Area Rural Development and technological constraints facing the rural Effectiveness of Organizational Project (Burkina poor have been centrally driven and sectorally Arrangements—Operational efficiency Faso) focused. As a result, their effectiveness has been low. Decentralized decision-making and economic Targeted Characteristic empowerment of beneficiary communities is Village land management committees expected to improve the choice, relevance, (the new structure for village-level cost effectiveness, and maintenance of rural governance) need to plan, execute infrastructure… In addition, to meet the demands and manage local projects to optimize of the local population, the deconcentration of the delivery of public services and sectoral ministries… will improve the delivery infrastructure relative to cost. mechanisms of public goods and services and make them more demand-responsive.” develop an understanding of the project 5. Identify measures of ICOs by review- context as needed (Table 2). ing the project results framework—the out- 3. Review the project results framework comes expected to occur as a direct result in all available documents (PAD, ISRs and of the capacity development interventions. ICR) to identify indicators and measures Note the data sources, targets, current related to changes in the targeted institu- values and arrangements for monitoring tional capacity development objectives. (Table 4). Note data sources, targets, current values 6. Assemble each individual capacity and arrangements for monitoring (Table 3). development change story to understand 4. Examine what capacity development whether and how interventions contributed interventions were designed and imple- to the expected results and to explore any mented to contribute to targeted changes instances when project implementation in the institutional capacity characteristics, was or should have been adjusted to better noting the specific activities and the tar- achieve those results (see Guidance Note 4 geted participants. for examples). 17 Table 3. Identifying Outcomes and Indicators for Institutional Capacity Change Objectives Project Targeted Capacity Measure Data Source Measured Values Development Outcome (Evidence of Results) Multi-Sectoral Increased compatibility Median age at first sex Demographic 2005—Males: 17 years; AIDS Project of the development among 15–24 year olds Health Females: 16 years (Malawi) goal (reducing the Surveys and transmission of HIV/AIDS) Multiple Indi- 2007—Males: 16 years; with social norms and cator Cluster Females: 16 years values Surveys No targets set Community and Increased resistance of Percentage of house- Household Baseline (2003) for Basic Health the healthcare budgeting hold expenditure surveys by Tajikistan: 5% Project process to corruption allocated to health care Ministry (Tajikistan) in project-supported of Health 2007—Sughd: 4.8% areas (measure was Department 2007—Khatlon: 3.1% established to confirm for Reform that funding diversions No targets set: and demands for Household survey data informal payments collection was scheduled have been minimized/ for after project eliminated) completion. Community- Increased operational Percentage of micro Project Baseline (2001): 0 Based Rural efficiency of the projects that are techni- reporting; Development Commission Villageoise cally sound and cost- cost-efficiency At Completion (2007): Project (Burkina de Gestion des Terroirs effectively implemented analysis 90% Faso) (CVGT—Village Land Target (2007): 75% Management Committee) 7. Fill information gaps as needed by during the knowledge sharing program contacting the task team leader (TTL) with (KSP), such as presentations, action plans or questions or by reviewing data or publica- other artifacts. tions provided by other donors who collab- 2. Review the program background, orated with the Bank on this intervention. objectives, and activities. Conduct a desk In some cases, reviewers might work with review of program documents and related the TTL to identify opportunities to collect materials to understand the country con- additional data on project outcomes from text and development goal(s) toward which beneficiaries or other key stakeholders. the program is oriented. Depending on the quality of information available, the Adapted Approach B: Steps to reviewer should be able to construct pre- Identify the Outcomes of Knowledge liminary hypotheses about the key compo- Sharing Programs nents of the capacity development change 1. Assemble program documents and story(ies): materials. Collect any available reports from • Development goal. Who is (or will the entire program cycle. This includes not be) better off and how, as a result of only final outputs and evaluation reports at the knowledge exchange program the end of the cycle, but also needs assess- and related activities over the longer ment documents and demand surveys term? conducted at the beginning of the program • Targeted capacity constraints. Which and interim monitoring reports. In addition, institutional capacity areas that were obtain materials developed by participants impeding the achievement of the 18 Table 4. Identifying Intermediate Capacity Outcomes Project Intermediate Measure Data Source Measured Values Capacity Outcome (Evidence of Results) Multi- Enhanced Percentage of young Demo-graphic 2005—Males: 37%; Females: 25% Sectoral knowledge for people aged 15–24 Health Surveys AIDS Project preventing HIV who correctly identify and Multiple 2007—Males: 41.9; Females: 42.1% (Malawi) transmission ways of preventing the Indicator Targets (2010) sexual transmission of Cluster Surveys HIV and reject major Males: 75%; Females: 75% misconceptions about HIV transmission (by gender and residence) Community Implementation Percentage of total Annual reports Baseline (2006)—0% in two target and Basic of new healthcare primary health care of rayon/oblast oblasts Health per capita expenditure paid by health depart- Project financing strategy capitation in project- ments 2009—Sughd Average: 2.45%; (Tajikistan) supported oblasts (this Range: 0.24% to 12.81% per rayon measure reflects the 2009—Khatlon Average: 6.0% extent to which the new Range: 1.7% to 18.3% per rayon standard transparent financing formula is 2010—Sughd Average: 2.05%; being implemented Range: 0.36% to 12.31% versus reliance on the 2009—Khatlon Average: 6.18% previous pattern of discretionary spending) Range: 2.72% to 14.09% Target (2012) 100% in Spitamen rayon and 20% in each of the other 43 rayons in two target oblasts Community- Strengthened Percentage of villages CVGT Annual Baseline (2001): 0% Based Rural coalition for with representative Reports Develop- governing local and participatory At Completion (2007): 149% [2,986 ment development bodies (CVGT or village CVGTs established, compared to Project land management 2,000 targeted at appraisal] (Burkina committees) assuming Target (2007): 60% Faso) their role in local development Formulation of a Percentage of villages CVGT Annual Baseline (2001): 0% local development covered by the project Reports plan that have adopted a At Completion (2007): 148% [2961 local development plan villages adopted plans, compared to 2,000 targeted at appraisal] Target (2007): 75% Implementation Percentage of CVGTs CVGT Annual Baseline (2001): 0% of the local that have substantially Reports development plan completed sub-projects At Completion (2007): 98% identified in their local Target (2007): 60% development plan 19 development goal were targeted a more detailed description of the rel- for enhancement through program evant change process(es) and requesting activities? What kind of evidence additional clarification and evidence when might be available to measure possible. the needed changes for specific Simple, consistent qualitative data institutional capacity characteristics? collection techniques work effectively for • ICOs. What raised awareness, revealing and compiling the evidence of enhanced knowledge or skills, results. Identifying and analyzing interme- improved consensus and teamwork, diate and institutional outcomes through strengthened coalitions, enhanced selected steps will provide a detailed networks, or new implementation understanding of how specific interven- know-how was needed to achieve tions contributed to capacity development the desired changes in the targeted results. Guidance for implementing the institutional capacity characteristics? steps for a retrospective analysis is pro- What evidence might be available to vided in Section II: Analytical Techniques to identify these outcomes? Assess Outcomes. • Change agents. Which individuals or groups initiated or managed the needed changes? 3. Interview the program officer(s) or other lead stakeholder(s). Fill information gaps and continue to develop an under- standing of the capacity development change process(es) by interviewing one or more knowledgeable individuals about the program. Explore the validity of current assumptions and identify data sources or data collection opportunities for gaining evidence of intermediate and/or institu- tional outcomes. 4. Analyze data to confirm or refine hypothesis. Trace each capacity develop- ment change story to understand whether and how interventions contributed to the expected results and to identify gaps in understanding where additional informa- tion is still needed. 5. Conduct additional interviews of key informants. As possible, collect qualitative data from other program designers, knowl- edge providers, participants and other stakeholders well positioned to provide useful contextual information or evidence of outcomes. This step could include field visits and in-person interviews or could be limited to email exchanges and telephone interviews. 6. Follow up as needed to collect evidence from key informants. Continue to develop and refine the results stories through an iterative process, constructing 20 Guidance Note 3 Opportunities for Assessing Country Strategies Capacity development interventions are and takes into account the Bank’s often implemented as part of a broader comparative advantages in the context development strategy to further the of other donor activities. achievement of a specific development Other types of standard development goal. The CDRF can help practitioners to strategies also exist, focused on a specific identify and review the results of such a sector, region or lending environment (such strategy, to explore how various interven- as fragile states). In each of these cases, tions and measurement practices have a key step in drafting a new strategy is to worked, and to compile lessons learned review current challenges, needed insti- about capacity development to inform tutional capacities and the results of any future strategy design. previous interventions or existing Bank (or International development agencies and other lender) project portfolio. This guide other organizations design standard strate- can therefore serve as a tool in this pro- gies to serve as roadmaps for guiding proj- cess for retrospectively assessing capacity ects and operations. At the World Bank, for development results and informing the new example, two main strategies exist at the strategy development for a country. country level: Using this guide to review strategies and • Poverty reduction strategy. This assess results helps practitioners and evalu- document describes a country’s ators to understand capacity development long-term vision and is prepared by change processes and document ICOs and low-income country governments in the achievement of capacity change objec- consultation with various stakeholders tives. The process is similar to that applied including civil society and the private at the project level (see Guidance Note 2). sector. The strategy establishes The right mix of data collection and analy- macroeconomic, structural and social sis steps will be determined based on the policy goals with clear country priorities information available, but the process is and targets. usually iterative in any case, with opportuni- • Country partnership strategy (also ties to test and refine hypotheses and fill referred to as a country assistance gaps in understanding. A generic approach strategy). This document lays out to reviewing the capacity development a selective program of World Bank results of any strategy follows. Group support for a particular country. Bank staff developed it, and it takes Basic Approach to Retrospectively as a starting point the country’s own Assessing Capacity Development long-term vision for development. Results at the Strategy Level The strategy is designed to promote • Assemble documents and materials collaboration and coordination among from the entire portfolio cycle as development partners in a country relevant 21 • Review any background information, strategic objectives without necessarily objectives, and interventions of the building a comprehensive understanding strategy to identify targeted institutional of how or why this progress occurred. The capacity challenges (that is, key standard template leads users through a challenges the strategy was designed to self-evaluation to identify CPS outcomes overcome to further the achievement of for each area of engagement or country one or more development goals) pillar, the lending and non-lending activities • Review existing results frameworks and perceived to contribute to these outcomes documentation of progress both at the and any lessons or suggestions for the new project and portfolio levels (such as country strategy. portfolio mid-term reviews, project ICRs, The steps outlined in this guidance note etc.) augment this process by guiding users • Organize and analyze data by tracing in mapping capacity changes in terms of each results story by identifying the pre- movement along a defined results chain existing institutional capacity constraint that is understood within a local context. (the basis for the capacity development This should help practitioners to: objective), the related ICOs and the • Understand challenges in terms of corresponding institutional capacity specific types of institutional capacity change(s) change objectives • Collect data through interviews of • Assign indicators to track changes in stakeholders and key informants as these characteristics needed to fill gaps in understanding • Examine interventions and targeted • Refine results stories and follow up on change agents to define needed ICOs data collection as needed • Assign indicators for ICOs to assess • Understand the strategy’s results by whether they were achieved identifying evidence of intermediate • Assess the local context to identify and institutional level outcomes to factors affecting the success or failure of demonstrate how planned interventions interventions were implemented to achieve progress • Derive lessons learned from the towards targeted development goal(s). intervention, change agents and This guidance note can be integrated targeted outcomes to inform future into the development strategy cycle to strategies inform standard assessment steps such as those that are implemented for a coun- This approach can be adapted to try partnership strategy (CPS) comple- enhance the standard review process for tion report (conducted by a World Bank any development strategy. The steps out- country team), or for a CAE (conducted by lined below are tailored to guide practitio- the World Bank Independent Evaluation ners in the CPS review process for retro- Group). This approach complements and spectively assessing capacity development supplements the typical M&E process to results. build understanding about what progress has been achieved in developing needed Suggested Steps for Assessing institutional capacities and what change Capacity Development Results for a processes have facilitated this progress. CPS Completion Report The lessons derived from this analysis can 1. Assemble the available country port- lend insights to the strategy development folio documentation. Key documents are process. likely to include any or all of the following if One example of this application could available: be for the standard review of the CPS. The oo Documents that explain the country traditional CPS Completion Report (CPSCR) context and development goals and reports on progress towards achieving priorities, such as the previous CPS 22 and the Poverty Reduction Strategy and arrangements for monitoring to assess Paper (PRSP) any progress and issues in addressing these oo Country Portfolio Review capacity change objectives during the CAS oo Midterm CPS review (CPS Progress period. Report) 4. Identify the main capacity develop- oo Any recent CAE conducted by IEG ment interventions in the portfolio that for the relevant country (these do not were designed to influence targeted institu- map capacity development change tional changes, noting the specific activities processes in detail but are helpful for and the targeted participants. understanding the country context 5. Determine the measures of ICOs and outcomes across projects) by reviewing the CPS and project results oo Any internal staff reviews of the frameworks—these are the outcomes portfolio or sectors (such as a review expected to occur as a direct result of the of investment lending performance, capacity development interventions. ICOs corruption vulnerabilities, etc.) reflect an improvement in the ability or oo Project documents, particularly PADs disposition of stakeholders to take needed and ICRs actions (see Guidance Note 9). Identify the oo Any reports related to Economic and data sources, targets, current values and Sector Work in the country arrangements for monitoring. Note: ICOs 2. Review background documents (such are commonly missing in portfolio and as CPS, PRSP) to understand the portfolio project documents, so supplemental data context. For an effective and accurate ret- collection might be required (see Step 7) rospective assessment of capacity develop- 6. Assemble the main individual capac- ment results, the practitioner should review ity development change stories reflected in and crosscheck documents as needed to the country portfolio. Identifying how proj- identify: ects helped local change agents to initiate oo The longer-term strategic goals or manage needed changes will help build (development goals) to which the an understanding about whether and how country assistance strategy has been interventions contributed to the expected designed to contribute results. Tracing each capacity development oo The institutional capacity challenges results chain highlights what has worked for that impede the achievement of the achieving the capacity change objectives development goals and that the and provides important information about country portfolio was designed to instances when project implementation was address or should have been adjusted. This step provides the foundation for 7. Fill information gaps as needed by tracing individual capacity development contacting project leaders or country teams change processes achieved during the CAS with questions or by reviewing data or period by articulating the various stages of publications provided by other donors who the change process, including the targeted collaborated with the Bank on components development goals and translating country of the portfolio. In some cases, reviewers and institutional capacity change objectives might work with project teams to identify (see Guidance Note 6). opportunities to collect additional data 3. Review the results matrix of the cur- on project outcomes from beneficiaries or rent CPS and all project results frameworks other key stakeholders. in the country portfolio. Available docu- A typical review of a country ments (PADs, ICRs, etc.) should be con- development strategy is intended to derive sulted to identify indicators and measures practical lessons from the past to inform related to changes in the targeted institu- the development of the new country tional capacity objectives (characteristics). strategy and, in the case of the CPSCR, Note data sources, targets, current values, the refinement of the Bank’s ongoing 23 country portfolio. This guidance note can be applied across all sectors and types of capacity challenges to help ensure that practitioners understand the capacity development results achieved and the processes through which these outcomes occurred. This approach to completing the standard CPSCR exercise will allow coun- try teams to better understand how well capacity development interventions have worked in specific contexts. Table 5 explores how the CPSCR form (self-evalua- tion) could be populated to share the les- sons gained from tracing the results chains for capacity change objectives. 24 Table 5. Three Examples of Applying this Approach to CPS Completion Report Information in Existing CPSCR Self-Evaluation Potential Value Added by this Approach CPS Status Lessons for Capacity Change Objective Lessons Gained from Tracing Outcome the New CPS Results Chains for Capacity Change Objective The outcomes The status rating Suggestions Using standard terms to Exploring change agents presented in (Dropped, for the team define challenges and role(s) and targeted ICOs to the strategy’s Achieved, etc.) with preparing the desired results help explain what worked and results matrix supporting data/ new CPS why information Improved Achieved. “Pilot activities Effective organizational What worked? What steps access to and increased the arrangements: and conditions are needed for quality of water 100% of collection rates these results to be replicated? supply microbiological water of utilities for Water and sewer utilities need to Identifying which measurable quality samples in sustainable achieve targeted outcomes while improvements in the abilities pilot areas meet target access to quality having financial viability or dispositions of specific values and collection water and this stakeholders (the ICOs) rates improved— now needs to contributed to the achievement average collection extend beyond of target values in the pilot areas ratio is 92% the pilot towns.” would provide critical information for effectively scaling up these results. Financially and Partially achieved. “Ongoing Strong stakeholder ownership: Why were targeted outcomes not socially viable Transparency improvements Improved transparency and fully achieved? A retrospective pension system increased, with to the pension public attitudes are needed to assessment starts with a full international system are change widespread behavior and review of existing institutional accounting standards critical for encourage pension contributions capacity challenges. In this introduced and sustained case, the review would highlight regular financial economic inconsistent policies. Employers audits conducted. growth and should pay contributions on New public awareness poverty behalf of employees, but a recent promoted the reduction.” law on contribution amnesty establishment of reduced the incentives for individual accounts compliance among employers. for all pension The current policy framework contributors; but is therefore impeding efforts all employers do to increase the compatibility not comply with of social norms with the requirements. development goal. Improved Achieved. None noted Efficient policy instruments: What institutional capacity health system The quality of Increased incentives for was needed to contribute performance primary healthcare compliance and more clearly effectively to the CPS outcome? services improved defined roles and responsibilities Identifying different levels of through strengthened for providers are needed to outcomes is critical for assessing compliance of assure the delivery of higher progress along a results chain. providers with MOH quality primary health care This provides a clear roadmap standards: 100% services for understanding how the of performance development of new policies and agreements between standards (the ICO level) led to central authorities and increased compliance among health care providers stakeholders (the capacity change include outcome objective). This compliance in indicators in line with turn led to better coverage and MOH priorities quality of healthcare services (the targeted CPS outcome). Note: These examples are taken from existing CPSCR self-evaluation tables. Each targeted CPS outcome is linked to a specific development goal to which the CPS has aimed to contribute. 25 Guidance Note 4 Guide to Writing a Results Story How do you tell the story of how programs ity development results story in an easy-to- or projects have contributed to a particular understand format. Two tips help to ensure development goal? Being able to commu- that stakeholders can derive the needed nicate this story for stakeholders is critical information: for sharing lessons about what worked, • Include only enough summary detail supporting longer-term capacity changes, in the boxes to clarify the main change and being accountable to funders. Once logic. Specific evidence and additional practitioners or evaluators have identi- details for each step can be included in fied the main components of the capacity accompanying text. development change process(es) (Guid- • Provide a chain of boxes for each ance Note 1) and collected evidence of targeted institutional capacity constraint. outcomes (Guidance Notes 2 and 3), they Capacity development efforts often will be able to depict an intervention’s or address more than one challenge strategy’s capacity development change impeding the achievement of a logic. development goal and these separate A diagram based on the model in processes can be depicted in a single Diagram 4 provides an effective means for diagram to reflect a comprehensive conveying a potentially complicated capac- approach. Diagram 4. Template for Showing Capacity Development Change Processes Development Goal Targeted Institutional Capacity Area Description of Specific Institutional Capacity Change Objective Description of Change Agents and Change Process (how the improvement in the stakeholders’ ability or disposition leads to institutional changes that contribute to the development goal) Description of Intermediate Capacity Outcomes List of Capacity Development Interventions 26 Diagram 5. Example of Tracing One Capacity Development Change Process Improve the socioeconomic development of the Dominican Republic by contributing to its export development Strength of stakeholder ownership Commitment of leaders High-level government officials need to envision the trans-formation of the economy that could result from export development to promote the needed changes. Change process and change agents Government officials understand the potential economic growth fostered by export development and the changes necessary to achieve this growth. Intermediate capacity outcome Raised awareness: Key high-level government officials become aware of the potential benefits of export development that could be achieved through reducing electricity losses, fostering public-private collaboration, and creating international trade networks. Consecutive knowledge sharing program projects Export Development for the Dominican Republic Improving the Export Infrastructure and Electric Power System Establishment of the Dominican Export-Import Bank Diagrams 5 and 6 demonstrate how IV. Capacity Development Objectives this approach was applied in a joint study (presented in terms of existing capacity conducted by the Korea Development challenges being targeted) Institute and WBI. The first diagram traces V. Program Description an individual change process and the a. Design of the Capacity Develop- second figure shows this one change ment Intervention story within the broader web of change b. Knowledge Partners (explaining why processes being facilitated by the capac- specific consultants or content pro- ity development interventions. viders were selected to empower Depending on the target audience(s), the change agents to manage or a detailed report can be constructed to initiate the needed changes) document the capacity development out- c. Participants (presented in terms of comes and share lessons learned. their positioning as change agents) An example of a report outline for VI. Outcomes documenting capacity development a. Targeted Change Process I results: i. Intermediate Capacity Outcomes I. Overview of Program, Project, or ii. Institutional Capacity Outcomes Strategy b. Targeted Change Process II (repeat II. Development Goal sequence above as needed) III. Strategic Context for [Country or VII. Lessons and Implications Countries] a. Success Factors b. Lessons 27 Diagram 6. Example of Tracing Multiple Change Processes Improve the socioeconomic development of the Dominican Republic by contributing to its export development Strength of stakeholder Efficiency of policy Effectiveness of ownership instruments organizational arrangements Commitment of Clarity in defining Incentives for Consistency Operational leaders roles and compliance of policy efficiency High level govern- responsibilities The policy envi- instruments Dominican Cor- ment officials need Private sector ronment could A consistent poration of State to envision the enterprises could provide better set of policies Electrical Compa- transformation of engage more in incentives and is needed for nies (CDEEE) the economy that export activities fewer barriers strategic plan- could improve op- could result from if the appropriate to encourage ning to reach erational efficiency export develop- lending and insur- private sector country goals. by improving orga- ment to promote ance instruments engagement in nizational arrange- the needed were available. export activi- ments to reduce changes. ties. electricity loss. Intermediate Intermediate Intermediate Intermediate Intermediate capacity capacity capacity capacity capacity outcome outcome outcomes outcome outcomes Raised awareness: Applied Raised Applied Enhanced Key high-level knowledge awareness: knowledge knowledge: government and skills: After Private sector and skills: CDEEE officials become learning about the stakeholders High-level management aware of the Korean model for understand officials at understands potential benefits export financing, the benefits of the Ministry reasonable of export Dominican policies that of Planning, energy loss rates development leaders take the promote export Economy and and methods for that could be steps to create development. Development reducing losses. achieved through an export-import learn how to Strengthened Increased reducing electricity bank—including implement public and implementation losses, fostering a presidential a national private sector know-how: A public-private decree and the strategic coalitions: Gov- law criminalizing collaboration, formulation planning ernment officials energy theft is and creating of a law to process establish formal implemented. international trade clarify roles and that ensures arrangements networks. responsibilities. consistency for gaining private sector among government input. policies. Consecutive KSP projects Export Development for the Dominican Republic Improving the Export Infrastructure and electric power system Establishment of the Dominican Export-Import Bank 28 A well-documented capacity devel- opment results story can be adapted to various approaches and formats but should follow a logical progression such as the one described in the above outline. Additional guidance and examples for writing results stories can be found in two publications, posted at www.worldbank. org/capacity: Reviewing Project Results Retrospectively Using a Results-Focused Approach to Capacity Development and Using Knowledge Exchange for Capac- ity Development: What Works in Global Practice?. 29 Guidance Note 5 Understanding Institutional Capacity Change Objectives Capacity development interventions are • Efficiency of policy instruments designed and implemented to address There are deficiencies in the policy challenges at the local, country, or regional instruments guiding pursuit of the level that are impeding the achievement development goal by different stake- of a particular development goal. Iden- holders. tifying the success of these interventions • Effectiveness of organizational arrange- is possible only if the capacity develop- ments ment objective(s) have been articulated so Organizations charged with the achieve- that the targeted effects are specific and ment of the development goal have measurable. Identifying the existing insti- weak performance. tutional capacity challenges at the start of For each of the capacity areas repre- an intervention or strategy is therefore a sented by these challenges, there are char- critical first step for understanding what acteristics—individual change objectives— worked and what did not work for any that can be enhanced through capacity capacity development intervention. development interventions (Table 6). This Capacity development interventions set of 19 capacity change objectives pro- are either explicitly or implicitly designed vides a comprehensive and standardized to address one or more of three types of approach for the measurement of capac- institutional capacity challenges: ity development results. Descriptions and • Strength of stakeholder ownership definitions for these objectives are avail- Low or divergent priority is attached able at www.worldbank.org/capacity. to the development goal by key Whether capacity development is stakeholders. the main focus of a program (such as Table 6. Institutional Capacity Change Objectives Strength of Efficiency of Effectiveness of Stakeholder Ownership Policy Instruments Organizational Arrangements • Commitment of social and • Clarity in defining rights and • Clarity of mission political leaders responsibilities • Achievement of outcomes • Compatibility of social • Consistency • Operational efficiency norms and values • Legitimacy • Financial viability • Stakeholder participation • Incentives for compliance and probity in setting priorities • Ease of administration • Communications and • Stakeholder demand • Risk for negative stakeholder relations for accountability externalities • Adaptability • Transparency of • Suitable flexibility information to stakeholders • Resistance to corruption 30 Table 7. Using Descriptions of Capacity Challenges to Identify Change Objectives Sector Focus Narrative Description of Capacity Challenges Generic Capacity Targeted Capacity (excerpted from PAD) Change Objective Development Objective HIV/AIDS “General awareness of the disease is fairly Strength of Reduced high-risk behavior Treatment high, but so are misconceptions about how Stakeholder among sexually active and to avoid the disease. As a result, high risk Ownership— youth and adults Prevention behavior among sexually active youth and Compatibility of adults continues…The immediate impacts are social norms and staggering: 70% of all admissions to hospital values medical wards are AIDS related, and HIV/AIDS is now the leading cause of death in the most productive age group (20–49 years).” Community “There is also lack of transparency and Efficiency of Policy Increased resistance to and Basic accountability in the flow of funds for primary Instruments— corruption of the primary Health care, since PHC [primary health care] funds Resistance to health care budget process normally flow through hospitals, polyclinics corruption or jamoats (local village councils) and there is plenty of scope for diversion, especially for any non-salary allocations… Informal payments are rampant.” Municipal “The [City Council] faces serious constraints in Effectiveness of Improved financial Development both revenue generation and budget planning Organizational management of the city and control… It is estimated that only 20 Arrangements— council percent of brick and with piped water and 5 Financial viability percent of the total properties in the city are and probity being taxed. The municipality lacks an updated cadastre and other tools to increase property tax revenues as well as other local taxes and fees. On the expenditure side…weaknesses remain in planning, execution, and control of expenditures.” a knowledge exchange intervention, • Identify the general institutional capacity technical assistance project, etc.) or just area targeted for change by the project one component of a multifaceted Bank • Refine the understanding of the project (such as a project developing change(s) needed within the institutional transport infrastructure), practitioners capacity area to pinpoint one or two should clearly identify the capacity specific change objectives development change objectives to assess • Translate the generic change objectives whether or not they have been achieved. into specific customized outcomes In retrospective evaluations, practitio- targeted by the project ners or evaluators can review the original Practitioners should not attempt to iden- problem statement and/or the country and tify or assign indicators and data sources sector context to identify which institu- until this basic understanding of the tar- tional capacity challenges the project was geted capacity development outcome(s) designed to address. As demonstrated by has been established. the three case examples in Table 7, World Bank project appraisal documents usually include sufficient details in their narrative description to: 31 Guidance Note 6 Checklist for Identifying Targeted Capacity Change Objectives Defining capacity challenges allows practi- development interventions were tioners or evaluators to assign indicators for designed to address. In any given case, measuring changes in targeted institutional other considerations may exists that capacity areas. Using effective indicators for are not included in the checklist, so the standard types of capacity change objec- individual user should exercise judgment. tives will help build a systematic under- Also, an effective capacity development standing over time about what does and intervention will likely target changes in does not work in capacity development. only a small set of characteristics at a time, The questions in Tables 8–10 can help so practitioners should think in terms of practitioners or evaluators to identify the the high priority changes needed when main capacity challenges that capacity using these checklists. Table 8. Strength of Stakeholder Ownership Checklist Capacity Characteristic Check if the answer is “no” in relation to the target development goal Commitment of ❑❑ Was there a clear commitment from relevant leaders (such as, at political and social community, sub-national, national levels) to achieve the targeted leaders development goal? Compatibility with ❑❑ Was the development goal consistent with the current social norms social norms and values and values of local stakeholders? Stakeholder ❑❑ Was there an established mechanism for stakeholders to voice their participation in setting opinions related to the development goal? priorities ❑❑ Was the mechanism supported by the relevant leaders engaged in setting priorities related to the development goal? Transparency of ❑❑ Was information related to the development goal shared regularly information to with stakeholders? stakeholders ❑❑ Was detailed information related to the development goal accessible to stakeholders (such as, available easily on the Internet)? Stakeholder demand ❑❑ Have stakeholders’ demands for government accountability been for accountability affecting the quality of service delivery by the government? 32 Table 9. Efficiency of Policy Instruments Checklist Capacity Characteristic Check if the answer is “no” in relation to the target development goal Clarity in defining rights ❑❑ Was there any established regulatory mechanism that could be and responsibilities used to support efforts and formally guide changes related to the development goal? Consistency ❑❑ Were the policies or regulatory mechanisms which support the development goal consistent (not in conflict) with other policies or regulatory mechanisms needed to achieve development goals of other projects? Legitimacy ❑❑ Was the current process related to the development goal transparent? ❑❑ Was the current process in formulating policies related to the development goal participatory? Incentives for ❑❑ Was there enough compliance by stakeholders for the development compliance goal-related policies to function? Ease of administration ❑❑ Was the current administrative capacity sufficient to implement the policy instrument? Risk of negative ❑❑ Did the policy take into consideration unintended (negative) effects externalities that might occur during the pursuit of the development goal? Flexibility ❑❑ Could the policy instrument accommodate revisions as necessary to adapt to changes in the social and political environment? Resistance to ❑❑ Did the policy include any measures to minimize opportunities for corruption corruption? Table 10. Effectiveness of Organizational Arrangements Checklist Capacity Characteristic Check if the answer is “no” in relation to the target development goal Clarity of mission ❑❑ Did the organization have publications (internal or external) that described the mandate (vision and mission) of the organization? ❑❑ Did the organization have an annual business plan with clearly defined responsible units and personnel for various tasks? Achievement of ❑❑ Did the organization have an annual business plan with clear outcomes objectives for its work? ❑❑ Did the organization have a system (informal or formal) to periodically report the progress of its work against the objectives? Operational efficiency ❑❑ Did the organization have an annual business plan with a defined set of activities accompanied by a budget, timeline, and responsible personnel assigned? ❑❑ Did the organization have a system (informal or formal) to receive confirmation from its stakeholders about the completed work? Financial viability and ❑❑ Did the organization have the funds to sustain its operating costs? probity ❑❑ Did the organization issue annual income and expenditure reports? Good communications ❑❑ Did the organization have stakeholders’ cooperation and support to and stakeholder meet its goals? relations Adaptability ❑❑ Was the organization proactive in obtaining up-to-date information on development goal-related areas? ❑❑ Did the organization research innovative ways to improve its processes? 33 Guidance Note 7 Assigning Indicators and Data Sources for Assessing Achievement of Capacity Change Objectives Once practitioners or evaluators have of collecting the relevant data. Changes in defined the specific capacity development stakeholder perspectives or behaviors are outcomes (change objectives) targeted by often tracked via surveys whereas changes interventions, they should assign indicators in the operational efficiency of an orga- to assess whether desired changes hap- nization might be captured through the pened as planned. Establishing effective analysis of existing administrative records. indicators requires thinking through how Practitioners or evaluators may find that these changes can be observed and mea- existing indicators are adequate for mea- sured to confirm that the capacity develop- suring the achievement of targeted change ment outcomes have been achieved. objectives. If existing indicators are not suf- Appropriate indicators for capacity ficient, it is possible to conduct additional development outcomes could be quan- data collection and/or analysis after project titative or qualitative, depending on the completion (see Section II: Analytical Tech- nature of the capacity change desired. In niques to Assess Outcomes). either case, they should have the following Tables 11–13 provide examples of SMART characteristics: indicators used in existing Bank projects • Specific. Indicators should reflect simple to assess the achievement of capacity information that is communicable and change objectives. These cases identify easily understood by the provider and the development goal to which the the user of the information. intervention was expected to contribute • Measurable. Changes should be and describe the capacity development objectively verifiable. outcome that was intended as a result • Achievable. Outcomes and indicators of the envisioned capacity development must be achievable and sensitive to change process. Indicators are provided change during the life of the project. to show how changes in specific • Relevant. Indicators should reflect characteristics related to the outcome information that is important for could be observed and measured to assess assessing outcomes to be used for whether the targeted objective is being management or immediate analytical achieved. purposes. The examples in Tables 11–13 demon- • Time-bound. Progress can be tracked at strate how capacity development results a desired frequency for a set period of can be assessed across sectors and across time and assessed accordingly. institutional capacity areas. Additional The process of selecting indicators examples can be found in the Institutional should always include the consideration of Capacity Indicators Database (see Guid- existing data sources and/or the feasibility ance Note 7). 34 Table 11. Examples of Indicators to Assess Changes in the Strength of Stakeholder Ownership Development Capacity Development Indicator Data Source Goal Objective Promote Increased use of Bus Percentage of surveyed residents who perceive that Surveys of bus environmentally Rapid Transit System walking and cycling have become safer and more riders sustainable by automobile comfortable in project area urban transport owners Proportion of Bus Rapid Transit System riders accessing the system through bicycles or on foot Establish a Improved Percentage of households that reported hearing about Institutional functioning transparency of government efforts from an official source: Reform and local information regarding Capacity government decentralization How do you hear about what the government is doing? Building Project system efforts • No source National Public • Relatives, friends, neighbors, co-workers Services Survey • Community bulletin board • Village headman/headwoman • Paramount or section chief/chiefdom officials • Newspaper • Radio • TV • Other Improve public Increased Participation rate of poorest and vulnerable community Surveys and services in participation members in planning and decision-making meetings attendance targeted urban of community records in Participation rate of women in planning and decision- areas stakeholders in Management making meetings decisions regarding Information local public services Percentage of kelurahans (urban wards) with System established community boards of trustees Table 12. Examples of Indicators to Assess Changes in the Efficiency of Policy Instruments Development Capacity Development Indicator Data Source Goal Objective Increase Increased compliance Percentage of public school teachers who meet Ministry of the quality with credentialing professional standards for licensing education of primary requirements among teacher education teachers licensing data Improve the Improved clarity Clear designation for policies and responsibilities on Records of health status of regarding oversight professional accreditation, certification, and school Central Project the population responsibilities for licensure for each profession. Is a body established Coordination different types of to provide oversight on standards for accreditation, Unit health professional content, and conduct [yes, no] for: education programs • Medical education • Dental education • Nursing education • Midwifery education Provide Increased use of Percentage of field audits selected by automated Tax committee citizens with automated selection procedure data better public procedures for field services and audits infrastructure 35 Table 13. Examples of Indicators to Assess Changes in Effectiveness of Organizational Arrangements Development Capacity Development Indicator Data Source Goal Objective Improve Increased financial Operational cost ratio (percentage cost recovery) Audited Financial sustainable viability of the Water of Water and Sewerage Authority Statements access to safe and Sewerage of Water and water supply Authority Sewerage Authority Improve Increased level of Percentage of respondents, ages 14 and older, Living Standards employment employment of who indicated they found their job through the Measurement rates individuals using labor office Survey labor office services Improve land Reduced cost of land Per unit cost and time of regularization process Intendance titling tenure security registration process information system at pilot project department 36 Guidance Note 8 Institutional Capacity Indicators Database The Institutional Capacity Indicators Data- and failure to effectively track them limit the base* is a practical resource to help moni- possibility for TTLs to make needed, timely tor, evaluate, and report tangible results for adjustments to their programs. capacity development programs. The data- WBI developed the database for TTLs to base is a searchable catalogue of real-world find examples of indicators and measures capacity characteristics and their indicators. for various institutional capacity challenges In this way, project teams can break institu- their projects face. It features examples of tional capacities down into observable and indicators from a review of development measurable units to retrospectively assess databases and approximately 200 existing results. and closed World Bank projects across sec- Task team leaders (TTLs) often need tors and regions. concrete results for capacity development TTLs can search the database to: interventions, to show the viability of their • Identify characteristics of institutional efforts and accountability to stakeholders capacity for exploring the results of and donors. However, evaluation of capac- capacity development ity development activities often focuses on • Identify indicators of those measuring outputs rather than outcomes. characteristics for evaluating institutional Also, the absence of appropriate indicators change *The database is available online for World Bank staff only at http://wbicdrf.worldbank.org. External users can request a searchable Excel document by emailing capacity4change@worldbank.org. 37 • Prioritize characteristics according to the Through the database, the TTL under- most needed results stood how constraints to stakeholder • Expand their understanding of results ownership could be assessed to reflect the management for capacity development program’s ability to achieve locally owned For example, a TTL within an Urban results. She searched the database for Development sector focuses on public examples of capacity characteristic out- sector governance. She is working on come indicators and their measures to help a capacity development program with assign appropriate indicators. Together the goal of improving public services in with stakeholders, she also prioritized which targeted urban areas. Initially, she planned characteristics to measure by defining the to focus primarily on organizational most needed results. capacity development. Using the Table 14 gives an example of using the Institutional Capacity Indicators Database database to clarify an institutional capac- to inform her retrospective assessment of ity change objective, its outcome and project results, she identified stakeholder indicator that the TTL could use to assess ownership as a key institutional capacity the results of the capacity development area targeted by project interventions. program. Table 14. Example of Using the Database to Assess Results Database Category Sector is Urban Development Development Goal Improve public services in targeted urban areas Institutional Capacity Area Stakeholder ownership Institutional Capacity Increased stakeholder participation in setting priorities Change Objective Outcome Increased participation of community stakeholders in decisions regarding local public services Indicator Participation rate of poorest and vulnerable community members in planning meetings Data Sources Community survey and meeting minutes 38 Guidance Note 9 Understanding Intermediate Capacity Outcomes An ICO is an improvement in the ability worked and what did not work in capac- or disposition of agents of change to take ity development interventions. actions. This improvement is considered an An ICO is the result of one or several intermediate capacity outcome, because the steps (or deliverables) in the capacity expectation is that the agents of change— development intervention (or initiative). thanks to the improved ability or disposi- These steps can involve different tion—will act to effect institutional changes instruments (or learning approaches), toward the development goal of a capacity including learning-by-doing. The CDRF development program. Being able to under- provides a typology of six standard stand and identify ICOs is critical for accu- ICOs that agents of change can achieve rately tracing capacity development change to contribute to institutional level processes and deriving lessons about what changes. Table 15 presents the ICOs, Table 15. Intermediate Capacity Outcomes, Definitions and Attributes ICO Definition and Operational Attributes Raised Increased disposition to act, through, for example, improved: awareness Understanding, attitude, confidence, or motivation Enhanced Increased ability to act, through: knowledge and Acquisition or application of new knowledge and skills skills Improved Strengthened disposition or ability to act through improved collaboration within a group of consensus and people tied by a common task. This may involve for example, among team members, a stronger teamwork agreement or improved: Communication, coordination, cohesion, or contributions by the team members to the common task Strengthened Strengthened disposition or ability to act through improved collaboration between individuals or coalitions groups with diverse objectives to advance a common agenda. This may involve, for example: Stronger agreement on a common agenda for action, increased commitment to act, improved trust among members, or improved ability of the coalition members to leverage their diverse strengths Enhanced Strengthened disposition or ability to act through improved collaboration between individuals or networks groups with a common interest but not a formal common agenda for action. This may involve, for example: Improved processes for collaboration, stronger incentives for participation in the network, or increased traffic or communication among network members Increased Strengthened disposition or ability to act, arising from: implementation Formulation or implementation of polices, strategies, or plans know-how This may involve, for example, discovery and innovation associated with learning by doing. 39 their definitions and attributes. Annex 2 characteristics? Specifically, what provides examples of these attributes. changes in the ability or disposition of Once practitioners have identified stakeholders led to or facilitated the their capacity change objectives (targeted institutional capacity change? changes in institutional capacity), they can • What capacity development trace the change process logic intended to interventions contributed to the achieve these objectives (examples in Table targeted ICOs? 16). The ICOs reflect the initial change ICOs should be identifiable for all results of capacity development interven- capacity development interventions, tions and serve as important milestones regardless of their sector focus or higher- for monitoring progress. In cases where level institutional capacity change objec- targeted ICOs have not been explicitly tives. Even at the sector or country strategy identified during a project or program’s level, capacity change processes can be design stage, it is possible to retrospec- understood only when ICOs have been tively identify the needed initial change clearly articulated. The assessment of a results and assess whether these have been strategy in this case nearly always requires achieved. practitioners to trace capacity development The intermediate stage of the program changes within specific projects to identify change logic can be explored and clarified the ICOs and capacity development results by asking the following questions: for the overall portfolio. • Who were the agents of change who Practitioners need to understand how an initiated or managed the desired ICO serves as an intermediate step toward change process (es)? a needed institutional capacity change • What ICOs led to the measurable before attempting to assign any indicator changes in institutional capacity or data source for assessment. Table 16. Examples of Intermediate Capacity Outcomes in World Bank Projects Sector Focus Institutional Capacity Generic ICO Specific ICO Change Objective Public sector Increase the transparency of Raised awareness Awareness among community governance information about decentralization members of decentralized and intergovernmental transfers governance structure Transport Improve the achievement of outcomes Enhanced knowledge Completion of the by transport authorities or skills municipality’s Transport Master Plan Community Increase the operational efficiency Improved consensus Expansion of participatory and basic of primary healthcare facilities in and teamwork process for budgeting to health delivering services to those who need link strategic objectives with them budget allocations Agriculture Foster stakeholder participation in Strengthened coalitions Establishment of inter- and rural priority setting for increasing the municipal road consortia by development productive capacity of the rural sector clusters of municipalities Public financial Increase the demand for the Enhanced networks Establishment of National management accountability of public service Budget Oversight Network providers in public financial management Public sector Increase the operational efficiency New implementation Formulation of strategy by governance of the village land management know-how village land management committee in planning and committee for local implementing rural development development 40 Guidance Note 10 Checklist for Identifying Targeted Intermediate Capacity Outcomes ICOs reflect an improvement in the ability (or should have been) targeted as part of or disposition of change agents to take the a capacity development change process. actions needed to achieve capacity change In cases where task team leaders or other objectives. These improvements too often practitioners lack sufficient information fall within the “black box” of capacity to accurately address these points retro- development—remaining undefined and spectively, the questions should simply be unmonitored despite the fact that they are answered to the extent feasible. intermediate milestones representing criti- Practitioners must identify and under- cal progress towards targeted higher-level stand targeted ICOs before they can suc- institutional capacity changes. cessfully assign indicators and methods or Defining desired ICOs allows practitio- data sources for tracking their achievement ners or evaluators to assign indicators for (described in Guidance Note 11). assessing and documenting the capacity development change process. Such indica- tors can be identified during the project design phase or retrospectively by reexam- ining the project context and interventions. For instance, project teams might have assigned and tracked indicators for individ- ual capacity development components that reflect important ICOs needed to achieve capacity change objectives. The questions in Table 17 can help practitioners or evaluators to explore and possibly identify which ICOs a capacity development intervention was designed to produce. These questions are phrased in the past tense to support a retrospec- tive assessment. However, they could also be asked in the present tense during the design phase to help define which improvements in the ability or disposition of stakeholders interventions will be devel- oped to produce. Not all of the questions will be relevant for a specific capacity development inter- vention, but practitioners can use this list to consider retrospectively which ICOs were 41 Table 17. Questions for Clarifying Needed Intermediate Capacity Outcomes ICO Check if the answer is “no” in relation to the target development goal Raised ❑❑ Did the change agents have sufficient knowledge of the ? awareness ❑❑ Did they understand their role in improving the current situation? ❑❑ Were they sufficiently motivated to take the needed actions? ❑❑ Were they confident that they could take the needed actions? Enhanced ❑❑ Did the change agents have adequate technical skills and/or knowledge related to to knowledge and make the current situation better? skills ❑❑ Did they know how to apply the needed knowledge or skills in their work? ❑❑ Did they have the managerial support to apply the needed knowledge or skills? ❑❑ Was the environment in the change agents’ workplace conducive to applying these skills? Improved ❑❑ Were there any problems among or within the change agents related to poor teamwork? consensus and [check if yes] teamwork ❑❑ Were change agents able to work effectively together on ? ❑❑ Were they able to reach agreement on ? ❑❑ Were all key stakeholders (other than change agents) included in the decision-making process related to ? ❑❑ Was there effective and sufficient communication among team members? ❑❑ Were team members committed to improving the situation related to ? Strengthened ❑❑ Did the change agents collaborate in any form with any external partners on ? coalitions ❑❑ Were the roles and responsibilities within established partnerships clear related to ? ❑❑ Did the members of the established partnerships or coalitions share a common agenda for action related to ? ❑❑ Was there sufficient trust among members of the coalition to work effectively together? ❑❑ Were the partnerships or coalition structured appropriately to leverage diversities related to ? ❑❑ Was the structure of the partnership or coalition formal enough to support an effective decision making process related to ? Enhanced ❑❑ Were the relevant stakeholders’ involvement in the decision making process ensured? networks ❑❑ Did individual members have sufficient incentives for participating in the network? ❑❑ Were members committed to the network’s goals? ❑❑ Were the relationships within the network appropriate for effectively addressing ? ❑❑ Was everyone connected to the network who needed to be for addressing ? ❑❑ Did the network effectively bridge differences? ❑❑ Was there a sufficient exchange of information among network members for addressing ? New ❑❑ Did change agents have sufficient understanding of why they needed to develop a strategy/ implementation policy/plan? know-how ❑❑ Was there a new policy or strategy that needed to be developed to make the envisioned changes in the ? ❑❑ Was there a policy/strategy/plan that needed to be implemented to make the envisioned changes in the ? ❑❑ Did the change agents have sufficient know-how to identify and implement the needed action steps related to ? ❑❑ Was there a M&E plan to measure the results of the strategy/policy/plan? 42 Guidance Note 11 Assigning Indicators and Data Sources for Assessing the Achievement of Intermediate Capacity Outcomes Once practitioners or evaluators have • Social or value network analysis, identified the targeted ICOs of capacity involving the mapping of relationships development interventions, they should or the assessment of the financial and identify indicators that can assess whether nonfinancial value of assets the desired changes have happened (or • Assessment of the quality of policy, are happening) as planned. This process strategy, program and project requires thinking beyond the general type documents of ICO (raised awareness, enhanced knowl- The process of selecting indicators edge and skills, etc.) to identify one or should include the consideration of exist- more specific attributes for which measur- ing data sources and/or the feasibility of able change could be observed during the collecting the relevant data. Data quality is timeframe of the intervention. an important consideration in selecting data Appropriate indicators for ICOs could sources. In all cases, participants should be either quantitative or qualitative ensure that indicators are SMART (specific, depending on the context and the char- measurable, achievable, relevant, and time- acteristic being observed and measured. bound). ICOs are assessed through a broad range In cases where indicators for ICOs have of measurement methods, such as: not been identified during an intervention’s • Surveys assessing changes in design stage, some outcomes captured for perceptions, understanding, attitudes, individual activities or components might be motivation, etc. adequate for confirming the needed ICOs • Analysis of media content within a results chain. Practitioners or evalu- • Review of records from relevant ators should therefore review existing docu- government offices (administrative data) mentation for possible evidence of ICOs. • Post activity tests (tests of learning or Table 18 provides examples of indicators skill competency tests) used in existing World Bank projects across • Pre and post learning questionnaires sectors and across types of outcomes. • Analysis of meeting minutes or other Additional guidance for exploring and documentation of group processes identifying possible attributes of ICOs to • Observation of meetings and group measure is included in Annex 2. interaction If existing indicators and data are not • Application of collaboration and sufficient for documenting ICOs, then it is inclusivity checklists often possible to conduct additional data • Individual or group interviews, focus collection and/or analysis after project com- groups pletion (see Section II: Analytical Techniques to Assess Outcomes). 43 Table 18. Examples of Indicators for Intermediate Capacity Outcomes in World Bank Projects Sector Focus Generic ICO Specific ICO Indicator Data Source Public sector Raised Awareness among • Percentage of households that can National Public governance awareness community identify their representative in the Services Survey members of local council decentralized • Percentage of households who know governance whom to file complaints with if they structure are dissatisfied with development decisions in a specific sector (education, health, and agriculture) Transport Enhanced Completion of Publication of the Transport Public documents knowledge or the municipality’s Master Plan (Transport skills Transport Master Plan) Master Plan Community Improved Expansion of Number of sector and subsector Annual reports of and basic consensus participatory strategies included within Medium Ministry of Health, health and teamwork process for Term Budget Framework oblast health budgeting to link departments, strategic objectives and central rayon with budget hospitals allocations Agriculture Strengthened Establishment of Number of inter-municipal road Management and rural coalitions inter-municipal road consortia established by clusters Information development consortia by clusters of municipalities System of municipalities Public Enhanced Establishment of Presence of network for information Report of the financial networks National Budget sharing and collaboration among NSA coordinator management Oversight Network non-state actor (NSA) coordinators and government, as evidenced by the following: • Appointment of a NSA coordination officer • Membership that includes NSAs and other accountability institutions • Regular meetings for government and NSAs to discuss PFM issues • Preparation of PFM information materials at national and local levels, translated into local dialects using non-technical language Public sector New imple- Formulation of Percentage of villages covered by the CVGT governance mentation strategy by village project that developed and adopted (village land know-how land management local development plan management committee for local committee) development annual activities and budget reports 44 Section II: Study Designs and Analytical Techniques to Assess Outcomes Practitioners or evaluators conducting a retrospective assessment of a capacity development intervention often rely on project documents to trace one or more capacity development change processes. In some cases, data from existing databases or management information systems can be extracted and analyzed to assess results. In most cases, however, additional data collection will be required to confirm and refine results stories and to document capacity development outcomes. This section can guide evaluators or practitioners in planning and implementing data collection activities that will be used to analyze the effects of a capacity development intervention at two levels: ICOs and institutional capacity changes. In planning additional data collection and analysis, evaluators must decide whether to pursue a qualitative approach, a quantitative approach or a mixed-methods approach that uses both. Qualitative techniques usually require less time for developing data collection instruments but typically require more time for analysis of narrative data collected through interviews and focus groups. Such narrative data often yields a detailed understanding of how interventions worked in a specific context. In contrast, quantitative techniques require substantial time for instrument development (such as surveys and tests) but the data can be analyzed more quickly once it is collected and entered into a database. Quantitative methods using systematic sampling methods provide data that enable the findings to be generalized to the population from which the sample is drawn. Overall, the data collection and analysis methods that are most appropriate for clarifying or confirming capacity development change processes will depend on the nature of the intervention, M&E practices that were used during the intervention, availability of existing data, expertise of the evaluation team, available resources and other factors. A critical aspect of a successful analytical approach is to include data representing multiple perspectives to triangulate findings and confirm assumptions linked to each capacity development results story. In particular, collecting data from both program implementers, and beneficiaries including change agent participants can reduce the biases arising from one particular perspective. A general rule of good practice is to consult the major stakeholder groups as appropriate from government agencies, civil society, and the private sector. This section will help practitioners and evaluators think through what they want to learn and what the most effective means could be for doing so. Tips and references promote good practice in applying qualitative and quantitative methods, and example protocols and templates have been tested in completed case studies. Data Management Guidance Note 12: Preparing, Storing and Managing Data for Analysis................... 46 Qualitative Methods Guidance Note 13: Interviewing Key Program Stakeholders. ...................................... 50 Guidance Note 14: Conducting Group Interviews with Stakeholders........................ 56 Guidance Note 15: Analyzing Qualitative Data............................................................ 59 Quantitative Methods Guidance Note 16: Exploring Opportunities for Quantitative Analysis. ..................... 63 Guidance Note 17: Survey Data..................................................................................... 66 45 Guidance Note 12 Preparing, Storing and Managing Data for Analysis In planning a data collection and analy- scrubbing, entering missing data, recoding sis strategy, practitioners and evaluators variables, and merging data from other should consider early on how they will sources. record and manage the data. Reliably cap- Qualitative data, such as narrative text turing and organizing qualitative or quanti- from interviews, focus groups and desk tative data is a critical step for substantiat- reviews, need to be organized and coded ing a capacity development results chain into categories related to emerging themes. and filling gaps in understanding about the A case study will include data obtained from outcomes associated with the program or multiple sources to enhance understanding intervention being studied. and triangulate findings. Some case studies Three main types of data are worth focus on more than one program, which can considering: data extracted from existing further complicate efforts to organize and systems, data created from surveys and analyze stakeholders’ inputs. data created from interviews. In all cases, A simple approach to managing qualita- the original or “raw”data will need to be tive data is to categorize the types of infor- transformed to be useful for analysis. Care- mation collected and enter text from the ful planning regarding data storage and interview notes into a table using the sample management at the start of any data collec- categories presented in this note. Designing tion process, including interviews and focus a data entry template early on in the data groups, will help ensure efficiency. collection process can facilitate the analysis Quantitative data extracted from exist- after the interviews have been completed, ing databases will typically require consoli- and additional variables and coding can dation and/or manipulation, since existing be added during the data analysis process databases will rarely include the precise as needed. indicators that the practitioner or evalua- A program such as Excel could be used tor has identified as relevant to the spe- to allow for the easy filtering and analysis of cific assessment of capacity development qualitative variables. For example, a reviewer results. Quantitative data from surveys must might want to explore whether the types be entered, either manually or automati- of interventions and/or the types of partici- cally through a web-based application, pants appear to be influencing the types of and “cleaned.” outcomes achieved. Potential categories to Although responses to survey questions include in an effective template for docu- are often pre-scored, by assigning numeric menting a results story are described in this values to each response option, cleaning note with illustrative examples provided. The survey data typically requires inspection to format and structure of the template should remove errors and inconsistencies in the be customized based on the scope of the database. In addition, efforts will likely be retrospective assessment, program details needed to refine the data set such as data and level of information likely to be available. 46 1. Basic program details Particularly in case studies where multiple programs will be examined, it will be important to record basic identifier information about the program and the person being interviewed. Title of Capacity Name(s) of Role(s) of Key Program Start Program End Development Key Informants Informants Date Date Program Interviewed Interviewed 2. Information on program context These steps can help to define and confirm capacity development results related to a spe- cific development goal. Information about the overall context, including the development goal, should be captured for the analysis. In most instances, the targeted institutional capacity areas can be identified by asking about the challenges that are impeding the achievement of the development goal and that the intervention has been designed to address. At first, the verbatim description of the challenges can be entered into the table and labeled by question number for reference (see Q2). This description can then be recoded to identify the specific institutional capacity change objectives, with each challenge entered separately for a more detailed analysis of the change process(es) later (see 2a–2b). Overall Q2. Description of Specific 2a. Targeted 2b. Targeted Development Challenges or Problems Institutional Institutional Goal Capacity Capacity Characteristic (1) Characteristic (2) Develop the IT/ 1. (from Anubha) “All countries wanted Strength of Stakeholder Effectiveness of ITES industry to learn about the IT industry and its Ownership— Organizational to stimulate impact on the economy. However, Commitment of leaders Arrangements— economic they were not able to envision the Achievement of growth and transformation of the economy and the outcomes employment urban development as a result of the IT industry development…. An attitudinal shift and different mindset was necessary to nurture this industry.” This view is reinforced by the IT/ITES Industry in Africa: A World Bank Supported Study, in which the conclusion of this baseline study is that “all that is required is high level government commitment to taking the necessary steps.” 2. (from GFR, pp5-8) the need for “market-responsive training programs” “NESAP-ICT will seek to introduce innovative skills development models deemed necessary for the country’s present and future needs” 47 3. Program Description Descriptive text from the interview notes can be entered into the template by topic. In the example below, each field references the question number from the interview protocol (see example protocol in Guidance Note 13). Q3. Program Q4.Targeted Participants Q5. Rationale Components (Brief (Roles and Organizations) for Selecting Description) Participants • Three VCs before the • Private sector representative from bodies representing/ Participants were visit involving ICT firms, e.g. chambers of commerce; strategically selected • Visit to India (field association of network providers/ISPs; or other private based on their visits, presentations, a sector IT organization positioning to initiate reflection portion– part • Senior officials in technical institutes or universities or manage the of the action plan to who are engaged in ICT skills training and would be needed change. We take time to document interested in incorporating courses with a focus on IT/ looked for champions the learning, and a ITES into their curriculum in each country conference) • Senior officials from the public or private sector who before the project. • A VC after the visit are involved in or would like to be involved in business We made sure we to discuss lessons process outsourcing included not just the learned. • Government officials from regulatory bodies that leaders but also the • Zambia follow-up address policies in the area of ICT implementers. (e-learning conference) • Government officials or senior officials from ministry using GDLN. Sharing responsible for development of IT of experiences upon • World Bank Project TTLs return was the primary Specific participants from Nigeria included: purpose of the [list of names and titles] conference. 4. Evidence of ICOs Interviewers’ questions about the outcomes of capacity development interventions often first elicit responses at the intermediate outcome level. In other words, informants are likely to talk about how the ability or disposition of individuals or groups changed when they reflect on program or project outcomes. It is therefore suggested that the description of outcomes is entered into the template and then coded into the standard categories of ICOs (with each type of ICO entered separately—see 7a–7d). 7a. Intermediate 7b. Intermediate 7c. Intermediate 7d. Intermediate Capacity Outcome Capacity Outcome Capacity Outcome Capacity (1) (2) (3) Outcome (4) New Implementation Know-how Strengthened Coalition Enhanced Network Enhanced knowledge and An action plan was developed and In Nigeria, a new Nigeria is proceeding skills (longer-term it has lead to several follow-up IT-ITES industry in building up outcome) actions, Bank as well as non-Bank: association (called National Research As a key component of this action Outsourcing and Education In Nigeria, the plan, the Bank is supporting a pilot Development Initiative Networks (NREN) ACCESS initiative in Nigeria to assess and certify of Nigeria - ODIN) that will connect is developing a foundational skills for BPO industry. was conceptualized universities and training program, a This is called ACCESS (Assessment and launched in 2009. research institutes. skills set dictionary of Core Competence for Services During the KE, ODIN Efforts to set and curriculum Sector) Nigeria. The ICT Skills TA met with NASSCOM up the NREN in framework, and a for piloting this assessment was and learned what an Nigeria have been training matching launched by a concept review held industry association can ongoing since the grant model to on January 25, 2010. A computer- do. The Access Nigeria Nigeria ICT Forum address the skills based assessment and benchmarking Pilot resulted from this was established in gaps. A preliminary tool was developed and piloted on interaction, reflecting 2005. The possible pool of qualified 300 students in Lagos in December, how the coalition contributions of the training providers has 2010, and is now being rolled out to represented by ODIN KE to furthering the been identified. 3000 students in 5 cities. Results are has strengthened. development of the expected in May 2011. NREN is not yet clear. 48 5. Evidence of Institutional Capacity Outcomes In some cases, interviewers will identify outcomes at the institutional capacity level. An effective approach can be to probe on the specific institutional capacity challenges identi- fied earlier in the interview to explore whether the initiative has had an effect thus far. As with the information about ICOs, this data can be entered first as descriptive text and then recoded into the specific institutional capacity change objectives for later analysis (see 8a–8b). Q8. Institutional Capacity Outcomes 8a. Institutional 8b. Institutional (Description and Evidence) Capacity Outcome Capacity Outcome (1) (2) The ICT Skills TA for piloting an Assessment and Increased strength of Increased effectiveness Benchmarking of IT Enabled Services Foundational stakeholder ownership: of organizational Skills was launched by a concept review held on January commitment of leaders arrangements: 25, 2010. There is high-level commitment to this pilot, to developing a skilled improved achievement as evidenced by the link to the President’s initiative workforce for the IT-ITES of outcomes by training allocating $400 million to create jobs. industry centers preparing graduate for the IT-ITES The ACCESS initiative will establish a full globally- industry benchmarked cycle of Assessment-Training-Certification and job placement of candidates. An increase in the commitment of leaders will allow for this cycle to be scaled up at training centers throughout Nigeria to produce skilled graduates well-prepared for employment in the IT-ITES industry. 6. Other Information Interviews provide opportunities to explore unexpected findings and identify additional data sources. Any relevant questions or probes for gaining this information should be included in the interview protocol (Guidance Note 13) and the resulting qualitative data can then be captured in the data template. Any methods used for managing data can be adapted to meet the needs and purposes of users. However, it is important to establish a clear structure and process at the start to ensure that team members implement a consistent, standard approach to capture any relevant data for documenting capacity development results stories. Q9. Lessons Q10. Additional Q11. Other Key Q12. Final Comments Learned Information Informants to Contact or Suggestions Resources 49 Guidance Note 13 Interviewing Key Program Stakeholders A qualitative approach to data collec- menting the capacity development tion can yield a valuable understanding of outcomes and the context in which they how program components have worked occurred. To this end, the team should together to produce specific outcomes. carefully plan out how they will analyze An important early step in this approach is the responses and in what format the to interview a person who is highly knowl- data should be entered to facilitate this edgeable about the intervention and well analysis process. situated to identify other key informants. • There is no single way to conduct a This person will most often be the program qualitative interview, but the tips in the officer or team leader and will serve as the sidebar on page 51 are likely to enhance entry point for locating useful program its effectiveness: documents and identifying appropriate • Interviews should be semi-structured. informants to cover a range of perspectives The interviewer should develop ques- such as program implementers, partners, tions or probes beforehand to ensure participants and beneficiaries. that key topics are covered. A flexible, Some preparation is recommended facilitated conversation is likely to yield to help ensure that the evaluation team richer data then a more controlled ques- designs and conducts an effective case tion and answer session. study. In particular, two steps should be • Questions should focus first on gaining implemented before any interviews are basic descriptive information. Program conducted: officers or other key informants could • Review any available program docu- feel uncomfortable if early questions ments or other information to begin are challenging. The interviewer can mapping the capacity development learn valuable contextual information change processes (see Guidance Notes and set the informant at ease by asking 1–10). The reviewer or team should first about the program background or explore existing data related to program design. outcomes to establish one or more • Follow-up questions (or probes) often hypotheses about the results stories. need to be asked to gain more spe- Thinking through each results story will cific details and evidence related to highlight where additional information is outcomes. A rich results story requires needed and inform the reviewer about concrete details and documentation, what questions need to be asked of pro- but an informant might not be aware of gram stakeholders during interviews. which details are needed or might be • Establish a system for managing and concerned about talking too much. analyzing qualitative data (see Guidance • The interview should be viewed as part Note 13). To minimize the burden on of an iterative process. To this end, the respondents and evaluators, data should interviewer should include questions only be collected that is expected to related to who (or what) else might serve be useful for understanding and docu- as a valuable information resource and 50 set the expectation that he or she might systematically a standard set of data items have additional questions later. from multiple perspectives. Additional • Recording information during the inter- questions can then be added or adapted view should not interrupt the flow of the for each informant being interviewed based conversation. Tape recording is the least on their unique experiences or areas of intrusive method for recording informa- expertise. The sidebar lists typical core tion during the conversation if it does topics to explore for a case study using this not intimidate interviewees from speak- approach. ing freely. Therefore, it is important to In nearly all cases, an interview proto- ask permission beforehand to tape the col should be customized for its specific conversation. Taking notes during the intended use to maximize a data collection conversation is still helpful even if the opportunity. An example of a customized conversation is taped. interview protocol used by WBI and KDI in Overall, qualitative interviews are most their joint study of KDI’s Knowledge Shar- successful if the interviewer is able to ing Program follows to show how the core engage the informant in free-flowing dis- items listed could be adapted and inte- cussion and adapt the questions as needed grated into an interview. to clarify and document the capacity devel- Select resources for further guidance opment change process(es). on designing an effective interview are A semi-structured interview protocol included at the end of in this note. often needs to be customized to serve as an effective guide for each targeted informant. A typical approach is to estab- lish a set of core questions that will be asked across all of the program stakehold- ers. This allows practitioners to explore Core Topics for Interviews with Program Stakeholders Program Overview • Interviewee’s role in the design and implementation of the project or program—to understand the vantage point of the person being interviewed • Challenges or problems that program was design to address—to identify institutional capacity constraints • Context of program component or knowledge exchange in focus—to understand how the program being explored fit within a larger initiative Participants • Who was targeted or selected to participate and in which country(ies)—to identify the stakeholder groups represented • Rationale for selecting participants—to explore the program design and expected outcomes • Process for selecting knowledge partners—to explore the program design and expected outcomes Outcomes • Short-term results and follow-up activities of the program—to identify and document possible ICOs • Concrete ways that the program helped to address identified barriers and challenges—to identify and document possible institutional capacity outcomes • Other outcomes—to explore other possible outcomes that the interviewer has not yet identified or included in hypotheses about the capacity development results Additional Information • Additional materials or existing resources—to identify other information sources for learning more about the program and documenting or triangulating findings • Other key informants—to learn whom else to contact to explore further the program’s design and possible evidence of outcomes. 51 Example of Interview Protocol from Real-World Situation Assessing the Outcomes of a Knowledge Sharing Program—Program Implementer Perspective [Example of Interview Guide for KDI Program Officer] Notes to interviewer: • The generic protocol for conducting a semi-structured interview should be customized for the specific knowledge sharing project based on a review of any available documents. • The interview guide does not need to be followed in sequence; instead, a collaborative, conversational approach could dictate the order in which topics are addressed. • Probes and instructions are provided in italics to assist the interviewer in eliciting more details as needed. • The interviewer should use this guide to qualitatively explore the capacity development change process(es) supported by the knowledge sharing program. Existing challenges and reported outcomes should be translated into the corresponding institutional capacity constraints and types of ICOs. These standard terms are listed at the end of this guide. Program Overview 1. Why did the Korea Development Institute establish a partnership with [country] in [year] to share knowledge related to [content area]? Probe: What were the considerations that led KDI to this development partnership for [year]? 2. What were the specific challenges or problems that this program was designed to address? Probe as needed to define each challenge in terms of an institutional capacity constraint 3. Could you briefly describe the major components or activities in this knowledge sharing program? Probe as needed to collect basic description of format and content. Program-specific probes: (if not mentioned as part of activities or components listed) • [list names of main program activities described in reviewed report here] • • Participants 4. Who was targeted or selected in [country] to participate in this knowledge sharing program? What were these individuals’ roles and organizations? 5. What was the rationale for selecting these participants? Probe as needed to understand how the participants were positioned to help overcome the institutional capacity challenges that the program was designed to address. Interviewer should focus on the specific challenges mentioned in Question 2. 6. How were the experts from Korea selected to participate? Probe as needed to understand the rationale for selecting these individuals. 52 Outcomes 7. What did you identify as the main outcomes of this knowledge sharing program? Probe for existing supporting documentation for each outcome. If no ICOs are identified, interviewer can probe using the following categories: • Raised awareness • Enhanced knowledge and skills • Improved consensus and teamwork • Strengthened coalitions • Enhanced networks • New implementation know-how 8. To what extent did the program help to address the identified institutional capacity challenge [name challenge listed in Question 2]? What indicators were or could be used to assess a change for the [name institutional capacity characteristic]? Explore possible indicators for each identified institutional capacity challenge separately. Probe for measure and data source as appropriate. 9. [If outcomes described only for knowledge recipient country] Did Korea directly benefit from this knowledge sharing program in terms of enhancing its own institutional capacity? If so, please describe the outcomes that were achieved for Korea. Probe as needed to understand type of outcome and specific measure. 10. Were there any predominant lessons learned during this knowledge sharing program that have been used or will be used to inform the design and implementation of future knowledge sharing programs? If so, please describe what was learned. This question could elicit information on additional unreported outcomes. Additional Information 11. Are there any additional materials or existing resources that you suggest we review to learn more about this program’s design, implementation, and outcomes? In particular, we are interested in: • Rationale from the initial demand survey used to establish this knowledge sharing program • Agendas, instructional materials, websites, or other artifacts that would help us to understand the main knowledge sharing activities • Any additional reports from pilot studies and policy consultations • Evaluation data related to program activities 12. Who are other key informants whom we should contact to learn more about the program’s design, implementation, and outcomes? 13. Are there any final comments or suggestions that you would like to share regarding this knowledge sharing program and its capacity development results? 53 Checklists for Change Objectives and ICOs for Interviews Interviewers might find it useful to bring checklists for the standard institutional capacity change objectives and ICOs to the interviews to think through and explore individual capacity development change processes. Checklist: Institutional Capacity Areas and Their Contributing Characteristics Strength of Stakeholder Ownership ❑❑ Commitment of social and political leaders ❑❑ Compatibility of social norms and values ❑❑ Stakeholder participation in setting priorities ❑❑ Transparency of information to stakeholders ❑❑ Stakeholders demand for accountability Efficiency of Policy Instruments ❑❑ Clarity in defining rights and responsibilities of stakeholders ❑❑ Consistency ❑❑ Legitimacy ❑❑ Incentives for compliance ❑❑ Ease of administration ❑❑ Risk for negative externalities ❑❑ Suitable flexibility ❑❑ Resistance to corruption Effectiveness of Organizational Arrangements ❑❑ Clarity of mission ❑❑ Achievement of outcomes ❑❑ Operational efficiency ❑❑ Financial management (financial viability and probity) ❑❑ Communications and stakeholder relations ❑❑ Adaptability Checklist: Standard Intermediate Capacity Outcomes and Their Operational Attributes Raised Awareness ❑❑ Attitude ❑❑ Confidence ❑❑ Intention to act ❑❑ Motivation Enhanced Knowledge and Skills ❑❑ Acquisition of new knowledge ❑❑ Application of new knowledge ❑❑ Improvement in understanding Improved Consensus and Teamwork ❑❑ Communication ❑❑ Coordination ❑❑ Contributions ❑❑ Cohesion 54 Strengthened Coalitions ❑❑ Common agenda for action ❑❑ Commitment to act ❑❑ Trust ❑❑ Leveraging diversities Enhanced Networks ❑❑ Common interest ❑❑ Processes for collaboration ❑❑ Incentives for participation ❑❑ Generating traffic New Implementation Know-How ❑❑ Formulated policies and strategies ❑❑ Implemented strategies and plans Resources for Designing and Conducting Effective Interviews Bamberger, M., V. Rao, and M. Woolcock. 2010. Using Mixed Methods in Monitoring and Evaluation: Experiences from International Development. Policy Research Working Paper. Washington, DC: The World Bank. Cresswell, J. and V. Plano Clark. 2007. Designing and Conducting Mixed Methods Research. Thousand Oaks, CA: Sage Publications. Morra Imas, L. and R. Rist. 2009. The Road to Results: Designing and Conducting Effective Development Evaluations. Washington, DC: The World Bank. Patton, M. 2002. Qualitative Research and Evaluation Methods. 3rd ed. Sage Publications: Thousand Oaks, CA. United States Agency for International Development’s Center for Development Information and Evaluation. 1996. Conducting Key Informant Interviews. Washington, DC. Available at http://pdf. usaid.gov/pdf_docs/PNABS541.pdf W. K. Kellogg. Foundation Evaluation Handbook. 2004. Available at http://www.wkkf.org/ knowledge-center/resources/2010/W-K-Kellogg-Foundation-Evaluation-Handbook.aspx 55 Guidance Note 14 Conducting Group Interviews with Stakeholders Qualitative data collection efforts often participants. The process would go less rely on interviews with stakeholders in smoothly if participants are expected group settings. In some cases, interviewers to address different questions, such select a focus group design because the as might occur with a mix of program interactions among informants can help to implementers and participants. develop a more nuanced understanding of • Equal status. The hierarchy of those in outcomes and contributing factors. In other the group should be considered. For cases, the group interview format is the example, some participants might be only or best option for interviewing mul- less likely to speak candidly about their tiple stakeholders within a tight schedule. ability to use new skills and knowledge Either way, the group interview is an impor- in their jobs in situations where their tant qualitative data collection method supervisor is present. Variations in for exploring and documenting a capacity gender and ethnicity may also influence development results story. group dynamics in some situations. The first step in conducting an effective group interview is to address practical con- Interviewers siderations during the preparation stage: The individual or team conducting the interview should also be selected carefully Participant selection to avoid introducing response bias into The composition of the group being the interview. External evaluators are more interviewed will influence the quality of the likely than program implementers to obtain responses. candid feedback on sensitive issues related • Relevant perspectives. Those to the program, whereas program imple- invited to participate should have menters are more likely to elicit favorable direct experience with the capacity responses from program participants and development intervention. This will help sponsors. to ensure that they are well-informed about the topics being explored. These Logistics individuals might be participants, Basic preparations related to where and beneficiaries, implementers, partners or how the interview is conducted will also others who have served a role related to help to ensure a successful session. the program. • Setting. The facilities selected for • Common interests and experiences. group interviews should be quiet and Each group should include individuals comfortable, providing an atmosphere with similar roles who can explore a that is conducive for a candid standard set of questions together. For conversation. The location should be example, an effective group interview convenient for participants to attend could occur with a mix of program easily. 56 • Language. Interviews should be as the lead interviewer to minimize conducted in the dominant language confusion. This lead interviewer can of the group when possible. In cases ensure that other team members have where multiple languages are needed, opportunities during the session to arrangements for interpreters should follow up on responses or fill gaps. be made in advance to ensure clear • Interviewers need to be flexible. Focus communication during the session. groups often include unexpected Sequential translation requires aspects, such as a change in the group’s additional time, which reduces the composition, a lack of willingness of amount of material that can be covered some participants to talk in front of in a given amount of time. Simultaneous others, an unanticipated answer that translation requires appropriate requires substantial follow-up, or even technology (translation booths, a change in the amount of time allotted microphones, earphones, transmitters). for the session. Given the realities of an international Core questions development context, the specific arrange- As with the individual interviews (see Guid- ments of any group interview will need to ance Note 13), practitioners should identify be adapted based on the current situation a core set of questions or themes that will and needs, but a few general tips will help be explored by all of the groups being to ensure that the interview progresses interviewed to integrate multiple perspec- smoothly: tives and triangulate findings. Once this • Groups should contain no more than 6 standard set of items has been established, to 10 people when possible, so that all the questionnaire can be adapted for each of those present have a chance to share group. their views. Focus group interview guides tend to be • The interviewer’s role is to focus highly customized and are developed spe- the discussion. The purpose of the cifically for the scheduled participants. One interview is not to reach consensus or example of a protocol used by WBI and to explore differences; instead, it is for KDI in their joint study of KDI’s Knowledge the interviewer to gain the participants’ Sharing Program is included in the next perspectives on a specific set of topics. example to demonstrate this approach. • The interviewer should prepare a set Selected resources to consult for devel- of questions or probes in advance oping a group interview guide are at the by reviewing existing data and any end of this guidance note. assumptions formed thus far. Based on a prior desk review and/or interview with a program officer, the interviewer or team might be able to trace (hypothesize) the capacity development change processes and then design the interview to explore if the change process is correct and to fill any gaps in understanding. • The focus group discussion guide should include no more than 8 to 10 questions for the group to address. As appropriate, additional probes can be included to ensure that the targeted questions are addressed sufficiently. • If a team is conducting the interview, one member should be designated 57 Example of Group Interview Questions for Exploring Outcomes KDI’s Knowledge Sharing Program (KSP) in the Dominican Republic (DR) Meeting with the National Bank of Housing and Production (BNVP) (1) Has the KSP helped to build understanding about the need for improved credit and insurance services to support export development? [Probes: For whom? How did this occur?] (2) Did the KSP facilitate cooperation between BNVP and the Korea EXIM Bank? (3) What is the current status of efforts to establish the Dominican Export-Import Bank or export credit agency? Is there any documentation available to help us understand this status? (such as, memorandum of understanding between DR and Korea EXIM Bank, draft proposal, etc.) (4) How has the KSP specifically contributed to this process of establishing an EXIM bank? (5) What are the main challenges that the Dominican Republic has faced in planning the establishment of an EXIM bank? Has the KSP provided relevant lessons to overcome these challenges? (6) Are there other outcomes from the KSP not yet mentioned that you experienced or observed? Lessons Learned (7) What specific factors contributed to the success of this KSP? (8) How could the KSP have been more effective? Do you have any suggestions to improve KSP in the future? [Probe: For example, with respect to modality and content?] Resources for Designing and Conducting Group Interviews Bamberger, M., Rugh, J. and Mabry, L. 2006. RealWorld Evaluation: Working Under Budget, Time, and Political Constraints. Thousand Oaks, CA: Sage Publications. Billson, J. 2004. The Power of Focus Groups: A Training Manual for Social, Policy, and Market Research with a Focus on International Development. Barrington, RI: Skywood Press. Hennick, M. 2007. International Focus Group Research: A Handbook for the Health and Social Sciences. London: Cambridge University Press. Krueger R., and M. Casey. 2000. Focus Groups. 3rd ed. Thousand Oaks, CA: Sage Publications. 58 Guidance Note 15 Analyzing Qualitative Data Good data collection practices for evalu- documents from the program cycle have ating capacity development requires been assembled, the preparation stage will forethought about how the information include reviewing program documents and gathered will be stored and analyzed (as other background information to under- described in Guidance Note 13). To pro- stand the program objectives and identify vide meaningful findings, qualitative data potential capacity development change should be clear and specific and provide processes that could serve as the basis for in-depth details regarding the topics being results stories. These hypotheses about examined. Findings from interviews should potential results stories will in turn inform also easily be traced back to individual the data collection strategies and question- informants or stakeholder groups, and ver- naire design. batim quotations often serve as compelling As the data collection proceeds, the examples that add texture and substance team will analyze the data and will need to to results stories. All of these consider- revisit the initial hypotheses to assess how ations highlight that practitioners retro- well the emerging themes confirm the pro- spectively assessing capacity development posed change stories. In many cases, the interventions should explore issues related original hypotheses will need to be revised to data management and analysis early on or additional data collection activities will in the design stage of their study. be planned to address inconsistencies and How data should be stored and man- data gaps. This reiterative approach leads aged depends on its intended use and the to richer and better documented results plans for analysis. Therefore, practitioners stories (Diagram 6). should explore possible approaches to The main purpose in analyzing analyzing content from any desk reviews qualitative data is to identify common and interviews before any data is collected. words, themes or ideas. When evaluating An effective qualitative assessment of a program, a practitioner can explore capacity development outcomes is likely these common themes to identify ICOs to be based on a reiterative approach that or institutional capacity changes. Where moves through six key steps. After the possible, data collection efforts should seek Diagram 6. Steps in the Qualitative Assessment Process Assemble Review Collect data Analyze data Follow up Understand documents the program through by tracing on data the knowledge and materials background, interviews of results story collection as exchange program’s from the entire objectives, and stakeholders using CDRF needed to results by identifying program cycle activities and key methodology refine the results intermediate and informants stories institutional level outcomes 59 evidence of outcomes that extend beyond through a defined results chain. Based on the anecdotal reporting of stakeholders. desk reviews, evaluators can hypothesize For example, stakeholders’ reports that a about a results chain and then collect quali- coalition has developed between state and tative data to determine if the hypothesis non-state actors could be corroborated was correct. However, in cases where there through a memorandum of understanding is limited existing data, the approach may regarding the new group, regular meeting need to be inductive to collect and analyze minutes, a joint product completed by information before the program model or the coalition members, or other types of theory can be developed. The deductive documentation. method can be applied to test hypotheses Qualitative data analysis can be induc- while the inductive method can illuminate tive or deductive, and both of these unintended program outcomes and/or approaches serve valuable functions in the alternative ways of describing capacity mapping and documenting of capacity development change processes. development change processes. The basic steps and a template for • Inductive analysis. Research implementing an inductive or deductive findings emerge from the frequent, analysis approach follow. dominant or significant themes found in the narratives. The findings are not constrained by structured methodologies, models, frameworks, or other preconceived paradigms. Purposes for using this approach include: oo To condense extensive and varied narratives into a brief, summary format oo To establish clear links between the research objectives and the summary findings derived from the narratives to ensure these links are both transparent (able to be demonstrated to others) and defensible (justifiable given the objectives of the research) oo To develop a model or theory about the underlying structure of experiences or processes that are evident in the narratives • Deductive analysis. An existing framework such as prior research or a logic model guides the data analysis. The main purpose for using this approach is to review or test the project model or framework. A reviewer using this guide to evalu- ate and learn from capacity development interventions will rely some on both of these approaches. The steps outlined are deductive in that a general model of capac- ity development is prescribed. Evaluators explore what outcomes took place and how 60 Basic Template for Deductive Data Analysis • Review the project or strategy results chain(s) • Identify categories or groupings for data prior to data analysis to document intermediate capacity outcomes and progress in achieving institutional capacity change objectives • Review the qualitative data carefully • Label statements or phrases in the qualitative data with the appropriate category or grouping based on project or strategy results chain Intermediate Capacity Outcomes (specific categories will be determined by the capacity development results chain) 2 Enter statements or phrases from the data grouped by topic Example: Data from interviews with government officials exploring the results of the Korea Development Institute’s Knowledge Sharing Program with the Dominican Republic Raised awareness—increased motivation to take action “Our mindset was changed dramatically after our visit to Korea. How important it was to promote outward development. Then we could see the future: if we do what Korea did… if we do many of these things, there is no doubt that this is the future that we will have.” “After we visited Korea and engaged in discussions with those who actually participated in the process of developing Korea’s exports, it made us believe that the Dominican Republic could also do it. Seeing was totally different from just reading about it in the literature. We could now clearly see the future of the Dominican Republic. The Dominican Republic can be the Korea of the Caribbean.” Institutional Capacity Outcomes (specific categories will be determined by the capacity development results chain) 2 Enter statements or phrases from the data grouped by topic Example: Continued Increased commitment by high level government officials “A presidential decree was issued to hold private-public consultation meetings. Members of the meetings include the president, relevant ministers, leaders of export agencies, and private sector leaders. The president demonstrates his strong support for exports by convening these meetings.” “CEI-RD and the Dominican Ministry of Foreign Relations signed a memorandum of understanding to collaborate on strengthening international trade networks. The first achievement was inviting Dominican representatives from the public and private sectors living abroad to Santo Domingo for training.” 61 Basic Template for Inductive Data Analysis • Review qualitative data carefully and fully • From statements, identify potential ICOs or progress in achieving capacity change objectives • For each potential outcome, identify all of the statements that go with that theme • Determine linkages and relationships across themes and potential outcomes • Reduce the number of themes or outcomes as possible • Create a results chain based on the main evidence of outcomes Primary statements or phrases from the data Emergent themes or outcome Example 1. Raised awareness—increased motivation • “Our mindset was changed dramatically after our visit among government officials to take action to to Korea. How important it was to promote outward develop export sector development. Then we could see the future: if we do what Korea did… if we do many of these things, there is no doubt that this is the future that we will have.” • “After we visited Korea and engaged in discussions with those who actually participated in the process of developing Korea’s exports, it made us believe that the Dominican Republic could also do it. Seeing was totally different from just reading about it in the literature. We could now clearly see the future of the Dominican Republic. The Dominican Republic can be the Korea of the Caribbean.” • 2. • • 62 Guidance Note 16 Exploring Opportunities for Quantitative Analysis Appropriate application of quantitative references are listed at the end for prac- methods to representative samples of titioners who are interested in learning projects or activities enables evaluators to more about specific methodological generalize findings about capacity devel- approaches. opment outcomes to the broader popula- • Available data for sampling. Sufficient tion of activities targeting relevant capacity information about capacity development development change objectives across interventions and their beneficiaries must projects, sectors and countries. be available in a standard or systematic Quantitative techniques tend to focus format to support a rigorous sampling on survey data, test scores, existing vari- process. In reality, data constraints often ables in administrative records, or other present substantial challenges for the data sources that typically allow for larger study of capacity development inter- samples than those examined strictly ventions (such as, small biased samples through a qualitative approach (see Guid- which do not represent the population). ance Notes 13-15). This guide facilitates Program documentation will not neces- the analysis of program or strategy effects sarily have captured the relevant details at two levels in the results chain: the over time to analyze progress over time shorter term ICOs and longer term prog- in a results chain (such as missing data ress achieved toward institutional capacity points would need to be imputed or change objectives (see Guidance Note 1). reconstructed through retrospective data However, key considerations should factor collection). In addition, drawing scientific into evaluators’ decisions about whether samples of participants or beneficiaries of and when to apply quantitative methods: programs depend on the reliability and • Expertise. Quantitative data are ana- quality of administrative databases (i.e., lyzed using statistics. These might be whether or not data is updated regularly). descriptive (to describe and summarize Obtaining comparison groups or using data) and/or inferential (to predict a counterfactual models of evaluation range of values for a variable in a popu- could be difficult in cases where there are lation). In either case, any practitioners limited details and contact information or evaluators who wish to apply statisti- available for individuals who did not par- cal techniques for analyzing data to ticipate in the program (in other words, identify capacity development results identifying similar counterparts who did should have training and expertise in not participate in the project but were the methods applied. The guidance similar to those who did). provided in this note is intended to • Timeframe. A study that produces scien- highlight opportunities for quantitative tifically generalizable conclusions about analysis rather than to provide how-to the outcomes of capacity development instruction for data analysis. Selected interventions is likely to be a longer 63 term research project, perhaps running designs are available in the references listed two or more years. In most cases, such at the end of this guidance note. research is designed to assess impact, which means enough time must elapse Tracer study of beneficiaries in order for impact to be observable. If participant information has been col- Other factors affecting the timeframe lected and regularly updated, tracer studies include: of a sample of program beneficiaries and oo Survey designs and instruments their institution(s) could be conducted. This typically need to be pretested (and would provide insights about the impact sometimes designed after extensive of a capacity intervention on beneficiaries consultations or focus group directly, factors that facilitated the achieve- discussions with target population). ment of outcomes, and barriers. Ideally, oo Contact information for potential this approach would include a quantitative respondents could be insufficient, survey followed by qualitative focus group requiring additional investigation discussions to lend texture and contextual (that is, face to face visits, phone understanding to the survey findings. calls, etc.). For a type of capacity development oo Identifying and hiring local intervention practiced broadly, a tracer study consultants for survey administration of a stratified random sample of beneficia- where needed can be time ries from various disciplines and countries consuming. in which the targeted intervention had been Quantitative analysis aims to determine conducted would provide findings about the relationship between an independent the effectiveness of the intervention across variable and a dependent or outcome vari- sectors. Lessons could be derived about the able in a population. Quantitative research effectiveness of the intervention in vary- designs are either descriptive (measuring ing conditions and according to different subjects once) or experimental (subjects contexts. In particular, this approach could measured before and after a treatment). A provide strong empirical evidence about the descriptive study can establish associations types of ICOs that need to be in place for between variables while controlling for mul- targeted institutional capacity changes to be tiple factors whereas an experiment with achieved and sustained. randomized trials can establish causality. The cost of a tracer study survey will Pure experimental research designs depend on the countries in which the survey are rarely feasible given that evaluators is conducted and the means by which it is usually have little control over program administered. It is usually more costly to hire design. Instead, evaluators are often faced local consultants in the field to administer with assessing a project or program during surveys in person (both for the interviewer’s implementation or after completion, with time and costs to conduct interviews and no or limited ability to influence either the enter data) than to send emails with a interventions or the assignment of indi- weblink to an electronic survey where data viduals to treatment and control groups. is entered with minimal error. However, the A more realistic approach therefore is to method chosen can influence response rates examine opportunities for “natural experi- that relate to the study’s validity. ment,” a term used in evaluation literature to refer to evaluation designs that draw on Quasi-experimental design naturally occurring bases for comparison. For a project that is under implementation Using this natural experiment approach, or completed, it is no longer possible to two possible scenarios for retrospective conduct a randomized control trial experi- quantitative analysis of capacity develop- ment with randomly selected treatment and ment results are described below. Addi- control groups. However, it is possible to tional descriptions and examples of study conduct a quasi-experimental evaluation by 64 creating a comparison group that is similar such as propensity score matching. It cannot to the group that received the interven- be assumed that the necessary data exist. tion—provided the necessary data exist. If Establishing a quasi-experimental com- records exist of potential participants of a parison group study utilizing rigorous meth- capacity development intervention (such odology could potentially provide strong as a knowledge exchange for example), it evidence of results at both the intermediate might be possible to create points of com- capacity outcome and institutional capac- parison—organizations or groups of indi- ity levels, and the findings would provide viduals—that could be matched with similar important lessons about what works in organizations or groups that did not benefit capacity development and what common from the program. Such a study would pitfalls should be avoided. shed light on what ICOs and institutional Regardless of the quantitative study capacity changes were achieved by specific design established to identify the outcomes interventions and under what conditions. and impact of a capacity development inter- For this type of study, data are only col- vention, practitioners should bear in mind lected after the project has been imple- the value of using a mixed method approach mented for both a group of project partici- where possible. Qualitative approaches— pants or direct beneficiaries and a separate such as case studies or interviews with key comparable group that did not participate informants—can help to triangulate find- or directly benefit. The comparison group ings and add contextual details. A rigorous is constructed through statistical analy- quantitative study with qualitative compo- ses to be as equivalent as possible to the nents could provide a valuable in-depth treatment group. It can be challenging to understanding of the capacity development identify a strong comparison group. It usu- change processes needed to achieve a ally requires data on both the population targeted development goal. Hence, most of beneficiaries and non-beneficiaries from impact evaluations use mixed-methods which the comparison group is identified approaches which combine quantitative and based on sophisticated statistical analyses qualitative analysis with field visits. Resources for Designing and Conducting Quantitative Studies Bamberger, M. and H. White. 2007. “Using Strong Evaluation Designs in Developing Countries: Experience and Challenges.” Journal of Multidisciplinary Evaluation 4(8): 58-73. Khandker, S, G. Koolwal, and H. Samad. 2009. Handbook on Impact Evaluation: Quantitative Methods and Practices (World Bank Training Series). Washington, DC: The World Bank. Morra Imas, L. and R. Rist. 2009. The Road to Results: Designing and Conducting Effective Development Evaluations. Washington, DC: The World Bank. Rossi, P. M. Lipsey, and H. Freeman. 2004. Evaluation: A Systematic Approach. 7th ed. Sage: London. Scriven, M. 2007. Key Evaluation Checklist. Western Michigan University, Evaluation Center: Kalamazoo, MI. See http://www.wmich.edu/evalctr/archive_checklists/kec_feb07.pdf Wholey, J., H. Hatry, and K. Newcomer. 2010. Handbook of Practical Program Evaluation. San Francisco: Jossey-Bass. World Bank. 2006. Conducting Quality Impact Evaluations under Budget, Time, and Data Constraints. Washington, DC: Independent Evaluation Group. World Bank. 2004. Monitoring and Evaluation: Some Tools, Methods, and Approaches. Washington, DC: Independent Evaluation Group. 65 Guidance Note 17 Survey Data As evaluators determine the appropriate and whether or not they were able to apply study design for a retrospective assessment what they learned in their ministries. of capacity development results, they are A well designed survey administered on likely to rely on one or more surveys as a a properly selected sample can produce data collection methodology. Surveys representative information about the popu- are popular because they seem like a lation at large. This requires a mastery of straightforward method to collect data methodological and statistical expertise that quickly; however, the data are only mean- goes beyond the scope of this guidance ingful to the extent that the survey is con- note. The following sections therefore pres- ducted properly. ent the key concepts and basic fundamen- Surveys are used to collect self-reported tals of survey design and administration nec- information on individuals’ knowledge, essary for supervising the survey process. attitudes, opinions, experiences and behavior from a sample of the population Survey design or targeted group. In fact, sometimes Developing an effective survey is a multi- asking people is the best way to get stage process that requires pre-testing. information such as in determining Assuming that the survey designer already attitude shifts. For instance, surveys are knows what information needs to be col- used to gauge individuals’ opinions about lected from a targeted population, the first issues after awareness-raising campaigns. step is designing the survey instrument. Likewise, questionnaires are used to Diagram 7 illustrates the steps in the survey determine participants’ use of knowledge design process and skills after training programs. Ideally, the initial draft and question In addition to providing information wording would be developed with input on unobservable or intangible internal from members of the targeted popula- attitudes and beliefs, surveys can be tion through focus group discussions and used to assess institutional changes. For follow up pre-testing. Generally speaking, example, surveys can reveal how citizens’ survey instruments should be simple, clear, rate their experiences with public ser- easy and as short as possible while long vices after capacity building interventions enough to collect the necessary information. to determine whether improvements in Focus groups can be used to understand public services took place as an indicator the content to be explored and to identify of project impact. Surveys can also be used context-specific language that resonates to measure intermediate outcomes of an with the target population. In other words, intervention. In this example, administering it is important that the question wording questionnaires to government employees has the same meaning for all respondents who were trained as part of the program answering the survey. would provide insight on whether partici- For example, if a program leader is pants acquired new knowledge and skills, interested in learning about the outcomes of a public service capacity building project, 66 Diagram 7. Steps in the Survey Design Process Focus Group Design Survey Pre-test Survey Revise Survey Pre-test Survey Finalize Survey they may start by holding a focus group survey needs to be and the actual popula- with a select number of public servants tion size. The proportion of the population who participated in the program to learn required for the sample decreases as the about whether or not the program has had population size increases. In other words, an effect on their motivation and work. The larger populations require smaller samples. focus group can provide ideas on what For example, national public opinion polls kinds of immediate effects the program in the United States conduct surveys on a has had which could then be tested in sample of approximately 1,000 people and the population at large in a ministry-wide are representative of the entire citizenry. survey. Thus, in this example, the focus There are a variety of sampling methods group would help to inform the content both random and non-random. While ran- of the survey as well as the language used dom samples allow generalizations to be to ensure that questions are worded in a drawn about the population, non-random comprehensible way that all respondents samples do not because they are suscep- will understand consistently. tible to bias. Once the survey instrument is drafted, Random samples include simple ran- it is important to pre-test it with selected dom samples; stratified random samples individuals from the target population to and cluster samples. A simple random sam- ensure that the language is correct and ple is where units are selected randomly consistently understood across respon- from a complete list of the population until dents. Based on the feedback provided the sample size is met. A stratified random by respondents, the instrument is revised sample ensures that specific groups are again and depending on the extent of the represented in the sample. The popula- revisions, it should be pre-tested again tion is divided by groups and then random before the survey is finally launched. Pre- selections are made from each group to testing the questionnaire is critical even if it make sure that each group is represented is not possible to design a survey informed in the sample. For instance, a public opin- by a focus group discussion. ion poll can be stratified by ethnicity to make sure all ethnicities are represented in Sampling the sample and analyses can be conducted In evaluations where the target population by ethnicity. A cluster sample uses a com- is too large a number to survey, it is possi- plete list of clusters from within the popu- ble to select a sample that is representative lation and randomly select clusters from of the population at large. The sample is which sample units are randomly selected. a subset of the full population (or the total For example, surveys can conduct a clus- number of units). ter sample where zip codes are selected The first step is determining the sample randomly and then surveys are conducted size. The sample size needed for the within the selected zip codes. population depends on the population Non-random samples include quota; size and the level of accuracy required for accidental, snow-ball, and convenience the survey. The level of accuracy is based sampling. Quota samples select a targeted on two statistics - the confidence level number of respondents within a category. and the sample error. How big a sample Accidental samples include participants by is needed depends on how accurate the accident, for example surveys in shopping 67 malls. Snow-ball samples are often used in due to a lack of technology infrastructure interviews where interviewees recommend or electricity for powering computers. Face other participants. Convenience samples to face surveys may entail an interviewer are selections based on the researcher’s asking respondents questions directly from convenience such as student samples used the survey and recording their responses for studies conducted by university profes- as done in in-person interviews. (Tech- sors. niques for optimizing face-to-face interview surveys are covered in the next section.) Survey administration However, consultants hired to collect sur- Conducting a survey for evaluation in vey data may also give the written survey developing countries often entails hiring in person to the respondent and let the a local consultant or firm to administer respondent complete it on his or her own the survey. The local practitioner is helpful and collect the completed survey. because they understand the country con- Emailing survey questionnaires some- text and can provide guidance on the best times works better than internet surveys method for administering the survey. for various reasons. In some countries, it However, hiring, training, and supervis- is unlikely that respondents will complete ing consultants in the field from headquar- a survey on-line because they may have ters can pose certain challenges. If the sur- to pay to access the survey such as, at an veys are to be delivered face to face, then internet café. Internet surveys may also be it is integral that the interviewer receive difficult to complete when electrical power adequate training on the content and outages cause unpredictable computer survey instrument. To enhance data validity crashes. These challenges make it difficult and discourage fabrication of data, it is use- and expensive for respondents to complete ful to inform them that a random sample on-line surveys because they would have of interviews will be verified based on the to spend their valuable internet time on it. contact information they provide to dis- Additionally, some standard internet survey courage any potential indiscretions. Also, packages do not allow users to save their the consultant should be provided with partially completed surveys preventing the introduction on letterhead or email, or respondents from working on the ques- if drafted by them, carefully reviewed by tionnaire at various points in time which is the project team to avoid any reputational problematic if they need to research their risks. The country office should be informed answers. However, it is important to keep in before contacting any clients. mind that in emailing surveys, an incorrect Surveys can be administered through a spelling of an email notification will result in variety of methods: face-to-face, Internet, a failed delivery. email, telephone, fax, and mail. Internet surveys are the most efficient The choice of method for administering because the data are automatically entered international surveys should take into con- into a database when the respondent sideration the infrastructure and capacity answers the questions. Hence, it is easy to of the country in which the survey is being monitor responses and know who has and conducted and the target population of the has not responded. survey. For instance, some populations may only be reached in villages accessible by Non-response and non-coverage foot. Alternatively, high level government The issue of survey non-response is a officials are usually not directly accessible key challenge to overcome. Survey non- and require going through proper channels response occurs when a sampled individual to contact them about the survey. does not respond to the request to be In some countries, face to face surveys surveyed. In other words, the individuals are most convenient because of respon- ignore the request to complete the survey. dents’ difficulties in gaining internet access A separate problem is that of non-cover- 68 age, which is the failure to contact all mem- participating must outweigh the costs at bers of the target population to request the time participation is requested. their participation in the survey. • Social exchange: Create an expectation Non-coverage refers to not being able of reciprocity in social interactions, such to contact or deliver a request for participa- as, by using small gestures or by offering tion to a member of the target population. incentives up front. Some degree of non-coverage is often dif- • Topic saliency: The survey should be ficult to avoid and therefore expected to a (made) relevant to the participant. certain degree in professional surveys. For • Interviewer competence: Effectively example, telephone surveys that target the tailoring the request to the sampled general population and use random digit person to increase the sampled person’s dialing (a commonly used data collection propensity to cooperate method for telephone surveys) necessarily cannot cover respondents or households Consider the following suggestions for without a phone. Even when a complete list maximizing response rates at various stages of the population of potential respondents in designing and implementing surveys. exists, and when this list contains contact information, non-coverage can still occur Contacting potential respondents when the survey administrator is unable to ✓✓ Call more times, and vary the timing of reach the target respondent. calls (more calls and at different times • Erroneous contact information. are better) • Delivery errors due to transmission ✓✓ Lengthen the data collection period failures (e.g., technology issues with (longer is better) web, email, or Internet; problems with ✓✓ Allow for significant interviewer postal mail). workload (careful personal tailoring of • Interception by others (such as, request requires more time) administrative staff), followed by a failure ✓✓ Cultivate interviewer observations (face- to forward the request to the sampled to-face contact allows better tailoring of person. requests to participate) • Sampled person is not present or inaccessible. Influencing potential respondents’ • Sampled person does not read or decisions to participate understand the survey request. ✓✓ Engage a trusted organization for The consequences of both noncoverage sponsorship (World Bank, government and nonresponse can be serious if those and membership organizations are who cannot be contacted or who refuse beneficial) to respond differ in a systematic way from ✓✓ Prenotify participants (advance letters those who were able to be contacted and tend to be beneficial) do respond. The consequences can include ✓✓ Provide incentives (advance cash biased results. incentives outperform in-kind or promised incentives) Maximizing survey response rates ✓✓ Reduce participant burden (shorter The objectives are: to maximize the surveys are better, but perception of response rate by minimizing the costs for length matters) responding; to maximize the rewards for ✓✓ Attend to interviewer behavior (flexible doing so; and to establish trust that those and tailored introductions appear to rewards will be delivered. Efforts to lower work better) non-response can be thought of in terms of ✓✓ Match the interviewer to the sample these principles: person (match by sex or ethnicity, e.g., • Opportunity costs: From the perspective in general, older women tend to obtain of the respondent, the benefits of highest response rates) 69 Persuading non-respondents to participate summary statistics. A potential disadvan- ✓✓ Switch interviewers (replacing an initially tage to closed-ended questions is that the unsuccessful interviewer might help) options provided may not capture exactly ✓✓ Switch interview modes (mixed mode a respondent’s answer or that having to data collection designs can be more choose from preselected options influences efficient than single mode data respondents’ answers. However, if the collection) questionnaire is well designed, the chances ✓✓ Use follow-up procedures (send of this will be minimized. reminders or persuasion letters that Analysis of qualitative data is generally address expressed concerns) more time consuming to get basic results ✓✓ Use two-phase sampling (follow-up than obtaining summary statistics from sampling of nonrespondents to assess quantitative data (qualitative data analysis bias) is addressed in Guidance Note 15). Typi- ✓✓ Make postsurvey adjustments (weighting cally, both types of questions are used for of data if response bias is known) evaluation surveys with the majority being closed-ended items and a few open-ended Organizational surveys items to dig deeper on key issues where A key to gaining higher response rates in narrative responses can help to reveal organizational surveys is targeted respon- insights. Closed-ended items have a stan- dent selection methods (that is, identifying dard set of response options from which person with relevant knowledge instead respondents select an answer. For example, of generically addressing chief executive a closed-ended question about the useful- which requires getting past gatekeepers ness of a report may be followed up with such as administrative staff) and pre-notifi- open-ended questions asking about what cation (to raise awareness, or to identify the aspects the respondents found most and right respondent). Other factors associ- least relevant. ated with high response rates include an authoritative sponsor (government is better Example of closed-ended item followed than universities), a legal mandate, follow- up with open-ended item up activities (with reported effectiveness Thinking about the recent workshop you between 0 and 32 percentage points), con- attended entitled, “M&E for Results,” current use of multiple response modes, please rate its overall relevance to your and ensuring that the survey topic is viewed work organization. as relevant to the organization’s staff. ❑❑ Very relevant ❑❑ Relevant Survey questionnaire wording ❑❑ Somewhat relevant ❑❑ Irrelevant Question types: open and closed ❑❑ Not relevant at all ended questions Survey questionnaires including two What aspects of the workshop did you basic types of questions: open-ended find most relevant for your work organiza- and closed-ended items. Open-ended tion? _________________________________ items are questions where the respondent ______________________________________ answers in their own words. Open-ended What aspects of the workshop did you items collect qualitative data in the respon- find least relevant for your work organiza- dents’ own words. The advantage to this is tion? _________________________________ that there is no chance that responses are ______________________________________ influenced by having to select from preset options. Closed-ended questions obtain Closed-ended questions can elicit vari- standardized responses which facilitate ous kinds of data including nominal data, data processing and production of basic ordinal, or count data. Nominal variables 70 cannot be rank ordered. The response options that are associated with an options designed to collect nominal data ascending or descending order. Note that are usually descriptive categories that do there is no requirement of equal-sized not have values associated with them. For intervals between the response options of example, a question asking respondents ordinal dependent variables, just a simple the type of organization in which they work requirement that moving from one value is purely descriptive and cannot be ranked. to the next is in some way an ascent or In other words, working for a university is descent. not twice as valuable as working for a donor For example, a question asking agency. respondents to report highest level of education obtained is ordinal (such as high Example of closed-ended question for school, college, graduate school at masters nominal data level, graduate school at doctoral level, At the time of the workshop, which of the etc.) because higher values are meaningful following best characterizes the organiza- and indicate higher levels of education. tion in which you worked? Likewise, questions asking respondents to ❑❑ University/Research Institute indicate their agreement with a statement, ❑❑ Private sector or satisfaction with a service or product ❑❑ Donor Agency often use a rating scale. ❑❑ National/Central Government For instance, a question asking a ❑❑ Provincial/Regional Government respondent to rate the extent to which they ❑❑ Local Government are satisfied with a training session they ❑❑ Non-Governmental Organization attended may have a response option scale ❑❑ Other ranging from: not satisfied at all, somewhat dissatisfied, moderately satisfied, satisfied On the other hand, ordinal variables and highly satisfied. In this example, higher can be rank ordered. Questions collecting values indicate greater levels of satisfaction ordinal data have three (or more) response and lower values indicate less satisfaction. Example of closed-ended question for ordinal data Tell us about your experience with the [content/instructional design/community that was developed/etc.] in this [workshop/online course/etc.]. Please indicate the extent to which you agree with the following statements. Strongly Disagree Moderately Agree Strongly disagree agree agree The objectives of the ❑❑ ❑❑ ❑❑ ❑❑ ❑❑ course were clearly stated. The course work was ❑❑ ❑❑ ❑❑ ❑❑ ❑❑ at an appropriate level for me. The course work provided adequate ❑❑ ❑❑ ❑❑ ❑❑ ❑❑ exploration of the content and topics. The scope of the course was ❑❑ ❑❑ ❑❑ ❑❑ ❑❑ appropriate for the time allotted. 71 Such response options produce ordinal satisfied, and highly satisfied. However, level data which provide opportunities to the potential disadvantage is that this calculate descriptive summary statistics and increases the likelihood of question non- conduct statistical analyses to explore the response which reduces the overall sample determinants of satisfaction with training, size to the number who answered the for example. question. There are statistical techniques to impute missing data, however, they Middle Category require advanced statistical methods and In designing the question response additional data sets on similar populations options, one has to decide whether or to estimate the missing information, which not to include a middle category. In the may or may not exist. example above, the “moderately satisfied” option is the middle category. It provides Reference periods respondents who do not have extreme Evaluation surveys usually ask respondents opinions in one direction or the other an to report on the past which can introduce option to select that captures their moder- problems of respondents’ abilities to ate views or ambivalence (that is, an option recall incidents in the past. This can result other than not answering the question). The in systematic biases where respondents disadvantage to using a middle category is over-report or under-report due to difficul- that you may not get a decisive response ties in remembering what happened in the in one direction or the other if most people specific time frame. One demonstrated choose the middle category. phenomenon is “telescoping” where The alternative is to forego the middle respondents remember events as happen- category, so that respondents are forced to ing more recently than they did in the past. take a position. In the example above that Conversely, “recall loss” happens when would mean leaving out the moderately time periods are too long and respon- satisfied option and choosing between dents cannot remember what happened not satisfied at all, somewhat dissatisfied, in the specified time period. The longer Example of closed-ended question with middle category To what extent do you agree with the following statements about your experience in the field visit knowledge exchange program? Strongly Disagree Moderately Agree Strongly disagree agree agree The topics covered in the ❑❑ ❑❑ ❑❑ ❑❑ ❑❑ seminar were relevant to my work. The technical discussions ❑❑ ❑❑ ❑❑ ❑❑ ❑❑ were at the appropriate level. The networking ❑❑ ❑❑ ❑❑ ❑❑ ❑❑ opportunities provided were valuable to me The visit was of the right ❑❑ ❑❑ ❑❑ ❑❑ ❑❑ length to cover the topics thoroughly. There was too much time ❑❑ ❑❑ ❑❑ ❑❑ ❑❑ allotted for sight-seeing. 72 the period the greater the recall loss and • In 2011, how many community meetings lower the telescoping effect. One way to did you attend, if any? reduce recall effects is to provide aids in • In 2011, how many times did you vote on survey interviews to help people remember funding a particular project? the time period. This can include show- The following questions are about ing people calendars while asking them to work in the ministry since the knowledge recall how many times something hap- exchange on public private partnerships pened in a shorter time period. Another that took place between January 1 and approach is to anchor time periods around December 31, 2011, memorable events for the respondent (such • In 2011, how many PPP projects have as the Arab Spring or Japanese Tsunami), you proposed this year compared to last or perhaps use respondents’ own materials year? to aid them in remembering incidents more • In your opinion, in 2011, are ministry accurately (such as diaries, checkbooks, staff more motivated to implement the emails, work documents, etc.). If this is not procedures necessary to facilitate more possible, describe the reference period in PPP projects? full and then refer to it at the beginning of the question item. Balanced questions It is important to use balanced questions Examples of questions using a reference when asking respondents their attitudes to period avoid bias in their responses towards one The following questions are about your direction or the other. Balanced questions participation in community meetings after mention the alternative. For example, did the awareness raising campaign on citizens’ the intervention have a positive influence, rights to participate in public budgeting negative influence, or no influence in your between January 1 and December 31, community? This includes the possibility of 2011. a project having a negative influence that Example of balanced questions We are interested in finding out if your participation in the program resulted in any changes. Did the program you participated in have a positive influence, negative influence or no influence at all in the following areas? Strong Negative No influence Positive Strong negative influence influence positive influence influence Legislation or ❑❑ ❑❑ ❑❑ ❑❑ ❑❑ regulations Research ❑❑ ❑❑ ❑❑ ❑❑ ❑❑ methodology Publications ❑❑ ❑❑ ❑❑ ❑❑ ❑❑ Teaching ❑❑ ❑❑ ❑❑ ❑❑ ❑❑ materials for courses Work practices ❑❑ ❑❑ ❑❑ ❑❑ ❑❑ in your organization Community- ❑❑ ❑❑ ❑❑ ❑❑ ❑❑ based initiatives 73 is often overlooked. A classic mistake by Examples to avoid question order effects project teams is developing a survey ques- Please indicate your level of satisfaction tion focused on discovering how clients with the conference as a whole. benefited from the project and not allowing ❑❑ Very satisfied for the possibility that the respondent may ❑❑ Satisfied have experienced negative consequences. ❑❑ Somewhat satisfied ❑❑ Dissatisfied Question sequencing ❑❑ Very dissatisfied Responses can be influenced by the order Please indicate your level of satisfaction in which questions are asked. For instance, with the networking session. asking respondents about something spe- ❑❑ Very satisfied cific and then something more general that ❑❑ Satisfied includes the initial specific item can lead ❑❑ Somewhat satisfied respondents to exclude the specific item ❑❑ Dissatisfied in their response to the general question. ❑❑ Very dissatisfied Respondents assume since they already answered about the specific item that the Annex 3 presents an example of a survey general item must be asking about every- questionnaire. thing else but that. For instance, asking a question about satisfaction with a particular event or activ- ity and then asking about satisfaction with the entire program may lead respondents to exclude their attitudes about the particu- lar event from their overall assessment of the program. It is better to ask the general question first and then the specific ques- tions afterwards. Resources for Survey Data Burgess, T. 2001. Guide to the Design of Questionnaires. United Kingdom: University of Leeds. Dillman, D., Smyth, J., and L. Christian. 2008. Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method. 3rd ed. Hoboken, N.J.: John Wiley and Sons. Fink. A. 2002. How to Design Survey Studies. Thousand Oaks, CA: Sage Publications. Foddy, W. 1993. Constructing Questions for Interviews and Questionnaires. New York: Cambridge University Press. Fowler, F. Jr. 1995. Improving Survey Questions: Design and Evaluation. Applied Social Research Series (38). Thousand Oaks, CA: Sage Publications. Miller, T. 1994. “Designing and Conducting Surveys.” In Wholey, J., Hatry, H, and K. Newcomer, eds. Handbook of Practical Program Evaluation. San Franciso: Jossey Bass. Rea, L. and R. Parker. 2005. Designing and Conducting Survey Research: A Comprehensive Guide. 3rd ed. San Francisco: Jossey Bass. 74 ANNEX 1 References Bamberger, M. and H. White. 2007. “Using Fowler, F. Jr. 1995. Improving Survey Strong Evaluation Designs in Developing Questions: Design and Evaluation vol 38 in Countries: Experience and Challenges.” Applied Social Research Series. Thousand Journal of Multidisciplinary Evaluation 4(8): Oaks, CA: Sage Publications. 58-73. Hennick, M. 2007. International Focus Bamberger, M., Rugh, J. and Mabry, L. Group Research: A Handbook for the 2006. RealWorld Evaluation: Working Under Health and Social Sciences. New York: Budget, Time, and Political Constraints. Cambridge University Press. Thousand Oaks, CA: Sage Publications. Khandker, S, G. Koolwal, and H. Samad. Bamberger, M., V. Rao, and M. Woolcock. 2009. Handbook on Impact Evaluation: 2010. Using Mixed Methods in Monitoring Quantitative Methods and Practices (World and Evaluation: Experiences from Bank Training Series). Washington, DC: International Development. Policy Research World Bank. Working Paper. Washington, DC: World Bank. Korea Development Institute and World Bank (joint study). 2011. Using Knowledge Billson, J. 2004. The Power of Focus Exchange for Capacity Development: Groups: A Training Manual for Social, What Works in Global Practice? Three Policy, and Market Research with a Focus case studies in assessment of knowledge on International Development. Barrington, exchange programs using a results-focused RI: Skywood Press. methodology. Washington, D.C. Burgess, T. 2001. Guide to the Design of Krueger R. and M. Casey. 2000. Focus Questionnaires. United Kingdom: University Groups. 3rd ed. Thousand Oaks, CA: Sage of Leeds. Publications. Cresswell, J. and V. Plano Clark. 2007. Ling, C. and M. Meffert. “How to Increase Designing and Conducting Mixed Methods Response Rates: Review of Literature on Research. Thousand Oaks, CA: Sage Survey Nonresponse” (Working paper). Publications. Washington, DC: World Bank Institute. Dillman, D., Smyth, J., and L. Christian. Miller, T. 1994. “Designing and Conducting 2008. Internet, Mail and Mixed-Mode Surveys.” In Wholey, J., Hatry, H, and K. Surveys: The Tailored Design Method. 3rd Newcomer, eds. Handbook of Practical ed. Hoboken, NJ: John Wiley and Sons. Program Evaluation. San Francisco: Jossey Bass. Fink. A. 2002. How to Design Survey Studies. Thousand Oaks, CA: Sage Morra Imas, L. and R. Rist. 2009. The Road Publications. to Results: Designing and Conducting Effective Development Evaluations. Foddy, W. 1993. Constructing Questions for Washington, DC: World Bank. Interviews and Questionnaires. New York: Cambridge University Press. 75 Organisation for Economic Co-operation W. K. Kellogg Foundation. 2004. Evaluation and Development (OECD). 2005. The Paris Handbook. Available at http://www.wkkf. Declaration on Aid Effectiveness. Paris. org/knowledge-center/resources/2010/ W-K-Kellogg-Foundation-Evaluation- Organisation for Economic Co-operation Handbook.aspx and Development (OECD). 2006. Survey on Monitoring the Paris Declaration: Overview Wholey, J., H. Hatry, and K. Newcomer. of the Results. Paris. 2010. Handbook of Practical Program Evaluation. San Francisco: Jossey-Bass. Otoo, S., N. Agapitova, and J. Behrens. 2009. The Capacity Development Results World Bank. 2004. Monitoring and Framework: A Strategic and Results- Evaluation: Some Tools, Methods, and Oriented Approach to Learning for Approaches. Independent Evaluation Capacity Development. Washington DC: Group: Washington, DC. World Bank Institute. World Bank Independent Evaluation Group Patton, M. 2002. Qualitative Research and (IEG). 2005. Capacity Building in Africa: Evaluation Methods. 3rd ed. Thousand An IEG Evaluation of World Bank Support. Oaks, CA: Sage Publications. Washington, DC. Rea, L. and R. Parker. 2005. Designing World Bank Independent Evaluation Group and Conducting Survey Research: A (IEG). 2006. Annual Review of Development Comprehensive Guide. 3rd ed. San Effectiveness. Washington, DC. Francisco: Jossey Bass. World Bank Independent Evaluation Group Roberts, D., C. Ling, and N. Agapitova. (IEG). 2008. Using Training to Build Capacity 2011. Reviewing Project Results For Development: An IEG Evaluation of Retrospectively Using a Results-Focused the World Bank’s Project-Based and WBI Approach to Capacity Development. Training. Washington, DC. Washington DC: World Bank. World Bank Independent Evaluation Rossi, P. M. Lipsey, and H. Freeman. 2004. Group. 2006. Conducting Quality Impact Evaluation: A Systematic Approach. 7th ed. Evaluations under Budget, Time, and Data London: Sage Publications. Constraints. Washington, DC. Scriven, M. 2007. Key Evaluation Checklist. Western Michigan University, Evaluation Center: Kalamazoo, MI. See http://www. wmich.edu/evalctr/archive_checklists/ kec_feb07.pdf Taylor, P. and P. Clarke. 2008. Capacity for a Change. Institute of Development Studies. Sussex. United States Agency for International Development’s Center for Development Information and Evaluation. 1996. Conducting Key Informant Interviews. Washington, D.C. Available at http://pdf. usaid.gov/pdf_docs/PNABS541.pdf 76 ANNEX 2 Examples of Attributes for Intermediate Capacity Outcomes Table 18. Examples of Attributes for Types of Intermediate Capacity Outcomes Type of ICO Specific Attributes Description to Measure Raised Attitude Beliefs and values about the outcome of planned Awareness behavior Understanding Perceptions of the benefits and constraints of a behavior; feelings and emotions or affect towards a behavior Confidence An individual’s perceived behavioral control or confidence over resources and skills needed to perform the behavior Strong Motivation The incentive to perform the behavioral change Enhanced Acquisition of New knowledge or skills that lead to broader Knowledge Knowledge institutional change and Skills Application of New Demonstrated use of new knowledge and skills by Knowledge or Skills change agents in the process of institutional change Improved Improved Agreement An increased level of agreement among participants Consensus resulting from a consensus decision making process and Increased Input by participants to a shared proposal for a Teamwork Contributions decision that meets the concerns of all group members as much as possible Improved Cohesion Higher sharing of a common set of values to lead to the best possible decision for the group and all of its members rather than just having participants competing for personal preferences Improved Inclusion Involvement of as many stakeholders as possible in the consensus decision-making process Improved Improved exchange of clear and accurate information Communication and the ability to clarify or acknowledge the receipt of information. Improved Group Improved ability of a team to gather and integrate Decision Making or information, use logical and sound judgment, identify Planning possible alternatives, select the best solution, and evaluate consequences Improved Adaptability Improved ability to use information and adjust or Flexibility strategies through compensatory behavior and reallocation of team resources 77 Table 18 (continued). Examples of Attributes Type of ICO Specific Attributes Description to Measure Strengthened Shared Purpose and A common agenda reflecting why the coalition exists Coalitions Vision and what the desired results are Leadership Having strong leaders, as evidenced by their ability to set a clear direction, keep the coalition moving forward, resolve conflict, ensure trust and accountability from members and keep a coalition focused on its vision Transparent Decision Clarity among all parties about the model chosen and Making Process the commitment to implementing the process Cultural Capacities Evidence that the coalition exhibits trust, respect for dissent and sensitivity to internal and external power differentials Membership Diversity The broad-based ability to produce diverse resources and Participation and expand the reach to a wider audience. Members believe they are doing meaningful work, leading to sustained membership. Frequent and The ability to keep members up to date on Productive developments or activities and communicate clearly Communication to motivate members to action Evaluating Success Measures of the quality and impact of a coalition’s work. The ability to measure progress toward goals and increase strategic capacity. Enhanced Unity of Purpose Uniting around a compelling idea, a shared belief that Networks members of a network can achieve more together than they can alone Network Connectivity The strength of the relationships between and among network members, reflected by how well network members are connected to one another and how well they communicate with one another Value Added The extent to which a network adds value for its members, for clients served by the network, and in the broader environment Increased Close Engagement Coordinated action that involves the participation of Implementation and Dialogue key stakeholders in all stages of the learning by doing Know How process Designated The use of a single qualified local institution to Responsibility for coordinate the work with other local participants Coordination 78 ANNEX 3 Example of Questionnaire for WBI Participants WORLD BANK INSTITUTE Unleashing the Power of Knowledge to Enable a World Free of Poverty World Bank Institute (WBI) Questionnaire Instructions WBI had the pleasure to have you participate in the following learning activity: Title: _____________________________________________________________ Held from: ________________________ to: ________________________ In: _______________________________________________________________ Getting your opinion of the above-mentioned activity—now that you have had time to reflect on it—is very important to help WBI improve its programs. For this, we ask you to complete this questionnaire. The questionnaire has four sections and should take approximately 20 minutes to complete. • Section 1 asks about the relevance of the activity. • Section 2 asks about the usefulness of the activity. • Section 3 asks you to compare this activity with similar learning activities offered by other organizations. • Section 4 asks about the characteristics of the activity, its follow-up and your background. We need your honest feedback. Please keep in mind that your responses will be kept confidential, and will be used for the sole purpose of improving WBI programs. If you have any questions about the questionnaire please send a message by e- mail to Email@worldbank.org. Thank you for agreeing to complete this questionnaire! ID:_________________ 79 WORLD BANK INSTITUTE Unleashing the Power of Knowledge to Enable a World Free of Poverty World Bank Institute (WBI) Program Questionnaire I. Relevance of the Activity The activity that you are asked to evaluate is mentioned on the first page of this questionnaire. 1. Since the end of the activity, to what degree has the activity been relevant to your work? Neither Relevant Not Irrelevant relevant for the relevant for the Somewhat or Somewhat most Extremely Don’t at all most part irrelevant irrelevant relevant part relevant know 1 2 3 4 5 6 7 DK ! ! ! ! ! ! ! ! 2. To what degree have the topics covered in the activity been relevant to your country’s needs? Neither Relevant Not Irrelevant relevant for the for the Somewhat or Somewhat most Don’t relevant Extremely at all most part irrelevant irrelevant relevant part relevant know 1 2 3 4 5 6 7 DK ! ! ! ! ! ! ! ! 3. Was the activity designed specifically for participants from your country? ! Yes ! No ! Don’t know 4. Was the activity related to your country’s development goals listed below? a. Eradicate extreme poverty ! Yes ! No ! Don’t know b. Achieve universal primary education ! Yes ! No ! Don’t know c. Promote gender equality and empower women ! Yes ! No ! Don’t know d. Reduce child mortality ! Yes ! No ! Don’t know e. Improve maternal health ! Yes ! No ! Don’t know f. Combat HIV/AIDS, malaria, and other diseases ! Yes ! No ! Don’t know g. Ensure environmental sustainability ! Yes ! No ! Don’t know h. Develop global partnerships for development ! Yes ! No ! Don’t know i. Ensure water sanitation and supply ! Yes ! No ! Don’t know j. Improve investment climate and finance ! Yes ! No ! Don’t know k. Promote trade ! Yes ! No ! Don’t know II. Usefulness of the Activity 5. Please rate the degree of effectiveness of the activity in each area noted below. (If the area was not an objective of the activity, please mark “not applicable.”) ID: Page 2 of 6 80 WORLD BANK INSTITUTE Unleashing the Power of Knowledge to Enable a World Free of Poverty Not Not moderately Somewhat moderately Extremely Areas effective ineffective ineffective effective effective effective effective applic at all able 1 2 3 4 5 6 7 NA a. Raising your awareness and understanding of the development issues important to your country         b. Providing you with knowledge or skills         c. Helping you better understand your role as an agent of change in your country’s development         d. Helping you develop strategies or approaches to address the needs of your organization         e. Helping you develop strategies or approaches to address the needs of your country         f. Helping you develop contacts, develop partnerships and build coalitions in the field         6. How would you rate the change—brought by the activity—in the main topic or issue it addressed? Neither positive Strong Moderately or Moderately Strong negative Negative negative negative positive Positive positive Don’t change change change change change change change know 1 2 3 4 5 6 7 DK         7. How often have you used the knowledge and skills you acquired in the activity for the following purposes? (If you have not worked in the given area since this activity, please mark “Not applicable.”) som Somew Never Occasi Use all Not Purposes used Rarely onally etim hat often the time applicable es often 1 2 3 4 5 6 7 NA a. Conducting research         b. Teaching         c. Raising public awareness in development issues         d. Implementing new practices within your work organization         e. Organizing collective initiatives         f. Influencing legislation and regulation         g. Implementing country development strategies         ID: Page 3 of 6 81 WORLD BANK INSTITUTE Unleashing the Power of Knowledge to Enable a World Free of Poverty 8. To what extent did the following factors help or hurt the process of using the knowledge/skills that you acquired at the activity? Neither Not Factors Greatly Somewhat helped Somewhat Greatly applic hurt hurt hurt nor hurt helped helped helped able 1 2 3 4 5 6 7 NA a. Your work environment (e.g., work procedures, colleagues, incentive system, funding, etc.)         b. Your county’s development environment (e.g., country policies, social groups, political groups, readiness for reform, etc.)         9. How has the activity had an influence in the following areas? (If the area is not relevant to the activity, please mark “Not applicable.”) No Not Negative Somewhat Somewhat Positive Areas negative influ positive applica influence negative positive influence ence ble 1 2 3 4 5 6 7 NA a. Research         b. Teaching         c. Public awareness in development issues         d. New practices within your work organization         e. Collective initiatives         f. Legislation and regulation         g. Country development strategies         10. Since the activity ended, have you discussed the issues raised in the activity in any of the following instances: at work, with local partners, government officials, NGOs, or in the media? Issues Issues Issues Issues discussed discussed raised but discussed to a to a Issues Issues Issues never not very limited moderate discussed discussed raised discussed briefly extent extent adequately extensively 1 2 3 4 5 6 7        82 ID: Page 4 of 6 WORLD BANK INSTITUTE Unleashing the Power of Knowledge to Enable a World Free of Poverty III. Comparison of the WBI Activity with Similar Activities Offered by Other Organizations 11. Did you participate in any similar learning activities offered by other (NON-WBI) organizations in your country? (If no, please skip to question 14.)  Yes  No 12. If yes, please provide the name(s) of the organization(s): 1. ................................................................................................................................................................................. 2. ................................................................................................................................................................................. 3. ................................................................................................................................................................................. 13. How would you rate the effectiveness of the WBI activity compared to other activities conducted by other organizations? Neither more WBI Somewhat effective Somewhat WBI much less Less less or less more More much more No effective effective effective effective effective effective effective opinion 1 2 3 4 5 6 7         IV. Characteristics of the WBI Activity, its Follow-up and Your Background 14. How would you describe the type of the learning activity? Video Sessions Class room Mix of Video and Conference Web-based Study tour (Distance Learning) (Face to Face) Face to Face Learning 1 2 3 4 5 6       15. How effective was this type of learning activity in helping you learn? Not effective at Moderately Somewhat Moderately Extremely No all ineffective ineffective effective effective Effective effective opinion 1 2 3 4 5 6 7         16. During the WBI activity, did you develop an action plan/strategy (e.g., work plans, strategy papers, or policy documents) to apply the knowledge and skills you learned? (If no, please mark “no” below, then skip to question 16.)  Yes  No ID: Page 5 of 6 83 WORLD BANK INSTITUTE Unleashing the Power of Knowledge to Enable a World Free of Poverty 17. If yes, did you use part or all of the action plan in your work?  Yes  No 18. Were you provided with the contact information of other participants in the activity, such as e-mail addresses, telephone numbers or mailing addresses? (If no, please mark “no” below, then skip to question 18.)  Yes  No 19. If yes, how did you use it? Used it to Used it to Never used it continue activity organize joint related discussions follow-up activities    20. Which of the following best describes the organization in which you have worked the longest since the activity? (Select one.)  University/research institution  National/central government  Non-governmental organization (not-for-profit)  Provincial/regional government  Media  Local/municipal government  Private sector  Other, specify: ____________________________ 21. Which of the following best describes the primary type of work you have done the longest since the activity? (Select one.)  Research  Teaching  Policymaking/legislation  Provision of services (e.g., financial, health, etc)  Management/administration  Other, specify: ____________________________ 22. How would you best describe the level of the position you have held the longest since the activity?  Highest level (e.g., Minister, Deputy Minister, Top Government Official, Full Professor, President, CEO)  Senior level (e.g., Department Head, Division Head, Associate Professor, Sr. Researcher)  Middle level (e.g., Program Manager, Project Leader, Assistant Professor, Technical Expert)  Junior Level (e.g., Research associate, Ph.D. level graduate student, Technical Specialist)  Entry level (e.g., Intern, assistant)  Other, Please specify: __________________________________________________________________ 23. What is your gender?  Male  Female Thank you for your feedback. We appreciate your cooperation very much. ID: Page 6 of 6 84 85 This collection of guidance notes explains and demonstrates how to assess capacity development efforts by reviewing and documenting the results of ongoing or completed capacity development activities, projects, programs or broader strategies. The notes can help practitioners and evaluators to highlight lessons learned and identify which approaches were successful and unsuccessful within specific contexts. This information provides an orientation for designing more effective results frameworks and monitoring arrangements during the project or strategy design stage. Key concepts in this approach apply to a wide range of development initiatives. The methods have been tested on capacity development projects within the World Bank’s lending portfolio and capacity building programs, on the Korea Development Institute Knowledge Sharing Program, and on a knowledge exchange program sponsored by the World Bank’s South South Experience Exchange Facility. 86