93306 FOCUS NOTE Getting Smart about Financial Inclusion: Lessons from the SmartAid Index I ndices are frequently used to motivate behavior— think of the World Bank’s Doing Business reports or the United Nations Development Programme’s By focusing attention on a limited number of indicators, indices are purposefully biased toward the kind of behavior they want to influence. In that sense, (UNDP’s) Human Development Index. By measuring they are not neutral; they make a value judgment and benchmarking how countries perform on on what “good performance” is. While this helps specific policy objectives, indices aim to influence motivate behavior, there is always a risk that indices decision-making and create competition to catalyze focus on the wrong things—since what matters reforms. In the same spirit, the SmartAid Index cannot always be easily measured. Experts on indices2 was launched by CGAP in 2007 to measure and point out that there is a danger that indices focus improve the effectiveness of funders in supporting too much on the things they are measuring and not financial inclusion. The Index focuses on funders’ enough on what we really care about (development internal management systems, building on the outcomes). This can lead to perverse incentives, simple premise that better management systems to spend more time on these things than they lead to better projects. Eight years after SmartAid warrant, or simply fail to take into account the many was launched and with 20 funders participating over unmeasured influences on outcomes. Because of this those years, what lessons can be drawn both on the focus on a score and a limited number of indicators, use of indices to motivate behavior change and the innovation may be discouraged and listening to other effectiveness of funders in financial inclusion? Has sources of feedback, such as feedback from clients or the Index triggered change? What has driven this beneficiaries, can become secondary. change? This Focus Note reflects on these questions and also looks at upcoming challenges for funders In 2006 the heads of 29 development agencies in light of evolutions in financial inclusion and the committed to measure the quality of their support broader aid architecture.1 to access to finance. They gave CGAP a mandate to develop an index, recognizing the power of indices Racing to the Top: Using an to benchmark performance, create incentives, Index to Change Practices and stimulate the debate about aid effectiveness within their institutions.3 Many years of in-depth Indices of all types draw on the power of scores to research on what makes a funder effective in focus attention and drive behavior. Like grades in supporting financial inclusion provided a solid basis school, these scores can lead to both positive and for developing an index that would focus on what negative responses. On the positive side, indices really matters. Through peer reviews conducted No. 99 help summarize a complex set of variables into among 17 funders from 2002 to 2004, five elements October 2014 something that is measureable, manageable, and of effectiveness were identified that most influence comparable. For those motivated by competition, how well a funder works in financial inclusion. Those Mayada El-Zoghbi indices drive their efforts to improve performance five elements—strategic clarity, staff capacity, and Barbara Scola and excel. Indices also enable benchmarking and accountability for results, knowledge management, comparison of performance to those of others and and appropriate instruments—were then translated to an organization’s historical performance. into a set of well-defined indicators that constitute 1 This paper distills learning from CGAP’s work on aid effectiveness from 2007 to 2014 and relies on various sources described in Annex 3. 2 For example, Owen Barder from the Center for Global Development (CGD), interviewed in September 2010. CGD developed the Commitment to Development Index (CDI) and was one of CGAP’s thought partners in developing the SmartAid Index. 3 Twenty-nine heads of development agencies signed the “Compact for Better Aid for Access to Finance” at the Better Aid for Access to Finance High Level Meeting in Paris, France, 2006. The Compact states their collective commitment to improve effectiveness, transparency, and mutual accountability, and expresses their engagement in refining and piloting an index. The four specific commitments adopted with the Compact are (1) measure the quality of aid management, (2) implement the good practice guidelines for funders of microfinance, (3) improve field-level coordination, and (4) partner with the private sector. 2 the SmartAid Index.4 Moving from a purely qualitative for improvement. Most funders also organize a approach, such as peer reviews, to an index based debriefing meeting with CGAP to discuss the on quantifiable indicators introduced a sense of findings and develop an action plan to implement competitiveness among funders and also increased the recommendations. the analytical rigor of the assessments. After the pilot round of SmartAid in 2007, the set of indicators SmartAid is first and foremost a learning tool that was significantly streamlined to improve efficiency draws on the power of numbers and rankings to and relevance of the Index. In the following three motivate and support a process of internal change. rounds (2009, 2011, and 2013) the set of indicators SmartAid assesses and ranks institutions that willingly has proven relevant and, according to feedback from participate and relies on confidential information that participating funders, gives an accurate picture of a is provided by participating institutions to analyze their funder’s overall institutional effectiveness. internal systems. While transparency is encouraged, funders can choose to publish their results or not. Contrary to project evaluations that look at the This approach differentiates SmartAid from most other performance of specific interventions a funder indices used to measure aid effectiveness (see Table supports in the field, SmartAid focuses on funders’ 1); these typically use publically available data to rank internal management systems, a deliberate choice performance of actors and make the results public.7 since SmartAid aims at behavior change at the institutional level (see Figure 1). As such, SmartAid The voluntary nature of SmartAid also has its does not replace and is complementary to other limitations, as not all institutions that would benefit types of evaluations, such as project evaluations, from the exercise participate in it. An aid institution’s portfolio reviews, and impact evaluations. 5 6 funding source seems to affect its willingness to subject itself voluntarily to an external evaluation The SmartAid Index relies on evidence provided by of this nature. Donor agencies that fundraise the funder, which is then reviewed by an independent periodically from member countries/organizations board that assigns a score to each indicator. The seem to be more driven to prove their effectiveness. review board comprises four independent financial Bilateral agencies that rely on a political process of inclusion experts who have extensive experience lobbying for their funding are more concerned with working with funders. Participating funders receive the changing political priorities of their governments. a concise report that describes the main strengths Thus we see a trend of multilateral funders being and weaknesses and provides recommendations more likely to participate in assessments such as Figure 1. Spectrum of evaluations serving different goals SmartAid Index Project evaluation Portfolio review Impact evaluation Quality of Project Portfolio management Impact performance performance systems 4 For further information about the five elements of effectiveness and the SmartAid indicators, see Annex 1 or consult El-Zoghbi, Javoy, and Scola (2014). 5 See Scola-Gähwiler and Nègre (2012). 6 See El-Zoghbi and Martinez (2011). 7 For example, CDI ranks 27 donor countries according to their performance on seven policy priorities, including areas such as the quantity and quality of foreign aid, policies that encourage investment and financial transparency, and openness to migration. The Index makes the point that development is not just what happens with aid dollars, but is influenced by many other policy issues such as migration, trade, and environmental decisions. 3 Table 1. List of major aid effectiveness indices Index Initiator Focus Type of data used Frequency Quality of Official Brookings Institution Assesses 23 donor Survey on Annual (since 2009) Development and Center for countries and over monitoring the Paris Assistance (QuODA) Global Development 100 aid agencies Declaration on Aid assessment (CGD) on four dimensions Effectiveness and related to the Paris additional sources Declaration on Aid Effectiveness Commitment to CGD Ranks 27 rich Official sources and Annual (since 2003) Development Index countries in 7 policy academic research (CDI) areas related to development Best and Worst William Easterly Rating of bilateral, OECD Development One-off (2011) of Aid Agency and Claudia multilateral, and Assistance Practices R. Williamson, United Nations aid Committee database Development agencies in terms and data reported Research Institute at of transparency, by donors New York University specialization, selectivity, ineffective aid channels, and overhead costs Aid Transparency Publish What You Ranks donors on Public data Annual (since 2011) Index Fund how transparent complemented with they are; 67 survey participants in last round SmartAid and to strive to improve their performance IFAD had updated its rural finance policy in 2008 and over time. therefore scored well on strategic clarity. However, the new policy had not yet been fully operationalized, Change Happens and the rural finance team saw SmartAid as an opportunity to help it translate words into action. Lessons from the International Fund for Following the SmartAid assessment, IFAD developed Agricultural Development (IFAD) its operational guidelines, known as the IFAD Decision Tools for Rural Finance, and a number of other practical When the rural finance team at IFAD, a specialized UN tools that translated the policy into key principles and agency working to improve the lives of poor people practices at every step of the project cycle, such as in rural areas, decided to participate in SmartAid in Technical Notes, Lessons Learned, and “How to do 2009, it knew it wasn’t going to be an easy exercise. Notes,” providing in-depth guidance on specific rural Managing change processes in an organization finance topics. Having scored low on staff capacity, where decision power is highly decentralized and the IFAD decided to invest heavily in capacity building technical advisory division has only limited influence efforts for staff. Based on a needs assessment, the on project implementation is akin to coaching a team designed and delivered a targeted offer of in- football team from the stands. One staff member house training involving its five regional divisions and remembers that it felt “scary opening yourself up and the Human Resource Department. IFAD also worked exposing everything you are doing even if you know, with the Microfinance Information Exchange (MIX)8 not everything is perfect.” to develop an online course for staff on financial and 8 MIX is a nonprofit organization promoting responsible financial services for underserved communities through data analytics and market insight. For further information see www.themix.org. 4 social monitoring and strengthened its partnerships from SmartAid very seriously. The SmartAid results with regional microfinance networks and service are discussed at the annual staff retreat, and an action providers. These combined efforts show a cohesive plan is prepared to implement the recommendations. approach to bringing harmony to strategy, capacity Over the past years, this continuous change process building, knowledge management, and quality has resulted in significant improvements. Staff have assurance. been recruited and trained to fill gaps at the technical level, but also to strengthen internal knowledge The reform that probably had the most significant management following an agency wide knowledge impact on projects was the introduction of quality management strategy. Based on the results of enhancement reviews and technical support early in evaluations and a thorough portfolio review, UNCDF the project cycle. This required IFAD’s rural finance has refocused its strategy around an ambitious experts to be involved in project design and play a market development approach that is very much in stronger role in project supervision. According to IFAD line with UNCDF’s comparative advantage and its and evidenced by the documents it submitted for the local presence in 28 least developed countries. 2013 round of SmartAid, all these measures combined resulted in a drastic reduction of market-distorting UNCDF’s high-performance culture is also reflected subsidized lending and generally low-performing in its relationships with partners, thus showing how and unsustainable credit components within multi- doing business differently can lead to improved sector programs. It helped IFAD to take a more results on the ground. Over the years, UNCDF has systemic approach to financial sector development tested and refined the use of standard performance- and to promote a wider range of financial institutions based agreements9 not only with financial service that increase long-term access to diverse financial providers (FSPs), but also with support networks and services for the rural poor. And from the perspective central banks. Performance-based agreements are of IFAD’s rural finance team, transparency was used to set clear targets and to ensure results are on rewarded. SmartAid has led to an increased agency track, and, if not, trigger enforcement mechanisms. wide commitment to IFAD’s role in rural finance and To monitor performance, UNCDF requires all FSPs strengthened the importance of rural finance within to report to MIX and uses MIX Business Solutions the organization’s diverse portfolio. to analyze performance across the entire portfolio. Standardized processes and templates, which can Lessons from the United Nations Capital easily be tailored for each partner institution led to Development Fund (UNCDF) rapid clearances (2–3 days) of new contracts. The According to UNCDF, the UN’s capital investment policy of clearly signaling that agreements will be agency for the world’s 48 least developed countries, suspended in case of nonperformance and actually SmartAid has become part and parcel of who it is enforcing sanctions showed results. Seventy percent and what it does. Having participated in all four of FSPs in UNCDF’s portfolio met performance rounds of the SmartAid Index since 2007 and having targets, while 21 percent of FSPs did not meet reached the label “very good” for its internal systems minimum performance thresholds, triggering a in 2011, is there still something UNCDF can learn temporary suspension of funding. In some cases (5 from SmartAid? According to the staff of UNCDF’s percent of FSPs in the portfolio), a waiver was given Financial Inclusion Practice Area, permanent learning and funding continued, and in only a few cases (4 is UNCDF’s daily business. And indeed, UNCDF is not percent of FSPs in the portfolio) nonperformance resting on its laurels; it is taking the recommendations persisted and led to termination of the agreement. 9 See El-Zoghbi, Glisovic-Mezieres, and Latortue (2010). 5 Lessons from all participating funders more than once, suggest that the Index did indeed The examples cited describe only a few among many stimulate or accelerate change. smaller and bigger steps taken by funders toward greater effectiveness in supporting financial inclusion. Drivers of Change For many funders, developing or updating a financial inclusion strategy is often the natural starting point While change happens at different speeds in different to improve effectiveness. Operationalizing a new organizations, there are some common trends across strategy requires action at many different levels and SmartAid participants as to why and how change represents a large coordination effort. Since SmartAid happens. was launched, funders have made significant investments toward becoming learning organizations, Leadership of internal champions is crucial where lessons from projects are effectively captured, for institutional change. Institutions that have made accessible to staff, and help refine future successfully used SmartAid to improve their strategic directions. However, only a few institutions effectiveness tend to have strong internal fully integrated a learning agenda into their champions. Participation in SmartAid is based accountability systems. Most often, accountability on an organizational commitment; however, the systems are designed from a controller’s perspective individuals initiating and managing the process are and tailored to prove that taxpayers’ money isn’t critical for its usefulness and for ensuring that action wasted, rather than feed learning into new project is taken as a result of the assessment. Typically, design or updating of strategies. the process is driven by the financial inclusion focal point, one or more technical experts responsible If we analyze broader trends across all participants, for knowledge management and technical advice we see significant improvements in total scores over on financial inclusion within an organization. time. This is very evident for the agencies that have Designating a financial inclusion focal point was a conducted SmartAid on multiple occasions (see common recommendation of the peer reviews, the Figure 2). Scores on individual indicators mostly predecessor process of the SmartAid Index. As one also increased over time. However, some agencies focal point noted from the 2009 round, “SmartAid scored lower on individual indicators, usually in cases presented us with a nice opportunity to bring the where an agency’s systems did not keep up with the topic of financial systems development to the top of evolution of the portfolio or external trends. While the agenda, and to the minds of our management, it is impossible to know whether SmartAid was the so now there is a higher level of expectation, and cause of progress over time, feedback from SmartAid we have to continue to improve and meet these participants and the fact that agencies participate expectations.” Figure 2. Evolution of scores for repeating agencies 100 Total SmartAid Scores 90 2013 2011 2013 2011 2011 80 2009 2009 2009 70 2013 60 2009 50 40 30 20 10 0 Agency A Agency B Agency C Agency D 6 Consequently, most of the organizations participating for improvement. The SmartAid report highlights in SmartAid had a financial inclusion focal point. key strengths and areas for improvement and However, not all focal points had a strong internal presents concrete recommendations. An external, standing within the organization. In addition to independent review by recognized experts helped individual leadership, the responsibilities assigned to put financial inclusion related issues on agency focal points determine their influence. For example, management’s agenda and triggered concrete focal points play a stronger role in agencies that actions. As one executive director of a participating oblige project managers to take into account the agency noted, “I think the fact that it’s done by an focal point’s technical inputs in project design. external body (CGAP) that has legitimacy is a key When technical reviews by focal points are purely thing. As a manager you may know where weak spots voluntary, with no requirement that their opinion are but having some transparent and external review is sought or their feedback addressed, the role of just inherently strengthens your hand.” It also drew the focal point is weakened in relation to program attention to weaknesses in internal systems common or country staff. Additionally, when organizations to many funders and therefore easily ignored by staff require the focal point’s input only at the final stages and management. In particular, results management of project approval, this also greatly diminishes their systems were weak in most participating agencies, role and influence in the organization. Organizations and after their participation in SmartAid, many that integrate the focal point review earlier in invested significantly in improving results tracking project design, requiring technical comments and management systems. In that sense, SmartAid to be addressed by program staff, benefit more helped to raise the bar on what is expected, and what systematically from technical reviews of their projects can be done, to support financial inclusion effectively. and increase technical know-how throughout the organization. Management commitment proved to be particularly important at institutions that scored low in SmartAid. Top management commitment counts. Low scores can be demotivating. And if nobody cares Participation in the SmartAid Index is open to all at the top, there is not much to gain by mid-level funders of financial inclusion that are committed management and technical staff investing time in to improving their effectiveness. An explicit conducting a SmartAid assessment and advocating expression of commitment from top management, for cumbersome reforms. Not surprisingly, several evidenced by the signature of the Compact for “low performers” have not repeated SmartAid. But Better Aid for Access to Finance and a letter in institutions with committed management teams, confirming the agency’s participation in SmartAid, even those with low SmartAid performance ratings has proven critical to the usefulness of the Index experienced some positive change as a result of and its likelihood to lead to internal change. This the Index. For example, in one agency there was an requirement engages top management from the internal dialogue on why financial inclusion was not start, creating a strong incentive for staff to dedicate receiving the kind of attention management thought the necessary resources to the process and to share it deserved. This was followed by the creation of the results, at least internally. In the few cases a task force and a much deeper internal review to where organizations abandoned SmartAid midway explore the topic further. Another agency conducted or didn’t approve the final report, management had a portfolio review and as a result has refocused its not been involved in the decision to participate in strategy. SmartAid. Learning from peers and colleagues matters. Even more importantly than stimulating Ranking funders does not only serve the purpose management’s commitment to effectiveness upfront, of creating competition, it also shows which SmartAid draws their attention to priority areas organizations do best in specific areas of effectiveness. 7 Many examples of exchange between SmartAid participants confirm this type of peer learning Box 1. Do Better Management Systems Lead to Better Projects? taking place. Besides learning events organized The question most asked about the SmartAid Index by CGAP, SmartAid participants exchanged with is whether better management systems actually peers on areas such as knowledge management lead to improvements in projects on the ground. and conducting portfolio reviews. For example, This is also the most difficult question to answer. SmartAid focuses on internal systems and does UNCDF invited the German development agency, not capture the performance of funders’ financial Gesellschaft für Internationale Zusammenarbeit (GIZ), inclusion projects on the ground. Some indication the top performer on knowledge management, to can be drawn from comparing SmartAid scores participate in its knowledge management strategy with results from portfolio reviews, which assess project performance throughout a funder’s financial development process. Several agencies contacted inclusion portfolio (see Scola-Gähwiler and Nègre the French development agency Agence Française [2012]). However, there is a caveat to this approach. de Développement (AFD) to learn how to conduct a Portfolio reviews use scoring systems developed for each institution, and there is no standard across all portfolio review. IFAD consulted UNCDF to discuss funders, making it difficult to compare performance lessons and techniques to inform plans to develop a across different funders. Given that only four funders system for performance-based agreements. (EIB, AfDB, UNCDF, and AFD) have undertaken portfolio reviews following a similar methodology since 2007, the sample is insufficient to draw any Beyond peer learning, participating in SmartAid general conclusions at this stage. provided opportunities for internal learning. In many institutions, multiple departments or operational units However, there is emerging evidence, when design, manage, or evaluate financial inclusion projects. comparing results from SmartAid and the portfolio reviews, showing a correlation between the quality In addition legal, risk management, and procurement of management systems and project performance. departments are involved in projects. In many cases But, agencies that perform highest in SmartAid do SmartAid highlighted that there was no common not necessarily have consistently high-performing projects on the ground. Likewise, some of the understanding of good practices in financial inclusion poorly performing SmartAid agencies can still across departments and provided opportunities manage to have some good projects on the ground. for interdepartmental exchange within the funding For example, one of the agencies that scored below agency. Higher-performing funding agencies require average has several investments in very strong banks and microfinance institutions that reach out all departments to participate in financial inclusion to rural areas or that integrate new products. These knowledge management and learning events. investments were made from the private sector arm, one of the departments that integrated good Keeping up with the Times practices in its operational systems. The majority of the other investments in the portfolio is channeled by the rest of the agency as loans to government Drivers of change, such as the ones discussed above, or national apexes, with fairly weak performance determine whether and at what speed organizational overall. Often these kinds of differences are a result of staffing, country context, and partner selection at change happens and, ultimately, whether the project level; highlighting that other factors, not organizations are able to anticipate or adapt to a just the quality of management systems, contribute continuously evolving environment. Organizations to project performance. must continue to evolve and integrate learning into On a qualitative level, portfolio reviews often point their projects and portfolios to add value in the out similar strengths and weaknesses as SmartAid. markets where they work, or to even stay relevant Both SmartAid and portfolio reviews conclude that at all. SmartAid has served as an important tool for many funders have weak accountability systems. funders of financial inclusion to evolve over time Funders consistently score lowest on three of the four indicators related to accountability for results and to respond to market changes. But the pace of in SmartAid. change may leave even the best of them struggling to catch up. 8 How does the turtle compete with the hare? By being are propelling it forward may not be sufficiently smarter, not faster. The entire aid industry and the integrated. Even the top-performing SmartAid architecture that defines it are rapidly changing. agencies can see their strategic relevance erode if Aid flows are a much less powerful instrument they do not refresh their strategies regularly. From through which OECD countries can wield influence the agencies participating in SmartAid, we see that in the developing world. Private capital flows, in those that update their strategies every four or five particular remittances, now far outstrip official years are more likely to stay relevant. development assistance. New players are redefining One of the most important considerations global priorities. Countries such as China and the funders must integrate into their strategies is the Gulf Cooperation Council countries are using their role of the private sector in advancing financial wealth—private investment as well as aid flows—to inclusion and how limited aid can effectively set the tone for aid and trade. leverage private resources. While policy makers and others also play important roles in advancing At the same time, OECD countries are still reeling financial inclusion, much of the change is emanating from the effects of the 2008 financial crisis. Most from technological advances used by the private countries continue to cut their aid budgets, putting sector. Thus donors working in this space need severe strains on how donor agencies can operate. to have a clear understanding of how their Budget cuts have meant fewer staff managing larger work leverages or influences the private sector. projects. Even when budgets are not the binding The European Commission (EC), for example, constraint, the watchful eyes of parliaments and recently took a serious look at its private sector the media provide little room for maneuverability. framework, which involved consultation with many Everyone is looking to see results, and fast. actors (nongovernment organizations, bilateral governments, multi-lateral institutions, academics, Funders supporting financial inclusion have to shift from think tanks, etc.). This process has helped the EC funding microfinance institutions to supporting broader define and articulate, both internally as well as to market development, as microfinance institutions now the outside world, how its work will incorporate the attract private investments and technology enables the private sector. emergence of new business models for the delivery • There is no substitute for technical competence: of financial services, many of which are mainly driven invest in it internally or leverage it through by the private sector. This requires more and more partners. Despite budget cuts, the shrinking specialized technical expertise and working with an number of staff, and larger project sizes, financial increasing number and diversity of actors. inclusion is becoming increasingly technical and is not well-served with money alone. Funders The broader changes in the aid industry, coupled with contributing to financial inclusion projects the trends in financial inclusion, require funders to increasingly need technical know-how to identify reflect on how they can add value, often in technical the right partners and provide the value added areas they historically know little about. The SmartAid that partners need. This has significant implications experience points to several important lessons that for how funders operate. Funders need to decide if can help funders be better prepared for a fast they will build their own technical competence or if changing environment: they will partner with those that already have it. If they choose the former, they need to hire the right • Need to refresh strategy to reflect drivers of staff and put in place the funding mechanisms that financial access. While many funders have financial allow them to make investments and grants that sector strategies, private sector strategies, or even are large enough to have value but small enough microfinance strategies, the pace of change in to meet the needs of the types of players operating the financial inclusion field and the drivers that in the financial inclusion ecosystem. If they choose 9 the latter, they need to have the sufficient capacity CGD (Center for Global Development). 2013. to select the right partners but they must also cede “Commitment to Development Index 2013.” http:// control and let their partners sit in the driver’s seat. www.cgdev.org/sites/default/files/CDI2013/cdi- • Measure, learn, and do. The cycle for learning brief-2013.html. must get faster and better. Funders can no longer accept inadequate monitoring and evaluation Easterly, William, and Claudia R. Williamson. 2011. systems that are disconnected from project design “Rhetoric versus Reality: The Best and Worst of Aid and staff development. Systems need to go beyond Agency Practices.” New York: New York University. measuring project results to assessing whether a funder’s interventions contribute to market El-Zoghbi, Mayada, and Meritxell Martinez. 2011. development. Funders must continue to learn “Measuring Changes in Client Lives through from their projects and must integrate this learning Microfinance.” Brief. Washington, D.C.: CGAP. regularly into their strategies and the design of new projects. Only when these learning loops are El-Zoghbi, Mayada, Emmanuelle Javoy, and Barbara working will we see the full benefits of aid. Beyond Scola. 2014. “SmartAid Index.” Technical Guide. internal monitoring and evaluation systems, we must Washington, D.C.: CGAP. also work collectively to better communicate what it means to be effective and how this is measured. El-Zoghbi, Mayada, Jasmina Glisovic-Mezieres, and Alexia Latortue. 2010. “Performance-based In a nutshell, organizational change is not always easy Agreements.” Technical Guide. Washington, D.C.: for funders to address, but neither is it impossible. CGAP. Funding agencies have the means to make the changes that matter—years of assessment through SmartAid Latortue, Alexia, Mayada El-Zoghbi, and Barbara have shown this. The need to continue on this path of Gähwiler. 2009. “Improving Effectiveness from renewal is the only thing that remains constant. Within: SmartAid for Microfinance Index.” Brief. Washington, D.C.: CGAP. SmartAid remains available as a tool for funders that want to improve their effectiveness in financial Publish what you fund. (accessed 2014). inclusion. The SmartAid Index Technical Guide “Aid Transparency Index.” http://www. provides a detailed description of the methodology publishwhatyoufund.org/index/ and process. Interested funders can register at www. cgap.org/about/programs/smart-aid. Scola-Gähwiler, Barbara, and Alice Nègre. 2012. “Portfolio Reviews: Resource Guide for Funders.” References Technical Guide. Washington, D.C.: CGAP. Birdsall, Nancy, Homi Kharas, and Rita Perakis. 2012. “The Quality of Official Development Assistance Assessment 2009: Is Aid Quality Improving?” Washington, D.C.: Brookings and Center for Global Development. 10 Annex 1: SmartAid indicators Depending on their performance, funders reach a score ranging from 0 (no systems in place) to 5 (good practice) for each indicator. Indicators account for either 10 or 15 points depending on their relevance for a funder’s overall effectiveness in financial inclusion. This results in different weights for the five elements of effectiveness. Funders can reach a maximum of 100 points in the SmartAid Index. Elements of Effectiveness SmartAid Index Indicators Strategic Clarity 1 Funder has a policy and strategy that addresses financial inclusion, 15 points is in line with good practice, and is based on its capabilities and constraints. Staff Capacity 2 Funder has quality assurance systems in place to support financial 10 points inclusion projects and investments. 3 Funder has the staff capacity required to deliver on its financial 15 points inclusion strategy. Accountability 4 Funder has a system in place that identifies all financial inclusion 10 points for Results projects and components. 5 Funder monitors and analyzes performance indicators for financial 10 points inclusion projects and investments. 6 Funder incorporates performance-based elements in standard 10 points agreements with partners. 7 Funder regularly reviews the performance of its financial inclusion 10 points portfolio. Knowledge 8 Funder has systems and resources for active knowledge management 10 points Management for financial inclusion. Appropriate 9 Funder has appropriate instrument(s) to support the development of 10 points Instruments local financial markets. MAXIMUM SCORE 100 points 11 Annex 2: List of participating funders 2007–2013 SmartAid round New participants Repeaters Number SmartAid 2007 • Asian Development Bank (AsDB) 7 (pilot round, scores • Canadian International Development Agency are not comparable (CIDA) with other rounds) • FMO • Gesellschaft für Technische Zusammenarbeit (GTZ) • KfW Entwicklungsbank (KfW) • Swedish International Development Cooperation Agency (Sida) • UNCDF SmartAid 2009 • Agencia Española de Cooperación Internacional • GTZ 11 para el Desarrollo (AECID) • UNCDF • AFD • African Development Bank (AfDB) • EC • IFAD • International Finance Corporation (IFC) • International Labour Organization (ILO) • Multilateral Investment Fund (MIF) • Swiss Development Cooperation (SDC) SmartAid 2011 • Australian Agency for International Development • GIZ (former GTZ) 6 (AusAID) • KfW • European Investment Bank (EIB) • MIF • UNCDF SmartAid 2013 • AFD Group (including AFD and Proparco) • IFAD 5 • European Investment Fund (EIF) • MIF • UNCDF Total number of 29 assessments done Total number of 20 participants (unique count) No. 99 October 2014 Annex 3: Sources • Discussions with senior management of Please share this participating funders collected during debriefing Focus Note with your This Focus Note is based on learning from CGAP’s meetings on SmartAid colleagues or request • Inputs and feedback from funders and aid extra copies of this work on aid effectiveness from 2007 to 2014. paper or others in This learning has been captured in multiple ways, effectiveness experts provided during two learning this series. including the following: events (2011 and 2014) • Feedback received during High Level Aid CGAP welcomes Effectiveness Events (Accra High Level Form in your comments on • Documents provided by 20 funders in four rounds this paper. of the SmartAid Index 2008 and High Level Forum on Aid Effectiveness • Feedback survey completed by participating in Busan, 2011) All CGAP publications • Consultations and interviews with aid effectiveness are available on the funders after every round of the SmartAid Index CGAP Web site at • In-depth semi-structured interviews with 11 funder experts: Owen Barder and David Roodman from www.cgap.org. staff the Center for Global Development, Homi Kharas • An external evaluation conducted after the from the Brookings Institution, and Elisabeth CGAP Sandor and Brenda Killen from OECD/DAC 1818 H Street, NW SmartAid pilot round in 2008 MSN P3-300 • Scores and comments from the Review Board on • Literature review on aid effectiveness and indices Washington, DC the performance of participating funders 20433 USA • Feedback from the Review Board on the relevance Tel: 202-473-9594 of the SmartAid indicators and the process Fax: 202-522-3744 Email: cgap@worldbank.org © CGAP, 2014 The authors of this Focus Note are Mayada El-Zoghbi, senior finan- Bruett, Claudia Huber, Richard Rosenberg, Heather Clark, Ruth cial sector specialist at CGAP and Barbara Scola, financial sector Goodwin-Groen, Alice Negre, Kathryn Imboden, Lene Hansen, specialist at CGAP. The authors would like to acknowledge the Klaus Maurer and Emmanuelle Javoy. Special thanks also go to the many individuals who advised, conceptualized, participated in and 20 participating agencies in SmartAid and, in particular, to IFAD supported the SmartAid Index including: Alexia Latortue, Tilman and UNCDF for reviewing and contributing to this publication. The suggested citation for this Focus Note is as follows: El-Zoghbi, Mayada, and Barbara Scola. 2014. “Getting Smart about Financial Inclusion: Lessons from the SmartAid Index.” Focus Note 99. Washington, D.C.: CGAP. UKa from the British people