97385 TheChangn ig a L ndsa cpe ofDevel opment Eval uai tonTa rinn i g: ARapidReve iw No.31|June2014 DawnRobet rs The Changing Landscape of Development Evaluation Training: A Rapid Review June, 2014 www.worldbank.org/ieg/ecd © 2014 Independent Evaluation Group, The World Bank Group 1818 H St., NW Washington, DC 20433 http://ieg.worldbankgroup.org IEG: Improving the World Bank Group’s Development Results Through Excellence in Evaluation The Independent Evaluation Group is an independent unit within the World Bank Group; it reports directly to the Bank’s Board of Executive Directors. IEG assesses what works, and what does not; how a borrower plans to run and maintain a project; and the lasting contribution of the Bank to a country’s overall development. The goals of evaluation are to learn from experience, to provide an objective basis for assessing the results of the Bank’s work, and to provide accountability in the achievement of its objectives. It also improves Bank work by identifying and disseminating the lessons learned from experience and by framing recommendations drawn from evaluation findings. IEG’s Evaluation Capacity Development Working Papers are an informal series to disseminate the findings of work in progress to encourage the exchange of ideas about development effectiveness through evaluation. The findings, interpretations, and conclusions expressed here are those of the author(s) and do not necessarily reflect the views of the Board of Executive Directors of the World Bank or the governments they represent, or IEG management. IEG cannot guarantee the accuracy of the data included in this work. The boundaries, colors, denominations, and other information shown on any map in this work do not imply on the part of the World Bank any judgment of the legal status of any territory or the endorsement or acceptance of such boundaries. ISBN-13: 978-1-60244-250-4 ISBN-10: 1-60244-250-9 Contact: IEG Communication, Learning and Strategies e-mail: ieg@worldbank.org Telephone: 202-458-4497 Facsimile: 202-522-3125 http://ieg.worldbankgroup.org i Acknowledgments This note was written by Dawn Roberts, Independent Consultant, as a background paper for the Independent Evaluation Group, World Bank. The paper was prepared under the team task management of Arianne Wessal, team leadership of Nidhi Khattri, Lead Evaluation Officer, and overall guidance of Monika Weber-Fahr, Chief Knowledge Officer and Senior Manager, IEGCS. i Table of Contents Introduction ................................................................................................................................. 1 Executive Training Programs ..................................................................................................... 2 Generic Introductory M&E Courses ................................................................................... 2 Customized Introductory M&E Courses ............................................................................ 9 Specialized Workshops for Experienced Practitioners ..................................................... 10 Formal Graduate Degree and Certificate Programs .................................................................. 12 Other Training Opportunities .................................................................................................... 16 E-learning and Distance Learning Programs .................................................................... 16 The Growing Role of Networks ........................................................................................ 18 The Changing Landscape of Evaluation Education .................................................................. 19 References ................................................................................................................................. 22 ii Acronyms and Abbreviations AEA American Evaluation Association CDI Centre for Development Innovation (Wageningen University) CGU Claremont Graduate University CLEAR Centers for Learning on Evaluation and Results EPDET European Program for Development Evaluation Training ICDC International Center for Development Communication (Kasetsart University) IDPMS International Development and Programme Management Solutions IEG World Bank Independent Evaluation Group INTRAC International NGO Training and Research Centre IOCE International Organization for Cooperation in Evaluation IPDET International Program for Development Evaluation Training IPEN International Program Evaluation Network J-PAL The Abdul Latif Jameel Poverty Action Lab M&E Monitoring and Evaluation MESI Minnesota Evaluation Studies Institute MSI Management Systems International MS-TCDC MS Training Centre for Development Cooperation NFP Netherlands Fellowship Programme NGO Nongovernmental Organization Nuffic Netherlands organization for international cooperation in higher education PIFED Programme international de formation en évaluation du développement SHIPDET Shanghai International Program for Development Evaluation Training TEI The Evaluators’ Institute TESA Teaching Evaluation in South Asia UNICEF United Nations Children’s Fund USAID United States Agency for International Development VOPE Voluntary Organization for Professional Evaluators iii Introduction The World Bank Independent Evaluation Group (IEG) works to improve development results through excellence in evaluation. A key part of this mandate focuses on developing the Bank’s client countries’ capacities in monitoring and evaluation. To this end, IEG developed the International Program for Development Evaluation Training (IPDET) in 2001, and this executive training program has been implemented since then in partnership with Carleton University in Ottawa, Canada. IPDET is managed by Carleton University but has received substantial in-kind (through technical experts) and financial support over the years from IEG. IPDET was conceived to offer a one-of-a-kind learning program for filling a gap in development evaluation training. However, there is broad recognition that the landscape is changing, with increasing numbers of organizations providing monitoring and evaluation (M&E) training in some form, an evolving mix of formal graduate degree and certificate programs preparing evaluators, innovations in learning supported by new technologies, and the growing engagement of local networks and evaluation associations in evaluation capacity development. In this context, IEG has commissioned a rapid review of the current landscape for M&E training to develop an understanding of the current context in which IPDET operates. A process of identifying and reviewing current M&E learning opportunities available to practitioners in a developing country context was undertaken to understand the current landscape. Given the tight timeline of this study, a desk review of existing studies was coupled with a convenience sampling approach to explore the broad range of evaluation learning opportunities. M&E learning providers were identified via citations in relevant literature, word of mouth referrals, training announcements posted and/or archived by evaluation associations, and broad online searches. Communication with the learning providers via email, telephone, and Skype helped to deepen the understanding of each program’s characteristics and history. Given these sampling and data collection methods, the selection of providers and programs profiled in this summary report should be viewed as illustrative rather than exhaustive. It is also important to note that this method focused on examining the supply side of training and learning opportunities in particular and thus provides only indirect information about the characteristics of or changes in demand. Finally, training programs focused specifically on development evaluation were of particular interest for this review. The distinction between “evaluation” and “development evaluation” is arguably an important one, with increasing attention focused on what kinds of peer groups and curriculum are needed to effectively build the M&E skills and knowledge relevant for a developing country context. For example, Ofir (2013) describes the imperative for adapting more traditional evaluation practices and concepts to promote “evaluation for development rather than the evaluation of development” (p.584). To this end, programs must go beyond “simplistic notions of ‘measuring impact’ and determining ‘value for money’ toward enabling…a smart engagement with managing for impact that ensures the capacities to achieve impacts in a sustained manner and is based on the many lessons that have emerged from results-based management and other similar efforts” (p. 585). 1 The following sections of this report provide an overview of the evaluation training landscape by exploring various types of learning programs including executive “in-service” training, formal graduate degree and certificate programs, and opportunities available through other forums and modalities. Together, information about the offerings in each of these categories is then used to reflect on emerging trends and the implications for developing local M&E capacity going forward. Executive Training Programs Face-to-face “in-service” programs are common across the training landscape, allowing practitioners to leave their work settings for short periods of time and interact with their peers in developing new knowledge, skills, and awareness. These in-person courses and workshops typically integrate a range of instructional methods such as lectures, small group exercises, case studies, and/or roundtable discussions and sometimes incorporate online interaction or exercises. To explore the characteristics of this type of training, this review covered a sample of 58 introductory monitoring and evaluation courses and explored offerings by 40 providers on more specialized topics for experienced evaluators. 1 Generic Introductory M&E Courses Overall, executive training related to monitoring and evaluation most commonly focuses on building foundational M&E skills. These introductory courses last anywhere from three days to a month, allowing participants to focus intensively on learning M&E concepts with their counterparts without having to enroll in and commit to a formal academic degree program. IPDET’s Core Course provides an example of an introductory course that provides a comprehensive overview of development monitoring and evaluation. The 2-week course consists of 15 instructional modules, which follow chapters in The Road to Results: Designing and Conducting Effective Development Evaluations. The generic content is relevant for the needs of practitioners in the full range of sectors, and the program therefore attracts both evaluators and the commissioners of evaluations from government, NGOs, bilateral and multilateral development agencies, and the private sector. This type of longer generic training, lasting more than one week and offered for open enrollment at least once per year, is relatively less common than the shorter courses. Aside from IPDET and programs based on the IPDET curriculum, three other intensive executive training programs in monitoring and evaluation were identified that served more than 50 participants per year with a generic focus across sectors:  MDF Training and Consultancy offers a 10-day Monitoring, Evaluation, and Learning course at its headquarters in the Netherlands once per year. The training model is replicated for delivery independently at MDF’s branch offices in all regions. 1 Many providers of M&E trainings provide workshops on more than one specialized topic, so the number of workshops is much higher. 2  The Centre for Development Innovation (CDI) at Wageningen University and Research Centre offers a 15-day course on participatory planning, monitoring, and evaluation in the Netherlands and in Burkina Faso.  The MS Training Centre for Development Cooperation (MS-TCDC) in Tanzania offers a 15-day monitoring and evaluation course three times per year. These examples are shown in Table 1. 3 Table 1. Examples of Large Generic Comprehensive Introductory M&E Programs with Open Enrollment Duration, Cost Information* Program Provider(s) Frequency, Main Topics Covered Participant Information (2013) Location(s) International Carlton 10-day course, Understanding the evaluation 163 participants in core course—419 applicants were ~US$525 per day Program for University offered once per context; building an M&E turned away in 2013 due to financial need Tuition and materials: $5,580 Development (headquartered year in Ottawa, system; designing the evaluation; 2012 data: (CAD) Evaluation in Ottawa, Canada since data collection; data analysis and  75 countries represented; with most from Africa Financial assistance provided to (IPDET)** Canada) and 2001 management; presenting results; (23%), East Asia and the Pacific (19%) and North 51% of core course participants the World ethics and guiding principles America. in 2013 Bank IEG  largest share from public sector (33%) followed by development agencies (31%) and NGOs (10%)  ~50% women Participatory Centre for 15-day course, Principles and practices of M&E; 89 participants in 2013—491 applicants were turned ~US$328 per day Planning, Development offered once per situational analysis; logical away in 2013 due to financial need Tuition and materials: €3,600 Monitoring Innovation at year since 2002 frameworks and theories of 2013 estimates from course leader: and Wageningen in the change; developing the right  26 countries represented: 70% from Africa; 14% Most participants (70-80 per Evaluation University and Netherlands and M&E plan based on information from East Asia and the Pacific; 11% from South year) join the course with a full Research once per year in needs, indicators, and intended Asia scholarship from the Netherlands Centre in French in uses; stakeholder engagement;  All types of organizations represented (no Fellowship Programme by Nuffic Wageningen, Burkina Faso managing for impact breakdown provided) ( Netherlands organization for Netherlands since 2011  ~50% women international cooperation in higher education) Monitoring, MDF Training 10-day course Intro to M&E; different ~70 participants enrolled yearly; program does not ~US$489 per day Evaluation, and offered twice per approaches; design of the turn away qualified applicants Tuition and materials: and Learning Consultancy year in Ede, the monitoring system; basic 2013 estimates from registrar: €3,550 (MEL)*** (Head Office Netherlands principles of participatory  Worldwide representation: 40-50% of participants Since 2012, course is sponsored in Ede, (some version system; tracking qualitative and come from Nuffic-eligible countries by Nuffic/NFP: participants in Netherlands) offered since quantitative changes through such  All types of organizations represented (no eligible countries receive full 1999) methods as Outcome Mapping breakdown provided) financial assistance and Most Significant Change  ~50% women MS-TCDC MS Training 15-day course Developing logframes; designing ~70 participants enrolled yearly; program does not ~$US139 per day—includes Monitoring Centre for offered three M&E systems; developing tools turn away qualified, paying applicants and tries to add lodging and Development times per year for M&E data collection; sessions as needed Tuition and Lodging: US$2085 Evaluation Cooperation near Arusha, conducting M&E exercises, doing No participant data available, but targeted participants MS-TCDC does not provide (MS-TCDC) Tanzania since qualitative and quantitative data include practitioners in civil society organizations, scholarships for the M&E course in Arusha, 2003. analysis; managing information government, UN organizations, and the private sector. (some other courses are covered) Tanzania systems; and report writing *Where available, costs reflect only tuition and fees to allow for comparisons across programs to some extent. Costs for travel, room, and board are not included. **The IPDET training model is replicated in other similar recurring programs (e.g. SHIPDET, EPDET, and PIFED described in the text) ***The MDF training model is also replicated and delivered independently at some of the MDF Branch Offices (e.g., Tanzania, Ghana, Vietnam, Colombia, Bangladesh, and Bali) 4 Each of these face-to-face introductory M&E training programs has been offered regularly with steady enrollment for more than ten years. However, IPDET is set apart from the others by the sheer volume of participants it serves and the resulting class size: in 2013, the Core Course had 164 participants, nearly double the number of learners reached by any other in-person program on an annual basis. Some other programs have been established over time based on the IPDET model. The IPDET co-Directors provide training in other parts of the World based on the IPDET materials. These programs, listed below, have no formal affiliation with IPDET:  Shanghai International Program for Development Evaluation Training (SHIPDET). A 6- day course delivered in Shanghai, China (twice per year 2007-2012; once per year currently, serving approximately 60 participants per session).  European Program for Development Evaluation Training (EPDET). A one-week program delivered annually in the Czech Republic since 2007 that served 56 participants in 2013. Programme international de formation en évaluation du développement (PIFED) is also based on IPDET, but implemented independently of the IPDET co-directors. It is a two-week program that is delivered annually in French at the Quebec national school of public administration in Canada since 2010. The program had 81 participants in 2012 and expects to increase to 100 in future years. Many other face-to-face training programs exist that focus on building foundational skills and knowledge for monitoring and evaluation. 2 Several of these have similar formats (2 to 4 weeks long with a generic focus) but serve a more limited number of participants. Some examples of these offerings are shown in Table 2. These smaller programs appear to belong in one of two categories. First, there are international consulting firms, often headquartered in North America or Europe, that offer multiple course sessions in different regional locations for small groups of learners. An example of this type is IMA International (based in the UK), which has offered generic M&E training for more than a decade but recently developed a new 10-day course which is offered in six locations worldwide. Although the new course served only 39 participants in 2013, the provider expects to scale up enrollment in 2014. Second, there are small locally-based programs often situated at an academic institution that serve participants from mainly within that institution’s region. An example of this type is the Asian Institute of Technology Extension in Pathum Thani, Thailand. A three-week project management, monitoring, and evaluation course has been offered two or three times per year since 2006 and nearly all of the participants are from South Asia—specifically Bangladesh, Pakistan, Nepal and Sri Lanka. 2 As part of this exercise, a separate matrix will be provided to IEG with the details of all M&E training programs identified. 5 Table 2. Examples of Smaller Generic Comprehensive M&E Programs with Open Enrollment Duration, Cost Information Program Provider(s) Frequency, Main Topics Covered Participant Information (2013) Location(s) Monitoring IMA New 10-day course, Principles and practice of 39 participants in 2013 (only participants ~US$475 per day and International, with 6 sessions M&E; qualitative and turned away are those that do not meet Evaluation headquartered planned for 2014 in quantitative approaches to selection criteria) Tuition and materials: for Results in Bangkok, New monitoring; logic models and 2013 data: £2,900 Hurstpierpoint, York, Brighton, logical frameworks; developing  66% from Africa United Nairobi, Accra, and indicators and SMART targets;  Fairly even representation from No financial assistance Kingdom Kuala Lumpur— designing and managing government, NGOs, private sector, and some version evaluations; using findings for UN agencies offered since 2002 learning  ~25% women Project Asian Institute 15-day course, Overview of project planning 25 to 40 participants per year (all sessions) US$233 per day Management, of Technology offered two or three and appraisal; project M&E; Tuition and materials: Monitoring, Extension in times per year in Identifying indicators and tools Program estimates include: US$3,500 and Pathum Thani, Pathum Thani, for data collection—specific  Most are from South Asia Evaluation Thailand Thailand since focus on M&E approaches  Most participants are from the public Most participants funded by ~2006 currently used in Thai sector own organizations; those development projects  ~40% are women that request scholarship receive subsidy of 25% to 50% Monitoring SETYM 15-day course, Monitoring of programs and 25 to 45 participants per year (all open course US$520 per day and International, offered three times projects (management); sessions) Tuition and materials: Evaluation headquartered per year in Kuala monitoring results (outcomes US$7,800 Systems in Montreal, Lumpur, Montreal, and impact); and evaluation Specific data not available but estimates Canada and Dar es Salaam (lessons learned); developing include: No financial assistance the indicator base; integrating  Most participants are from the public monitoring processes into an sector M&E system  ~ 25% are women Results- Uganda 10 day course, Development perspective of 20 to 40 participants per year: each session is based Management offered one or two results-based monitoring and capped at 20 participants and “a few” ~US$55 per day (ALL Monitoring Institute, based times per year in evaluation (RBME); RMBE applicants are turned away. COSTS) and in Kampala, Kampala, Uganda with the program management Evaluation Uganda and the since 2012 cycle; designing RBME Program estimates include: All essential costs (lodging, Danida framework and system; M&E  Participants are mainly from Africa; travel, food, tuition, Fellowship processes; impact evaluation; however, some participants from South materials): 6 Duration, Cost Information Program Provider(s) Frequency, Main Topics Covered Participant Information (2013) Location(s) Centre utilization and sustainability of Asia have been selected in the last 2 DKK (Danish Kroner) (sponsor) RBME sessions 3,000  More than half of participants are from the public sector, but NGOs are also well- The course is funded by the represented Danida Fellowship  No gender breakdown available Programme and some  All participants have at least bachelor applicants receive degrees fellowships (data not available) Participatory International 10-day course, Learning a cost-effective M&E 11 participants in 2013—no qualified paying US$150 per day Monitoring Center for offered once per approach with the continued applicants are turned away and Development year in Bangkok, involvement of the Tuition and fees: $US1,500 Evaluation Communication Thailand since 2002 Beneficiaries throughout the 80 participants have completed course since (ICDC) at program cycle; identifying 2002, with characteristics as follows. No financial assistance Kasetsart meaningful indicators; Using  22 countries have been represented, with provided University in valid measurement and most participants from South Asia (Sri Bangkok, statistical analysis of Lanka, Bangladesh and Bhutan); about Thailand quantitative data complemented 20% come from East Asia and fewer than by qualitative data to enhance 10% come from Africa. the analysis of the results  33% of all participants to date have been women  Development agencies, NGOs, and government ministries are fairly equally represented 7 The cost for enrolling in a comprehensive introductory M&E program ranges from approximately US$525 per day to US$55 per day. While exact cost comparisons across programs are difficult, the highest daily rates for tuition and fees were listed at the time of this review by IPDET (US$525) and the consulting firms SETYM International (US$520), MDF Training and Consultancy (US$489), and IMA International (US$475). In many cases, financial assistance provided by an international development agency or program is available to cover or help defray the cost of participation. For example, both the MDF and Center for Development Innovation (CDI) programs are heavily supported by the Netherlands Fellowship Programme (NFP) through Nuffic, the Netherlands organization for international cooperation in higher education. In fact, the course leader for CDI estimated that most participants (70 to 80) come from Nuffic-eligible countries and gain full scholarship support each year. Financial support from IEG allows many IPDET Core Course participants to receive needed scholarship assistance, with approximately half of the participants receiving at least some aid in 2013. The two-week course offered regularly by the Uganda Management Institute is heavily funded by the Danida Fellowship Programme and therefore offered to all participants at low cost ($US55 per day for all costs including travel). In terms of program quality, IPDET stands out from the other programs for its consistent and transparent efforts to ensure the quality of its offerings, as reflected by the sharing of the annual evaluation results on the IPDET website. The Core Course is evaluated annually by an outside evaluator using a standard evaluation framework that includes participant reactions and knowledge tests for learning effectiveness (Trumpower 2012). Annual evaluation results, which have been positive overall, are used to inform plans for program improvements. In addition, a tracer study of participants and in-depth case studies are periodically conducted, with the last set completed in 2010. In contrast, no other comprehensive introductory M&E course identified during this review had publicly available evaluation findings posted on their website or had published reports readily available to share. 3 When asked to describe their evaluation methods, most programs reported doing end-of-course participant reaction questionnaires. One provider, IMA International, reported also following up with participants 3 to 6 months later. The vast majority of introductory M&E programs are shorter in duration (2-5 days). Some of these programs are well-recognized and adapted for use across the development community. For example, the International Institute for Local Development (IILD) in South Africa plans to offer a 5-day M&E course six times during 2014 in Abuja, Washington DC, Amsterdam, Cape Town, Singapore and Brazil. The course is modeled on the curriculum developed by the former World Bank Institute Evaluation Group (WBIEG) and facilitated by a former WBIEG manager. Similar content is offered by the same facilitator regularly for the Arab Administrative Development Organization based in Egypt (with recent trainings offered in the United Arab Emirates, Turkey, France, and Morocco) and by other providers on a more intermittent basis. 3 The one exception was EPDET, the program offered in the Czech Republic based on the IPDET model, which has annual evaluation reports posted on the Development Worldwide website at http://www.dww.cz/english.php?page=news 8 Shorter introductory M&E programs tend not to be as comprehensive as the programs lasting more than one week, often covering only a subset of the topics addressed in the longer courses or providing a more cursory treatment of key M&E concepts. However, practitioners can piece together shorter offerings from the same or different providers to create a comprehensive executive training program. The Evaluators’ Institute (TEI), now situated at George Washington University, is a unique hybrid example of a provider. TEI was specifically created to deliver high quality instruction for evaluators and offers sessions for open-enrollment in the United States in Washington DC, San Francisco, Chicago, and Atlanta. Although individual workshops range from one to four days in length, a participant can piece together workshops for a comprehensive learning opportunity during TEI’s two-week summer session. Also, although TEI has demonstrated expertise specifically in development evaluation training and collaborates with development agencies on special initiatives, the sessions regularly held in the U.S. do not necessarily focus on the development evaluation context. TEI enrolls a large number of participants each year (575 for 2013), but only 10 to 12 percent come from outside the U.S. Thus, with appropriate guidance, a practitioner could gain a comprehensive introduction to monitoring and evaluation through TEI offerings, but this program is difficult to compare to the other prepackaged options developed more specifically for developing country contexts. Customized Introductory M&E Courses Many introductory M&E courses serve more homogenous groups (i.e. trainings specifically targeting government agencies, NGOs, etc.) or are more specialized within a specific sector. Although a systematic search by sector was not possible within the scope and timeline of this review, the general search for general monitoring and evaluation training programs indicated that there are introductory M&E courses available for a broad range of sectors, with a prevalence of customized courses in particular for the health sector. For example, MEASURE Evaluation, funded by the U.S. Agency for International Development (USAID), has developed a range of regional workshops on M&E in the health sector. These workshops typically serve 15-25 participants at a time and provide a comprehensive overview of M&E for HIV/AIDS programs, malaria programs, or other such areas. Examples of specialized programs for more targeted audiences that are offered via open enrollment are shown in Table 3. 9 Table 3. Examples of Introductory M&E Courses for More Targeted Audiences Duration / Program Title Providers/ Cost Frequency/ Topics Covered and Focus Main Sponsors (2013) Location Monitoring International 5-day course Defining M&E; M&E tools; stakeholder US$140 and Evaluation Development offered at least analysis; problem tree analysis; developing per day for Water and Programme 5 times per indicators; purpose and scope of an M&E plan; Supply, Management year. outcome evaluation; data gathering and Sanitation Solutions Schedule for analysis; baselines and surveys; disseminating and Hygiene (IDPMS), based 2014 includes M&E findings Tuition: (WASH) in London, UK courses in US$700 Programmes with field Liberia, offices Somalia, Sudan, and Uganda Project International 2-week course Results-based management in international US$395 Monitoring Law Institute offered once development; planning for and executing the per day and Evaluation based in per year in M&E process; tools, methods, and approaches; (Legal and Washington Washington knowledge and learning from the M&E Tuition: Judicial DC, USA experience US$3,950 Reform) Monitoring Africa Medical 4-week course Conceptual understanding of M&E; designing US$70 and Evaluation Research offered four and conducting an evaluation; quantitative and per day (Health) Foundation times per year qualitative data collection and analysis (AMREF) in Nairobi methods; report writing and presentation; based in designing the M&E system and change Nairobi, Kenya monitoring Tuition: US$1400 Participatory International 10-day course Situational analysis; designing and facilitating US$180 Monitoring, Institute of offered once the M&E process; strengthening systems for per day Evaluation and Rural per year in participatory M&E; action planning Learning Reconstruction Silang, (Rural (IIRR), based in Philippines Development) the Philippines since 1998 Tuition: US$1,800 Specialized Workshops for Experienced Practitioners Aside from introductory M&E courses, in-person executive training programs also include a variety of intermediate and advanced workshops. An internet search on development evaluation training indicated a broad proliferation of training opportunities for experienced evaluators. A brief review of the current and archived training announcements on evaluation associations’ websites easily identified dozens of workshop announcements on intermediate and advanced 10 topics. 4 Selected examples from 40 providers were explored during this review, with some examples provided in Table 4. IPDET is one example of a provider offering a range of specialized workshops for experienced practitioners. In addition to the Core Course, IPDET offers 2- and 3-day sessions on a variety of methodological and thematic topics. The Core Course or equivalent is a prerequisite for attending these workshops, which allow participants to drill down deeper in topics of interest, work directly with experts in the field, and learn about new approaches and cutting-edge methods. IPDET changes the mix of workshop offerings each year (26 in 2013) based on participant reactions recorded in the annual evaluations and new developments in the field of evaluation. In addition, IPDET tests the interest of participants for potential new, cutting-edge content by offering samples in mini-workshops during parts of the Core Course. The new content areas or practices for which there is clear demand are then developed into full workshops for the following annual session. Opportunities for evaluators to update or deepen their knowledge and skills vary widely. Available workshops address a broad range of topics, from performance budgeting to communicating results effectively to the use of new technologies in data collection and so on. However, the most frequent focus appeared to be on designing and conducting impact evaluations. Also, the workshops on intermediate and advanced topics varied widely in duration, but most commonly lasted for 3 to 5 days. Table 4. Examples of Workshops for Experienced Evaluators Program Title Providers/ Main Description Instructional Location and Target Audience Sponsors* Format Frequency Impact Institute of 5-day course on Lecture and small Brighton, UK Experienced Evaluation Development how to design an group work researchers and Design Studies in impact evaluation Annually evaluators in a Brighton, UK offered since 2013 developing (with plans to country context repeat annually) Monitoring Channel Research 3-day Main approach Colombo, Sri Experienced and Evaluation in Ohain, Belgium intermediate includes sharing Lanka— evaluators for Crisis and course on how to participants’ case Customized carrying out work Recovery apply M&E skills studies and versions of the in peace-building Initiatives in peace-building interactive project course are and conflict and conflict examples offered in prevention prevention context different offered since 2006 regions several times a year Executive The Abdul Latif 5-day course on Integrated teaching Held annually Managers and Education: Jameel Poverty impact evaluation methods by expert in several researchers from Evaluating Action Lab (J- and 3-day researchers, locations international Social PAL) Executive advanced course lecture, group work worldwide— development Programs and Education on impact USA, South organizations, Advanced headquartered in evaluation offered Africa, foundations, Impact Cambridge, MA, since 2005 Mexico, governments and 4 For one example, see the announcements posted by the European Evaluation Society at http://www.europeanevaluation.org/work-opportunities/training-opportunities/training-opportunities-archive.htm 11 Program Title Providers/ Main Description Instructional Location and Target Audience Sponsors* Format Frequency Evaluation USA Belgium, NGOs Argentina, and Pakistan. Policy The Evaluators’ 2-day workshop Participants Washington Practicing Evaluation and Institute at George exploring the develop major D.C. (or other evaluators and Analysis Washington types of components of a locations in policy analysts University in evaluation likely professional policy U.S.) Washington DC, to be influential in analysis and design USA the policy analysis a policy evaluation Two or three process times per year Mobile Based Indepth Research 3-day workshop Instructor Nairobi, Kenya Practicing Data Services (IRES) introducing presentations are evaluators with a Collection in Nairobi, Kenya participants to combined with Several times foundational Tools for new information guided exercises, per year understanding in M&E and web-based statistics and communications tutorials, and group M&E technology (ICT) work tools for collecting data *Most providers of M&E workshops on special topics have a range of offerings covering various topics. One example from each selected provider is described here simply to illustrate the types of offerings available. In many cases, workshops on special topics are not just for developing technical expertise. Ongoing, well-established training programs also work to foster M&E demand as a central objective. For example, the Abdul Latif Jameel Poverty Action Lab (J-PAL) has regularly held a flagship executive education program on evaluating social programs since 2005. The program is offered by all of J-PAL’s independent regional offices, housed at local academic institutions, and serves approximately 250 participants per year. A J-PAL representative explained the objectives of the course for this review, noting that it offers an overview for people who have heard of randomized evaluation. The hope is that participants—from international development organizations, government, foundations, and NGOs—will be able to identify opportunities for when programs should be evaluated and that they can then participate intelligently in discussions about the research design. The course implementers do not intend that participants would actually conduct randomized experiments after completing the 5-day course. Formal Graduate Degree and Certificate Programs Academic institutions are a key provider in the landscape of development evaluation education; however, the relevant collection of programs and institutions is difficult to identify or define. In a recent study in the United States exploring how evaluators acquire their skills, Dillman (2013) concluded that “we lack both awareness of how educational experiences are currently offered through graduate education programs nationwide, and an accepted standard of how these opportunities should take form” (p.280). Indeed, although prospective applicants can consult 12 various resources to find graduate degree programs for major disciplines, there is no clear guide to graduate training programs in evaluation. 5 With the recognition that “evaluators are made, not born, and an extended period of training is necessary,” LaVelle and Donaldson (2010) worked to identify university-based evaluation training programs in the United States through an online search and curricular analysis (p.10). Although the published research to date had suggested that the number of such training programs had steadily declined since the 1980’s, a search on key words allowed the authors to identify 48 master’s or doctoral programs in total which offered two or more courses with “evaluation” in the course title. Given that “university programs do not appear to be in decline” and that “the evaluation discipline is gradually becoming distinct,” the authors suggested that future studies might “explore the actual effects of the training programs” (LaVelle and Donaldson 2010; p. 20- 21). Similar efforts are underway in other countries to understand better the available opportunities for graduate study in evaluation. For example, the University of Bern has worked since 2004 to identify European university-based study programs via internet searches, interviews, and a survey. The results of these efforts include profiles of 16 European institutions that offer graduate study in evaluation, and this list is currently posted on the European Evaluation Society website to provide guidance for anyone seeking graduate study opportunities (University of Bern 2012; Beywl and Harich 2007). Traditionally, graduate programs have provided mainly pre-service education, and the enrollment in a master’s or doctoral program is not always a viable choice for a practitioner looking to update or expand his or her skills. A recent survey of 975 American Evaluation Association (AEA) members revealed that courses related to project management, project logistics, evaluation procedures, and evaluation approaches were most commonly completed through professional development training rather than through graduate degree programs (Christie et al 2013). This finding underscores the need to examine other ways that academic institutions are supporting evaluator’s professional development aside from the formal graduate degree programs. Postgraduate certificate and diploma programs in evaluation are becoming more common worldwide and typically offer working professionals a more flexible alternative to developing core M&E skills than the enrollment in a formal degree program. These programs are typically connected to graduate degree programs and require a longer-term commitment than the short- term executive training programs. As shown in the few selected examples in Table 5, these programs usually include a compulsory course on the foundations of program evaluation but otherwise do not share any common format, length, or curriculum. Although certificate programs reflect a growing effort in the United States to formalize evaluation credentials, the increase in programs at universities does not necessarily translate to increased learning opportunities for evaluators working in the World Bank’s client countries. 5 For example, sources of such guides for more common fields of study include Peterson’s, the Princeton Review, and U.S. News and World Report. 13 For example, the Minnesota Evaluation Studies Institute (MESI) at the University of Minnesota regularly attracts foreign students to its graduate degree programs due to the University’s Comparative and International Development Education Track. However, MESI does not recruit candidates residing outside the local metropolitan area for its Program Evaluation Certificate program. The program has served 12-15 participants per year since its start in 1998 but has never had a foreign enrollee. In contrast, Claremont Graduate University (CGU) typically enrolls 14 students per year in the Certificate in Advanced Study in Evaluation program, with half of them coming from outside the U.S. 6 Table 5. Examples of Postgraduate certificate and Diploma Programs in Program Evaluation Provider(s) Program General Participant Description Information Uganda Postgraduate 173 participants enrolled One year course can be completed via Management Diploma in for 2013. Target group distance learning and/or evening Institute in Monitoring and includes senior and mid- sessions. Course includes 10 required Kampala, Evaluation level government modules to cover key M&E concepts and Uganda officials, development skills and 2 electives (Organizational managers, and other capacity assessment, statistical software professionals covering packages, NGO management, or development trends. consultancy skills development) Claremont Certificate of Past participants include The program is tailored to a participant’s Graduate Advanced Study executives, professionals, prior training, experience, and career University in in Evaluation, and graduate students goals and typically requires two or more Claremont, offered since 1998 representing a diverse semesters. It covers 5 key areas: California, cross-section of the reflective practice, technical practice, USA evaluation community. situational practice, management About 14 students enroll practice, and interpersonal practice. The per year, with half capstone is a competency-based coming from outside the examination. Much of the coursework U.S. can be completed via distance learning. Himgiri Zee Postgraduate Program targets The one year program includes 8 University Diploma in evaluation practitioners months of campus learning (required with TESA Program in South Asia. modules) and 4 months of real world (Teaching Evaluation, evaluation training in a field setting. Evaluation in offered since 2013 South Asia) in Dehradun, Uttarakhand, India Stellenbosch Postgraduate 188 participants The program is a one-year multimodal University in Diploma in completed the program in course (online course modules plus three Stellenbosch, Monitoring and 2012. For practitioners contact sessions) that covers the basic South Africa Evaluation tasked with design, principles, concepts and methods used in Methods, offered management, monitoring M&E studies. 6 Information for these two examples was collected via telephone interviews with John LaVelle at CGU and Jean King at MESI in November 2013. 14 Provider(s) Program General Participant Description Information since 2006 and evaluation of public programs based in the public sector, private sector and civil society The Centre for Postgraduate Targeting professionals The program requires 6 months of full Program Certificate of across Australasia time study (or 1 year part-time), Evaluation at Evaluation including graduates in all including one compulsory foundation the University disciplines subjects and two electives. Most courses of Melbourne are available online. in Australia The Certificate in Program targets The Certificate in Evaluation Practice Evaluators’ Evaluation practicing evaluators. and the Certificate in Quantitative Institute at the Practice; Participants in 2013 Evaluation Methods each require 30 days George Certificate in represented 27 countries. of instruction with some required Washington Advanced courses. The Certificate in Advanced University, Evaluation Evaluation Practice requires an with sessions Practice; additional 30 days (in addition to offered in Certificate in completing Evaluation Practice). The Washington Quantitative Master Evaluator Certificate requires DC, San Evaluation portfolio documentation. Francisco, and Methods; Master Atlanta Evaluator Certificate, offered since 2004 University of Graduate 10 to 15 participants Completion takes four semesters. Victoria, in Certificate in enroll each year and Participants earn a diploma rather than a Victoria, Evaluation, typically already have certificate if they complete a capstone British offered since 2010 graduate degrees. More project (real-world evaluation). First Columbia, than half of students are certificate program in Canada available Canada from Canada (often entirely online. working as development professionals); others come mainly from Latin America and Eastern Europe. In 2004, The Evaluator’s Institute offered the first certificate program for professional evaluators available entirely through off-campus intensive short courses, meaning that a working practitioner did not need to physically attend programming at an academic institution for several weeks or months before earning a certificate. Currently, three different TEI certificates can be earned through coursework in addition to the Master Evaluator Certificate, which is portfolio- based (see Table 5). Certificate programs in evaluation geared to development practitioners have also been developed by organizations outside of academia. For example, Management Systems International, a U.S.-based consulting firm, offers a Certificate Program in Evaluation that is equivalent to a one-semester course at the graduate level at a U.S. university. Since 1997, over 400 developing country professionals have graduated from MSI’s program. Findings from 15 follow-up surveys of graduates conducted in 2002 and 2007 were that most graduates were using their evaluation skills to set up evaluation systems in their organizations or to conduct project or program evaluations for organizations in their region. Regardless of the type of provider, certificate programs are steadily evolving to make use of new technology. Most programs with a website are offering at least some courses online, and some programs are advertising distance learning methods to reach participants wherever they are. For example, the University of Victoria started offering the Graduate Certificate in Evaluation program entirely online in 2010, which was reported by the program’s graduate advisor to be the only such program in Canada. Other Training Opportunities Various other programs exist for development practitioners to learn about monitoring and evaluation aside from in-person executive training programs and formal graduate degree and certificate programs. Two common themes that surfaced in particular during this review were the prevalence of programs available through distance learning or online and the growing role of regional (or other) networks in facilitating learning opportunities for evaluators. E-learning and Distance Learning Programs An increasing number of programs rely on internet technology and distance learning methods to reach participants wherever they are. For example, the International NGO Training and Research Centre (INTRAC) started offering a new blended learning course in 2013 that combines trainer-delivered content (via webinar) with coaching (via phone or Skype) and action learning over a six-week period. Examples of such programs for developing core M&E skills are described in Table 6. Table 6. Examples of Introductory M&E E-Learning Programs Program Providers/ Main Duration / Description Cost Title Sponsors Frequency (2013) Monitoring International 6-week Delivered remotely through webinar and online £850 and NGO Training course platform to develop foundational knowledge and (~US Evaluation and Research offered skills in M&E. Topics include main M&E terms $1,390) Blended Centre annually and concepts; selecting M&E tools for the program Learning (INTRAC) based since 2013 and project levels; reflecting on how to improve in Oxford, UK M&E in participants’ organizations (comparable INTRAC face-to-face course is 5 days) Monitoring Capacity Africa 8-week Online and distance learning course covering US$450 and Training Institute course with introduction to M&E terms and concepts; Evaluation based in Nairobi, 5 to 6 hours developing an M&E framework; the main Kenya required per components of an M&E plan; data collection tools week and methods; data analysis; reporting for action and offered once accountability per year since 2013 Monitoring Human Rights 6-week Reading, online working groups and webinars US$625 and Education course covering introduction to M&E, logic models, Evaluation in Associates, requiring 30 forming evaluation questions, indicators and the NGO based in hours measurement, data collection strategies, managing Sector Cambridge, MA, offered at evaluation processes, and reporting results USA least three 16 Program Providers/ Main Duration / Description Cost Title Sponsors Frequency (2013) times per year E-Learning EvalPartners Courses are For a comprehensive M&E course, participants FREE Programme in under the open for a choose either the introductory course and fixed Development auspices of the 16 week courses (electives) or the 10 units most relevant for Evaluation United Nations period three their work. Fixed courses include Equity-Focused Childrens Fund times per and Gender-Responsive Evaluations; National (UNICEF) and year since Evaluation Capacity Development for Country-led the International 2012 Monitoring and Evaluation Systems; and Emerging Organization for Practices in Development Evaluation with new Cooperation in modules being added Evaluation (IOCE) with other sponsors The most radical example of the growing emphasis on distance learning and online technology is that of EvalPartners, a global initiative to develop the capacity of Voluntary Organizations for Professional Evaluation (VOPES). EvalPartners was created under the auspices of the United Nations Children’s Fund (UNICEF) and the International Organization for Cooperation in Evaluation (IOCE), with initial funding from the Government of Finland. Under the EvalPartners Initiative, UNICEF, CGU, and IOCE, with support from the Rockefeller Foundation and in partnership with UN Women, started the introductory e-Learning on Development Evaluation program in 2012. The program is free and open to all interested people with internet access anywhere in the world. Courses are open three times a year, and participants either choose the introductory class followed by fixed courses or they can select the 10 units most relevant to their work. In 2012, almost 13,000 participants from 172 countries registered (reported in http://mymande.org/elearning ). Efforts to increase the accessibility of and participation in development evaluation workshops have led to a growing number of one-day offerings, which are becoming more available online. Part of the rationale for this approach is that busy working professionals have limited time to participate in learning activities and use of technology eliminates the costs, time requirements, and logistical demand of travel. Some of these are single offerings provided by one or more organizations to increase understanding in a particular area, but many of the one-day workshops are offered systematically in batches by providers to educate evaluators on a larger scale. An example of this approach is the Professional Development Workshop Series in Evaluation and Applied Research Methods that Claremont Graduate University has offered each summer since 2004. Each workshop lasts one full day. In 2013, 22 workshops were offered spanning a broad range of topics taught by practitioners from around the globe. Nearly all the workshops this year were offered online as well as in-person. The workshops served 350-400 participants in person this year and an additional 700 participants online. Participants have represented more than 100 countries since the series began. 17 The Growing Role of Networks An important factor in the M&E training landscape is the growing presence and engagement of regional networks and communities of practice in providing M&E training to develop local capacity. The number of Voluntary Organizations for Professional Evaluation has more than tripled over the past decade, from 33 in 2003 to 103 in 2013 (Rugh 2013). 7 Many of these have websites that advertise training opportunities available to local stakeholders, and many also become directly involved in sponsoring or developing training as part of their mission to support evaluation capacity development. Programs developed by VOPEs span the full range of program types and relevant cases are described throughout this report. For example, the Postgraduate Diploma in Program Evaluation program offered by Himgiri Zee University in India was originally developed through the Teaching Evaluation in South Asia Program (TESA), which was a collaborative effort between a consortium of academic institutions, the Sri Lanka Evaluation Association, and the International Development Research Centre (De Silva et al 2013). EPDET provides another example, given that the program is sponsored by Development Worldwide, a civic association for international development cooperation; the Slovak Evaluation Society; and the Czech Evaluation Society. A few examples of recurring programs sponsored or developed by networks are shown in Table 7. However, it is notable that many of the activities and programs sponsored by networks are one- off or occasional initiatives rather than regularly recurring events. Various initiatives of the international development community aim to increase local capacity for supplying M&E training, and an example is the Centers for Learning on Evaluation and Results (CLEAR) initiative, which was launched in 2010. CLEAR is a multilateral partnership program, and the CLEAR Secretariat is housed in IEG. CLEAR “aims to network and strengthen the capacity of knowledge/training institutions located with partner countries” (Khattri and Ordonez 2013). To this end, CLEAR has established six centers at regional academic institutions in Africa, East Asia, Latin America, and South Asia. Consultations with advisory committees in each region “revealed a specific demand for practical, hands-on capacity development programmes, utilizing case-based approaches, action learning, mentoring, and ongoing engagement” (Khattri and Ordonez 2013). CLEAR Centers are now working to increase the local capacity for supplying these types of learning opportunities. Networks have an instrumental role in building the local demand for monitoring and evaluation capacity, and one increasingly common model for promoting this local demand is the creation of a national M&E learning week, commonly hosted by one or more government ministries or agencies and developed collaboratively with the VOPE and a mix of development partners. For example, the Ministry of State for Planning, National Development and Vision 2030 hosted National Monitoring Week in Kenya to “to create demand for M&E results and use among all stakeholders in the public sector, agencies, and departments; and among non-state actors at both the national and devolved levels” (Described on the Evaluation Society of Kenya website at http://www.esk.co.ke/) Similarly, the Office of the Prime Minister in Uganda organized the National Evaluation Week earlier this year to “strengthen knowledge sharing and awareness of 7 EvalPartners now reports that this number has increased to 158 at http://www.mymande.org/evalpartners/international-mapping-of-evaluation 18 the importance of evidence based evaluation in making policy so as to enhance service delivery by government” (http://opm.go.ug/resource-center/policy-archive/national-evaluation-week-4th- 7th-march-2013.html ). Table 7. Examples of M&E Training Programs Offered or Sponsored by Networks Program Title Network Description Location and Engagement Frequency European Two of the main 6-day face-to-face Introductory M&E course based The Czech Program for sponsors include the on the IPDET model (drawing on same curriculum Republic Development Slovak Evaluation and instructors) Evaluation Society and the Czech Offered once Training Evaluation Society annually (EPDET) CLEAR M&E CLEAR South Asia Sessions in the series are held in person and New Delhi, Roundtable organized the series. videotaped for online viewing. Topics so far have India and Series Collaborators in included innovative techniques and technologies available online selected sessions for data collection, theory of change, evaluation included the methods, equity-focused evaluation methods, Offered at least Community of systematic reviews, scaling up social programs, quarterly Evaluators and 3ie and understanding and measuring women’s empowerment Annual The International The 2-day conference of 90 minute learning Varies--In a Conference of Program Evaluation sessions for evaluators is preceded by a set of pre- CIS member the International Network (IPEN), the conference workshops. Topics vary each year, but country. In Program association for CIS in 2013 these included evaluation for beginners, 2013, in Evaluation countries RUFDATA method for the design of evaluation Chisinau, Network polieis and terms of reference, evaluating Moldova advocacy programs, and new directions in evaluation Held annually The Changing Landscape of Evaluation Education The selected examples of M&E training spotlighted throughout this report reflect only a few of the many programs identified during the review. These examples were shared to illustrate some of the types and variety of programs available and are useful for exploring the current landscape of M&E training. However, it is important to note the limitations of this study before drawing conclusions. First, the review focuses on the supply side of M&E training and therefore provides limited information about demand. Second, the search methods used were most likely to identify those programs announced on a website or listed by an evaluation association. These search methods therefore likely biased the findings towards international development agencies, academic institutions, and consulting firms often headquartered in North America and Europe rather than smaller, more locally-based programs. Finally, this review of regularly recurring programs did not identify and include the broad range of sporadic, non-regularly scheduled training programs and other special initiatives that development partners launch to support local evaluation capacity development. Despite these limitations, the general categorization of programs and the placement of their prominent characteristics along a timeline allows for the identification of emerging trends and the characterization of the changing landscape. Based on these data, major and often interrelated trends in M&E training include the following: 19  Generic M&E training. Prior to the 1990s, the evaluation of international development projects often relied on external international consultants, brought into the local context to ensure accountability. The surge in programs for broadly developing foundational knowledge and skills in M&E coincided with the growing emphasis on developing local M&E capacity. All of the major generic M&E programs identified during this review were first established between 1999 and 2003. Without exception, their objectives include not only a focus on developing knowledge and skills but also a focus on building a local understanding of the value of monitoring and evaluation to foster local demand.  Increased attention to the participation of civil society. Part of the demand for the generic M&E training for a multi-stakeholder audience has been fueled by the increasing recognition of the need to empower and engage local civil society organizations. Organizations specifically targeting NGOs, such as INTRAC, have continued to scale up their M&E offerings to build this capacity. The number of VOPEs has increased sharply since 2003.  The rise of specialization. Introductory training programs on monitoring and evaluation appear to be becoming increasingly specialized, with many programs focusing on practices and methods related to a particular sector or within a specific country context (e.g, fragile and conflict affected states, post-disaster settings, etc.) With the exception of the health sector, where there is a longer history of specialization, most of these more specialized programs appear to have been established since 2006.  Hot topics. The available education programs span a broad range of topics but certain themes are dominant in the roster of recurring programs listed by evaluation associations and listservs. Current dominant themes that emerged during this review included the use of impact evaluation for shaping evidence-based policy and the uses of technology for data collection and analysis tools. An increasing number of programs also appear to focus on addressing complexity in evaluation through the use of such methods as outcome mapping.  E-learning. Amid continuing concern about the need to increase access to training opportunities for local development practitioners, providers increasingly seek to harness the potential of the internet. Academic degrees and M&E workshops are becoming more available online. The OpenCourseWare concept is gaining followers and EvalPartners has been launched (2012) to provide learning opportunities for free worldwide. J-PAL plans to start offering its executive education course on evaluating social programs free online via edX starting in 2014.  Decreasing costs. Overall, for those local development practitioners who might most benefit from M&E training, the costs of participation remain the greatest barrier. These costs reflect not only the tuition fees but also the requirements to travel and miss valuable workdays in an organization that cannot spare any capacity. This reality drives many of the current trends, shortening the workshop time to reduce time away from the office, using technology to remove the need to travel, and specializing content to potentially address the needs of learners more efficiently. With the use of technology and broadening access to relatively generic M&E content, discussions are increasingly focusing on how to customize content for the local context and how 20 to effectively develop the M&E capacity of local institutions. The representative from J-PAL interviewed for this review aptly described his vision for future M&E training: “M&E training needs to be for organizations rather than individuals. So the open course model now being developed for individuals should quickly move to rapid customization--for example a course for a democracy and governance group in USAID. When one or two people from an organization take an M&E training, there is less likelihood of a culture shift. Instead, the M&E training should be customized and part of the whole unit’s professional development.” Despite recognized shifts and visions for the future of M&E training, the evidence is still limited as to whether e-learning or a sequence of shorter targeted workshops over time can yield the same learning outcomes and behavior changes as an in-depth intensive program that requires participants to exit their current work environment to learn. 21 References Beywl, W. and K. Harich (2007). University-Based Continuing Education in Evaluation: The Baseline in Europe. Evaluation 13(1): 121-134. Christie, C, P. Quinones, and L. Fierro (2013). Informing the Discussion on Evaluator Training: A Look at Evaluators’ Course Taking and Professional Practice. American Journal of Evaluation. Cousins, J., C. Elliott, and N. Gilbert (2010). IPDET Evaluation of Program Impact. Volumes 1 and 2. Centre for Research on Educational and Community Services: University of Ottawa. De Silva, S., R. Goyal, N. Kalimullah, and A. Akundy (2013). Teaching Evaluation in South Asia. Collaboration among Voluntary Organizations for Prfoessional Evaluation (VOPEs) Academia, and Development Partners. In Rugh, J. and M. Segone (eds) Voluntary Organizations for Professional Evaluation (VOPEs): Learning from Africa, Americas, Asia, Australasia, Europe, and Middle East. UNICEF. Dillman, L. (2013). Evaluator Skill Acquisition: Linking Educational Experiences to Competencies. American Journal of Evaluation, 34(2), 270-285. Khattri, N. and X. Ordonez (2013). The Role of the CLEAR Initiative in Country Evaluation Capacity Development. In Rugh, J. and M. Segone (eds) Evaluation and Civil Society: Stakeholders’ Perspectives on National Evaluation Capacity Development. UNICEF. Kosheleva, N. and M. Segone (2013). EvalPartners and the Role of Voluntary Organizations for Professional Evaluation in the Development of National Evaluation Capacity. In Rugh, J. and M. Segone (eds) Voluntary Organizations for Professional Evaluation (VOPEs): Learning from Africa, Americas, Asia, Australasia, Europe, and Middle East. UNICEF. Lavelle, J. and S. Donaldson (2010). University-Based Evaluation Training Programs in the United States 1980-2008: an Empirical Examination. American Journal of Evaluation, 31(1)9- 23. Ofir, Z. (2013). Strengthening Evaluation for Development American Journal of Evaluation, 34, (4)582-586. Regional Centers for Learning on Evaluation and Results (CLEAR) Initiative (2013). Demand and Supply: Monitoring, Evaluation and Performance Management Information and Services and Anglophone Sub-Saharan Africa. A Synthesis of Nine Studies. Washington, DC: World Bank. Rugh, J. (2013). The Growth and Evolving Capacities of VOPEs. In Rugh, J. and M. Segone (eds) Voluntary Organizations for Professional Evaluation (VOPEs): Learning from Africa, Americas, Asia, Australasia, Europe, and Middle East. UNICEF. 22 Trumpower, D (2012). Evaluation Report: The International Program for Development Evaluation Training (IPDET). Centre for Research on Educational and Community Services: University of Ottawa. University of Bern (2012). European University-Based Study Programmes in Evaluation: Sixteen Profiles. Bern, Switzerland: Center for University Continuing Education 23