103694 VERIFICATION OF PERFORMANCE IN RESULTS-BASED FINANCING (RBF): THE CASE OF AFGHANISTAN DISCUSSION PAPER AUGUST 2015 Cheryl Cashin Lisa Fleisher Tawab Hashemi VERIFICATION OF PERFORMANCE IN RESULTS-BASED FINANCING (RBF) The Case of Afghanistan Cheryl Cashin, Lisa Fleisher, Tawab Hashemi August 2015 Health, Nutrition and Population (HNP) Discussion Paper This series is produced by the Health, Nutrition, and Population Global Practice. The papers in this series aim to provide a vehicle for publishing preliminary results on HNP topics to encourage discussion and debate. The findings, interpretations, and conclusions expressed in this paper are entirely those of the author(s) and should not be attributed in any manner to the World Bank, to its affiliated organizations or to members of its Board of Executive Directors or the countries they represent. Citation and the use of material presented in this series should take into account this provisional character. The World Bank does not guarantee the accuracy of the data included in this work. The boundaries, colors, denominations, and other information shown on any map in this work do not imply any judgment on the part of The World Bank concerning the legal status of any territory or the endorsement or acceptance of such boundaries For information regarding the HNP Discussion Paper Series, please contact the Editor, Martin Lutalo at mlutalo@worldbank.org or Erika Yanick at eyanick@worldbank.org. RIGHTS AND PERMISSIONS The material in this work is subject to copyright. Because The World Bank encourages dissemination of its knowledge, this work may be reproduced, in whole or in part, for noncommercial purposes as long as full attribution to this work is given. Any queries on rights and licenses, including subsidiary rights, should be addressed to the Office of the Publisher, The World Bank, 1818 H Street NW, Washington, DC 20433, USA; fax: 202-522- 2422; e-mail: pubrights@worldbank.org. © 2015 The International Bank for Reconstruction and Development / The World Bank 1818 H Street, NW Washington, DC 20433 All rights reserved. ii Health, Nutrition and Population (HNP) Discussion Paper VERIFICATION OF PERFORMANCE IN RESULTS-BASED FINANCING (RBF): The Case of Afghanistan Cheryl Cashina Lisa Fleisherb Tawab Hashemic a Consultant b Consultant c Senior Health Specialist, the World Bank Abstract: The Ministry of Public Health in Afghanistan implements a supply-side results-based financing (RBF) scheme to improve the provision of a standardized basic package of health services (BPHS) to its population. Contracting NGOs and “contracting-in” MOPH providers, this RBF program concentrates on high-priority maternal and child health services in the BPHS such as antenatal care (ANC), post-natal care (PNC), delivery care, nutrition, immunization coverage, tuberculosis (TB), as well as quality of care. It includes an intensive data verification method, focusing on both the quantity and the quality of delivered services, which was implemented between 2010 and 2013 by international third party organizations. This verification method was specifically used to ensure that providers reach performance thresholds and disburse performance payments. This case study describes the major characteristics of this RBF verification method. Taking stock of the experience of the MOPH, it aims at generating possible lessons for other RBF initiatives, thereby expanding knowledge and making RBF verification processes more efficient, sustainable and effective. This case study also responds to concerns about the future sustainability of RBF, particularly with regard to the intensive and external nature of verification: can (and should) the intensive verification process be sustained or will it need to evolve further to match the institutional capabilities of the contracted organization? Keywords: Results-based financing, Afghanistan, Health, Verification Disclaimer: The findings, interpretations and conclusions expressed in the paper are entirely those of the authors, and do not represent the views of the World Bank, its Executive Directors, or the countries they represent. Correspondence Details: Petra Vergeer, The World Bank, 1818 H Street, NW, Washington D.C. 20433, USA; email: pvergeer@worldbank.org iii iv Table of Contents ACKNOWLEDGMENTS ................................................................................................................ VI PART I – INTRODUCTION: ............................................................................................................ 7 Methodology ............................................................................................................................ 8 PART II – BACKGROUND: ............................................................................................................ 9 2.1 COUNTRY CONTEXT ........................................................................................................... 9 PART III – MAJOR CHARACTERISTICS OF THE VERIFICATION METHOD: ......................... 15 3.1 VERIFICATION TEAM ......................................................................................................... 15 3.2 DATA SOURCES AND FLOWS............................................................................................. 16 3.3 VERIFICATION OF THE QUANTITY OF SERVICES DELIVERED AT THE FACILITY LEVEL ............. 17 PART IV – FINDINGS FROM THE APPLICATION OF THE VERIFICATION METHOD: ........... 22 4.1 DEFINITION OF ERROR ..................................................................................................... 22 4.2 RESULTS OF VERIFICATION FOR THE QUANTITY OF SERVICES ............................................ 22 4.3 RESULTS OF VERIFICATION FOR THE QUALITY OF SERVICES............................................... 23 PART V – USE OF VERIFICATION FINDINGS ........................................................................... 24 5.1 OTHER USES OF VERIFICATION DATA................................................................................ 24 PART VI – VERIFICATION COSTS.............................................................................................. 26 PART VII – CHALLENGES ........................................................................................................... 27 7.1 CHALLENGES RELATED TO PATIENT TRACING ..................................................................... 28 7.2 CHALLENGES RELATED TO QUALITY VERIFICATION.............................................................. 29 PART VIII – LESSONS ................................................................................................................. 30 PART IX- CONCLUSION AND RECOMMENDATIONS .............................................................. 31 ANNEX 1. NATIONAL MONITORING CHECKLIST ................................................................... 34 ANNEX 2. CONSENT FORM ........................................................................................................ 44 ENDNOTES AND REFERENCES ................................................................................................ 46 v ACKNOWLEDGMENTS The authors of this study extend their gratitude to the many health sector stakeholders who participated in the study and were interviewed for their contributions. Special gratitude also goes to all persons who have reviewed the study, especially Petra Vergeer, Senior Health Specialist, the World Bank Group. The authors gratefully acknowledge financial support from the World Bank Health Results Innovation Trust Fund (HRITF). Finally, the authors are extremely grateful to the World Bank Group for publishing this report as an HNP Discussion Paper. vi PART I – INTRODUCTION: Paying health care providers performance-based incentives is one form of results-based financing (RBF). In RBF programs, verifying that providers have reached performance thresholds is a crucial part of program implementation and key to maintaining the transparency, fairness, and viability of the programs. For the purpose of this case study, verification is defined as the first order substantiation of results paid for in RBF. Verification is particularly important in settings such as Afghanistan where routine health information systems are still being developed and governance structures are still taking root. The objectives of this case study are to provide a detailed description of the process for verifying the data on the quantity and quality of health services under the Afghanistan MOPH’s RBF program as it existed between 2010 and 2013 and to generate possible lessons for other RBF initiatives to make verification processes efficient, sustainable, and effective in supporting overall performance improvement and health system strengthening. This case study is part of a broader analysis of multiple country case examples of RBF verification to expand knowledge about the verification process and practices to address the immediate design and implementation needs of RBF programs. In Afghanistan, the contracting of health services has been leveraged to re-establish the Ministry of Public Health (MOPH) as the steward of the health system after severe deterioration of the health sector during the Taliban period. Through the contracting mechanisms, the MOPH aims to ensure that both non-governmental organizations (NGOs) and MOPH providers deliver a standardized basic package of basic services (BPHS) with access for the entire population. The MOPH of Afghanistan has been using RBF as part of its contracting of health services delivered both by NGOs, which sub-contract with individual health facilities, and ‘contracted-in’ MOPH providers in approximately one-third of the country’s provinces since 2010. The RBF component of the contracts focuses attention further on high-priority maternal and child health services in the BPHS such as antenatal care (ANC), post-natal care (PNC), delivery care, nutrition, immunization coverage, tuberculosis (TB), as well as quality of care. Since the end of the Taliban period, Afghanistan has achieved remarkable progress, particularly in the health sector. The contracting arrangement has contributed to this progress. This case study of the verification process in Afghanistan’s RBF program provides an opportunity to take stock of the experience and draw lessons. The verification process has evolved as the focus of RBF expanded to include individual health facilities as well as the performance of NGOs and Provincial Health Offices (PHOs) as the provincial-level ‘implementers’ of the BPHS, and to include quantity as well as quality of services delivered. While the World Bank project that supports Afghanistan’s RBF program and the verification process have changed over time to address the shifting priorities in the health sector, questions about the sustainability of the program overall and the verification component linger. Respondents to interviews conducted as part of this case study raised questions about the sustainability of the verification process given its intensive approach and that it was supported by an international third- party organization paid by donor funds. An important question is whether the intensive verification process can (and should) be sustained or will need to evolve. This case study is organized as follows. Section 2 provides background information on the Afghanistan country context and the history of the RBF approach in Afghanistan. Section 3 describes the major characteristics of the verification method. The findings from the application of the verification method are discussed in Section 4. Sections 5 and 6 discuss the use of the verification findings and costs associated with the verification process, respectively. Section 7 describes challenges and lessons. Section 8 offers recommendations and conclusions. 7 Methodology The methods for carrying out this case study included conducting a desk review of available documents related to the RBF program in Afghanistan and in-depth, semi-structured interviews with a total of 13 respondents. An initial group of three interview respondents was purposively sampled based on recommendations provided by the World Bank task team. Snowball sampling was then used to identify and interview additional respondents. Interviews about the quantity verification process were conducted with seven respondents, including representatives of the MOPH, the third party verifier, and the World Bank by telephone or Skype between January and April 2013 and usually lasted for one hour. In some cases informants provided written responses to key questions and supplementary materials. Data on the verification of quality was collected through semi-structured in-depth interviews with six additional respondents by a local consultant based in Kabul during January and February, 2014. These respondents were also identified by purposive sampling based on the expertise of the World Bank task team and the local consultant. The data obtained from interview respondents was triangulated across and between respondents and with information obtained from the desk review to ensure validity. In particular, the consistency of factual data from the desk review was cross-checked with respondents. In addition, preliminary drafts of this case study were reviewed by the World Bank task team in Washington, DC and Kabul and feedback was incorporated at multiple stages. Although information dating back to 2002 is presented in the Background section to provide context, this case study focuses on the period between 2010 and 2013. This time boundary is important because World Bank support for the RBF program was renewed in February 2013 and the third-party verifier is no longer an international organization. Thus, for the sake of consistency and given the retrospective nature of this case study, the information presented herein is described in the past tense regardless of whether elements of the program as it existed from 2010-2013 carried through to the program as it exists today. The information presented below is drawn primarily from the findings obtained from the 13 interview respondents, and to a lesser extent, from the information obtained from the desk review. 8 PART II – BACKGROUND: 2.1 COUNTRY CONTEXT Afghanistan has a long history of poor health outcomes and health service coverage, particularly in rural areas. Relative to other countries in the South Asia Region (SAR) and to other low-income countries (LICs), Afghanistan has lagged behind for decades and has some of the worst health statistics in the world (Table 1). However, there have been some important gains in life expectancy and reductions in infant, child, and maternal mortality. Table 1. Health Outcome Indicators in Afghanistan Relative to South Asia and LICs Afghanistan South Asia Low-income 1990 2013 1990 2013 1990 2013 Health expenditure per capita, PPP - 142.8 - 192.6 - 92.2 Life expectancy at birth, total (years) 48.6 60.9 58.9 66.9 53.4 62.0 Mortality rate, infant (per 1,000 live births) 121.3 70.2 91.8 44.6 104.7 52.9 Mortality rate, under-5 (per 1,000 live births) 179.1 97.3 129.4 56.6 166.6 76.3 Maternal Mortality Ratio 1300 460 620 220 810 410 Fertility rate, total (births per woman) 7.7 4.9 4.17 2.6 5.7 4.0 Physicians (per 1,000 people) 0.1 0.3 0.4 - 0.1 - Nurses and midwives (per 1,000 people) - - - - - - Source: World Bank World Development Indicators. Data on under-five mortality are from the Afghanistan Mortality Survey 2010 and are adjusted to take into account the omission of data from the South Zone. See http://www.measuredhs.com/pubs/pdf/FR248/FR248.pdf, p. 103 for more information. Note: - indicates data not available. Before 2001, health services were provided by a poorly-coordinated group of health facilities that were managed or supported mainly by NGOs. These NGOs operated largely outside the supervision and regulation of the Government of Afghanistan, including the MOPH. NGO health facilities were located disproportionately near urban or more secure rural areas, leaving the 80 percent of Afghanistan’s population living in rural areas largely un-served. In addition to a misdistribution of health facilities, the gender mix and training of the health workforce was not aligned with the needs of the population. Female providers were needed to provide maternal and reproductive health care, but only one rural facility in four had a single trained female provider on staff. The health needs of the population were squarely in the realm of public health, but providers often specialized in other fields. The irrational distribution of health facilities and lack of clear population catchment areas resulted in duplication, inefficiencies, and a lack of accountability for results. 9 2.2 RBF APPROACH IN AFGHANISTAN After the fall of the Taliban in 2002, the MOPH asserted its stewardship role starting with the development of a basic package of health services. The MOPH had two goals for the BPHS: (i) develop a standardized package of services that would form the core of primary health service delivery, and (ii) promote equitable access to health services. The BPHS consists of seven elements, six of which are areas of high-priority primary healthcare1. The seventh element, the regular supply of essential drugs, is a health system-related element that is critical to ensure the successful delivery of the six health service elements. The delivery of the BPHS is the foundation upon which results-based financing is implemented in Afghanistan. Given the long-running history of NGOs providing health services, the MOPH decided to contract the delivery of the BPHS out to NGOs in 2002, except in the three provinces where the MOPH implemented the BPHS. All major donors in the health sector endorsed the approach. The first contracts and grants were signed in 2003. Delivery of the BPHS is supported by three donors in all of Afghanistan’s 34 provinces: the World Bank finances 11 provinces and Kabul City; the United States Agency for International Development (USAID) finances 13 provinces; and the European Commission (EC) finances 10 provinces. Donor support has several common features, including that it is for competitively-selected NGOs that have an assigned catchment area and are responsible for the delivery of the BPHS. However, the specific nature of the contracting mechanism varies by donor. Contracts supported by the European Commission (EC) and the United States Agency for International Development (USAID) are cost-reimbursement contracts against budgeted line items, although if deliverables outlined in contracts supported by USAID are not met, payment can be withheld. EC contracts do not include any type of monetary or non- monetary performance-based incentive. There are two types of contracts supported by World Bank funding. In one type of contract, NGOs are contracted to provide the BPHS. In the other type of contract, used in three provinces surrounding Kabul, the MOPH delivers services through a Strengthening Mechanism (MOPH-SM), which is a type of contracting-in model that gives greater autonomy to MOPH providers. Both types of contracts are subject to the National Salary Policy, so staff working in facilities in the MOPH-SM provinces as well as those working in other NGO contracted provinces have comparable salaries. In both types of contracts, a performance-based payment is made when contracted parties achieve improvements in maternal and child health indicators above a baseline level. These RBF-based contracts are supported by the World Bank in 14 provinces, in which the implementation of the BPHS and essential package of hospital services (EPHS) is supported by different donors. The delivery of EPHS is also parceled out by geography among its several supporters. While it is not the focus of this case study, the EPHS has important linkages with the BPHS through referral mechanisms between primary care and hospital services that should be strengthened over time. RBF is implemented in five provincial hospitals that are delivering the EPHS. The indicators of performance at the hospital level are focused on quality of care and payment is made based on an index of infection prevention measures, an overall mean score for performance on the balanced scorecard (described below), and treatment for severe acute malnutrition. Results are preliminary, with only one data point available at the time of this report. 10 2.2.1 2004–2009 Over the period 2004-2009, the World Bank worked with the Government of Afghanistan to design and implement a contracting mechanism with NGOs to provide the BPHS that included a performance-based component. The contracts in the 11 provinces supported by the World Bank were termed “performance-based partnership agreements” (PPAs) and provided a lump sum to NGOs to provide the BPHS with contract renewal and bonus payments subject to performance. Performance was measured using a balanced scorecard (BSC), among other tools such as HMIS data, to assess quality at the provincial level. An international firm was contracted as the third-party monitoring firm and assessed NGO performance against the balanced scorecard nationwide. The firm also led the process of developing the balanced scorecard and supporting manuals and tools. The information on the balanced scorecard was based on an annual assessment of a sample of more than 700 randomly-selected health facilities with a maximum of 25 from each province, including three district hospitals, seven comprehensive health centers, and 15 basic health centers. The sampling frame varied from year to year because of the addition (i.e., construction) of new facilities, the existence of fewer functional facilities than specified by the sampling frame, and poor security, which prevented survey teams from reaching some facilities. The third party firm used quality control mechanisms to ensure the data were valid and reliable2. For example, a small sample of health facilities in each province was selected for a second visit by a different survey team than the original team. This second team would complete the same assessment as the first team and the two sets of assessments were then compared. Performance against the balanced scorecard was used as one input into decisions about contract renewal, payment of bonuses, and for dialogue with NGOs to address obstacles to performance improvement. NGOs could earn a bonus of 1 percent of the contract amount if there was a ten percentage point (or more) increase over the highest score on the balanced scorecard, as measured by the third party firm, in the province served by the NGO. At the end of the contract, NGOs were eligible for an additional bonus of 5 percent of the contract value if they had achieved a 50 percentage point increase in the combined score of indicators listed in their terms of reference, as measured by household and health facility surveys. There is some evidence that the NGO contracting and performance-based payments did lead to improved performance of contracted NGOs34. Early results from the application of the balanced scorecard indicated that it was a useful tool to standardize the monitoring of results across different providers. The balanced scorecard also highlighted specific areas in need of improvement as well as provinces in need of more attention and focus5. Subsequently, during the period from 2010- 2013, the balanced scorecard was used for monitoring and evaluation in the health sector and no longer linked to contracting and payment. 2.2.2 2010–2013 By 2009, there was significant improvement in performance on the indicators tracked by the balanced scorecard. In fact, all benchmarks were reached by 2008. Nonetheless, coverage of priority maternal and child health services continued to lag, and service utilization remained low. For example, while the contraceptive prevalence rate (CPR) increased from 5 percent in 2003, according to the Multiple Indicator Cluster Survey (MICS), to 16 percent in 2006, according to the Afghanistan Health Survey (AHS), the absolute levels of CPR were still very low. In addition, administrative data indicated that the trend in service utilization had plateaued and in some cases decreased in 2009. For example, while over 60 percent of 12-23 month old children received DPT1, there was a 12 percentage point drop in DPT2 and a 14 percentage point drop in DPT3 to 34.6 percent. The MOPH wanted to address the issue of underutilization of services and initiated a new phase of results-based financing, which involved paying a kind of fee-for-service performance bonus based on the additional volume of services delivered conditional on the quality of care. Payments were made for services above and beyond a certain threshold (baseline). There was still a 11 performance incentive at the provincial level for the managing NGO, which could earn up to ten percent of the payments made to the health facilities it managed if the service utilization data could be verified within the acceptable margin of error. In 2010, a new RBF program was initiated and supported by the World Bank which added a health facility-level component to what was previously provincial-level performance incentives related to mainly structural aspects of quality. In addition, the basis for incentive payments changed: facilities were paid based on marginal increases in the quantity of services delivered above a baseline, conditional on quality measured using the National Monitoring Checklist (NMC). The NMC was developed by the MOPH and is a short health facility survey to track quality of care in specific health facilities (see Annex 1 for the NMC)6. This change in the RBF program was intended to drive further improvements in the coverage and delivery of maternal and child health services. An international third-party firm was selected through a tender process to verify the reported results as well as to conduct the impact evaluation. Some interview respondents indicated that the introduction of the health facility-level quantity component to the RBF program also was linked to broader changes in the socio-political climate in Afghanistan at that time, as well as to the availability of funding from the Health Results Innovation Trust Fund (HRITF) managed by the World Bank. In addition, a new government had taken power in Afghanistan, and some respondents indicated that there was a perception that the civil service was becoming less performance-based and more politicized. RBF was seen by some as a way to maintain transparency and accountability during the transition. There was an interruption in the procurement process during that time (2009-2010), and further disruption was created by the long process of appointing a new Minister of Health, which took nearly three years. Furthermore, a major change in health financing policy was instituted as user fees were banned at public health facilities. These events in Afghanistan coincided with a movement in the World Bank to increase the knowledge base around RBF, and therefore to encourage and fund pilot activities such as those in Afghanistan. 2.2.3 Current performance domains, indicators, and incentive payments The 11 indicators eligible for the performance bonus fell within two of the BPHS elements: maternal and newborn health and child health and immunization. This reflected the policy priority of making progress toward Millennium Development Goals (MDGs) 4 and 5 through the implementation of RBF (Table 2). Health centers were paid a quarterly per-service bonus payment for each service delivered in the first nine indicator areas above the baseline estimate and contingent on quality. Thus, only nine of these 11 indicators were verified at the community and health facility level: the tenth and eleventh indicators, for facility-based delivery and treatment of malnourishment, respectively, were indicators for hospitals, not health centers and verified only at the hospital level. Per-service payments were adjusted by the quality score achieved using the National Monitoring Checklist. For example, if the health facility received $1,250 per quarter based on quantity of services and it scored 75 percent on the NMC, then it would receive an actual payment of $938 for that quarter.7 12 Table 2. Indicators for Performance Bonuses in the Afghanistan RBF Program Indicator Definition Number of additional pregnant women over the baseline First visit for skilled number who saw a skilled provider the first time for ANC in (1) ANC the catchment area of the HF for the reference period of interest. Number of additional pregnant women over the baseline Second visit for skilled number who saw a skilled provider a second time for ANC (2) ANC for the same pregnancy in the catchment area of the HF for the reference period of interest. Number of additional pregnant women over the baseline Third visit for skilled number who saw a skilled provider a third time for ANC for (3) ANC the same pregnancy in the catchment area of the HF for the reference period of interest. Number of additional pregnant women over the baseline Fourth visit for skilled number who saw a skilled provider a fourth time for ANC for (4) ANC the same pregnancy in the catchment area of the HF for the reference period of interest. Number of additional pregnant women over the baseline Skilled attendants at number who used a skilled provider (doctor or midwife) for (5) delivery delivery (at facility or at home) in the catchment area of the HF for the reference period of interest. Number of additional deliveries over the baseline number in the catchment area of the HF that received one PNC visit (6) First visit for PNC from a trained attendant (at facility or at home) within 6-12 hours of birth for the reference period of interest Number of additional deliveries over the baseline number in the catchment area of the HF that received the second PNC (7) Second visit for PNC visit from a trained attendant (at facility or at home) for the same delivery within 6 days of birth for the reference period of interest Children getting their Number of additional children over the baseline number (8) third dose of DPT receiving DPT3 before their first birthday in the catchment before their 1st area of the HF in the reference period of interest birthday Tuberculosis case Number of additional new smear positive cases notified over (9) detection the baseline number for the reference period of interest. Number of additional women over the baseline number who Deliveries occurring (10) delivered (not including complications) in an institution within hospital during the reference period Successful treatment Number of additional severely malnourished children (11) of severely successfully treated over the baseline number for the malnourished children reference period of interest 13 2.2.4 Financial management and disbursement arrangements The Health Economics and Finance Department (HEFD) of the MOPH was responsible for overall management of the RBF program. Its duties included overall contract and project management, approving payments, and coordinating with other relevant departments involved in the program. The development budget department of the MOPH was responsible for financial management and making payments to NGOs. The Financial Management manual prepared for the World Bank- supported Strengthening Health Activities for the Rural Poor (SHARP) project was also used for the financial management of the RBF program. Payments under the RBF program were based on signed performance contracts between the MOPH and the NGOs. The performance contracts delineated each party’s responsibilities and procedures for procurement and financial management. Contracts also defined performance indicators, that, when verified, served as triggers for the release of performance payments, after the initial tranche payment was made based on the contract. Performance payments were calculated as a fixed rate per-service for the different services monitored. Payment rates increased significantly since the beginning of the program, almost doubling for some services. A third-party assessor was appointed to verify health facilities’ performance claims every quarter. This verification process was overseen by HEFD. NGOs were required to ensure that their financial statements are audited annually, and the audit reports were submitted to MOPH. Furthermore, the MOPH could also appoint its own auditors to review the financial transactions of any NGO. Financial reports from the NGO were submitted to HEFD quarterly. The per-service bonuses were awarded only after careful verification of both the quality and quantity of services. The verification method and process is described in more depth in the next section. The remainder of this case study focuses primarily on the RBF program in place since 2010. 14 PART III – MAJOR CHARACTERISTICS OF THE VERIFICATION METHOD: As stated above, prior to 2010, the RBF program in Afghanistan focused on measuring the quality of services delivered by the NGO at the provincial level using the balanced scorecard, and the HMIS was used to track the quantity of services delivered. There were quality control measures in place to assess the validity of the data obtained from the balanced scorecard (described above). In 2006, results from the first household survey following the fall of the Taliban indicated substantial discrepancies between (quantity) results reported in the HMIS and services actually received by households. Interview respondents indicated that concerns about these results, as well as stagnating trends in utilization, lingering concerns about the quality of care and gaming, provided the motivation for implementing a strong verification system at the facility and community levels concomitantly with the new RBF program in 2009. In addition to performance and process monitoring, which were carried out by various departments in the MOPH at the national and provincial level, verification of reported results was conducted by a third party firm. Verification of reports in Afghanistan involved verification of both the quantity and quality of services delivered. There were three core features to the verification method in Afghanistan: verification of the quantity of services provided at facilities; verification of the quantity of services received by communities; and verification of the quality of services provided. The process of verification in Afghanistan’s RBF scheme worked as follows. The health facility submitted a report to the NGO through the HMIS detailing the quantity of services it provided. The NGO compiled the reports from all of its contracted health facilities and submitted them to MOPH. The quality of these services was then scored on a quarterly basis by the PHO and NGO using the National Monitoring Checklist. The balanced scorecard was used to triangulate the results reported on the NMC at the provincial level. The team from the third party firm verified the provision of services at the facility and community levels on a quarterly basis to ensure that patients actually received the services that facilities reported were provided. Bonus payments were approved by the MOPH based on the verification reports. The NGOs received advance payment every six months or so based on their account balance (they should have funds in their account equivalent to two quarters of payments). NGOs made payments directly to facilities they managed through a variety of sub-contracting mechanisms once confirmation was received by HEFD. The contracted-in MOPH facilities were also paid on a quarterly basis by the finance department after approval of HEFD through the Provincial Health Office. 3.1 VERIFICATION TEAM The third party firm verified the quantity of services. The third party reported its findings directly to the HFED and Monitoring and Evaluation (M&E) Directorate of the MOPH and was paid on a contract basis with funding from the World Bank. The team assembled by the third party firm to carry out the three prongs of the verification process was large, with staff from the national to the provincial level, as well as Community Monitors (Figure 2). There were about 200 people in the field conducting verification activities each quarter. The core staff of the third party team (the RBF Manager, Deputy RBF Manager, Provincial Supervisors, and Deputy Provincial Supervisors), were paid on salary. Community monitors received payments for each verification form they completed or the number of households they visited per quarter, with a maximum of 50 household visits per community monitor per year. The RBF Manager was based in Kabul and was responsible for the overall management of the third party activities related to the verification process. A Deputy RBF Manager, who was the primary focal person in Kabul, managed the day-to-day verification process. There were between eight and 11 Provincial Supervisors based in Kabul that had overall responsibility for verification of the HMIS data in the provinces and for supervising field activities. The Provincial Supervisors were 15 responsible for the safety of the verification teams, the completion of the process, and for ensuring data quality. Figure 2.Afghanistan RBF Staff Structure of the Third-Party Verification Team At the province level, each team was comprised of one Deputy Provincial Supervisor per province, and a pair of community monitors per health facility. These teams assisted with the verification of HMIS reports at the facility, but they were mainly responsible for community verification. The Deputy Provincial Supervisors were responsible for all aspects of community verification, including staffing, training, supervising, and managerial support to the community monitors. The Deputy Provincial Supervisors ensured data quality and collected and certified all forms submitted by community monitors. The community monitors occupied the central role in the community verification. They were responsible for visiting the selected households, interviewing the eligible respondents, completing the verification forms, ensuring that no questions remain unanswered, and submitting the filled-in forms to the Deputy Provincial Supervisor in timely manner. The community monitors were selected from a community-based structure such as the National Solidarity Program or from among school teachers. In the catchment area of each health facility, a pair of community monitors (one female and one male) was chosen who met the requirement of having basic literacy and communication skills. Community monitors were prepared through a five-day training activity. Different community monitors were recruited each quarter depending on the facilities that were selected and the households that were sampled from facility registers. 3.2 DATA SOURCES AND FLOWS The reporting system for the RBF program in Afghanistan was built on the HMIS to monitor delivery of the BPHS and EPHS. Thus, parallel reporting structures were avoided. Minor adaptation was necessary to include five or six new indicators and small changes were made to existing data forms, registers, and tally sheets. The reporting process that was in place also was used and strengthened for RBF reporting. Health facilities sent paper reports on services delivered to the managing NGOs at the provincial level. 16 The NGOs aggregated the data and entered the information into an ACCESS database program, and then sent electronic reports to the central level. The database calculated estimates of the payments for facilities. The initial version of the database was limited given the volume of data generated on a quarterly basis by the verification process but has since been upgraded. Reports on the verification of the HMIS data were sent by the third party firm to the MOPH, where discrepancies were analyzed and key data gaps were identified and incentive payments were calculated. The MOPH shared the reports with the provincial level, where data were examined more closely indicator-by-indicator to calculate health facilities’ payments and to identify areas where additional support to health facilities was needed. 3.3 VERIFICATION OF THE QUANTITY OF SERVICES DELIVERED AT THE FACILITY LEVEL Quantity data were verified at the health facility level every three months to reconcile what was reported by health facilities through the HMIS with data recorded in facility registers. Verification of services received by the community is described in the next section. 3.3.1 Sampling for Health Facility-Level Verification The third party firm selected a 25 percent sample of health facilities from the 11 RBF provinces for verification each quarter (about 115 facilities across all provinces), and all facilities were visited at least once in a year. A total of 460 health facilities were covered by the verification process each year. To select the sample each quarter, facilities were stratified by type – Sub-Center, Basic Health Center (BHC), Comprehensive Health Center (CHC), and District Hospital (DH) – and 25 percent of facilities in each stratum were selected randomly. The sampling frame of all heath facilities is shown in Table 3. Ten percent of the health facilities that were verified in previous quarters during the year were also randomly selected for verification in the current quarter. The Kabul team specified the health facilities and cases for verifying the nine health center-level indicators and communicated the list to the Provincial Supervisors. The same list was used for the community verification. The team from the third party firm carried out the sampling procedure and data collection for each province without advance notification to the implementers or health facilities. Despite serious issues with security, all facilities remained in the sampling frame each quarter, and one attempt was made to visit all selected facilities. If a facility was found to be insecure, it was excluded for that quarter, and an attempt was made to include it in the next quarter. Using this method, data was collected successfully from some health facilities designated as insecure. 17 Table 3.Sampling Frame of Health Facilities in World Bank-Supported Provinces Total # of # of District N Geographic BPHS Hospitals # of Sub- # of # of Province o location Facilities in Centers BHCs CHCs Province 1 Samangan Northern 30 2 11 12 5 2 Sar-e-Pul Northern 45 2 20 15 8 3 Balkh Northern 86 5 33 33 15 4 Jawzjan Northern 22 2 14 6 5 Bamiyan Northern 37 3 13 13 8 6 Kunduz North eastern 45 1 5 27 12 7 Badkhshan North eastern 55 +1 PH 2 8 27 18 8 Parwan Central 56 [1 PH] 1 14 33 9 9 Panjsher Central 18 1 6 9 3 10 Kandahar Southern 9 1 2 5 11 Takher North eastern 54 3 5 33 13 12 Kapisa Central I PH 13 Laghman Eastern I PH 14 Paktya Eastern I PH 3.3.2 Process for Carrying Out Health Facility-Level Verification Verification of the HMIS data from the previous quarter took place with the permission and presence of the health facility head. The verification of HMIS data with facility records took two to four hours per facility over a period of 25-35 days and was carried out simultaneously with the community- level verification. The field team checked the validity and reliability of the HMIS data by examining logbooks, registers and other records. At the conclusion of the health facility visit, a written summary was left in the visit book of a health facility along with the date of the visit and the signature of team members. The collected data forms were acknowledged through signature and stamped by the person with authority at the health center. A summary of the activities for health facility-level verification is provided in Table 4. Table 4.Summary of Health Facility-Level Verification of Quantity of Services in Afghanistan RBF Activity Time Responsible required each quarter Select sample of health facilities One day RBF Manager Stratify all facilities by type (Sub-center, BHC, CHC, DH) and randomly select a 25% sample of each stratum. Also select 10% of the health facilities that have already been verified in previous quarters for re-verification. Select sample of cases for each indicator for each One day Deputy RBF selected health facility Manager Communicate the selected health facilities to Provincial Supervisors for selection of cases from the previous quarter in the HMIS reports that third party firm receives from MOPH each quarter Conduct health facility visits 2 to 4 Provincial Visit each selected health facility and tally and calculate facilities/day Supervisor cases from the facility register for the previous quarter. Submit verification data to Kabul team 1 to 3 days Provincial Send completed verification forms via the carrier service Supervisor through local transport. 18 Activity Time Responsible required each quarter (5) Data entry cleaning for RBF indicators One week RBF manager Enter and edit data for the verification of selected cases and deputy using Excel, and send the report to MOPH. manager. (6) Data entry and cleaning from community-level One week Data manager verification and the team Enter and edit data for all questions in CS-pro. These data will be used to study quality of care and patient satisfaction later. Quality control measures were also implemented to ensure the accuracy of the verification data collected. These measures included randomly selecting 10 percent of health facilities in each province from the completed verification forms to re-check and reviewing the selected completed forms against the facility registers. If errors were identified in one or more of the forms, the entire sample of health facilities chosen for that quarter in that province was reviewed. The work of the field monitors was also checked by either selecting a convenience or random sample for review. 3.3.3 Verification of the Quantity of Services through Patient Tracing The provincial team randomly selected households from the registers of the 115 randomly-selected health facilities for community verification, or patient tracing. For each facility, six households were randomly selected for each of the nine indicators, five were scheduled for interviews, and one served as an alternate. A total of 45 households were interviewed per health facility each quarter, for a total of more than 5,000 household interviews per quarter, or over 20,000 household visits each year. The interviews were typically 30-40 minutes in duration with a maximum of one hour for each household. The entire community-level verification process took one week to ten days each quarter. A summary of the activities for community-level verification is provided in Table 5. Information on services provided was collected through a combination of treatment cards held by patients and patient recall. Patient cards should have contained the date of health facility visit, registration number and services received, including which round of ANC or PNC. Verification of the services from the patient cards was the preferred approach for obtaining good quality data given that client’s memories were subject to recall and other biases. However, patient cards were not always available or complete. Time required Activity Responsible each quarter (1) Selection of client cases for verification 2–4 Deputy The Provincial Supervisor provides the list of selected health facilities/day Provincial facilities to the Deputy Provincial Supervisor. If there are more Supervisor than 5 cases for each indicator, the Deputy Provincial Supervisor stratifies cases by indicator for the last two weeks. • Randomly select 6 clients for each indicator • If there are fewer than 6 patients for any indicator, select clients from preceding weeks until the sample is reached. • Record the client’s name, address, registration numbers and date of visits on the household selection sheet. (2) Distribute household selection sheets to community 1 day after Deputy monitors completing the Provincial Each pair of community monitors receives one household household Supervisor selection sheet and 6 community verification forms for each selection sheet for each facility 19 indicator. The forms are color-coded for each type of indicator. (3) Conduct household interviews 2-4 Community The male monitor finds and identifies the household, and houses/day/pair Monitors the female monitor conducts the interview at the selected (2-3 weeks per household, unless the patient is a male individual. pair) (4) Submit community verification forms Within 2-3 Deputy Collect filled community verification forms and submit. weeks of Provincial Make photocopies of the filled forms, and send a copy via initiation of Supervisor the carrier service through the local transport to Provincial verification Supervisor. Table 5.Summary of Community-Level Verification for the Quantity of Services in Afghanistan RBF 3.3.4. Verification of the Quality of Services The process of verification of the quality of services served a dual role: to link quality to payment of bonuses and to strengthen the role of PHOs in supervision and monitoring and evaluation. The quality of services delivered was verified jointly by the PHO and NGO using the National Monitoring Checklist. The NMC was also part of the data collection and measurement strategy of Afghanistan’s National Strategy for Improving Quality in Health Care.8 To implement the NMC, PHOs and NGOs were instructed to visit each health facility once per quarter. During the quality verification visit, the assessor answered questions about structural aspects of quality, such as the availability of equipment. Respondents confirmed completing one to three NMC checklists per health facility each quarter. Ease of access and the level of performance of the health facility were indicated among the factors that determined the frequency of visits to the health facilities. For instance, closer health facilities and those with low performance were visited more frequently. According to the interview respondents, the average time required to complete a checklist ranged between two to three hours. PHOs received a bonus under the RBF program for each health facility that was visited once in a quarter. For health facilities that were difficult to access, the NGO completed the NMC alone, or the health facility itself completed the checklist. PHOs attempted to verify self-reported NMC data at least once per year and results were accepted if there was not more than a ten percent discrepancy from the self-reported checklist. In practice, most health facilities were visited every six months. The data gathered from the interview respondents about quality verification suggested that although all of the interview respondents confirmed using the NMC for checking the quality of services delivered in the health facilities, there was some discrepancy between the guidance provided in the RBF Operational Manual and actual practice. While the RBF manual recommended one completed NMC per health facility per quarter, the study respondents provided mixed responses. Some of the respondents indicated that the number of NMC checklists completed for their health facilities was in compliance with the RBF manual, while others stated that they did not follow the manual. According to those who did not follow the manual, the number of visits was decided based on the staff job description and the NGO and PHO plan for monitoring and supervising health facilities. The respondents confirmed that all visits are concluded by leaving a summary of the NMC checklist in the health facility. They also indicated that their findings and points for improvement were recorded in the health facility visitor’s book. In addition, one of the respondents indicated providing a copy of the supervisors report to the health facilities for their use. The NMC has over 13 sections each of which is comprised of several indicators. However, what the respondents perceived as quality indicators was actually the “performance against target” section of the NMC. This section compares data from the HMIS forms with the original source data and areas of discrepancy notes (yes/no) between the two information sources. The RBF guidelines indicated that the quality score should be the average of all sections in the NMC checklist that were related to quality. The performance against target section was normally excluded when deriving 20 the NMC/quality score. Since quality indicators were not specified in the RBF guidelines the respondents might have mistaken the performance indicators specified in the NMC under “performance against target” section with quality indicators. The information from the completed NMC checklist was entered into a database at the PHO at the completion of the visit and a summary sheet was left with the health facility. The information was sent to the central MoPH on a quarterly basis. The results of the checklist were converted into a percentage score, which was applied to the bonus payment. The quality scores were counter-verified in several ways. First, central level officials periodically visited health facilities in the provinces and used the opportunity to complete the NMC, which was then cross-checked with the facilities previous NMC scores. In addition, the MOPH cross-checked on an annual basis the similar fields between the NMC and BSC to identify any significant differences. 21 PART IV – FINDINGS FROM THE APPLICATION OF THE VERIFICATION METHOD: 4.1 DEFINITION OF ERROR There was a different acceptable margin of error for discrepancies in facility and community-based verification of the quantity of services delivered. At the facility level, discrepancies greater than 10 percent between the number of services reported in the HMIS and the data reported in the facility register was considered to be an error for that indicator. At the community level, a discrepancy of up to 20 percent between data reported in facility registers and reports by selected patients was considered acceptable. If the discrepancy was greater than the acceptable margin of error and an acceptable explanation was not found, the health facility was not paid the performance incentive for the services related to that indicator (“all or nothing”). 4.2 RESULTS OF VERIFICATION FOR THE QUANTITY OF SERVICES This verification process was put in place in mid-2010. In the first quarter for which results were available, 17 percent of the services reported in the HMIS could not be verified in health facility registers (Figure 3). At the community level, there was a discrepancy of 33 percent (Figure 4). Since that time, the level of discrepancy in the number of services reported through the HMIS and what was verified at the health facility and in the community reduced dramatically. Data from late 2012 suggested that there was only a five percent discrepancy between the HMIS and both health facility and community level verification results. There was widespread agreement that the RBF program and the verification process drove these improvements in data quality. The threat of not receiving incentive payments was effective in very quickly closing the data quality gap for program indicators. Figure 3. Trends in Level of Agreement between HMIS and Facility-Level Verification Data for the Quantity of Services Delivered 96 95 93 94 92 90 89 89 88 87 % Agreement 86 86 84 83 83 83 82 80 78 76 Q3-2010 Q4-2010 Q1-2011 Q2-2011 Q3-2011 Q4-2011 Q1-2012 Q2-2012 Q3-2012 Source: Health Results Innovation Trust Fund Annual Report: Afghanistan. March 18, 2013. 22 Figure 4. Trends in Level of Agreement between HMIS and Community-Level Verification Data for the Quantity of Services Delivered 100 94 95 95 92 91 89 89 90 86 85 83 % Agreement 80 77 75 70 67 65 60 55 50 Q2-2010 Q3-2010 Q4-2010 Q1-2011 Q2-2011 Q3-2011 Q4-2011 Q1-2012 Q2-2012 Q3-2012 Source: Health Results Innovation Trust Fund Annual Report: Afghanistan. March 18, 2013. 4.3 RESULTS OF VERIFICATION FOR THE QUALITY OF SERVICES All of the interview respondents provided consistent responses to the question concerning the use of quality/ NMC score and were able to explain why a quality score of less than 100 might occur. The respondents indicated that there was an adjustment to the RBF bonuses based on the quality score which was interpreted as a larger bonus to the health facility when the NMC/quality score is high and a smaller bonus to the health facility and NGO for a lower quality score. For instance, one respondent noted the following: Suppose a clinic is eligible for payment of bonus of 100,000 AFN based on the result of HMIS data and verification and the average NMC score is 90 percent. In this case the actual payment to the health facility will be 90 percent of the above payment which is 90,000 AFN. According to most of the respondents, for all weak points identified by the quality verification, a root cause analysis was conducted, an action plan was prepared, and all parties responsible for bringing the improvement were identified. For example, if the performance was low in the delivery indicator, they worked with the health facility staff to identify the root cause(s) and then sought to solve the problem. The parties involved in solving the problem could include clinic staff, the NGO, and the community. One of the study respondents provided the following response when asked about how they concluded the health facility visits and their recommendations for how to improve a low score on the NMC: If the registration is incomplete, related staff are recommended to make sure the registers are complete and there is agreement between different data sources. In order to minimize the discrepancy between different sources of data, it is recommended that all registration and tallying should take place on [a] daily basis and [an] information coordination meeting between various departments of the health facility [should be held] in order to ensure data agreement. Another…recommendation was provided [for] the space between doses of vaccinations that should be properly observed. For instance, we recommended that proper timing for all doses of vaccines should be observed in order to ensure compliance with the guidelines and avoid any reduction in the incentive payment to the health facility staff. 23 PART V – USE OF VERIFICATION FINDINGS When the RBF team at the MOPH received the verification data, they extracted the corresponding data from the HMIS. Data that were not been verified were highlighted. The MOPH was very careful to identify which discrepancies were due to actual over-reporting and which were due to data issues or misunderstandings. The MOPH then entered into a dialogue with those responsible for managing the delivery of health services at provincial level (for example, NGO or MOPH). In general, however, there was not a significant problem with over-reporting. The MOPH estimated that within the very low rate of discrepancy, only about 20 percent of those cases could be attributed to over-reporting. Based on the verification report of the third party firm, the MOPH calculated the performance payment earned by the NGO based on the combined performance of the facilities they managed. The NGOs were then responsible for calculated performance payments for individual health facilities. The final decisions were made by implementers (that is, NGOs in contracted out provinces and PHOs in MOPH-SM provinces) about whether to cut incentive payments to health facilities because of discrepancies that appeared to be due to over-reporting in accordance with the stipulations of the contract. Any discrepancy greater than 20 percent with community-level verification and 10 percent for facility-level verification resulted in non-payment to the health facility for that indicator. The verification of information from the sample of facilities was aggregated at the provincial level to compare province-wide reporting against what could be verified in the community. Performance payments to the NGOs were up to 10 percent of what was earned by health facilities as a performance incentive after verification. For example, if 85 percent of services reported through the HMIS in the province could be verified, only 85 percent of the performance payment would be paid to the NGO for its contracted health facilities. The MOPH lowered the threshold of the overall accuracy rate as it was initially found to be too high to be attainable. Thus, the total performance payment to an NGO was equal to the total amount that all of its sub-contracted facilities earned after verification, plus an additional ten percent to be retained by the NGO adjusted for the overall level of accuracy the data reported by its facilities. Regarding the use of NMC findings, the interview respondents indicated that data were used for the following: 1) calculating RBF payments; 2) analyzing the performance of the health facilities by NGO management team and PHO; 3) decision making by NGOs or PHOs at the province level about supervision of HFs, supply, and planning of training for the staff of health facilities; 4) comparison between and within the “treatment” and “control” health facilities 1 in terms of their level of performance; 5) checking quality of services and data; 6) training needs assessment and planning trainings; and 7) identification of the root causes of problems and health service improvement. The overall impression of respondents was that the NMC was used to improve the performance of health facilities and after the RBF introduction it was also associated with calculation and payment of RBF bonuses. 5.1 OTHER USES OF VERIFICATION DATA A large focus of the supportive function of the verification process has been on improving data quality. The verification data were used at the provincial level to highlight areas where the quality of data reporting needed to be improved. Additional HMIS training was provided to health facilities to help close these gaps. The verification results also were used as a tool to support performance improvement. Results were shared regularly through quarterly and annual reports with the MOPH and through coordination meetings with implementing NGOs. The results were used by both health care 1 The Afghanistan RBF program includes an impact evaluation component, which assigned facilities to a treatment group, which implements RBF, or a control group, which does not. 24 providers and policymakers to better understand current performance and what could be done to continue to improve. Respondents indicated that while the providers were clearly looking at the data, it appeared that discussions and debates about the quality of the data was the main focus. The MOPH was also using the verification data to do further analysis, and every two months a meeting was held to share results of verification with all implementers and other stakeholders. There were presentations and discussions about how to improve performance and how to improve data quality in those provinces where there was a higher degree of discrepancy. Respondents suggested that a communication mechanism still needed to be established to follow up on the meetings and ensure that the discussion and decisions reached can be distributed. 25 PART VI – VERIFICATION COSTS As is often the case in RBF programs, little information was available about the actual cost of carrying out verification in Afghanistan’s program. The original budget for the RBF program developed by the World Bank project indicates that 7.8 percent of the funding for the program was allocated to third-party verification, or $863,382 out of a total budget of $11 million between 2010 and 2013. This does not, however, take into account the cost to the PHO of implementing the NMC to verify the quality of services or other indirect costs to the MOPH, NGOs and health facilities. In terms of quality verification, interview respondents’ answers were mixed when they were asked whether a budget is required for NMC implementation according to RBF requirement. One of the respondents stated that there were certain resources that were required for the NMC implementation such as vehicles, supervisors’ salary, and per diem. In contrast, another respondent expressed that no specific budget was required for the NMC implementation because NMC application was part of the NGOs’ annual plan and each health facility should have had at least one completed NMC in a quarter. None of the respondents gave any specific number for the cost required for the implementation. 26 PART VII – CHALLENGES The verification process for Afghanistan’s RBF program was an elaborate and labor-intensive process that involved hundreds of health facility visits and thousands of household visits each year. The process was being carried out by a third-party international organization, initially a highly reputable university based in the United States and later from Europe. The process and the involvement of the respected third-party evaluator was perceived to have lent credibility to the RBF program in Afghanistan. The process consistently reached nearly all health facilities included in the RBF program and actively involved community members. It was also noted by one respondent that the participation of the international third-party evaluator may have contributed to building institutional capacity for verification within Afghanistan. A noted success of the verification process was that it contributed to strengthening the HMIS, both specifically for the indicators that were linked to incentive payments and for the system as a whole. Verification reports were being used not only for payment but also for actively improving the routine reporting system and overall quality of the data. There was consensus among the interview respondents that the credibility and trust in the data generated by the HMIS has improved. The verification process also relied on and may have strengthened communication and dialogue between the national MOPH, Provincial Health Offices, NGOs, and health facilities. There were a number of serious challenges in carrying out the verification process, particularly in the fragile and often volatile context of Afghanistan. At the forefront of the challenges were concerns about security, which deteriorated after the RBF program began. It was a challenge even to define security and identify when health facilities were not secure. Some respondents indicated that there was a perceived lack of willingness by security authorities to support verbal statements of insecurity with written documentation. Without written confirmation of poor security in certain provinces, staff proceeded to make efforts to reach facilities in those provinces at risk to their personal safety. If verbal statements of insecurity were confirmed with written documentation, the verification team did not travel to that region’s facilities. When this occurred, no verification for quantity of services was done in these facilities. However, for quality verification, it was reported that some health facility staff “self-implemented” the NMC checklist to partially overcome the challenge posed by insecurity. It was perceived by some interview respondents that insecurity in some of the provinces was a serious challenge to the implementation of the NMC, that many people were put at risk to carry out the verification process, and that there was not enough added value to justify that risk. The other major challenge that highlighted by the Afghanistan case is the sustainability of the verification process, and the RBF program in general, given that the RBF program continues to be financed entirely by the World Bank. The cost and complexity of the verification process is not perceived as something that could be taken over by the government in its current form in the foreseeable future. However, this issue is not unique to RBF verification. A large part of Afghanistan’s overall budget is funded by external sources, the entire budget for primary health care is donor-financed, and a large proportion of the budget for secondary healthcare is also donor- financed. 27 7.1 CHALLENGES RELATED TO PATIENT TRACING Holding providers accountable at the community level was perceived as valuable by interview respondents, but community-based data collection and verification of services are difficult tasks in any context, especially in a conflict setting such as Afghanistan. The third party firm was been faced with a range of challenges related specifically to the community verification, including: • Privacy of patients and confidentiality of health information. It was unclear whether the provisions to protect patient privacy and confidentiality were sufficient, particularly given that patients did not provide consent prior to being visited at their homes by community monitors. Visited patients were asked to sign a consent form at the beginning of the household interview about their clinic visit (see Annex 3). However, prior to this provision of consent, patients did provide consent (e.g., by signing a release form when they visited the clinic) granting permission for a data collector to come to the facility, review facility records or registries, and then visit their homes. As outlined in Table 5, these community monitors were aware of what services the community members received prior to the visit to the household and they then interview the patient to confirm receipt of services. Protection of patient privacy and maintaining the confidentiality of health information was further complicated by the fact that the community monitors were drawn from the same communities as the facility patients. As noted above, one male and one female community monitor with basic literacy and communication skills were selected from a community- based structure such as the National Solidarity Program, or from among school teachers. Respondents gave examples of privacy concerns such as reluctance on the part of some health facilities to give out patient contact information, difficulty for women to discuss health facility visits in the presence of male household members, and public identification of visited households. • The cost and labor intensity. The number of households visited each quarter and the number of community monitors required to reach them was an enormous cost, both in terms of financial costs and risk to field staff that should be weighed against the added benefit that the community level information obtained. • Difficulty finding selected households. In some cases, the poor quality of registration and recording at the health facility level, including incomplete addresses and incorrect recording of client names, made the work of the community monitors more difficult and time consuming. It was also noted that in some cases, patients may have intentionally provided incomplete addresses, as they did not want their health service utilization to be known. This relates to the broader issue of patient privacy and confidentiality. Over time, this problem seemed to have dissipated, as evidenced by the increase in the percent agreement between the HMIS and the community-level verification data for the quantity of services delivered (Figure 4). • Shortage of female community monitors. It was very difficult to find female members in some communities who met the qualifications for community monitors and were willing to participate, particularly in the rural areas. This problem was likely related to insecurity, continued poor education of women, cultural barriers among conservative communities, or other discomfort with the role of community monitors. In some cases, trained community monitors decided not to participate once they returned to their villages after training. A number of sampled health facilities had to be dropped in some quarters due to the lack of availability of female community monitors. • Biased nature of the information provided by households. Concerns were raised by several interview respondents about the reliability and validity of using patient information to verify the receipt of services. (Note: patient satisfaction is not measured.) In many cases the patient treatment cards were not available at the household level or they contained incomplete information that was difficult to verify. Furthermore, although the recall period was intended to be two weeks, it was often longer when the number of cases in the period was too low. The longer recall period introduced greater recall bias, which may have 28 resulted in an under-estimate of the number of services and may have caused a higher level of discrepancy with the HMIS. In order to reduce the recall bias, the sample for the community was limited to the past four weeks at most, but the validity of this recall period has not been tested. 7.2 CHALLENGES RELATED TO QUALITY VERIFICATION The study respondents indicated that the time demand of implementing the NMC was a key challenge. They explained that there was an opportunity cost to NMC implementation, namely that it usually limited the opportunity for supportive supervision. In addition, some study respondents noted that staff capacity for NMC implementation might be a challenge. They further expressed concern over weak cooperation of PPHOs in planning and implementation of joint monitoring and pointed it out as a challenge to the NMC implementation. One of the respondents noted that “the PHOs usually do not join the health facility visits”. Furthermore, the NMC only checked drug availability at the time of visit and was not able to capture stock-out during the past month. 29 PART VIII – LESSONS The Afghanistan case provides several lessons that may be useful to inform RBF verification efforts in other countries. Verification approaches in any country are often driven by the accuracy of data in the health information systems and the status of the governance structures in place in the health sector. In the case of Afghanistan, both were relatively weak at the time the RBF program was initiated, and the MOPH was keen to focus on its stewardship function. Therefore, an international third-party verifier was used along with an intensive community-level verification effort. The first lesson is that as any RBF program matures, however, and the quality of data improves, the verification process may also need to evolve, particularly if the government eventually takes over responsibility for funding the effort. It should be noted, however, that in Afghanistan, government funding of the RBF program is unlikely to happen in the near term given the government’s current level of dependence on external resources for basic programs, including health. The second and related lesson is that the RBF program in general, and the verification process specifically, can and should be a lever to strengthen health information and governance structures. Findings from a large number of RBF programs worldwide suggest that this is one of the most valuable “spillover effects” of such programs9. Finally, the main lesson of the Afghanistan case is that it is critical that the verification process align with the capacity and accuracy of the reporting system and the potential added value of more elaborate approaches. In the Afghanistan case, there has been an improvement in data quality over the course of the RBF program, and while we do not know the level of gaming that would have occurred in the absence of verification, it is not clear that the cost of the community-level verification -- both financial and in terms of risk to field teams and patient privacy -- was justified by the gaming that was potentially avoided. As Afghanistan’s experience with RBF and verification has evolved over the past decade, it could be valuable to consider modifying the intensity and process of the community verification effort to align with the institutional capabilities for verification within Afghanistan and the potential value of the effort. 30 PART IX- CONCLUSION AND RECOMMENDATIONS The evolution of RBF in general and verification in particular in Afghanistan has mirrored a number of changes both in the country’s sociopolitical environment and according to some respondents, the priorities and interests of the international partners, including the World Bank. The early performance-based contracting approach focused on provincial-level performance and rewards and the incentive mechanism was tied to quality. The balanced scorecard that was developed and used as a measurement tool for the performance contracts has been viewed as a useful and constructive process (Peters 2007). The tool has been an important monitoring and evaluation tool for the MOPH and has fostered dialogue and a focus on overall improvement of the system rather than serving largely an inspection or audit function. In 2009, the RBF program evolved, precipitated by a change in the political leadership in Afghanistan, declining security, growing concerns about transparency, and the availability of funding from the HRITF to pilot RBF approaches. Furthermore, although significant improvement in performance against the balanced scorecard had been achieved, coverage of key maternal and child health services continued to lag. The approach to performance-based contracting added a facility-level component with incentives based on the quantity of services (conditional on their quality) delivered at the facility level to drive further improvements in coverage of priority services. The financial incentive itself became more prominent and verification of reported results therefore became more of a concern. The verification process consequently evolved and expanded to include verification of quantity and quality of services delivered at individual health facilities. An elaborate mix of facility- and community-level verification was adopted that was resource-intensive and, given the security issues in Afghanistan, often risky. Concerns about institutional and financial sustainability were obvious and were mentioned by all of the respondents interviewed. There also was a sense among some interview respondents that real achievement had been made with the balanced scorecard and as the RBF evolved to focus more on quantity of services and the financial incentive, with some of the power and achievement of the balanced scorecard was lost. On the other hand, all respondents agreed that the RBF and verification process contributed to improvements in the HMIS and greater trust in the data it generates, which brings wider benefits to the health system. The community-level verification, while intuitively attractive, has proven to be the most problematic aspect of the process in Afghanistan. Most respondents felt that community-level verification brought value through greater community involvement and better accountability. With 200 community monitors in the field and over 5,000 interviews conducted each quarter, however, serious concerns about cost, burden on the community, and security were raised. Issues of patient privacy and confidentiality do not appear to have been adequately addressed. The validity of information provided by patients was also questioned given recall bias and challenges with patient- held service records. According to several respondents interviewed, the risks of the community- level verification to patient privacy and field staff security did not appear to be worth the additional benefits that it may yield in terms of community involvement and accountability. 31 Some potential recommendations emerge from the Afghanistan case study: Streamline the verification process to improve cost-effectiveness and sustainability. The MOPH may consider allowing the verification process to continue to evolve as experience has been gained and data quality gaps are closing. This is more critical when financial and institutional sustainability concerns are taken into consideration. A targeted, risk-based approach combined with the credible threat of a random check may be sufficient to keep gaming under control. In the U.K. Quality and Outcomes Framework (QOF) verification process, for example, annual visits to all general practices was found to be costly, and the program has evolved to using risk-based criteria for selecting providers and indicators for verification. Targeted, risk-based verification combined with a paper-based review of automated data, and the credible threat of random verification has been seen as a more cost-effective approach now that data quality has improved and discrepancy rates are low. Afghanistan may consider moving in this direction to streamline and ensure the sustainability of its verification process without sacrificing its validity 10. Change the role of communities in the RBF program from direct involvement in verification to monitoring and supervision. The involvement of communities has been seen as a strength of Afghanistan’s RBF verification process, but there are also numerous concerns. As discussed above, drawing community monitors from the same catchment area as the patients visited during community verification poses confidentiality risks. Patient tracing may still be needed to verify the services delivered but further exploration may need to be done on how to best approach this in Afghanistan. One recommendation that emerged from the study was to retain community participation in monitoring health service delivery given the important role communities play in accountability, but to scale it back its role in the verification of the quantity of services delivered. Furthermore, it was suggested by an interview respondent that community-level monitoring should rely on existing community structures, such as community health assemblies. Strengthen the strategic purchasing role of the MOH and the PHOs, including overseeing monitoring and evaluation as well as RBF verification carried out in a sustainable manner. The capacity of the MOPH and PHOs should be strengthened to carry out the function of purchasing health services, which includes overseeing the monitoring and evaluation and verification functions. The RBF program and verification process are already contributing to strengthening this capacity, and over time it may be appropriate to rely on a locally contracted third- party evaluator. At the start of RBF, the stakeholders in Afghanistan agreed to use the international third-party institution, which was already contracted for the monitoring and evaluation of the RBF program, for the verification component. This was also considered appropriate at the time given the status of governance structures and the focus of the MOPH on their stewardship function. An international third party, although expensive, lent much-needed credibility to the program and may have helped build institutional capacity for verification within Afghanistan. The new World Bank- supported project has already moved to a locally contracted third-party firm which is considered to be more financially and institutionally sustainable. In any case, continued efforts should be made to strengthen the capacity of the MOPH and PHOs to be strategic health purchasers and fully exploit the verification process to continue to improve the coverage and quality of priority health services in Afghanistan. 32 ANNEX 1. NATIONAL MONITORING CHECKLIST 33 34 35 36 37 38 39 40 41 42 43 ANNEX 2. CONSENT FORM The RBF Community Monitoring Visits Verbal Consent Form for Clinic Patients Instructions for the Interviewer: The following is to be read verbatim to the respondent prior to the interview. If the subject then agrees to participate, you must sign on the line marked “Witness to Consent Procedures” at the end of this form. Also mark the date on the appropriate line. Introduction Hello, my name is ……………. I am here to ask if you will answer questions about your most present clinic visit for the Ministry of Public Health. Purpose of the Study The Ministry of Public Health, is conducting interviews to find out about your most recent visit to the ……………… clinic for yourself or your children, or those you care for. This information will help the Government of Afghanistan improve health services. You are not required to answer any questions if you choose not to. Even if you start to answer questions you can stop at any time. Procedures To obtain the necessary information, you have been chosen at random to participate from the list of persons who recently attended …………..…. clinic. An example of the word “random” is when you select a single nut out of a bag of nuts with your eyes closed you don’t know which single nut you are going to select. If you consent, you will be asked a series of questions about your most recent visit to the clinic. Risks /Discomforts The questions should not take more than 15 minutes to finish. You may find some of the questions uncomfortable, or you may be embarrassed to answer some questions in front of others in your home. If there are any questions you do not want to answer you may refuse to answer them without penalty. All interviews carry some risk that information about you may become known to others. Information that might be used to identify you or your family will be kept in a locked box that only the project staff in Kabul will be able to access. Benefits There is no direct benefit to you from being in this study. The information you tell me will be very useful to the Ministry of Public Health in planning better health services for Afghanistan. Confidentiality During the question period I will write down the information you tell me. Some information could be used to identify you or your household. To keep people outside the study from seeing this information, after the interview with you is completed, we will keep the forms in a locked box. Voluntary Consent It is your decision whether or not to be in this study. You can stop participating in the study at any time without anything happening to you. Even if you do not participate you can get the same level of health care as always. 44 Whom to Contact If you have any questions now I will answer them, and if you have questions later you can contact the Provincial Health Director, or the Monitoring and Evaluation Unit at the Ministry of Health, Old Ministry of Health Road, Kabul. Telephone 07026 2961 Do you agree to answer questions about your clinic visit?  Consented  Refused Interviewer Name: ______________________ Interviewer Signature _____________________ Date: _______________________ 45 ENDNOTES 1 The six service areas are: maternal and newborn health; child health and immunization; nutrition, including prevention, assessment and treatment of malnutrition; control and treatment of communicable diseases, especially tuberculosis, malaria and HIV; mental health services; and disability services. The seventh element is regular supply of essential drugs. REFERENCES Arur, T., Peters, D., P. Hansen, M. Mashkoor, L. Steinhardt, and G. Burnham. 2010. “Contracting for Health and Curative Care Use in Afghanistan Between 2004 and 2005.” Health Policy and Planning 25:135-144. Belay, Tekabe. 2010. Building on Early Gains in Afghanistan’s Health, Nutrition, and Population Sector: Challenges and Options. Washington, D.C.: The World Bank Benderley, B. L. (n.d.). Getting Health Results in Afghanistan. The World Bank. Retrieved from http://www.rbfhealth.org/rbfhealth/library/doc/291/getting-health-results-afghanistan Cashin, C., Y. Chi, M. Borowitz, P. Smith, and S. Thompson. (eds.). Forthcoming. Pay for Performance in Health Care: Implications for Health System Performance and Accountability. Open University Press. Cashin, C., and P. Vergeer. 2013. Verification of Performance in Results-based Financing: The Case of the United Kingdom Quality and Outcomes Framework (QOF). Washington, D.C.: The World Bank. Edward, A. et al. 2011. “Configuring Balanced Scorecards for Measuring Health System Performance: Evidence from 5 Years’ Evaluation in Afghanistan.” PLoS Medicine. Volume 8, Issue 7. Online. Accessed July 9, 2013. MOPH. Kabul: Ministry of Public Health, Islamic Republic of Afghanistan. 2010. Operations Manual: Results Based Financing Intervention in BPHS Facilities and Hospitals in Afghanistan. Peters, David et al. 2007. “A Balanced Scorecard for Health Services in Afghanistan.” WHO Bulletin 85: 146-151. Rahmiza, M., M. Amiri, N, Burhani, S. Leatherman, S. Hiltebeitel, and A. Rahmanzai. 2013. “Afghanistan’s National Strategy for Improving Quality in Health Care.” International Journal for Quality in Health Care 25(3): 270-276. Sondorp, E., N. Palmer, L. Strong, and A. Wali. 2009. Afghanistan: Paying NGOs for Performance in a Post-conflict Setting. In Eichler, R. and Levine, R. Performance Incentives for Global Health: Potential and Pitfalls. Washington, D.C.: Brookings Institution Press. 46 The Ministry of Public Health in Afghanistan implements a supply-side results-based financing (RBF) scheme to improve the provision of a standardized basic package of health services (BPHS) to its population. Contracting NGOs and “contracting-in” MOPH providers, this RBF program concentrates on high-priority maternal and child health services in the BPHS such as antenatal care (ANC), post-natal care (PNC), delivery care, nutrition, immunization coverage, tuberculosis (TB), as well as quality of care. It includes an intensive data verification method, focusing on both the quantity and the quality of delivered services, which was implemented between 2010 and 2013 by international third party organizations. This verification method was specifically used to ensure that providers reach performance thresholds and disburse performance payments. This case study describes the major characteristics of this RBF verification method. Taking stock of the experience of the MOPH, it aims at generating possible lessons for other RBF initiatives, thereby expanding knowledge and making RBF verification processes more efficient, sustainable and effective. This case study also responds to concerns about the future sustainability of RBF, particularly with regard to the intensive and external nature of verification: can (and should) the intensive verification process be sustained or will it need to evolve further to match the institutional capabilities of the contracted organization? ABOUT THIS SERIES: This series is produced by the Health, Nutrition, and Population Global Practice of the World Bank. The papers in this series aim to provide a vehicle for publishing preliminary results on HNP topics to encourage discussion and debate. The findings, interpretations, and conclusions expressed in this paper are entirely those of the author(s) and should not be attributed in any manner to the World Bank, to its affiliated organizations or to members of its Board of Executive Directors or the countries they represent. Citation and the use of material presented in this series should take into account this provisional character. For free copies of papers in this series please contact the individual author/s whose name appears on the paper. Enquiries about the series and submissions should be made directly to the Editor Martin Lutalo (mlutalo@ worldbank.org) or HNP Advisory Service (healthpop@worldbank.org, tel 202 473-2256). For more information, see also www.worldbank.org/hnppublications. 1818 H Street, NW Washington, DC USA 20433 Telephone: 202 473 1000 Facsimile: 202 477 6391 Internet: www.worldbank.org E-mail: feedback@worldbank.org