USING CLINICAL VIGNETTES TO MEASURE PROVIDER SKILLS AND STRENGTHEN PRIMARY HEALTH CARE IN CÔTE D’IVOIRE Benjamin Chan, Awa Diallo, Gnamien Kouamé, Simplice Kouassi November 2023 Health, Nutrition, and Population Discussion Paper This series is produced by the Health, Nutrition, and Population (HNP) Global Practice of the World Bank. The papers in this series aim to provide a vehicle for publishing preliminary results on HNP topics to encourage discussion and debate. The findings, interpretations, and conclusions expressed in this paper are entirely those of the author(s) and should not be attributed in any manner to the World Bank, to its affiliated organizations, to members of its Board of Executive Directors, or to the countries they represent. Citation and the use of the material presented in this series should take into account this provisional character. The World Bank does not guarantee the accuracy of the data included in this work. The boundaries, colors, denominations, and other information shown on any map in this work do not imply any judgment on the part of the World Bank concerning the legal status of any territory or the endorsement or acceptance of such boundaries. For information regarding the HNP Discussion Paper Series, please contact the Editor, Jung-Hwan Choi at jchoi@worldbank.org or Erika Yanick at eyanick@worldbank.org. RIGHTS AND PERMISSIONS The material in this work is subject to copyright. Because the World Bank encourages the dissemination of its knowledge, this work may be reproduced, in whole or in part, for noncommercial purposes as long as full attribution to this work is given. Any queries on rights and licences, including subsidiary rights, should be addressed to World Bank Publications, the World Bank Group, 1818 H Street, NW, Washington, DC 20433, USA; fax: 202-522-2625; e-mail: pubrights@worldbank.org. © 2023 The International Bank for Reconstruction and Development / The World Bank 1818 H Street, NW, Washington, DC 20433 All rights reserved. ii Health, Nutrition, and Population (HNP) Discussion Paper Using Clinical Vignettes to Measure Provider Skills and Strengthen Primary Health Care in Côte D’Ivoire Benjamin Chan,a Awa Diallo,a Gnamien Kouamé,b Simplice Kouassib a Health, Nutrition, and Population Practice, The World Bank Group, Washington, DC b Community Health Department, Ministry of Health, Public Hygiene and Universal Health Coverage, Abidjan, Côte d’Ivoire Paper prepared for research by the Primary Health Care Performance Initiative (PHCPI) financed by the Bill and Melinda Gates Foundation Abstract: This paper describes a program in Côte d’Ivoire designed to assess and enhance the competence of primary health care providers using clinical vignettes. The initiative provided training to district supervisors in 113 health districts on how to present patient scenarios to providers to assess their skills in history taking, physical exam, diagnosis, treatment and provision of patient advice. The clinical vignettes covered common topics in maternal and child health and infectious diseases. Several technical improvements were applied, including improved organization and clarity of questions, flexibility in treatment choices, options for management for rural areas and clearer standards for what information should be communicated to patients. The program also aimed to enhance content validity by mapping vignette questions against national practice guidelines. A training manual with role-playing exercises was developed. Supervisors conducted an initial sample of vignettes which revealed that only 36 percent of providers achieved a satisfactory score. The Project ECHO virtual learning platform was then used to address implementation challenges among district supervisors, who shared ideas for improvement. Learning sessions with midwives focused on managing preeclampsia. An electronic tablet tool was also designed for assessments, allowing data transfer to the national health information system, and key design features included offline assessment capability. The paper provides a comprehensive account of program design, challenges, and solutions, so that other countries interested in developing similar programs can learn from this experience. Keywords: Primary health care, clinical vignettes, clinical competency, health providers, quality assurance Disclaimer: The findings, interpretations, and conclusions expressed in the paper are entirely those of the authors, and do not represent the views of the World Bank, its executive directors, or the countries they represent. Correspondence Details: Benjamin Chan, the World Bank Group, 1818 H Street, NW, Washington, DC; bchan2@worldbank.org, http://www.worldbank.org/. Recommended citation: Chan, Benjamin; Diallo, Awa; Kouamé, Gnamien; Kouassi, Simplice. 2023. Using Clinical Vignettes to Measure Provider Skills and Strengthen Primary Health Care in Côte d’Ivoire. Washington, DC: World Bank. iii Table of Contents ACKNOWLEDGMENTS ............................................................................................... V PART I – EXECUTIVE SUMMARY ............................................................................. 6 PART II – BACKGROUND ............................................................................................ 8 HEALTH OUTCOMES, QUALITY OF CARE AND GAPS IN MEASURING CLINICAL COMPETENCY ................................................................................................................... 8 INVESTMENTS IN PHC AND CONCURRENT ACTIVITIES TO IMPROVE HUMAN RESOURCES ....................................................................................................................................... 10 PART III – PROGRAM DESIGN AND METHODOLOGY ..................................... 12 OVERVIEW OF CLINICAL VIGNETTES .............................................................................. 12 VIGNETTE DESIGN.......................................................................................................... 13 CLINICAL VIGNETTE SCORING ....................................................................................... 18 EVALUATING VALIDITY ................................................................................................. 19 PART IV – PROGRAM IMPLEMENTATION .......................................................... 24 TARGET GROUP FOR EVALUATION AND PRIORITY TOPICS ............................................. 24 EXAMINERS AND TRAINING............................................................................................ 24 LOCATION OF EVALUATION AND USE OF DECISION SUPPORTS ...................................... 25 INITIAL RESULTS ............................................................................................................ 25 USE OF PROJECT ECHO TO TROUBLESHOOT IMPLEMENTATION ISSUES AND SUPPORT PROVIDER COMPETENCY ................................................................................................ 26 DEVELOPMENT OF SOFTWARE TOOL FOR DATA ENTRY ................................................. 27 PART V – ENSURING PROGRAM SUSTAINABILITY ......................................... 31 LONG-TERM PLANNING .................................................................................................. 31 INTEGRATION WITH OTHER TRAINING PROGRAMS ........................................................ 32 INCENTIVES AND ACCOUNTABILITY MECHANISMS ........................................................ 32 INTEGRATING AND MAINTAINING HEALTH INFORMATION SYSTEMS ............................. 32 PART VI – RECOMMENDATIONS FOR COUNTRIES DESIGNING CLINICAL VIGNETTE PROGRAMS ............................................................................................. 33 REFERENCES................................................................................................................ 37 ANNEX 1: PRIMARY HEALTH CARE PERFORMANCE INITIATIVE VITAL SIGNS PROFILE, 2021 .................................................................................................. 39 iv ACKNOWLEDGMENTS This report was authored by Dr. Benjamin Chan, Awa Diallo, and Dr. Gnamien Kouamé, World Bank consultants, and Dr. Amani Simplice Kouassi, Ministry of Health and Public Hygiene of Côte d’Ivoire, who provided leadership and oversight of the project. The authors thank Manuela Villar Uribe, World Bank project co-lead for the Primary Healthcare Performance Initiative, and members of the SDI team from the World Bank, Kathryn Gilman Andrews, Jigyasa Sharma, and Harish Ram Sai, for their extensive feedback on the report. The authors thank Professor Samba Mamadou, Director General of Health, the consultants from the University of New Mexico Project ECHO team especially Dr. Bruce Struminger, Konan Assouman Alphonse, Emmanuel Bado, Dr. Annick Tano Ve, and Professor William Yavo from the National Institute of Public Health, for implementation support of the Project ECHO component of this initiative and their team’s review of related sections of the report. The authors also thank software consultants from Bluesquare, Martin De Wulf, Elie Khalil, Wilfried Oro Akre, and Moustapha Pouye for development of the clinical vignette software tool and their review of related sections of the report. The authors are grateful to the World Bank for publishing this report as an HNP Discussion Paper. v PART I – EXECUTIVE SUMMARY This paper describes the process employed by Côte d’Ivoire to design and implement a program using clinical vignettes to assess the competence of its primary health care providers. The purpose of the program is to identify providers with weaknesses in clinical competency and refer them to continuing education. The methodology uses trained personnel to present scenarios of patient symptoms to frontline providers to test their ability to diagnose and treat key illnesses. Supervisors in each of Côte d’Ivoire’s 113 health districts were trained to work through vignettes with physicians, nurses, and midwives in primary health care facilities. This project was done with the support of the Primary Healthcare Performance Initiative (PHCPI), funded by the Bill and Melinda Gates Foundation. The World Bank, a partner in that initiative, provided technical assistance. Clinical vignettes were adapted from previous versions used in other World Bank projects, covering topics such as malnutrition, diarrhea, pneumonia, tuberculosis, malaria and preeclampsia. Technical improvements to the original vignettes were applied and field- tested. They included the following: • Better organization and breadth of question responses, to recognize that physicians may wish to ask questions or examine all body systems to rule out alternate diagnoses • Giving clearer direction to evaluators on how to handle multiple ways of asking a patient the same question • Flexibility of treatment choices, where multiple therapies or dose ranges are possible • Specifying alternative treatment approaches suited to rural areas that lack access to certain medications, labs, or other resources • Setting clearer standards for what to communicate to patients regarding diagnosis, purpose of treatments, warning signs to watch for, and follow-up The program also aimed to improve content validity by using detailed tables cross- referencing vignette questions to specific sections of national practice guidelines. A training manual was developed for examiners, which included role-playing exercises where they could practice conducting assessments in challenging situations. In a pilot program after the training was done, a sample of 202 completed vignettes found only 36 percent of providers assessed obtained a “satisfactory� score of 50 or higher out of 100. The next step was using the Project ECHO virtual learning platform to bring together district supervisors to discuss problems they had encountered administering the assessments. They shared ideas that might help providers be more at ease while being assessed, which included improving the wording of the questions and clarifying the process of administering the vignettes. Learning sessions were conducted with midwives, to review principles for managing preeclampsia and discuss difficult cases. 6 An electronic tablet tool was developed to facilitate doing the assessments and transferring the data to the national District Health Information System 2 (DHIS-2) health information system for further analysis. One of its key design features was the ability to do an assessment without Internet access, and upload data to a server later when Wi-Fi was available. Five resource persons from the Ministry of Health, Public Hygiene and Universal Health Coverage of Côte d'Ivoire (referred to as the Ministry of Health in the document) received comprehensive training on how to modify the tool or add new vignettes to the system. This paper’s detailed descriptions of program design, implementation challenges, and solutions are intended to benefit other jurisdictions interested in implementing similar programs. 7 PART II – BACKGROUND HEALTH OUTCOMES, QUALITY OF CARE AND GAPS IN MEASURING CLINICAL COMPETENCY Côte d’Ivoire—like other low- and middle-income countries around the world—has seen health outcomes and life expectancy improve in recent years. Its economy, too, has grown substantially over the past decade: the country attained lower-middle-income status in 2020 and has a gross national product (GNP) per capita 50 percent higher than the average for sub-Saharan Africa (US$2,549 vs. US$1,633) (IMF 2017). Despite these promising signs, however, Côte d’Ivoire’s health outcomes and quality indicators are at best similar to and in many cases lag behind other nations in sub-Saharan Africa. World Bank data from 2010 to 2020 show life expectancy at birth increased from 55 to 59 years during those years, but that is five years less than in neighboring Ghana (64 years) and two years behind the sub-Saharan Africa average (World Bank 2021a and 2021b). According to the Demographic and Health Survey of Côte d'Ivoire 2021, the infant- juvenile mortality rate is 74 percent, while infant mortality is 52 percent, neonatal mortality 30 percent and maternal mortality 385 deaths per 100,000 live births (INS and ICF, 2023), but those outcomes are far worse than other lower-middle income countries where average life expectancy is 69, the maternal mortality ratio is 255 per 100,000, and under- five mortality 45 per 1,000 (World Bank 2021c and 2023a). These comparisons show there is considerable room for improvement in the health care system. To improve knowledge about the quality of primary health care services, Côte d’Ivoire joined the Primary Health Care Performance Initiative (PHCPI) in 2017. This initiative was funded by the Bill and Melinda Gates Foundation and implemented in partnership with the World Bank, the World Health Organization (WHO), Research 4 Development, Ariadne Labs, the United Nations Children’s Fund (UNICEF), and the Global Fund. PHCPI promoted the use of a common scorecard, the “vital signs profile� (VSP), which pooled data from multiple sources to provide a comprehensive view of the state of primary health care (PHC). Côte d’Ivoire was one of the first countries to release a VSP in 2018 and an updated version in 2021. Côte d’Ivoire’s latest VSP highlighted ongoing weaknesses in delivery of care for mothers and children and in treatment of major infectious diseases. Only 17 percent of young children with diarrhea received appropriate oral rehydration therapy in 2019, and only 50 percent of tuberculosis cases were detected and successfully treated (PHCPI, 2021). Some core services were not routinely available across the country. Across a basket of five maternal and child health services (management of sick child, vaccination, family planning, antenatal care, and prevention of mother-to-child transmission of human immunodeficiency virus [HIV]) considered essential in all primary health care sites, 84 percent of items were being provided, suggesting gaps in some facilities. Only 54 percent of a basket of services for communicable conditions (sexually transmitted infections, tuberculosis, and HIV) were being provided. The vital signs profile also highlighted equity gaps in child mortality rates (78 per 1,000 in urban areas, 108 in rural) and in variations in coverage of maternal and child services by maternal education with a rate of 63 percent for those with secondary-level education or above and 43 percent without (PHCPI 2021; see Annex 1). 8 Ensuring professionals have clinical skills necessary to diagnose and treat common conditions is essential, but Côte d’Ivoire had no data on this topic because it had previously used the Service Availability and Readiness Assessment (SARA) facility survey. SARA examines availability of staff, equipment, medications, supplies, and services but has no module for evaluating clinical skills. The country has since switched to the Service Delivery Indicators (SDI) facility survey tool, which uses clinical vignettes to assess skills. Publication of these data is anticipated in the near future. The vital signs profile identified elements in the country’s capacity for managing health human resources, which could be used in the future to monitor clinical skills and competency. Countries participating in PHCPI deployed the “progression model� tool created for the initiative, which evaluated a country’s underlying capacity for providing primary health care, such as its ability to plan and execute strategies, provide necessary infrastructure, manage human resources, and collect and use health information. Those evaluations were based on structured interviews, document reviews, and field visits. The analysis showed Côte d’Ivoire had a well-established supportive supervision program (Measure 33, see Annex 1), with district supervisors visiting PHC facilities quarterly to identify service gaps, support improvements, and monitor progress. The country also had mechanisms to ensure health care providers were licensed (Measure 19) and define standards for what each profession should be able to do (Measure 20). With support from PHCPI, Côte d’Ivoire embarked on activities to address these issues of suboptimal outcomes and quality of care and lack of information on clinical competency. In 2019, Côte d’Ivoire received seed funding from PHCPI’s Opportunities Fund to import clinical vignettes from World Bank projects in other African countries and modify their content to align with local guidelines and practices. After a hiatus due largely to the COVID-19 pandemic, Côte d’Ivoire was selected to participate in the “Intensive Country Engagement� program of PHCPI, designed to provide additional technical support to countries addressing gaps in quality measurement shown in their VSPs. Section V of this report gives a detailed description of Côte d’Ivoire’s experience for countries interested in replicating such a program. 9 Box 1. Overview of Côte d’Ivoire's Primary Health Care System Côte d’Ivoire provides primary health care (PHC) through facilities called établissements sanitaires de premier contact (ESPCs), which include the following: • Rural health centers (centres de santé ruraux, CSRs) • Rural dispensaries (dispensaires rurales) • Urban health centers (centres de santé urbains, CSUs) • Urban health formations (formations de santé urbaines, FSUs) Rural communities are typically staffed by nurses, midwives, and health aides (aide- soignantes). Facilities may include examining rooms, a basic pharmacy, and a maternity ward. Urban sites also have physicians and may house other staff (e.g., pharmacists, dentists, social workers). There are 3,411 ESPCs, of which 2,311 are public and 1,100 private. Public ESPCs include 1,730 rural and 581 urban facilities. Private ESPCs include 760 for-profit, 96 faith-based, 54 corporate, and 190 nongovernmental organization (NGO)-operated centers. ESPCs refer complex cases first to the district general hospital (hôpital général du district), to the regional hospital center (centre hospitalier regional, CHR) if more complex, and to university hospital centers (centres hospitaliers universitaires, CHUs) for the most complex cases. Health system oversight is organized in tiers. Supervisors at the district level oversee quality of care in each facility. They report to regional directors, who then report to the Ministry of Health and Public Hygiene (Ministère de la Santé et de l'Hygiène Publique, MSHP) and the General Directorate of Health (Direction Générale de Santé, DGS). Source: MSHPCMU 2021. INVESTMENTS IN PHC AND CONCURRENT ACTIVITIES TO IMPROVE HUMAN RESOURCES Côte d’Ivoire’s activities to measure and improve clinical competency are one component of a broader strategy to strengthen primary health care. Priorities in the country’s 2016– 2020 National Health Development Plan (Plan national de développement sanitaire) include strengthening governance and leadership, health human resources, health information systems for decision making, and health care financing, as well as reducing maternal and neonatal mortality and improving other outcomes. In human resources, the emphasis was on reducing geographic disparities in the supply of health care professionals by increasing the supply of providers and through strategies for local retention (MPD 2020). In 2017, Côte d’Ivoire officially joined the Global Financing Facility (GFF), an initiative that provides technical assistance to help countries develop and implement national health plans to improve access to quality care for women, children, and adolescents. Participating countries develop an investment case outlining priority interventions that will help them introduce universal health coverage and achieve the United Nations’ (UN’s) Sustainable Development Goals (SDGs). Côte d’Ivoire’s case for investment highlighted its priorities for strengthening its health care system, including human resources. GFF funding comes from a dedicated trust fund, which pools contributions from donors to avoid fragmentation due to different funder priorities. Côte d’Ivoire’s 2020 investment case highlighted its system-wide weaknesses—insufficient human resources, weak use of services particularly in rural areas, variable quality of health care, and poor quality of data stored in health information systems, rendering them insufficient for decision-making. Among seven priority interventions identified were a call to improve the availability of qualified personnel (World Bank 2023b). The World Bank Strategic Purchasing and Alignment of Resources and Knowledge (SPARK) in Health Project, co-financed with GFF, supported activities to strengthen 10 primary care quality and human resources, and was launched in Côte d’Ivoire in 2019, with a credit of US$200 million from the International Development Association of the World Bank and US$20 million contribution from the GFF Trust Fund (World Bank 2023b). Program activities included the following: • Rehabilitating and equipping health facilities • Strengthening human resources, including training and addressing knowledge gaps for instructors and in-service training for health professionals • Improving health information systems • Strengthening quality of primary care, with a special focus on reproductive, maternal, newborn, child, and adolescent health There are opportunities to ensure integration of a clinical competency evaluation program into a broader strategy for strengthening health human resources and primary health care. The problem of poor distribution of health professionals is frequently cited as a factor in gaps between urban and rural areas in access to quality of care. Activities to measure and improve clinical skills in rural health care providers is one complementary strategy to ensuring good access to quality care. This is especially important because rural health center staff are typically nurses and midwives. These individuals will need help strengthening their diagnosis and treatment skills, which are handled by physicians in urban settings. 11 PART III – PROGRAM DESIGN AND METHODOLOGY OVERVIEW OF CLINICAL VIGNETTES Clinical vignettes are scenarios presented to health care providers to test their ability to diagnose and treat key illnesses. An examiner reads a standardized, hypothetical case scenario that briefly describes the presenting complaint. The provider then asks the examiner questions as if taking the history from the patient. Responses to questions the providers should ask are printed on the scoring sheet, and the examiner reads back the response to each question asked. Points are awarded for asking questions relevant to the case. After taking the history, the provider specifies what to look for during physical examination, lab tests, the diagnosis, treatment, and advice to the patient. Clinical vignettes are one of many approaches for measuring clinical competency; their main advantages are affordability and their ease of scale-up. The gold standard is the observed structured clinical examination (OSCE), using trained live actors to simulate patients. OSCEs offer greater realism, as the provider must consider visual cues about the patient’s condition and can be observed conducting an actual physical exam rather than simply stating what should be examined. The disadvantage is that the approach is very expensive. Peabody et al. (2000) found that quality of care scores using clinical vignettes varied by only 5 percent compared to OSCEs and were more accurate than chart reviews. Therefore, in many settings (particularly those with limited resources) clinical vignettes represent a cost-effective alternative to the gold standard. Direct observation is another valid method of measuring clinical competency, which Côte d’Ivoire may continue in parallel with clinical vignettes in the future. In this method, examiners observe providers performing actual patient assessments in a PHC facility and assess their skill at taking histories and doing physical exams, diagnosis, and treatment. Leonard and Masatu (2005) found that direct observation showed providers typically implemented fewer desired actions than they said they would in a vignette review, so these are likely more accurate. Their disadvantages are that cases are not standardized, and there is no control over which cases arrive during the examiner’s visit. District supervisors in Côte d’Ivoire do some direct observation of general history taking and examination skills, but not of how specific clinical conditions are managed. However, continuing to observe providers with patients could show whether the history and physical exam were conducted adequately in real-life circumstances. Vignettes can be used for assessment or training, at different stages of a health care provider’s career and in different settings. Individuals can practice working through vignettes to learn how cases should be managed and to acquire new medical knowledge, either during medical training or as part of continuing medical education. Vignettes can also be used to evaluate whether individuals have acquired the necessary level of knowledge when they complete training, or to identify gaps in knowledge in practicing providers. Health facility surveys such as the Service Delivery Instrument (SDI) use vignettes for this purpose, and the results can be used to set priorities for future investments in education. 12 VIGNETTE DESIGN The vignettes our team created for Côte d’Ivoire were designed to simulate five stages of a clinical evaluation: history taking, physical examination and tests, diagnosis, treatment planning, and advice to the patient. Each vignette was designed to represent a case of a particular disease, and very limited information (such as the presenting complaint) was given to the provider at the outset. The provider is prompted to ask questions that reveal the diagnostic criteria for identifying a disease and its evidence-based treatments, as described in national clinical practice guidelines. The first clinical vignettes in Cote d’Ivoire were introduced during the 2019 Opportunities Fund project. They were based on a template from the SDI facility survey and were adapted from health provider assessment programs in other sub-Saharan countries. In internal discussions we identified several issues with them, including the following: • Patient history questions that providers were expected to ask were focused on the disease being simulated and did not include other important questions that would rule out alternate diagnoses. • If providers asked patient history questions in an unconventional order, the examiner might have difficulty finding them on the scoring sheet. • Many vignettes did not include certain routine questions that should be asked in any patient assessment (e.g., allergies, current medications, past medical history). • In many cases, providers asked the same question in different ways. For example, they might ask, “Do you have breathing problems?� rather than “Are you coughing or wheezing?� This could lead to ambiguities in interpretation. • There was no allowance for a partially correct diagnosis. For example, gastroenteritis could be diagnosed but the level of severity (acute) was not explicitly mentioned, even if the treatment given was consistent with the level of severity. • The model did not allow for reasonable variations in treatment. For example, in one vignette the dose of rehydration fluid had to be exactly 700 milliliters (ml), and a response of 705 ml would have been marked as an error even though it would have been curative. To promote standardization among the vignettes, we organized questions in each of their five sections (history, physical examination and tests, diagnosis, treatment plan, and advice to the patient) into subcategories (shown in Table 1 and Figure 1. For example, when providing advice to the patient at the end, the framework sets consistent expectations that providers will do the following: • Explain the diagnosis • Describe the purpose of treatments, how to administer them, and why they are important • Mention warning signs of further problems patient should watch for • Make follow-up arrangements • Offer the opportunity for patients to ask questions These subsections help ensure all these items are addressed in patient encounters. The reorganization of questions should help examiners find the response matching the provider’s question if a provider asks questions in an unconventional order. 13 Table 1. Organizing Framework for Vignette Design Clinical vignette section Subcomponents History (a) Presenting complaint (i.e., general quality or physical characteristics, duration, frequency, intensity, location, etc.) (b) Review of body systems (c) Past medical conditions, vaccination status, medications, and allergies Physical examination (a) Vital signs and diagnostic tests (b) Special examinations as dictated by the presenting complaint (c) Examination of all body systems (i.e., head and neck, chest, cardiovascular, abdominal, genitourinary; if appropriate, musculoskeletal, neurological, and skin) (d) Order lab tests and other investigations Diagnosis (a) Diagnosis (b) Classification of severity level (if required) Treatment plan (a) Medications (b) Therapeutic procedures, nondrug interventions (c) Organization of referrals where necessary Advice to the patient (a) Inform the patient of the diagnosis, the nature of the disease, and its importance (b) Describe purpose of medication and treatments and their importance (c) Explain how to take medications and treatments; (d) Describe warning signs to watch for that warrant immediate medical attention (e) Make follow-up arrangements (f) Offer to answer questions Source: Framework developed by authors. 14 Figure 1. Example of Organization of Questions on History and Physical Examination in Clinical Vignettes Scenario: My 22-month-old son is sick with a bad cough and I’m worried about him. # Question asked Examiner's response Points Check History [25 points] Details of the complaint (13 points) Q2 Duration of cough? 3 days 2 |___| Q3 Wet or dry cough? Wet cough 2 |___| Q4 Presence of blood in the 1 sputum? No |___| Q5 Sputum color? Yellow-green 2 |___| Q6 Fever? Yes 2 |___| Q7 Chest pain? No 1 |___| Q8 Difficulty breathing? Yes 3 |___| Other symptoms (7 points) Q9 General health (awake, 1.5 lethargic, tired) Not his normal self |___| Q10 Appetite A bit decreased 0.5 |___| Q11 Vomiting No 1 |___| Q12 Difficulty swallowing / able to 1 drink? Drinking but not as much |___| Q13 Red eyes No 0.5 |___| Q14 Ear problems/discharge/pain? No 0.5 |___| Q15 Diarrhea No 0.5 |___| Q16 Seizures / convulsions No 1 |___| Q17 Rash No 0.5 |___| Medical history / medications / allergies (6 points) Q18 Close contact with a possible 1 case of COVID Not that I know of |___| Q19 Recent history of measles No 0.5 |___| Q20 Other medical history No 1 |___| Q21 Family history of asthma No 0.5 |___| Q22 Medications / treatments Paracetemol 1 |___| Q23 Vitamin A Not received for 6 months 1 |___| Q24 Allergies None 1 |___| Source: Authors. To address ambiguity on how to score the provider’s probing question in situations where we anticipated a different way of asking the same question, different valid options were included separated by a slash, as demonstrated in Figure 2 15 Figure 2. Example of Clinical Vignette Questions with Multiple Options for Wording Question asked Examiner's response Points Check General activity level / alertness Very poor energy 2 |___| Breathing difficulties / cough None 3 |___| Abdominal / pelvic /labor pain? Pain in the upper-right part of 1 |___| the belly. No contractions. Do you feel any movements / kicks Yes, a lot. 1 |___| from the baby? Vaginal bleeding / vaginal discharge? No 1 |___| Seizures / convulsions? No 1 |___| Source: Authors. In the diagnosis section, providers were asked to name the diagnosis, with alternate names for the diagnosis provided for clarity. For example in Figure 3 diarrhea, dysentery, and gastroenteritis are all acceptable answers. Often, there are additional details regarding the diagnosis, such as the severity level. Such details were scored separately, to allow measurement of partial knowledge of the disease. In Figure 3, for example, the provider gets credit for diagnosing dysentery and receives additional points for correctly identifying the level of dehydration. The scoring sheet also tracks whether the provider failed to make a diagnosis or made an incorrect diagnosis, and records the names of any incorrect diagnoses proposed. Identifying such errors might help improve training programs, or show flaws in vignette design that led a skilled individual to make a different diagnosis than intended. Figure 3. Example of Structure of Clinical Vignette Questions regarding Diagnosis Diagnosis: Check ALL diagnoses mentioned by the clinician. If the clinician does not propose a diagnosis, ask for one. [10 points] Q48 Dysentery (or gastroenteritis or acute diarrhea) 4 |___| Q49 With moderate dehydration or Category B (Add extra +6 |___| dehydration of the WHO Protocol points if Q48 is selected) Q50 Dehydration without level, or severe dehydration, +0 |___| or mild dehydration Q51 Other diagnosis (if Q48 is not mentioned) -10 |___| Q52 Specify the alternative diagnosis: ____________________________________ Q53 No diagnosis was given, even AFTER asking the -10 |___| clinician Source: Authors. Similarly for treatment, the scoring sheet was designed to measure partial knowledge of the correct management strategy. As Figure 4 shows, providers receive some points for identifying the correct treatment (oral rehydration solution) and additional points for the dose and frequency. 16 Reasonable variations in doses of treatments may exist if there are slightly different formulas in common use or providers wish to round up doses for practical reasons. For that reason, the authors defined a range of acceptable dosages, as illustrated in Figure 4. Figure 4. Example of Clinical Vignette Questions on Treatment # Question asked Examiner's Points Check response Q54 Rehydration with oral rehydration solution 3 |___| (ORS) Q55 ORS amount: For a child of 9kg, in the first 9 |___| 4 hours ORS give: 9 x 75 ml or 675ml (a response between 540ml and 750ml is acceptable). Q56 After the first 4 hours, provide as much 3 |___| ORS as possible until diarrhea stops. After each stool, give 50–100ml Q57 Vitamin A capsules. 1 |___| Dose: 10 or 20mg per day 2 |___| Q58 for 10–14 days Source: Authors. Notes: ml = Milliliters; kg = Kilograms; mg = Milligrams. We adapted some questions for rural areas where health facilities have limited access to drugs, equipment, or investigations. In cases where there were recommendations for alternative treatments for smaller centers lacking infrastructure (e.g., where they could not do intravenous therapy), we also listed the acceptable alternatives, such as an intramuscular medication (see Figure 5). Figure 5. Example of Structure of Clinical Vignette Questions to Accommodate Differences in Resources in Different Facilities # Question asked Notes Points Chec k Q53 Start treatment with antibiotic by iv (If the provider intends 4 (preferred) or im to refer, treatment must begin before the patient is transferred) |__| Q54 Choosing an appropriate antibiotic. 10 Eligible responses: - Ceftriaxone 50mg/kg im or iv/day, or - Ampicillin 50 mg kg iv/im every 6 hours PLUS gentamicin iv/im 5 mg/kg/day, or - Chloramphenicol 25 mg/kg im |__| Source: Authors. Notes: iv = intravenous; im = intramuscular; mg = Milligrams; kg = Kilograms. 17 CLINICAL VIGNETTE SCORING A scoring rubric allows for the calculation of a summary measure to describe a provider’s overall level of competency in managing a case. We were not able to find any international standard in the literature on how to score a vignette. One possible reason for lack of standardization is that, ultimately, the relative weight given to different parts of the vignette reflects what the user prioritizes as being important for providers to know, and these priorities may vary from one user or country to another. Given the need for a scoring rubric, nonetheless, we chose a practical approach and adapted the rubric used in other World Bank projects as described previously with some minor modifications. This rubric was based on a previous version of the SDI instrument, where each section was assigned a number of points, as shown in Table 2. Penalties were given for an incorrect diagnosis, and in some cases for prescribing an inappropriate treatment. In the hypothetical situation where a provider received a negative score, the final score was zero. Table 2. Scoring Scheme for Clinical Vignettes Clinical vignette section Points History 20–25 Physical examination and diagnostic tests 20–25 Diagnosis 10 (penalty of up to 10 points for incorrect diagnosis) Treatment plan 20–25 Advice to the patient 20–25 TOTAL 100 Source: Authors. Each item in the vignette was weighted by its importance. At this level, we incorporated some modifications to account for changes made to vignette design, as described above. Core questions on the signs and symptoms that met criteria for the disease or differentiated its severity were given relatively high weights. Questions not related to the main diagnosis but useful for ruling out other diagnoses or existence of comorbidities were included but had relatively lower weights. For example in the vignette in Figure 1 about a cough which turns out to be pneumonia, questions related to diagnosis and classification (e.g., presence of sputum or blood, breathing difficulties, fever, lethargy) were weighted higher than questions to rule out concurrent conditions (e.g., dehydration, gastroenteritis, febrile seizures). Similarly, questions describing the most important aspects of treatment received higher weighting: higher points were given for naming the definitive guideline-based treatment, while measures to optimize how treatment is delivered received fewer points. For example, in diarrhea with moderate dehydration, the primary treatment is oral rehydration salts. Zinc supplementation may help reduce the duration of symptoms by 0.5 to 1.0 day (Lazzerini and Wanzira 2016) but would not address the urgent problem of dehydration and is weighted lower. If the provider proposed inappropriate or contra-indicated treatments, penalties were applied. 18 The section for diagnosis (Figure 4 gives partial points if the provider could name the main diagnosis but not supplemental information such as severity or comorbidities. In general, approximately half the points were awarded for the main diagnosis and the remainder for the latter two. The exact division of points varied depending on the extent of severity, and comorbidities were essential to treatment decisions. A penalty of 10 points was applied if the diagnosis was completely inaccurate or missing. Finally, overall scores were ranked to classify the health provider’s clinical competency. The score cutoffs set by the Ministry of Health were 80 or above—superior, 50 to 79— satisfactory, and less than 50—unsatisfactory. An unsatisfactory ranking reflects failure to ask key questions, leading to either misdiagnosis or inappropriate treatment. Satisfactory suggests knowledge of enough questions to achieve correct diagnosis and treatment. Superior performance suggests the provider asked a comprehensive list of questions to rule out the broadest range of alternate diagnoses. EVALUATING VALIDITY “Content validity� refers to the extent to which the assessment instrument is relevant and representative of what it aims to measure. This was handled by an extensive cross- referencing of vignette questions to their corresponding section in national practice guidelines, shown in Figure 6. We assessed the vignette’s face validity through field tests and presentations to district supervisors. A vignette with good face validity should be a realistic representation of an actual case, including signs, symptoms, and treatments, as perceived by users of the tool. Validity was assessed during field tests with providers in Abidjan, training sessions with supervisors, and the Project Echo feedback sessions with supervisors and providers after the first phase of initial data collection (described below). At each of these checkpoints, individuals were invited to flag any errors or concerns. This process allowed for timely response to errors or misunderstandings; for example, midwives asked to use their delivery registry as a decision support tool during vignette review sessions, because this tool is how they manage cases in real life. Criterion validity examines whether our questions and scoring accurately measure providers’ competence. More data will have to be gathered to ensure we calibrated the point values for questions appropriately. Results from the initial deployment of vignettes (detailed below) rated most providers “unsatisfactory� with a score of less than 50. This could indicate either that providers are not meeting expectations on patient assessments described in the vignettes or the scoring system needs recalibration. To do that, the Ministry of Health could test their validity by comparing scores on vignettes to an independent expert review of the responses. If experts judge the response would have resulted in satisfactory care where a provider’s vignette score was unsatisfactory, the allocation of points per question could be adjusted. 19 Figure 6. Excerpt from Reference Tables Mapping Clinical Vignette Questions to Clinical Practice Guidelines Vignettes 1 Diarrhea, moderate; 2 Diarrhea, severe Guidelines Integrated Management of Illness of Neonates and Infants, Cote d’Ivoire, Booklet of Tables, August 2018 edition Section Section of guidelines History p. 67, section on diarrhea Observations p. 67, section on diarrhea Diagnosis p. 10, classification of diarrhea according to level of severity of (diagnostic dehydration and treatment principles criteria) Treatment p. 10, treatment principles p. 23, further instructions on treatment Source: Authors. 20 As well, a quality assurance program will be required to assure inter-rater reliability in scoring vignettes. Although examiners received standardized training for this initial work, over time different interpretations of instructions could arise, or staff turnover could lead to variation in training and skill in administering the vignettes. For future reference, we developed a role-playing training module, whereby examiners practice administering a vignette to a peer who plays the role of the provider. The peer asks a set of specially prepared mock questions that have been deliberately designed to make it challenging for the examiner to interpret, due to inclusion of vaguely worded or ambiguous questions, or questions asked in an order that is not logical (Figure 7a). The examiner then must find each mock question on the vignette scoring sheet, read back the appropriate response, and award the correct number of points. After conclusion of the vignette, the examiner compares his score sheet against a reference score sheet that has the correct scoring for the mock questions (Figure 7b). Several examples of these mock questions and reference sheets have been included in the training manual, allowing for examiners to be tested for consistency multiple times. If on average there is a major discrepancy (e.g., five points) between what the examiner scored and the score on the reference sheet, then the examiner would need to repeat the exercise until he or she achieves scores consistent with the reference sheets. 21 Figure 7a. Excerpt of Dialogue with Mock Questions to Evaluate Examiners’ Ability to Score Vignettes Correctly Role-Play Script for Malaria Before each section, wait for the examiner to invite you to ask questions. History How long has he had a fever? Does he get chills? Headaches? Does he have a sore throat? Sore ears? Is he coughing? Has he vomited? How is his cough? Thick or dry? Is there blood in the vomit? Does his stomach hurt? Does he have pimples on the skin? Does he have problems urinating? Does he have difficulty breathing? Does he have diarrhea? Are his eyes yellow? Do you use mosquito nets? Does the child sleep under the mosquito net? Do you know measles? Is your child vaccinated? What medications have you already given him before coming? Is he vaccinated against measles? What doses did you give the child for all medicines? How long have you been deworming him? Does he have any known allergies? Source: Authors. 22 Figure 7b. Excerpt of Reference Sheet to Evaluate Examiners’ Ability to Score Vignettes Correctly # Question asked Investigator's response Points Tick History [25 points] Details of the complaint (7) Q2 Duration of fever? A week 2 |_x_| Q3 Frequency of fever? Some days no fever, others it is 2 very hot |___| Q4 Chills? Yes 1 |_x_| Q5 Sweats? Yes 1 |___| Q6 Level of alertness / activity / general Decreased, especially when he 1 malaise? has a fever |___| Other symptoms (13) Q7 Appetite? He eats, not as much as usual 1 |___| Q8 Convulsions? No 1 |___| Q9 Headaches? No 1 |_x_| Q10 Ear pain or throat? No 0.5 |_x_| Q11 Difficulty breathing? No 1 |_x_| Q12 Cough? Yes, a little 0.5 |_x_| Q13 Type of cough / wet or dry? The cough is dry, not greasy 0.5 |_x_| Q14 Nausea / vomiting? Yes, sometimes, 3 times this 1 week |_x_| Q15 Blood in vomiting? No 0.5 |_x_| Q16 Diarrhea? No 1 |_x_| Q17 Stomach ache / abdominal pain? No 1 |_x_| Q18 Problems urinating? No 0.5 |_x_| Q19 Bleeding / epistaxis? No 0.5 |___| Q20 Skin rashes? No 0.5 |_x_| Q21 Jaundice? No 1 |_x_| Q22 Muscle pain / aches? No 0.5 |___| Q23 Use of impregnated mosquito nets? Not always 1 |_x_| Medical history / medications / allergies (5) Q24 Does he have measles, or has he had it No 0.5 in the last 3 months? |_x_| Q25 Any other medical history? No 1 |___| Q26 Vaccines? His vaccinations are up-to-date 1 |_x_| Q27 Medications / treatments? I gave him paracetamol 1 |_x_| Q28 Vitamin A? Not received for 6 months 0.5 |___| Q29 Deworming / albendazole / Not received for 6 months 0.5 mebendezole? |_x_| Q30 Allergies? No 0.5 |_x_| Source: Authors. 23 PART IV – PROGRAM IMPLEMENTATION TARGET GROUP FOR EVALUATION AND PRIORITY TOPICS The Ministry of Health focused its evaluation of clinical skills on primary health care providers—doctors, nurses, and midwives, and was especially interested in evaluating nurses and midwives in rural health centers that do not have physician coverage. These staff may not have had adequate training to diagnose and treat common conditions. The vignette topics were based on Côte d’Ivoire’s disease burden priorities: maternal and child health conditions and infectious diseases. These conditions were also identified as areas of weakness in the VSP, and additional topics are expected to be added over time. Multiple versions of the same topic with different degrees of severity were developed to test the ability of the provider to select the appropriate intensity of treatment (see Table 3). Table 3. Vignettes Developed Vignette topic Childhood dysentery with moderate dehydration Childhood dysentery with severe dehydration Childhood upper respiratory infection Childhood pneumonia, normal Childhood pneumonia, severe Childhood malaria, simple Childhood severe acute malnutrition without complications Childhood severe acute malnutrition with complications Adult tuberculosis Pregnant woman, preeclampsia Source: Authors EXAMINERS AND TRAINING The examiners chosen for the project were district supervisors, who are part of Côte d’Ivoire’s well-established supportive supervision program. The supervisors visit every PHC facility in each of its 113 districts four times a year. District supervisors also offer educational activities to PHC providers and hence are ideally suited for administering the vignettes. Some vignette assessment models (such as the SDI survey) use two enumerators, one to record results and the other to play the role of the patient. For reasons of practicality, cost, and sustainability, Côte d’Ivoire used one. Examiners received a training manual and attended training sessions, which were held in multiple regions around the country. As well, two individuals from each district participated in a two-day training session. A total of five sessions were held in three different cities to offer training in reviewing the vignettes, including their format, method of administration, and scoring. 24 LOCATION OF EVALUATION AND USE OF DECISION SUPPORTS Supervisors evaluated providers using clinical vignettes at the health facilities where they worked. This took place during the supervisors’ quarterly visits. Supervisors also carried out regular activities such as direct observation of patient care. Each provider was evaluated separately from colleagues. To simulate real-life situations, the Ministry of Health allowed limited access to decision support tools, such as protocols or reminders posted on the wall of the health facility. Managing clinical cases requires knowledge of details such as dosages of drugs, fluids, and other treatments requiring use of formulas; in real-life situations it is reasonable, even desirable, for health care providers to consult protocols or similar resources. However, participants were not allowed to read through an entire clinical guideline to find the answer. INITIAL RESULTS The Ministry of Health asked districts participating in training sessions to submit the results from a sample of vignettes, gathering data from 202 reviews from établissements sanitaires de premier contact (ESPCs) in 69 districts. This initial small sample of results from clinical vignette reviews showed that across all conditions and districts, only 36 percent of health care providers received a satisfactory score of 50 or above, and only 2 percent had superior performance with a score of 80 or above. More than 60 percent had scores rated unsatisfactory. Low scores were noted in all types of primary health care facilities (Figure 8 and for different conditions (Figure 9). Figure 8. Distribution of Vignette Scores by Type of Primary Health Care Facility Source: Authors' analysis using vignettes results. Notes: Each box plot describes lowest value observed (lowest marker); 25th percentile (bottom of box), median (middle line in box), 75th percentile (top of box), and highest value observed (upper marker). DR = Rural dispensary; CSU = Urban health center; CSR = Rural health center; DU = Urban dispensary. 25 Figure 9. Vignette Scores by Clinical Scenario Source: Authors' analysis using vignettes results. Note: SAM = Severe acute malnutrition. The average time of administration of an individual vignette was 20 minutes, plus an additional six minutes spent with providers discussing the process beforehand and reviewing results afterwards. This result is consistent with the time it took to administer vignettes for the SDI health survey (Verbal communication, SDI team). Examiners expect administration time will decrease as examiners and providers gain experience. USE OF PROJECT ECHO TO TROUBLESHOOT IMPLEMENTATION ISSUES AND SUPPORT PROVIDER COMPETENCY Côte d’Ivoire has a well-established Project Echo learning platform for virtual training and capacity-building. Project Echo links experts on a selected topic (working in a central location) with personnel in dispersed or remote locations. Participants receive a combination of didactic training and feedback on how to manage challenging cases. Côte d’Ivoire has used this platform for human immunodeficiency virus/Acquired immunodeficiency syndrome (HIV/AIDS) training since 2018 and COVIID since 2020. The program was established through a WHO grant and is administered by the Institut National de Santé Publique (National Institute of Public Health) with support from faculty at University of New Mexico, where the initiative was founded. As of 2023, all district offices have videoconferencing equipment. Remote primary health care staff can log in with Internet if available, or travel no more than one hour to their district office to participate. Following initial training, learning sessions were established for district supervisors and for midwives to discuss implementation problems. The University of New Mexico was contracted to coordinate these sessions, to establish that the Project Echo method could be applied to support project implementation. District representatives participated in three sessions and were divided into five groups to maintain a target of 25 participants per session, to maximize opportunities for discussion. Districts then invited midwives who 26 participated in the preeclampsia vignette to attend a separate round of three sessions to suggest improvements to the program, review basic principles of preeclampsia, and discuss challenges in management. The most common issue identified by examiners was fear among providers of being evaluated and sanctioned. Supervisors believed fear might have hindered performance and made providers’ scores appear lower than what they were capable of achieving. Supervisors discussed ideas on how to reassure providers that vignette evaluations are intended to support improvement and not to punish them. Examiners identified errors or ambiguities in the administration or design of the vignettes, and proposed corrections during learning sessions. Examples of design errors included the following: • An incorrect z-score for weight in the vignette for malnutrition, which was subsequently corrected. • Some supervisors misunderstood when to pause for responses or provide feedback, so these details were clarified. • The process allowed only limited use of decision support tools, but midwives routinely use their delivery registry to guide patient management, and subsequently its use was permitted. Examiners found nurses had greatest difficulty with the history and physical examination sections of the vignettes. This observation is consistent with the Ministry of Health's concern that nurses in rural communities were expected to perform some responsibilities that lie with physicians in urban settings, but did not have the training to do so. The first Project Echo sessions produced several recommendations, including the need to do the following: • Extend use of the vignettes to all ESPCs • Scale up the initiative by training other members of the district management team to administer the vignettes • Reassure providers the evaluation has no punitive consequences; instead, a low score will require participation in extra training DEVELOPMENT OF SOFTWARE TOOL FOR DATA ENTRY The Ministry of Health contracted a software vendor, Bluesquare, to develop a tablet- based software tool for administering vignettes. The software allowed supervisors to do the following: • Retrieve the latest version of clinical vignettes on their tablet • Administer vignettes to a provider in a health facility, even if Internet was unavailable • Record the provider’s responses on a scoring sheet within the tablet • Obtain automatically calculated scores for each vignette, including subtotals for each section (e.g., history, physical, etc.) • Upload the data to a repository at a later time when Internet access was available 27 Using an electronic format eliminates recurring printing costs, time-consuming manual calculations, and space for archiving forms, and maintains version control. Figure 10a and 10b gives examples of windows for recording information from a provider and for retrieving scores. Figure 10a. Original French Version of Windows for Data Entry and Results Retrieval in Clinical Vignette Tablet Tool Source: Clinical vignette software tool developed for project. Notes: Left: Window for recording questions or actions named by the provider. Right: Summary of vignette scores by section, following administration of the vignette. 28 Figure 10b. English Translation of Windows for Data Entry and Results Retrieval in Clinical Vignette Tablet Tool Vignette 5 Child Pneumonia Vignette 5 Child Pneumonia 1.2.7. Red eyes - (0.5) Provider name: Provider Test Interviewer's reply: No 1.2.8. Ear problems/discharge/pain? Evaluator's name: Evaluator Test Interviewer's reply: No 1.2.9. Diarrhea Experience: 13.5 / 25 Interviewer's reply: No Medical history / medications / allergies Observation: 10 / 23 1.3.1. Close contact with a possible case of Laboratory tests: 2 / 5 COVID (1) Interviewer's reply: not that I know of 1.3.2. Recent history of measles (0.5) Diagnosis: 0 / 10 Interviewer's reply: No 1.3.3. Other medical history (1) Treatment: 13 / 18 Interviewer's reply: No 1.3.4. Family history of asthma (0.5) Practical medical information: 8 / Interviewer's reply: No 19 1.3.5. Medications / treatments (1) Total point: 36.5 / 100 Interviewer's reply: No 1.3.6. Vitamin A (1) Score: 36.5 % Interviewer's reply: not received for 6 months 1.3.7. Allergies (0.5) Interviewer's reply: None Source: Clinical vignette software tool developed for project. Notes: Left: Window for recording questions or actions named by the provider. Right: Summary of vignette scores by section, following administration of the vignette. We developed a detailed list of software requirements, focusing on user-friendliness, integration with health information systems, and ability to update vignettes without vendor assistance. These requirements are listed in detail in Table 4, and can be used by other countries as a template for the terms of reference for requests for proposals. 29 Table 4. Proposed Vendor Requirements for Tablet-Based Data Entry Tool for Clinical Vignettes # Requirements 1 Android-compatible 2 Clarity of instructions or visual clues on how to use the software 3 Minimize number of page clicks to find information 4 Ability to calculate scores (both overall and for each section) and show to the provider immediately after completion of the vignette 5 Minimize the need to scroll up and down to read all question responses for each section of the vignette 6 Ability to be used in situations where there is no immediate access to Internet 7 Ability to provide instant feedback to the evaluee on his or her score, subscores for each section, and questions answered correctly 8 Ability to transfer data to a server once Internet access via Wi-Fi is reestablished 9 Ease of ability to update vignettes on the tablet, and remove old ones, without disturbing data archived from previous administrations of vignettes 10 Interoperability with national health information systems, in this case, DHIS-2, allowing for archiving of individual question scores from each vignette administered 11 Ability to generate reports comparing vignette scores in different facilities, districts, or regions (either within the application or following transfer of data to DHIS-2) 12 Ability of staff at the Ministry of Health to make updates to the vignettes or add new vignettes, without having to rely on consultants in the future 13 No additional maintenance charges from the vendor following completion of the tool, to the extent possible Source: Authors, adapted from terms of reference for consultants. Note: DHIS-2 = District Health Information Software-2. One of the desired functions was to monitor vignette scores of individual health providers over time to see if their clinical skills are improving. Unfortunately, this proved to be difficult to implement because Côte d’Ivoire does not have a unique identifier for health care providers. As a result, provider names must be entered each time, and spelling variations could compromise the ability to follow a provider’s progress over time. Countries that want this function should establish such an identifier. The clinical vignette software allowed for transfer of data to the national DHIS-2 health information system for archiving and analysis. To accomplish this, data are first entered on the tablet and then copied to an interim server designed for use with the software. Additional programming code was written to transfer data from this interim server to DHIS- 2. A summary of responses for each vignette are archived in DHIS-2. The interim server will be the same as the one used for a performance-based financing initiative within the SPARK project, which allows for possibly integrating this program with performance-based funding activities in the future. Five technical staff at the Ministry of Health were trained on how to update clinical vignettes in the system over five days, following a training manual and templates developed by the vendor. Verifying that staff could make changes to vignettes or add new ones without assistance from the vendor was a key feature of training. 30 PART V – ENSURING PROGRAM SUSTAINABILITY Long-term sustainability of the program depends on multiple factors related to planning, execution and integration into existing activities. This initiative was funded by a PHCPI grant, and as with all grant-funded initiatives, there is a risk activities will not continue after funding ends. The following paragraphs describe in detail what actions have or will be taken by Côte d’Ivoire Ministry of Health to ensure sustainability. LONG-TERM PLANNING Having a long-term plan for evaluation of providers’ clinical skills is fundamental to sustainability. A clear plan outlines a common vision for all stakeholders for improving health human resources and allows for estimation of resources required for implementation at scale. The Ministry of Health has indicated a desire to evaluate clinical skills of all PHC health professionals through evaluation cycles lasting three to five years. Table 5 lists the number of PHC personnel in the country that would need to be evaluated: 12,500. Table 5. Heath Personnel in Côte d’Ivoire Health profession Number Estimated number working in PHC Physician 3,715 2,000 Nurse 12,994 7,000 Midwife 6,744 3,500 TOTAL 18,215 12,500 Sources: MSHPCMU 2021 and MSHP 2018. Note: PHC = Primary health care. Table 6 presents calculations of the amount of supervisor time needed to evaluate all of these staff. Three planning scenarios are presented, under different assumptions of how long it takes to administer one vignette and the number of vignettes to be administered per provider. This analysis suggests that district supervisors should plan on finding approximately 15 to 30 minutes to administer vignettes within their existing supervision visits. Table 6. Estimated Resource Requirements for a National Clinical Vignette Program Planning variable Scenario A Scenario B Scenario C Length of evaluation cycle (in years) 3 4 5 Number of ESPCs 2,252 2,252 2,252 PHC providers 12,500 12,500 12,500 Vignettes per provider required 3 3.5 4 Time per vignette (minutes) 20 20 15 Total minutes of examiner time required per evaluation cycle 750,000 875,000 750,000 Total ESPC visits during evaluation cycle (assuming 1 visit per quarter to each site) 27,024 36,032 45,040 Additional minutes required per ESPC supervision visit 28 24 17 Source: Authors' estimation using MSHPCMU 2021. Notes: PHC = Primary health care; ESPC = Établissements sanitaires de premier contact. 31 Doing other supervisory tasks more efficiently or eliminating some tasks would help supervisors to find time to administer vignettes. To that end, PHCPI also supported the creation of electronic templates within the tablet-based software that supervisors could use to document the availability of drugs, equipment, staff, and supplies instead of recording and analyzing this information on paper. INTEGRATION WITH OTHER TRAINING PROGRAMS Integrating the clinical vignette program into a broader strategy for improving human resources should help to ensure its sustainability. For example, the Ministry of Health has recently developed the e-Santé training program, in collaboration with the WHO bureau in Côte d’Ivoire and with support from the World Bank SPARK project. The new program uses videoconferencing to deliver lectures on clinical practice guidelines for common medical conditions. The clinical vignettes program can be used to measure the extent to which information from these lectures has been applied in medical decision making. In the future, the Project Echo program can be used to provide mentorship on managing difficult cases, to further support practical use of knowledge. INCENTIVES AND ACCOUNTABILITY MECHANISMS Another option for ensuring sustainability is to incorporate measurement and maintenance of clinical competency into performance-based financing (PBF) mechanisms. World Bank projects in Cambodia, Democratic Republic of Congo, Republic of Congo and Kyrgyzstan have employed this approach, where clinical vignette scores are one component within an index measuring service delivery and quality of care. This index in turn is used for calculating incentive payments to health facilities (Fritsche and Peabody 2018). Linking vignettes to improving competency will also require setting targets for the number of evaluations to be completed annually per district, then devising accountability mechanisms to ensure those targets are met. The health pyramid structure, with a chain of reporting from PHC facilities to districts to regions to the Ministry of Health, could facilitate this system of accountability. INTEGRATING AND MAINTAINING HEALTH INFORMATION SYSTEMS Integration of clinical vignette data collection into existing health information infrastructure is also critical to long-term success. The vendor specifications for programming to enable linkage to DHIS-2 are aimed at minimizing the need to learn different data systems, standards or platforms on an ongoing basis. Because vignette data will be stored in DHIS- 2, existing tools can be used to create feedback reports showing variations in vignette scores by topic, facility, district or region; changes or improvements over time; and differences between provider group. Health information systems need to be regularly maintained, to allow for changes in what information needs to be captured and to adapt to new technologies. The existence of core staff within the ministry to do this maintenance is essential. A key feature of this program was the transfer of knowledge on how to update clinical vignettes from software developers to ministry staff. However, because all organizations undergo staff turnover, mechanisms to transfer this knowledge to new staff will be required in the future. 32 PART VI – RECOMMENDATIONS FOR COUNTRIES DESIGNING CLINICAL VIGNETTE PROGRAMS This document describes in detail the steps, challenges, and solutions in designing and implementing a clinical competency measurement program in Côte d’Ivoire. This information can be of great benefit to other countries interested in designing similar programs. We summarize in Tables 7a to 7e below, questions or suggestions to consider when setting objectives for the program, designing each individual vignette, creating a scoring rubric, selecting and training examiners, and conducting field implementation. The purpose of this section is not to suggest that each country follow Cote d’Ivoire’s model, but rather to encourage countries to systematically consider the pros and cons of each design option identified during development of this program, and choose what is best for them. The desire is that each country will then be able to create a program that suits the needs and local circumstances of its own health care providers and facilities. Table 7a. Considerations for Setting Objectives Comments Clarify purpose of May be either for evaluation, training, or both vignette program Select target audience Identify professional group (doctors, specialists, nurses, midwives, for evaluation community health workers, etc.) Identify setting (rural or urban primary care, hospital, etc.) Consider prioritizing certain provider groups at greatest risk of having low clinical competency scores Select priorities for Options include reproductive, maternal, and child health; infectious clinical topics to be diseases; noncommunicable diseases; etc. evaluated Consider prioritizing common conditions accounting for largest share of mortality or morbidity Consider national Example: Ensure all health care providers are assessed every 3–5 objectives for clinical years for clinical competency and direct low-scoring providers to vignette program training programs Source: Authors. 33 Table 7b. Considerations in Designing Each Clinical Vignette Considerations Comments Select diagnosis Consider specifying severity level, complications, comorbidities Select questions and History: Description of complaint; review of body systems; management points Past medical conditions, drugs, allergies according to a standardized Physical: Vital signs, special observations related to the structure complaint, examination of each body system Investigations: Lab tests, diagnostic imaging, other investigations Diagnosis: Main diagnosis, level of severity, comorbidities Treatment: Medications, procedures, other interventions, referrals or observation where indicated Advice to patient: Provide and explain diagnosis; describe treatments, their importance, how to administer them; warning signs to watch for; follow-up procedures; invite questions Cross-reference vignette Map each diagnostic criterion and essential treatment point to a content to practice guidelines specific question in the vignette, to ensure content validity List alternative wordings of Examples: Cough / breathing problems? Lethargy / energy the same concept level? Separate components of Award points for giving name of treatment; additional points for treatment and diagnosis, to details such as dose, frequency, or administration describe partial knowledge of Award points for naming main diagnosis; additional points for concept details such as level of severity, comorbidities, or complications Allow for range of acceptable Example: Instead of oral rehydration dose 500ml, specify an treatment dose acceptable range, e.g., 400–600ml Allow for alternate responses Example: Ideal response is intravenous antibiotic, but allow depending on resources option for intramuscular or oral antibiotic in centers where available intravenous line is not available Anticipate and include Examples: Unnecessary tests; wrong diagnosis; inappropriate, common inappropriate harmful, or outdated treatments; consider penalties for such actions responses (see scoring below) Conduct multiple pretests of Ask examiners, examinees to identify ambiguities in wording, each vignette content errors, etc. Source: Authors. Note: ml = Milliliters. 34 Table 7c. Considerations in Designing a Scoring Rubric Considerations Comments Consider desired Authors chose roughly equal weighting for history, physical exam weighting for each section and investigations, treatment, and advice (20–25 points each) (history, physical exam and 10 points for diagnosis, but there is no universal standard for and investigations, this allocation. Decision on allocation of points is ultimately a treatment, patient advice) value judgment of what constitutes good clinical competency. Set weights for different Give higher weights to questions related to diagnostic criteria and questions and responses main treatment; lower weights to questions to rule out other conditions, adjunct treatments, or tips to optimize main treatment. Consider penalizing Scale the penalty according to impact (e.g., low penalty for inappropriate treatments, inconvenience or waste, higher penalty for treatment causing incorrect diagnoses harm) Set cutoffs for acceptable Authors chose two cutoffs to define superior, acceptable, and performance poor performance. Validate the scoring rubric Authors suggest that selected clinical experts review completed vignettes to see if rating (e.g., acceptable) has face validity. Source: Authors. Table 7d. Considerations in Selecting and Training Examiners 4. Vignette examiners Comments Select pool of examiners Identify individuals with sufficient clinical knowledge to be able to interpret responses from provider. Provide training and Train and evaluate examiners using role-play, where a mock certification of examiners provider reads a standard script of questions asked in a confusing or disorderly manner, to test examiner’s ability to accurately score the vignette. Establish ongoing Meet with examiners regularly to identify implementation barriers, mechanism to support errors in vignettes, or opportunities to improve design or wording. examiners The authors used the Project ECHO platform for this purpose. Establish accountability Set expectations among supervisors of examiners to conduct mechanism target number of vignette assessments. Ensure time allocation Budget the necessary time to complete desired number of vignettes per provider per evaluation period. Source: Authors. 35 Table 7e. Considerations for Field Implementation Considerations Comments Select location of administration Options include workplace or examination center. Decide if decision support tools will For example, providers were allowed to consult protocols be allowed posted on the wall in the workplace or other tools in common practice. Mechanism for providing feedback Provide feedback on score obtained including scores for to provider different sections or questions scored incorrectly. Help provider identify areas for improvement. Address anxiety of staff being Describe use of the tool as being for supportive evaluated educational purposes. Establish protocol for maintaining If multiple individuals in the same facility are to be privacy of examination evaluated with the same vignette, conduct evaluation in a secluded area. Do not circulate copies of the vignette. Such restrictions are not required if vignettes are used for training purposes. Automate data collection if Example: In this project, a tablet tool was developed possible allowing for uploads of vignette questions, collection of data offline and synchronization of data to a central server and the national DHIS-2 database when Wi-Fi is available. Analyze and report data Monitor competency scores by individual provider over time, by provider type, diagnosis, individual institution, health district, or region. Develop plan for remedial For example, online training modules exist in Cote d’Ivoire education for providers with low that any provider can access. scores Consider monitoring progress of If this is desired, then linking vignette data to a registry of individual health care providers all health care providers by a unique identifier is over time recommended. Establish accountability Example: There is the potential to incorporate this clinical mechanisms to ensure providers, vignettes program into performance-based payment facilities maintain adequate scores mechanisms in the future. Source: Authors. Note: DHIS-2 = District Health Information Software-2. 36 REFERENCES Fritsche, G., and J. Peabody. 2018. “Methods to Improve Quality Performance at Scale in Lower- and Middle-Income Countries.� Journal of Global Health 8 (2): 021002. https://doi.org/10.7189/jogh.08.021002. IMF (International Monetary Fund). 2017. “Second Reviews under an Arrangement under the Extended Credit Facility and the Extended Arrangement under the Extended Fund Facility.� Press Release and Staff Report. IMF Country Report No. 17/372. file:///Users/awadiallo/Downloads/cr17372.pdf. INS (Institut National de la Statistique) et ICF (2023). Enquête Démographique et de Santé de Côte d’Ivoire, 2021. Rockville, Maryland, USA : INS/Côte d’Ivoire et ICF. Disponible à l'adresse : The DHS Program - Cote d'Ivoire: DHS, 2021 - Final Report (French) Lazzerini, M., and H. Wanzira. 2016. “Oral Zinc for Treating Diarrhoea in Children.� The Cochrane Database of Systematic Reviews 12 (12): CD005436. https://doi.org/10.1002/14651858.CD005436.pub5. Leonard, K. L., and M. C. Masatu. 2005. The Use of Direct Clinician Observation and Vignettes for Health Services Quality Evaluation in Developing Countries.� Social Science and Medicine 61 (9): 1944–951. MPD (Ministère du Plan et du Développement Côte d'Ivoire). 2020. « Résumé Plan National de Développement (PND) en Côte d’Ivoire (2016–2020). » https://www.caidp.ci/uploads/506b0bce6be504b64d5359c6407cd7df.pdf. MSHP (Ministere de la Sante et de l’Hygiene Publique). 2018. « Rapport Annuel sur la Situation Sanitaire (RASS) 2017. » https://www.snisdiis.com/wp- content/uploads/2022/03/Rapport-Annuel-sur-la-Situation-Sanitaire-RASS-2017- VF.pdf. MSHPCMU (Ministere de la Sante, de L’hygiene Publique et de la Couverture Maladie Universelle). 2021. « Rapport Annuel sur la situation sanitaire 2020. » https://www.snisdiis.com/wp-content/uploads/2022/03/Rapport-Annuel-sur-la- Situation-Sanitaire-RASS-2020-VF.pdf. Peabody, J. W., J. Luck, P. Glassman, T. R. Dresselhaus, and M. Lee. 2000. “Comparison of Vignettes, Standardized Patients, and Chart Abstraction: A Prospective Validation Study of 3 Methods for Measuring Quality.� Jama 283 (13): 1715–22. PHCPI (Primary Health Care Performance Initiative). 2021. “Côte d’Ivoire Primary Health Care Vital Signs Profile.� https://www.improvingphc.org/sites/default/files/2022- 12/CotedIvoire_VSP_2021.pdf. World Bank. 2021a. “Life Expectancy at Birth, Total (Years)—Cote d'Ivoire.� https://data.worldbank.org/indicator/SP.DYN.LE00.IN?locations=CI. ———. 2021b. “Life Expectancy at Birth, Total (Years)—Ghana.� https://data.worldbank.org/indicator/SP.DYN.LE00.IN?locations=GH. ———. 2021c. “Mortality Rate, Under-5 (per 1,000 Live Births)—Cote d'Ivoire.� https://data.worldbank.org/indicator/SH.DYN.MORT?locations=CI. ———. 2023a. “Cote d'Ivoire—Gender data portal.� https://genderdata.worldbank.org/countries/cote-d- ivoire/#:~:text=480%20women%20die%20per%20100%2C000,lower%20than%20i ts%20regional%20average. ———. 2023b. Development Projects: Strategic Purchasing and Alignment of Resources and Knowledge in Health Project (Spark-Health), p167959. 37 https://projects.worldbank.org/en/projects-operations/project- detail/P167959?lang=en. 38 ANNEX 1: PRIMARY HEALTH CARE PERFORMANCE INITIATIVE VITAL SIGNS PROFILE, 2021 39 40 Source: PHCPI 2021. 41