Note No. 91 February 2004 Citizen Report Card Surveys - A Note on the Concept and Methodology This note provides a short summary of the concept and key phases involved in implementing a citizen report card (CRC) survey. CRCs are client feedback surveys that provide a quantitative measure of user perceptions on the quality, efficiency and adequacy of different public services. They have been applied to numerous contexts in different regions. Beyond the process of executing a survey, CRCs involve efforts at dissemination and institutionalization that make them effective instruments to exact public accountability. 1. Introduction competition" for state-owned monopolies that Citizen Report Cards (CRCs) are participatory lack the incentive to be as responsive as private surveys that solicit user feedback on the enterprises to their client's needs. They are a performance of public servicesi. But they go useful medium through which citizens can beyond being just a data collection exercise to credibly and collectively `signal' to agencies being an instrument to exact public accountability about their performance and pressure for change. through the extensive media coverage and civil society advocacy that accompanies the process. 2. Application Contexts Citizen Report Cards are used in situations where CRCs originated in 1994 in Bangalore, India, demand side data, such as user perceptions on through the work of an independent NGO ­ the quality and satisfaction with public services, is Public Affairs Center. The idea was to mimic the absent. Starting from their original context of private sector practice of collecting consumer evaluating urban services in Bangalore, CRCs feedback and applying it to the context of public have been applied in different geographic and goods and services. The surveys derive their name sectoral contexts ­ the common theme being to from the manner in which data is presented. Just use a survey that captures consumer data in a as a teacher scores a student's performance on comparative manner to demand responsiveness. different subjects in a school report card, CRC data aggregates scores given by users for the Some of the actual applications include (i) using quality and satisfaction with different services CRCs as a basis for performance based budget like health, education, police, etc...or scores on allocations to pro-poor services (Philippines), (ii) different performance criteria of a given service, cross-state comparisons on access, use, reliability such as availability, access, quality and reliability. and satisfaction with public services (India), (iii) The findings thus present a collective quantitative supplementing national service delivery surveys measure of overall satisfaction and quality of (Uganda), and (iv) governance reform projects services over an array of indicators. (Ukraine and Bangladesh). By systematically gathering and disseminating The success of these initiatives has varied, public feedback, CRCs serve as a "surrogate for depending in large part on the ability to negotiate change, the degree of participation, and the unit of service provision. Criteria vary with presence (or absence) of a political champion. In contexts: agencies receiving the largest amounts general, an effective CRC undertaking requires a of public funds, agencies most directly related to skilled combination of four things: i) an the poor, agencies with sensitive mandates like understanding of the socio-political context of security and policing, agencies plagued with high governance and the structure of public finance, ii) volume of anecdotal complaints from users, etc. technical competence to scientifically execute and analyze the survey, iii) a media and advocacy Second, administration of a report card initiative campaign to bring out the findings into the public is a technical exercise. Therefore one needs to domain, and iv) steps aimed at institutionalizing identify credible policy institutes or NGOs who the practice for iterative civic actions. can undertake the exerciseiii. Ex-ante respectability of the intermediary organization 3. Key Phases directly affects the ex-post credibility of the A CRC initiative is a process that goes beyond the findings. In some cases, external involvement execution of a survey. It is part `science' - the such as that of the World Bank can add to the technical aspect of running an efficient and credibility, while in other cases, it can be credible survey ­ and part `art' ­ the challenge of counterproductive. mobilizing an advocacy strategy that can foster debate and generate results (see fig.1). 3.2 Design of Questionnaires First, following the identification of stakeholders, Figure-1: Running a CRC initiative focus group interactions to provide inputs to design questionnaires are necessary with at least CRC Methodology the two constituencies ­ the providers of service and its users. Providers of service can indicate not only what they have been mandated to The `Science' The `Art' provide, but also areas where feedback from · Define scope · Media campaign · Select sample · Building awareness clients can improve their services. Similarly, users · Questionnaire · Keep issues alive can sound out initial impressions of the service, so design · Public hearings that areas that deserve extensive probing can be · Field Testing · Constructive catered to. After the questionnaire is designed, it · Data Collection criticism will be necessary to pre-test it with similar focus · Data Analysis · Negotiation · Report Writing · Interface meetings groups before a full-scale launch. Second, the structure and size of the Participation of different stakeholders occurs at questionnaire need to be defined, keeping in mind various stages - (a) in the design of questionnaires that there is a trade-off between detail and time. where the performance indicators and key issues Mechanisms to make the sessions mutually are developed through focus group discussions convenient to the enumerator and the respondent with citizens, (b) during the survey execution, have to be worked out. A useful practice is to where qualitative interviews are used to support break the questionnaire into different modulesiv questionnaire data, and (c) during dissemination that are answered by different members of the where a variety of NGOs are brought in to use the household. Demographic statistics of the data for advocacy and reform. respondents (sex, age, family size, ethnicity, etc.), and income/expenditure patterns should also be Overall, a CRC initiative goes through 6 key included in a separate module. stagesii that are described in more detail below. 3.3 Sampling 3.1 Identification of Scope, Actors and Purpose First, the sample size has to be determined. First, among a cluster of actors, or stakeholders to Usually, the larger the sample size, the better, but be identified, the most important is to be clear on this has to be weighed against budgetary, time, the scope of the evaluation: a sector, industry, or and human resource constraints. The key is to aim 2 for greater representativeness rather than a plain services on a scale, for example, ­5 to +5, or 1 to expansion of numbersv. 7. These ratings of representative users on the various questions are then aggregated, averaged, Second, after an appropriate sample size has been and a satisfaction score expressed as a percentage. determined, the sampling frame has to be decided. This is what will be read like a `report card'. All Allocations will have to be made for different data should be subjected to standard error analysis geographic regions. The standard principle is to and tests of significance. use multi-stage probability sampling with probability proportional to the size of population. 3.6 Dissemination It is useful to ensure that at least one sample First, the findings of the report card should aim at precinct is assigned to all geographical regions being constructively critical. It may be unhelpful covered. Sample households (the ultimate unit of if the goal is solely to embarrass or laud a service analysis) are then chosen from each precinct. provider's performance. This is why, it is important to share the preliminary findings with Third, within sample households, sample the concerned service provider itself. An respondents have to be chosen. Usually, the head opportunity for its authorities to respond to some of the family is approached for answers, but on of the serious criticisms must be made, and the whole respondents should be of different genuine grievances on their part, such as staffing genders and ages. If questionnaires are lengthy or budgetary constraints should be fed back to the and broken into modules, s/he may assign other report to alter the tone of recommendations. members to answers different modules. This is also important since different household members Second, the post-survey publicity strategy has to use different services. be developed. Findings should be launched in a high-profile press conference with wide coverage. 3.4 Execution of Survey Other options are to prepare press kits with small First, one must select and train a cadre of survey printable stories, media-friendly press releases, personnel. Survey personnel or enumerators and translation of the main report into local should be thoroughly informed about the purpose languages. Making the findings widely known of the project and be skilled in questioning and available makes it difficult for the concerned respondents with courtesy and patience. Like with agency to ignore them. the questionnaires, the work of enumerators has to be pre-tested, with preliminary feedback used to Third, following the publication of the report modify questions or the tactics for questioning. If cards, interface between the users and the service multiple languages are being used, instruments providers ideally in a town-hall type setting is should be re-translated back to English (or the recommended. This not only allows the two primary language) to check for consistency. parties to constructively engage in a dialogue based on evidence, but also puts pressure on Second, to ensure that recording of household service providers to improve their performance information is being done accurately, spot for the next round. If more than one agency is monitoring of interviews at random should be being evaluated, these settings can foster a sense undertaken in phases after a proportion of of healthy competition among service providers. interviews are complete. Then, after completing A direct interaction between the two concerned each interview, enumerators should go over the parties is also a way to ensure an operational link information collected and identify inconsistencies. between information and action. Once the record is deemed satisfactory, it is inputted into standardized data tables. Fourth, new developments in information technology (IT) should increasingly be used to 3.5 Data Analysis solve old problems of accountability. Through This is the output stage, when all inputted data is web-sites and discussion boards on the internet, consolidated and analyzed. Typically, respondents the reach of the findings of reports cards can not rate or give information on aspects of government only be widened, but they can also solicit the 3 engagement of literate and informed tax payers in stimulate action both from public officials and solving public problems. from citizens could easily become daunting given the unpredictability of different actors. And even 3.7 Institutionalization methodologically, there are limits to comparing CRC initiatives, especially those that arrive as different services or regions based on user one-off experiments, will serve little long-term perceptions on account of varying expectations. purpose unless implementation is followed by efforts at institutionalization on a sustained basis. Yet, with the effective combination of citizen, How these efforts are to be institutionalized political and bureaucratic action, CRCs could be should thus be a concern warranting some thought the ideal catalyst for mobilizing demand for right from the outset. Institutionalization is also accountability and reform, and for moving important to exploit the usefulness of credible ordinary people, including the poor, from `coping report cards in full by making them more than to voice' and from `shouting to counting'. psychological pressure tools on service providers. Ideally, governments can use report cards for This note was prepared by Swarnim Waglé, Janmejay performance-based budgeting and link public Singh and Parmesh Shah of the Social Development Department of the World Bank. It draws on Samuel Paul opinion with public spending. This is what has (2002): Holding the State to Account: Citizen Monitoring in been done by the Department of Budget in the Action, Books for Change, Bangalore, and the Filipino Philippines, which is in fact contracting out the Report Card on Pro-Poor Services, The World Bank, 2001, CRC exercise to independent CSOs. as well as numerous discussions and presentations by the authors. For further references visit Alternatively, CRCs can be adapted to create www.worldbank.org/participation. `governance rating systems' in a decentralized setting ­ an experiment attempted in Bangladesh and in Ukraine's People's Voice Projectvi. i It is important to note that CRCs are not `opinion Service providers and ministries can in turn link polls' - feedback is taken not from the general public, CRC findings with their internal management and but from only the actual users of public services. ii incentive systems. The second Bangalore CRC in These generally take 3 to 7 months to implement. iiiUsually, the survey execution is out sourced to a 1999 for instance, catalyzed numerous responses market research agency with adequate market research from providers such as the setting up of the and statistical survey analysis skills e.g ORG-MARG Bangalore Agenda Task Force by the state (India) or the Social Weather Station (Philippines). government that closely monitors the feedback iv Another useful strategy is to use `rotating from the CRC, the initiation of training programs interviews', i.e. ask about the first 3 services to the first on customer responsiveness by the Bangalore household, the next 3 to the following one and so on. Development Authority and the Water Board, and v On occasion, if the number of actual `users' of a less the introduction of regular consumer satisfaction regularly used service, like the police, are too low in surveys by the Karnataka Electricity Boardvii. the sample, then `booster' interviews to increase representativeness can be undertaken. These involve purposive sampling of users through `exit interviews'. Institutionalization efforts depend heavily on vi For more information see the People's Voice website political commitmentviii. That being said, CRCs at - http://www.icps.kiev.ua/eng/projects/pvp. often provide the needed impetus for reform- vii For more details of the impacts of the Bangalore minded politicians to tackle bureaucratic inertia CRCs see Ravindra, A.: An Assessment of the Impact and vice-versa. of Bangalore Citizen Report Cards on the Performance of Public Agencies, Draft Paper, November 2003. 4. Concluding Remarks viiiThis was clear in the Bangalore context where the CRCs are increasingly being used as tools for Karnataka State Chief Minister's role in countering civic engagement to demand better governance. resistance from the bureaucracy to be held They are not without their limitations though. For accountable, was critical in leading to reforms. one, they depend on strong media support and external financing. The effort and time required to 4