WPS7447 Policy Research Working Paper 7447 Why Do Countries Participate in International Large-Scale Assessments? The Case of PISA Marlaine E. Lockheed Education Global Practice Group October 2015 Policy Research Working Paper 7447 Abstract The number of countries that regularly participate in inter- 2000 and 2015, yielding more than 1,200 observations. national large-scale assessments has increased sharply over The data cover each country’s participation in each of six the past 15 years, with the share of countries participating in cycles of PISA as it relates to the country’s level of economic the Programme for International Student Assessment grow- development, region, prior experience with assessment, and ing from one-fifth of countries in 2000 to over one-third OECD membership. The results indicate that the odds of countries in 2015. What accounts for this increase? This of participation in PISA are markedly higher for OECD paper explores the evidence for three broad explanations: member countries, countries in the Europe and Central globalization of assessments, increasing technical capac- Asia region, high- and upper-middle-income countries, ity for conducting assessments, and increased demand for and countries with previous national and international the microeconomic and macroeconomic data from these assessment experience; the paper also finds that regional assessments. Data were compiled from more than 200 assessment experience is unrelated to PISA participation. countries for this analysis, for six time periods between This paper is a product of the Education Global Practice Group. It is part of a larger effort by the World Bank to provide open access to its research and make a contribution to development policy discussions around the world. Policy Research Working Papers are also posted on the Web at http://econ.worldbank.org. The author may be contacted directly at lockheed@princeton.edu or through Marguerite Clarke, at mclarke2@worldbank.org. The Policy Research Working Paper Series disseminates the findings of work in progress to encourage the exchange of ideas about development issues. An objective of the series is to get the findings out quickly, even if the presentations are less than fully polished. The papers carry the names of the authors and should be cited accordingly. The findings, interpretations, and conclusions expressed in this paper are entirely those of the authors. They do not necessarily represent the views of the International Bank for Reconstruction and Development/World Bank and its affiliated organizations, or those of the Executive Directors of the World Bank or the governments they represent. Produced by the Research Support Team Why Do Countries Participate in International Large-Scale Assessments? The Case of PISA Marlaine E. Lockheed1 JEL Codes: I20, I25 Keywords: Learning Assessment, Education Indicators and Statistics, Secondary Education, Education Policy and Planning 1 Marlaine Lockheed is a former World Bank official and a lecturer in international and public affairs at Princeton University INTRODUCTION International large-scale assessments demand resources and technical capacity, yield comparisons that are often embarrassing to ministries of education, and typically describe a country’s education system as it was one or more years in the past. Yet an increasing number of countries are choosing to participate in these exercises (Kamens 2013, Lockheed 2015). Why? This paper seeks to answer this question empirically, using the experience of the OECD’s Program for International Student Assessment (PISA) as a case. International large-scale assessments (ILSAs) provide high-quality comparative information regarding the outputs of education systems. These assessments are carried out systematically across a number of countries and yield internationally comparable information regarding students and their cognitive skills. The two ILSAs covering the most number of education systems are: (a) the Programme for International Student Assessment (PISA) sponsored by the Organization for Economic Cooperation and Development (OECD), and (b) Trends in International Mathematics and Science Study (TIMSS) sponsored by the International Association for the Evaluation of Educational Achievement (IEA). IEA has also sponsored the Progress in Reading Literacy Study (PIRLS) and the OECD has sponsored the Programme for the International Assessment of Adult Competencies (PIAAC). All four of these assessments measure learning and cognitive skills in selected countries across all regions. Regional assessments, another type of ILSA limited to a specific world region, have arisen in Africa and Latin America. Since the mid-1990s, ministers of education in Africa have sponsored a periodic regional assessment, the Southern and Eastern African Consortium for Monitoring Educational Quality (SACMEQ). Also, since the same time, UNESCO, through the Laboratorio Latinamericano de Evaluacion de la Calidad de la Education (LLECE), or Latin American Laboratory for the Evaluation of Education Quality, has sponsored a regional assessment in Latin America, Estudio Regional Camparativo y Explicativo (ERCE), or the Regional Comparative and Explanatory Study; ERCE is also known by the acronyms PERCE, SERCE and TERCE, for the first, second and third study (Ferrer 2006, UNESCO Santiago 2015). In 2012, another assessment, the Programme d’Analyse des Systemes Educatif de CONFEMEN (PASEC), which had been less systematic than a typical ILSA, changed to become more standardized across countries and also is partnering with SACMEQ (CONFEMEN 2015). Table 1 reports selected information about the various ILSAs. 2 Table 1: International large‐scale assessments, 1995‐2016  Assessment  Sponsor  Years  Frequency  Grades/Age  Subjects  TIMSS  IEA  1995‐2015  4‐year cycle  Grade 4  Mathematics  Grade 8  Science  PIRLS  IEA  2001‐2016  4‐year cycle  Grade 4  Reading  PISA  OECD  2000‐2015  3‐year cycle  15‐year olds  Mathematics  Science  Reading  ERCE  LLECE  1997‐2013  No fixed cycle   Grade 3   Mathematics  Grade 6  Reading  Writing  Science  SACMEQ  SACMEQ  1995‐2014  No fixed cycle  Grade 6  Mathematics  Reading  PASEC  CONFEMEN  2012‐  4‐year cycle  Grade 2  Curriculum  Grade 6  based  End of basic  The number of education systems 2 that have participated in one or more of these international or regional large-scale assessments has grown dramatically over the past decades, for example:  TIMSS, from 42 in 1995 to 60 in 2015  PISA, from 42 in 2000/2001 to 73 in 2015, with an additional 7 countries participating in PISA for Development  SACMEQ from 7 in 1995-1999 to 15 in 2012-14  ERCE from 13 in 1997 to 15 in 2013 What accounts for this increase? Many explanations for the spread of large-scale assessments have been offered. The number of countries participating in any large-scale assessments could increase along with the globalization of education, generally, and of assessment specifically (Kamens 2013; Meyer & Benavot 2013). The number of countries participating in ILSAs, in particular, could increase as the worldwide technical capacity for undertaking large-scale assessments increases, fueled in part by previous international, regional or national assessments (Lockheed 2010; Greaney and Kellaghan 2009). And more countries could decide to participate in ILSAs to benefit from the information that international comparative measures provide; this demand for information could be fueled both by the concerns of educators and by the analytic interests of macro-economists for 2 All ILSAs report results for “education systems”, which may include cities, states or provinces, or other sub-national units. The numbers in the bulleted points refer to education systems, whereas the remainder of this paper focuses on economies as defined by the World Bank. 3 valid and reliable cross-national data regarding economies’ education outcomes (Chabbot & Elliott, 2003; Hanushek & Woessmann 2012; Lockheed 2013). The spread of PISA, in particular, provides an opportunity to investigate these explanations. First, we can examine the “globalness” of international large-scale assessments, using PISA as an example. If ILSAs are “global”, then countries in all regions and at all levels of income should have an equal likelihood of participating in PISA, controlling for a country’s economic status; if ILSAs are not “global”, then differences in participation rates should be observed for countries in different regions and at different levels of economic development. Moreover, if ILSAs are “globalizing”, then the rate at which countries join PISA should be comparable across regions and levels of income. Second, we can examine the assessment capacity-building effects of participating in large-scale assessments. If participating in a large-scale assessment builds assessment capacity, then countries that have participated in another large-scale assessment (such as TIMSS, a regional assessment or a national assessment) should have greater assessment capacity and therefore should be more likely to participate in PISA than countries without this experience, controlling for the country’s economic status. The assessment capacity that is built through participation could be administrative, technical or both. Third, we can explore the role of economists in creating a demand for information by looking at the role the Organization for Economic Cooperation and Development (OECD) plays in PISA. If an economists’ organization such as OECD creates a demand for information, then OECD member countries should have a higher level of participation in PISA than other countries, controlling for the country’s economic status. METHOD This paper tests some of these notions empirically, using a purpose-built data set covering over 200 countries and economies3 as of 2015. A. Data sources The data used in this paper come from four main sources:  Economic status and geographic location, from the World Bank.  Participation in international and regional large-scale assessments, from published reports and websites of sponsoring agencies. 3 Hereafter “countries”. 4  History of national assessments, from UNESCO’s Global Monitoring Report.  Membership in the OECD and date of membership, from OECD. B. Measures Countries. Countries are those identified by the World Bank. All countries that have existed or have come into existence since 2000 are included (World Bank 2015). The countries used in this study are those with populations greater than 30,000. Participation in PISA (the dependent variable). Participation in each of the five completed PISA cycles was coded from various published reports (OECD 2001, OECD 2004, OECD 2007, OECD 2010, OECD 2014, Walker 2011); the identity of the countries participating in the sixth cycle, PISA 2015 which is on-going, was provided by OECD. Globalization. Two indicators of globalization are used. The first relates to level of economic development. Time series data on each country’s economic status, 1997-2013 were accessed from the World Bank (2015). The World Bank classifies countries into four categories based on their GNP per capita: low-income, lower-middle income, upper-middle-income and high income. A country’s classification can change over time, in accordance with their GNP per capita, and these changes are reflected in the data. The country’s economic status at the outset of each PISA cycle in which the country participated, approximately two years before the date of the cycle, is used in the analysis. For example, economic data for PISA 2015 reflect the country’s level of economic development in 2013. Thus, the country’s economic status is “lagged” relative to PISA. Although differences in the level of PISA participation according to a country’s level of economic development could be expected, similarities in the growth of PISA participation for countries at all levels of economic development may indicate the globalization of assessment. The other indicator related to globalization is the country’s region. The World Bank groups countries into seven regions: East Asia and Pacific, Europe and Central Asia, South Asia, Sub-Saharan Africa, Latin America and the Caribbean, North America, and Middle East and North Africa). This paper uses the World Bank’s country groupings as of 2015. Similarities in the level and growth of PISA participation across regions may also indicate the globalization of assessment. Assessment capacity. A country’s assessment capacity is inferred from the country’s experience with three large-scale assessments: an international assessment, a regional assessment, and a national assessment. Each country’s participation in the longest-operating international large-scale assessment, Trends in International Mathematics and Science Study (TIMSS), was coded from published reports from each cycle of that assessment (Beaton, Martin et al., 1997; 5 Beaton, Mullis et al., 1997; Martin et al. 1997 ; Martin et al. 2000; Martin et al. 2004; Martin, Mullis & Foy 2008; Martin, Mullis et al 2012; Mullis et al. 1997; Mullis et al. 2000; Mullis et al. 2004; Mullis, Martin & Foy 2008; Mullis et al. 2012). Countries participating in TIMSS 2015 were identified through the IEA website. The variable used in the analyses was whether or not the country had participated in TIMSS during the five year period preceding the specific PISA cycle, as follows: PISA 2000 (TIMSS 1995 or TIMSS 1999), PISA 2003 (TIMSS 1999), PISA 2006 (TIMSS 2003), PISA 2009 (TIMSS 2007), PISA 2012 (TIMSS 2007 or TIMSS 2011), PISA 2015 (TIMSS 2011). Countries that had participated in any one of three regional large-scale assessments were identified from their websites. Countries participating in SACMEQ I, II and III were listed in the SACMEQ website (2015). Countries participating in PERCE, SERCE and TERCE were listed in the LLECE website (UNESCO Santiago 2015). Countries participating in PASEC 2012 were listed in the CONFEMEN website (CONFEMEN 2015). The year(s) of participation in each assessment were noted. The variable used in the analyses was whether or not the country had participated in a regional assessment prior to a specific PISA cycle. Countries also build their own assessment capacity through national assessments. The UNESCO Global Monitoring Report provided data on each country’s history of national assessments (UNESCO 2015). The variable used in the analyses was whether or not the country had conducted a national assessment in the three years prior to a specific PISA cycle. Role of economists. The indicator for the role that economists play in the growth of PISA is the country’s membership in the OECD. Historical data on OECD membership was extracted from the OECD website. The variable used was whether at the time of a particular PISA cycle the country was an OECD member. C. Issues with matching All data were matched by country and year, and the unit of analysis used in this paper is the “country-PISA cycle.” The assumption is that all countries could have participated in up to six cycles of PISA, depending upon whether or not the country existed at the time of the specific cycle. Some issues in matching data were encountered, since both PISA and TIMSS report information on “education systems” while the World Bank reports information on “economies.” This means, for example, that PISA and TIMSS report results for various education systems in the United Kingdom (Scotland, England, Wales and Northern Ireland) whereas the World Bank 6 reports information for the United Kingdom as a whole. This paper follows the World Bank’s policy with respect to country identification. D. Estimation strategy This paper uses a logistic regression approach to estimate the effect on PISA participation of a country’s level of economic development, region, prior participation in national, regional or international assessments, and OECD membership. The logistic regression analysis estimates the increased odds of a country’s participation in PISA at any time, based on these country characteristics. Two models are estimated, one with controls for the six PISA cycles and one without these controls. In addition, a final model is estimated for only the PISA 2015 cycle. RESULTS A. Descriptive statistics and bivariate comparisons PISA participation. Since 2000, country participation in PISA4 has increased by about 70%, from 20 percent of countries participating in the first cycle to 34 percent of countries participating in the sixth cycle of the assessment. Thus, approximately one-third of all countries with populations over 30,000 are participating in the sixth cycle of 2015 (table 2). Table 2: Share of countries participating in PISA, by PISA cycle    PISA Cycle (Year)    I  II  III  IV  V  VI  (2000/01)*  (2003)  (2006)  (2009/10)*  (2012)  (2015)  PISA participants  20.3%  19.1%  27.01%  33.5%  29.3%  33.95  All countries  207  209  211  212  215  215  Note: * includes PISA Plus countries  Participation by level of economic development. A higher share of high income and upper-middle-income countries have participated in PISA, as compared with lower-middle income countries and low-income countries. Overall participation rates increase dramatically with the level of a country’s income, with greater levels of participation among higher income countries in all cycles. The gap in participation rates between higher income and lower income countries has increased over time (figure 1). The most rapid growth in PISA participation has occurred for 4 Including PISA Plus, which was carried out in 2001 and 2010 with a number of OECD “observer” or “partner” countries. 7 upper-middle-income countries, whose rate of participation approximately doubled between 2000 and 2015. Participation rates for lower-middle-income and low-income countries remained quite stable from 2000 to 2015. Differences between PISA cycle III and cycle IV in the participation rates for upper-middle-income countries may be partially explained by the effort to include more middle-income countries in PISA 2009 Plus, which added 10 middle-income countries for a PISA assessment in 2010. Figure 1. Share of countries participating in PISA, by PISA cycle and  economic status 100 90 80 70 60 OECD High income (includes OECD) 50 Upper middle‐income 40 Lower‐middle‐income Low income 30 20 10 0 Cycle I Cycle II (2003 Cycle III Cycle IV Cycle V Cycle IV *includes PISA Plus (2000/01)* (2006) (2009/10)* (2012) (2015) Participation by region. The overall PISA participation rates of countries in Europe and Central Asia (58%) and in North American (61%), are higher than the participation rates of countries in other regions, with Sub-Saharan Africa recording the lowest participation rate (less than 0.5%). The inclusion of Central Asian countries in the ECA group hides the fact that the 8 overall participation rate for European Union countries5 is 100%, whereas the participation rate of non-EU ECA countries is substantially lower (64%). Table 3: Share of countries participating in PISA, by cycle and region    PISA cycle    I  II  III  IV  V  VI  (2000/01)*  (2003)  (2006)  (2009/10)*  (2012)  (2015)  East Asia and Pacific   20.0%  21.6%  24.3%  32.4%  35.1%  35.1%  Europe and Central Asia  50  48.2  64.3  66.7  64.9  68.4  Latin America and Caribbean  12.8  7.7  15.4  28.2  19.5  26.8  Middle East and North Africa  4.8  4.8  19.1  28.6  23.8  38.1  North America  66.7  66.7  66.7  66.7  66.7  66.7  South Asia  0  0  0  12.5  0  0  Sub‐Saharan Africa  0  0  0  2.1  0  0  Assessment capacity. Prior assessment experience appears to increase PISA participation (table 4). Participating in an ILSA appears to build assessment capacity; the share of countries participating in PISA after having previously participated in IEA’s TIMSS assessment is higher than that of countries that had not participated in TIMSS previously. For all PISA cycles, countries that had participated in a recent TIMSS assessment had higher rates of participation in PISA as compared with countries that had not previously participated in TIMSS. This advantage was relatively consistent over time; the higher advantage for prior TIMSS participants was 50 percentage points in 2000, and 54 percentage points in 2015. The advantage from TIMSS participation may arise from the similarity between the two assessments in the administrative skills required, so that countries that participated in TIMSS would already understand the administrative demands of PISA participation. Table 4: Share of countries participating in PISA, by cycle and previous assessment “capacity”    PISA cycle    I*  II  III  IV*  V  VI  (2000/01)  (2003)  (2006)  (2009/10)  (2012)  (2015)  International Large‐Scale       In TIMSS  60.5%  51.2%  59.6%  62.7%  64.3%  73.3%       Not in TIMSS  9.8  11.13  17.7  22.2  13.8  18.7  Regional                   In Regional  0  0  24.0  7.1  25.0  21.1       Not in Regional  21.0  20.4  27.42  35.4  31.0  36.7  5 The EU countries are: Austria, Belgium, Bulgaria, Croatia, Republic of Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Ireland, Italy, Latvia, Lithuania, Luxembourg, Malta, Netherlands, Poland, Portugal, Romania, Slovakia, Slovenia, Spain, Sweden and the UK. 9 National                   National  30.9  30.6  46.9  49.5  47.2  56.0       No National  15.1  13.1  14.6  18.4  13.1  17.7  Participation in a regional assessment does not appear to build capacity in the same way. In fact, a much smaller share of countries participating in a regional assessment subsequently participate in PISA, as compared with countries that do not participate in a regional assessment. Regional assessments may substitute for international assessments in Latin America and Sub- Saharan Africa. By comparison, countries with a tradition of national assessments also participate more in PISA. The advantage of participating in a national assessment ranges from a 16 percentage point advantage in 2000/2001 to a 38 percentage point advantage in 2015. This may be due to the technical (rather than only administrative) assessment expertise that has been developed through designing and implementing as assessment specifically for the country. Participation by OECD members. All OECD member countries participate in PISA, and all countries that were members of OECD at the time participated in each PISA cycle (figure 1). While an increasing number of non-OECD member countries participate in PISA and are referred to as “partner” countries in OECD publications, no OECD member country has not participated in PISA. As a result the OECD member countries’ participation rates are 100% for all PISA cycles. B. Multivariate comparisons using logistic regressions with odds ratios Assessment capacity, geographical location and economic status may be related, and bivariate comparisons do not take this into account. This section explores the likelihood of a country’s participation in PISA when these three advantage factors are considered simultaneously. A logistic regression approach (Stata logit regression with odds ratios) is used to estimate the odds of participation in PISA, 2000-2015, based on the above “advantage” (non-risk) factors. Separate regressions are run for two groups of countries: (a) all countries and (b) non- OECD countries; in addition, models are run that control for the assessment cycle. Table 5 presents the means and standard deviations of the variables used in the logistic regression analysis, separately for observations for OECD countries, all countries (including OECD countries) and for non-OECD countries. The unit of analysis (observation) is the country at up to six times, once for each PISA cycle: 2000, 2003, 2006, 2009, 2012 or 2015. The value for the dependent variable—participation in a given PISA cycle – is highest for OECD countries, and higher for all countries than for non-OECD countries. Average values for 10 four predictor variables are highest for OECD countries and higher for all countries than for non- OECD countries; these are: TIMSS participation, national assessments, location in ECA and high-income economy. Values for three other predictor variables are lowest for OECD countries and lower for all countries than for non-OECD countries; these are: location in Sub-Saharan Africa, low-income economy and lower-middle-income economy. Table 5: Descriptive statistics    OECD countries   All countries   Non‐OECD countries   (Obs = 188)  (Obs = 1269)  (Obs = 1081)    Mean  Std. Dev.  Mean       Std. Dev.  Mean       Std. Dev.  OECD member  1.00  0  .15  .36  0  0  In PISA  .99  .10  .27  .45  .15  .36  In TIMSS within 5 years  .61  .49  .25  .43  .19  .39  In regional assessment  .03  .16  .08  .28  .09  .29  National assessment  .06  .25  .41  .49  .35  .48  East Asia and Pacific  .13  .33  .17  .39  .18  .39  Europe and Central Asia  .76  .43  .26  .44  .18  .38  Latin America and Caribbean  .04  .20  .19  .39  .21  .41  Middle East and North Africa  .01  .10  .10  .30  .11   .32  North America  .06  .25  .01  .12  .01   .07  South Asia  0  0  .04  .19  .04   .21  Sub‐Saharan Africa  0  0  .22  .42  .26   .44  Low‐income (lagged)  0  0  .25  .43  .29   .46  Lower‐middle‐income (lagged)  .01  .07  .26  .44  .30   .46  Upper middle‐income (lagged)  .15  .36  .20  .40  .21   .40  High Income (lagged)   .84  .43  .29  .46  .20   .40  Basic model. First, variables comprising each advantage factor are examined separately, for: (a) all countries and (b) non-OECD countries (table 6). The first three columns of this table present results for all countries, and the second three columns present the results for non-OECD countries. The odds ratios reported in this table show the increase in the odds of PISA participation for each of three groups of country characteristics: the country’s economic status at the start of the PISA cycle, its regional location, its capacity for assessment as indicated by prior experience; in addition, OECD membership is included for the “all countries” model. The first column shows the strong effect of a country’s economic level of development on PISA participation. High income countries are nearly 80 times more likely to participate as compared with low-income countries, and upper-middle income countries are 36 times more likely to participate. Among the non-OECD countries (column 4), these odds drop considerably, but remain highly significant. 11 The second column indicates that countries outside the ECA region are much less likely to participate in PISA as compared with those in the ECA region, for all regions other than North America, where the odds of participation are slightly higher than in the ECA region; North America, however, includes only three countries (Canada, Jamaica and the United States; Canada and the US participate in PISA). Non-OECD countries (column 5) in regions other than ECA are also less likely to participate than countries in ECA. Table 6. Logistic regressions, all countries and non‐OECD countries    All countries Odds Ratios  Non‐OECD countries Odds Ratios    (1)  (2)   (3)   (4)   (5)  (6)   Dependent variable: In PISA   Income   Region  Capacity   Income   Region  Capacity                Low Income (omitted)  ‐‐‐      ‐‐                    Lower Middle Income (lagged)  9.38***      9.17***      (4.50)  (4.40)  Upper Middle Income (lagged)  35.99***      25.75***      (16.89)  (12.21)  High Income (lagged)  77.94***      18.73***        (36.06)      (8.97)      Europe and Central Asia    ‐‐      ‐‐                  East Asia and Pacific    .255***      .496**        (.048)      (.118)    Latin America and Caribbean    .147***      .383***        (.030)      (.091)    Middle East and North Africa    .161***      .470**        (.04)      (.130)    North America    1.30***      ‐‐        (.666)          South Asia    .013***      .044**        (.014)      (.045)    Sub‐Saharan Africa    .002***      .007***        (.002)      (.007)    Previously in TIMSS      8.14***      8.20***        (1.26)      (1.58)  In regional assessment      0.66      1.23        (.19)      (.385)  National assessment      4.08***      2.64***        (.61)      (.504)                Constant  ‐0.02***  1.54***  0.013***  .016***  .485***  0.059***    (0.01)  (0.17)  (.0008)  (.007)  (0.074)  (0.009)  Number of observations  1269  1269  1269  1081  1075  1081  LR chi2  341.06  380.64  341.9  109.05  133.3  157.79  Degrees of freedom  3  6  3  3  5  3  Pseudo R2  .229  .255  .229  .119  .146  .173  Note: Standard error in parentheses, *** p<.001, ** p<.01, * p<.05  12 The third column indicates that the likelihood of participation in a PISA cycle is affected by a country’s capacity for assessment, as indicated by its prior experience with TIMSS or national assessments. Participation in a regional assessment, however, is unrelated to participation in PISA. These odds are similar for all countries and non-OECD countries. Table 7 presents the results for income level, region and assessment capacity considered simultaneously. In general, the size of the coefficients for most of the variables are smaller in the full models than in the models in which the variables are considered in separate groups. All odds are relative to countries with the following characteristics: low-income countries, not in the ECA region, not with prior assessment capacity. The first column presents the model that includes an indicator for OECD membership; as noted previously, OECD membership perfectly predicts PISA participation. The level of economic development strongly increases the odds of PISA participation, with or without an OECD control; upper middle-income countries and high income countries are about 15 times more likely to participate in PISA as compared with low-income countries, and lower-middle income countries are about five times more likely to participate. Countries in the ECA region are about three times more likely to participate than countries outside the region. Coefficients for region and level of economic development are higher in the model that does not include a control for OECD membership. Countries with prior experience of TIMSS are about five times more likely to participate in PISA, and countries with national assessments are about three times more likely to participate. Regional assessments are unrelated to PISA participation, in all models. The strong positive effect for having built capacity through TIMSS participation and the non-significance of regional assessment capacity do not differ between the two groups of countries. Table 7: Odds of participating in PISA, all countries and non‐OECD countries  In PISA  All countries  All countries  Non‐OECD countries  Odds Ratio  Odds Ratio  Odds ratio  (std. error)  (std. error)  (std. error)  Lower‐middle income  5.01 (2.47)**  4.91 (2.45) **  4.86 (2.48) **  Upper middle‐income  15.04 (7.35)***  20.46 (10.03) ***  15.60 (7.63) ***  High income  14.49 (7.21)***  39.22 (19.01) ***  13.19 (6.58) ***  ECA region  3.01 (0.68)***  6.07 (1.20) ****  3.07  (0.71) ***  In TIMSS  5.01 (1.04)***  5.43 (1.03) ***  5.36 (1.12) ***  In Regional  1.21 (0.41)  1.32 (0.3)  1.20 (0.41)  National  3.60 (0.76)***  6.03 (1.16) ***  3.41 (0.73) ***  OECD  104.90 (77.37)***  ‐‐  ‐‐  Constant  .006 (.003)***  .004 (.002) ***  0.006 (0.003) ***  Observations  1269  1269  1081  LR chi2  809.07  678.58  256.04 ***  13 Degrees of freedom  8  7  7  Pseudo R2  0.542  0.455  0.280  Note: ** p < .01, *** p < .001  Controlling for year of assessment. The above analyses were repeated with dummy variables introduced to control for the PISA cycle year. Controlling for PISA cycle did not change the size or statistical significance of the coefficients for any of the models (table 8). Table  8:  Odds  of  participating  in  PISA,  all  countries  and  non‐OECD  countries,  controlling  for  year  of  assessment  In PISA  All countries  All countries  Non‐OECD countries  Odds Ratio (std. error)  Odds Ratio (std. error)  Odds ratio (std. error)  Lower‐middle income  5.12 (2.55)**  4.99 (2.50) **  4.95 (2.47) **  Upper middle‐income  14.60 (7.24)***  21.08 (10.43) ***  15.15 (7.51) ***  High income  13.97 (7.05)***  40.65 (19.91) ***  12.68 (6.41) ***  ECA region  3.07 (0.71)***  6.24 (1.24) ****  3.14  (0.74) ***  In TIMSS  5.10 (1.08)***  5.55 (1.07) ***  5.47 (1.17) ***  In Regional  1.22 (0.39)  1.33 (0.44)  1.12 (0.39)  National  3.58 (0.77)***  5.99 (1.16) ***  3.40 (0.74) ***  OECD  145.84 (109.47)***  ‐‐  ‐‐  Constant  .003 (.002)***  .004 (.002) ***  0.004 (0.002) ***  Observations  1269  1269  1081  LR chi2  832.28  689.57  278.29 ***  Degrees of freedom  13  12  12  Pseudo R2  0.558  0.463  0.305  Note: ** p < .01, *** p < .001; dummy variables for PISA year as controls  Single-year logistic regressions. Separate analyses were conducted for each PISA cycle, using fewer indicator variables, since the number of observations ranged from 207 to 215 countries per year (table 9). As previously noted, OECD membership perfectly predicted participation in PISA in all years so the variable was not included in the regression. Table 9: Odds of participating in PISA, by year, PISA 2000, PISA 2006 and PISA 2015  In PISA  Odds Ratio (std. error)    PISA 2000  PISA 2006  PISA 2015  Upper middle‐income  3.18 (2.17)*  5.09 (3.06)**  9.34 (5.78) ***  High income  8.84 (4.89)***  9.02 (4.85)***  13.63 (19.91) ***  ECA region  7.25 (3.75)***  12.37 (6.19)***  6.74 (3.40) ****  In TIMSS  9.04 (4.46)***  7.71 (3.92)***  8.01 (4.20) ***  National  3.90 (2.11)*  9.42 (4.76)***  9.77 (4.69) ***  Constant  .012 (.007)***  .010 (.006)***  .008 (.005) ***  Observations  207  211  215  LR chi2  92.63  117.29  137.03  Degrees of freedom  5  5  5  Pseudo R2  0.444  0.476  0.497  14 Note: ** p < .01, *** p < .001;    Although the size of the coefficients varied somewhat from year to year, the odds of participation in PISA were consistently higher for: (a) countries with exiting assessment capacity, as shown by their prior participation in TIMSS and their own national assessment, (b) upper- middle-income and high-income countries and (c) countries in the ECA region. Moreover, the gap between higher income countries and lower income countries increased over time. All odds are compared with countries with the following characteristics: low-income and lower-middle- income countries, not in the ECA region, not with prior TIMSS participation and with no national assessment. Table 9 presents the results for three cycles: PISA 2000, 2006 and 2015. In short, countries are more likely to participate in PISA when their level of economic development is higher, when they are located in Europe or Central Asia and when they have some level of existing assessment capacity as provided through participation in a prior international large-scale assessment or developed through their own national assessment. DISCUSSION This paper began with three questions related to the growth in the number of countries participating in international large scale assessments: (a) Is the growth of international large-scale assessments indicative of the globalization of assessment? (b) Does participating in an assessment build a country’s capacity for future assessment? (c) Is the growth in assessments driven by demands for information, particularly from economists? While this paper sheds some light on these question, many questions remain. Is PISA participation indicative of the “globalization” of assessment? That is, are countries in all regions and at all levels of development increasingly participating in PISA? Clearly, PISA participation is significantly higher for countries in the ECA region, which is closer to the European-based OECD, than for countries outside this region (with the exception of North American participation); the PISA participation of countries in South Asia and Sub-Saharan Africa is particularly low. At the same time, PISA 2015 is reaching more countries in more regions than were reached in PISA 2000, and a new initiative of OECD—PISA for Development – is attempting to accommodate more lower income countries (OECD 2015). Globalization across levels of economic development is less clear. Growth in participation is observed for high-income and upper-middle income countries, suggesting globalization. But little growth in participation for low-income or lower-middle-income countries is observed. Virtually no low-income country has participated in PISA (the average share across 15 the six cycles is 1.6%) and the average share of participating lower-middle-income countries is about 13%. The seven6 countries participating in PISA for Development include two low-income and four lower-middle income countries; if they all complete the assessment on schedule, then the share of low-income country participations would increase to 5.5% and the share of lower-middle income country participations would increase to 18.7%. The low rate of participate for low-income and middle-income countries may also reflect the deterrent effect of the actual costs of participation. At a minimum, participation in PISA requires payment of an international participation fee of 182,000 Euros, payable at 45,500 Euros annually for four years. This amount is a very small cost for upper-middle-income and high- income countries, but may appear greater for lower-middle-income and low-income countries, particularly when added to other direct costs (such as for travel to required meetings) and the in- country costs for implementing the assessment. In the past, the participation of low- and middle- income countries in various international large-scale assessments has been facilitated by the economic support of donors such as the World Bank and the United Nations Development Program (Liberman & Clarke 2012; Lockheed 2010). Does participation in other types of assessment build the capacity for participating in an international large-scale assessment such as PISA? Here, the answer appears much clearer. A much higher share of countries that have participated in another international large- scale assessment (TIMSS) go on to participate in PISA. Moreover, countries that have an established national assessment system also go on to participate in PISA at higher rates than countries lacking national assessments. This higher rate of participation can be explained by the increased capacity for assessment that such participation develops, which can be both for the administrative demands of the assessment and its technical requirements. Third, is the growth in international large scale assessment driven by the data needs of economists? The results indicate that – at a minimum – country members of a significant organization dominated by economists, the OECD (which sponsors PISA), are far more likely to participate in PISA than non-members. The participation rate of OECD member countries is 100% in all years, whereas the participation rate of non-OECD countries averages around 15% across the six PISA cycles. OECD membership predicts PISA participation perfectly in all years. Research suggests that economists disproportionately use the results from ILSAs in general and PISA in particular in their research on human capital and economic development (Lockheed 6 The countries participating in PISA for Development are: Cambodia (low income), Ecuador (upper-middle-income), Guatemala (lower-middle-income), Paraguay (lower-middle-income), Senegal (lower-middle-income), Tanzania (low income), Zambia (lower-middle-income) 16 2013); in particular, assessments have been heavily used by the economists Ludgar Woessmann (2002) and Eric Hanushek (e.g., Hanushek & Woessmann 2012). Other reasons for growth in international large-scale assessments. But the growth in PISA may be due to other concurrent changes at the country level. The context for assessment has changed in many countries over the past 15 years: near universal basic education, better education information systems, better governance, greater ease of doing business, more open information. Universal basic education (often including secondary education) means that countries can turn their attention from measuring educational access to measuring educational quality. Virtually all countries participating in most international-large scale assessments report high rates of enrollment in primary and secondary education (World Bank 2015 April). Better education information systems means that the requisite information needed for scientific sampling of schools and students is available for all large-scale assessments. Better governance means that a stable and secure environment for providing education can be found (Kaufmann, Kraay & Mastruzzi 2010; World Bank 2015). Virtually no countries participating in international large-scale assessments have been identified as “fragile” or “conflict” states at the time the assessment was conducted, and some countries that have participated previously in an international large-scale assessment have ceased to participate after the governance situation eroded. Greater ease of doing business means that ministries of education can more easily employ the temporary resources needed to complete an international or regional assessment (World Bank 2014). The average rating for “ease of doing business” is higher for countries participating in PISA , as compared with countries in the same economic group that do not participate in PISA (World Bank 2014; OECD forthcoming 2015). More open information means that the results of assessments can be shared more broadly within the country (Reporters Without Borders 2014). The increase in international large-scale assessments may indeed indicate globalization, but possibly just the globalization of development. REFERENCES Beaton, A., M. Martin, I. Mullis, E. Gonzales, T. Smith and D. Kelly (1996), Science Achievement in the Middle School Years: IEA’s Third International Mathematics and Science Study, Center for the Study of Testing, Evaluation and Education Policy, Boston College, Chestnut Hill, MA. Beaton, A., I. Mullis, M. Martin, E. Gonzales, D. Kelly and T. Smith (1996), Mathematics Achievement in the Middle School Years: IEA’s Third International Mathematics and 17 Science Study, Center for the Study of Testing, Evaluation and Education Policy, Boston College, Chestnut Hill, MA. Chabbot, C. and E. Elliott (eds.) (2003), Understanding Others, Educating Ourselves: Getting More from International Comparative Studies in Education, National Academies Press, Washington DC. CONFEMEN (2015), Website: http://www.confemen.org/le-pasec/ Ferrer, G. ( 2006), Educational Assessment Systems in Latin America: Current Practice and Future Challenges, Partnership for Educational Revitalization in the Americas (PREAL), Washington DC. Greaney, V and T. Kellaghan (2008). National Assessments of Educational Achievement, Vol. 1: Assessing National Achievement Levels in Education. World Bank Publications, Washington, DC. Eric Hanushek & Ludger Woessmann, 2012. "Do better schools lead to more growth? Cognitive skills, economic outcomes, and causation," Journal of Economic Growth, Springer, vol. 17(4), pages 267-321, December Kamens, D. (2013) "Globalization and the Emergence of an Audit Culture: PISA and the Search for 'Best Practices' and Magic Bullets" in H.-D. Meyer & A. Benavot (eds.), PISA, Power and Policy: The Emergence of Global Educational Governance, pp. 117-140, Symposium Books, Oxford. Kaufmann, D., A. Kraay and M. Mastruzzi (2010), The Worldwide Governance Indicators: Methodology and Analytical Issues. Brookings Institution, Washington, DC. Kellaghan, T. and V. Greaney (2001), Using Assessment to Improve the Quality of Education, International Institute for Educational Planning, UNESCO, Paris. Liberman, J. and M. Clarke (2012), A Review of World Bank Support for Student Assessment Activities in Client Countries, 1998-2009, World Bank, Washington DC. Lockheed, M. (2015). "PISA for Development? The Experiences of Middle-income Countries Participating in PISA", paper presented at the World Bank, June 4, 2015. Lockheed, M. (2013), "Causes and Consequences of International Assessments in Developing Countries" in H.-D. Meyer & A. Benavot (eds.), PISA, Power and Policy: The Emergence of Global Educational Governance, pp. 163-185, Symposium Books, Oxford. Lockheed, M. (2010), The Craft of Education Assessment: Does Participating in International and Regional Assessments Build Assessment Capacity in Developing Countries? Amsterdam: International Association for the Evaluation of Educational Achievement. Martin, M., I. Mullis, A. Beaton, E. Gonzales, T. Smith and D. Kelly (1997), Science Achievement in the Primary School Years: IEA’s Third International Mathematics and Science Study, Center for the Study of Testing, Evaluation and Education Policy, Boston College, Boston. 18 Martin, M., I. Mullis, & P.Foy,(with J.F. Olson, E.bErberber, C. Preuschoff, & J. Galia), (2008). TIMSS 2007 International Science Report: Findings from IEA’s Trends in International Mathematics and Science Study at the Fourth and Eighth Grades, Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College. Martin, M., I. Mullis, P. Foy & G. Stanco (2012). Trends in International Mathematics and Science Study 2011: International Results in Science, Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College. Martin, M.O., Mullis, I.V.S., Gonzalez, E.J., & Chrostowski, S.J. (2004) TIMSS 2003 International Report: Findings From IEA’s Trends in International Mathematics and Science Study at the Fourth and Eighth Grades, Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College. Martin, M., I. Mullis, E. Gonzalez, K. Gregory, T. Smith, S. Chrostowski, R. Garden and K. O’Connor (2000), TIMSS 1999 International Science Report: Findings from IEA's Repeat of the Third International Mathematics and Science Study at the Eighth Grade, International Study Center, Boston College, Chestnut Hill, MA. Meyer, H.-D. and A. Benavot (eds.) (2013), PISA, Power and Policy: The Emergence of Global Educational Governance, Symposium Books, Ltd, Oxford, UK. Mullis, I., M. Martin, A. Beaton, E. Gonzales, D. Kelly and T. Smith (1997), Mathematics Achievement in the Primary School Years: IEA’s Third International Mathematics and Science Study, Center for the Study of Testing, Evaluation and Education Policy, Boston College, Boston. Mullis, I.V.S., Martin, M.O., & Foy, P. (with J.F.Olson, C. Preuschoff, E. Erberber, A. Arora & J. Galia,), (2008). TIMSS 2007 International Mathematics Report: Findings from IEA’s Trends in International Mathematics and Science Study at the Fourth and Eighth Grades, Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College. Mullis, I., M. Martin, P. Foy & A. Arora (2012). Trends in International Mathematics and Science Study 2011: International Results in Mathematics, Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College. Mullis, I., M. Martin, E. Gonzalez, K. Gregory, R. Garden, K. O’Connor, S. Chrostowski and T. Smith (2000), TIMSS 1999 International Mathematics Report Findings from IEA’s Repeat of the Third International Mathematics and Science Study at the Eighth Grade. International Study Center, Boston College, Chestnut Hill, MA. OECD (2015 forthcoming). The Experience of Middle-Income Countries Participating in PISA, 2000-2015. OECD Publishing, Paris. OECD (2014), PISA 2012 Results: What Students Know and Can Do – Student Performance in Mathematics, Reading and Science (Volume I, Revised edition, February 2014), OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264208780-en. OECD (2010), PISA 2009 Results: What Students Know and Can Do – Student Performance in Reading, Mathematics and Science (Volume I), PISA, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264091450-en 19 OECD (2007 ), PISA 2006: Science Competencies for Tomorrow's World: Volume 1: Analysis, PISA, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264040014-en. OECD (2004), Learning for Tomorrow's World: First Results from PISA 2003, PISA, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264006416-en. OECD (2001), Knowledge and Skills for Life: First Results from PISA 2000, PISA, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264195905-en. Reporters Without Borders (2014), World Press Freedom Index 2014, Reporters Without Borders, Paris, France. SACMEQ (www.sacmeq.org) UNESCO (2015), Education for All 2000-2015: Achievements and Challenges, UNESCO, Paris. Report (http://en.unesco.org/gem-report/report/2015/education-all-2000-2015- achievements-and-challenges#sthash.QTUk60tu.dpbs UNESCO Regional Office for Education in Latin America and the Caribbean (10 April 2014) “The Latin-American Laboratory for the Assessment of the Quality of Education (LLECE): Associated Factors. Presentation in Washington, DC. UNESCO Regional Office for Education in Latin America and the Caribbean (n.d.) Mapa de Evaluaciones Educativas (www.unesco.org/santiago) Wagemaker, H. (2014), "International Large-Scale Assessments: From Research to Policy", in L. Rutkowski, M. v. Davier, & D. Rutkowski. (eds.), Handbook of International Large-Scale Assessment: Background, Technical Issues and Methods of Data Analysis, Taylor and Francis, New York. Walker, M. (2011), PISA 2009 Plus Results: Performance of 15-year-olds in Reading, Mathematics and Science for 10 Additional Participants. ACER Press, Victoria. Woessmann, L. (2002), "Cross-country Evidence on Human Capital and the Level of Economic Development: The Role of Measurement Issues in Education", Historical Social Research, Vol. 27, No. 4, pp. 47-76. GESIS. World Bank (2015 April) World Bank EdStats. World Bank (2015), Worldwide Governance Indicators, World Bank, Washington DC. World Bank (2015) historical country classification (https://datahelpdesk.worldbank.org/knowledgebase/articles/378834-how-does-the-world-bank- classify-countries) World Bank (2014). Doing Business 2015: Going Beyond Efficiency, The World Bank, Washington DC. 20