Plastic pollution assessment methodologies suitability toolkit V1.0 User Manual i © 2023 International Bank for Reconstruction and Development / The World Bank 1818 H Street NW Washington DC 20433 Telephone: 202-473-1000 Internet: www.worldbank.org This work is a product of the staff of The World Bank with external contributions. The findings, interpretations, and conclusions expressed in this work do not necessarily reflect the views of The World Bank, its Board of Executive Directors, or the governments they represent. The World Bank does not guarantee the accuracy, completeness, or currency of the data included in this work and does not assume responsibility for any errors, omissions, or discrepancies in the information, or liability with respect to the use of or failure to use the information, methods, processes, or conclusions set forth. The boundaries, colors, denominations, and other information shown on any map in this work do not imply any judgment on the part of The World Bank concerning the legal status of any territory or the endorsement or acceptance of such boundaries. Nothing herein shall constitute or be construed or considered to be a limitation upon or waiver of the privileges and immunities of The World Bank, all of which are specifically reserved. Citation: World Bank. 2023. Plastic Pollution Assessment Methodologies Suitability Toolkit (PLAST) – user manual. Washington DC. Rights and Permissions The material in this work is subject to copyright. Because The World Bank encourages dissemination of its knowledge, this work may be reproduced, in whole or in part, for non-commercial purposes as long as full attribution to this work is given. Any queries on rights and licenses, including subsidiary rights, should be addressed to World Bank Publications, The World Bank Group, 1818 H Street NW, Washington, DC 20433, USA; fax: 202-522-2625; e-mail: pubrights@worldbank.org. i Acknowledgements The PLAST toolkit was developed by a consortium coordinated by the University of Leeds with input from Deltares, IUCN and David Newby Associates (DNA). Led by Dr Costas Velis (University of Leeds), with main research and development provided by Dr Joshua W. Cottom (University of Leeds). Development of the user interface was provided by DNA with graphical support from the Digital Education Service (University of Leeds). The wider development team included: University of Leeds: Ed Cook; Deltares: Joana Veiga, Robyn Gwee, Bastien Van Veen; IUCN: Dr Janaka de Silva, Maeve Nightingale, Hien Bui Thi Thu, Lynn Sorrentino, Joao Sousa. The work was managed by a World Bank team comprised of Kate Philp, Anjali Acharya, Tao Wang, and Ikbal Alexander under the leadership and guidance of Mona Sur. Grzegorz Peszko and Suiko Yoshijima provided peer review. The study team would like to thank the representatives from national government departments, NGOs, academia and development partners who participated in the stakeholder consultation workshop, held virtually in February 2021, for their valuable insights and input to the PLAST Toolkit. The team is also grateful for the opportunity to present the updated PLAST toolkit to the ASEAN Working Group for Coastal and Marine Environment (AWGCME) in May 2023 and for the subsequent insights provided by the Focal Point of AWGCME, Thailand. Funding for the PLAST Toolkit was provided by PROBLUE, an umbrella multi-donor trust fund, administered by the World Bank, that supports the sustainable and integrated development of marine and coastal resources in healthy oceans. Please direct technical support questions to Dr Costas Velis (University of Leeds). Funded by: Developed by: ii Contents ACKNOWLEDGEMENTS ....................................................................................................................................II 1. OVERVIEW ............................................................................................................................................... 1 Aim ................................................................................................................................................................. 1 Objectives ....................................................................................................................................................... 1 Who should use PLAST? ................................................................................................................................. 1 2. HOW TO OPEN AND RUN PLAST .............................................................................................................. 1 Microsoft Windows ........................................................................................................................................ 1 Other Microsoft Excel security features ......................................................................................................... 6 MacOS ............................................................................................................................................................ 7 3. TERMINOLOGY ........................................................................................................................................ 8 What is meant by a methodological approach? ............................................................................................ 8 What is meant by a plastic pollution assessment methodology? .................................................................. 9 Other terminology ........................................................................................................................................ 10 4. STRUCTURE OF PLAST ............................................................................................................................ 11 5. HOW TO USE PLAST ............................................................................................................................... 15 Inputting user needs and available resources .............................................................................................. 15 Interpreting results ...................................................................................................................................... 19 Comparing methodologies ........................................................................................................................... 22 6. HOW DOES PLAST WORK? ..................................................................................................................... 23 Scoring and ranking of methodological approaches.................................................................................... 23 Scoring and ranking of plastic pollution assessment methodologies .......................................................... 24 7. FREQUENTLY ASKED QUESTIONS ........................................................................................................... 25 8. TROUBLESHOOTING .............................................................................................................................. 26 9. APPENDICES .......................................................................................................................................... 27 Appendix 1 ................................................................................................................................................... 27 Appendix 2 ................................................................................................................................................... 29 Appendix 3 ................................................................................................................................................... 30 iii 1. Overview Aim In response to the plastic pollution crisis, many methodologies for quantification of plastic pollution have been developed. However, the breadth of these plastic pollution assessment methodologies makes it difficult for practitioners to assess which methods are best suited for their needs. The ‘Plastic Pollution Assessment Methodologies Suitability Toolkit’ (PLAST) has been designed to characterize and compare plastic pollution assessment methodologies and generalised methodological approaches to suggest the most suitable options based on a user’s requirements. PLAST focuses on quantification assessments whereby the amounts of plastic pollution are determined and insight into sources, pathways and fates are provided. PLAST does not aim to provide harmonisation between the methodologies, however, it allows the outputs, technical features and data requirements to be compared thereby providing the first necessary stage in this harmonisation. Objectives PLAST has four objectives: 1 To collate methodologies available for the assessment of plastic pollution. To characterise assessment methodologies according to an explanatory 2 framework. To suggest what broad methodological approaches may be best suited based 3 on a user’s overall objectives and generalised resources. To suggest suitable plastic pollution assessment methodologies based on a 4 user’s specific technical objectives and data availability. Who should use PLAST? PLAST is designed to aid all users interested in applying plastic pollution assessment methodologies, for example those shown in Figure 1. These users can be split into those wanting to apply a methodology to assess plastics pollution (governments, NGOs, local authorities and businesses) and those looking to gain further information on how newly developed assessment methodologies may compare with those that already exist (academia and developers). 1 Figure 1: Users that may want to use PLAST. To apply PLAST, we recommend input from both high-level users focused on overall objectives, policies and resources; as well as technical users familiar with plastic pollution assessment methodologies terminology, data availability and required outputs. The manner in which these users’ knowledge and requirements is incorporated into PLAST is explained in the following sections. 2 2. How to open and run PLAST The PLAST spreadsheet application contains a full MS Office VBA application and a suite of Macros. To use the tool, Microsoft Excel must be configured to enable Macros and access the VBA application, the steps of which are outlined below. Please follow all these steps before trying to run PLAST to ensure it operates correctly. Refer to the Quick Start Guide for a further breakdown of each step. Please follow all these steps before trying to run PLAST to ensure it operates correctly Microsoft Windows Step 1. Save the PLAST toolkit in ‘My Documents’ Errors may be encountered if you try and run PLAST from a network drive such as OneDrive. To ensure PLAST works correctly, we recommend saving the downloaded file to ‘My Documents’ to run. PLAST can still be saved after its operation to OneDrive folders for sharing, however, it should always be moved to a stable location prior to opening (e.g., ‘My Documents’). Note: You must have permission to save files in the folder in which the PLAST has been saved and opened from. Step 2. Configure settings in the ‘Trust Center’ The ‘Trust Center’ settings control what content you are able to open within Microsoft Excel, for example Macros. We recommend users change their macro settings to ‘Disable VBA macros except digitally signed macros’ and check ‘Trust access to the VBA project object model’. These settings ensure users are opening a version of the toolkit that is unmodified from the official release. To change these settings, follow the steps below: 1 2.1: Open Excel and click on File in the menu, then Options and click Trust Centre . You will see a screen similar to the following: 2.2: Click on Trust Center Settings.. in the main window you will then see the following screen: Choose the Macro Setting “Disable VBA macros except digitally signed macros” (on some versions of Microsoft Excel this may instead read, “Disable all macros except digitally signed macros”). Tick the box “Trust access to the VBA project object model”. Click OK button to save settings and exit back to the Excel program. Close Excel and re-open the workbook….. Step 3. Open PLAST and Install the ‘Digital certificate’. A digital certificate is a security feature added to VBA projects to verify that they are safe. It is recommended that users install the digital certificate prior to trying to run PLAST to show that they trust it. Although Step 2 can be completed by opening a new Excel workbook, the following steps should be done when opening the PLAST workbook. Please follow the steps below to install the certificate. 3.1: Open the digitally signed PLAST workbook and you should then see the SECURITY WARNING: 2 Click on the Options.. button. 3.2: The dialog box that opens shows the certificate details that the spreadsheet model is digitally signed with. Click on “Show Signature Details” 3.3: A further “Digital Signature Details” dialog opens: Click on the View Certificate… button: 3 3.4: Now click on Install Certificate… and the next dialogue box opens: 3.5: Choose store location “Local Machine” and click Next button: If storing to the “Local Machine” is disabled, the certificate can be stored on the “Current User” instead, however, others users of the PC will also have to install the certificate if they wish to run PLAST. 3.6: Now we choose where to save the certificate. To do this, you click on the Browse… button. 4 3.7: Choose “Trusted Root Certification Authorities” and click on the OK button. 3.8: Having chosen the certificate store, click the Next button 3.9: Then click Finish button to complete the certificate installation. 5 3.10: You will get a confirmation. Click OK . 3.11: Select the option “Trust all documents from this publisher” and click OK . Close Excel and reopen the PLAST spreadsheet application and the system should be fully functional. Other Microsoft Excel security features Microsoft Excel security features to prevent malicious use of macros have been frequently updated, keeping up with these can be problematic for developers of bona-fide Excel applications that rely on Microsoft’s Visual Basic for Applications (VBA). It is strongly recommended that the digital certificate is installed if possible. If it is not possible to install the digital certificate, then one of the following actions may enable the application to run, however this is at the user’s own risk. Trusted locations Excel has implemented another security feature “trusted locations” recently which can prevent even digitally signed macros, with certificate installed, from running if the file is opened from the “Downloads” folder or a shared network location (e.g. OneDrive). If this is the case, users are likely to see a standard excel workbook with the PLAST ‘About’ sheet displayed. The best way to correct this is to download the spreadsheet, open it, save it to “My Documents”, close it and then re-open it. This usually satisfies the “trusted location” security check. 6 Alternatively, a folder can be set as a trusted location by going to the ‘Trust Center’ (see step 2 above), clicking ‘Trusted Locations’, then “Add new location…” and selecting the folder in which PLAST is saved. We do not recommend that shared locations (e.g., OneDrive) are set as trusted locations. Warning – Making a folder a “trusted location” can enable any office application with macros to run, including files with malicious content. Do not copy office documents to a “trusted location” unless you are sure they are safe. Microsoft policy settings Some users that have computers provided and configured by their companies/organisations may have Security Policies pre-set and find that access to the Microsoft Excel Trust Centre is disabled. Users may attempt to run the macros by right-clicking the file, choosing Properties, and then select the Unblock checkbox on the General tab. Warning – Selecting ‘Unblock’ bypasses the digital certification and should only be done if users are confident of the source of the file. MacOS Unfortunately PLAST will not work correctly on MacOS. Please run PLAST on a Microsoft Windows PC. 7 3. Terminology What is meant by a methodological approach? Whilst many plastic pollution assessment methodologies exist, each with their own unique method and results, each methodology can be grouped into one of four main approaches (Table 1), known hereafter as ‘methodological approaches’. Table 1: Four methodological approaches used in the quantification of plastic pollution. It should be noted that there is considerable overlap in these approaches, and as such many assessment methodologies incorporate more than one approach. Icon Description Transfer coefficient The transfer coefficient approach is a top down method where flows are distributed according to coefficients. For example, the amount of mismanaged waste which may enter oceans. When applied as the primary method, the transfer coefficient approach is typically adept at requiring low resources and giving gross estimations to guide policy. Transfer coefficient approaches tend to provide a simplistic overview of the plastic flows in the solid waste management system. Material flow analysis Material flow analysis aims to model the flows and stocks of plastic waste within a solid waste management system to a much greater detail than that used in transfer coefficient based approaches. Although in its simplest form transfer coefficients are used to calculate the distribution of waste flows, more complex forms can be used such as probabilistic material flow analysis which incorporates uncertainty of flows, or data validation and reconciliation which aims to harmonise different measurements within the system. Material flow analysis approaches tend to be used when a detailed assessment of the solid waste management system is required. Statistical / trend analysis Statistical or trend analysis approaches are a bottom up approach typically used to understand the amount of plastic pollution in different environmental compartments via measurements. Results give a snapshot of the plastic pollution in an area at a moment in time, but can be conducted over longer periods to assess how the amounts of plastic pollution changes with time. They are often utilised to develop baselines or monitor the impact of interventions. Hydrological modelling Hydrological and transport modelling approach aims to harness the considerable experience that has been amassed in hydrological models and transfer this to the problem of plastic pollution. Typically using geographic information system (GIS) analysis, this approach is primarily focused on understanding how plastic in the environment may move and transfer to the ocean by combining estimates of terrestrial/riverine plastic with information on rainfall and river characteristics 8 The relative suitability of each methodological approach is scored in PLAST based on a user’s high level objectives (Part A questions). These results are useful in providing an indication of the type of methodological approach which may be best suited for a user, without suggesting specific methodologies. What is meant by a plastic pollution assessment methodology? The growing awareness of plastic pollution has seen it rise up the international agenda to become a leading priority for nations and the global community alike. With this, a wealth of data, methodologies, and metrics have been developed to aid in the understanding of plastic pollution. One particular area receiving important attention is the quantification of plastic pollution sources, along with its subsequent transport and accumulation in the environment. The plastic pollution assessment methodologies included in PLAST are focused solely on these quantification assessments. As such, assessment methodologies which focus on the ecological impacts of plastic pollution, for example, are omitted. Similarly, assessment methodologies that are related solely to policy without a quantification of the amount of plastic pollution are deemed out of scope. Given the quantification of plastic pollution is meant to provide knowledge and understanding on how to effectively act, any methodologies that are designed simply for the collection of data and which lack any interpretive analysis are also out of scope. A full list of the system boundaries used to define if plastic pollution assessment methodologies are in-scope is shown in Table 2. Table 2: System boundaries used for defining inclusion of plastic pollution assessment methodologies in PLAST. System boundary In scope Out of scope Types of Assessments quantifying Assessments without assessment plastic pollution sources, quantification of plastic methodologies transportation pathways or pollution sources, accumulation in the transportation pathways or environment accumulation in the environment (e.g. ecological impacts) Assessments / models with Data / monitoring protocols with explanatory outputs no explanatory outputs Indicators if fundamental to Non-fundamental or non- assessment and standardised, standardised indicators e.g. plastic pollution related SDG indicators Geographical Global NA boundary Scale Local to regional (multi-country) Solely global assessments level assessments 9 Life Cycle Life cycle assessments if Life cycle assessments covering plastic waste focused solely on plastic emissions into the environment production and use Macro / Macroplastic assessments Solely microplastic Microplastics assessments Implementable Assessments are transferrable Assessments are not to other locations transferrable to other locations Accounting for the inclusion criteria shown in Table 2, plastic pollution assessment methodologies are defined here as: “An implementable methodology that quantifies macroplastic pollution, providing knowledge and understanding in order to effectively act.” Other terminology Additional terminology used to describe the structural components of PLAST can be seen in Figure 2. Figure 2: Terminology used in PLAST to describe the structural components on the main page. 10 4. Structure of PLAST PLAST has been developed using Microsoft Excel and harnesses Visual Basic for Application (VBA) to provide an intuitive and interactive graphical user interface for users. To use PLAST, Microsoft Excel must be configured to enable Macros and access the VBA application, the steps of which are outlined in section 2: How to open and run PLAST. PLAST is comprised of several distinct sections, a description of which is provided in Table 3. Table 3: Description and functions of the sections of PLAST. Section Visualisation Description On loading of PLAST, a splash screen (‘landing Splash screen screen’) is displayed signifying the version, funders and developers. To progress, users click the start button. 11 After clicking ‘Start’ on the splash screen, users will Disclaimer be shown a disclaimer page. Users can progress to page the next page by clicking the ‘next’ button. After clicking ‘Next’ on the disclaimer page, users will About & be shown the about & training page. On the left is training page information about PLAST including a summary of its aim, users, development, citation, as well as an email contact for technical support (Dr Costas Velis – University of Leeds). On the right, users are presented with some training text as well as a link to a training video. It is strongly encouraged that users watch the brief training video prior to progressing to ensure all aspects of PLAST are understood and results are suggested as intended. Users can only progress to the questions by confirming with the checkbox that they have completed the training, at which point the ‘Next’ button can be clicked. 12 The main section of PLAST is where the questions Main page can be answered by a user and a summary of the results shown. This is structured as having three selectable question sets (Part A, B, C) at the top left of the page, each of which has a series of questions displayed below. Users are required to go through the questions under each of the three parts and answer them based on their requirements (see How to use PLAST Section for details on how to complete). On the right, users are provided with results on the relative suitability of methodological approaches, and a comparison of the top three suggested assessment methodologies with details of each. Results If users click the ‘compare’ button on the question comparison and results summary page, they are directed to a page PDF document that displays both the results of the relative suitability of methodological approaches, and a comparison of the top three suggested assessment methodologies. Depending on the user’s PC, this PDF will either open in a separate Excel pop-up window or in a PDF viewer such as Abode. In addition, a radar diagram is displayed providing a visual comparison between the top 3 suggested methodologies. Users can save this results comparison page to a PDF or print the results using the bar at the top of the page. The page can be closed by clicking the cross in the top right corner of the window to return the user to the main page. 13 Assessment If users click the ‘View All’ button on the question and methodologies results summary page, they are directed to a pop-up comparison window that displays information on each of the page plastic pollution assessment methodologies included within PLAST. This includes all methodologies, not just those deemed suitable. The page can be closed by clicking the cross in the top right corner of the window to return the user to the main page. 14 5. How to use PLAST To use PLAST, Microsoft Excel must be configured to enable Macros and access the VBA application, the steps of which are outlined in section 2: How to open and run PLAST. Inputting user needs and available resources The basic premise of PLAST is that users answer a series of questions on their needs and resources available in applying a plastic pollution assessment. PLAST then ranks the suitability of generic methodological approaches and specific plastic pollution assessment methodologies according to multicriteria decision analysis and displays the result. The questions are broken into three parts, as summarised in Table 4, with full descriptions of each question provided in Appendix 1. Table 4: Summary of the three questions sets (Part A, B, and C) in PLAST including who they should be completed by, the focus of the questions and the number of questions in each Part. Question Completed by Focus of questions No. of set questions Part A High-level users To understand the motivation of the 5 focused on overall user in applying a plastic pollution objectives, policies assessment and ascertain the general and resources scale, scope and available resources of the planned assessment. Part B Technical users To understand the user’s technical 8 familiar with plastic requirements of any outputs, for pollution assessment example, the level of detail methodologies (resolution) required or any specific terminology, data functionalities. availability and Part C required outputs To understand the availability of 2 existing data or the capability to collect new data. Each question set can be accessed using the buttons at the top of main page (see Figure 2). Users are encouraged to start with the questions of Part A, before proceeding onto the more technical questions of Part B and C. Help text for each question set can be viewed by hovering over the buttons. The individual questions associated with each question set are shown in the dark blue question ribbon (see Figure 2). PLAST has been designed to guide the user through the questions one at a time to avoid users potentially skipping relevant questions. To aid in this, only the next question is clickable, with this depicted as dark blue (for example question A2 in Figure 2). The remaining questions are disabled as depicted by a grey colour (for example question A3 to A6 in Figure 2), and only become clickable once the subsequent question has been viewed by the user. It is not mandatory to answer all the questions and 15 available options, instead users are only required to complete those that are relevant for their needs. The question text for the selected question is shown directly beneath the question ribbon. This is accompanied by a series of question options each of which can have an answer selected for it (see Figure 2). Help text for each question text can be viewed by hovering over the icon, whereas additional help text for each of the question options can be viewed by hovering over the text. Therefore if users are unsure about the meaning of any terminology, these help-texts should be consulted for definitions. For question A5, the definitions of each answer are shown in a larger textbox below to accommodate the more detailed descriptions given. Hover over the question options text to show these definitions. If the user feels that a question is repeated, we ask them to carefully read the definitions as subtle differences do exist. In general, questions in Part A enquire about the overall ambitions of the project (e.g. scope and scale), whereas those in Part B are referring specifically to the resolution of the outputs. For example, a user could answer in Part A that they are wanting to apply a methodology at the country level, but in Part B signify they want the outputs to inform at the municipality resolution. “It is not mandatory to answers all the questions and available options, instead users are only required to complete those that are relevant for their needs” The answers available for each question differ, but fall into one of three options: Yes / No The ‘Yes/No’ option is available for question A1 as this is asking at what stage the users are in preventing plastic pollution. Users may be acting on multiple stages simultaneously, therefore multiple options can be set to yes. The ‘Yes/No’ option is also provided for question C1 relating to available data as it is assumed users either have the data / are willing to collect it or the data is not available. Essential, Important, Preferred, Not essential (default) The majority of question options have these terms as the possible answers to select from, particularly for those in Part B – Technical objectives. These terms are provided in order for the user to specify how important it is that assessment methodologies can satisfy each option. For example, a user may wish to select that: • It is essential that the methodology can operate at the national scale. • It is important that it can assess the state of the environment and inform on coastal regions. • It is preferred that the amount of plastic discharge to oceans is assessed. These terms are defined in the following order of importance (starting from most important), as reflected in the multicriteria decision making: 16 Essential > Important > Preferred > Not essential Question options which have had the answer set to ‘essential’ are therefore treated as more important than those set as important, preferred or not essential. Likewise, answers set as important, are treated as more important than those set as preferred or not essential, whilst setting to preferred means it is treated as more important than only those set as not essential. By default, answers are set initially as non-essential. It is crucial to note that if an answer is set to ‘essential’, this is taken literally by the toolkit. As such, if a plastic pollution assessment methodology is characterized in the assessment framework as not meeting this option, it will be removed as a potentially suitable method and will not show in the results. Users should therefore only select essential when they wish that all methodologies that do not satisfy the option be excluded. In this sense, using essential as an answer acts as a hard filter to remove unsuitable results. If answering as ‘essential’ is overused by a user, there may be no methodologies that fully match the criteria and therefore the results display ‘No methodologies match criteria’. In such cases, it is suggested users set some of their essential options to important or preferred. A warning message is provided to remind them of this consideration. “Users should only select essential when they wish that all methodologies that do not satisfy the option be excluded.” High, Medium, Low The high, medium, low answers are provided for question A5 - What level of resources can you commit towards applying the assessment methodology? This question has three possible options to input: 1. Resource availability for employing specialist expertise 2. Resource availability purchasing specialist equipment 3. The time available for the project Each of these can be scored either High, Medium or Low according to the definitions shown in Table 5, or viewable in the text box at the bottom of the page when hovering over each option. As the resource required can vary greatly depending on the scope and scale of the project in question, we highly recommend users assess the suggested toolkits for all resource levels to avoid excluding methodologies that may be deemed suitable for their particular project scope. “We highly recommend users assess the suggested toolkits for all resource levels to avoid excluding methodologies that may be deemed suitable for their particular project scope” 17 In addition to the above three question options, another input named the ‘overall resource availability’ is used to assess the suitability of methodologies. This input is automatically calculated from the above three question options, with the answer of high, medium and low reflecting the average score 1. This therefore assumes that the overall resources available (which can be thought of as a proxy for the available budget) is dictated by the ability to purchase specialist expertise, specialist equipment and the time available for the project. The automatic calculation of overall resource availability, rather than having this as a dedicated user input, is believed to be a fairer method to assess how well matched a user’s resources are in meeting the resource requirements of different methodologies. This is due to the fact that the budgets required to run a methodology are likely to vary considerably even for the same methodology when applied at different scales, locations and with different scopes. As such, the automatic calculation of ‘overall resource availability’ does not require a definition with explicit monetary values. Instead it uses more easily quantifiable options of expertise, equipment and time to define the likely overall level of resources required. The definitions by which methodologies were scored high, medium and low for these three resource options is discussed in Appendix 2. Table 5: Definition of ‘high’, ‘medium’ and ‘low’ answers for question A5 - Resources. Resource High Medium Low type Specialist User provides project User performs data User performs data expertise management only. collection with support collection and Third parties (e.g. from third parties (e.g. implementation of methodology methodology methodology with developers) perform developers). Third support from third data collection and parties support data parties (e.g. implementation of collection and methodology methodology implements the developers). methodology Equipment User has ability to User has ability to User does not have purchase / hire purchase / hire semi- ability to hire specialist specialized equipment specialist equipment equipment (e.g. drones, specialist required (e.g. nets / software etc.) trawls / boats etc.) Time1 User has over 6 User has from 2 – 6 User has less than 2 months duration OR months duration OR 2 months duration OR over 6 person-months to 6 person-months less than 2 person- effort effort months effort Overall The ‘overall resource availability’ can be considered as a proxy for the resource budget required to implement the methodology. It is automatically availability calculated by averaging the total score for each of the above categories (proxy for where high = 3, medium = 2, low = 1, before rounding to the nearest integer. budget) 1. The shortest option should be chosen here. For example, if a user can commit four people full-time for a month (4 person months effort), but require the results in less than two months, the low option should be selected. 1 User inputs of high are allocated a score of 3, medium a score of 2 and low a score of 1. The average score from the three resource categories is calculated and rounded to the nearest integer. This is then displayed as high, medium or low for the ‘overall resource availability’ input according to the same scoring criteria. 18 Interpreting results After a sufficient number of the inputs have been completed by the user the results will automatically update. If the user has not assessed all questions, a popup warning message will display to advise the user that the results are preliminary and that they should continue answering all relevant questions. Results are shown on the right-hand side of the main page and can be viewed in more detail by clicking the ‘Compare (PDF)’ button. A similar popup message is also displayed if the user clicks the ‘Compare (PDF)’ button prior to all questions being assessed. Suggestion of what broad methodological approaches may be best suited 1 based on a user’s overall objectives, policies and generalised resources. Suggestion of suitable plastic pollution assessment methodologies based on a 2 user’s specific needs. Suitability of methodological approaches result The top result relates to how suitable different methodological approaches are as informed by the high-level user inputs from Part A – policy-related objectives. In total four types of methodological approaches were identified as shown in Table 1. However, there is a large degree of overlap in these approaches, and as such, many of the models incorporate more than one of these approaches. In this sense, the results of this section only aim to give a broad overview of the type of approach that may be best suited. Each of the methodological approaches are scored according to the user inputs provided and as described further in section 6 - ‘How does PLAST work?”. The scoring is displayed in the form of a bar chart (Figure 3) where the most suitable methodology has the largest bar. The relative suitability of the other assessments is then shown in comparison to this bar. If all the bars are similar in size, all methodological approaches are deemed equally suitable. A bar spanning the full width does not signify that this approach is perfectly suitable, instead it signifies it has the best suitability. Figure 3: Example visualisation of the results for the most suitable methodological approach 19 Suitability of plastic pollution assessment methodologies result The second result shown relates to how well the user’s needs and resources match each individual plastic pollution assessment methodology. The top three ranked methodologies according to the multicriteria decision making algorithm are visualised as shown in Figure 4. The user can navigate between these top three results using the side arrows. Figure 4: Example visualisation of the results for the most suitable plastic pollution assessment methodologies The top three results are shown along with important information for each, navigable by clicking the tabs of the results box. Information shown includes: • Assessment name • Organization / authors (if publication) • 5 star ratings on how well the methodology matches the inputs the user selected as ‘important’ or ‘preferred’ and the associated match with data requirements. Note, the assessments shown are ordered based off the ‘important’ star rating first, and then by the ‘preferred’ rating and eventually the ‘data requirements’ rating as explained in How does PLAST work?’ section. As such, methodologies showing higher star ratings for the important or data requirements categories may show up lower overall due to them ranking lower in the important rating. • A short description of the assessment methodologies objectives, methodology and key outputs. • Contacts with hyperlink to developer’s webpage or publication (if the hyperlink does not work, the URL can be found in the ‘Assessment methodologies comparison page’). 20 In addition, the results section also notes how many methodologies meet the essential criteria specified by the user. The ranking of the top 3 methodologies from 1 to 3 is shown. However, as it is possible for multiple methods to rank equally, particularly when only a few inputs have been specified by the user, the names of any methodologies that are equally ranked within the top three but not shown in the ‘Suggested methodologies result’ section, are instead listed on the ‘Results comparison’ page. Importantly, it should be stressed that the suitability of each assessment methodology is determined only using the multicriteria decision making algorithm, as explained in the How does PLAST work?’ section. No indication is given to the methodologies scientific rigour or accuracy of results. As such, the results presented here should be used as a guide only. Additionally, whilst the level of resources and data requirements are included as scoring criteria, it should be acknowledged that primary data collection is always encouraged and that the quality of the results will likely reflect the resources allocated. “No indication is given to the methodologies scientific rigour or accuracy of results. As such, the results presented here should be used as a guide only” A printable PDF of the top three ranked results can be viewed by clicking the ‘Compare’ button. In addition to showing a summary of each of the suggested plastic pollution assessment methodologies it also provides a visual comparison in the form of a radar diagram (Figure 5). Figure 5: Radar diagram providing visual comparison of performance of the top 3 suitable assessment methodologies by each question set. 21 Comparing methodologies A useful feature of PLAST is the ability for users to compare the different plastic pollution assessment methodologies at a high-level. Different potential examples of when this may be useful are as follows: 1. Users received the top 3 most suitable options from the toolkit but wish to understand what other assessment methodologies exist. 2. Developers may wish to compare methodologies and ascertain how their methods match up to others or provide scope for harmonisation. 3. Users received the top 3 most suitable options but PLAST could have specified equally suitable / alternate options that exist as explained above. The user may therefore want to understand the details of these methodologies. For each of these cases, the user simply has to click the ‘View all’ button. This provides a database of all the available plastic pollution assessment methodologies and their key information. Methodologies can be navigated between by clicking the left and right arrow buttons (Figure 6). Assessment methodologies are ordered alphabetically. Figure 6: Assessment methodologies comparison page. 22 6. How does PLAST work? PLAST works by initially categorising each plastic pollution assessment methodology by a framework, as seen in Appendix 3. This framework is designed to categorise each methodology by its important features such as scope, outputs and requirements. Users are then required to input their needs and available resources within the question sets of Part A - C, with these directly linked to the assessment framework categorisations. As users fill out their needs and objectives, the toolkit automatically scores and ranks the assessment methodologies and generalised methodological approaches against this criteria. The manner in which this scoring and ranking takes places is outlined below for both the ‘suitability of methodological approaches’ result and the ‘suitability of plastic pollution assessment methodologies’ result. Scoring and ranking of methodological approaches The scoring of the four methodological approaches is against only the high-level policy questions of Part A. The scores assigned for each question can be viewed in the scoring matrix of Figure 7 below: Figure 7: Default scoring matrix linking each methodological approach against each question option. Green cells (score of 3) relate to those where the approach is well suited. Yellow cells (score of 2) relate to those where the approach is somewhat suited, and red cells (score of 1) relate to those where the approach is poorly suited. Blue cells represent ones which can be equally covered by all methodologies and therefore are not scored. The scoring works by taking the value shown in Figure 7 for the option the user inputted and multiplying this by a weighting factor that depends on the level of importance the user specified. For example, if the user input that they want to assess the marine compartment as ‘essential’, this would take the default scores for each methodological approach and multiply them by a weighting of 5. Alternatively, if the user sets this option as ‘important’ the 23 weighting would be by a factor of 2, whilst setting it to ‘preferred’ would keep the default score the same. By default, question five on the resources available is treated as ‘essential’ therefore weighted by a factor of 5. For this, only the automatically calculated ‘overall resource availability’ option is used in determine the methodological approach, with the other resource options feeding into this calculation as explained previously. The ranking process simply involves summing up the scores for each methodology, with the highest value assigned as the most suitable in the bar chart of the results section. Scoring and ranking of plastic pollution assessment methodologies The scoring and ranking of each plastic pollution assessment methodology against the users’ needs and resources is conducted via ‘Multicriteria Decision Analysis’. This involves matching the user input against the assessment framework categorisation so that if the methodology satisfies the user’s input, a score of 1 is allocated. The exception to this is for the inputs of Part A, which due to them being targeted at high level decisions and objectives, are deemed more influential and therefore allocated a weighting factor of 2. For resource availability questions with inputs of ‘high’, ‘medium’ and ‘low’, a match is defined as one whereby the methodology requires equal to or less than the available resources as input by the user. The ranking process acts in a stepwise manner where methodologies are first filtered out if they do not satisfy all of the ‘essential’ components specified by the user inputs. All remaining methodologies are then ranked according to the number of important components that they match. If two or more methodologies are ranked equally for the number of matching important components, the ranking process then goes to the next level and distinguishes methodologies based on the number of preferred components they match. Likewise, if two or more methodologies rank equally in terms of both important and preferred components, the ranking process then looks at the data requirements (Question C1). Lastly, if any methodologies still cannot be separated, ranking is performed in terms of alphabetical order. However, in this case, the results page signifies that other equally matching methodologies exist by showing an equal sign prior to the rank. By default, the four resource inputs of question A5 are treated as ‘important’ with a score of 1 given if the users resources match that of the assessment methodology exactly, whereas a score of 0.5 is given is the user has more resources available than required by the assessment methodology. However, as the resource requirements are generalised and can vary greatly depending on the scope and scale of the project in question, we highly recommend users assess the suggested toolkits for all resource levels to avoid excluding methodologies that may be deemed suitable for their particular project scope. 24 7. Frequently asked questions What is the aim of PLAST? See the ‘Aim’ section Who should use PLAST? See the ‘Who should use PLAST?’ section. What is meant by a methodological approach? See the ‘What is meant by a methodological approach?’ section. What is meant by a plastic pollution assessment methodology? See the ‘What is meant by a plastic pollution assessment methodology?’ section How does the toolkit work? See the ‘How does PLAST work?’ section. What if a question isn’t relevant to me? Users may skip any questions or question options that are not relevant to them, with all skipped questions not impact the scoring of methodologies. However, it is encouraged that users view each question in turn and decide if it is relevant to them rather than skipping to only the questions they consider relevant. This is to avoid users missing questions that may be relevant to them only once they have considered the possible question options. To assist in this, PLAST is designed to encourage users to check each question sequentially. Once a question has been viewed, users may return to that question at any point to change their answers. What is meant by ‘essential’,’ important’, ‘preferred’ and ‘not essential’? The majority of question options have these terms as the possible answers for the user to select, particularly for those in Part B – Technical objectives. These terms are provided in order for the user to specify how important it is that the assessment methodology can satisfy each option, with the terms ranked in the following order of importance (starting from most important) in the multicriteria decision making: Essential > Important > Preferred > Not essential If a user selects ‘Essential’ as an answer, only assessment methodologies that include this feature will be suggested as possible suitable methodologies. ‘Essential’ thereby acts as a filter and should only be used when having that feature is vital. Answers of ‘important’ and ‘preferred’ do not act as a filter, but instead are used to rank how suitable the available assessment methodologies are. ‘Important’ has a higher weighting than ‘preferred’ and therefore allows the user to distinguish the relative importance of non-essential features. Lastly, if a user selects ‘Not essential’ (default option), then assessment methodologies will not be scored based on this question. How do I save my answers and results? Any changes to PLAST, such as viewing or answering questions, or the generation of results can be saved by clicking the cross in the top-right corner of the main page. A message will 25 appear asking users whether they wish to save their progress. Click yes to save. Alternatively, the results can be saved by saving the PDF of the ‘Results comparison page’. 8. Troubleshooting PLAST does not open / work Please follow the instructions in Section 2: How to open and run PLAST The results say ‘No methodologies match criteria’ This typically occurs when either too few questions have been answered to populate results, or too many ‘essential’ answers have been specified so that no methodologies match them all. Try reducing the number of ‘essential’ answers by converting any that are not completely essential to ‘important’. I do not understand the question / option? Definitions of each question can be read by hovering over the question in the question ribbon, or by hovering over the icon next to the question text. Definitions of each question option can be viewed by hovering over the relevant option text. It is recommended that if users do not understand a question, it is better to leave it as ‘Non-essential’ than to incorrectly answer the question. High-level users are recommended to complete Part A questions whereas more technical users should complete Part B and C. Why do the same methodologies always appear? This is typically because of two reasons: 1) When the user has provided only a few inputs the results will start to populate. However, as there are not many inputs to rank them by, many methodologies are likely to rank equally. In such cases, the toolkit is forced to assign the top 3 ranked methodologies by alphabetical order, meaning that the same methodologies often show up. It is suggested that users complete more of the questions to allow the decision making algorithms to better distinguish between the methodologies. 2) If the user has selected an answer as ‘essential’ that only a small number of methodologies satisfy, this will by default only include these methodologies in the results. Try reducing the number of ‘essential’ answers to allow more methodologies to be suitable. The toolkit does not include a methodology A comprehensive literature review was performed to ascertain all the available plastic pollution assessment methodologies that fit within the scope of those allowed. However, as this is a rapidly evolving field new methodologies may have been subsequently released or previous methodologies excluded. In such cases, please contact the development team with a request to add a new methodology. 26 9. Appendices Appendix 1 Table 6: Questions asked in PLAST to understand a user’s needs and resources in applying a plastic pollution assessment. These are distributed between three question sets (Part A, B, C). Question Question Description set topic Part A Objectives The stage of the objective relates to how progressed the user is in regards to planning mitigation of plastic pollution. Generally, users may aim to baseline their plastic pollution first, followed by identifying interventions (action plans) to address it, with subsequent monitoring allowing the impact of these interventions to be quantified and progress towards targets tracked. Lastly, users may also require benchmarking their performance against others to reassess overall objectives. These stages may be conducted simultaneously therefore users can specify more than one objective stage. Assessment The assessment scale relates to the geographic area across which the plastic assessment methodology can be scale practically applied within a single project. Multiple assessment scales can be selected by a user. Assessment The assessment scope relates to the aspect of plastic pollution that the user is most interested in scope understanding. For example, users may be interested in the understanding the generation of plastic waste and how it is managed prior to its release into the environment, or users may want to quantify the sources (ways by which the plastic is leaked to the environment). Alternatively, users may be interested in understanding where and when plastic waste enters the marine environment, or how polluted the environment is. Multiple scopes can be selected by a user. Environmental The environmental compartment relates to the part of the environment where the user is most interested in compartments understanding the sources, flows or concentrations of plastic pollution. This is typically the environmental compartment in which the method is applied. Users can select multiple environmental compartments of interest. Resources The level of resources relates to the resources available to both collect data and implement the methodology, categorized by the required expertise, equipment and time. The average combination of these dictates the overall budget requirements. 27 Although it can be tempting to limit results to methodologies requiring the lowest resources, it should be noted that the quality of outputs are often related to the resources applied. Similarly, the actual level of resource required often depends on the overall scope of the project, and scorings applied in the framework are suggestions only. If in doubt, we recommend leaving this question blank. Part B Spatial The spatial resolution is the geographical scale at which the outputs are reported; this differs from the resolution 'assessment scale' which is the geographical scale at which the method is applied. For example, if applying a methodology at the national scale, but where results are wanted to inform cities, the ‘assessment scale’ should be set as ‘national’ and the spatial resolution set to ‘municipal’. Temporal The temporal resolution relates to the time-scales over which the plastic pollution assessment methodology resolution informs. If a methodology informs on a daily basis for a year, then both of these timescales and all in between are informed by the methodology. Sector The economic / industrial sector resolution relates to whether outputs are reported by economic sectors (e.g. resolution fishing, retail etc.) and companies. Waste The waste management activity output resolution relates to the waste management activities for which outputs management are reported on. resolution Material The material / item resolution relates to the granularity at which plastic is assessed. This may be at overall resolution plastic material-level, by polymers, items, or brands Quantification The quantification unit relates to whether the assessment methodology has outputs by count or by mass. unit Desired The format and functionality of outputs relates to the manner in which the outputs are presented or what functions functions they can perform Interventions Prioritization of interventions relates to whether useful information is gained by the methodology that would allow users to rank the importance of interventions in mitigating plastic pollution based on their cost or expected impact. Part C Collecting The data availability relates to the general data that may be required to feed into plastic pollution assessment data methodologies, broken down by common categories. Proxy / default Proxy / default data relates to the ability of the methodology to substitute missing data with a generic value. data Although accuracy may be comprised, this can assist in simplifying data collection if large data gaps exist. 28 Appendix 2 Table 7: Criteria for scoring methodologies resource requirements Resource type High Medium Low Specialist Methodology typically requires data Data collection can be undertaken by The user is able to perform data expertise collection and implementation of the the user with guidance from collection and implementation of methodology to be done by the methodology developers, however, methodology with only limited methodology developers. the implementation of the support from the methodology methodology is typically undertaken developers. by the methodology developers. Equipment Specialised equipment essential Semi-specialist equipment requiredNo specialist equipment required (e.g. specialist software, drones, (e.g. nets / trawls / boats etc.) modelling code etc.) Time Typical assessment1 requires over 6 Typical assessment1 requires from 2 Typical assessment1 requires less months duration OR over 6 person- – 6 months duration OR 2 to 6 than 2 months duration OR less than months effort person-months effort 2 person-months effort Overall resource The ‘overall resource intensity’ can be considered as a proxy for the budget required to implement the methodology. availability (proxy It is automatically calculated by averaging the total score for each of the above categories where high = 3, medium = for budget) 2, low = 1, before rounding to the nearest integer. 1. Typical assessment refers to the most common scale and scope for each assessment methodology (e.g. city level, country level etc.). As this differs depending on the specific ambitions of the project this aims to provide only a generalised overview. 29 Appendix 3 The framework shown in Table 8 is an outline of that used to categorize methodologies by objectives, functionality, outputs and requirements. Table 8: Framework to categorise assessment methodologies. The primary and secondary category columns represent the framework categorisation, whilst type column shows the available options that may be specified. Framework Primary Primary Secondary Secondary category Unit category category category definition definition Assessment Assessment Key information Assessment name Name of the plastic pollution Text details relating to assessment assessment methodology or title of details and name and organisation paper/report if no official name given. contact Organization(s) / Name of organization(s) or authors for Text information Author(s) academic papers URL (if available) Website address of plastic pollution URL framework assessment methodology Objectives Objectives Text Methodology Methodology Text Key outputs Key outputs Text Year released Year of initial release Year Policy Stage of The stage of the Can be used for Initial quantification of plastic pollution Y,N objective objective relates to baselining to identify areas to focus and establish objectives how progressed the a reference user is in regards to planning on how to Provides details of Identification of interventions to apply Y,N mitigate plastic interventions necessary within action plans to mitigate plastic pollution. The to implement action pollution framework informs on plans whether the Can be used for Regular quantification of plastic Y,N assessment monitoring pollution to assess effectiveness of methodology can help interventions and track progress the user meet this towards goals and commitments stage of their Can be used for Periodic quantification of plastic Y,N objective. benchmarking pollution to compare against other 30 locations and reassess overall strategies Assessment The assessment scale Global Can be applied worldwide Y,N scale relates to the Regional (multiple Can be applied across multiple Y,N geographic area countries) countries or continents across which the National Can be applied at a national (country) Y,N plastic assessment scale methodology can be Provincial Can be applied to a province, county Y,N practically applied or state within a single project. Municipal Can be applied to a municipality or Y,N Multiple assessment local authority scales can be selected Sub-municipal (local) Can be applied to a local area smaller Y,N by a user. than a municipality e.g. a neighbourhood. Assessment The assessment Pre-leakage / upstream Plastic prior to its emission into the Y,N scope scope relates to the environment, e.g. production, imports / aspect of plastic exports, waste generation, recycling, pollution that the user waste management. is most interested in Point of uncontrolled Plastic at the point of its uncontrolled Y,N understanding. release into release (leakage) into the environment. environment (sources) This may be to all environmental compartments, not just marine e.g. littering on land. Plastic discharge to Flux (e.g. rates) of plastic entering the Y,N oceans (marine litter) oceans and becoming marine litter, typically via rivers. State of the Stock (e.g. concentration) of plastic Y,N environment which has accumulated in the (accumulations) environment over time Environmental The environmental Land Terrestrial environment including non- Y,N compartment compartment relates to perennial drains (e.g. those not the part of the permanently filled with water). environment where the Riverine Rivers, lakes and perennial drains (e.g. Y,N user is most interested permanently filled with water). in understanding the Coastal Interface between land and sea (e.g. Y,N sources, flows or beaches). concentrations of Marine Oceans and seas. Y,N 31 plastic pollution. This is typically the environmental compartment in which the method is applied. Users can select multiple environmental compartments of interest. Resource Resources to both Specialist expertise Level of expertise required (e.g. level High, Medium, availability collect data and of support required from third parties) Low implement methodology, High = Methodology typically requires categorised by the data collection and implementation of required expertise, the methodology to be done by the equipment and time. methodology developers. The average Medium = Data collection can be combination of these undertaken by the user with guidance dictates the overall from methodology developers, budget requirements however, the implementation of the methodology is typically undertaken by the methodology developers. Low = The user is able to perform data collection and implementation of methodology with only limited support from the methodology developers. Logistics / equipment Level of logistics and equipment High, Medium, required required (e.g. specialist software or Low equipment) High = Specialised equipment essential (e.g. specialist software, drones, modelling code etc.) Medium = Semi-specialist equipment required (e.g. nets / trawls / boats etc.) Low = No specialist equipment required 32 Time required Estimated time required for both data High, Medium, collection and implementation of the Low methodology High = Typical assessment requires over 6 months duration OR over 6 person-months effort Medium = Typical assessment requires from 2 – 6 months duration OR 2 to 6 person-months effort Low = Typical assessment requires less than 2 months duration OR less than 2 person-months effort Calculated overall The ‘overall resource intensity’ can be High, Medium, resource availability considered as a proxy for the budget Low (proxy for budget) required to implement the methodology. It is automatically calculated by averaging the total score for each of the above categories where high = 3, medium = 2, low = 1, before rounding to the nearest integer. Technical Spatial The spatial resolution Global Outputs reported at a global level Y,N objectives resolution of is the geographical Regional (multiple Outputs reported across multiple Y,N outputs scale at which the countries) countries or continents outputs are reported; National (federal) Outputs reported at a national Y,N this differs from the (country) level 'assessment scale' Provincial (state) Outputs reported at a provincial, Y,N which is the county or state level geographical scale at Municipal Outputs reported at the municipal or Y,N which the method is local authority level applied. For example, Sub-municipal (local) Outputs reported at a local level Y,N if applying a smaller than that of the municipality methodology at the Urban Outputs reported on areas with high Y,N national scale, but population densities such as towns where results are and cities. 33 wanted to inform cities, Rural Outputs reported on areas with low Y,N the ‘assessment scale’ population densities outside of towns should be set as and cities ‘national’ and the Catchment / Basin Outputs reported at a river basin level Y,N spatial resolution set to (e.g. the area whereby precipitation ‘municipal’. drains to a common outlet) River compartments Outputs reported on specific parts of a Y,N river (e.g. banks, surface etc.) Estuarine Outputs reported on the area when Y,N freshwater meets the ocean Beach Outputs reported on the narrow strip of Y,N sand, pebbles or rocks that separates the land from the ocean. Coastline Outputs reported on the area where Y,N land meets the sea Sea / Ocean Outputs reported on the oceans or Y,N seas Temporal The temporal Annual Outputs inform on a yearly timescale Y,N resolution of resolution relates to outputs the time-scales over Seasonal Outputs inform on a seasonal Y,N which the plastic timescale (e.g. spring, summer, pollution assessment autumn, winter; or wet season, dry methodology informs. season) If a methodology Monthly Outputs inform on a monthly timescale Y,N informs on a daily basis for a year, then both of these Daily Outputs inform on a daily timescale Y,N timescales and all in between are informed Sub-daily Outputs inform on a timescale less Y,N by the methodology. than a day (e.g. hourly) Economic / The economic / Economic sectors (e.g. Outputs inform on different economic Y,N industrial industrial sector tourism, fisheries, retail) activities (e.g. fishing, retail etc.). See sector resolution relates to the International Standard Industrial resolution whether outputs are Classification of All Economic Activities reported in relation to (ISIC), Rev. 4 for a full list of economic sectors 34 economic sectors and Companies Outputs inform on a commercial Y,N companies business Waste The waste Waste generation Outputs inform on waste generation Y,N management management activity Waste collection Outputs inform on formal waste Y,N activity output output resolution (formal) collection resolution relates to the waste Waste collection Outputs inform on informal waste Y,N management activities (informal) collection for which outputs are Sorting for reprocessing Outputs inform on waste sorting for Y,N reported on. reprocessing Reprocessing Outputs inform on waste reprocessing Y,N (recycling) Disposal Outputs inform on disposal Y,N Littering / illegal Outputs inform on littering or illegal Y,N dumping dumping Open Burning Outputs inform on open burning of Y,N waste Material The material / item Plastic material-level Outputs are related to all plastic Y,N resolution of resolution relates to materials (e.g. plastic) outputs the granularity at Polymer-level Outputs are related to plastic polymers Y,N which plastic is (e.g. PET, PP) assessed. This may be Item-level Outputs related to specific plastic Y,N at overall plastic level, objects (e.g. drink bottle, plastic bag). by polymers, items, or This differs from a plastic product as it brands does specify the brand / company. Brand-level Outputs related to specific company Y,N brands of plastic items Microplastics Outputs are related to microplastics as Y,N well as macroplastics Unit of The quantification unit Quantifies by mass Outputs are given by mass (e.g. kg, Y,N quantification relates to whether the tonnes) assessment Quantifies by count Outputs are given by count (e.g. Y,N methodology has number of items) outputs by count or by mass. Format of The format and Includes uncertainty Outputs are presented with a degree of Y,N outputs and functionality of outputs certainty 35 model relates to the manner GIS / maps Outputs can be shown in GIS Y,N functionality in which the outputs interfaces or as maps are presented or what Outputs aligned with Outputs are aligned to report on the Y,N functions they can SDG sub-indicators Sustainable Development Goal (SDG) perform sub-indicators. Scenarios / forecasts Ability to run scenarios to predict how Y,N interventions may impact plastic pollution or project outputs into the future Wedges approach Illustrates how interventions can be Y,N combined to meet targets Outputs Interventions to Prioritises interventions Interventions are prioritized based on Y,N prioritize mitigate plastic based on estimated cost their estimated cost to achieve a interventions pollution are prioritized desired impact in order of importance Prioritizes policy Policy interventions (e.g. bans) are Y,N based on their cost or interventions by impact prioritized based on their estimated expected impact. impact Prioritizes engineering / Engineering and service interventions Y,N service interventions by (e.g. improving infrastructure) are impact prioritized based on their estimated impact Available Data The data requirements Plastic production / Data on the amounts of plastic Y,N resources requirements relates to the general consumption data produced or sold data that may be Waste generation data Data on the amounts of plastic which Y,N required to feed into becomes waste plastic pollution Waste composition Data on what material fractions make Y,N assessment up the waste methodologies, broken Plastic waste Data on what polymers make up the Y,N down by common composition (polymers) plastic waste categories. Plastic waste Data on what items make up the Y,N composition (items) plastic waste (e.g. bags, bottles etc.) Plastic waste Data on what company branded items Y,N composition (brands) make up the plastic waste Solid waste Data on how solid waste is managed Y,N management data (e.g. collection, disposal) 36 Survey / clean up data Data from clean up campaigns and Y,N environmental surveys Hydrological data Data on hydrological aspects such as Y,N precipitation Remote sensing data Data via satellite, aircraft, drones or Y,N cameras Socioeconomic data Data on the social and economic Y,N characteristics of the area GIS data Spatial data Y,N Ability to use Ability to use default NA Y.N, Some proxy / default values or more readily inputs or regions data / accessible data to secondary data estimate required input sources data 37