AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is...

60
CITIZEN SCIENCE AND NOAA SPILL RESPONSE TEAM: Sam Haapaniemi, Myong Hwan Kim, Roberto Treviño (Evans School of Public Affairs, UW) ADVISOR: Beth Bryant (School of Marine and Environmental Affairs, UW) CLIENT: Doug Helton (Office of Response and Restoration, NOAA) DATE: March 20, 2015

Transcript of AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is...

Page 1: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

CITIZEN SCIENCE

AND NOAA SPILL RESPONSE

TEAM: Sam Haapaniemi, Myong Hwan Kim, Roberto Treviño (Evans School of Public Affairs, UW)

ADVISOR: Beth Bryant (School of Marine and Environmental Affairs, UW)

CLIENT: Doug Helton (Office of Response and Restoration, NOAA)

DATE: March 20, 2015

Page 2: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

TABLE OF CONTENTS

EXECUTIVE SUMMARY 1

1.0 INTRODUCTION 2

1.1 BACKGROUND 2

1.2 SCOPE OF ANALYSIS/STRATEGIC FRAMEWORK 2

1.3 RESEARCH METHODS 4

2.0 CITIZEN SCIENCE IN GENERAL 5

2.1 WHAT IS CITIZEN SCIENCE? 5

2.2 CITIZEN SCIENCE IN PRACTICE 5

2.3 CHANGING TRENDS 7

3.0 CITIZEN SCIENCE FOR OIL SPILL RESPONSE 10

3.1 GUIDING CONCEPTS 10

3.2 BARRIERS TO CITIZEN SCIENCE 11

3.3 POTENTIAL BENEFITS OF ENGAGING CITIZENS 20

3.4 EXAMPLES OF CITIZEN SCIENCE IN OIL SPILL CONTEXT 21

3.5 POTENTIAL OPPORTUNITIES FOR ENGAGEMENT 25

4.0 BASELINE REQUIREMENTS FOR CITIZEN SCIENCE 26

4.1 CO-BENEFITS FOR BOTH THE PUBLIC AND NOAA 26

4.2 COMMUNICATION/FEEDBACK LOOP 26

4.3 COLLABORATIVE APPROACH 27

5.0 EVALUATION 28

5.1 DECISION FRAMEWORK 28

5.2 CRITERIA AND SCORING 29

5.3 POTENTIAL CITIZEN ACTIVITIES FOR NOAA & ANALYSIS OF OPTIONS 31

6.0 DISCUSSION AND EXAMPLE SCENARIOS 43

6.1 SUMMARY OF KEY FINDINGS AND TRADEOFFS 43

6.2 MOVING FORWARD 48

ACKNOWLEDGEMENTS 50

APPENDICES 51

APPENDIX 1 – SAMPLE INTERVIEW QUESTIONS 51

APPENDIX 2 – DECISION FRAMEWORK 52

APPENDIX 3 – REFERENCES 53

APPENDIX 4 – INTERVIEWEES 58

Page 3: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

1

EXECUTIVE SUMMARY

This report and the corresponding research were completed by the Citizen Science Management

team, a group of graduate students working through the University of Washington’s Program on

the Environment. It aims to inform future decisions on ways to use citizen science to improve the

National Oceanic and Atmospheric Administration’s (NOAA) oil spill response efforts.

The initial impetus of the project was the increased interest in public participation during the

2007 M/V Cosco Busan and 2010 Deepwater Horizon oil spill responses. Unprepared for the

unprecedented number of public who volunteered to participate, the professional responders in

these incidents were unable to use them as an effective resource. We believe this research is a

critical step in helping prepare response agencies for similar future situations.

The research is based on two core goals: (1) to identify and prioritize activities of NOAA’s

Office of Response and Restoration that could benefit from citizen science, bearing in mind

recent developments in crowdsourcing, and (2) to provide recommendations on effective citizen

science management. To do this, we conducted a literature review as well as a number of expert

interviews with responders, data managers, and citizen science specialists.

Based on our findings regarding barriers and opportunities of citizen science in oil spill context,

we have identified a list of activities NOAA could consider addressing both before and during oil

spill responses. Before a response, NOAA could benefit from establishing data collection

protocol, partnering with volunteer organizations, and managing baseline studies with affiliated

volunteers. During a response, NOAA could benefit from choosing a model for observation or

field surveys for volunteers, choosing types of data management activities, and managing

volunteer registration and coordination. We then evaluated various options of how each activity

could be implemented using five criteria, four pertaining primarily to NOAA (manageability,

cost, data value, liability) and the other to the public (participation value).

The above analyses are fit into a decision framework that is designed to help weigh the tradeoffs

between different design options for a citizen science program. Various scenarios for NOAA are

illustrated, highlighting how the features of a citizen science program can change depending on

the quality and degree of the participation desired. The paper concludes with the following

recommendations:

● Acknowledge the potential benefits of citizen science

● Define goals clearly and recognize trade-offs

● Use the decision tool to move from concept to operation

● Build a program that meets the baseline requirements

● Start now – Pre-need actions pay off

Page 4: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

2

1.0 INTRODUCTION

1.1 BACKGROUND

In 2007, the M/V Cosco Busan allided with the San Francisco Bay Bridge, spilling 53 thousand

gallons of oil into the bay.1 Due to the location and nature of this incident, there was a high level

of public visibility. In turn, there was a surge of public interest in the clean-up efforts. People

came out to volunteer, but the National Oceanic and Atmospheric Administration (NOAA) and

the unified command were unprepared to manage the number of volunteers that were available,

resulting in the spread of misinformation and mounting public criticism of the response. The

public was hoping to receive continuous, real-time information through web-based services and

was unhappy when they did not get it. This discontent is well illustrated by Bay Area residents

turning to social media to communicate their frustration – one blog alone receiving 13,000 posts

and 2,628 unique visitors per day.2

The Cosco Busan spill, along with the 2010 Deepwater Horizon and other recent incidents, has

highlighted the need for response agencies to incorporate and use public support. In 2012, the

National Response Team (NRT) published a document titled Use of Volunteers Guidelines for

Oil Spills, which stressed the need for better citizen involvement. However, outlining the need is

only the first step in effectively leveraging community support.

With this in mind, NOAA’s Office of Response and Restoration reached out to the University of

Washington’s Program on the Environment to research the possibility of using citizen science as

a tool to meaningfully incorporate the public into spill response. This report is the result of that

research.

1.2 SCOPE OF ANALYSIS/STRATEGIC FRAMEWORK

This research was undertaken with the end goal of producing usable analysis for NOAA’s Office

of Response and Restoration (OR&R). OR&R is “a center of expertise in preparing for,

evaluating, and responding to threats to coastal environments, including oil and chemical spills,

releases from hazardous waste sites, and marine debris,”3 In light of this, OR&R is interested in

investigating and analyzing emerging technologies and practices that may have potential to help

with future response efforts.

1 NOAA OR&R. “$44 Million Natural Resource Damage Settlement to Restore San Francisco Bay After Cosco

Busan Oil Spill.” Web. http://response.restoration.noaa.gov/about/media/44-million-natural-resource-damage-

settlement-restore-san-francisco-bay-after-cosco-busa 2 NRT (2012). 3 NOAA OR&R website (http://response.restoration.noaa.gov/about).

Page 5: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

3

The research in this report focuses on one specific emerging area of interest in this field – using

citizen science to improve spill response efforts. Citizen science can take on many forms, but at

its core, it is a practice through which the public voluntarily engages in the scientific process.

This collaboration often takes place through observation, data collection, and interpretation.

While citizen science has the potential to be useful and useable across a number of different

situations, our research was limited to the activities that could take place before a spill and

during a spill response. These two periods are currently the least accessible to volunteers, giving

them the most need and potential for improvement.

The overarching goals of this project were to (1) identify and prioritize activities of NOAA’s

Office of Response and Restoration that could benefit from citizen science, bearing in mind

recent developments in crowdsourcing, and (2) provide recommendations on effective citizen

science management.

These goals were then broken up into two key objectives:

1. To provide the most current and relevant information on citizen science from the

perspective of all involved parties; and

2. To compare and contrast different models of citizen science, including but not limited to

observations, data collection, and interpretation.

Mindful of the above goals and key objectives, our research revolved around the following

central questions:

● What has been the role of citizen science in recent environmental disasters?

● What does the growth of public interest in participation mean for response agencies in

terms of opportunities and challenges?

● How can we leverage the public interest to benefit emergency response efforts?

● What are ways in which OR&R can manage the high flow of information that is inherent

in a citizen science program?

○ Who owns the information once it is collected and submitted?

○ Who uses the information that citizen science provides and how is it used?

○ How can we ensure the reliability of the data?

● What are the aspects of a successful citizen science program?

Page 6: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

4

1.3 RESEARCH METHODS

The research for this report was undertaken in two phases, and included both primary and

secondary source material. First, we conducted a review of the available literature. This began

with material suggested by OR&R and grew as our research developed. The literature covered

general citizen science topics, case studies of citizen science in practice, articles focusing on

response techniques and data integration, and documents looking at existing NOAA policies.

To supplement the literature research, we conducted a number of interviews with relevant

experts. These interviews focused on the interviewees’ topic of expertise, but were generally

asking about their ideas and thoughts on how citizen science could be integrated into spill

response. We interviewed coastal volunteer management professionals4, citizen science

specialists5, NOAA practitioners6, and data experts7 in an attempt to get a comprehensive view of

how to best develop and implement a program. See Appendix 1 for a list of sample interview

questions.

4 California Office of Spill Prevention and Response (OSPR) and Beach Watchers of Washington State University

(WSU) Extensions in Snohomish and Island Counties 5 Coastal Observation And Seabird Survey Team (COASST) and Washington Sea Grant 6 NOAA Scientific Support Coordinators (SSCs) 7 NOAA Emergency Response Management Application (ERMA) and Marine Debris Programs (MDP)

Page 7: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

5

2.0 CITIZEN SCIENCE IN GENERAL

2.1 WHAT IS CITIZEN SCIENCE?

In its simplest form, citizen science8 describes projects in which nonprofessional volunteers

participate in scientific research. One working definition explains it as “a form of collaboration

where members of the public participate in scientific research to meet real world goals.”9

Contribution by amateurs in scientific observation and research is not new. Dating back to the

18th century, some of the earliest projects include amateur ornithologists monitoring the timing

of bird migrations in Finland and citizen astronomers participating in the British government's

Transit of Venus project to accurately measure the distance from the Earth to the sun10. The

Audubon Society’s Christmas Bird Count, which began in 1900 is the longest active citizen

science project in the United States, and has about 60,000 to 80,000 volunteers that now

participate in the survey.11

While citizen science itself is not new, the variety and extent of citizen science projects, the

number of participants, and the type and complexity of data collected have changed dramatically

in recent years.12 Technological advances (e.g. the Internet, smartphones, and social media) have

transformed data collection, and led to innovative ways to harness the potential of citizen

science.13 There were more than 200 research projects in North America in 2008 and there

currently are over 65 active NOAA projects supported by citizen scientists, signaling a growing

trend for citizen science.

2.2 CITIZEN SCIENCE IN PRACTICE

There are a wide variety of citizen science models in practice. We studied these citizen science

models to distill their core competencies and match them with NOAA’s needs. We have

identified the following primary categories of existing citizen science programs: (1)

environmental monitoring, (2) species monitoring and observation, and (3) collaborative

research. Below we will explain these in greater detail and present select examples of each

type.14

8 Related terms include public participation in scientific research, volunteer monitoring, and crowdsourced science. 9 Bowser and Shanley (2013). Also similarly defined in Theobald et al. (2015). 10 Hines et al. (2012). 11 Audubon Society webpage (http://www.audubon.org/content/history-christmas-bird-count). 12 Cohn (2008). 13 Hines et al. (2012), Bonney et al (2014). 14 See Bowser and Shanley (2013) and Theobald et al. (2015), among others, for extensive case studies.

Page 8: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

6

2.2.A ENVIRONMENTAL MONITORING

Citizen science can provide non-wildlife environmental monitoring services in a number of

capacities. This includes testing and observation of air and water quality, weather, and other

factors. Through appropriate training and the use of specific instruments, the necessary

information is collected and used by the coordinating agencies or organizations. The notable

examples in this regard are volunteer monitoring by the U.S. Environmental Protection Agency

(EPA) and weather monitoring by Community Collaborative Rain, Hail, and Snow (CoCoRaHS)

Network.

● Volunteer Monitoring (EPA)15: Since 1998, the EPA has supported programs that train

and equip volunteers nationwide to monitor the condition of their local water resources.

In addition to empowering citizens to detect and resolve harmful pollution problems, the

data collected have been used by decision makers to make regulatory changes, and in

some cases led to criminal prosecutions.

● Weather Monitoring (CoCoRAHS)16: Since 1998, struck by the unprecedented storm the

previous year, the people began to participate in CoCoRaHS network to monitor the

precipitation using a standardized instrument. This information is sent to National

Weather Service for better analysis and forecast.

2.2.B SPECIES MONITORING AND OBSERVATION

Scientific observation of species can help identify trends or establish baselines for further

research. Certain species can act as bio-indicators, the careful monitoring of which can signal

larger ecosystem shifts. The notable examples in this regard are bird monitoring, Hudson River

eel research, and observation of wildlife or dead species along the beach.

● Bird Monitoring (Audubon Society)17: Since 1900, the Audubon Society has engaged the

public in monitoring birds all over the country. This project provides valuable historic

baseline data and increases public awareness for conservation. The Audubon Society also

conducts various regional programs.18

15 http://water.epa.gov/type/rsl/monitoring/ 16 http://www.cocorahs.org/ 17 http://www.audubon.org/ 18 For example, Puget Sound Seabird Survey, managed by Seattle Audubon Society, organizes citizen scientists to

conduct data collection on 50+ seabird species in Puget Sound from publicly accessible shorelines with detailed

protocol (http://seattleaudubon.org/seabirdsurvey/default.aspx).

Page 9: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

7

● American Eel Monitoring (Hudson River Eel Project)19: Since 2008, volunteers have

been monitoring eels coming through the Hudson River in spring to collect data that

inform species management decisions. Annual survey data is reported to both state and

coast-wide management councils through NERR and NOAA.

● Beach Watch (NOAA National Marine Sanctuary)20: Since 1993, volunteers have

conducted bi-monthly surveys of coastal beaches along the Gulf of the Farallones on the

birds or mammals resources present. The information is used as a baseline data and the

network is also used to assist in possible oil spills situation.

2.2.C COLLABORATIVE RESEARCH

Citizen science can also take a collaborative form, relying on the public to sort through and

analyze large amounts of data that may otherwise be too immense to process. The notable

examples in this regard are Zooniverse curated by the Citizen Science Alliance and EteRNA

developed by Carnegie Mellon University and Stanford University.

● Zooniverse (Citizen Science Alliance)21: Zooniverse began in 2007 as Galaxy Zoo, a

collaborative astronomy project that used volunteers to classify the shapes of different

galaxies. Since then, Zooniverse has expanded to become a platform of digital citizen-

science projects.

● EteRNA (CMU and Stanford)22: Players in this game propose designs for synthetic RNA,

the best of which are tested in research labs. Since 2010, it has contributed to science by

creating a large group of volunteers willing to experiment with different scientific

discovery games.

2.3 CHANGING TRENDS

We found that new digital technologies have revolutionized citizen science and spurred its

growth dramatically. The transformation of citizen science has been beneficial to both traditional

scientific research projects and emergency responses.23 The everyday connectedness of the

public, enabled through the Internet, smartphones, and social media, has increased functionality,

accessibility, and visibility of citizen science projects.24 In line with these changes, the federal

19 http://www.dec.ny.gov/lands/49580.html 20 http://farallones.noaa.gov/science/beachwatch.html 21 https://www.zooniverse.org/ 22 http://eterna.cmu.edu/web/ 23 Cohn (2008). 24 Bonney et al. (2014).

Page 10: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

8

agencies under the Obama administration have also leaned toward more open government and

public engagement.

2.3.A IMPROVED FUNCTIONALITY

Enabling more volunteers to perform more complex data collection, technological advances have

led to increases in the number and variety of studies using citizen scientists. People can now

locate a relevant project, train themselves more quickly with online resources, collect more

accurate data, and submit directly to an online platform. This has helped create “crowd-

mapping,” a new way of crowdsourcing information by overlaying information onto a digital

map.25

Examples of this increased functionality can be found in Did You Feel It? run by the U.S.

Geological Survey and eBird created by Cornell Lab of Ornithology and the Audubon Society.

● Did You Feel It? (USGS)26: Since 2004, volunteers experiencing an earthquake can

submit their data to USGS, which maps the data in real time. This can help USGS assess

the magnitude of the seismic activity in remote areas.

● eBird (Cornell and the Audubon Society) 27: Since 2002, volunteer bird observers can

now upload their data online with which eBird creates an open-source, geographically

mapped dataset of bird species. According to its website, in a single month of March in

2012, there were around 3.1 million reports across the U.S. The website also provides

real-time visualization of the reports as well as synthesis of the observations.

< Figure 1 > Changing Trend in Citizen Science: Did You Feel It? (left) and eBird (right)

25 McCormick (2012). 26 http://earthquake.usgs.gov/earthquakes/dyfi/ 27 http://ebird.org/content/ebird/about/

Page 11: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

9

2.3.B IMPROVED ACCESSIBILITY AND VISIBILITY

In addition to improving citizen science competence, technological advances have increased

accessibility and visibility to emergency events. As exposure to disaster events has increased, so

too has the public’s perceived cognitive familiarity with emergency response decisions and

operations. This increased public interest and awareness have motivated citizens toward active

participation. Some participants view citizen science as an opportunity for their own education

and empowerment on social and environmental issues, while others see it as a way to fill in the

“gaps” that agencies or organizations cannot accomplish on their own.28

Additionally, our research revealed a growing trend in the use of crowdsourcing and social

media to rapidly detect and broadcast events during disaster responses. Some recent examples in

this regard include Humanitarian OpenStreetMap Team (HOT)29 that used open data and crowd-

sourcing to accurately map the Ebola-affected regions in West Africa for field workers.

2.3.C OPEN GOVERNMENT INITIATIVE OF THE OBAMA ADMINISTRATION

Along with these changing trends, federal policies have also acknowledged the importance of

citizen science have begun to encourage open innovation and public participation. President

Obama’s Memorandum on Transparency and Open Government (2009) as well as Memorandum

on Open Data Policy – Managing Information as an Asset (2013) specifically asked federal

agencies to improve transparency and collaboration. The administration’s Second Open

Government National Action Plan, released in December 2013, included provisions to broaden

public participation and to make data more useful and accessible to the public.30

Under this broad encouragement toward public engagement, federal agencies also formed a

Federal Community of Practice on Crowdsourcing and Citizen Science (FCPCCS) in 2013 with

Lea Shanley (Presidential Innovation Fellow at NASA) and EPA as co-chairs.31 The White

House as well as 22 other federal agencies, departments, and bureaus are included as members

and they meet regularly to discuss ways in which federal agencies can better incorporate citizen

science into their work. Its mission statement clearly attests to this commitment:

As affiliates of federal agencies, we seek to expand and improve the U.S. Government’s

use of crowdsourcing, citizen science and similar public participation techniques for the

purpose of enhancing agency mission, scientific and societal outcomes.

28 Walker et al (2014). 29 http://hot.openstreetmap.org/ 30 Bowser and Shanley (2013). 31 FCPCCS Factsheet (2014).

Page 12: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

10

3.0 CITIZEN SCIENCE FOR OIL SPILL RESPONSE

3.1 GUIDING CONCEPTS

3.1.A USE OF VOLUNTEERS GUIDELINES FOR OIL SPILLS (NATIONAL RESPONSE TEAM)

The National Response Team established guidance in 2012 on the use of unaffiliated and

affiliated volunteers during an oil spill response. Citizen scientists conducting activities during

an oil spill response may be defined as a volunteer accordingly. As a part of the National

Response Team, NOAA will need to ensure that any decisions related to citizen science are

consistent and aligned with this guidance.

3.1.B AFFILIATED / UNAFFILIATED VOLUNTEERS

The characteristics describing the volunteers who have participated in recent oil spill responses

are diverse. Some volunteers bring extensive experience conducting species and environmental

observations and monitoring, and are trained in the science of oil spills.32 Others volunteers have

very few relevant skills, but nonetheless have a strong desire to contribute. The National

Response Team (2012) defines and categorizes volunteers into two groups as follows:

● Affiliated volunteer – An individual who comes forward following an incident or disaster

to assist with response activities during the response or recovery phase without pay or

other consideration and has a pre-existing formal or informal arrangement with either a

governmental agency or non-governmental organization (NGO) or Community Based

Organization (CBO) and who has been trained for a specific role or function in incident

response or disaster relief during the preparedness phase. Affiliated volunteers may also

have benefited from pre-deployment rostering, credentialing, and health screening. An

affiliated volunteer’s organization may have established ties to the local response

structure.

● Unaffiliated volunteer33 – An individual who comes forward following an incident or

disaster to assist a governmental agency, NGO, or CBO with response activities without

pay or other compensatory consideration. By definition, unaffiliated volunteers are not

initially affiliated with a response or relief agency or pre-registered with an accredited

32 Interviews with COASST and WSU Beach Watchers. 33 Unaffiliated volunteers are also sometimes referred to as “convergent,” “emergent,” or “spontaneous” volunteers

within the emergency management community. For standardization purposes in this document, these volunteers will

be referred to as “unaffiliated.”

Page 13: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

11

disaster council. Unaffiliated volunteers may not have benefited from pre-deployment

training, credentialing, and health screening.

Unless specifically noted, all discussions in the remainder of the document can be assumed to be

applicable for both affiliated and unaffiliated volunteers.

Thoughtful consideration will also need to be given in defining the quality of the relationship

under applicable federal policy and laws (e.g. “employee” or “volunteer” under the National

Contingency Plan, “government employee” under the Federal Employees’ Compensation Act of

1916, “volunteer” under the Volunteer Protection Act of 1997, and “employee” under

Occupational Safety and Health Act of 1970 among others).34

3.2 BARRIERS TO CITIZEN SCIENCE

Challenges and concerns must be carefully weighed and considered by NOAA in determining

the degree of public participation. There are practical concerns such as liability for injury and

information security, and scientific concerns, such as preserving the quality of data, that may

present legitimate barriers to effectively employing citizen scientists. The following sections

discuss the challenges NOAA must carefully consider to properly design a citizen science

program that fits its needs, as well as best practices to overcome them.

3.2.A FEDERAL POLICY AND LAW

Depending on the type of data collection, how the data are collected, and who collects the data,

NOAA is subject to an array of federal and state law and policy limitations. All citizen science

programs should undergo review to ensure compliance with appropriate federal laws and

policies. While not an exhaustive or all-encompassing list, below are the most common laws

discussed in our research.

3.2.A.1 PERSONAL PRIVACY

Personally identifiable information falls under the Privacy Act and the Children’s Online Privacy

Protection Act (COPPA).35 First, the Privacy Act governs the ways in which federal agencies

collect and handle personally identifiable information (PII) from individuals. PII is categorized

very broadly, including name, IP address, fingerprint, identifiable photographs, and a variety of

other information. COPPA provides limitations on how the data are stored, requiring that

databases not have the capability to retrieve information by any of these personally identifiable

34 NRT (2012). 35 Bowser and Shanley (2013).

Page 14: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

12

criteria. This is an important liability question to keep in mind as NOAA considers implementing

citizen science programs. Depending on how citizen science data are submitted, there is a

possibility that there will be some sort of embedded PII, which NOAA will have to handle in

accordance with the Privacy Act.

Like the Privacy Act, COPPA restricts the way that information is collected and managed, but

with a specific focus on children under 13 years of age. It requires a privacy notice be provided

for all data collected from children, as well as a statement of parental rights, a contact point, and

an explanation of how the data will be used. There are ways to build these things into collection

systems, and NOAA should be aware of these limitations before establishing a citizen science

program.

3.2.A.2 PAPERWORK REDUCTION ACT

The Paperwork Reduction Act (PRA) is yet another barrier for federal agencies in soliciting

information from the public.36 PRA stipulates that federal agencies cannot impose excess burden

upon the public when collecting information, including asking for answers to specific set of

questions as opposed to open-ended, general comments. Otherwise, agencies have to go through

a burdensome OMB approval process, which takes around 90 days.

3.2.A.3 THE DATA QUALITY ACT

In 2001, the Office of Management and Budget (OMB) passed the Data Quality Act (DQA),

which requires that agencies set up guidelines to meet Data Quality Act standards, put in

pathways for people affected by incorrect data to have their problem addressed, and submit data

accuracy reports to their directors. The DQA ensures the federal agencies use and disseminate

accurate information.

3.2.A.4 DATA OWNERSHIP

We were unable to identify a universally adopted data ownership policy. Moreover, through our

interviews37 we learned that the desire and necessity to relinquish and retain data ownership or

control data use varied across the citizen science and emergency response communities. A user

agreement is a commonly used tool and legally binding agreement between the volunteer and the

project where data ownership policies are agreed to, among other policies such as a project’s

legal policies, privacy policies, and terms of use.38 Two themes were consistently mentioned and

36 Young et al. (2013). 37 In particular, interviews with COASST, WSU Beach Watchers, and NOAA ERMA. 38 http://www.birds.cornell.edu/citscitoolkit/toolkit/policy/user-agreement

Page 15: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

13

resonated among the groups: appropriate attribution for data collected and the benefit in

determining Terms of Use (TOU) and/or Terms of Service (TOS) in advance to an oil spill.

3.2.B HUMAN HEALTH RISKS

The health and safety of volunteers, including citizen scientists, is also very important. The

National Contingency Plan (40 CFR § 300.185(c)) states:

ACPs should provide for the direction of volunteers by the OSC/RPM or by other federal,

state, or local officials knowledgeable in contingency operations and capable of providing

leadership. ACPs also should identify specific areas in which volunteers can be used,

such as beach surveillance, logistical support, and bird and wildlife treatment. Unless

specifically requested by the OSC/RPM, volunteers generally should not be used for

physical removal or remedial activities. If, in the judgment of the OSC/RPM, dangerous

conditions exist, volunteers shall be restricted from on scene operations.

In some respects, this is very promising for citizen science – beach surveillance, both before and

during a spill, is an ideal opportunity for citizen science. However, if conditions are too

dangerous and public access is restricted, it may be very difficult to use citizen scientists to

provide useful information.

The National Response Team has provided comprehensive volunteer health and safety training

within the Use of Volunteers Guidelines for Oil Spills (2012). The Occupational Health and

Safety Act of 1970 does not cover volunteers or “employees” of state or local governments, and

health and safety training requirements vary from state to state. Some states have implemented

their own health and safety plans for their employees, and in some cases volunteers. The federal

and state Occupational Health and Safety Administration requirements for hazardous waste

operations (HAZWOPER) and emergency HAZWOPER training should also be noted.

The National Contingency Plan (40 CFR § 300.185(c)) requires that procedures be established to

allow for safe use of volunteers who participate in oil spill responses. Additionally, the National

Response Team recommends that public participants be trained and demonstrate competence in

accordance with the applicable sections of 29 CFR § 1910.120. The worker‘s role and

responsibilities during the response operations ultimately determines the amount of training

required. At a minimum, all workers must demonstrate competence in the tasks they will

conduct, the hazards associated with the tasks, and the precautions and protections needed to

safely complete the tasks before they begin working. The National Response Team suggested

that volunteers’ participation be commensurate with low-risk activities, require training at the

“skilled support personnel” or “first responder awareness” level.39

39 NRT (2012).

Page 16: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

14

3.2.C LIABILITIES OF USING VOLUNTEERS

Using citizen scientists, and the data they collect, introduces additional risk into an already

unpredictable event. The liability NOAA faces cannot be understated. Failing to properly vet or

supervise citizen scientists could result in unsafe behavior with detrimental consequences. The

inherently hazardous oil spill environment may present legitimate concerns of personal injury to

participants and third-party bystanders. Even baseline activities conducted outside an oil spill

response may raise liability concerns for NOAA, and pose a tremendous barrier to meaningful

use of citizen scientists. Additionally, disseminating false information, demonstrating

negligence, and / or creating hazardous conditions by digital volunteers and through citizen

science collected data could expose NOAA to increased potential of litigation. As the number of

participants increase, so does the level of risk.

The risk of liability should not be overstated, either. For a number of reasons, NOAA can feel

encouraged to use citizen science. First, the legal system provides significant protection from

liability. It could benefit NOAA to consult legal counsel and improve understanding of these

safeguards before employing volunteers. Second, NOAA can make programmatic decisions such

as thoughtfully considering management practices, training, and technological innovation to

institute liability-reducing steps and improve citizen scientist competence.40 The following

sections will discusses the federal laws addressing participant safety and data collection liability,

as well as ways to reduce the associated risk.

3.2.C.1 VOLUNTEER INJURY INSURANCE

The threat of personal injury to volunteers is real, especially during an inherently hazardous oil

spill response. When affiliated organizations do not provide liability coverage, or when

unaffiliated volunteers do not have liability coverage of their own, the risk increases. The

National Response Team has urged “considerable scrutiny” when deciding to use volunteers

without these policies.

In order to limit NOAA’s liability for volunteer injury, vetting during the volunteer registration

process should be the first step. This could include requiring personal health care insurance for

eligibility, or the use of legal instruments such as disclaimers, contracts of adhesion, or TOU to

limit liability from personal injury for volunteers.41

Another approach is to think about how liability may be covered in instances where the affiliated

organization or individual volunteer does not provide coverage. One option is to include them on

a government policy, where applicable. In determining liability coverage, volunteers may be

40 NRT (2012), Robson (2013), and Smith (2014). 41 NRT (2012) and Robson (2013).

Page 17: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

15

considered government employees in certain instances, even when no legally binding contract

was previously negotiated. If volunteers can be defined as “government employees,”42 they are

afforded coverage under the Federal Employee Compensation Act (FECA) (5 U.S.C. §

8101(1)(B) and the Federal Torts Claim Act (FTCA) 28 U.S.C. §§ 1346(b) and 2671-80).

3.2.C.2 THIRD PARTY INJURY

When acts of a volunteer harm a third party, liability to the volunteer is limited under the

Volunteer Protection Act of 1997 (VPA) (42 U.S.C. § 14501-505). This protection exists only

for volunteers who are part of a non-profit organization or governmental entity, however.

Additionally, this protection is not extended to NOAA or its employees.43

Third party injury claims against NOAA or its employees will be adjudicated under FTCA.

NOAA is liable for injuries by volunteers if the volunteer is determined to be under the direction

and control of the agency. NOAA is not liable if another agency, organization, or the responsible

party was found to be in control of the volunteer’s work. Devolving volunteer management

responsibilities, to include providing tasking, day-to-day supervision, and supplies to other

entities federal and state agencies, the responsible party, and affiliated organizations limits

liability to NOAA.44

3.2.C.3 DATA RELIABILITY

The protections under the VPA for volunteers are also afforded to digital volunteers. Moreover,

claims of negligent or wrongful act or omission of any federal government employee relating to

data quality are adjudicated under the FTCA.

There are two caveats that protect NOAA against FTCA suits45: (1) if the employee is found to

be operating under his/her own discretion; and (2) if, when acting, the employee was indeed

acting within the scope of employment46 or due care.

These caveats underscore the importance of policies, protocols, and processes to preserve data

quality. The following sections provide amplifying discussion on this.

42 If the government provides tasking, day-to-day supervision, and supplies to UAV or AV under voluntary

agreement, then these individuals, if injured, may be considered employees of the government. 43 Public Law 105-19-June 18, 1997. 44 NRT (2012). 45 Smith (2014). 46 Definition of “scope of employment” is determined by the federal government agency and the DOJ.

Page 18: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

16

3.2.D CONCERNS SURROUNDING DATA QUALITY

Use of citizen scientist generated information is becoming increasingly common, and there is a

high level of variability among data models to describe what the data are, their formats, and how

they are organized and processed. The format (e.g. email, text, paper, spreadsheet, etc.) and

survey type (e.g. open-end questions or structured pro forma) can influence data quality.

Through our interviews, we learned that some projects established robust monitoring protocol,

others used paper forms, some uploaded imagery online, and one project used unstructured email

reports.47

Additionally, increased public participation also increases the risk that data gathered might not

be as reliable as data collected solely by experts. Due to the wide variability in skills and

expertise between contributors, issues of data quality often rise to the forefront in considering the

validity of this research. A number of best practices can be implemented to improve reliability of

citizen science data. By keeping observation processes standardized and simple, and adopting

quality assurance and quality control processes, NOAA has the best chance of producing reliable

and useable information from citizens.

3.2.D.1 DATA MANAGEMENT PLAN

The intended use of the data ultimately drives the robustness of data collection and management

plans and protocols. An oft-cited starting point to improving citizen science collected data

reliability is developing a data management plan. The plan is a formal document that explains

policies and protocols for how the data will be handled during the project, as well as afterward.

Additionally, this is a vehicle to clearly define the volunteers “scope of employment” or

standards of care — what they are authorized and not authorized to do. In developing a data

management plan, program managers consider important aspects of data management, metadata

generation, data preservation, and analysis before the project starts.48

As a notable example, the University of California Curation Center of the California Digital

Library Data Management Planning Tool provides a checklist to create a data management

plan.49

3.2.D.2 QUALITY ASSURANCE

In addition to simple data collection processes, there are other quality assurance measures that

can be taken to ensure that the best possible data will be collected. One way to assure high

47 Interviews with WSU Beach Watchers, COASST, ERMA, and Marine Debris. 48 Wiggins et al. (2013). 49 https://dmp.cdlib.org

Page 19: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

17

quality data is to improve citizen scientist collection competence, and there are a number of ways

to do this. One option is to establish minimal required skills of citizen scientists to participate in

the program. Another option is to increase the baseline knowledge and skills of citizen scientists

by training and testing them on both data collection methods and interpretation. Other ways to

improve data quality include leveraging technology to create learning modules for citizen

scientists or to automate and calibrate the data collection process. A practice used by NOAA

during the Deepwater Horizon oil spill response was to filter amateur data from aggregate data

during the storage phase.50

Generally, more tightly controlled data submission processes improve the quality of the data.

Ways to improve data submission quality include controlling vocabularies, having machine

timestamps, required fields, field formatting, and defining an acceptable set ranges for data

entry.51 The risk of increased data collection control is that NOAA only validates what they

know. Useful information may go unreported because it does not fall within the data collection

parameters.52 While not scientifically significant, we felt it was noteworthy that a survey of 280

citizen science projects found that many depended on personal knowledge of the participants in

order to feel comfortable with data quality. Familiarity with data collectors and their competence

are built over time, suggesting the need to engage in data quality practices in advance to oil spill

responses.53

3.2.D.3 QUALITY CONTROL

Preserving data quality also included having quality control processes after the data are collected.

We found that there are a variety of quality control measures used for citizen science projects.

Some projects conduct repeated sampling, using data collected by multiple participants at one

site. Collecting multiple observations increases the sample size, which generally leads to

increased accuracy in inferences made. Another technique was to follow up with participants to

better understand “unusual” reports. Just like quality assurance measures, several projects used

technology to ensure quality after data was collected. Some examples include automatic

recognition techniques, data triangulation, normalization, and mining.

Technology is not a data quality panacea, though. Alone, it fails to accommodate the full range

of data or details needed by researchers. As a result, many projects built in redundancy and

concurrently used multiple measures to improve data quality. In a survey of 280 citizen science

projects researchers found that, 33% required submission of paper data sheets in addition to data

50 Interview with NOAA ERMA. 51 Wiggins et al. (2011). 52 Interview with NOAA MDP (Peter Murphy). 53 Wiggins et al. (2011). Also include a framework of options for data quality mechanisms.

Page 20: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

18

entry.54 This enabled participants to provide more detailed observations, and also served as a

means to verify data entries in cases where questions about accuracy arose. Another measure

commonly used included experts review, reinforcing the value of expertise in ensuring data

quality.

Collecting paper forms and expert review are labor intensive and do not scale well. As a result,

one suggestion was to use volunteer expert review networks. Well-known projects such as eBird

and Butterflies and Moths of North America (BAMONA) use these networks. The resources

needed for managing a reviewer network or developing a system with automated data checking

process are different, and in turn so are the costs. The high variability of oil spill response make

it challenging to project expected volunteer growth. As a result, distinguishing which approaches

are most appropriate can be difficult. Researchers cautioned that this is a challenge present in all

citizen science projects. As a result, consideration should be given to the need to plan different

quality management techniques based on the projected growth and resulting size of the data set.55

3.2.D.4 ADMISSIBILITY IN COURT

Generally speaking, the more a decision – whether it is related to safety, security, enforcement or

monetary damage assessments – is reliant upon any single datum, the higher the level of scrutiny

is placed on the quality of collection and analysis. The highest standards for data collection are

those established for the admission of expert testimony as evidence. Under Federal Rule of

Evidence 702, which encodes the Daubert56 factors, “a witness who is qualified as an expert by

knowledge, skill, experience, training, or education may testify in the form of an opinion or

otherwise if:

(a) The expert’s scientific, technical, or other specialized knowledge will help the trier of fact

to understand the evidence or to determine a fact in issue;

(b) The testimony is based on sufficient facts or data;

(c) The testimony is the product of reliable principles and methods; and

(d) The expert has reliably applied the principles and methods to the facts of the case.”

These factors are nonexclusive, and evidence does not have to meet each standard above for a

judge to determine if it is credible or not. Even with this caveat, it is uncommon for citizen

science organizations to get their data up to court admissibility standards.57 Of the organizations

interviewed, COASST was the only one who explicitly targeted this standard.58 The simpler the

data collection process can be made, the more likely higher quality data will be produced.

54 Ibid. 55 Ibid. 56 Daubert v. Merrell Dow Pharmaceuticals, 509 U.S. 579 (1993). 57 Interviews with COASST, WSU Beach Watchers, and NOAA MDP. 58 Interview with COASST.

Page 21: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

19

3.2.E COSTS OF A CITIZEN SCIENCE PROGRAM

Citizen science programs vary considerably in cost and effectiveness. Researchers surveyed 280

small-to-medium sized citizen science projects and found that on average (for the 43 that

provided this information), the projects had a budget of $105,00059. Factors such as the intended

use of the data, budget, labor intensity, pre-existing infrastructure, scalability and expected

project growth need to be carefully considered when calculating program costs.

We found that projects with increased budgets generally became less reliant on labor-intensive

administration processes, instead turning to more technologically advanced management

practices.60 Although there will be unavoidable resource and financial costs in developing and

administering a citizen science program, these upfront investments can provide the greatest

return on investment. It is easier to prevent than repair problems, and more cost effective in the

long run.61

3.2.F UNIQUE CHARACTERISTICS OF OIL SPILL RESPONSE

There are a number of challenges that are unique to implementing a citizen science program in

an oil spill response situation. Some of these situational challenges include the compressed

timeline associated with a spill, the unpredictability in scope, geography, and nature of a spill,

and the heightened risk and liability that come from having volunteers involved with hazardous

spill scenarios. These challenges make a standardized, one-size-fits-all approach to program

design less effective.

In particular, oil spills are events with a high level of uncertainty. Because of the operating

environments and conditions, and the chemical characteristics of oil, there are considerable

unknowns that create challenges in mounting an effective response. The uniqueness of each oil

spill undermines the effectiveness of a “cookie-cutter” approach to using citizen science.

In addition, as a result of human health and environmental risks involved, there is an increased

sense of urgency to act and the decision stakes are heightened. The potential negative outcomes

that may result from decisions made using bad data magnifies the importance of having high

quality and reliable data. These inherent human health and safety risks of oil spills pose

challenges that must be considered when identifying ways to use citizen science meaningfully.

Ultimately, these challenges will need to be considered and balanced so that NOAA maximizes

the benefits of citizen science without jeopardizing its scientific integrity.

59 Wiggins et al. (2011). 60 Ibid. 61 Wiggins et al. (2013).

Page 22: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

20

3.3 POTENTIAL BENEFITS OF ENGAGING CITIZENS

It is important that NOAA determine the quality of participation they want to have from

volunteers moving forward, and identify channels through which they can connect with those

volunteers in meaningful ways. Evidence suggests that there are short-term benefits and long-

term value of citizen science, both for the professionals and the volunteers involved. In addition

to supplementing scientific data, there is strong potential for social benefits from fostering public

participation. Citizen science programs can provide an organized means for the public to feel

connected to local problems and support the response. Moreover, proactively engaging citizens

can be an effective risk management practice in that it can improve risk communication and

reduce operational risk during oil spill responses.

3.3.A FOR THE PROFESSIONAL RESEARCHER

Representing a larger workforce, citizens can fill the gaps in data analysis by providing “broad

extent” and “fine grain” observations.62 They can also provide a cost-effective workforce. In one

project, the use of volunteers to monitor the abundance and distribution of songbirds living in the

upper elevations of New England mountains saved about $30,000 per year.63 By taking on

assignments that are labor intensive, time consuming, expensive, or unable to be automated,

citizens provide tremendous functional value. Most importantly, they can supplement insufficient

professional staff resources. Our research revealed generating baseline data and conducting

ongoing monitoring are good examples of scenarios where citizen scientists can help.64 Citizen

scientists can also serve in a consultative role, especially in remote areas, where their local and

traditional knowledge fills information gaps due to the lack of available historical data.65

3.3.B FOR THE CITIZEN SCIENTIST

Although more difficult to quantify, the benefits that citizen scientists derive from participation

are equally valuable. Citizen science programs provide a constructive way for citizens to

contribute to solving local problems.66 As highlighted in the Cosco Busan oil spill response, in

the absence of governmental leadership and institutional processes to facilitate their

participation, citizens will self-organize and take action on their own. Haphazard management of

citizen engagement can detract or delay professional attention and resources from response

planning and operations. In the best of cases, these engagements may contribute to

62 Interview with COASST. 63 Cohn (2008). 64 Interviews with WSU Beach Watchers and COASST; Hines et al. (2012). 65 Interview with NOAA SSCs and Coonrad (2012). 66 Interview with WSU Beach Watchers.

Page 23: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

21

misinformation. In the worst, they may increase human health and environmental risk and

threaten a timely and effective response.67

3.3.C FOR SOCIETY AT LARGE

Institutional and operational processes have traditionally kept emergency event-scientific

analysis to the professional responders. The public’s unfamiliarity with oil spill response

protocol, the uncertainty inherent to science, and the general lack of knowledge around the

environmental and risk tradeoffs regarding dispersant use placed considerable pressure on

responders during the Cosco Busan and Deepwater Horizon oil spill responses. The public’s

knowledge deficit was so great in the Deepwater Horizon oil spill response that attempts fell

short in adequately addressing concerns about ecological and human health risks despite the

unified commands efforts to manage risk perception and communicate with the public.68

When thoughtfully designed and managed, citizen science can be an important stakeholder

engagement tool for advancing scientific literacy and reducing risk perception.69 Citizen science

programs can provide opportunities for NOAA to correct risk misconceptions and address

stakeholder concerns, share technical information, and establish constructive relationships and

dialogue about the science that informs oil spills and response options. These interactions and the

improved scientific literacy in some respects are the building blocks of community resilience,

better equipping citizens to cope with environmental changes caused by oil spills. Furthermore,

working alongside citizens affords NOAA the opportunity to build mutual trust, establish the

social capital that is integral to improving credibility of government-disseminated information

and reduce risk perception.70

3.4 EXAMPLES OF CITIZEN SCIENCE IN OIL SPILL CONTEXT

Based on our research and interviews, we were able to identify some of the examples of citizen

science being used in the context of oil spill response. These include conducting baseline

surveys, utilizing pre-existing networks, and mapping and data management.

3.4.A BASELINE SURVEYS

All the interviewees emphasized the importance of establishing a baseline for oil spill response

because understanding what is “normal” for the affected regions changes the type and intensity

67 NRT (2012). 68 Walker et. al (2014) and NRT (2012). 69 Cohn (2008), Hines et al. (2012), Walker et al. (2014). 70 Walker et al. (2014).

Page 24: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

22

of efforts. Broadly speaking, all the environmental monitoring activities identified above can be

categorized as baseline surveys, but for the purpose of spill response, the shoreline surveys are

particularly important. Some of the notable examples are:

● Coastal Observation And Seabird Survey Team (COASST)71: A project of University of

Washington, COASST creates a network of citizens from coastal communities and

engages in rigorous data collection. These data are up to a level that can be admissible in

courts72 and can contribute to establishing “baselines against which any impact — from

human or natural origins — can be assessed.”73 Though regionally focused in the Pacific

Northwest, the highly skilled and robust data collection and management of COASST,

makes it the gold standard for NOAA in terms of data quality. In addition, a large number

of COASST staff and volunteers have HAZWOPER training, which make them prepared

for any initial assessments of the affected shore.74

● Washington State University (WSU) Beach Watchers75: Beach Watchers is part of the

WSU Extension program and has branches located in seven counties situated around

Puget Sound. They organize and manage a large network of volunteers and focus on

engaging communities and citizens for education and baseline research. Their research-

oriented volunteer management and training experience gives them an advantage in a

large spill situation, as demonstrated by previous coordination with Washington

Department of Ecology for an “Oil Spill Assessor” program that aimed to provide early

shoreline assessment.76

● NOAA Mussel Watch Contaminant Monitoring77: The longest running program of this

kind, Mussel Watch has analyzed chemical and biological contaminant trends in

sediments and bivalve tissues collected at over 300 coastal sites since 1986. This program

provides water quality data on background levels and trends of fossil-fuel byproducts and

other chemicals, which are valuable for spill response. The program incorporates

volunteers as citizen scientists to conduct monitoring around the coastal regions. In

Washington State, for example, volunteers from WSU Beach Watchers and other

organizations have been active in this program.78

71 http://depts.washington.edu/coasst/ 72 Interview with COASST. 73 http://depts.washington.edu/coasst/what/vision.html 74 COASST (2015). 75 http://www.beachwatchers.wsu.edu/regional/about/ 76 http://ccma.nos.noaa.gov/about/coast/nsandt/musselwatch.aspx. For Snohomish County Mussel Watch program,

see see http://www.snocomrc.org/projects/science/mussel-watch.aspx 77 http://www.beachwatchers.wsu.edu/regional/about/ 78 NOAA OR&R. “Mussel Memory: How a Long-Term Marine Pollution Program Got New Life.” Web.

https://usresponserestoration.wordpress.com/2012/06/07/mussel-memory-long-term-marine-pollution-program-new-

life/

Page 25: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

23

3.4.B PRE-ESTABLISHED NETWORK

For NOAA, it is important to form a relationship or partnership with organizations which already

have a pre-established network of other organizations and volunteers.79 The impact will be even

greater if these organizations have the ability to manage larger group of organizations, bearing in

mind the huge number of unaffiliated volunteers pouring in during spill incidents. Some notable

examples of this are:

● California Oiled Wildlife Care Network (OWCN)80: Established in 1994 by California’s

Office of Spill Prevention and Response (OSPR) in response to the Exxon Valdez

incident in Alaska, the OWCN has more than 30 member organizations specializing in

providing support for oiled wildlife rescue and rehabilitation. The OWCN ensures that

volunteers are adequately equipped during incidents and its work provides much-needed

support.81

● NOAA Marine Debris Program (MDP)82: Since 2006, and particularly after the 2011

tsunami in Fukushima, NOAA MDP has been active in engaging with citizens to detect,

research, and remove the marine debris flowing onto U.S. shores. The strength of the

program comes from having regional coordinators all around the country that conduct

marine debris removal projects in coordination with local volunteer organizations. In

addition, MDP recently developed an open-ended reporting app called Marine Debris

Tracker83 with University of Georgia to broaden public participation.

3.4.C MAPPING AND DATA MANAGEMENT

As NOAA considers incorporating citizen science into its work, it has become even more

important for NOAA to be able to adequately manage the large influx of information coming

from the public. There are notable examples of mapping citizen-generated data from GIS

information and improving data management capacity. Some of these are:

● Louisiana Bucket Brigade (LABB)84: LABB is a non-profit organization established in

2010 dedicated to grassroots action against pollution from the state’s oil refineries and

chemical plants. In the face of the 2010 Deepwater Horizon incident, LABB created an

online, open-source mapping platform called Oil Spill Crisis Map85 where people could

79 Interviews with NOAA SSCs, MDP, and ERMA. Also raised by NOAA staff during interim presentations. 80 http://www.vetmed.ucdavis.edu/owcn/about_us/index.cfm 81 Interview with CA OSPR. 82 http://marinedebris.noaa.gov/about-us 83 http://www.marinedebris.engr.uga.edu/ 84 http://www.labucketbrigade.org/ 85 After the incident, the website has now expanded to include general pollution reports under the name iWitness

Pollution Map (http://map.labucketbrigade.org/)

Page 26: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

24

submit eyewitness reports via website, Twitter, email, text, etc., which LABB then

mapped real-time onto a publicly available satellite map. This was the first case to apply

an online, open-source mapping system in a spill incidents.86 Regarding the data

collected, all citizen-submitted reports undergo validation by LABB before they are made

public, and LABB regularly sends its synthesis report to federal and state enforcement

officials regarding any discrepancy between their own and industry or state

reports.87 However, it should be noted that the data were collected with the purpose of

influencing policy and encouraging actions, rather than science. So, the map could be

used to indicate certain local impacts that were not found in official reports, but may lack

scientific rigor88

● NOAA Emergency Response Management Application (ERMA)89: Developed by NOAA

and the University of New Hampshire with the EPA, the Coast Guard, and the

Department of Interior, ERMA is an online mapping tool that integrates key

environmental response information for decision makers. This one-stop map assisted

responders with timely and comprehensive information during 2010 Deepwater Horizon

incident. ERMA even mapped the above LABB’s Oil Spill Crisis Map as a separate layer

so as to make sure that the responders were not missing out on any key information.90

< Figure 2 > Crowd-mapping in Spill Context: Oil Spill Crisis Map (left) and NOAA ERMA (right)

86 McCormick (2012). 87 http://map.labucketbrigade.org/page/index/8 88 McCormick (2012). 89 http://response.restoration.noaa.gov/maps-and-spatial-data/environmental-response-management-application-erma 90 Interview with NOAA ERMA.

Page 27: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

25

3.5 POTENTIAL OPPORTUNITIES FOR ENGAGEMENT91

In addition to some of the potential benefits identified above, there are further opportunities that

NOAA can use for more effective response. Some of the opportunities are: (1) having structured

dialogues with scientists and technical experts, (2) collaborating with local knowledge sources;

and (3) integrating citizen scientists into oil spill scenarios and drills.

First, structured dialogues with scientists and technical experts outside of the formal response

organizations could help the public better understand complex technical information and

uncertainty during spill situation as well as avoid the spread of misinformation. In addition,

during spills, such a dialogue – for example, an ad hoc science advisory board during the

Deepwater Horizon incident – could help responders to receive inputs from scientists with useful

knowledge.

Second, collaborating with local knowledge sources during spills could help create more

effective response. In particular, NOAA could benefit from traditional knowledge of indigenous

peoples and local knowledge of individuals who have information about specific conditions in

the affected regions, even more so if the incident took place in a remote area.92 In order to best

incorporate this knowledge into actual response, these people could be integrated into official

preparedness activities and exercises run by the Area Committees and Regional Response

Teams.

Third, more broadly, volunteer organizations and select citizen scientists could be integrated into

the Area Committee and Regional Response Team activities.93 This would help NOAA build

more engaging relationships with different stakeholders, facilitate learning from both responders

and non-responders regarding spill response, and identify ways to cultivate new opportunities for

collaborative efforts. This could be done in all preparedness cycles, including planning, training,

and drills.

91 The discussion in this section has been largely adapted from Walker et al (2014). 92 Also from interview with SSCs. 93 Ibid.

Page 28: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

26

4.0 BASELINE REQUIREMENTS FOR CITIZEN SCIENCE

There are some fundamental components that must exist within the organizations and agencies

that plan to implement citizen science programs in order for the programs to be successful. These

core requirements are: the existence of meaningful and beneficial work for both the public and

NOAA, a substantive feedback loop and a collaborative approach.

4.1 CO-BENEFITS FOR BOTH THE PUBLIC AND NOAA

4.1.A MEANINGFUL WORK FOR THE PUBLIC

Volunteers offer their time because they feel they can be of some value in addressing a problem.

In order to feel that they are contributing, however, the work that they are assigned must be

meaningful, and they must understand the meaning. So, in developing a citizen science program,

it is important for NOAA to provide volunteers with tasks that clearly connects to a response.94 It

is also very important that this connection is communicated. Many types of volunteer work, even

if they are seemingly removed from frontline response, have the potential to be meaningful

components so long as that is value relayed to the volunteers.

Providing meaningful work is important for all citizen science programs, but is particularly key

for long-term volunteers and affiliated organizations.95 For these types of volunteers, there is a

great deal of value derived from an ongoing relationship, and the understanding that their work

has meaning for NOAA will help that relationship continue.

4.1.B BENEFICIAL WORK FOR NOAA

In identifying meaningful work, NOAA should first look at its internal data needs. Creating a

program that fills existing gaps or provides additional capacity in a particular area will mean that

the volunteers’ work is beneficial to the response. In turn, this will ensure that it provides

meaningful work to the citizens and is beneficial to both parties.96

4.2 COMMUNICATION/FEEDBACK LOOP

Citizen science, like all volunteer activities, is a two way process. For the public to get involved,

there must be some visible response to their efforts. Giving the public useful tasks will not retain

94 Interviews with WSU Beach Watchers and COASST. 95 Interview with Sea Grant 96 Ibid.

Page 29: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

27

their interest unless they understand that their efforts are being useful. Therefore, it is imperative

that NOAA communicates back the ways in which citizen-generated information is being used.

Giving people a venue in which to see the impacts of their efforts will help keep them engaged

and build trust between volunteers and NOAA. It is also important to maintain a communication

channel to keep informed about community risk perceptions and have an outlet for addressing

these perceptions.97

4.3 COLLABORATIVE APPROACH

For citizen science to produce a tangible benefit to public or to NOAA, there must be a belief on

both sides that there are positive gains to be realized from cooperation and collaboration. If either

party doubts the intentions or abilities of the other too greatly, the process will not succeed. In

addition, a back and forth with volunteer organizations and NOAA can tease out the best

possible solutions for both parties.98

97 Interview with COASST and Walker et al. (2014). 98 Interviews with NOAA SSCs and WSU Beach Watchers.

Page 30: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

28

5.0 EVALUATION

This section presents an evaluation framework, introduces a set of criteria for analysis, and then

identifies and analyzes potential activities for NOAA.

5.1 DECISION FRAMEWORK

Based on the literature and interviews, we created a decision framework as a practical tool for

NOAA in evaluating different options related to citizen science. Appendix 2 shows the full

model used for this analysis. Broadly, the framework is divided into two main sections: (1)

Programmatic Decisions and (2) Citizen Science Model Decisions. The programmatic

decisions section addresses the management components of implementing a citizen science

program. The citizen science model decisions section breaks out the different types of citizen

science that we have identified through our research and highlights points for comparison.

Within each of these two sections (programmatic and citizen science models), we applied a flow-

chart hierarchy to reflect actual decision-making sequence: When, Who, What, How.

● When refers to the timing of a given activity in relation to the spill. This is broken into

Pre-Spill and During a Spill, as these are the primary stages addressed in this report.

● Who refers to the parties that will be involved in any given activity. This could include

NOAA alone, NOAA partnered with affiliated volunteer groups, and NOAA working

with unaffiliated opportunistic volunteers.

● What is the activity being analyzed. For the programmatic decisions, What relates to the

management action, and, for the citizen science models, it relates to the type of action

being taken by volunteers.

● How explains the different options in which the What activities can be approached and

provides the core comparisons within this analytic framework.

The activities listed as What were identified by matching the citizen science activities from our

research with the three central questions that NOAA asks during a spill.99

1. What got spilled?

2. Where will it go and what will it hit?

3. What damage will it cause and how can the effects of the spill be reduced?

99 NOAA OR&R website (http://response.restoration.noaa.gov/oil-and-chemical-spills). Also from interviews.

Page 31: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

29

All of the activities (What) are included in the framework, but they are not evaluated against one

another. This is because we found that all of these activities need to be considered and addressed

by NOAA in incorporating citizen science. Instead, as detailed in the following sections on

criteria and actual evaluation, different options (How) to address each activity are evaluated.

5.2 CRITERIA AND SCORING

5.2.A CRITERIA

Within each potential citizen science activity that we identified, different methods related to

How above will be weighed against a set of standardized criteria. In this way, the comparisons

are made among different methods and not between each What activity. The criteria used for

evaluation were developed based on the recurring concerns that arose in our literature review and

interviews. These were then categorized and condensed into the following five criteria.

5.2.A.1 MANAGEABILITY

This assesses how resource intensive an option is. It takes into account the levels of people,

infrastructure and tools that would be required from NOAA to implement this option. Low-

resource strategies will receive the highest scores. This criterion assesses the following subsets of

questions:

● People – What are the internal demands on NOAA in terms of staff capacity associated

with this strategy?

● Program Infrastructure100 – This discusses the extent to which a strategy would be

built on existing infrastructure or require new infrastructure within NOAA.

● Scalability - How adaptable is this option to different sized responses?

5.2.A.2 MINIMAL COST

This is a measure of the direct investment required from NOAA. It will not be measured in

precise dollar amounts, but instead is shown relative to other citizen science strategies. Low cost

options will receive the highest scores.

100 Also called “automaticity” in some policy literature (see Salamon (2002), The Tools of Government).

Page 32: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

30

5.2.A.3 DATA VALUE

This looks at the value of the data provided by a given option in terms of usefulness of that type

of data for NOAA response as well as the reliability of the collection method. Strategies with

high values in both categories will receive the highest scores. This criterion assesses the

following subsets of questions:

● Reliability – This looks at the collection methods, and whether they provide sound data.

● Usefulness of this Data Type in Response – This assess the format and content of the

data. Is it useful information? Is it in a useable format?

5.2.A.4 MINIMAL LIABILITY

This analyzes the potential liability concerns that NOAA faces in implementing a given option. It

is broken into two types of liability – safety and data. Strategies with the least potential liability

will receive the highest scores. This criterion assesses the following subsets of questions:

● Participant Safety – What are the liability concerns of having the public participate?

● Data Collection Liability – What are the liability concerns around collecting data in this

way?

5.2.A.5 PARTICIPATION VALUE

This measures the value that a given option provides beyond data. It accounts for the benefits of

engaging the public and the educational capacity provided by a given citizen science option.

Strategies with high participation values will receive the highest scores. This criterion assesses

the following subsets of questions:

● Public Engagement – Is this strategy capable of incorporating large numbers of people?

● Public Education – Does this strategy help inform the public before or during a disaster?

5.2.B SCORING

Each option was assigned a score from 1-5 for each applicable criteria, with 5 being the most

desirable. The scores are meant to be comparative rating against other options (How) within each

activity (What), rather than numbers for each option implying absolute values in and of itself.

Our team tried to reduce our personal biases by basing the scores on our research and interviews.

Page 33: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

31

The scoring contains two different levels of weight assignments. First, the total score is

computed by applying “individual weights” to these scores and converting into 100. For this

analysis, all the criteria have been given equal weights, but when using this tool in the future,

weights should be assigned based on the importance of each criteria to NOAA.

Second, the weighted score takes into account that not all criteria matter equally to all parties

involved in citizen science and that collaborative value needs to be accounted for. Hence,

“collaborative weights” are used to give different weights to the criteria that provide the most

value for NOAA (manageability, data value, cost, and liability) and the criterion that matters

most to the public (participation value). For this analysis, the two different values were assigned

equal weights. Through this process, the weighted score will better reflect NOAA’s needs and

values as well as the dual value that is necessary in citizen science.

5.3 POTENTIAL CITIZEN ACTIVITIES FOR NOAA & ANALYSIS OF OPTIONS

This section first introduces the potential activities (What) through which NOAA could engage

volunteers in citizen science processes as well as some of the pre-need programmatic activities

that NOAA can take.

Within each of these activities (What), we compare the alternative options (How) by scoring

them against the criteria. It is not intended to provide perfect answers, as every decision is

contingent on a unique set of circumstances. Instead, these comparisons are meant to be a tool

for facilitating discussion and highlighting tradeoffs to streamline that discussion. Appendix 2

shows the full matrix used for our evaluation.

5.3.A PROGRAMMATIC DECISIONS – PRE-SPILL

5.3.A.1 ESTABLISH DATA COLLECTION PROTOCOLS

A fundamental component of any science program is the data collection process. The quality of

the end product is largely a result of the methodology used to gather the initial information –

good research is built on strong data. The first step in using citizen science to collect good data is

for NOAA to establish their desired data collection protocols.

Collection protocols, for the purposes of this report, fall onto a data structure spectrum.101 On

one end of the spectrum lies completely unstructured information. This type of data does not ask

any direct questions or require any specific collection techniques. On the other end of the

101 Interview with NOAA MDP (Peter Murphy).

Page 34: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

32

spectrum is highly structured data, which requires very detailed collection techniques, asks

targeted and verifiable questions, and only allows certain data inputs. Most data collection

protocols will lie somewhere between these two extremes.

The collection protocol models we

compare in this framework are open-ended

(or unstructured) data collection and

survey-based data collection. Examples of

open-ended collection include free-form

email reporting systems or phone calls.

Survey-based collection protocols, in

comparison, would provide collectors with

a set of questions to answer and a method

in which they should answer those

questions. In practice, this could range

from an app-based photo submission

process with associated data fields to

highly structured surveys in which

volunteers answer detailed categorical

questions and have specific directions on

how data should be collected.

Comparisons between these protocols

options are captured later in the citizen science models section as the relative benefits are highly

related to the type of citizen science model that NOAA will pursue during a spill. For example, if

NOAA decides to use structured surveys for receiving information from unaffiliated volunteers

during a spill, then that protocol will need to have been created before an actual spill. Although

the evaluation of options take place in the later section, we deemed it necessary to include this

activity as one of the pre-spill programmatic decisions by NOAA, because it is important to

establish the protocol before an actual spill situation.

It is important to note that creating a structured protocol will require NOAA to go through the

OMB approval process stipulated by the Paperwork Reduction Act. Therefore, it is important

that NOAA make a decision early on and design such a protocol prior to a spill scenario.

5.3.A.2 NOAA AND VOLUNTEER ORGANIZATIONS DEVELOP PARTNERSHIPS

Another pre-spill step for NOAA to take in instituting a citizen science program is to develop

partnerships with either affiliated or unaffiliated volunteer organizations. Partnering with

organizations that have the capacity to help with volunteers can either be taken on through a

Examples:

Unstructured protocol - The NOAA Marine

Debris Program began collecting data by

opening an email address where people who

found debris could report their observations.

This came in as unstructured data that was

subject to whatever the observer thought was

notable.

Structured protocols - The COASST program

has a highly rigorous protocol that includes

regular observations, beach walking patterns,

and detailed datasheets outlining what

observations need to be recorded. The mPING

app, which was designed by NOAA/NSSL, the

University of Oklahoma, and the Cooperative

Institute for Mesoscale Meteorological Studies,

asks observers for structured data in the form of

standardized questions about weather.

Page 35: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

33

voluntary partnership or a formal agreement. Voluntary partnerships are those in which there is

no formal agreement. Conversely, formal partnerships are those in which the relationship has

been established and defined in writing. Examples of agreements that could lead to a formal

partnership are contracts specifying activities and compensation, documents assigning liability to

one party or another, and data use agreements about the information being gathered by citizen

scientists. Establishing these partnerships can be a tool to ensure that citizen science

organizations are able to best contribute when a spill happens.

● Option 1: Voluntary Partnership

A voluntary partnership would be fairly manageable relative to a formal agreement, with the

only real demands being to incorporate the incoming data into the response and to maintain

communications with the organization. This also means that there likely would be less upfront

cost to NOAA. The data developed from a voluntary partnership could benefit from NOAA’s

protocols, but may be less reliable due to NOAA’s lack of control. NOAA would also have a

relatively higher liability risk, since no formal agreement would exist to clarify the liability of

volunteers collecting data. A voluntary partnership would allow more organizations to participate

because the barrier of creating an agreement would be removed. However, there would also be

less direction from NOAA in the messaging reaching volunteers.

● Option 2: Formal Agreement

The management requirements of implementing a formal agreement would be somewhat higher

than a voluntary partnership. Staff would need to create new contracts, there would be a process

holding back fast scalability to include other organizations, and the structure of the agreement

would need to be developed. So, the relative cost for a formal agreement is also higher than a

voluntary partnership. There are benefits of a formal agreement, though. The data coming from a

formal agreement would be subject to the protocols that NOAA listed in the agreement, making

the data as strong as the rules it is built on. Assuming that a formal agreement would include a

liability clause, this is a lower risk strategy for NOAA (But, if that were not the case, liability

could actually increase, changing the scores significantly). Participation may be slightly

decreased, but the public education value provided by partnering with NOAA counters this,

giving it a similar participation value to a voluntary partnership.

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

voluntary partnership 4 5 4 2 4 76.0 77.5

formal agreement 3 3 5 5 4 80.0 80.0Pre-Spill with both AV & UV Define relationship

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

voluntary partnership 4 5 4 2 4 76.0 77.5

formal agreement 3 3 5 5 4 80.0 80.0Pre-Spill with both AV & UV Define relationship

Page 36: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

34

5.3.B PROGRAMMATIC DECISIONS - DURING A SPILL

Some of the programmatic decisions in managing a citizen science program will not be able to be

set up beforehand and will need to take place during the spill response. These relate to the how

the affiliated volunteers and the method for incorporating new volunteers that become available.

5.3.B.1 NOAA AND AFFILIATED VOLUNTEER ORGANIZATIONS COORDINATE VOLUNTEERS

The two options available for NOAA to manage affiliated volunteers are to supervise the trained

volunteers directly, or to manage a roster and leave the volunteer-level management to the

affiliated organization in which the volunteers are housed.

● Option 1: Direct supervision by NOAA

NOAA does not currently have the capacity, resources, or time to supervise volunteers during

spills, making this a difficult management prospect and a higher cost option. It would also invite

liability concerns, as increased involvement increase the possibility that NOAA would be liable

for volunteer injury or third party injury. However, direct supervision would provide more

reliable data since NOAA could monitor the process. This would also provide a greater

educational opportunity by enabling NOAA to disseminate information directly to volunteers.

● Option 2: Affiliated organizations coordinate volunteers

Having affiliated organizations manage volunteers is a more realistic option for NOAA in terms

of manageability and cost. NOAA could connect with the organizations on a higher level, but

would be free from the demands of on the ground volunteer management. The value of incoming

data may be slightly less reliable, but with affiliated organizations, poor data quality is generally

less of a concern. NOAA’s liability would also go down by removing themselves from day-to-

day volunteer direction. Another benefit of this model is that more volunteers could likely

participate, as they would not be limited by NOAA’s internal management constraints.

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

supervised by NOAA 2 3 5 3 4 68.0 72.5

managed by AOs 5 5 4 5 4 92.0 87.5During a Spill with AV Volunteer coordination

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

supervised by NOAA 2 3 5 3 4 68.0 72.5

managed by AOs 5 5 4 5 4 92.0 87.5During a Spill with AV Volunteer coordination

Page 37: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

35

5.3.B.2 NOAA DEVELOPS A CHANNEL FOR INTEGRATING UNAFFILIATED VOLUNTEERS

When volunteers make themselves available during a spill, it will be important for NOAA to

have developed a channel for taking them on. This should be chosen and established beforehand

to ensure that volunteer integration is as seamless as possible and limits detracting from spill

response. In approaching this, NOAA can either opt to manage these volunteers directly and

have established admittance procedures, or direct them to an affiliated organization that has the

capacity to take on and train the volunteers.

● Option 1: NOAA manages rosters directly

Directly managing an unaffiliated volunteer registration channel is a less ideal model for NOAA

to pursue than having affiliated volunteer organizations manage them. This would involve

fielding inquiries from potential volunteers, registering them, and assigning them to tasks, which

would be resource intensive and costly. In addition, holding personal information in a database

and being the agency to assign volunteers to activities would open NOAA to more liability

compared to relaying volunteers to affiliated organizations. More importantly, in the absence of

volunteer management training or increased staff to support volunteer registration, balancing

primary scientific support duties with managing registration would likely strain NOAA staff,

which will limit participation levels, relative to option 2.

● Option 2: NOAA relays volunteers to affiliated organizations

For NOAA, a better option for unaffiliated volunteer registration is to direct these volunteers to

pre-established affiliated organizations that can take them on. This could be as simple as listing

volunteer opportunities on the NOAA website. While there would be some cost to maintaining a

current list of potential channels, it would be minimal compared to the demands of developing an

internal NOAA volunteer process.

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

managed by NOAA 1 1 4 3 36.0 45.0

relayed to AOs or

State agencies5 5 5 5 80.0 87.5

During a Spill Volunteer registrationwith UV

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

managed by NOAA 1 1 4 3 36.0 45.0

relayed to AOs or

State agencies5 5 5 5 80.0 87.5

During a Spill Volunteer registrationwith UV

Page 38: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

36

5.3.C CITIZEN SCIENCE MODELS - PRE-SPILL

5.3.C.1 PRE-SPILL BASELINE STUDIES

Baseline studies provide the comparison data for NOAA to use when assessing whether or not a

shoreline has been affected by an oil spill. Baseline data can include wildlife counts, tar ball

levels, shoreline type, vegetation information, temperatures, and ecosystem details. To gather

this kind of data using citizen science, NOAA can either work directly with affiliated volunteers

to collect the data or develop some capacity for taking in citizen science data provided by other

organizations. NOAA’s ERMA system provides a good example of how this might work.

Trusted organizations could enter their data into the system and OR&R would be able to see the

data in its final format overlaid on the map to complement data from other sources.

● Option 1: NOAA conducts baseline studies and manages volunteers

Conducting a baseline study where NOAA directly managed affiliated volunteers would likely

provide higher quality data for citizen science as the process would be very visible to NOAA and

they would have the ability to act as quality control. It would also give NOAA the opportunity to

teach volunteers about spill response and about baseline shoreline characteristics, increasing the

public education element of this option. However, like all direct engagement, this is more costly

than if NOAA could be more removed. It would take staffing capacity, require program

management from within NOAA, and would only be as scalable as the resources NOAA could

dedicate to it. There would also be greater liability risk from directly managing volunteers.

● Option 2: Baseline studies done by affiliated organizations and the data are shared

Allowing the affiliated organizations to conduct the baseline studies and then pass their results to

NOAA is a generally a more desirable option. It requires less direct support from NOAA in

terms of human resources and capital and would likely reduce liability by removing NOAA from

volunteer management. The data reliability, or at least the assurance of it, may be slightly lower

than if NOAA were to manage these studies themselves, but the agency could still have a say in

developing the protocols and establishing a standard. Although this option would allow for larger

numbers of participants, devolving management duties limits NOAA’s ability engage with them

on an educational level.

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

supervised by NOAA 2 2 5 3 5 68.0 80.0

conducted by AVs,

then data sharing5 4 4 4 4 84.0 82.5

Baseline study(e.g. geographic, shoreline assessment,

various monitoring)

with AVPre-Spill

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

supervised by NOAA 2 2 5 3 5 68.0 80.0

conducted by AVs,

then data sharing5 4 4 4 4 84.0 82.5

Baseline study(e.g. geographic, shoreline assessment,

various monitoring)

with AVPre-Spill

Page 39: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

37

5.3.D CITIZEN SCIENCE MODELS - DURING A SPILL

5.3.D.1. AFFILIATED VOLUNTEER OBSERVATIONS/FIELD STUDIES

While baselines are useful, they are primarily useful insofar as there are spill-based data to

compare them against. So, during a spill, NOAA will need to be collecting up-to-date

observations and field studies. There are three options to do this: (1) Incorporate unstructured

data coming in from affiliated volunteers, (2) Use more structured survey-based data, and (3)

Use citizen scientists to conduct full Shoreline Cleanup and Assessment Technique (SCAT)

surveys.

● Option 1: NOAA incorporates AV data from unstructured channels

The primary benefit of this option is its low initial cost. Developing unstructured channels takes

less upfront commitment, but that is countered by the higher demands on later data management.

Unstructured would require a fair amount of staff capacity to interpret results. It also is generally

going to provide lower quality data, with the value being mostly in flagging anomalies, not

observing trends or creating baselines. The liability concerns around this type of data collection

lie in NOAA not having established defined protocols, but maintaining an affiliation with the

volunteers. With the volunteers under NOAA’s umbrella, the agency has some liability and so

would be better served to define parameters around observation. Though an unstructured

platform allows for the greatest participation rates (and almost no barrier to entry), it also has

low educational benefits.

● Option 2: Structured surveys by Affiliated volunteers

Structured observations from affiliated volunteers improve upon the unstructured data in every

way except for cost. They are more manageable because data sorting demands are reduced, the

data value is higher because it has a level of standardization, liability is reduced as direction gets

clearer, participation numbers can still be high, and there is kinesthetic learning occurring during

structured surveys, which provides value to the public. Also the uniform format of structured

data collection is easier to digest, making this a more scalable option relative to unstructured

surveys. Cost would likely increase based on the need to develop a data structure, intake

channels, and QA/QC processes.

● Option 3: SCAT surveys by Affiliated volunteers

Affiliated volunteers could also be used to conduct Shoreline Cleanup and Assessment

Technique (SCAT) surveys in a spill environment. Executing a citizen science SCAT program

would build on the existing infrastructure at NOAA, with survey parameters that already exist.

The data collected would be more valuable because the level of expertise needed to conduct the

Page 40: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

38

survey is relatively high, and the results are designed to be easily integrated into response. The

liability would be relatively low as these would be trained volunteers conducting the surveys and

they would be associated with some affiliated organization that would hold the liability. SCAT

surveys provide a greater channel for communicating educational materials with volunteers. So,

while the demanding training requirements may limit participation quantity, it would be very

meaningful work for participants.

5.3.D.2. AFFILIATED VOLUNTEER DATA MANAGEMENT

Citizen scientists can also be used to manage data

once it has been collected. Possible options for

this include: (1) Using established volunteers that

are affiliated with partner organizations to enter

and/or do some preliminary sorting on the data;

(2) have these volunteers provide some first-level

data validation, such as making sure photos match

descriptions; (3) have the volunteers provide

higher level QA/QC data validation; and (4) have

the volunteers provide a preliminary synthesis of

the data for assessment by NOAA. These options

all vary in the level of responsibility put on the

volunteers and affect the formatting and usability

of data that would reach NOAA.

● Option 1: NOAA uses established volunteers to enter/sort data

Having affiliated volunteers enter data and provide preliminary sorting within their organization

is a great option for NOAA in terms of cost and agency requirements. There is virtually no added

demand created by this and the data would be one step closer to analysis by the time it reached

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

open-ended

observation2 4 2 2 3 52.0 55.0

structured observation 3 3 4 3 4 68.0 72.5

SCAT survey 4 5 5 4 3 84.0 75.0

During a Spill with AVObservation / Field survey

(e.g. geographic, shoreline,

wind/weather/water monitoring)

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

open-ended

observation2 4 2 2 3 52.0 55.0

structured observation 3 3 4 3 4 68.0 72.5

SCAT survey 4 5 5 4 3 84.0 75.0

During a Spill with AVObservation / Field survey

(e.g. geographic, shoreline,

wind/weather/water monitoring)

Example:

Preliminary Data Validation - The

Zooniverse Galaxy Zoo project asks

volunteers to classify images of galaxies

taken by the Sloan Digital Sky Survey, the

Hubble Space Telescope, and United

Kingdom Infrared Telescope. Classification

is simple -- volunteers look at an image and

ask a few basic questions about that image,

allowing large amounts of data to pass

through a preliminary classification phase

quickly. In the first year of this program,

more than 50 million images were classified.

Page 41: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

39

NOAA. However, by allowing for external sorting, there is less control over the QA/QC

processes, thereby decreasing data reliability. There is also reduced liability concern in having

affiliated volunteers enter data, large amounts of people could participate, and there is

educational value in being exposed to the data coming in.

● Option 2: provide preliminary data validation

Using affiliated volunteers to conduct simple data validation is similar to having them enter and

sort the data in terms of management, cost, and liability. This would provide slightly increased

data validity, however, with affiliated volunteers having flagged glaring data errors. But, since

this practice would require more training than straightforward data entry, there would likely be

less relative participation.

● Option 3: provide more advanced QA/QC validation

Providing more advanced validation, beyond basic flagging of anomalies, is similar to the

preliminary validation but would create increased cost and manageability demands associated

with purchasing and deploying more technologically advanced QA/QC mechanisms. That said,

with the more robust data management processes, this option produces the most usable data.

Despite these measures, NOAA would likely be exposed to greater liability because of the

increased reliance on volunteers to process data.

● Option 4: provide a preliminary synthesis of the data

If affiliated organizations were able to provide a preliminary synthesis along with a dataset, this

would increase the data usefulness to response. While maintaining access to the raw data would

still be important, an overview and preliminary findings may save time for the response team and

point out initial takeaways from the data. But, in order to have the data be useful, NOAA would

need to communicate the format and standard for the incoming synthesis, which would take

planning and staff capacity. Working with affiliated organizations to create an initial synthesis

could be a relatively cost effective method for NOAA as this would help flag useful and non-

useful datasets early on without requiring internal analysis. The drawback of this option is that

NOAA’s liability would increase the more it delegates data analysis to volunteers.

Page 42: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

40

5.3.D.3 UNAFFILIATED VOLUNTEER OBSERVATIONS/FIELD STUDIES

Much like the Affiliated Volunteer Observations/field studies outlined above, opportunistic

volunteers could be used to provide observational or monitoring data from the field. The ways in

which the data are collected mirrors the Affiliated Volunteer Observations/field studies.

However, since the management practices for these two groups differ, the opportunistic have

been broken out for their own scoring and assessment.

● Option 1: NOAA incorporates UV data from open-ended channels

Collecting unstructured data from volunteers has a number of drawbacks. It could be expected to

be a significant drain on staff capacity, as high levels of unstructured data would require

intensive sorting and interpretation, increasing the long-term costs of the model. The data would

also be less valuable to a response because observations would not be easily aggregative or

useable in a timely way. These types of observations can be helpful in flagging anomalies, but

are much less useful in observing trends quickly. A strength of this option is that is limits

NOAA’s liability. By not soliciting any direct responses or posing any specific questions, NOAA

would be less liable for the people making observations. Moreover, large numbers of people

could submit information through these unstructured channels because there are no training,

registration, or technical requirements to report in an open-ended way. But, this lack of structure

also means that NOAA would be unable to provide any educational messaging prior to a

volunteer reporting. The work would be less beneficial to the volunteers and would be hard for

NOAA to give meaningful feedback on.

● Option 2: structured surveys by unaffiliated volunteers

Collecting data in a more structured format such as a paper or app-based survey that has pre-

established, combinable fields and clear instructions can be significantly more valuable to spill

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

data entry & sorting 5 2 2 4 4 68.0 72.5

preliminary data

validation5 2 3 4 3 68.0 65.0

advanced validation 4 3 4 3 3 68.0 65.0

preliminary synthesis 3 4 4 2 3 64.0 62.5

Data Managementwith AVDuring a Spill

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

data entry & sorting 5 2 2 4 4 68.0 72.5

preliminary data

validation5 2 3 4 3 68.0 65.0

advanced validation 4 3 4 3 3 68.0 65.0

preliminary synthesis 3 4 4 2 3 64.0 62.5

Data Managementwith AVDuring a Spill

Page 43: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

41

response. More structured data can be analyzed more quickly, helping to highlight trends. Since

structured data require less time commitment to make the data interpretable, structured survey

instruments are a more scalable solution. While there will be upfront costs to developing a

structured dataset, the ongoing labor costs will be less than with an unstructured data collection

method. Depending on how the information is submitted, liability could be a concern because of

the personal information storage and PRA requirements. This is a good option for maximizing

participation value because it is not particularly exclusive (anyone could be trained to complete a

survey) and surveys have the potential to provide some educational value. Likewise, if NOAA

has approved the observation methods/questions, this lends validity to the operation and conveys

to volunteers that there may be some larger benefit to participation.

● Option 3: SCAT surveys by unaffiliated volunteers

Unaffiliated volunteers will, as a group, have less training in data collection protocols than the

affiliated volunteers. So, using them in a more complex task like SCAT surveys would require

extensive up front training, creating a drain on NOAA’s capacity and creating increased costs. If

this level of training was provided, the unaffiliated volunteers would likely be able to produce

usable and valuable data due to the structured nature of the SCAT surveys. Because the

volunteers would still be housed within NOAA instead of an affiliated organization, NOAA’s

liability risk could increase by using unaffiliated volunteers in this capacity. Having the

volunteers within the NOAA structure would provide great opportunities for education, but the

high training requirements would limit the number of participants.

5.3.D.4 UNAFFILIATED VOLUNTEER DATA MANAGEMENT

Finally, once data have been collected by citizen scientists or from other sources, there is a

possibility for opportunistic citizen scientists to provide analysis on that data. They can either do

this under the direct supervision of NOAA, in which case a NOAA staff member would be

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

open-ended

observation1 2 2 5 3 52.0 55.0

structured observation 3 3 4 3 4 68.0 72.5

SCAT survey 1 1 4 2 3 44.0 50.0

Observation / Field survey(e.g. geographic, shoreline,

wind/weather/water monitoring)

with UVDuring a Spill

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

open-ended

observation1 2 2 5 3 52.0 55.0

structured observation 3 3 4 3 4 68.0 72.5

SCAT survey 1 1 4 2 3 44.0 50.0

Observation / Field survey(e.g. geographic, shoreline,

wind/weather/water monitoring)

with UVDuring a Spill

Page 44: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

42

present to answer any questions. Or, NOAA can establish a channel through which the

unaffiliated opportunistic volunteers are connected with affiliated organizations, and become

affiliated volunteers.

● Option 1: Under direct NOAA supervision

Taking on citizens to do data analysis may have its place in a citizen science program, but the

management demands of handling this type of work internally at NOAA would be very high.

Training would be required and unskilled volunteers would require training time from NOAA

staff in order for them to accomplish a relatively straightforward task. It would also not be a very

scalable program, as it would be limited to NOAA’s capacity and the data entry demands of the

spill information, which may only warrant a limited number of data entry personnel. But, by

managing data entry participants directly, NOAA could ensure that the data entry methods are

sound and that the final data set is as reliable as the incoming data. There is some level of

associated liability with any type of direct management. Also, allowing citizens to enter

potentially personal information could present liability concerns for NOAA as the managers. As

volunteer became familiar with the data sets, this would be a valuable participatory activity for

them to complete.

● Option 2: Relay unaffiliated volunteers to affiliated organizations or state agencies

A less management intensive option for NOAA to incorporate unaffiliated volunteers into data

analysis would be to have them volunteer within affiliated organizations. This way, the

organization would take on the day-to-day operations of managing these untrained volunteers,

taking the human resources and cost burdens off of NOAA. Additionally, this is a more scalable

solution because it would allow larger numbers of people to participate in analysis, meaning it

could work for a range of responses. This would likely produce slightly less reliable data than if

these volunteers were managed within NOAA, but that is countered by the liability benefits of

removing NOAA from volunteer management and the large participation value that comes from

an activity that can take on lots of people and provide educational value through data exposure.

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

directly supervised

by NOAA1 1 5 3 4 56.0 65.0

relay to AOs or

state agencies5 4 4 5 5 92.0 95.0

Data managementwith UVDuring a Spill

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

directly supervised

by NOAA1 1 5 3 4 56.0 65.0

relay to AOs or

state agencies5 4 4 5 5 92.0 95.0

Data managementwith UVDuring a Spill

Page 45: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

43

6.0 DISCUSSION AND EXAMPLE SCENARIOS

6.1 SUMMARY OF KEY FINDINGS AND TRADEOFFS

The following subsections identify some of the choices that the decision tool points to when

certain criteria are given all the weight. While it is not realistic to use the tool with superlative

weights like this, it does demonstrate how prioritizing different criteria can bring about different

results. This is used to highlight the core tradeoffs that exist in selecting components of a citizen

science model, while also providing a better sense of how the different criteria interact with the

options.

6.1.A SCENARIO 1: MAXIMUM FEASIBILITY

The first scenario looks at the options that had the highest combined scores for manageability

and cost, which are lumped together here as “feasibility.” These measures are aimed at reducing

the demands on NOAA’s infrastructure, minimizing cost, and maximizing scalability. In general,

these criteria lend themselves to options where daily operations are managed outside of NOAA

by another organization. Options that involve introducing less administrative tasks also scored

well (establishing voluntary partnerships and using the pre-existing SCAT survey protocol).

Finally, structured data collection is important here because of the scalability of operations that it

provides.

< Figure 3 > Maximum Feasibility Scenario

Page 46: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

44

6.1.B SCENARIO 2: MAXIMUM DATA VALUE

This scenario is focused only on the value of the data being produced by the citizen science

program. The two components of data value are reliability and usefulness to response. Both of

these are likely to benefit from NOAA’s direct involvement in the data collection process. The

increased visibility and instruction that NOAA can provide by managing citizen science

programs means that the data is more likely to be reliable. Additionally, formal partnerships and

existing protocols (like SCAT survey) are a way to ensure that the data being collected will be

useful.

So, while a hands-off approach to coordinating a citizen science program may be the most

feasible, it is not going to provide the strongest data. Likely, any program will want to reach

some comfortable middle ground between these two, but this tradeoff will exist at any level.

< Figure 4 > Maximum Data Value Scenario

Page 47: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

45

6.1.C SCENARIO 3: MINIMAL LIABILITY

Liability looks at participant safety and data collection liability, both of which should be taken

very seriously. Here, the options that scored the highest were those where NOAA had formal

partnerships with affiliated organizations to officially assign responsibility to the AO. As far as

managing volunteers, however, the recommendations are split – NOAA should be more hands-

on with those affiliated volunteers that have some official relationship, and less hands-on with

the unaffiliated volunteers. This is because the unaffiliated volunteers have no formal tie to

another organization that can house their liability, so providing direct supervision puts NOAA at

risk. On the other hand, with the affiliated volunteers, NOAA has an established relationship and

so should manage the volunteers more closely to ensure their safety. Please note that safety is

equally important for all volunteers, the distinction being made here is who takes on the

responsibility of providing that safety.

< Figure 5 > Minimal Liability Scenario

Page 48: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

46

6.1.D SCENARIO 4: MAXIMUM PARTICIPATION VALUE

This scenario differs from the others in that it focuses the criterion that primarily provides public

value, as opposed to internal value to NOAA. Participation value, as defined in this report,

comes from a program’s ability to incorporate large numbers of volunteers and the educational

value that a program is able to provide to volunteers. Here, the options lean toward involvement

from NOAA due to the potential educational value provided through such an arrangement, and

structured protocols which allow for large amounts of data to be taken in for analysis. The reason

that data capacity matters is because the ability to take in more information means that more

volunteers can work toward creating that data and NOAA will be able to communicate back

about it more easily.

< Figure 6 > Maximum Participation Value Scenario

Page 49: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

47

6.1.E SCENARIO 5: MAXIMUM OVERALL VALUE (USING 1:1 WEIGHTS)

The final scenario consists of the recommendations provided by the tool if all criteria are equally

weighted (even individual weights) and the NOAA value to public value relationship

(collaborative weights) is given a 1:1 ratio. This is the most realistic of these scenarios as it takes

into account all of the criteria as well as weighting for the collaborative value that comes from

high participation value and response value.

This scenario takes components from all of the extreme scenarios presented before it. It has the

formal partnerships that reduce liability and promote strong data value. It has the more removed

management options, which provide increased manageability. It also builds on existing protocols

(SCAT) for affiliated volunteers while introducing the highly scalable structured format

observations for unaffiliated volunteers, resulting in high manageability for affiliated volunteers,

and high participation value for unaffiliated volunteers.

< Figure 7 > Maximum Overall Value (even individual weights and 1:1 collaborative weights)

Page 50: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

48

6.2 MOVING FORWARD

The stage has been set for incorporating citizen science into spill response. Public interest in

participating in oil spills is high, improvements in technology have made data more easily

communicable and participation more widely available, and citizen models are becoming more

mainstream. The National Response Team has taken note of this and called for some level of

volunteer engagement in their document, Use of Volunteers Guidelines for Oil Spills (2012).

NOAA operates in a world where the public wants to participate, and it needs to be prepared to

meet this new challenge.

There is a lot of potential to be derived from public engagement during emergency response.

Currently, much of this is untapped and not used to its capacity. But, lost opportunity is only one

of the risks associated with failure to engage the public. As has been highlighted in recent

disasters, the public can lose trust in response efforts leading to a spread of misinformation,

social unrest, and possible poor media portrayal (which has expanded to include social media).

On top of this, waiting for an emergency situation to develop volunteer management strategies

will result in loss of efficiencies and require resources that may be better used elsewhere. All of

these increase the risk to an already complex and challenging environment.

Despite these challenges, citizen science is a strong potential channel for NOAA to incorporate

the public in spill response. We have five recommendations that we suggest taking into account

when developing a new program:

● Acknowledge the potential benefits of citizen science

● Define goals clearly and recognize trade-offs

● Use the decision tool to move from concept to operation

● Build a program that meets the baseline requirements

● Start now – Pre-need actions pay off

6.2.A ACKNOWLEDGE THE POTENTIAL BENEFITS OF CITIZEN SCIENCE

Citizen science has benefits for both NOAA and the public, but these benefits must be

recognized in order to be realized. First, citizen science fills a resource gap in emergency

response by providing widespread, fine grain data that is not as easily collected through

conventional means. Second, citizen science programs offer constructive and meaningful ways

for the public to engage in emergency response. Finally, because of this engagement, citizen

science can help improve scientific literacy and reduce risk perception, helping NOAA

communicate risk more clearly.

Page 51: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

49

6.2.B DEFINE GOALS CLEARLY AND RECOGNIZE TRADEOFFS

After acknowledging the potential benefits of citizen science, NOAA needs to understand its

own priorities. This means clearly defining the goal of the citizen science programs that will be

developed and recognizing that there are inherent tradeoffs to any approach. Understanding the

outcomes (e.g. functional, short-term, long-term) desired will help guide management and

implementation decisions down the road, while defining the intended use of the citizen-generated

data will guide how data quality factors into a program.

6.2.C USE THE DECISION TOOL TO MOVE FROM CONCEPT TO OPERATION

Oil spill response is unpredictable, so a “one-size fits all” solution is not realistic. However, the

decision tool provided in this paper will help define the types of program and implementation

methods that are appropriate. By taking its citizen goals and translating them into priorities,

NOAA can weight the criteria presented in our matrix and identify the paths that may be the best

fit for the situation.

6.2.D BUILD A CITIZEN SCIENCE PROGRAM THAT MEETS THE BASELINE REQUIREMENTS

No matter what activities and methods are decided upon, any citizen science program should

incorporate the three baseline requirements: Co-benefits, a communication and feedback loop,

and a collaborative approach. The degree of participation in citizen science is related to the

quality of the participation, and the quality of participation will be much richer and more

sustainable if these requirements are taken into account.

6.2.E START NOW – PRE-NEED ACTIONS PAY OFF

Our final recommendation is that NOAA begin developing a program sooner rather than later.

First, it is important to develop pre-need relationships. The familiarity and trust that comes from

long-standing relationships is invaluable to citizen science, but it takes time to build.

Operationally, there is tacit knowledge to be gained by investing in long-term relationships,

which will improve efficiency in a response situation.

Outside of partnerships, there are other gains to be had by preempting necessity. The greatest

return on investment in citizen science programs comes from planning and preparedness. The

more established citizen science programs can become, the more they can be consistently

integrated into response, and the more efficient, sustainable, and effective they will be.

Page 52: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

50

ACKNOWLEDGEMENTS

The Citizen Science Management team would like to express our gratitude to those who helped

us in conducting our research and assembling this report. Thanks to:

● Our advisor, Beth Bryant, for guiding us through the project;

● Our client, Doug Helton and the Office of Response and Restoration, for providing us

with the opportunity to work on such an interesting topic;

● The Program on the Environment at the University of Washington, for facilitating the

process; and

● All the interviewees who volunteered their time and provided a great deal of valuable

information for this research.

Page 53: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

51

APPENDICES

APPENDIX 1 – SAMPLE INTERVIEW QUESTIONS

(Existing Citizen Science)

What has been the role of citizen science in recent (environmental) disasters? Were they

successful or not? In what ways?

What has changed and what are some of the new trends in citizen science projects? Do you have

any examples?

How would you define “citizen science” in your field of work? How has it been incorporated, if

at all? Has that incorporation been effective?

(Engaging Citizens)

What has been your experience working with volunteers or volunteer groups? How has your

experience been different between “pre-trained” and “spontaneous” volunteers?

o How do you mobilize, manage, coordinate, and retain volunteers? How do you maintain

an ongoing relationship with them?

o How much/what kind of training is needed to use citizen volunteers?

o Overall, were these programs successful or not? In what ways?

Have you engaged in any work in which volunteers gathered information for your work?

o If so, have you owned the information? How has the information been used? How have

you ensured the reliability of the data?

(Responding to Oil Spill Incidents)

What has been your experience working with NOAA, in particular with Scientific Support

Coordinators (SSCs)? What types of information are most relevant to your work?

What are the primary environmental, safety, and human health risks associated with volunteer

activities in support of an emergency response?

What do you think are the aspects of a successful citizen science program in terms of emergency

response? What are the underlying factors that can enable this success? What are the constraining

factors? (can include: planning, infrastructure, technology, coordination, pre-need relationship

building, training, costs, health risks, legal issues)

Page 54: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

52

APPENDIX 2 – DECISION FRAMEWORK102

102 1-5 scores are meant to illustrate comparative difference among options (How) and serve as proxies for more in-depth qualitative analysis in the report.

The fully functional Microsoft Excel spreadsheet has been provided separately to NOAA.

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

unstructured

structured

voluntary partnership 4 5 4 2 4 76.0 77.5

formal agreement 3 3 5 5 4 80.0 80.0

supervised by NOAA 2 3 5 3 4 68.0 72.5

managed by AOs 5 5 4 5 4 92.0 87.5

managed by NOAA 1 1 4 3 36.0 45.0

relayed to AOs or

State agencies5 5 5 5 80.0 87.5

When Who What How Manageability Minimal Cost Data Value Minimal LiabilityParticipation

ValueTotal Score Weighted

supervised by NOAA 2 2 5 3 5 68.0 80.0

conducted by AVs,

then data sharing5 4 4 4 4 84.0 82.5

open-ended

observation2 4 2 2 3 52.0 55.0

structured observation 3 3 4 3 4 68.0 72.5

SCAT survey 4 5 5 4 3 84.0 75.0

data entry & sorting 5 2 2 4 4 68.0 72.5

preliminary data

validation5 2 3 4 3 68.0 65.0

advanced validation 4 3 4 3 3 68.0 65.0

preliminary synthesis 3 4 4 2 3 64.0 62.5

open-ended

observation1 2 2 5 3 52.0 55.0

structured observation 3 3 4 3 4 68.0 72.5

SCAT survey 1 1 4 2 3 44.0 50.0

directly supervised

by NOAA1 1 5 3 4 56.0 65.0

relay to AOs or

state agencies5 4 4 5 5 92.0 95.0

Public

Manageability Cost Data Value Liability Participation

1 1 1 1 1 Individual Weights

1 Collaborative Weight

Citizen Science Programmatic Decisions

Citizen Science Model Decisions

with both AV & UV

with AV

Pre-Spill

Volunteer registration

Volunteer coordination

with UV

Set up NOAA protocol

Define relationship

During a Spill

NOAA alone* Placeholder to indicate the time of decision.

Refer to the Citizen Science Model Decisions for scoring.

1

Observation / Field survey(e.g. geographic, shoreline,

wind/weather/water monitoring)

Data management

Data Management

with UV

During a Spill

Observation / Field survey(e.g. geographic, shoreline,

wind/weather/water monitoring)

Baseline study(e.g. geographic, shoreline assessment,

various monitoring)

NOAA

with AV

with AVPre-Spill

Page 55: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

53

APPENDIX 3 – REFERENCES

LITERATURE

Bonney, R., C. B. Cooper, Janis Dickinson, Steve Kelling, Tina Phillips, Kenneth V. Rosenberg,

and Jennifer Shirk. 2009. Citizen Science: A Developing Tool for Expanding Science

Knowledge and Scientific Literacy. BioScience 59(11): 977-984.

Bonney, R., J. L. Shirk, T. B. Phillips, A. Wiggins, H. L. Ballard, A. J. Miller-Rushing, and J. K.

Parrish. 2014. Next Steps for Citizen Science. Science 343: 1436-1437.

Bowser, A., A. Wiggins, L. Shanley, J. Preece, and S. Henderson. 2014. Sharing Data While

Protecting Privacy in Citizen Science. Interactions Jan/Feb 2014: 70-73.

Bowser, A., and L. Shanley. 2013. New Visions in Citizen Science. Commons Lab, The

Woodrow Wilson Center: Washington, DC.

Coastal Observation And Seabird Survey Team (COASST). 2015. Final report: Preparing

COASST Post Spill. Submitted to WA Department of Fish & Wildlife. Contract No.: 12-

1938.

Cohn, J. P. 2008. Citizen Science: Can Volunteers Do Real Research? BioScience 58(3): 192-

197.

Coonrod, L. 2012. Volunteers, Citizen Science, and Interpretation. Legacy Jul/Aug 2012: 34-35.

Enders, A., and Z. Brandt. 2007. Using Geographic Information System Technology To Improve

Emergency Management And Disaster Response For People With Disabilities. Journal of

Disability Policy Studies 17(4):223-29.

Federal Community of Practice on Crowdsourcing and Citizen Science (FCPCCS) Factsheet.

EPA.

Gommerman, L. and M.C. Monroe. 2012. Lessons Learned from Evaluations of Citizen Science

Programs. University of Florida IFAS Extension EDIS Publications. Web.

http://edis.ifas.ufl.edu/fr359

Hines, M., Benson, A., Govoni, D., Masaki, D., Poore, B., Simpson, A. and S. Tessler. 2012.

Partnering for Science: Proceedings of the U.S. Geological Survey Workshop on Citizen

Science. Open-File Report 2013-1234.

Ibrahim, N. H. and D. Allen. Information Sharing and Trust during Major Incidents: Findings

from the Oil Industry. Journal of the American Society for Information Science and

Technology: 1916-1928.

Page 56: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

54

Klenow, D. J. and J. L. Reibestein. 2014. Eyes to the Sky: Situating the Role of Storm Spotters in

the Warning and Response Network. Homeland Security & Emergency Management 11(3):

437–458.

McCormick, S. 2012. After the cap: risk assessment, citizen science, and disaster recovery.

Ecology and Society 17(4): 31.

Murchison, S. B. 2010. Uses of GIS for Homeland Security and Emergency Management for

Higher Education Institutions. New Directions for Institutional Research 146: 75-86.

National Oil and Hazardous Substances Pollution Contingency Plan. 1994. Code of Federal

Regulations. Title 40, Part 300.

Ridge, M. 2013. From Tagging to Theorizing: Deepening Engagement with Cultural Heritage

through Crowdsourcing. Curator 56(4): 435-450.

Robson, E. S. Responding to Liability: Evaluating and Reducing Tort Liability for Digital

Volunteers. Policy Series v. 1. Commons Lab, The Woodrow Wilson Center: Washington,

DC.

Smith, B. 2014. Agency Liability Stemming from Citizen-Generated Data (Working Paper).

Policy Memo Series v. 3. Commons Lab, The Woodrow Wilson Center: Washington, DC.

Starbird, K., D. Dailey, A. H. Walker, T. M. Leschine, R. Pavia, and A. Bostrom. 2014. Social

Media, Public Participation, and the 2010 BP Deepwater Horizon Oil Spill. Human and

Ecological Risk Assessment: An International Journal. Accepted for publication.

Theobald E. J., A. K. Ettinger, H. K. Burgess, L. B. DeBey, N. R. Schmidt, H. E. Froehlich, C.

Wagner, J. HilleRisLambers, J. Tewksbury, M. A. Harsch, and J. K. Parrish. 2015. Global

change and local solutions: Tapping the unrealized potential of citizen science for

biodiversity research. Biological Conservation 181: 236-244.

Walker, A. H., R. Pavia, A. Bostrom, T. M. Leschine, and K. Starbird. 2014. Communication

Practices for Oil Spills: Stakeholder Engagement during Preparedness and Response.

Human and Ecological Risk Assessment: An International Journal. Accepted for

publication.

Wiggins, A., G. Newman, R. D. Stevenson, and K. Crowston. 2011. Mechanisms for Data

Quality Validation in Citizen Science. Computing for Citizen Science Workshop at the

IEEE eScience Conference, Stockholm, Sweden.

Wiggins, A., R. Bonney, E. Graham, S. Henderson, S. Kelling, G. LeBuhn, R. Littauer, K. Lotts,

W. Michener, G. Newman, E. Russell, R. Stevenson, and J. Weltzin. 2013. Data

Management Guide for Public Participation in Scientific Research. DataONE Public

Participation in Scientific Research Working Group. DataONE: Albuquerque, NM.

Page 57: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

55

Wilson Center. 2014. Barriers and Accelerators to Crowdsourcing and Citizen Science in

Federal Agencies: An Exploratory Study. Draft Summary for Discussion. Commons Lab,

The Woodrow Wilson Center: Washington, DC. Web.

http://wilsoncommonslab.org/2014/09/07/an-exploratory-study-on-barriers

Young, J. C., D. J. Wald, P. S. Earle, and L. A. Shanley. 2013. Transforming Earthquake

Detection and Science Through Citizen Seismology. Commons Lab, The Woodrow Wilson

Center: Washington, DC.

BLOG POSTS

Branch, M. 2013. Citizen Scientists Study Oil Spill Response. Orcas Issues News & Views.

http://orcasissues.com/citizen-scientists-study-oil-spill-response

Brown, M. 2013. Gulf Coast Research Lab Seeks ‘Citizen Scientists’ to Assist with Oil Spill

Research. The University of Southern Mississippi Blog.

http://www.usm.edu/news/article/gulf-coast-research-lab-seeks-‘citizen-scientists’-assist-

oil-spill-research

Gustetic, J., L. Shanley, J. Benforado, and A. Miller. 2014. Designing a Citizen Science and

Crowdsourcing Toolkit for the Federal Government. Open Government Initiative. White

House Blog.

https://www.whitehouse.gov/blog/2014/12/02/designing-citizen-science-and-

crowdsourcing-toolkit-federal-government

Koerth-Baker, M. 2011. Citizen Science in the Gulf of Mexico. Boing Boing.

http://boingboing.net/2011/04/22/citizen-science-in-t.html

Larson, S. 2014. Oh, Snapchat: Your Smartphone Just Became a Climate Scientist. Grist.

http://grist.org/business-technology/oh-snapchat-your-smartphone-just-became-a-climate-

scientist/

McCormick, S. Citizen Science More Than a Century Later: Ordinary People Go Online to

Track Gulf Oil Spill. George Washington University.

http://publichealth.gwu.edu/content/citizen-science-more-century-later-ordinary-people-

go-online-track-gulf-oil-spill

Moore, S. 2014. FEMA Will ‘crowd-source’ Future Hurricanes. Beaumont Enterprise.com

http://www.beaumontenterprise.com/news/article/FEMA-will-crowd-source-future-

hurricanes-5493151.php

Richardson, P. 2012. Citizen Science 4: Where Next? JISC Regional Support Centres Blog.

http://jiscrsc.jiscinvolve.org/wp/2012/12/citizen-science-4-where-next

Scientific American. Gulf Oil Spill Tracker.

http://www.scientificamerican.com/citizen-science/gulf-oil-spill-tracker

Page 58: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

56

Toomey, D. How Rise of Citizen Science Is Democratizing Research: Interview with Caren

Cooper of the Cornell Lab of Ornithology. Yale Environment 360.

http://e360.yale.edu/feature/interview_caren_cooper_how_rise_of_citizen_science_is_dem

ocratizing_research/2733/

Woods Hole Oceanographic Institution. 2014. Scientists Train the Next Generation on Oil Spill

Research.

http://www.whoi.edu/news-release/Gulf-Oil-Observers

ONLINE RESOURCES

California Department of Fish & Wildlife, Office of Spill Prevention and Response (OSPR). Cal

Spill Watch.

https://calspillwatch.dfg.ca.gov/

Cornell Lab of Ornithology. Citizen Science Central.

http://www.birds.cornell.edu/citscitoolkit/

CrisisCommons. CrisisCongress.

http://wiki.crisiscommons.eu/wiki/CrisisCongress

Federal Community of Practice on Crowdsourcing and Citizen Science (FCPCCS), EPA.

http://www2.epa.gov/innovation/federal-community-practice-crowdsourcing-and-citizen-

science

National Oceanic and Atmospheric Administration (NOAA), Emergency Response Division.

2010. An FOSC’s Guide to NOAA Scientific Support. 2nd ed.

http://response.restoration.noaa.gov/sites/default/files/FOSC_Guide.pdf

National Oceanic and Atmospheric Administration (NOAA), Emergency Response Division.

2013. Shoreline Assessment Manual. 4th ed.

http://response.restoration.noaa.gov/sites/default/files/manual_shore_assess_aug2013.pdf

National Response Team (NRT). 2012. Use of Volunteers Guidelines for Oil Spills.

http://www.nrt.org/production/NRT/NRTWeb.nsf/AllAttachmentsByTitle/SA-

1080NRT_Use_of_Volunteers_Guidelines_for_Oil_Spills_FINAL_signatures_inserted_V

ersion_28-Sept-2012.pdf/

National Wildlife Federation. Oil Spill Volunteer Opportunities.

https://www.nwf.org/What-We-Do/Protect-Habitat/Gulf-Restoration/Oil-

Spill/Surveillance-Network.aspx

University of California Curation Center. California Digital Library Data Management Planning

Tool.

https://dmp.cdlib.org

Page 59: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

57

Washington Department of Ecology. OilSpills101.wa.gov (Oil Spill Volunteers).

http://www.oilspills101.wa.gov/go/doc/5779/1793639/Oil-Spill-Volunteers

White House. 2013. Second Open Government National Action Plan for the United States of

America.

http://www.whitehouse.gov/sites/default/files/docs/us_national_action_plan_6p.pdf

Page 60: AND NOAA SPILL RESPONSE - WordPress.com › ... · 2015-03-20 · While citizen science itself is not new, the variety and extent of citizen science projects, the number of participants,

58

APPENDIX 4 – INTERVIEWEES

NOAA PRACTITIONERS (SCIENTIFIC SUPPORT COORDINATORS)

John Tarpley, Emergency Response Division

Ruth Yender, Emergency Response Division

NOAA DATA MANAGERS

Amy Merten, Spatial Data Branch, Assessment and Restoration Division

Peter Murphy, Alaska Region, Marine Debris Program

Sherry Lippiatt, California Region, Marine Debris Program

CITIZEN SCIENCE SPECIALISTS

Julia Parrish, Coastal Observation And Seabird Survey Team (COASST)

Kate Litle, Washington Sea Grant

VOLUNTEER COORDINATORS

Chrys Bertolotto, WSU Snohomish County Extension Beach Watchers

Barbara Bennett, WSU Island County Extension Beach Watchers

Randy Imai, California Office of Spill Prevention and Response (OSPR)

Kathleen Jennings, California Office of Spill Prevention and Response (OSPR)

ADDITIONAL NOAA CONTACTS

Doug Helton, Emergency Response Division

Alan Mearns, Emergency Response Division

Jordan Stout, Emergency Response Division

Ashley Braun, Office of Response and Restoration