Traditionally, air quality analysispubs.awma.org/gsearch/em/2005/9/husar.pdf · awma.org september...

3
awma.org september 2005 em 39 em feature Traditionally, air quality analysis was a slow, deliberate investigative process occurring months or years after the monitoring data had been col- lected. Satellites, real-time pollution detection, and the World Wide Web have changed all that. Analysts can now observe air pollution events as they unfold. They can “con- gregate” through the Internet in ad hoc virtual workgroups to share their observations and collectively create the in- sights needed to elucidate the observed phenomena. As a result, air quality analysis has become much more agile and responsive to the needs of air quality managers, the pub- lic, and the scientific community. The era of near-real-time air quality analysis began in the late 1990s with the availability of real-time satellite data over the Internet. High-resolution color satellite images were uniquely suited for early detection and tracking of extreme natural or anthropogenic aerosol events. In April 1998, for example, a group of analysts keenly followed and docu- mented on the Web, in real-time, the transcontinental trans- port and impact of Asian dust from the Gobi desert on the air quality over the western United States. 1 Soon after, in May 1998, another well-documented incursion of Central American forest fire smoke caused record fine particulate matter (PM 2.5 ) concentrations over much of the eastern United States. 2 The high value of providing qualitative real-time air qual- ity information to the public has been well demonstrated through the U.S. Environmental Protection Agency’s (EPA) successful AIRNow program. However, during extreme events like the above-mentioned dust and smoke episodes, air quality managers need more extensive “just-in-time analysis.” In the1998 Asian dust event, for example, local air quality managers in Oregon and Washington used real- time analysis to issue health advisories. And soon after the Central American smoke event, EPA granted some states “exceptional event” exemptions from ozone (O 3 ) standard violations. These responsive air quality management actions were largely facilitated by the agile event-analyses provided by the ad hoc community of scientist and managers col- laborating via the Internet. In more recent years, air quality management has also changed. The old “command-and-control” style is giving way to a more participatory approach that includes all the key stakeholders from multiple jurisdictions and the application of scientific and “weight-of-evidence” approaches. Air qual- ity regulations now emphasize short-term monitoring, while at the same time, long-term air quality goals are set to glide toward natural background levels over the next decades. In response to these and other developments, EPA has under- taken a major redesign of the monitoring system that pro- vides the data input for air quality management. The new National Ambient Air Monitoring Strategy (NAAMS), through its multitier integrated monitoring system, is geared to providing more relevant and timely data for these com- plex management needs. All these changes in management style place considerable burden on the information system that supports them. Fortunately, both air quality monitoring and data dis- semination technologies have also advanced consider- ably since the 1990s. Recent developments offer outstanding opportunities to fulfill the information needs for the new agile air quality management ap- proach. The data from surface-based air pollution moni- toring networks now provides routinely high-grade spatio-temporal and chemical patterns throughout the United States for PM 25 and O 3 . Satellite sensors with glo- bal coverage and kilometer-scale spatial resolution now provide real-time snapshots that depict the pattern of haze, smoke, and dust in stunning detail. The “terabytes” of data from these surface and remote sensors can now be stored, processed, and delivered in near real time. The instantaneous “horizontal” diffusion of information via the Internet now permits the delivery of the right Rudolf B. Husar is a professor of mechanical engineering and director of the Center for Air Pollution Impact and Trend Analysis at Washington University, St. Louis, MO. Richard L. Poirot is an air quality analyst with the State of Vermont Department of Natural Resources. E-mail: [email protected]. The instantaneous ‘horizontal’ dif fusion of information via the Internet now permits the delivery of the right information to the right people at the right place and time. Copyright 2005 Air & Waste Management Association

Transcript of Traditionally, air quality analysispubs.awma.org/gsearch/em/2005/9/husar.pdf · awma.org september...

Page 1: Traditionally, air quality analysispubs.awma.org/gsearch/em/2005/9/husar.pdf · awma.org september 2005 em 39 featureem Traditionally, air quality analysis was a slow, deliberate

awma.org september 2005 em 39

emfeature

Traditionally, air quality analysiswas a slow, deliberate investigative process occurringmonths or years after the monitoring data had been col-lected. Satellites, real-time pollution detection, and theWorld Wide Web have changed all that. Analysts can nowobserve air pollution events as they unfold. They can “con-gregate” through the Internet in ad hoc virtual workgroupsto share their observations and collectively create the in-sights needed to elucidate the observed phenomena. As aresult, air quality analysis has become much more agile andresponsive to the needs of air quality managers, the pub-lic, and the scientific community.

The era of near-real-time air quality analysis began inthe late 1990s with the availability of real-time satellite dataover the Internet. High-resolution color satellite images wereuniquely suited for early detection and tracking of extremenatural or anthropogenic aerosol events. In April 1998, forexample, a group of analysts keenly followed and docu-mented on the Web, in real-time, the transcontinental trans-port and impact of Asian dust from the Gobi desert on theair quality over the western United States.1 Soon after, inMay 1998, another well-documented incursion of CentralAmerican forest fire smoke caused record fine particulatematter (PM2.5) concentrations over much of the easternUnited States.2

The high value of providing qualitative real-time air qual-ity information to the public has been well demonstrated

through the U.S. Environmental Protection Agency’s (EPA)successful AIRNow program. However, during extremeevents like the above-mentioned dust and smoke episodes,air quality managers need more extensive “just-in-timeanalysis.” In the1998 Asian dust event, for example, localair quality managers in Oregon and Washington used real-time analysis to issue health advisories. And soon after theCentral American smoke event, EPA granted some states“exceptional event” exemptions from ozone (O3) standardviolations. These responsive air quality management actionswere largely facilitated by the agile event-analyses providedby the ad hoc community of scientist and managers col-laborating via the Internet.

In more recent years, air quality management has alsochanged. The old “command-and-control” style is giving wayto a more participatory approach that includes all the keystakeholders from multiple jurisdictions and the applicationof scientific and “weight-of-evidence” approaches. Air qual-ity regulations now emphasize short-term monitoring, whileat the same time, long-term air quality goals are set to glidetoward natural background levels over the next decades. Inresponse to these and other developments, EPA has under-taken a major redesign of the monitoring system that pro-vides the data input for air quality management. The newNational Ambient Air Monitoring Strategy (NAAMS),through its multitier integrated monitoring system, is gearedto providing more relevant and timely data for these com-plex management needs. All these changes in managementstyle place considerable burden on the information systemthat supports them.

Fortunately, both air quality monitoring and data dis-semination technologies have also advanced consider-ably since the 1990s. Recent developments offeroutstanding opportunities to fulfill the informationneeds for the new agile air quality management ap-proach. The data from surface-based air pollution moni-toring networks now provides routinely high-gradespatio-temporal and chemical patterns throughout theUnited States for PM25 and O3. Satellite sensors with glo-bal coverage and kilometer-scale spatial resolution nowprovide real-time snapshots that depict the pattern ofhaze, smoke, and dust in stunning detail. The “terabytes”of data from these surface and remote sensors can nowbe stored, processed, and delivered in near real time.The instantaneous “horizontal” diffusion of informationvia the Internet now permits the delivery of the right

Rudolf B. Husar is a professor of mechanical engineering anddirector of the Center for Air Pollution Impact and Trend

Analysis at Washington University, St. Louis, MO.Richard L. Poirot is an air quality analyst with the State of

Vermont Department of Natural Resources.E-mail: [email protected].

The instantaneous ‘horizontal’

diffusion of information via the

Internet now permits the

delivery of the right information

to the right people at the

right place and time.

Copyright 2005 Air & Waste Management Association

Page 2: Traditionally, air quality analysispubs.awma.org/gsearch/em/2005/9/husar.pdf · awma.org september 2005 em 39 featureem Traditionally, air quality analysis was a slow, deliberate

40 em september 2005 awma.org

information to the right people at the right place andtime. Standardized computer-to-computer communica-tion languages and service-oriented architectures nowfacilitate the flexible processing of raw data into high-grade “actionable” knowledge. Last but not least, theInternet has opened the way to generous sharing of dataand tools leading to faster knowledge creation throughcollaborative analysis and virtual workgroups. Neverthe-less, air quality analysts face significant hurdles.

The new developments have introduced a new set ofproblems. The “data deluge” problem is especially acutefor analysts interested in aerosol pollution, since aero-sols are so inherently complex and there are so manydifferent kinds of relevant data—from extensive, new sur-face-based monitoring networks to meteorological andaerosol forecast models to satellite imagery and associateddata products.

In this article, we describe DataFed, an infrastructure forreal-time integration and Web-based delivery of distributedmonitoring data, and FASTNET, a recent application projectbuilt on the DataFed infrastructure, which uses real-timeand historical monitoring data for the collaborative study ofmajor aerosol events.

DATAFED: DATA FEDERATIONThe federated data system, DataFed (http://datafed.net),aims to support air quality management and science by moreeffective use of relevant data. Building on the emerging

pattern of the Internet itself, DataFed assumes that datasetsand new data processing services will continue to emergespontaneously and autonomously on the Internet, as shownschematically in Figure 1. Example data providers includethe AIRNow project, modeling centers, and the NASA Dis-tributed Active Archive Centers. DataFed is not a centrallyplanned and maintained data system, but rather a facilityto harness the emerging resources by powerful dynamicdata integration technologies and through a collaborativefederation philosophy.

The key roles of the federation infrastructure are to (1)facilitate registration of the distributed data in a user-acces-sible catalog; (2) ensure data interoperability based on physi-cal dimensions of space and time; and (3) provide a set ofbasic tools for data exploration and analysis. The federateddatasets can be queried by simply specifying a latitude–lon-gitude window for spatial views, time range for time views,and so on. This universal access is accomplished by “wrap-ping” the heterogeneous data, a process that turns dataaccess into a standardized Internet service, callable throughwell-defined Internet protocols.

The result of this wrapping process is an array of homo-geneous, virtual datasets that can be queried by spatial andtemporal attributes and processed into higher-grade dataproducts. The service-oriented architecture of DataFed isused to build Web-applications by connecting the Web ser-vice components (e.g., services for data access, transforma-tion, fusion, rendering) in Lego-like assembly. The genericWeb-based tools created in this fashion include catalogs fordata discovery, browsers for spatial-temporal exploration,multiview consoles, animators, and multilayer overlays(see Figure 2).

A good illustration of the federated approach is therealtime AIRNow dataset described in a companion pa-per in this issue of EM (see Weyland and Dye on page19). The AIRNow data are collected from the states, ag-gregated by EPA, and used for informing the publicthrough the AIRNow Web site (www.epa.gov/airnow). Inaddition, the hourly real-time O3 and PM2.5 data are alsomade accessible to DataFed, where they are translated intoa uniform format. Through the DataFed Web-based in-terface, any user with access to the Internet can displaythe AIRNow data as time series and spatial maps, performspatial-temporal filtering and aggregation, generate spa-tial and temporal overlays with other data layers, and in-corporate these user-generated data views into their ownWeb pages. As of early 2005, more than 100 distributedair quality-relevant datasets had been “wrapped” into thefederated virtual database. Using this technology, approxi-mately one dozen satellite and surface datasets are deliv-ered within a day of the observations and two modeloutputs provide PM forecasts.

THE FASTNET PROJECTFASTNET (Fast Aerosol Sensing and Tools for Natural EventTracking)3 is a data acquisition and analysis facility for im-proving the efficiency of air quality analysis, with particularemphasis on detailed real-time and post-analysis of major

Figure 1. Schematic of air quality information system: providers,users, and data federation infrastructure.

Figure 2. Main software programs of DataFed–FASTNET: datacatalog for finding and selecting data, viewer for data exploration,and an array of consoles for data analysis and presentation.

Copyright 2005 Air & Waste Management Association

Page 3: Traditionally, air quality analysispubs.awma.org/gsearch/em/2005/9/husar.pdf · awma.org september 2005 em 39 featureem Traditionally, air quality analysis was a slow, deliberate

awma.org september 2005 em 41

aerosol events. Natural aerosol events from forest fire smokeand windblown dust are particularly interesting events, dueto their large emission rates over short periods of time,continental and global-scale impacts, and unpredictablesporadic occurrence.

For the FASTNET project, 14 specific datasets are high-lighted that include various surface-based aerosol data(e.g., EPA fine mass, speciated aerosol composition fromEPA and IMPROVE network), hourly surface meteorol-ogy and visibility data, aerosol forecast model results, andvarious satellite data and images. Many of these datasetsare available in near real time, while others (e.g., theIMPROVE filter-based aerosol chemistry data and associ-ated back-trajectories) are available with a time lag of ap-proximately one year. Analysts access the desired datathrough the DataFed Data Catalog (see Figure 2). The se-lected data are automatically loaded into a Web-based databrowser designed for easy exploration of the spatio-tem-poral pattern. Semantic data homogenization assures thatall datasets can be properly overlaid in space and time views.

For illustration of the FASTNET analysis concept, tools,and application methods, consider the April 2003 Kansassmoke event. For several days every spring, the rangeland

grass in the Kansas–Okla-homa region is burned, re-sulting in major smokeplumes that covermultistate areas of the Mid-west. Figure 3 shows the lo-cation of fires over Kansasderived from the MODISsatellite sensor, the noonaerosol optical thicknessderived from the SeaWiFSsatellite sensor, and the spa-tial pattern of organic fineparticle mass concentrationderived from speciatedaerosol data collectedthrough several surfacemonitoring networks. Thespatio-temporal sparsenessof routine surface monitor-ing networks prevents the

Figure 3. Multi-sensory characterization of agricultural smokeover the Midwest on April 12, 2003: (a) fire pixels overKansas from the MODIS sensor; (b) vertical aerosol opticalthickness derived from the SeaWiFS satellite; and (c) concen-tration pattern of organics from speciated chemical samplers.(Source: EPA)

full characterization of aerosol events, but complemen-tary satellite observations can fill in many of the missingpieces. Auxiliary data from the 1200-station NOAA sur-face visibility network (not shown) is particularly usefulfor the study of fine-scale aerosol pattern at the surface.During the event and the days and years following, thedata from these real-time monitoring systems were ana-lyzed by several groups using FASTNET tools and meth-ods. Based on these exploratory analyses, EPA isconsidering exceedance waivers for remote sites that wereimpacted by the agricultural smoke. Local air quality man-agers are also using the dynamic analysis tools to evaluatealternative grass burning schedules and approaches.Collectively, these types of analyses are used in the new,more flexible weight-of-evidence approach to air qualitycompliance analysis.

SUMMARYRecent developments in surface and satellite sensing,along with new information technologies, now allow real-time data analysis for the characterization and partial ex-planation of major air pollution events. By makingavailable many spatio-temporal data sources through asingle Web-based interface and in a consistent format,the DataFed tools allow anyone, anywhere to view, pro-cess, overlay, and display many types of data to gain in-sight to atmospheric physical and chemical processes. Akey goal of the current effort is to encourage the use ofthese tools by a broad community of air pollution research-ers and analysts, so that a growing group of analysts maysoon enhance the rate at which our collective knowledgeof air pollution evolves. In recent years, such agile analy-ses have provided occasional real-time support to the airquality managers and to the public, but much more couldbe done. The current challenge is to incorporate suchsupport into the air quality management process in a moreregular and robust way.

ACKNOWLEDGMENTSDataFed is a community-supported effort. While the dataintegration Web-based infrastructure was initially sup-ported by specific information technology grants fromthe National Science Foundation (NSF, under grant num-ber 0113868) and through a NASA REASoN grant, thedata resources are contributed by the autonomous pro-viders. The application of the federated data and tools isin the hands of users as part of specific projects. Just asdata quality improves by passing through many hands,the analysis tools will also improve with use and feedbackfrom data analysts. A partial list is available at http://datafed.net/projects. The FASTNET project was sup-ported by the Regional Planning Organizations (RPOs)for regional haze management. At this time, the DataFed-FASTNET user community is small, but substantial effortsare under way to encourage and facilitate broader par-ticipation through larger organizations, such as the EarthScience Information Partners (ESIP) Federation (NASA,NOAA, EPA main member agencies) and the RPOs.

REFERENCES1. Husar, R.B. et al. The Asian Dust

Events of April 1998; J. Geophys. Res.Atmos. 2001, 106, 18317-18330. Seealso The Asian Dust Events of April1998 at http://capita.wustl.edu/Asia-FarEast/.

2. Peppler, R.A. et al. RM SouthernGreat Plains Site Observations forthe Smoke Pall Associated withthe 1998 Central American Fires;Bull. Am. Meteor. Soc. 2000, 81,2563-2591. See also A ResourceCatalog and Discussion Forum onSmoke from Central AmericanFires at http://capita.wustl.edu/Central-America/.

3. Poirot, R.L. et al. Aerosol and HazeObservations during 2004 with theFASTNET Distributed MonitoringSystem. Presented at the A&WMASpecialty Conference on Regionaland Global Perspectives on Haze:Causes, Consequences, and Con-troversies, Asheville, NC, 2004;Paper # 93.

em

Copyright 2005 Air & Waste Management Association