Application Integration and Better Remediation Decision-Making

17
Application Integration and Better Remediation Decision- Making WEAVER, SCOT D. Vice-President EarthSoft, Inc., 10820 South 800 East, Paradise, Utah, 84328, USA email: [email protected] GRAY, ARNOLD L. EarthSoft, Inc., 118 Bentwood Drive, Cherry Hill, New Jersey, 08034, USA email: [email protected] Abstract Electronic management of environmental data continues to become increasingly standard. An integrated data management system provides much more than just the ability to archive data electronically. Companies and government agencies worldwide are reaping the benefits of data management in a relational database integrated with improved analytical tools and enhanced visualization applications. Benefits include being able to manage data more quickly and accurately which, in turn, contributes to better decision-making and response. ESRI’s ArcGIS and EarthSoft’s EQuIS for ArcGIS represent tools in the GIS realm to support scientists in these efforts and are an excellent example of software integration which leverages the value inherent in the data. Case studies illustrate key benefits gained and efficiencies realized by application integration. Introduction For as long as site characterization data have been collected, the management and maintenance of those data have been challenging issues. In recent years, as laboratory analysis and field investigation techniques have evolved, the proliferation of such data has exacerbated the management problem. One data manager, formerly a regulator with the New Jersey Department of Environmental Protection (NJDEP), stated “Some project datasets filled thirty or more boxes.…When the case was closed, as there was no practical way to archive the data, it was all thrown away. Without the ability to manage data electronically, the quality of work suffered and hundreds of millions of data were lost forever.” (Gray, 2003). Electronic Data Management While it is not difficult to find a data “warehouse” in the historic sense (Figure 1), the state-of- the-art and, arguably, the standard method of operation today is to implement an electronic data warehouse. An electronic data warehouse serves to assimilate, protect, analyze, and share data from the many disparate sources that may be involved in an environmental or subsurface investigation. Electronic data management for some facets of investigation, such as cone penetrometer tests and laboratory analyses, is not new (even though some analytical laboratories—albeit a decreasing number—still use archaic, outdated systems requiring manual

description

Electronic management of environmental data continues to become increasingly standard. An integrated data management system provides much more than just the ability to archive data electronically. Companies and government agencies worldwide are reaping the benefits of data management in a relational database integrated with improved analytical tools and enhanced visualization applications. Benefits include being able to manage data more quickly and accurately which, in turn, contributes to better decision-making and response. ESRI's ArcGIS and EarthSoft's EQuIS for ArcGIS represent tools in the GIS realm to support scientists in these efforts and are an excellent example of software integration which leverages the value inherent in the data. Case studies illustrate key benefits gained and efficiencies realized by application integration.

Transcript of Application Integration and Better Remediation Decision-Making

Page 1: Application Integration and Better Remediation Decision-Making

Application Integration and Better Remediation Decision-Making WEAVER, SCOT D. Vice-President EarthSoft, Inc., 10820 South 800 East, Paradise, Utah, 84328, USA email: [email protected]

GRAY, ARNOLD L. EarthSoft, Inc., 118 Bentwood Drive, Cherry Hill, New Jersey, 08034, USA email: [email protected] Abstract Electronic management of environmental data continues to become increasingly standard. An integrated data management system provides much more than just the ability to archive data electronically. Companies and government agencies worldwide are reaping the benefits of data management in a relational database integrated with improved analytical tools and enhanced visualization applications. Benefits include being able to manage data more quickly and accurately which, in turn, contributes to better decision-making and response. ESRI’s ArcGIS and EarthSoft’s EQuIS for ArcGIS represent tools in the GIS realm to support scientists in these efforts and are an excellent example of software integration which leverages the value inherent in the data. Case studies illustrate key benefits gained and efficiencies realized by application integration.

Introduction For as long as site characterization data have been collected, the management and maintenance of those data have been challenging issues. In recent years, as laboratory analysis and field investigation techniques have evolved, the proliferation of such data has exacerbated the management problem. One data manager, formerly a regulator with the New Jersey Department of Environmental Protection (NJDEP), stated “Some project datasets filled thirty or more boxes.…When the case was closed, as there was no practical way to archive the data, it was all thrown away. Without the ability to manage data electronically, the quality of work suffered and hundreds of millions of data were lost forever.” (Gray, 2003).

Electronic Data Management While it is not difficult to find a data “warehouse” in the historic sense (Figure 1), the state-of-the-art and, arguably, the standard method of operation today is to implement an electronic data warehouse. An electronic data warehouse serves to assimilate, protect, analyze, and share data from the many disparate sources that may be involved in an environmental or subsurface investigation. Electronic data management for some facets of investigation, such as cone penetrometer tests and laboratory analyses, is not new (even though some analytical laboratories—albeit a decreasing number—still use archaic, outdated systems requiring manual

Page 2: Application Integration and Better Remediation Decision-Making

notation or retyping of data). The first step towards application integration is the implementation of an electronic data management system.

Figure 1. A traditional data “warehouse”.

Why Application Integration? The Colorado Department of Public Health’s Hazardous Materials and Waste Management Division has developed a Site Analysis Management System (SAMS) “to integrate environmental monitoring data into a standard format, with seamless interfaces to other applications for data evaluation”. Built upon EarthSoft’s Environmental Quality Information Systems (EQuIS) as the data warehouse and using analysis software from ESRI, RockWare, Golden Software, and EMS-I, SAMS was designed “to enable a more comprehensive and effective evaluation of environmental impacts, appropriate remediation methods, effects of remediation, and compliance” (CDPHE, 2003). On March 23, 2001 the NJDEP signed into law the Private Well Testing Act (PWTA). This Act requires that parties in certain real estate transactions in the state of New Jersey involving private wells and some public wells must test the water supply for specific parameters and that the laboratory submit the testing results to the NJDEP in an electronic format. The electronic data

2

Page 3: Application Integration and Better Remediation Decision-Making

deliverable format is enforced to “ensure that laboratories send in test results that comply with the requirements of the law and regulations” so that “accurate data can be entered into our data management system that will be accessible through the NJDEP's GIS System for data sharing and evaluation.” Ultimately, the NJDEP expects that this Act and the electronic data management system implemented will help regulators “make more informed decisions about regional and statewide water quality issues, respond more accurately to questions, and improve [the] ability to review … data more quickly and accurately” (NJDEP, 2003). From these examples it is clear that the proper integration of data management with analytical and visualization applications can result in a turnkey solution that increases efficiency, reduces cost, and facilitates better decision-making.

Application Integration through GIS Geographic information system (GIS) technology integrates common database operations such as querying and statistical analysis with the unique visualization and geographic analysis benefits offered by maps. These abilities distinguish GIS from other information systems and make it valuable to a wide range of public and private enterprises for documenting events, predicting outcomes, and planning strategies. EQuIS for ArcGIS, an extension for the ArcView, ArcEditor, and ArcInfo 8.x desktop applications, allows users to query, report, and map subsurface data. EQuIS for ArcGIS is ideal for displaying and effectively communicating project information and supports the ArcMap Style gallery by providing a customizable style gallery which is used when creating any of its automated map features such as sample locations, color ramps, scale bars, etc. The EQuIS for ArcGIS extension integrates many leading environmental software applications for specialized tasks. For example, contours created using EQuIS for ArcGIS and Surfer may be automatically rendered in the GIS where they can be used in further analyses with output from other non-GIS applications, such as RockWorks. Analytical data may be queried, and presented in several ways, including a crosstab summary format. Using the Chemical Layer Builder, the selection is defined using a specific media (e.g. water, soil, or soil gas). When dealing with a single event (or sample points sampled only once) the data may either be shown for each sample location using a symbol indicating relative concentration, e.g. red concentric circles with white outlines at the midpoint of each sample. Using the standard ArcMap info tool, all available data (or the desired subset) for individual sample points can be browsed. With EQuIS EZView integrated into the GIS, reporting and time-series plots are offered for a variety of scenarios such as plotting concentration vs. time. Integration with both LogPlot and gINT software allows for the display of monitoring well completion diagrams and creation of boring logs on the fly, all from within the GIS.

3

Page 4: Application Integration and Better Remediation Decision-Making

Integration for 1-Dimensional Visualization and Analysis To produce a report-quality borehole log with either RockWare’s LogPlot or gINT from gINT Software a user can right-click as well. By selecting ‘LogPlot’ or ‘gINT Boring Log’ from the custom context menu that appears when right-clicking an EQuIS sample location point, a borehole log can be created on-the-fly in seconds using a highly customizable log template. The output from the EQuIS for ArcGIS borehole log modules are not simply opening a pre-built log or image file when they are executed, as the output is created using the latest information from the project database each time a log is requested.

Figure 2. Borehole log in LogPlot 2003.

4

Page 5: Application Integration and Better Remediation Decision-Making

Integration for 2-Dimensional Visualization and Analysis In order to view geologic data in two dimensions, cross-sections can be produced using selected map locations within the GIS by using the EQuIS for ArcGIS modules for RockWare’s RockWorks. EQuIS for ArcGIS includes a Geologic Profile Toolbar containing various utilities for drawing the cross-section profile lines and identifying the locations with lithology data that are to be used when creating that cross-section.

Figure 3. Drawing a profile line and using EQuIS for ArcGIS’ buffer selection tool. After the desired locations are selected for the profile line drawn, from the EQuIS for ArcGIS Geologic Profiles Toolbar, a cross-section is created by selecting either 2D Cross-Section or 2D Fence menu options from the dropdown of available diagram types. The EQuIS for ArcGIS 3D Fence modules also support RockWare’s RockWorks 2002, which provides more than the adjacent borehole approach for creating its fence diagrams. A projected 3D fence can easily be drawn in ArcGIS with one or more discontinuous panels. RockWorks 2002 interpolates the lithology data to create the tops and bottoms of each unit measured using lithology data from its neighbors, not just the adjacent borehole along the profile line. Where a more complex layering scheme involves lenses and pinchouts, a linear interpolation may not be adequate. In this case, it is possible simply to represent a ‘strip log’ of layers found at the location. Interpolation—or interpretation—may then be left to the geologist. Additionally, ancillary data such as downhole cone penetrometer (CPT) or geophysical data, interpreted geologic units, and well screens may be represented on the cross section diagram.

Integration for 3-Dimensional Visualization and Analysis While ArcGIS 3D Analyst is one option for three-dimensional visualization data, there are other applications that are equally well suited for such operations. The Department of Defense

5

Page 6: Application Integration and Better Remediation Decision-Making

Groundwater Modeling System (GMS) provides the capability to constructed solid models from borehole data. However, only one click is required—selecting Create GMS Boreholes from the menu—to bring borehole data into GMS. Once the data are brought into GMS, surfaces can be created from the layer contacts of the boreholes. These surfaces can then be extruded and manipulated to create a solid model which is representative of the site geology. At this point, cross sections may be cut arbitrarily through the model.

Figure 4. Cross-sections created in GMS. The complete cycle of integration is apparent when the cross sections created in GMS are exported as a 3D DXF, and then brought back into ArcScene or the EQuIS for ArcGIS 3D Preview Window for ArcMap. RockWare’s RockWorks 2002 includes RockPlot 3D, a standalone OpenGL-based application for visualizing and interacting with 3D boreholes, fences, contaminant plumes, and a wide variety of other 3D shapes to represent features such as underground storage tanks or colored spheres to show results of analytical samples.

6

Page 7: Application Integration and Better Remediation Decision-Making

Figure 5. RockWorks 2002’s RockPlot 3D Visualization and subsurface geology. Another option for advanced three-dimensional visualization and modeling, EQuIS for ArcGIS can export chemistry and geology data into CTech’s Environmental Visualization System (EVS). EVS provides many visualization techniques which are not available in ArcGIS 3D Analyst or RockPlot 3D; these include volume rendering, quad-buffered stereo support for virtual reality compatible hardware, and finite difference or finite element modeling grid generation in addition to enhanced visualization techniques to communicate 3D contaminant plumes and fence diagrams, with arbitrary slicing and cutting to show subsurface features.

7

Page 8: Application Integration and Better Remediation Decision-Making

Figure 6. EVS Application Preview and Launch Menu. Geology data can be exported in the uninterpreted pre-geology format (*.PGF) or interpreted *.GEO format. The *.PGF format is compatible with the Geologic Indicator Kriging (GIK) found in the Krige3D module in EVS Pro/MVS. The recommended approach is to export as PGF format, and use the interactive tools or GIK to define the geologic layers. The *.GEO format take significantly longer to create than the PGF option, since there is a subjective logic used to interpret the geologic layers. Chemistry data can be exported for layers generated using the EQuIS EZView LayerBuilder to query analytical concenctrations or geochemistry from project databases. These may be either single analyte queries or multiple analyte queries (chemical crosstab layers).

8

Page 9: Application Integration and Better Remediation Decision-Making

Figure 7. 3D Plume shapefile exported from EVS displayed in EQuIS for ArcGIS 3D Preview Window. EVS is then launched with the proper data file, and after only a few seconds required to perform the necessary mathematical operations (kriging, etc.) the model is presented to the user. As with visualization options discussed previously, from the user’s perspective, EQuIS for ArcGIS greatly simplifies modeling. The integration of a data component (EQuIS), a modeling component (EVS, in this case), and an interface component (ArcGIS) allows much of the process to be automated.

Better Decision-Making Better decision-making essentially boils down to the ability to avoid both false positives and false negatives which reduces uncertainty resulting in the ability to make correct decisions more rapidly and with less waste. This requires high quality data from the field and laboratory to be integrated so that placement in 4D space (3D plus time) reflect reality adequately for accurate modeling.

9

Page 10: Application Integration and Better Remediation Decision-Making

False negatives and false positives are a normal part of sampling and analysis. Laboratory chemistry results, regardless of how carefully run, are subject to a wide range of errors due to differences in sampling technique, sample handling, equipment handling, and other variables. Within a laboratory, other factors intervene, causing error. To a large measure, these can be identified by means of routine QA/QC procedures detected in the data validation process. Even here, however, errors do occur and an alternate means of detection is required.

False Positives False positives—which indicate the presence of contamination when there is none—can dramatically drive up the cost of a remediation project. Such errors expand the scope and cost of the site remediation activity. The main tasks in reducing false positives include reducing errors related to transcription, run appropriate levels of validation, and apply appropriate statistical analysis to detect not only clear outliers, but also errors that can be equally damaging that lie on the margin of normal range. While false positives are impossible to avoid, they can be detected with the use of Monte Carlo simulation statistics which are available in statistical analysis packages such as CARStat. The ability to detect false positives and deal only with true positive results can dramatically reduce the scope and time involved in an investigation and subsequent remediation.

False Negatives Whereas false positives are a serious concern for those conducting a site remediation, false negatives (indicating the absence of contamination where it is present) represent a primary concern of those regulating site remediation activities. For regulators, data indicating the presence of contamination require far less validation than data that state a media is clean (One regulator reports that he used to tell responsible parties that if an area was contaminated, data was unnecessary—a signed confession would do; it’s when they argued that an area was clean that solid data was required). The reason for this is that it is the regulator’s primary responsibility to see that no contamination is left behind to have a negative impact on either human health or the environment. Cleanup cost is not a driving concern; regulators tend to maximize for remedial outcome. This is not to say that contaminated site owners are not concerned with false negatives. Undetected false negatives result in potentially hazardous levels of contamination being left untreated, thereby increasing corporate liability and exposure risk to the public. It is to the owners benefit to avoid this situation. There is, however, a tendency to want to optimize between remedial outcome and cleanup costs (with a bias to lower cost). This difference in objectives is the prime source of friction between regulators and the regulated community. This adds significantly to the amount of time required to affect remediation increasing remediation costs and, often, leading to litigation. The effective identification of false negatives can significantly reduce these differences and lead to a better outcome for both parties. Identification of false negatives can be attained in the same manner as for false positives.

10

Page 11: Application Integration and Better Remediation Decision-Making

Uncertainty Uncertainty comes from fear of error. While false positives and false negatives associated with sampling and analysis lead to uncertainty, they are not the only contributors. Uncertainty is derived from a number of sources: some scientific, some human, and others institutional. Uncertainty usually leads to similar results—higher costs in site investigation and remediation. The primary source of uncertainty in environmental investigations comes from transcription error. Despite the availability of computerized systems such as spreadsheets and databases, manual transcription of data remains a predominant way to move data from one place or from one form into another. Even where data has been made available in electronic format it is often manually typed into spreadsheets from output of laboratory information management system (LIMS), or cut and paste from a cell of one spreadsheet into that of another. In each of these activities typing errors or simple datum misplacement are common occurrences. The best means with today's technology to reduce uncertainty is to eliminate manual data transcription wherever possible. Personal digital assistants (PDAs) and tablet PCs are now used in the field for direct entry of field data. With these devices, the data entry process can be greatly simplified. Options are selected from pulldown menus, current dates and times automatically loaded, checkmarks placed before observed items, and so forth, leaving only a few places where the actual writing of data needs to occur. Even where data entries are made manually in an electronic device, they do not need to be transcribed into another system upon return from the field. Data can be ported directly into a central data management system. The LIMS output can also be taken directly as they are produced and input into checking tools and from there into data warehouse repositories without transcription. Whenever a transcription task is eliminated the opportunity for errors is reduced along with uncertainty. When data from the warehouse repository are exported into a variety of analytic applications without transcription, a higher level of confidence is expected from the output of those systems as well. This will reduce error and uncertainty in two ways. First, by application integration the possibility for transcription error is eliminated and output of the system is more reliable. Second, because it is then possible to place data rapidly into a variety of analytic tools the researcher is enabled to compare the output from a number of similar tools or use a number of different algorithms to crosscheck his findings. Where consistency is high, uncertainty is low. Where consistency is low the researcher is then directed to account for the discrepancies. When these are accounted for the same results are achieved: lower uncertainty.

Rapid Decision-Making The direct placement of data from the data management system to analytical applications results in a significant decrease in analysis and assessment time. In addition to crosschecking results in a number of applications, the researcher is also able to evaluate “what if…?” scenarios. Using these he is able to change variables and assumptions in plans and models to ascertain alternate pathways and potential outcomes.

11

Page 12: Application Integration and Better Remediation Decision-Making

In one pilot study of shallow groundwater, cone penetrometer sampling was conducted with real-time gas chromatograph/mass spectrometer (GCMS) data phoned in to an online EQuIS database. EQuIS then transferred the data directly into CTech’s EVS application that computed the uncertainty surface for groundwater at the facility. EVS then identified the best place for the next penetrometer reading, which was called back to the waiting mobile penetrometer lab. The entire data cycle took under 10 minutes. The normal method of investigating a contaminated shallow groundwater aquifer is to submit an investigation plan to the regulatory agency, wait for response from the agency, mobilize and install a number of wells, gather and analyze results, submit a report with the findings accompanied with proposals for further investigation. The cycle can take six months to complete and is normally repeated several times before shallow groundwater investigation is complete. Using this “rapid-feedback” methodology is possible to take samples from dozens of locations in a single day and complete analysis with sample locations selected on the basis of mathematical probability to reduce statistical uncertainty more quickly than could otherwise be achieved. Using this new methodology, monitoring wells are installed to provide points of formal laboratory analysis against which to correlate the field screening data. The end result was to drop an otherwise six-year investigation to one requiring less than a few months. Beyond obvious cost savings from fewer wells and fewer sampling rounds, additional benefits were derived from being able to mobilize an active remediation almost immediately curtailing the spread of contamination and resulting in a shortened remediation activity with reduced public exposure to hazard. Variations of this process can be used with other field screening methodologies as well.

Waste Reduction Reduced time equals reduced cost, requires a smaller resource mobilization with less infrastructure, and reduced health risks associated with hazard exposures. In addition, the ability to store data for long-term monitoring allows regulatory agencies to pull back from having to actively remediate every identified contaminated site. Where no imminent risk is identified for human health or environment, a wait-and-see approach may do well. It can be observed whether natural attenuation is effectively eliminating the hazard. While delaying the cleanup may cause a more widespread problem, new technologies may emerge that greatly reduce the cost associated with site remediation. These approaches are risky where data warehousing and data management capabilities are weak. Where solid data management protocols do exist, however, facilities can be closely monitored with very low levels of risk to the public. The opportunities put forward by effective data management and application integration are substantial and accrue benefits to all involved. A recent study by the California Policy Research Center determined that it cost the state of California $10 billion annually combat nine environmentally-related diseases for which economic data was available. If tracking the occurrence of these diseases led to no more than a 1% disease reduction, the state would save $100 million annually (Lashof, et al).

12

Page 13: Application Integration and Better Remediation Decision-Making

The New Jersey Department of Environmental Protection is initiating a study in conjunction with the Center for Disease Control to track occurrences of cancer and identify clusters to determine if and how environmental factors contribute in its etiology. The study will be most far-reaching and compelling cancer cluster study yet undertaken and could not be initiated were it not for New Jersey's vast environmental data warehouses. Results from the study will contribute to the ability of governments everywhere to identify risk areas and act proactively to reduce the occurrence of cancer in their societies.

Application Integration Once data quality issues have been resolved, field and laboratory findings have to be coordinated in space and time. It is here that the benefits of an advanced electronic data warehouse become apparent. The EQuIS data management system provides a case in point. Having run the data quality checks to ensure data quality, data from the system can be directed to a number of applications for analysis. The goal for proper decision-making is to lay out geophysical features as they occurred in nature in a way that can be visualized in 3D space. Provided data have been collected to provide X, Y, & Z coordinates in the database, these representations are easily produced. Time, the fourth dimension, is also a critical variable. While geologic features are relatively stable in time, groundwater and surface water features are transient and change significantly over time. Establishment of good time series data is critical to understanding the migration and spread of contaminated groundwater resources. The behavior of contaminant plumes can even provide new information with respect to underground geophysical features. Breaks in the continuity of confining zones can be determined by the spread of contamination from one aquifer into another. These may be naturally occurring or be the result of human activity (such as improper casing of deep wells), but would otherwise go undetected were it not for change in the contamination pattern.

Figure 8. USEPA Region 5 FIELDS Application. (FIELDS)

13

Page 14: Application Integration and Better Remediation Decision-Making

Static data models are useful in demonstrating past and current geology and contamination features. A viewer can, at a glance, achieve higher levels of understanding that would normally be possible only with hours or days of reviewing analytical results. It is important, however, that the viewer have a basic understanding of how computer algorithms display the data. Different algorithms produce different visualizations and can lead to different interpretations of the results. The differences are most pronounced in cases of sparse data. As the data set becomes more richly populated variations in the output are reduced. It must be remembered that the images represent and interpretation of the data using varying assumptions and mathematical formulation. Predictive data models examine past changes in data related to geophysical features to predict future scenarios. While the same caveats as for static data models apply, predictive modeling can be used not only to determine if a remedial intervention is required, but also where the implementation should be implemented and when it should be initiated to achieve the greatest effect at the least cost option. If there are no affected receptors, it is possible to let natural attenuation take place as a contaminant plumes migrates and begin remediation only when contaminant levels reach a location where they may begin to pose a threat at a future point in time. Predictive data models can help reduce the amount of monitoring and analysis that would be necessary without modeling thereby reducing costs. At the same time, modeling can be used to help protect against hazards that may arise in the future if no action is taken. Modeling principles can be applied to the breakdown of contaminant is in the natural attenuation process to detect potential generation of daughter compounds is risk factors may exceed those of the original contaminant. In the case provided below, observation of the TCE plume by itself would not have shown how decreasing levels of TCE were being replaced by concentrations of daughter compounds. Had a decision been made based solely on TCE monitoring over time, serious hazard exposures may have been left open.

14

Page 15: Application Integration and Better Remediation Decision-Making

Figure 9. Representation of TCE degradation (including daughter products) over time in GMS. Predictive modeling This rendition of future data scenarios for a TCE plume generated by GMS not only demonstrates future TCE contamination, but also daughter products as a contaminant plume breaks down. Examination of the TCE plume alone might indicate that further action is unnecessary. The model suggests two interesting features that may otherwise have gone

15

Page 16: Application Integration and Better Remediation Decision-Making

unnoticed. First, the presence of chlorides build to potentially dangerous levels by the time the facility boundary line is reached. Second, levels of vinyl chloride that are more toxic than TCE are generated in the breakdown process and rise to levels on site that may require an institutional control. The key to better decision-making is to maintain high-quality data that can be used in variety of ways. The data warehouse with powerful data quality checking tools provides a foundation for better decision-making. The ability to interface to a wide variety of analytic tools appropriate to a user's task provides the capstone.

Conclusion Environmental characterization and modeling is a science that, while still complex, is improving as the tools used by the scientists improve and applications become more tightly integrated. Data collection provides discrete point descriptions of the subsurface with almost no uncertainty. The uncertainty is introduced during the interpolation and statistical correlation between known points. The interpolation is needed to build a structure that relates these distinct points within the investigation space. As software and procedural checks are implemented to identify and correct false positives and false negatives, uncertainty is reduced and decisions can be made more rapidly and with less uncertainty. Using the appropriate software to manage, analyze, and visualize the data, the scientist is able to not only apply more advanced analyses, but also to investigate various scenarios. Regulatory agencies, consultants, and industrial companies worldwide are reaping the benefits of an environmental data management system with integrated visualization and analysis applications. These benefits include not only much more efficient management and visualization of multimedia data but also more accurate and cost-effective decision-making and response.

16

Page 17: Application Integration and Better Remediation Decision-Making

References CDPHE. (2003) http://www.cdphe.state.co.us/hm/samsmain.asp FIELDS (FIeld EnvironmentaL Decision Support), http://www.epa.gov/region5fields/ Gray, A. L. (2003) Telephone Conversation, May 2003. Lashof, J. C., McKone, T., Grass, R., Horton, M., Lyou, J., Needham, L., Pease, W., Rice, D.,

Ritz, B., Satariano, W., Schulte, P., Solomon, G. “Strategies for Establishing an Environmental Health Surveillance System in California: A Report of the SB 702 Expert Working Group,” California Policy Research Center, University of California, forthcoming.

NJDEP, Bureau of Safe Drinking Water. (2003) Private Well Testing Act Program Electronic

Data Deliverable Manual, p. 5.

Biographies Scot D. Weaver directs the development of GIS and Modeling interfaces for EarthSoft’s Environmental Quality Information System (EQuIS). His management responsibilities include development of training courses and product development. Mr. Weaver received a B.S. in Civil Engineering from Brigham Young University and a M.S.E. in Civil Engineering from the University of Texas at Austin. For nearly a decade, he has been involved in programming for engineering applications and has taught environmental data management courses in North America, South America, England, Europe, and Asia. Arnold L. Gray received a Ph.D. in Geography from Clark University specializing in human interactions with the environment. He has conducted research and training activities across Africa addressing issues of food supply, water supply and sanitation and currently serves as a consultant to a number of African nations. In the U.S., Dr. Gray specializes in institutional issues of human environmental response focused on the interaction of regulators and responsible parties in site investigation and remedial activities. He works closely with numerous states and EPA regions addressing data management issues and serves a director of government accounts for EarthSoft, Inc.

17