,

The Internet and Environmental and Geotechnical Data

Locus 300
Geo-Strata

1 January 2003 — “Data, data everywhere and not a drop to use.” Samuel Taylor Coleridge’s original verse was actually about water, but the result is the same for today’s environmental and geotechnical engineers and site owners as it was for the poet’s ancient mariner: drowning in a sea of information that is as unusable as salt water is for drinking.

Investigations, cleanups, and post-closure monitoring and maintenance of contaminated waste sites can generate enormous amounts of data. At large complex
sites, it is not uncommon to drill hundreds of boreholes and wells, collect tens of thousands of samples, and then analyze each of these for several hundred contaminants to ascertain the nature and extent of contamination and geotechnical properties. The information from these various phases, which may eventually include a million or more sampling and analytical records, is typically entered into a database, or worse, into a spreadsheet. With so much data to manage, precious resources are squandered on unproductive administration tasks.

 

What’s usually done?
Most companies with environmental problems do not store their own environmental data. Instead, they rely on their consultants for this service. Larger companies with particularly troublesome or multiple sites are often reticent about “putting all their eggs in one basket” and opt instead to apportion their environmental work among multiple consultants.

Rarely do all consultants use the same environmental database management system. And equally rare is the customer who insists on this. The end result is that the company’s environmental data are stored in various stand-alone or client-server systems at different locales.

If another consultant is hired to do some specialized work, such as risk assessment, data must usually be downloaded into files, then uploaded, and after much “massaging,” installed into the new consultant’s system. Often the data in these systems are not readily accessible to the consultant’s engineers and geologists, or to the companies who actually “own” it. Instead, information requests must go through specialists who know how to extract data from the system.

As for all the various documents and reports, these are often stored in a variety of locales and formats. Considerable time can be lost tracking them down and delivering them to the appropriate personnel. When tasks must be approved from multiple individuals, the necessary documents are sometimes passed sequentially from one person to another, thereby resulting in significant and unnecessary delays at high cost to the client.

All in all, it is not uncommon for environmental and related project information to be handled and processed by dozens of people, in different ways, with few standards or quality control practices governing the various steps in the process, and with no central repository.

With so much information to deal with, it should not come as a surprise that many companies find themselves drowning in data but starving for knowledge.

 

What’s out there?
There is no lack in the marketplace of computerized tools to help companies manage and process this information. However, these typically exist and function as islands of technology rather than as part of an integrated package or system.

Complicating the matter is that these individual tools are sometimes stand-alone applications that need to be installed on each user’s computer whereas others are client-server systems that must be accessed over a dedicated network.

Much rarer is an Internet-based solution. Yet many of the problems and inefficiencies described here can be reduced, if not eliminated, by turning to Internet technologies.

 

What about the Internet?
An easy-to-query Internet-based environmental database management system into which all consultants on a project upload their field and analytical data eliminates the incompatibility and accessibility problems. There is no need to transfer data from one party to another, because all interested parties are able to query and, as needed, download information from the same database using their web browsers. Further inefficiencies can be wrung out of the data acquisition and reporting process by turning to the use of hand-held devices and remote control and automation systems to upload field and sampling data more quickly and reliably. The Internet need not only be used just to store data on site conditions. It can also be used as the primary repository for the various permits, drawings, reports, and other such documents that are generated during the course of a site investigation or cleanup. Having all this information stored in a single place facilitates communication among all interested parties, improves project coordination, and
decreases the overall costs of environmental remediation.

 

What are the obstacles?
Why have most consulting firms made little if any effort to make site-related documents and data accessible over the Web? Explanations for their failure are many but foremost could be their unwillingness to do anything that would reduce their revenues or their clients’ dependence on them.

Because their clients are far removed from the processes of loading data, running queries, and generating reports, they are in no position to pass judgement on, or recommend improvements in, their consultants’ data management practices. On infrequent occasions, a client of a consulting firm will (1) encounter or hear about another environmental information management system, and (2) be sufficiently motivated to look into its pros and cons.

This motivation, however, does not translate into expertise in the area. So in the end, the client will typically turn to its consultant(s) for advice and assistance. I need not spell out the inevitable outcome of this process.

 

What about the future?
In the years ahead, the short shrift given to information management practices and techniques will change, particularly as more and more contaminated waste sites after being cleaned up, enter the O&M or what in some circles has come to be called the long-term stewardship (LTS) phase.

Information management costs, together with those associated with sample collection and analysis and data evaluation and reporting, are expected to consume over half of the expected annual LST budget for sites in this phase. Considering that the LTS phase often lasts for decades and that an estimated 300,000 – 400,000 contaminated sites exist in the United States alone, it is clear that both industry and government face substantial “stewardship” costs in the years ahead.

Because most of these charges will be related to information management, activities and expenses in this area will come under increasing scrutiny from those footing the bill. As a result, firms involved in data collection, storage, and reporting at these sites will be forced to evaluate their practices. In so doing they will come to realize, reluctantly or not, the benefits of adopting Internet-based tools and systems.

For the past three years I have been in charge of the development and implementation of the environmental industry’s first integrated, web-based system for managing and storing sampling and analytical data and project documents. The system includes:

  • An environmental information (analytical data-base) management system
  • Two hand-held applications to record water level readings and compliance data
  • An alternative to traditional GIS that is based on a new Web graphics format and XML-based language called Scalable Vector Graphics
  • Project management tools
  • Automatic emailing and calendar reminders
  • Document storage and retrieval, on-line collaboration opportunities
  • Remote control, automation, and diagnostics of process and treatment systems for water, groundwater, wastewater, air, and soil

I have seen the implementation of remote control and automation technologies and document storage and retrieval tools reduce the monthly costs of monitoring and maintenance at a site of a diesel spill in a remote mountainous area from $10,000 to $1,000 for an investment of only $30,000. I have also seen the data acquisition and reporting costs at a large site in the O&M phase decline by over 20% after the system was implemented.

The only individuals unhappy with this decline are those who were previously “forced” to either snowmobile or ski into the site during the winter months when the roads to it were impassable.

By adopting such new monitoring, database, and web technologies, a typical Fortune-100 company with a portfolio of 50 sites, whose net present value long term (30-years) monitoring costs are in the $100 million range, could lower these expenditures by $30 million dollars or mores.

If these numbers and predictions are correct, industry and government stand to benefit immensely in the years ahead from increased usage of the Internet as the primary repository and vehicle for the storage and delivery of environmental information and documents.