Locus Technologies introduces new calculation engine for GHG emissions inventories

Locus GHG calculation engine eases compliance burdens for GHG tracking

GHG inventories may be the result of mandatory state, regional, or national reporting programs, such as California Air Resource Board (AB32), U.S. EPA Mandatory Reporting Rule, or European Union Emissions Trading Scheme (EU ETS). Organizations need a GHG calculation engine that can calculate GHGs automatically and accurately from all emission-producing activities at all of their facilities anywhere in the world. The new Locus calculation engine supports simultaneous calculations using multiple methods so that users can input data once and report to federal, state, and voluntary reporting programs according to each proper protocol.

The requirements and procedures for GHG reporting are varied, complex, and rapidly evolving. To ensure compliance, companies need a calculation engine that can handle complex equations using appropriate emission factors, conversion factors, and calculation methodologies for each reporting program. The right calculation engine can reduce the stress, time, and potential inaccuracies found in home-grown accounting methods.

New GHG calculation engine removes reporting inaccuracies

As a leading accredited GHG verification company in California, Locus observed challenges that many companies experience with GHG inventory calculation, coupled with the gross inadequacy of tools previously available in the market. Informed by the verification of hundreds of inventories, Locus developed the new calculation engine.

When evaluating carbon management software with built-in calculation engines, companies must ensure that users are able to define both the calculation rules and display of calculated data for the purpose of reporting to various regulators. By giving end users the power to view, analyze, and make changes to analytic model data, Locus helps companies emphasize the transparency of the process and ensure that calculations are correct and that the company meets all verification requirements.

Find out more about Locus’ new GHG calculation engine for tracking emissions inventories.

Check out our white paper “How to Select the Best Greenhouse Gas Calculator for Your GHG Inventory”.

Locus Technologies introduces new calculation engine for GHG emissions inventories

The Locus GHG calculation engine is fully integrated with the dynamic Locus Platform and will automate emissions calculations for large enterprises.

MOUNTAIN VIEW, Calif., 20 November 2015 — Locus Technologies (Locus), the leader in cloud-based environmental compliance and sustainability management software, introduces an all-new calculation engine to its newest platform to redefine how companies organize, manage, and calculate their greenhouse gas (GHG) inventories. The Locus Platform offers a highly configurable, user-friendly interface to fully meet individual organizations’ environmental management needs.

With an increased focus on the role that GHG emissions play in climate change, ensuring that companies’ emissions are reported accurately is more important than ever. GHG emissions reports are coming under increased scrutiny from regulators, stakeholders, verifiers, and financial auditors. Choosing the right calculation engine plays a critical part in remaining compliant with these rapidly evolving requirements and regulations.

Locus GHG calculation engine eases compliance burdens for GHG tracking

GHG inventories may be the result of mandatory state, regional, or national reporting programs, such as California Air Resource Board (AB32), U.S. EPA Mandatory Reporting Rule, or European Union Emissions Trading Scheme (EU ETS). Organizations need a GHG calculation engine that can calculate GHGs automatically and accurately from all emission-producing activities at all of their facilities anywhere in the world. The new Locus calculation engine supports simultaneous calculations using multiple methods so that users can input data once and report to federal, state, and voluntary reporting programs according to each proper protocol.

“The requirements and procedures for GHG reporting are varied, complex, and rapidly evolving. To ensure compliance, companies need a calculation engine that can handle complex equations using appropriate emission factors, conversion factors, and calculation methodologies for each reporting program. The right calculation engine can reduce the stress, time, and potential inaccuracies found in home-grown accounting methods,” said Neno Duplan, President and CEO of Locus.

New GHG calculation engine removes reporting inaccuracies

As a leading accredited GHG verification company in California, Locus observed challenges that many companies experience with GHG inventory calculation, coupled with the gross inadequacy of tools previously available in the market. Informed by the verification of hundreds of inventories, Locus developed the new calculation engine.

“Besides spreadsheets, many calculation engines are proprietary to software vendors and are not transparent. For GHG calculations to pass audits and meet cap & trade requirements, transparency is absolutely required. Some of these ’black box‘ calculation tools have not been sufficiently stress-tested in the market and are generating errors that cause enterprises to fail their GHG verifications. Locus’ calculation engine addresses these deficiencies and capitalizes on the architecture of the highly scalable Locus platform. All calculations are viewable and traceable through the tool to the original data inputs,” said J. Wesley Hawthorne, Locus’ Senior Vice President of Operations and an accredited GHG verifier.

When evaluating carbon management software with built-in calculation engines, companies must ensure that users are able to define both the calculation rules and display of calculated data for the purpose of reporting to various regulators. By giving end users the power to view, analyze, and make changes to analytic model data, Locus helps companies emphasize the transparency of the process and ensure that calculations are correct and that the company meets all verification requirements.

“We listened to industry users and created a configurable calculation engine that is easy to use, dynamically driven, transparent, provides reproducible calculations, and is easy to verify. This calculation engine, along with the Locus Platform, will improve companies’ data collection, analysis, and most importantly, reporting capabilities,” added Duplan.

Locus will conduct live demonstrations of the Locus Platform and calculation engine at the Locus booth at the National Association for Environmental Management (NAEM) 2016 Sustainability Software and Data Management Conference from March 15-16, 2016 in Tampa, FL.

EU introduces more efficient monitoring of drinking water quality

New EU rules to improve the monitoring of drinking water across Europe come into force, improving access to wholesome and clean drinking water in Europe. As a first step following the European Citizens’ Initiative Right2Water, new rules adopted by the European Commission today provide flexibility to Member States as to how drinking water quality is monitored in around 100,000 water supply zones in Europe. This will allow for more focused, risk-based monitoring, while ensuring full protection of public health.

This new monitoring and control system will allow member states to reduce unnecessary analyses and concentrate on controls that really matter. This amendment of the Drinking Water Directive is a response to calls by citizens and the European Parliament to adopt legislation ensuring a better, fair and comprehensive water supply. It allows for an improved implementation of EU rules by Member States as it removes unnecessary burdens. Member States can now decide, on the basis of a risk assessment, which parameter to monitor given that some drinking water supply zones do not pose any risk for finding hazardous substances. They can also choose to increase or reduce the frequency of sampling in water supply zones, as well as to extend the list of substances to monitor in case of public health concerns. Flexibility in the monitoring of parameters and the frequency of sampling is framed by a number of conditions to be met, to ensure protection of citizens’ health. The new rules follow the principle of ‘hazard analysis and critical control point’ (HACCP), already used in food hygiene legislation, and the water safety plan approach laid down in the World Health Organization’s (WHO) Guidelines for Drinking Water Quality. Member States have two years to apply the provisions of this new legislation.

In order to effectively manage sampling and monitoring data at over 100,000 water supply zones water utilities and other stakeholders will need access to software like Locus EIM Water to organize complex water quality management information in real time and automate laboratory management programs and reporting. Locus EIM has been in use by numerous water utilities in the United States.

New Environmental Monitoring Technology Keeping the Air We Breathe Under an Unprecedented Level of Scrutiny

A recent article in the Los Angeles Times discussed advances in environmental monitoring technologies. Rising calls to create cleaner air and limit climate change are driving a surge in new technology for measuring air emissions and other pollutants — a data revolution that is opening new windows into the micro-mechanics of environmental damage. Data stemming from these new monitoring technologies coupled with advances in data management (Big Data) and Internet of Things (IOT) as discussed in my article “Keeping  the Pulse of the Planet: Using Big Data to Monitor Our Environment” published last year, is creating all new industry and bringing much needed transparency to environmental degradation. Real time monitoring of  radioactive emissions at any point around globe or water quality data are slowly becoming a reality.

According to the article author William Yardley, “the momentum for new monitoring tools is rooted in increasingly stringent regulations, including California’s cap-and-trade program for greenhouse gas emissions, and newly tightened federal standards and programs to monitor drought and soil contamination. A variety of clean-tech companies have arisen to help industries meet the new requirements, but the new tools and data are also being created by academics, tinkerers and concerned citizens — just ask Volkswagen, whose deceptive efforts to skirt emissions-testing standards were discovered with the help of a small university lab in West Virginia.”

“Taking it all into account, the Earth is coming under an unprecedented new level of scrutiny.”

“There are a lot of companies picking up on this, but who is interested in the data — to me, that’s also fascinating,” said Colette Heald, an atmospheric chemist at the Massachusetts Institute of Technology. “We’re in this moment of a huge growth in curiosity — of people trying to understand their environment. That coincides with the technology to do something more.”

The push is not limited to measuring air and emissions. Tools to sample soil, air emissions, produced water, waste management, monitor water quality, test ocean acidity and improve weather forecasting are all on the rise. Drought has prompted new efforts to map groundwater and stream flows and their water quality across the West.

Two of key issues that need to be addressed are validity of data stemming from new instruments and sensors for enforcement purposes and where is all (big) data be stored and how accessible it will be. The first question will be answered as new hand-held data collection instrumentation, sensors, and devices undergo testing and accreditation by governmental agencies. The second issue, a big data, has already been solved by companies like Locus Technologies that has been aggregating massive amounts of environmental monitoring data in its cloud-based EIM (Environmental Information Management) software.

As the article put it: “When the technology is out there and everyone starts using it, the question is, how good is the data? If the data’s not high enough quality, then we’re not going to make regulatory decisions based on that. Where is this data going to reside in 10 years, when all these sensors are out there, and who’s going to [manage] that information? Right now it’s kind of organic so there’s no centralized place where all of this information is going.”

However, the private industry and some Government organizations like Department of Energy (DOE) are already preparing for these new avalanches of data that are hitting their corporate networks and are using Locus cloud to organize and report increased volume of monitoring information stemming from their facilities and other monitoring networks.

California Lawmakers Approve Ban on Plastic Microbeads to Protect Water

California approves AB888, an important bill to prohibit the use of plastic microbeads in personal care products for sale in California by 2020. When someone uses a product – like a face wash, for example – that has microbeads, several things happen. First – they get a unique kind of cleanse in their face that beauty companies suggest they can’t get any other way. Second – the microbeads (tiny pieces of plastic) are washed down the drain with water. These microbeads do not get recycled. They do not get caught in filters before they hit the sea. They pollute.

With two just-released studies showing overwhelming levels of plastic pollution in San Francisco Bay and in Half Moon Bay’s marine life, it’s not an exaggeration to say that this bill will have a huge impact on the health of California’s waterways — and its people. Alaska, Hawaii, Iowa, Minnesota, New York, Vermont, and Washington also tried and failed this year to enact bans on manufacture and sale, while Oregon’s legislature is considering similar bans.

Studies found that San Francisco bay is contaminated with tiny pieces of plastic in greater concentrations than other U.S. bodies of water — at least 3.9 million pieces every day. Many of those plastic particles are tiny microbeads, less than one millimeter in diameter, which can be found in personal care products like shower gels, facial scrubs and toothpaste.

AB888 will ban the beads by 2020. Product manufacturers can use other exfoliants that aren’t as environmentally destructive, and increasingly, states are demanding that they do so. Six other states have already passed legislation that bans or restricts their use.

In addition to the plastic polluting our waterways — there are 471 million microbeads released into the bay every day from wastewater treatment facilities, Gordon said — they also contaminate the fish that we eat. A recent study in the publication Scientific Reports found “anthropogenic debris” in 25 percent of the fish sampled at markets in California.

EPA Imposes New Limits for Toxic Pollutants Released into Water

The Environmental Protection Agency  (EPA) has imposed new standards for mercury, lead and other toxic pollutants that are discharged into the water bodies (rivers and streams) from steam-powered electric power plants.

EPA Administrator Gina McCarthy said the rules, the first national limits on pollutants from steam electricity plants, will provide significant protections for children and communities across the country from exposure to pollutants that can cause serious health problems.

The rule will remove 1.4 billion pounds a year of toxic discharge nationwide. More than 23,000 miles of rivers and streams across the US are polluted by steam electric discharges, which occur close to 100 public drinking water intakes and nearly 2,000 public wells across the nation, the EPA said.

Toxic metals do not break down in the environment and can contaminate sediment in waterways and harm aquatic life and wildlife, including killing large numbers of fish. Steam electric power plants account for about 30 percent of all toxic pollutants discharged into streams, rivers and lakes from U.S. industrial facilities. The pollutants can cause neurological damage in children, lead to cancer and damage the circulatory system, kidneys and livers.

The EPA said most of the nation’s 1,080 steam electric power plants already meet the requirements. About 12 percent, or 134 plants, will have to make new investments to do so. A water quality management software like Locus EIM can help utilities automate their compliance with this new rules and manage water quality across portfolio of their plants.

Colorado Mine Spill Highlights Superfund Challenges

The Colorado mine spill that sent three million gallons of toxic sludge into a river last month highlighted the struggles of the federal Superfund program to clean up contaminated mining sites across the American West, reported Wall Street Journal on 12 September 2015.

The program, administered by the Environmental Protection Agency, was set up in the 1980s to remediate the nation’s most polluted places, from old factories to landfills. But it has been especially strained by legacy mining sites, which are often impossible to permanently clean up and instead require water-treatment plants or other expensive measures to contain widespread pollution, experts say.

The result is that some old mining sites widely acknowledged to be severely contaminated—such as the Gold King mine that led to last month’s spill, and others dotting the Upper Animas River Basin near Silverton, Colo.—haven’t been contained or cleaned, as the EPA and other stakeholders squabble about the best solution.

Currently, dozens of mining sites around the U.S. are on the EPA’s “National Priorities List” for Superfund cleanups or proposed to be added to the tally. But the taxes designed to fund cleanup costs when responsible parties can’t be found expired in 1995, and the multibillion-dollar fund dwindled to zero in the 2003 fiscal year, according to EPA data. Congressional appropriations have since helped support the program, but they decreased to nearly $1.1 billion this fiscal year from $1.3 billion in 2010.

Locus makes ENR TOP 200 Environmental Firms as the only EHS Software company

EPA Issues a Draft Report on Assessment of the Potential Impacts of Hydraulic Fracturing for Oil and Gas on Drinking Water Resources

This assessment provides a review and synthesis of available scientific literature and data to assess the potential for hydraulic fracturing for oil and gas to impact the quality or quantity of drinking water resources, and identifies factors affecting the frequency or severity of any potential impacts. The scope of this assessment is defined by the hydraulic fracturing water cycle which includes five main activities:

  1. Water acquisition – the withdrawal of ground or surface water needed for hydraulic fracturing fluids;
  2. Chemical mixing – the mixing of water, chemicals, and proppant on the well pad to create the hydraulic fracturing fluid;
  3. Well injection – the injection of hydraulic fracturing fluids into the well to fracture the geologic formation;
  4. Flowback and Produced water – the return of injected fluid and water produced from the formation to the surface, and subsequent transport for reuse, treatment, or disposal; and
  5. Wastewater treatment and waste disposal – the reuse, treatment and release, or disposal of wastewater generated at the well pad, including produced water.

This report can be used by federal, tribal, state, and local officials; industry; and the public to better understand and address vulnerabilities of drinking water resources to hydraulic fracturing activities. The report provides a comprehensive analysis of published literature and hints on environmental data management challenges facing hydro fracking industry.  Find out more about our solutions for the oil & gas industry.

For more information and to download report please visit the EPA site: http://cfpub.epa.gov/ncea/hfstudy/recordisplay.cfm?deid=244651

A Better Way to Organize and Manage Environmental Compliance Data

Current Practice

How do companies currently handle and store their environmental information?

Managing an environmental project (contaminated site, emission source, or GHG inventory) is similar to making a Hollywood movie, with one difference: duration.  A movie is usually made in few months, whereas an environmental project typically spans years or decades.

The work involved in investigating, remediating or monitoring of contaminated or emissions sites is almost universally performed by outside consulting firms. Large companies rarely “put all their eggs in one basket,” choosing instead to apportion their environmental work amongst several to 10, 20, or even more consulting firms.  The actual work at a particular site is generally managed and performed by the nearest local office of the firm that has been assigned to the site.

At larger production facilities such refinery or a Superfund site, the environmental work is likely to span 10, 20, or 30 years while monitoring may continue even longer. Over this period of time, investigations are planned, samples collected, reports written, remedial designs created, and following agency approval, one or more remedies may be implemented. Not only is turnover in personnel commonplace, but owing to the rebidding of national contracts, the firm assigned to do the work typically changes multiple times over the life span of a remedial project.

The investigation of a single large, potentially contaminated site often requires the collection of hundreds or even thousands of samples. A typical sample may be tested for the presence of several hundreds of chemicals, and many locations may be sampled multiple times per year over the course of many years. The end result is an extraordinary amount of information. Keep in mind that this is just for one site. Large companies with manufacturing and/or production facilities often have anywhere from a few to several hundred sites. Those that also have a retail component to their operations (e.g., oil companies) can have thousands of sites. Add to this list compliance and reporting data, engineering studies, real time emission monitoring, and the amount of data becomes staggering and unmanageable by conventional databases and spreadsheets. Given the magnitude and importance of this information, one would expect environmental data management to be a high priority item in the overall strategy of any company subject to environmental laws and regulations. But this is not so; instead, our surveys of the industry reveal that a large portion of information sits in spreadsheets and home-built databases. In short, you have an entire industry with billions in liability making decisions using tools that are not up to the task. Robust databases are standard tools in other industries – but for whatever reason, the environmental business has failed to fully embrace them.

As a result, many organizations and governmental agencies are simply “flying blind” when it comes to managing their environmental information.

The lack of standards and inconsistencies in information management practices among the firms performing environmental work for a company impose a significant cost on the company’s overall environmental budget.  The fact that some firms may use spreadsheets, others their own databases, and still other various commercial applications may appear on the surface to be a benign practice, as each firm’s office uses the tools it is most comfortable with. But the overall cost to the customer in fact is enormous.

A Better Way

Is there a better approach that companies (both consultants and owners of environmental liability) can adopt to manage their environmental data?  The solution seems obvious:  get all the information about sites out of paper files, spreadsheets, and stand-alone or inaccessible databases and into an electronic repository in a structured and formatted form that—and this is the crucial point — any project participant can access, preferably from the web, at any time or any place. In other words, the solution is not merely to use computers, but to use the web to link the parties involved in an emission management or site cleanup, and this includes not only site owners and their consultants but also regulators, laboratories, and insurers, thus making them, in current jargon, “interoperable.” This may be obvious, but today it is also a very distant goal.

What would the ideal IT architecture of environmental industry in future look like? It would start, with wireless data entry using mobile devices by technicians in the field and wireless sensors where feasible. Labs would upload the results of analytical testing directly from their instrumentation and LIMS systems into the web-based database. During the upload process any necessary error checking and data validation would take place automatically. Consultants would review these uploads and put their stamp of approval on the data before it becomes part of the permanent database. Air monitoring devices and sensors would automatically upload their measurements into the same system. Ditto for any water or air treatment systems installed at facilities, metering devices for consumption of energy, water, or fuel, etc. Anything with an IP address and connected to the internet that produces data relevant to environmental or sustainability monitoring should feed data into the same system. (In today’s word there is a word for it: Internet of Things or IoT).

Behind the scenes, all data would be formatted and stored according to recognized and standard protocols. Contrary to widespread concerns, this does not require a single central repository for all data or any particular hardware architecture. Instead, it relies on common software protocols and formats so that individual computer applications can find and talk to one another across the Internet. The good news is that the most of these standards, such as XML, SOAP, AJAX, REST, and WSDL, already exist and are used by many industries. Others, such as DMR, SEDD, GRI, CDP, EDF, CROMERR, or EDD (spelling them out makes them sound no less obscure) are unique to the environmental industry and govern data interchange between, laboratories, consultants, clients and regulatory agencies. On top of these, there needs to be hacker-proof layers of authentication and password protection so that only the right people can access critical or sensitive information.

There is still some work to do to refine these technologies but the basic building blocks are already readily available and implemented by few progressive companies and regulatory agencies. The problems that this changed approach would address are many. First, data would be entered or uploaded just once, preferably electronically. Secondly, data transfer costs would drop and data quality would improve. No longer would the need exist to transfer data whenever one consulting firm is replaced by another or to maintain multiple databases that must be kept in sync. Third, the significant amounts of time that engineers, managers, and scientists now spend determining where a particular report is correct or looking up information on a site would dramatically decline. Fourth, by having their data in a consistent electronic format, companies would be in a better position to comply with the emerging demand to upload   information on their sites to state or federal agencies and organizations. Several progressive states have already imposed electronic deliverable standards (e.g., California and New Jersey), and US EPA is working on its own standards based on XML technology.  Last, and most significantly, site owners would assume possession of their data and as such, finally gain ready access to information about their own sites. This would seem particularly beneficial to public companies attempting to comply with the SOX.

The good news is that a system described above already exists.

We would love to discuss your environmental data situation with you. Contact us or call (650) 960-1640.