There are two promising technologies that are about to change how we aggregate and manage EHS+S data: artificial intelligence (AI) and blockchain. When it comes to technology, history has consistently shown that the cost will always decrease, and its impact will increase over time. We still lack access to enough global information to allow AI to make a significant dent in global greenhouse gas (GHG) emissions by merely providing better tools for emissions management. For example, the vast majority of energy consumption is wasted on water treatment and movement. AI can help optimize both. Along the way, water quality management becomes an add-on app.
AI is a collective term for technologies that can sense their environment, think, learn, and act in response to what they’re detecting and their objectives. Possible applications include (1) Automation of routine tasks like sampling and analyses of water samples, (2) Segregation of waste disposal streams based on the waste containers content, (3) Augmentation of human decision-making, and (4) Automation of water treatment systems. AI systems can greatly aid the process of discovery – processing and analyzing vast amounts of data for the purposes of spotting and acting on patterns, skills that are difficult for humans to match. AI can be harnessed in a wide range of EHS compliance activities and situations to contribute to managing environmental impacts and climate change. Some examples of applications include permit interpretation and response to regulatory agencies, precision sampling, predicting natural attenuation of chemicals in water or air, managing sustainable supply chains, automating environmental monitoring and enforcement, and enhanced sampling and analysis based on real-time weather forecasts. Applying AI in water resource prediction, management, and monitoring can help to ameliorate the global water crisis by reducing or eliminating waste, as well as lowering costs and lessening environmental impacts. A similar analogy holds for air emissions management.
The onset of blockchain technology will have an even bigger impact. It will first liberate data and, second, it will decentralize monitoring while simultaneously centralizing emissions management. It may sound contradictory, but we need to decentralize in order to centralize management and aggregate relevant data across corporations and governmental organizations without jeopardizing anyone’s privacy. That is the power of blockchain technology. Blockchain technology will eliminate the need for costly synchronization among stakeholders: corporations, regulators, consultants, labs, and the public. What we need is secure and easy access to any data with infinite scalability. It is inevitable that blockchain technology will become more accessible with reduced infrastructure over the next few decades. My use of reduced architecture here refers to a replacement of massive centralized databases controlled by one of the big four internet companies using the hub-and-spoke model concept with a device-to-device communication with no intermediaries.
This post was originally published in Environmental Business Journal in June of 2020.
https://www.locustec.com/wp-content/uploads/2020/05/locus_graphic_ai_cover.jpg500950Neno Duplanhttps://www.locustec.com/wp-content/uploads/2023/12/locus_logo_2x.pngNeno Duplan2022-04-13 10:02:062022-04-26 08:57:51Artificial Intelligence & Blockchain Applied to Water & Energy
Locus Founder and CEO, Neno Duplan recently sat down with Grant Ferrier of the Environmental Business International to discuss a myriad of topics relating to technology in the environmental industry such as Artificial Intelligence, Blockchain, Multi-tenancy, IoT, and much more.
https://www.locustec.com/wp-content/uploads/2021/07/medium-locus-neno-bw.jpg341500Locus Product Teamhttps://www.locustec.com/wp-content/uploads/2023/12/locus_logo_2x.pngLocus Product Team2021-06-25 07:25:212024-09-27 09:15:27Emerging Technology and the Environmental Industry
In August 2014, we wrote on the potential use of wearables for EHS professionals. Less than a year later, the Apple Watch was introduced, revolutionizing the market. Now, wearables in the EHS space aren’t a hypothetical. Roughly a fifth to a quarter of Americans wear a smartwatch daily. Wearables are undoubtedly one of the biggest trends in EHS, with a seemingly endless number of uses to promote a more efficient and safer workplace.
Despite recent growth, wearables are still in their infancy when it comes to EHS. Verdantix anticipates that companies will spend 800% more on connected worker devices in twenty years, an explosion in utilization. This year alone, over 20% of surveyed companies are reporting an increase in budget for wearables for EHS purposes. While demand from organizations is growing, most EHS software is yet to adapt to market needs, with few offering wearable support.
Locus is prepared to meet the needs of the market, by integrating wearable support with our mobile application. Here are a few ways to best utilize your smartwatch with Locus Mobile:
Custom and priority notifications can be tailored to fit the needs of professionals in your organization, increasing engagement and response time.[/sc_icon_with_text]
The recent year of lockdowns pushed many daily activities into the virtual world. Work, school, commerce, the arts, and even medicine have moved online and into the cloud. As a result, considerably more resources and information are now available from an internet browser or from an application on a handheld device. To navigate through all this content and make sense of it, you need the ability to quickly search and get results that are most relevant to your needs.
You can think of the web as a big database in the cloud. Traditionally, database searches were done using a precise syntax with a standard set of keywords and rules, and it can be hard for non-specialists to perform such searches without learning programming languages. Instead, you want to search in as natural a matter as possible. For example, if you want to find pizza shops with 15 miles of your house that offer delivery, you don’t want to write some fancy statement like “return pizza_shop_name where (distance to pizza shop from my house < 15 miles) and (offers_delivery is true). You just want to type “what pizza shops within 15 miles of my house offer delivery?” How can this be done?
Enter the search engine. While online search engines appeared as early as 1990, it wasn’t until Yahoo! Search appeared in 1995 that their usage became widespread. Other engines such as Magellan, Lycos, Infoseek, Ask Jeeves, and Excite soon followed, though not all of them survived. In 1998, Google hit the internet, and it is now the most dominant engine in use. Other popular engines today are Bing, Baidu, and DuckDuckGo.
Current search engines compare your search terms to proprietary indexes of web page and their content. Algorithms are used to determine the most relevant parts of the search terms and how the results are ranked on the page. Your search success depends on what search terms you enter (and what terms you don’t enter). For example, it is better to search on ‘pizza nearby delivery’ than ‘what pizza shops that deliver are near my house’, as the first search uses less terms and thus more effectively narrows the results.
Search engines also support the use of symbols (such as hyphens, colons, quote marks) and commands (such as ‘related’, ‘site’, or ‘link’) that support advanced searches for finding exact word matches, excluding certain results, or limiting your search to certain sites. To expand on the pizza example, support you wanted to search for nearby pizza shops, but you don’t want to include Nogud Pizza Joints because they always put pineapple on your pizza. You would need to enter ‘pizza nearby delivery -nogud’. In some ways, with the need to know special syntax, searching is back where it was in the old database days!
Search engines are also a key part of ‘digital personal assistants’, or programs that not only perform searches but also perform simple tasks. An assistant on your phone might call the closest pizza shop so you can place an order, or perhaps even login to your loyalty app and place the order for you. There is a dizzying array of such assistants used within various devices and applications, and they all seem to have soothing names such as Siri, Alexa, Erica, and Bixby. Many of these assistants support voice activation, which just reinforces the need for natural searches. You don’t want to have to say “pizza nearby delivery minus nogud”! You just want to say “call the nearest pizza shop that does delivery, but don’t call Nogud Pizza”.
Search engine and digital personal assistant developers are working towards supporting such “natural” requests by implementing “natural language processing”. Using natural language processing, you can use full sentences with common words instead of having to remember keywords or symbols. It’s like having a conversation as opposed to doing programming. Natural language is more intuitive and can help users with poor search strategies to have more successful searches.
Furthermore, some engines and assistants have artificial intelligence (AI) built in to help guide the user if the search is not clear or if the results need further refinement. What if the closest pizza shop that does delivery is closed? Or what if a slightly farther pizza place is running a two-for-one special on your favorite pizza? The built-in AI could suggest choices to you based on your search parameters combined with your past pizza purchasing history, which would be available based on your phone call or credit charge history.
Searching in Locus EIM
The Locus team recently expanded the functionality of the EIM (Environmental Information Management) search bar to support different types of data searches. If a search term fits several search types, all are returned for the user to review.
Functionality searches: entering a word that appears in a menu or function name will return any matching menu items and functions. For example, searching for ‘regulatory exports’ returns several menu items for creating, managing, and exporting regulatory datasets.
Help searches: entering a word or phrase that appears in the EIM help files will return any matching help pages. For example, ‘print a COC’ returns help pages with that exact phrase.
Data searches: entering a location, parameter, field parameter, or field sample will return any matching data records linked with that entity. For example, searching for the parameter ‘tritium’ returns linked pages showing parameter information and all field sample results for that parameter. Searching for the location ‘MW-1’ returns linked pages showing all field samples, groundwater levels, field measurements, and field sample results at the location.
EIM lets the user perform successful searches through various methods. In all searches, the user does not need to specify if the search term is a menu item, help page, or data entity such as parameter or location. Rather, the search bar determines the most relevant results based on the data currently in EIM. Furthermore, the search bar remembers what users searched for before, and then ranks the results based on that history. If a user always goes to a page of groundwater levels when searching for location ‘MW-1’, then that page will be returned first in the list of results. Also, the EIM search bar supports common synonyms. For example, searches for ‘plot’, ‘chart’, and ‘graph’ all return results for EIM’s charting package.
By implementing the assistance methods described above, Locus is working to make searching as easy as possible. As part of that effort, Locus is working to add natural language processing into EIM searches. The goal is to let users conduct searches such as ‘what wells at my site have benzene exceedances’ or perform tasks such as ‘make a chart of benzene results’ without having to know special commands or query languages.’
How would this be done? Let’s set aside for now the issues of speech recognition – sadly, you won’t be talking to EIM soon! Assume your search query is ‘what is the maximum lead result for well 1A?’
First, EIM extracts key terms and modifiers (this is called entity recognition). EIM would extract ‘maximum’, ‘lead’, ‘result’, ‘well’, and ‘1A’, while ignoring connecting words such as ‘the’ or ‘for’.
Then, EIM categorizes these terms. EIM would be ‘trained’ via AI to know ‘lead’ is mostly used in environmental data as a noun for the chemical parameter, and not a verb. ‘Result’ refers to a lab result, and ‘well’ is a standard sampling location type.
EIM then runs a simple query and gets the maximum lead result for location 1A.
Finally, EIM puts the answer into a sentence (‘The maximum lead result at location 1A is 300 mg/L on 1/1/2020’) with any other information deemed useful, such as the units and the date.
A similar process could be done for tasks such as ‘make a chart of xylene results’. In this case, however, there is too much ambiguity to proceed, so EIM would need to return queries for additional clarifications to help guide the user to the desired result. Should the chart show all dates, or just a certain date range? How are non-detects handled? Which locations should be shown on the chart? What if the database stores separate results for o-Xylene, m,p-Xylene, plus Xylene (total)? Once all questions were answered, EIM could generate a chart and return it to the user.
Natural language is the key to helping users construct effective searches for data, whether in EIM, on a phone, or in the internet. Locus continues to improve EIM by bringing natural language processing to the EIM search engine.
About the Author—Dr. Todd Pierce, Locus Technologies
Dr. Pierce manages a team of programmers tasked with development and implementation of Locus’ EIM application, which lets users manage their environmental data in the cloud using Software-as-a-Service technology. Dr. Pierce is also directly responsible for research and development of Locus’ GIS (geographic information systems) and visualization tools for mapping analytical and subsurface data. Dr. Pierce earned his GIS Professional (GISP) certification in 2010.
https://www.locustec.com/wp-content/uploads/2021/05/locus_eim_quick-search-cover.jpg8001500Dr. Todd Piercehttps://www.locustec.com/wp-content/uploads/2023/12/locus_logo_2x.pngDr. Todd Pierce2021-05-11 12:26:262024-12-10 12:31:05Quicker Data Searching with Natural Language Processing
Regardless of the size of your organization or the industry you’re in, chances are that right now artificial intelligence can benefit your EHS&S initiatives in one way or another. And whether you are ready for it or not, the age of artificial intelligence is coming. Forward-thinking and adaptive businesses are already using artificial intelligence in EHS&S as a competitive advantage in the marketplace to great success.
With modern EHS&S software, immense amounts of computing power, and seemingly endless cloud storage, you now have the tools to achieve fully-realized AI for your EHS&S program. And while you may not be ready to take the plunge into AI just yet, there are some steps you can take to implement artificial intelligence into your EHS&S program in the future.
Perhaps the best aspect of preparing for AI implementation is that all of the steps you take to properly bring about an AI system will benefit your program even before the deployment phase. Accurate sources, validated data, and one system of record are all important factors for any EHS&S team.
Accurate Sources
Used alongside big data, AI can quickly draw inferences and conclusions about many aspects of life more efficiently than with human analysis, but only if your sources pull accurate data. Accurate sources data will help your organization regardless of your current AI usage level. That’s why the first step to implementing artificial intelligence is auditing your data sources.
Sources pulling accurate data can be achieved with some common best practices. First, separate your data repository from the process that analyzes the data. This allows you to repeat the same analysis on different sets of data without the fear of not being able to replicate the process of analysis. AI requires taking a step away from an Excel-based or in-house software, and moving to a modern EHS&S software, like Locus Platform that will audit your data as it is entered. This means that anything from SCADA to historical outputs, samples, and calculations can be entered and vetted. Further, consider checking your data against other sources and doing exploratory analysis to greater legitimize your data.
Validated Data
AI requires data, and a lot of it—aggregated from multiple sources. But no amount of predictive analysis or machine learning is going to be worth anything without proper data validation processes.
Collected data must be relevant to the problem you are trying to solve. Therefore, you need validated data, which is a truly difficult ask with Excel, in-house platforms, and other EHS&S software. Appropriate inputs, appropriate ranges, data consistency, range checks (to name a few)—are all aspects of data that is validated in a modern EHS&S software like Locus Platform. Without these checks inherent to a platform, you cannot be sure that your data, or your analyses are producing useful or accurate results.
Possibly the best reason to get started with AI is the waterfall effect. As your data uncovers hidden insights and starts to learn on its own, the more accurate your new data will be and the better your predictions will become.
One System of Record
A unified system of record and a central repository for all data means that you see an immediate increase in data quality. Starting with AI means the end of disconnected EHS&S systems. No more transferring data from one platform to another or from pen and paper, just fully-digitized and mobile-enabled data in one platform backed up in the cloud. You also gain the added benefit of being able to access your data in real-time, incorporate compliance/reporting on the fly, and save time and resources using a scalable solution instead of a web of spreadsheets and ad-hoc databases.
Whether you are ready for AI or not, investing in these otherwise useful steps are necessary for any program looking to harness the power of artificial intelligence. When you are ready to take that next step, you will be well on the path to AI implementation, with a solid data infrastructure in place for your efforts.
https://www.locustec.com/wp-content/uploads/2020/05/locus_graphic_ai_cover.jpg500950Locus Product Teamhttps://www.locustec.com/wp-content/uploads/2023/12/locus_logo_2x.pngLocus Product Team2020-05-15 07:31:382021-04-13 05:40:25AI for EHS&S: Three Essential Steps to Get Started
At Locus Technologies, we’re always looking for innovative ways to help water users better utilize their data. One way we can do that is with powerful technologies such as machine learning. Machine learning is a powerful tool which can be very useful when analyzing environmental data, including water quality, and can form a backbone for competent AI systems which help manage and monitor water. When done correctly, it can even predict the quality of a water system going forward in time. Such a versatile method is a huge asset when analyzing data on the quality of water.
To explore machine learning in water a little bit, we are going to use some groundwater data collected from Locus EIM, which can be loaded into Locus Platform with our API. Using this data, which includes various measurements on water quality, such as turbidity, we will build a model to estimate the pH of the water source from various other parameters, to an error of about 1 pH point. For the purpose of this post, we will be building the model in Python, utilizing a Jupyter Notebook environment.
When building a machine learning model, the first thing you need to do is get to know your data a bit. In this case, our EIM water data has 16,114 separate measurements. Plus, each of these measurements has a lot of info, including the Site ID, Location ID, the Field Parameter measured, the Measurement Date and Time, the Field Measurement itself, the Measurement Units, Field Sample ID and Comments, and the Latitude and Longitude. So, we need to do some janitorial work on our data. We can get rid of some columns we don’t need and separate the field measurements based on which specific parameter they measure and the time they were taken. Now, we have a datasheet with the columns Location ID, Year, Measurement Date, Measurement Time, Casing Volume, Dissolved Oxygen, Flow, Oxidation-Reduction Potential, pH, Specific Conductance, Temperature, and Turbidity, where the last eight are the parameters which had been measured. A small section of it is below.
Alright, now our data is better organized, and we can move over to Jupyter Notebook. But we still need to do a bit more maintenance. By looking at the specifics of our data set, we can see one major problem immediately. As shown in the picture below, the Casing Volume parameter has only 6 values. Since so much is missing, this parameter is useless for prediction, and we’ll remove it from the set.
We can check the set and see that some of our measurements have missing data. In fact, 261 of them have no data for pH. To train a model, we need data which has a result for our target, so these rows must be thrown out. Then, our dataset will have a value for pH in every row, but might still have missing values in the other columns. We can deal with these missing values in a number of ways, and it might be worth it to drop columns which are missing too much, like we did with Casing Volume. Luckily, none of our other parameters are, so for this example I filled in empty spaces in the other columns with the average of the other measurements. However, if you do this, it is necessary that you eliminate any major outliers which might skew this average.
Once your data is usable, then it is time to start building a model! You can start off by creating some helpful graphs, such as a correlation matrix, which can show the relationships between parameters.
For this example, we will build our model with the library Keras. Once the features and targets have been chosen, we can construct a model with code such as this:
This code will create a sequential deep learning model with 4 layers. The first three all have 64 nodes, and of them, the initial two use a rectified linear unit activation function, while the third uses a sigmoid activation function. The fourth layer has a single node and serves as the output.
Our model must be trained on the data, which is usually split into training and test sets. In this case, we will put 80% of the data into the training set and 20% into the test set. From the training set, 20% will be used as a validation subset. Then, our model examines the datapoints and the corresponding pH values and develops a solution with a fit. With Keras, you can save a history of the reduction in error throughout the fit for plotting, which can be useful when analyzing results. We can see that for our model, the training error gradually decreases as it learns a relationship between the parameters.
The end result is a trained model which has been tested on the test set and resulted in a certain error. When we ran the code, the test set error value was 1.11. As we are predicting pH, a full point of error could be fairly large, but the precision required of any model will depend on the situation. This error could be improved through modifying the model itself, for example by adjusting the learning rate or restructuring layers.
You can also graph the true target values with the model’s predictions, which can help when analyzing where the model can be improved. In our case, pH values in the middle of the range seem fairly accurate, but towards the higher values they become more unreliable.
So what do we do now that we have this model? In a sense, what is the point of machine learning? Well, one of the major strengths of this technology is the predictive capabilities it has. Say that we later acquire some data on a water source without information on the pH value. As long as the rest of the data is intact, we can predict what that value should be. Machine learning can also be incorporated into examination of things such as time series, to forecast a trend of predictions. Overall, machine learning is a very important part of data analytics and the development of powerful AI systems, and its importance will only increase in the future.
What’s next?
As the technology around machine learning and artificial intelligence evolves, Locus will be working to integrate these tools into our EHS software. More accurate predictions will lead to more insightful data, empowering our customers to make better business decisions.
Contact us today to learn how machine learning and AI can help your EHS program thrive
https://www.locustec.com/wp-content/uploads/2019/09/locus_screenshot_machine-learning-predict.png279330Locus Product Teamhttps://www.locustec.com/wp-content/uploads/2023/12/locus_logo_2x.pngLocus Product Team2019-09-11 07:40:592021-04-12 11:36:34Predicting Water Quality with Machine Learning
AI and Big Data to Drive EHS Decisions via Multi-tenant SaaS
With data and information streaming from devices like fire hydrants, there is little benefit from raw data, unless a company owning the data has a way to integrate the data into its record system and pair it with regulatory databases and GIS. That is where the advancement in SaaS tools and data sources mashups has helped set the stage for AI as a growing need.
Humans are not very good at analyzing large datasets. This is particularly true with data at the planetary level that are now growing exponentially to understand causes and fight climate change. Faced with a proliferation of new regulations and pressure to make their companies “sustainable” EHS departments keep adding more and more compliance officers, managers, and outside consultants, instead of investing in technology that can help them. Soon, they will be turning to AI technology to stay on top of the ever-changing regulatory landscape.
AI, in addition to being faster and more accurate, should make compliance easier. Companies spend too much time and effort on the comprehensive quarterly or annual reporting—only to have to duplicate the work for the next reporting period. The integrated approach, aided by AI, will automate these repetitive tasks and make it easier than just having separate analyses performed on every silo of information before having a conversation with regulators.
In summary, whether it is being used to help with GHG emissions monitoring and reporting, water quality management, waste management, incident management, or other general compliance functions, AI can improve efficiency, weed out false-positive results, cut costs and make better use of managers’ time and company resources.
Another advantage of AI, assuming it is deployed properly, concerns its inherent neutrality on data evaluation and decision making. Time and time again we read in the papers about psychological studies and surveys that show people on opposite sides of a question or topic cannot even agree on the “facts.” It should not be surprising then to find that EHS managers and engineers are often limited by their biases. As noted in the recent best-seller book by Nobel Memorial Prize in Economics laureate Daniel Kahneman, “Thinking, Fast and Slow,” when making decisions, they frequently see what they want, ignore probabilities, and minimize risks that uproot their hopes. Even worse, they are often confident even when they are wrong. Algorithms with AI built-in are more likely to detect our errors than we are. AI-driven intelligent databases are now becoming powerful enough to help us reduce human biases from our decision-making. For that reason, large datasets, applied analytics, and advanced charting and data visualization tools, will soon be driving daily EHS decisions.
In the past, companies almost exclusively relied upon on-premise software (or single-tenant cloud software, which is not much different from on-premise). Barriers were strewn everywhere. Legacy systems did not talk to one another, as few of the systems interfaced with one another. Getting data into third-party apps usually required the information to be first exported in a prescribed format, then imported to a third-party app for further processing and analysis. Sometimes data was duplicated across multiple systems and apps to avoid the headache of moving data from one to another. As the world moves to the multi-tenant SaaS cloud, all this is now changing. Customers are now being given the opportunity to analyze not just their company’s data, but data from other companies and different but potentially related and coupled categories via mashups. As customers are doing so, interesting patterns are beginning to emerge.
The explosion of content—especially unstructured content—is an opportunity and an obstacle for every business today.
The emergence of artificial intelligence is a game-changer for enterprise EHS and content management because it can deliver business insights at scale and make EHS compliance more productive. There are numerous advantages when you combine the leading multi-tenant EHS software with AI:
Ability to handle the explosion of unstructured content where legacy on-premise EHS solutions can’t.
AI can organize, illuminate, and extract valuable business insights if all your content is managed in one secure location in the cloud.
Locus helps you take advantage of best-of-breed AI technologies from industry leaders and apply them to all your content.
We are seeing in the most recent NAEM white paper, Why Companies Replace Their EHS&S Software Systems, that people want the ability to integrate with other systems as a top priority. Once the ability to share/consolidate data is available, AI is not far behind in the next generation of EHS/Water Quality software.
This concludes the four-part blog series on Big Data, IoT, AI, and multi-tenancy. We look forward to feedback on our ideas and are interested in hearing where others see the future of AI in EHS software – contact us for more discussion or ideas! Read the full Series: Part One, Part Two, Part Three.
Contact us to learn more about Locus uses IoT and AI
https://www.locustec.com/wp-content/uploads/2019/04/locus_graphic_ai_1488x800.png8001488Locus Product Teamhttps://www.locustec.com/wp-content/uploads/2023/12/locus_logo_2x.pngLocus Product Team2019-08-02 05:40:442020-03-06 08:11:04Artificial Intelligence and Environmental Compliance–Revisited–Part 4: AI, Big Data + Multi-Tenancy = The Perfect System
Multi-tenancy offers distinct benefits over traditional, single-tenant software hosting. A multi-tenant SaaS provider’s resources are focused on maintaining a single, current version of the application, rather than having its resources diluted in an attempt to support multiple software versions for its customers. If a provider is not using multi-tenancy, it may be hosting or supporting thousands of single-tenant customer implementations. By doing so, a provider cannot aggregate information across customers and extract knowledge from large data sets as every customer may be housed on a different server and possibly a different version of software. For these reasons, it is almost impossible and prohibitively expensive to deliver modern AI tools via single-tenancy.
Multi-tenancy has other advantages as well. Because every customer is on the same version of the software and the same instance, machine-learning (a prerequisite for building an AI system) can happen more quickly as large datasets are constantly fed into a single system. A multi-tenant SaaS vendor can integrate and deploy new AI features more quickly, more frequently, and to all customers at once. Lastly, a single software version creates more of a sense of community among users and facilitates the customers’ ability to share their lessons learned with one another (if they chose to do that). Most of today’s vendors in the EH&S software space cannot offer AI, sustain their businesses, and grow unless they are a true multi-tenant SaaS provider. Very few vendors are.
AI
Almost 30 years after the publication of our paper on the hazardous data explosion, SaaS technologies combined with other advancements in big data processing are rising to the challenge of successful processing, analyzing, and interpreting large quantities of environmental and sustainability data. It is finally time to stop saying that AI is a promising technology of the future. A recent Gartner study indicates that about 20 percent of data will be created or gathered by computers by 2018. Six billion connected devices will acquire the ability to connect and share data with each other. This alone will fuel AI growth as we humans cannot interpret such massive amounts of data.
Gone are the days where EHS software was just a database. There are two factors that are fueling the adoption of AI technologies for water quality management and EHS compliance. First, there is a vast increase that we have mentioned of data that needs sorting and understanding (big data). Second, there is the move to true multi-tenant SaaS solutions, which enables the intake and dissection of data from multiple digital sources (streaming data) from multiple customers, all in real-time.
AI has entered the mainstream with the backing and advocacy of companies like IBM, Google, and Salesforce, who are heavily investing in the technology and generating lots of buzzes (and we are seeing the consequent talent war happening industry-wide). It is remarkable to observe how quickly AI is proliferating in so many verticals, as CBS’s 60 Minutes segment showed us.
For our purposes, let’s look at where AI is likely to be applied in the EHS space. The mission-critical problem for EHS enterprise software companies is finding solutions that both enhance compliance and reduce manual labor and costs. This is where AI will play a major role. So far, companies have largely focused on aggregating their data in a record system(s); they have done little to interpret that data without human interaction. To address the ever-changing growth in environmental regulations, companies have been throwing people at the problem, but that is not sustainable.
AI and natural language processing (NLP) systems have matured enough to read through the legalese of regulations, couple them with company’s monitoring and emissions data, and generate suggestions for actions based on relevant regulations and data. Take, for example; a CEMS installed at many plants to monitor air emissions in real-time. Alternatively, a drinking water supply system monitoring for water quality. In each of these systems, there are too many transactions taking place to monitor manually to ascertain which ones are compliant and which ones are not? I see no reason why similar algorithms that are used for computerized trading (as described in the recent best-seller “Flash Boys”) to trade stocks in fractions of a second cannot be used for monitoring exceedances and automatically shutting down discharges if there is an approaching possibility of emission exceedance. It is an onerous task to figure out every exceedance on a case-by-case basis. Intelligent databases with a built-in AI layer can interpret data on arrival and signal when emissions exceed prescribed limits or when other things go wrong. The main driver behind applying AI to EHS compliance is to lower costs and increase the quality of EHS compliance, data management, and interpretation, and ultimately, to avoid all fines for exceedances.
For example, a large water utility company has to wade through thousands of analytical results to look for outliers of a few dozen chemicals they are required to monitor to stay compliant. Some of these may be false-positives, but that still leaves some results to be investigated for outliers. Each of those investigations can take time. However, if a software algorithm has access to analytical results and can determine that the problem rests with a test in the lab, that problem can be solved quickly, almost without human interaction. That is powerful.
Combing through data and doing this by hand or via spreadsheet could take days and create a colossal waste of time and uncertainty. Hundreds of billable hours can be wasted with no guaranteed result. Using AI-driven SaaS software to determine what outliers need investigation allows compliance managers, engineers, and chemists to focus their expertise on just these cases and thus avoid wasting their time on the remaining ones that the AI engine indicates need no further examination.
Predictive analytics based on big data and AI will also make customer data (legacy and new) work harder for customers than any team(s) of consultants. A good analogy that came to me after watching 60 minutes is that the same way the clinical center in North Carolina used AI to improve cancer treatment for their patients, engineers and geologists can improve on selecting the site remedy that will be optimized for given site conditions and will lead to a faster and less expensive cleanup with minimum long-term monitoring requirements.
A final example where AI will be playing a role is in the area of enterprise carbon management. SaaS software is capable of integrating data from multiple sources, analyzing and aggregating it. This aggregated information can then be distributed to a company’s divisions or regulatory agencies for final reporting and validation/verification, all in real-time. This approach can save companies lots of time and resources. Companies will be able to access information from thousands of emission sources across the states, provinces, and even countries where their plants are located. Because each plant is likely to have its set of regulatory drivers and reporting requirements, these would have to be incorporated into the calculation and reporting engine. After data from each plant is uploaded to a central processing facility, the information would be translated into a “common language,” the correct calculation formulae and reporting requirements applied, and the results then returned to each division in a format suitable for reporting internally and externally.
Blockchain for EHS—Looking ahead
And finally, another emerging technology, blockchain, will further augment the power of AI for EHS monitoring and compliance. While blockchain is in its infancy, its decentralized approach coupled with AI will bring another revolution to EHS compliance and water monitoring.
Parts one, two, and four of this blog series complete the overview of Big Data, IoT, AI, and multi-tenancy. We look forward to feedback on our ideas and are interested in hearing where others see the future of AI in EHS software – contact us for more discussion or ideas!
Contact us to learn more about Locus uses IoT and AI
https://www.locustec.com/wp-content/uploads/2019/04/locus_graphic_ai_1488x800.png8001488Locus Product Teamhttps://www.locustec.com/wp-content/uploads/2023/12/locus_logo_2x.pngLocus Product Team2019-07-19 06:07:122020-03-06 08:19:49Artificial Intelligence and Environmental Compliance–Revisited–Part 3: Multi-Tenancy and AI
More recently, big data has become more closely tied to IoT-generated streaming datasets such as Continued Air Emission Measurements (CEMS), real-time remote control and monitoring of treatment systems, water quality monitoring instrumentation, wireless sensors, and other types of wearable mobile devices. Add digitized historical records to this data streaming, and you end up with a deluge of data. (To learn more about big data and IoT trends in the EHS industry, please read this article: Keeping the Pulse on the Planet using Big Data.)
In the 1989 Hazardous Data Explosionarticle that I mentioned earlier, we first identified the limitation of relational database technology in interpreting data and the importance that IoT (automation as it was called at the time) and AI were going to play in the EHS industry. We wrote:
“It seems unavoidable that new or improved automated data processing techniques will be needed as the hazardous waste industry evolves. Automation (read IoT) can provide tools that help shorten the time it takes to obtain specific test results, extract the most significant findings, produce reports and display information graphically,”
We also claimed that “expert systems” (a piece of software programmed using artificial intelligence (AI) techniques. Such systems use databases of expert knowledge to offer advice or make decisions.) and AI could be possible solutions—technologies that have been a long time coming but still have a promising future in the context of big data.
“Currently used in other technical fields, expert systems employ methods of artificial intelligence for interpreting and processing large bodies of information.”
Although “expert systems” as a backbone for AI did not materialize as it was originally envisioned by researches, it was a necessary step that was needed to use big data to fulfil the purpose of an “expert”.
AI can be harnessed in a wide range of EHS compliance activities and situations to contribute to managing environmental impacts and climate change. Some examples of application include AI-infused permit management, AI-based permit interpretation and response to regulatory agencies, precision sampling, predicting natural attenuation of chemicals in water, managing sustainable supply chains, automating environmental monitoring and enforcement, and enhanced sampling and analysis based on real-time weather forecasts.
Parts one, three, and four of this blog series complete the overview of Big Data, IoT, AI and multi-tenancy. We look forward to feedback on our ideas and are interested in hearing where others see the future of AI in EHS software – contact us for more discussion or ideas!
Contact us to learn more about Locus uses IoT and AI
On 12 April 2019, Locus’ Founder and CEO, Neno Duplan, received the prestigious Carnegie Mellon 2019 CEE (Civil and Environmental Engineering) Distinguished Alumni Award for outstanding accomplishments at Locus Technologies. In light of this recognition, Locus decided to dig into our blog vault, share a series of visionary blogs crafted by our Founder in 2016. These ideas are as timely and relevant today as they were three years ago, and hearken to his formative years at Carnegie Mellon, which formed the foundation for the current success of Locus Technologies as top innovator in the water and EHS compliance space.
Artificial Intelligence (AI) for Better EHS Compliance (original blog from 2016)
It is funny how a single acronym can take you back in time. A few weeks ago when I watched 60 Minutes’ segment on AI (Artificial Intelligence) research conducted at Carnegie Mellon University, I was taken back to the time when I was a graduate student at CMU and a member of the AI research team for geotechnical engineering. Readers who missed this program on October 9, 2016, can access it online.
Fast forward thirty plus years and AI is finally ready for prime time television and a prominent place among the disruptive technologies that have so shaken our businesses and society. This 60 Minutes story prompted me to review the progress that has occurred in the field of AI technology, why it took so long to come to fruition, and the likely impact it will have in my field of environmental and sustainability management. I discuss these topics below. I also describe the steps that we at Locus have taken to put our customers in the position to capitalize on this exciting (but not that new) technology.
What I could not have predicted when I was at Carnegie Mellon is that AI was going to take a long time to mature–almost the full span of one’s professional career. The reasons for this are multiple, the main one being that several other technologies were absent or needed to mature before the promises of AI could be realized. These are now in place. Before I dive into AI and its potential impact on the EHS space, let me touch on these “other” major (disruptive) technologies without which AI would not be possible today: SaaS, Big Data, and IoT (Internet of Things).
As standalone technologies, each of these has brought about profound changes in both the corporate and consumer worlds. However, these impacts are small when compared to the impact all three of these will have when combined and interwoven with AI in the years to come. We are only in the earliest stages of the AI computing revolution that has been so long in the coming.
I have written extensively about SaaS, Big Data, and IoT over the last several decades. All these technologies have been an integral part of Locus’ SaaS offering for many years now, and they have proven their usefulness by rewarding Locus with contracts from major Fortune 500 companies and the US government. Let me quickly review these before I dive into AI (as AI without them is not a commercially viable technology).
Big Data
Massive quantities of new information from monitoring devices, sensors, treatment systems controls and monitoring, and customer legacy databases are now pouring into companies EHS departments with few tools to analyze them on arrival. Some of the data is old information that is newly digitized, such as analytical chemistry records, but other information like streaming of monitoring wireless and wired sensor data is entirely new. At this point, most of these data streams are highly balkanized as most companies lack a single system of record to accommodate them. However, that is all about to change.
As a graduate student at Carnegie Mellon in the early eighties, I was involved with the exciting R&D project of architecting and building the first AI-based Expert System for subsurface site characterization, not an easy task even by today’s standards and technology. AI technology at the time was in its infancy, but we were able to build a prototype system for geotechnical site characterization, to provide advice on data interpretation and on inferring depositional geometry and engineering properties of subsurface geology with a limited amount of data points. The other components of the research included a relational database to store the site data, graphics to produce “alternative stratigraphic images” and network workstations to carry out the numerical and algorithmic processing. All of this transpired before the onset of the internet revolution and before any acronyms like SaaS, AI, or IoT had entered our vocabulary. This early research led to the development of a set of commercial tools and technological improvements and ultimately to the formation of Locus Technologies in 1997.
Part of this early research included management of big data, which is necessary for any AI undertaking. As a continuation of this work at Carnegie Mellon, Dr. Greg Buckle and I published an article in 1989 about the challenges of managing massive amounts of data generated from testing and long-term monitoring of environmental projects. This was at a time when spreadsheets and paper documents were king, and relational databases were little used for storing environmental data.
The article, “Hazardous Data Explosion,“ published in the December 1989 issue of the ASCE Civil Engineering Magazine, was among the first of its kind to discuss the upcoming Big Data boom within the environmental space and placed us securely at the forefront of the big data craze. This article was followed by a sequel article in the same magazine in 1992, titled “Taming Environmental Data,“ that described the first prototype solution to managing environmental data using relational database technology. In the intervening years, this prototype eventually became the basis of the industry’s first multi-tenant SaaS system for environmental information management.
Today, the term big data has become a staple across various industries to describe the enormity and complexity of datasets that need to be captured, stored, analyzed, visualized, and reported. Although the concept may have gained public popularity relatively recently, big data has been a formidable fixture in the EHS industry for decades. Initially, big data in EHS space was almost entirely associated with the results of analytical, geotechnical, and field testing of water, groundwater, soil, and air samples in the field and laboratory. Locus’ launch of its Internet-based Environmental Information Management (EIM) system in 1999 was intended to provide companies not only with a repository to store such data, but also with the means to upload such data into the cloud and the tools to analyze, organize, and report on these data.
In the future, companies that wish to remain competitive will have no choice but bring together their streams of (seemingly) unrelated and often siloed big data into systems such as EIM that allow them to evaluate and assess their environmental data with advanced analytics capabilities. Big data coupled with intelligent databases can offer real-time feedback for EHS compliance managers who can better track and offset company risks. Without the big data revolution, there would be no coming AI revolution.
AI and Water Management – Looking Ahead
There has been much talk about how artificial intelligence (AI) will affect various aspects of our lives, but little has been said to date about how the technology can help to make water quality management better. The recent growth in AI spells a big opportunity for water quality management. There is enormous potential for AI to be an essential tool for water management and decoupling water and climate change issues.
Two disruptive megatrends of digital transformation and decarbonization of economy could come together in the future. AI could make a significant dent in global greenhouse gas (GHG) emissions by merely providing better tools to manage water. The vast majority of energy consumption is wasted on water treatment and movement. AI can help optimize both.
AI is a collective term for technologies that can sense their environment, think, learn, and take action in response to what they’re detecting and their objectives. Applications can range from automation of routine tasks like sampling and analyses of water samples to augmenting human decision-making and beyond to automation of water treatment systems and discovery – vast amounts of data to spot, and act on patterns, which are beyond our current capabilities.
Applying AI in water resource prediction, management and monitoring can help to ameliorate the global water crisis by reducing or eliminating waste, as well as lowering costs and lessening environmental impacts.
Parts two, three, and four of this blog series complete the overview of Big Data, Iot, AI and multi-tenancy. We look forward to feedback on our ideas and are interested in hearing where others see the future of AI in EHS software – contact us for more discussion or ideas!
Contact us to learn more about Locus uses IoT and AI
Neno Duplan is founder and CEO of Locus Technologies, a Silicon Valley-based environmental software company founded in 1997. Locus evolved from his work as a research associate at Carnegie Mellon in the 1980s, where he developed the first prototype system for environmental information management. This early work led to the development of numerous databases at some of the nation’s largest environmental sites, and ultimately, to the formation of Locus in 1997.
Mr. Duplan recently sat down with Environmental Business Journal to discuss a myriad of topics relating to technology in the environmental industry such as Artificial Intelligence, Blockchain, Multi-tenancy, IoT, and much more.
Click here to learn more and purchase the full EBJ Vol XXXIII No 5&6: Environmental Industry Outlook 2020-2021
https://www.locustec.com/wp-content/uploads/2016/01/environmental_business_journal.gif444440Brenda Mahedyhttps://www.locustec.com/wp-content/uploads/2023/12/locus_logo_2x.pngBrenda Mahedy2020-07-09 13:36:112024-03-07 14:45:36Technology Outlook for the Environmental Industry