Create value from data: Using HMIs as intelligent edge devices

Human-machine interfaces (HMIs) are evolving beyond basic visualization devices to become edge-located, data handling, and analytical powerhouses. See 5 modern HMI attributes.

By Marcia Gadbois and Chuck Kelley June 10, 2020

 

Learning Objectives

  • Human-machine interfaces (HMIs) are capable of sending and retrieving high volumes of information from edge devices and act more as machine-learning tool.
  • The modern HMI’s can identify items, events or observations not conforming to an expected pattern.

Human-machine interface (HMI) software can be used to create value from the rapidly increasing amount of industrial data. According to the International Data Corp. (IDC), 463 exabytes (each exabyte is 1 quintillion bytes) of data will be created every day in 2025 (Reference 1). Most of this data will be created by:

  • Smart sensors and other industrial internet of things (IIoT) devices
  • Programmable logic controllers (PLCs)
  • Other specialized controllers
  • Industrial data collection systems.

Devices of all types continue to add sensors. More sensors, naturally, leads to more data collection. The exponential growth for industrial systems alone is outpacing available network bandwidth. Much of this data remains an untapped source from machines and processes, but gaining access to this data is crucial for obtaining valuable business insights.

An emerging possibility is to mine this data using human-machine interface (HMI) software deployed near the source. This calls for the HMI’s role to evolve from being only a basic visualization tool to an enhanced role as an intelligent edge-located data collector and machine-learning processor.

New roles for HMI software

HMI software hosted on edge devices need to keep pace with the ever-growing requirements for how all types of data need to be acquired, parsed, mined and refined. The sheer volume of data being collected means advanced analytics and machine learning at the operational edge needs to play an important role in the overall digital transformation plan for smarter operations.

HMIs have often worked with data sources such as PLCs and sensors. Traditionally, HMIs were used as a visualization tool and sometimes as a data collector viewed on a dedicated control panel, mobile device or web browser. Modern HMIs still need to perform these roles, but they also must collect data in real time, store it locally for further analysis and use the data to find patterns and inferences to make predictions (Figure 1).

HMIs are evolving as a source of data for real-time machine-learning execution. Time-series process data must be correlated with alarm and event record data to train machine-learning models. This allows these models to detect the quality of a product in manufacturing or predict the health of a critical piece of equipment. Machine-learning models perform best when supplied with large quantities of high-fidelity data. Predictive maintenance is possible when these models detect deviation from a common behavior and indicate a possible impending failure, which could result in downtime.

4 ways to verify industrial data quality

Getting from raw data to insights requires the HMI to parse incoming data from the physical assets and industrial controls and store it in an organized manner. Combining historical with real-time data plays a critical role in laying the groundwork needed for predictive modeling.

Machine-learning models are developed by examining a sufficient amount of data representing a training set diverse enough to cover as many examples of success and failures as possible, perhaps in a ratio of 70/30 success/failure.

It takes time to find the right data set with verified quality, meaning it has been cleansed to:

  • Remove NULL values
  • Ensure the data is correct for the desired signal and scaling
  • Ensure sufficient sampling rates
  • Capture conditions of interest.

A lot of data may be thrown out while performing exploratory analysis and new data often must be obtained. Users also must ensure the training set does not have any built-in bias. Once good data is confirmed, machine-learning models can be created and applied.

How to build a machine-learning model

Collected real-time data supports building and refining the machine-learning model. There is a debate regarding whether once the machine-learning model is created, should each edge device have the ability to change the machine-learning model or should humans supervise such updates.

The benefit of enabling each edge device to change a model is they may adapt in a beneficial way. However, a consequence of dynamic model changing is the machine-learning models for each edge device may diverge. Models updated consistently are more likely to minimize support issues.

Most users begin the model development process with supervised learning for algorithms such as linear regression, logistic regression and neural networks. Most of today’s practical value comes from supervised learning.

Next, one can start applying deep learning techniques to the data.

Once the machine-learning model is in a reasonable state, the model can be deployed on an edge device where it can operate on real-time data and look for anomalies. When it encounters any anomalies, the HMI can notify the user by sending an alarm.

Modern HMI architecture, edge analytics

The modern HMI’s role is to not only collect data, but to identify items, events or observations not conforming to an expected pattern. The HMI is well positioned to not only make inferences from the data and detect these anomalies, but to send alarms via the screen, texts or emails to the operator about actual or potential problems.

By performing detection closer to the data source, HMIs allow early alarming without the latency inherent when sending data to the cloud. HMIs can detect known patterns that can lead to a failure detection on a critical piece of equipment, with inference and local actions handled locally (Figure 2).

HMIs also can perform data pre-processing. With the HMI located on the edge and creating real-time inferences, it can classify, detect and segment the data before sending it to the cloud. This can provide efficiencies for upstream processing and networking.

However, the concept of whether the edge should transmit raw data, aggregated data, or predictive data to the cloud for further analytics is somewhat contentious. Machine-learning models in higher-level or cloud-based systems will need the highest fidelity raw data possible in order to be optimized. Unfortunately, this can create a technical and commercial burden for the user.

The need to filter industrial data

Based on the IDC’s estimate of 463 exabytes created every day by 2025, users must make careful data transmission decisions based on the amount of storage and network bandwidth required to keep the data refreshed at the location where it is needed for analysis (Figure 3).

The raw data approach supplies the best base data for machine-learning models. However, with the vast amount of data being produced, it may be difficult to transmit the data in real-time. When aggregated or predictive real-time data is preprocessed at the edge and transmitted, the volume will be lower. Users must be aware this approach can filter or obscure information in a way which allows bias to creep into a machine-learning model.

Another option is sending aggregated or predictive data in real-time and setting up a different channel to send raw data at a slower speed. A potential drawback is the communication queue could be filled if this approach is used.

The economic costs of procuring network bandwidth and storage devices must be balanced with the available technical practicalities, such as determining if the network exhibits sufficient stability. These overriding factors will impact where machine learning can be deployed: at the edge, in the cloud, or a combination of both.

Advanced HMI analytics and anomaly detection

Advanced analytics are possible with modern HMIs. Much of the world’s data is streaming and time-series data where anomalies provide significant information indicating critical situations. There are numerous use cases for anomaly detection, including preventative and predictive maintenance, fault detection and monitoring.

Anomalies are defined as a point in time where the behavior of the system is unusual and quite different from past behavior. Anomalies can be spatial, meaning the value is outside the typical range, or temporal where the value isn’t outside the typical range but the sequence in which it occurs is unusual. State labels can be associated with anomalies and can classify them as temporal or spatial. The alarm system also can assign weighted values to predict failure based on priority, importance and frequency.

Any modern HMI also must natively support mechanisms for sending and receiving messages in a stateful way and ensure remote device data is current and valid. Stateful communications can be ensured by using protocols such as MQTT and Kafka, while the Sparkplug B specification handles state management.

Once in the cloud, the data can be aggregated and joined with data from multiple data sources. The value here is users can consider multiple operations or an entire fleet of equipment together, regardless of their physical location. Cloud-based filtering and analytic models can be used to refine the data for deep analysis to predict behavior and trends, such as the mean time between failure (MTBF) or end of life for machines. This information can then be deployed back to edge-located machine-learning models running in HMIs to improve their operation.

5 modern HMI requirements

If the IDC prediction is anywhere close to correct, the HMI’s role will need to evolve to accommodate massive quantities of data. Modern HMIs have the ability to connect to a wealth of machine data to:

  • Monitor and analyze this data in real time
  • Visualize it in a coherent and user-friendly manner
  • Help users make intelligent decisions
  • Store data in a useful way so it can be mined at will
  • Overcome compromises and constraints.

This is the HMI’s new role as more sensors are being deployed and reliance on machines continues to increase. The critical nature and functions of these machines will expand, and the HMI will be the brain of the intelligent edge.

Marcia Gadbois, president and general manager, AdisraChuck Kelley, chief data officer, Adisra. Edited by Chris Vavra, associate editor, Control Engineering, CFE Media and Technology, cvavra@cfemedia.com.

MORE ANSWERS

Keywords: human-machine interface, HMI, edge analytics

CONSIDER THIS

What benefits could your plant derive from the modern HMI?

ONLINE extra

Reference 1: World Economic Forum, 4/17/2019, https://www.weforum.org/agenda/2019/04/how-much-data-is-generated-each-day-cf4bddf29f/


Author Bio: Marcia Gadbois is the president and general manager of ADISRA. Marcia is an entrepreneur who has grown a start-up from inception to a successful liquidity event. Prior to joining ADISRA, Marcia was the President of InduSoft, which was acquired by Invensys. She holds a BS in Management Information System and Computer Science from Bowling Green State University and an Executive MBA from University of New Hampshire. Chuck Kelley is chief data officer for ADISRA. Chuck is an internationally known expert in data technology. He has over 40 years of experience in the design and implementation of operational/production systems, operational data stores, and data warehouses/data lakes. Mr. Kelley has co-authored or contributed to four books on data warehouses.