Process sensors and advanced analytics efficacy

Sophisticated sensors rely on predictive analytics for process and equipment monitoring. See examples of present and emerging process sensor applications.

By Mike Brooks November 29, 2022
Courtesy: Mark T. Hoske, CFE Media and Technology


Learning Objectives

  • Process sensors are evolving to provide predictive analytics for process and equipment monitoring.
  • Two examples show how process sensors were able to help users predict failure before they occurred.
  • Artificial intelligence (AI) and machine learning (ML) help users gather more sophisticated data from process sensors for better insights.

Process sensors insights

  • Process sensors were first developed in 1883 to measure temperature and have evolved to measure many different things including pressure, level and flow measurement, viscosity and more.
  • Modern process sensors can be enhanced with artificial intelligence (AI) and machine learning (ML) to glean more sophisticated insights.

Data is everywhere, but from where do they come? The first modern sensors started in 1860, when Wilhelm von Siemens used a copper resistor1 to measure temperature. That idea gave rise to the first thermostat in 1883, generally recognized as the first human-made sensor.

Since then, we have seen a proliferation of such devices especially for the measurement and control of manufacturing processes. The sophistication of sensors and their pricing has changed dramatically as their deployment entered transportation and consumer markets. So, too, has the nature of their deployment to measure and control processes, which used to be quite simple combinations of sensor signals in human-made calculations or logic such as in proportional, integral and derivative (PID) control.

These examples were followed by computer applications supporting simulations and process optimization. In the last few years, new techniques have emerged with highly technical capabilities driven by artificial intelligence (AI) and machine learning (ML). The result is a far more complex amalgamation of data streams across multiple dimensions as well as time to learn patterns of behavior instead of estimating behavior with computational digital models based on engineering first principles and statistics. As a result, the machines are writing the programs instead of humans.

Process manufacturing, sensors are intersecting

The advent of the Internet of Things (IoT) has spawned huge growth in connected things. According to the National Science Foundation (NSF), the IoT is on track to connect 50 billion smart things and 1 trillion sensors. Perhaps the most well-known deployment of such sensors and AI/ML technologies is in motor vehicles – especially electric vehicles. For example, the autopilot system in a Tesla 3 uses eight cameras, 12 ultrasonic sensors, and forward radar to read lane lines and detect nearby cars.

Process manufacturing industries have been developing new uses of such sensors steeped in AI and ML to measure the condition of assets and processes, including learning their explicit behavior from data streams gathered from sensors on and around machines. Such implementations can supersede older techniques to bring easier, faster deployment, and greater accuracy and outcomes without the need for intense engineering skills. The best applications can abstract data science, allowing regular personnel at plants to implement complex strategies without intense learning.

What’s changed is a shift from single point measurements and simple logical expressions to automatically combined streams of data gathered every few minutes. Data is processed using AI/ML in multi-dimensional/temporal ways to develop patterns of behavior in many more dimensions than humans can perceive. The technology sees those patterns with great clarity to recognize explicit behavior knowing what is normal, what is abnormal, and what are the actual advanced patterns of degradation in equipment and processes that, if left unattended, will result in poor outcomes and possibly asset failures.

Application examples: criticality of data for manufacturing

The reality is these deployments are data-driven solutions. No data means no solution. This leaves users wondering how much data and what data is required?

Consider the example of a large slurry pump at a mine, which was instrumented with only four sensors: Pressure in, pressure out, the fluid temperature and the amps drawn by the motor. In this case the predictive maintenance (PdM) solution provided two weeks’ notice the motor would fail, which it did.

However, a large charge pump at a refinery had 50 sensors, including the process measurements upstream and downstream, and mechanical sensors on the machine including many vibrations sensors. In this case, the PdM solution provided instant notice of degradation and 16 weeks’ notice of an imminent failure.

As a result, the ML-based sensor analysis can work effectively with a few sensors, but can be far more effective in accuracy and much earlier warning with more and higher quality sensors. It’s rare to add new sensors since the most critical assets are usually well-instrumented with many sensors.

The detected sensor data also streams patterns for imminent failure. It has a one-to-one relationship with the root cause of a failure and the precise failure mode, but those patterns are developed by selecting the group of appropriate sensors capable of developing those exact patterns.

In such cases, organized sensor selection kits indicate the precise collections needed to detect specific failure such as a bearing failure on a pump or compressor. Continuously monitoring the selected sensors will notify if the explicit degradation pattern occurs with extreme advanced warnings.

The future of process sensors, new applications

Sensor types and availability continue to change over the years. It started with simple temperature, flow, pressure and level sensors. Now, there are sophisticated analyzer sensors for product quality such as viscosity, water content, solids, color, weight and even inline mass spectrometers to give a full product breakdown. All are useful in predictive analytics for process and equipment monitoring.

However, emerging fields of sensor measurements may provide additional assistance. Think of the experienced operator who developed a deep understanding of equipment behavior by listening to the sounds it makes. It appears new acoustic sensors inside and outside of the human hearing range may assist. Microphones are quite inexpensive to install and do not need to sit on the machines themselves. There also have been developments in hyperspectral imaging using videos and high-powered graphics computing to interpret gas content and even smells.

Such devices can replace older analyzers for example in furnaces and stacks to see and measure temperature and exhaust gas composition. Another is interpreting vibrations on fiber cables, like the ones carrying phone calls, to detect temperature, proximity and intruders. The cable installation is non-intrusive (needs only to be close) and is a very valuable application for detecting leaks and theft, which on a perimeter fence, can detect anything coming close.

These are only a few examples of the ways in which sophisticated sensors are relying on predictive analytics for process and equipment monitoring. The future uses of such sensors are exciting and endless.

Mike Brooks is global director of APM solutions for AspenTech. Edited by Chris Vavra, web content manager, Control Engineering, CFE Media and Technology,


Keywords: process sensors, process instrumentation


See additional process instrumentation and sensors stories at


How have process sensors evolved at your facility and what have the results been like?


Sensor Technologies: What are they and how do they work?; Stevens Institute of Technology, Fundamentals of Sensing Technologies STEM Educator Workshop, Maritime Security Center

Author Bio: Mike Brooks is senior director of asset performance at Aspen Tech.