Use process analytics effectively
How can process data be accessed quickly, as useful information, to affordably improve performance? Predicting process events requires process historian or time-series search tools and the ability to apply meaning to the patterns identified within the process data. Historians serve as a repository for process data from many systems and are a data source for advanced analytics. However, process historian tools are not ideal for automating data analysis or search queries. They are "write" optimized and not "read/analytics" optimized. Finding the relevant historical event and building the process context is usually a laborious task.
While many process analytics solutions are available, these historian-based software tools often require a great deal of interpretation and manipulation. They are less than automated. They unearth past trends or export raw data to Microsoft Excel. Tools to visualize and interpret process data are typically trending applications, reports, and dashboards. Helpful, perhaps, but not particularly good at predicting outcomes.
Predictive analytics, a relatively new dimension to analytics tools, delivers insights about future performance, based on historical data, both structured and unstructured. Many predictive analytics tools begin with an enterprise approach and require more sophisticated distributed computing platforms such as Hadoop or SAP Hana.
While powerful, this is a more complex approach to managing both plant and enterprise data. Companies using an enterprise data management approach must often employ specialized data scientists to help organize and clean the data. Moreover, data scientists likely won’t be familiar with the process of generating the data, which limits their ability to achieve the best results.
Moreover, as can be seen from Chart 1, data modeling software are seen as engineering-intensive "black boxes" in which the user only knows the inputs and expected outcome, without any insight into how the result was determined. Understandably, for many operational and asset-related issues, this approach is too expensive and time-consuming and, as mentioned, may require a highly skilled data scientist. This explains why many suppliers target only 1.0% of critical assets, ignoring other opportunities. However, there is a way to manage these issues without a data scientist’s help.
Data scientist free
An understanding of operational intelligence and process data are required to improve plant performance and overall efficiency. Process engineers and other personnel must be able to search time-series data over a specific timeframe and visualize related plant events. Time-series data is generated by process control and automation systems, testing laboratories, and other plant systems, as well as including annotations and observations made by operators and engineers.
Therefore, today’s industrial process data analytics solutions take a different approach, leveraging multi-dimensional search capabilities that combine visualizing process historian time-series-data, overlaying similar, matched historical patterns, and contextualizing data captured by engineers and operators. The ideal pattern recognition solution quickly and easily integrates with the plant historian database archives and with a scalable architecture that communicates with available enterprise distributed computing platforms.
This solution uses "pattern-based discovery and predictive-process analytics" appropriate for the average user. It is easily deployed and delivers immediate value, with no data-modeling solution or data scientist required. Capabilities are targeted, perhaps without all the bells and whistles of larger systems that require in-depth education and training. Combining search capabilities of structured time-series process data with that captured by operators and other subject matter experts, users predict more precisely what is happening or what is likely to occur within their continuous and batch industrial processes.
This "self-service analytics" software puts power into the hands of the process experts, engineers, and operators who can best identify and annotate areas for improvement.
In the words of Peter Reynolds, a senior consultant at ARC Advisory Group, "The new platform is built to make operator shift logs searchable in the context of historian data and process information. In a time when the process industries may face as much as a 30% decline in the skilled workforce through retiring workers, knowledge capture is a key imperative for many industrial organizations."
A range of capabilities
Lab analysis quantifies quality before product is released to an internal customer but represent an additional cost and delay. On the other hand, using predictive analytics software, a process expert can create and deploy a predictor for an impurity concentration. The sensor gives an early indication. As a result, truck loading can start earlier, lead time is shorter, and the probability of rejection due to off-spec conditions significantly decreases. Support from on-the-fly detection of early indicators spots process variations, early.
Pattern recognition and machine learning allow users to search process trends for events or to detect process anomalies, unlike traditional historian desktop tools. Much like the music app Shazam, self-service analytics identify significant patterns in data, or "high energy content," matching it to similar patterns in its database, rather than seeking to match each note of a song.
These technologies are the basis for a new systems technology stack that makes use of existing historian databases and creates a data layer that performs a column store to index the time-series data. These systems work well with leading process historians like those from OSIsoft, AspenTech, and Yokogawa. They are simple to install and deploy via a virtual machine without impacting the existing historian infrastructure. However, the tool is only as good as the data it receives from the historian.
Case study in point
A polymer plant introduced a new product grade four years ago. From the start, the company experienced problematic behavior in 90% of the runs. During the dilution phase, an unknown factor caused a bottleneck in the conveyor system, ultimately resulting in an unacceptable pressure rise.
Product quality tests indicated no significant differences between runs. All possible physical factors were checked, up to dismantling the conveyor. Failing to find a cause, the engineers "solved" the problem by alternating every batch of the problem grade with a run of a different grade.
An analysis of the data revealed a repeated pattern of behavior, leading to a comparison of tag data during the time periods identified. The engineers found that one reactor upstream invariable showed a short pressure drop prior to a conveyor system pressure rise.
By better monitoring and controlling pressure in the reactor, the plant now runs several batches of the same grade without the workaround. Even though the grades were similar, avoiding the overhead of scheduling the identified grades together, as well as the change-over time and labor involved, including changing the additives, increased productivity.
By removing the conveyor system downstream bottleneck, a 16% efficiency increase in the production of this grade was achieved. The estimated total gain is $100,000 annually.
Why are tools needed?
Companies need analytics tools to uncover areas for efficiency improvements.
"There is an immediate need to search time-series data and analyze that data in context with the annotations made by both engineers and operators to be able to make faster, higher quality process decisions," said Reynolds. "If users want to predict process degradation or an asset or equipment failure, they need to look beyond time-series and historian data tools and be able to search, learn by experimentation, and detect patterns in the vast pool of data that already exists in their plant."
Fortunately, with a low-cost investment this new process analytics model can support the necessary re-tooling of traditional process historian visualization tools.
Early detection leaves time for action
Due to heat exchanger fouling, a manufacturer was experiencing batch times longer than the normal 12 hours to 13 hours. In the past, a timely switch was made to a spare heat exchanger after several days. This could foul as well, creating a situation where solving a fouling problem took about two weeks, and sometimes longer. With predictive analytics, monitoring alerts come two to three weeks in advance and are confirmed by the operator. Heat exchanger fouling was leading to longer cooling times. A contextualized cooling-time pattern was monitored to generate alerts when maintenance needs to be planned. Early detection leaves time for preemptive action, avoiding 10 hours of production loss every two to three months, as well as the associated extensive turnaround times.
Search reveals flow-rate peak proceeds temperature peak
A chemical plant occasionally suffered from quality problems due to unclean product. A predictive analytics search revealed a flow-rate peak preceding a temperature peak. These temperature peaks lead to heat exchanger seal stress. If seals break there is leakage between sterilized and unsterilized product, causing quality problems. The causal relationship between flow rate and temperature was confirmed with a similarity search. Additionally, a monitoring pattern was configured to alert the team when a similar pattern occurred. Staff was directed to retune the process to diminish seal degradation. The analysis allowed process experts to hypothesize a causal relation that explained how quality issues led to leaking heat exchanger seals. An alert was configured to detect excess seal stress.
Bert Baeck is CEO of TrendMiner.
This article appears in the IIoT for Engineers supplement for Control Engineering and Plant Engineering.
See other articles from the supplement below.