Five advanced process control, data analytics connections
Industrial Internet of Things (IIoT), artificial intelligence (AI), augmented reality/virtual reality (AR/VR), Industry 4.0 and the digitalization of manufacturing are among recent developments in the information technology (IT) world. Veteran advanced process control (APC) engineers may wonder if there is any connection between their world and this new world of digitalization and, especially, “data analytics.” A great deal of real-time and historical data is available to the manufacturing world. Does having more data improve operating efficiency?
The purpose of APC (multi-variable) is to adjust the setpoints of single-loop controllers to maintain key operating variables close to targets and to push operation closer to constraints, a form of “optimization” at the local level. Experienced board operators could do what APC does based on their knowledge of the process, current operating conditions and inputs from production planning. However, human beings don’t operate at the speed of computers and with their tirelessness.
APC emerged and evolved over the last 40-plus years, relying on step-response models (instead of process knowledge and experience) to decide which loops to move, and by how much, to keep the process at optimum conditions.
APC has been a costly and marginally successful adventure. It’s tough to maintain, in need of constant remodeling, too big, with static economic drivers. (How much is a pound more of product worth today, not last week?) A few manufacturers do it well, while many others don’t or can’t. Can data analytics help?
Connecting data analytics
Data analytics refers to the processing of raw data with advanced tools (analytics and algorithms) to generate useful information. Specifically, for APC, what would be “useful information?” Here are five ideas about APC and data analytics.
1. Real-time updates of model sensitivities
Raw instrument and distributed control system (DCS) historical and real-time data (for example, process variables [PVs], setpoints, valve positions), along with associated quality (lab) data, are now “connected” via data historians to the organization’s site and enterprise levels. Why not use this data to continually update, in real time, the key sensitivities of the models that drive the APCs? This doesn’t have to be as complex as it sounds. Algorithmic techniques do exist to accomplish this, as well as existing technology to transfer those sensitivities back down to the models, with appropriate security built-in.
2. Model accuracy
How about the models that drive the site and enterprise production planning tools (plant and enterprise-level optimizers) that download the key production targets to the APCs? The main issue with these tools today is their lack of execution in real time. What value comes from running the plant optimizer to set targets today if it relies on yesterday’s data? That’s what one of the original pioneers of APC was referring to with the comment: “Data only has value when it can lead to better decision-making.” What is the value of decisions made today based on yesterday’s data? It’s zero, or even negative, because old data could lead to the wrong decisions.
3. Real-time data analysis
The data analytics, regardless of how sophisticated, need to run in real time, if the decision-making must be real time. Again, this doesn’t have to be as complex as it sounds. A very limited number of variables need updating in real time to ensure the production planning tools are downloading the optimum targets. The data analytics should be focused on those few key variables.
4. APC design and scope improvements
What about the design and scope of the APC controllers that currently run in a plant? Could they be improved? One of the biggest problems with many controllers in plants today is they are just too big. Based upon standardized training programs, the designer runs the test, exercising every possible manipulated variable (MV), looking at the impact on every possible controlled/constraint variable (CV), and then includes every one of these variables in the model and controller design.
It shouldn’t be that way. Experienced board operators can tell you which variables are important. Why not have a data monitoring tool, based on data analytics and including key productivity measures, that can process plant and lab data? It could identify the important MV/CV relationships, assist the APC designer in finding the right controller size, and filter out the variables and variable relationships that have little, if any, impact on controllability and optimization of unit and plant operation.
5. APC monitoring and support
How about an APC monitoring tool that measures and reports on how well the APC controllers are performing? It’s not whether the controller is running, or how many MVs are active, but, rather, what do the economics look like? Data analytics could use plant and economic real-time data (product values, energy costs, and so on) to calculate the “profitability” of the APCs, based on enterprise-wide business criteria and accepted methodology.
These are just a few ways data analytics could help. Now, all that is needed is management vision and commitment to make it happen, somebody to sell it and validate it. It’s challenging, but doable.
Jim Ford, Ph.D., is a senior consultant at Maverick Technologies, a Control Engineering content partner. Edited by Mark T. Hoske, content manager, Control Engineering, CFE Media and Technology, email@example.com.
Keywords: Advanced process control, data analytics
Advanced process control (APC) relies on good data for optimal decisions.
Some data updates should be in real time.
APC design and scope influence performance.
If APC isn’t meeting expectations, reexamine data connections, models and monitoring.