Optimize manufacturing value in real time

If you have the engineering resources and the business commitment to improve your value-add to the business, then real-time process optimization (RPO) may be right for your organization.
By Dennis Brandl, BR&L Consulting November 22, 2016

Figure 1: This diagram shows how real-time process optimization (RPO) interacts with other aspects of process control and how it fits into the commonly used International Society of Automation (ISA) 95 hierarchy of control. Courtesy: BR&L ConsultingReal-time process optimization (RPO) defines functions that optimize the economic value of manufacturing production processes, such as the minimal cost of production, energy used, or time of production. RPO is commonly confused with advanced process control (APC), but they are complementary functions, not equivalent. APC is a technique designed to provide control strategies that minimize the difference between process setpoints and actual values; for example, minimizing overshoot in a process change or minimizing the time to return to a steady state after a process upset.

RPO is used to define the target process values for APC, based on optimizing one or more business goals. RPO is used to set target values for many different forms of process control-from basic on/off control, to proportional, integral, and derivative (PID) control, up to the APC techniques of model predictive control (MPC), model-based control (MBC), and dynamic matrix control (DMC). Figure 1 shows how RPO interacts with other aspects of process control and how it fits into the commonly used International Society of Automation (ISA) 95 hierarchy of control. RPO sits above real-time control loops and APC and uses information from many sources to determine the optimal targets to meet economic business goals.

Basic control handles single control loops or sometimes cascaded control loops. APC handles multiple control loops that have interdependencies—a change in one will have an impact on others. RPO usually spans several units, entire production lines, and sometimes entire sites, setting targets for APC or basic control that meet a global optimum. The broader the scope of RPO, the better the opportunity there is to reach a globally optimal solution, but RPO has been effectively applied to collections of units where economic value can be obtained using just the bottleneck or production-limiting units. The book, The Goal: A Process of Ongoing Improvement by Eli Goldratt, is a great reference to help understand where and when to apply optimization and should be a required part of every control engineer’s library.

RPO algorithms require multiple sources of information to be effective:

  • Business-defined key performance indicators (KPIs)—these define the economic goals to be maximized or minimized. In most systems, RPO algorithms may need to find solutions that simultaneously optimize several KPIs. For example, there may be a KPI that defines the minimum use of energy to make product, balanced against the amount of material used for production, balanced against the safe operating limits of the equipment. Production KPIs typically involve minimal use of time, energy, or material, or they involve maximizing quality or throughput. The optimization KPIs are set by higher level business functions and can be influenced by economic and market factors, such as customer commitments, spot market prices, sales campaigns, and inventory capabilities.
  • Process model—a process model describes, in formal mathematical terms, how the system will react to different operating conditions. In the refining industry, a process model may define how well the fractionation columns will split different grades of crude oil. In discrete manufacturing, the model may define the relationship between quality and production line speed. RPO process models are often similar to the process models used in APC, but they include inter-unit dependencies.
  • Capability model—a capability model describes the capability of the system to handle different raw materials and different products. For example, in processed food production, the difference between different lots of corn in moisture and sweetness can influence processing time, product quality, and energy used. The capability model can be a simple relationship in many manufacturing facilities, but where many different products can be manufactured on the same lines, with varying throughput, quality, material, equipment, and personnel use, then a capability model can be a complex association of materials and resources.
  • Constraints—define the various limits to determining an optimal solution. The limits may include safety limits for equipment, storage limitations for intermediates or final products, transportation limits for final products, raw material availability and price, or final product price/quantity sensitivity. Each constraint defines one dimension on an optimization multidimensional solution.
  • Current conditions—the final element that a real-time process optimization algorithm needs is the current operating conditions. The current condition is used to calculate the cost of changing to a different set of targets. Sometimes it’s better to run below optimal conditions because the cost of switching is too high. Historical values also are used to see the current rate of change of the system because changing direction can be costly and take significant time. 

Optimization methods

Figure 2: This graph shows an example with two variables, five constraints represented as linear equations, and one KPI relationship. The shape in yellow indicates possible solutions that fit into the constraints. The solutions are at the corners of the sOptimization is a difficult process when there are dozens of KPIs, hundreds to thousands of variables, and hundreds to thousands of constraints. Multiple techniques have been developed to solve these kinds of problems, and it has been a serious field of study by mathematicians for centuries. There are two ways to look at optimization problems, as linear or nonlinear problems.

Linear problems using linear equations to find optimal solutions is a branch of mathematics called linear programming (LP). If all of the relationships can be represented through a set of linear equations, then LP techniques can be used to find an optimal solution. Figure 2 illustrates a simple example with two variables, five constraints represented as linear equations, and one KPI relationship. The shape in yellow indicates possible solutions that fit into the constraints. The solutions are at the corners of the shape, and the optimal solution is the one that either maximizes or minimizes the optimization KPIs. LPs that have tens of thousands of variables and thousands of constraints have been built for refinery and chemical plant optimization, while a typical discrete manufacturing facility will often have hundreds of variables and dozens of constraints.

The key to using LPs to discover optimal solutions is the discovery of the linear relationships. Unfortunately, real life is often not linear. For nonlinear situations, there are two approaches; one is to assume that the relationships are linear within the expected solution set and accept that the answer is approximate, but close. Often this is combined with checks to see if the solution is near a nonlinear region and then rerunning the algorithm with a different set of relationships.

The second solution is to run hundreds to thousands of different scenarios, using a strategy called "peak hunting," in which scenarios are used to "walk" to a local optimal in the multidimensional solution space. Many refineries and chemical plants use the peak hunting method to determine optimal solutions because of the complexity and nonlinearity of the process models. Often, this is combined with checks to ensure that global optimums are found and not just local peaks. When problems are nonlinear, the most you can hope for is "almost optimal," but usually that can be within tenths of a percent of a true optimal. 

Model verification

Figure 3: This diagram shows a typical lifecycle of a real-time process optimization (RPO) project, where regular model validation and tuning are used to ensure valid targets are generated. Courtesy: BR&L ConsultingDevelopment of the RPO models can be very time consuming and require a lot of engineering and manufacturing knowledge. Effective RPO can bring significant economic benefits to companies, often adding percentage points of profits to the bottom line. However, the best models won’t provide the benefits unless they are valid. One of the worst things to happen on an ROP project is to generate targets that clearly won’t work, such as being not safe or not using knowledge of actual current capacities and capabilities.

When targets are wrong, the operational staff will quickly start to ignore them, because they will no longer trust the RPO system. It is important to develop a process that is used to regularly validate the models by comparing expected results against measure results. Figure 3 illustrates a typical lifecycle of an RPO project, where regular model validation and tuning are used to ensure valid targets are generated. Most successful RPO projects use data historians to collect data and regularly run analysis programs to determine the actual performance, and they have dedicated resources to validate the models and correct them when needed. 

How often to run RPO and validation

Real-time process optimization is real time, but real time usually is measured in hours and days rather than seconds and milliseconds. There is little use to run RPO faster than the system can respond to changes in the targets. Often, because the targets are defined by business economics, they will change on the daily, weekly, or even monthly basis. Some industries have faster changes, such as the electric power industry, but most run slower because of the lag in their supply chain. There also is little need to run faster than changes to the demand, constraints, or capabilities. If the plant is running with constant demand, constant prices, and no changes in capabilities, then RPO should find that no changes to process targets are needed.

Validation should be run on a regular schedule or when there are major changes to process capabilities or constraints. For example, if a bottleneck machine is replaced with a faster machine, then the model should be updated with the new capability. Equipment also breaks or performs less well over time, so validation every quarter, 6 months, or a year is generally a reasonable schedule. Remember, an unvalidated model will quickly become untrusted and not used, so don’t forget regular validations.

Looking to the future

A question that often comes up is, how does RPO fit into the new Industrial Internet of Things (IIoT) world? IIoT is envisioned to bring extensive sensing and computing capability to manufacturing sites. A manufacturing system will be made up of hundreds or thousands of smart IIoT devices, each communicating and coordinating work with other devices. In this world, RPO still has a large benefit. RPO sets the targets for local actions, eventually directing the actions of thousands of devices to reach the optimal process state. RPO algorithms will undoubtedly be developed to take advantage of the immense computing capability of an IIoT cloud. Peak hunting for optimal solutions can be easily distributed, allowing all devices to participate in discovering the optimal business state.

RPO can be used in companies that have a good understanding of their production processes, their dependencies, capabilities, and sensitivities. RPO is best applied to processes where you are continually discovering better production schedules and setpoints, and you know that you are not at the best business use of the facility. If you have the engineering resources and the business commitment to improve your value-add to the business, then RPO may be right for your organization. 

Dennis Brandl is the founder and president of BR&L Consulting in Cary, N.C. He is an active member of the ISA 95 Enterprise/Control System Integration committee, a co-author of the MESA B2MML standards, a member of the ISA 99 Industrial Cyber Security Standards Committee, former chairman of the ISA 88 Batch System Control Committee, and has participated in the development of OPC and other industrial standards. Brandl writes a monthly column on Manufacturing IT in Control Engineering magazine.

This article appears in the Applied Automation supplement for Control Engineering 
and Plant Engineering

– See other articles from the supplement below.