Understanding optimizers

Given the number of process variables interacting at most plants, finding the optimum mix to maximize profitability is likely beyond manual capabilities.

By Tom Kinney December 3, 2013

Optimization can mean different things to different people, so for purposes of this discussion we will concentrate on applying linear programming (LP) techniques, either online or offline, to determine the minimum cost solution, and therefore highest profitability for a set of targets or setpoints to drive a process. In order for LP to evaluate process conditions, it needs a process model to help it evaluate process conditions for a potential solution. In all cases the LP and model must respect all the constraints which exist in the process, including safety-related limits such as pressure, product quality specifications, and equipment constraints such as fluid handling or metallurgical limits.

Why use optimization?

In today’s operating environments with extensive heat integration, numerous process quality targets, and hundreds of constraints, the decision to drive a process in one direction or another or to push a constraint is anything but straightforward. Gasoline, for example, is a blend of as many as 12 or even more separate streams coming from refinery process units. These streams combine in a ratio control system to create a given gasoline grade. Each component has a cost, and the individual properties for that component will affect the properties of the final product.

For any given gasoline grade there are at least as many product specifications as there are properties, both minimum and maximum. Moreover, for each component used in the blend, there are also constraints related to availability, feedstock tank inventory, product tank inventory, rundown from the unit, and pumping constraints, to name only a few. Finding the one value of each component that delivers the final specification while honoring the constraints at the lowest cost requires evaluation of thousands of possible combinations. Optimizers can deliver this operating point faster and at significant savings over one that a traditional best practice might suggest. This can save a refinery several million dollars per year, while reducing product quality giveaway, eliminating rework, and improving overall production.

Open vs. closed loop

Closed-loop optimizers can be found in offsite product blending operations, process units, MPC (model predictive control) advanced controllers, planning and scheduling platforms, and raw material purchasing tools. These typically run on a minute-to-minute basis, but rigorous optimizers can run less frequently, from every few hours to once per day, with planning tools running on a weekly or monthly basis. In all these applications, optimizers are an integral part of the real-time system and therefore must have dynamic behavior and must incorporate best automation practices to make any online closed-loop control system robust and reliable. 

On the other end of the spectrum, the planning department looks at the horizon measured in weeks and even months. In this case the optimizer is run in an offline mode to try out hundreds or even thousands of potential scenarios to get the best set of operating plans developed for the plant for the available raw materials to be purchased. LP is the main workhorse here, and hundreds of product qualities are considered along with a multitude of operating constraints. Optimizers are adapted to run in different scenarios with one of the most fundamental demarcation points being the online closed-loop service or the open-loop planning case. In the closed-loop case, such as in MPC, the LP is almost always used with a multivariate dynamic controller to decouple the interactions in the process and help drive the LP targets toward the optimum solution in a short amount of time and a stable manner. In these cases, the dynamic controller with the LP uses an empirical model of the process dynamics. This is just a fancy word for regressed models from process data that contain information on process dynamic behavior.

Process models

Earlier we mentioned that a model of the process was necessary for the optimizer to evaluate process conditions at the new operating point. For example, let’s ask, “If we increase the feed to the unit by 5%, what will the exit temperature from the reactor be, and does this violate any known constraints?” There are two broad classes of models, empirical and rigorous. Empirical models are derived from data which is usually processed through a tool that performs some sort of regression using techniques such as least squares or partial least squares. They are relatively quick to develop, are based on process conditions, and contain dynamic features of the process. However, they are limited in that they generally don’t predict very well outside of the range of data on which they are based.

Rigorous models result from first principle relations that do not depend on process data, although almost all rigorous models are validated with data from an actual process. They can predict process values over a much wider range and are better at predicting such things as nonlinear behavior in the steady state.

Although we have been discussing optimization tools in the LP context, many commercial offerings are capable of nonlinear programming (NLP) solutions. These are useful for certain types of applications like energy management, where major pieces of process equipment, such as a turbine, might not always be present when the solution is determined. These difficult situations call for another solver variant, the mixed-integer nonlinear program (MINLP). The common denominator for all these cases is that the tool will determine the best point or mode of operation as already discussed.   

Data validation and reconciliation

The old computer programming axiom of “garbage in, garbage out” certainly applies to optimization, especially in the case of online optimization. As mentioned earlier, in those cases where online closed-loop optimization is applied, these solutions don’t generally address process dynamics in any comprehensive manner. Consequently they are executed only when the process has reached steady state, requiring that the tools have sufficient facilities to detect that the process has reached steady state from the last set of moves requested by the optimizer. These steady state detection tools are used in conjunction with data reconciliation tools to spot, correct, and remove or even replace process values prior to the optimizer initiating its calculation sequence to arrive at the next solution. 

Constraints

Constraints or limits are present in every context of optimization, whether it is MPC optimization or online/offline optimization. The process presents many constraints which must be respected and are therefore integrated programmatically into the software. Constraints might be with respect to product qualities or safety or equipment limits. In any case they must be respected and not violated. Sometimes, the optimizer cannot find a feasible solution with the existing constraints and a constraint must be relaxed in order to find a feasible solution. In these cases the constraints selected to be relaxed or given up are determined ahead of time and ranked as to which constraint is least important.

Benefits, costs, and maintenance

Six to nine months is typical ROI for optimization applications, but there is wide variation across industries. Some of the greatest benefits have been achieved when they are implemented as part of an MPC program and when regulatory and advanced regulatory control layers are well maintained and operating at peak performance prior to commissioning.

Although optimization projects may compete for resources, it is worth noting that the optimization itself does not change the process or the existing automation hardware, except for the addition of another server class machine connected to the process or business network. 

Maintenance must also be factored into any discussion of ROI on optimizers. Some assume that once commissioned, optimizers require no ongoing maintenance. However, operational constraints, models, and objectives change over time and the optimizer must be changed accordingly. Ongoing software and application support can average 10% of the original project cost.

Future requirements

Despite their benefits, developing rigorous first principle models can be time-consuming. Combining empirical and first principle models is an emerging trend, which promises to reduce development time and ease future model maintenance. Integration of historians, feed property databases, and various planning tools is another trend, which can reduce staff time needed to update information manually while at the same time reducing errors. Lastly, the new generation of users coming into the workplace demands a level of ease of use not found in the previous generation of tools. Drag and drop, copy and paste, and touchscreen enabled are all trends that will simplify the use and adoption of software tools. The inclusion of features that are consistent with the mainstream desktop- and iPad-enabled generation will be ongoing. 

Also, the technical capabilities to conceive, develop, commission, and maintain an optimization system are always in short supply. Getting staff with the right skills is a prerequisite to a successful project in addition to identifying and enlisting a site champion. The right skills might be within your company or within the supplier organization, but not in your geographic location, so we are seeing an increasing trend where multiple locations and time zones are accommodated by the optimization tool set, allowing multiple persons in differing locations to work collaboratively in real time. Two or more users should be able to view the same workspace at the same time with both making changes with the underlying database managed in near real time. The successful tools of the future will support the transition of design models developed for steady state design to be seamlessly translated into the online optimization for both steady state and dynamic applications, giving much greater leverage and reuse than has been possible with previous generations of tools.

Tom Kinney is vice president, optimization, for Invensys.

Key concepts:

  • In most process manufacturing applications, the number of process variables requires sophisticated analysis to determine optimum operating conditions.
  • Optimizers operate hand-in-hand with advanced process techniques to drive profitability.
  • Accurate process models are necessary to ensure that all relevant operating constraints and safety measures are included in the calculations.

ONLINE

https://iom.invensys.com

See related stories about advanced process control below.

Subscribe to Process & Advanced Control eNewsletter at www.controleng.com/newsletters