Understanding optimizers

Given the number of process variables interacting at most plants, finding the optimum mix to maximize profitability is likely beyond manual capabilities.

12/03/2013

Flash is required!

Tobias Scheele, vice president, design, simulation and optimization for Invensys explains how optimizers extend the ability to improve plant performance.



This diagram gives a highly simplified view of the elements related to producing gasoline. Keeping every one of them optimized requires more analysis than is practical with manual approaches. Courtesy: InvensysOptimization can mean different things to different people, so for purposes of this discussion we will concentrate on applying linear programming (LP) techniques, either online or offline, to determine the minimum cost solution, and therefore highest profitability for a set of targets or setpoints to drive a process. In order for LP to evaluate process conditions, it needs a process model to help it evaluate process conditions for a potential solution. In all cases the LP and model must respect all the constraints which exist in the process, including safety-related limits such as pressure, product quality specifications, and equipment constraints such as fluid handling or metallurgical limits.

Why use optimization?

In today’s operating environments with extensive heat integration, numerous process quality targets, and hundreds of constraints, the decision to drive a process in one direction or another or to push a constraint is anything but straightforward. Gasoline, for example, is a blend of as many as 12 or even more separate streams coming from refinery process units. These streams combine in a ratio control system to create a given gasoline grade. Each component has a cost, and the individual properties for that component will affect the properties of the final product.

For any given gasoline grade there are at least as many product specifications as there are properties, both minimum and maximum. Moreover, for each component used in the blend, there are also constraints related to availability, feedstock tank inventory, product tank inventory, rundown from the unit, and pumping constraints, to name only a few. Finding the one value of each component that delivers the final specification while honoring the constraints at the lowest cost requires evaluation of thousands of possible combinations. Optimizers can deliver this operating point faster and at significant savings over one that a traditional best practice might suggest. This can save a refinery several million dollars per year, while reducing product quality giveaway, eliminating rework, and improving overall production.

Open vs. closed loop

Closed-loop optimizers can be found in offsite product blending operations, process units, MPC (model predictive control) advanced controllers, planning and scheduling platforms, and raw material purchasing tools. These typically run on a minute-to-minute basis, but rigorous optimizers can run less frequently, from every few hours to once per day, with planning tools running on a weekly or monthly basis. In all these applications, optimizers are an integral part of the real-time system and therefore must have dynamic behavior and must incorporate best automation practices to make any online closed-loop control system robust and reliable. 

On the other end of the spectrum, the planning department looks at the horizon measured in weeks and even months. In this case the optimizer is run in an offline mode to try out hundreds or even thousands of potential scenarios to get the best set of operating plans developed for the plant for the available raw materials to be purchased. LP is the main workhorse here, and hundreds of product qualities are considered along with a multitude of operating constraints. Optimizers are adapted to run in different scenarios with one of the most fundamental demarcation points being the online closed-loop service or the open-loop planning case. In the closed-loop case, such as in MPC, the LP is almost always used with a multivariate dynamic controller to decouple the interactions in the process and help drive the LP targets toward the optimum solution in a short amount of time and a stable manner. In these cases, the dynamic controller with the LP uses an empirical model of the process dynamics. This is just a fancy word for regressed models from process data that contain information on process dynamic behavior.

Process models

Earlier we mentioned that a model of the process was necessary for the optimizer to evaluate process conditions at the new operating point. For example, let’s ask, “If we increase the feed to the unit by 5%, what will the exit temperature from the reactor be, and does this violate any known constraints?” There are two broad classes of models, empirical and rigorous. Empirical models are derived from data which is usually processed through a tool that performs some sort of regression using techniques such as least squares or partial least squares. They are relatively quick to develop, are based on process conditions, and contain dynamic features of the process. However, they are limited in that they generally don’t predict very well outside of the range of data on which they are based.

Rigorous models result from first principle relations that do not depend on process data, although almost all rigorous models are validated with data from an actual process. They can predict process values over a much wider range and are better at predicting such things as nonlinear behavior in the steady state.

Although we have been discussing optimization tools in the LP context, many commercial offerings are capable of nonlinear programming (NLP) solutions. These are useful for certain types of applications like energy management, where major pieces of process equipment, such as a turbine, might not always be present when the solution is determined. These difficult situations call for another solver variant, the mixed-integer nonlinear program (MINLP). The common denominator for all these cases is that the tool will determine the best point or mode of operation as already discussed.   


<< First < Previous 1 2 Next > Last >>

Anonymous , 12/28/13 04:25 AM:

I will talk about refinery optimisation.
Planning & scheduling are like headquarters that define strategy and tactics to optimize the refinery. Then process optimisers are soldiers that shall apply strategy on the battlefield, i.e. every minute how you can best operate your various units to achieve the overall goals. Hence the complexity threshold for process optimisation is reduced, and the need for extremely detailed model is also reduced.
Decisions like producing more naphtha versus kerosene/gasoil are totally out of the hand of process optimisation. This is decided by planning and scheduling. Process optimisation shall enforce it by pushing production against constraints. When designing process optimiser, I will define how planning and scheduling will set their “orders”.
There is also “local” optimization, i.e. the global strategy does not depend upon the choice of the local optimum. Typically, energy optimisation is often local. For instance, considering 2-cut distillation with 3 manipulated variables (condenser, reboiler and reflux). Once your cut is defined to achieved global strategy (2 variables fixed), one variable remains for energy optimisation.
Because process optimisation is “manipulating” variables to enforce targets, it shall follow basic control principle like Shannon theorem. Hence, process optimiser shall have “fast” frequency, which means you shall reconcile your process model versus the plant taking into account process dynamics. Tools like SMOC, that runs every minute, are well tailored to enforce targets, push against constraints and solve local optimum.
Tools with only steady state models does not fit because their models are updated only at steady state and does not achieve the minimum frequency. A good example is outside temperature in summer time. Running optimisation every 6 hours does not work. Often your process constraint is cooling capacity. Early in the morning, optimiser foresees extra cooling capacity because of the night, but you can’t achieve it because sun is back. In the evening it is the other way around, you don’t use the extra capacity because optimiser has run versus the afternoon heat. Steady state optimisers are always late.
The Engineers' Choice Awards highlight some of the best new control, instrumentation and automation products as chosen by...
Each year, a panel of Control Engineering editors and industry expert judges select the System Integrator of the Year Award winners.
The Engineering Leaders Under 40 program identifies and gives recognition to young engineers who...
Learn how to increase device reliability in harsh environments and decrease unplanned system downtime.
This eGuide contains a series of articles and videos that considers theoretical and practical; immediate needs and a look into the future.
Learn how to create value with re-use; gain productivity with lean automation and connectivity, and optimize panel design and construction.
Go deep: Automation tackles offshore oil challenges; Ethernet advice; Wireless robotics; Product exclusives; Digital edition exclusives
Lost in the gray scale? How to get effective HMIs; Best practices: Integrate old and new wireless systems; Smart software, networks; Service provider certifications
Fixing PID: Part 2: Tweaking controller strategy; Machine safety networks; Salary survey and career advice; Smart I/O architecture; Product exclusives
The Ask Control Engineering blog covers all aspects of automation, including motors, drives, sensors, motion control, machine control, and embedded systems.
Look at the basics of industrial wireless technologies, wireless concepts, wireless standards, and wireless best practices with Daniel E. Capano of Diversified Technical Services Inc.
Join this ongoing discussion of machine guarding topics, including solutions assessments, regulatory compliance, gap analysis...
This is a blog from the trenches – written by engineers who are implementing and upgrading control systems every day across every industry.
IMS Research, recently acquired by IHS Inc., is a leading independent supplier of market research and consultancy to the global electronics industry.

Find and connect with the most suitable service provider for your unique application. Start searching the Global System Integrator Database Now!

Case Study Database

Case Study Database

Get more exposure for your case study by uploading it to the Control Engineering case study database, where end-users can identify relevant solutions and explore what the experts are doing to effectively implement a variety of technology and productivity related projects.

These case studies provide examples of how knowledgeable solution providers have used technology, processes and people to create effective and successful implementations in real-world situations. Case studies can be completed by filling out a simple online form where you can outline the project title, abstract, and full story in 1500 words or less; upload photos, videos and a logo.

Click here to visit the Case Study Database and upload your case study.