Data validation and reconciliation
The old computer programming axiom of “garbage in, garbage out” certainly applies to optimization, especially in the case of online optimization. As mentioned earlier, in those cases where online closed-loop optimization is applied, these solutions don’t generally address process dynamics in any comprehensive manner. Consequently they are executed only when the process has reached steady state, requiring that the tools have sufficient facilities to detect that the process has reached steady state from the last set of moves requested by the optimizer. These steady state detection tools are used in conjunction with data reconciliation tools to spot, correct, and remove or even replace process values prior to the optimizer initiating its calculation sequence to arrive at the next solution.
Constraints or limits are present in every context of optimization, whether it is MPC optimization or online/offline optimization. The process presents many constraints which must be respected and are therefore integrated programmatically into the software. Constraints might be with respect to product qualities or safety or equipment limits. In any case they must be respected and not violated. Sometimes, the optimizer cannot find a feasible solution with the existing constraints and a constraint must be relaxed in order to find a feasible solution. In these cases the constraints selected to be relaxed or given up are determined ahead of time and ranked as to which constraint is least important.
Benefits, costs, and maintenance
Six to nine months is typical ROI for optimization applications, but there is wide variation across industries. Some of the greatest benefits have been achieved when they are implemented as part of an MPC program and when regulatory and advanced regulatory control layers are well maintained and operating at peak performance prior to commissioning.
Although optimization projects may compete for resources, it is worth noting that the optimization itself does not change the process or the existing automation hardware, except for the addition of another server class machine connected to the process or business network.
Maintenance must also be factored into any discussion of ROI on optimizers. Some assume that once commissioned, optimizers require no ongoing maintenance. However, operational constraints, models, and objectives change over time and the optimizer must be changed accordingly. Ongoing software and application support can average 10% of the original project cost.
Despite their benefits, developing rigorous first principle models can be time-consuming. Combining empirical and first principle models is an emerging trend, which promises to reduce development time and ease future model maintenance. Integration of historians, feed property databases, and various planning tools is another trend, which can reduce staff time needed to update information manually while at the same time reducing errors. Lastly, the new generation of users coming into the workplace demands a level of ease of use not found in the previous generation of tools. Drag and drop, copy and paste, and touchscreen enabled are all trends that will simplify the use and adoption of software tools. The inclusion of features that are consistent with the mainstream desktop- and iPad-enabled generation will be ongoing.
Also, the technical capabilities to conceive, develop, commission, and maintain an optimization system are always in short supply. Getting staff with the right skills is a prerequisite to a successful project in addition to identifying and enlisting a site champion. The right skills might be within your company or within the supplier organization, but not in your geographic location, so we are seeing an increasing trend where multiple locations and time zones are accommodated by the optimization tool set, allowing multiple persons in differing locations to work collaboratively in real time. Two or more users should be able to view the same workspace at the same time with both making changes with the underlying database managed in near real time. The successful tools of the future will support the transition of design models developed for steady state design to be seamlessly translated into the online optimization for both steady state and dynamic applications, giving much greater leverage and reuse than has been possible with previous generations of tools.
Tom Kinney is vice president, optimization, for Invensys.
- In most process manufacturing applications, the number of process variables requires sophisticated analysis to determine optimum operating conditions.
- Optimizers operate hand-in-hand with advanced process techniques to drive profitability.
- Accurate process models are necessary to ensure that all relevant operating constraints and safety measures are included in the calculations.
See related stories about advanced process control below.
Subscribe to Process & Advanced Control eNewsletter at www.controleng.com/newsletters
Case Study Database
Get more exposure for your case study by uploading it to the Control Engineering case study database, where end-users can identify relevant solutions and explore what the experts are doing to effectively implement a variety of technology and productivity related projects.
These case studies provide examples of how knowledgeable solution providers have used technology, processes and people to create effective and successful implementations in real-world situations. Case studies can be completed by filling out a simple online form where you can outline the project title, abstract, and full story in 1500 words or less; upload photos, videos and a logo.
Click here to visit the Case Study Database and upload your case study.