In search of the ultimate batch system

When statistical process control (SPC) is applied, it's usually because the producer wants to optimize the process, using existing equipment to produce as much product as efficiently as possible. On first blush, that would be justification enough for any production operation, continuous or batch, to apply SPC methods; but believe it or not, cost and quantity aren't the key business drivers amon...

By Dave Harrold August 1, 2002

KEY WORDS

Process & advanced control

System analysis & design

Quality assurance

Batch control

Modeling

Information systems

Data aquisition

Sidebars: Ignoring quality problems is expensive

When statistical process control (SPC) is applied, it’s usually because the producer wants to optimize the process, using existing equipment to produce as much product as efficiently as possible.

On first blush, that would be justification enough for any production operation, continuous or batch, to apply SPC methods; but believe it or not, cost and quantity aren’t the key business drivers among all chemical industry producers. (See ‘Chemical industry diversity’ diagram.)

Petrochemical processing is a well-defined technology generally performed as a continuous process. Volumes are high, margins are low, and production operations make extensive use of plant-wide modeling and optimization methods to drive unit costs down, thus maximizing asset utilization. It’s a classic example of maximizing the return on assets (ROA).

At the other end are R&D driven pharmaceuticals. These are the companies who spend several years investigating hundreds of new drug possibilities in hopes of bringing a handful to the market place.

Generally, R&D pharmaceutical processing requires a manufacturing design that provides the flexibility to produce a variety of high-value products in small volumes, while documenting and signing off every step along the way-activities best performed using batch processing.

R&D pharmaceutical production operations focus on modeling work processes and yield-enhancing optimization activities in such areas as crystallization, distillation, and filtration. An R&D pharmaceutical producer’s ‘brass rings’ are time to market and maximizing return on investment (ROI). ROI on products sold supports research on products that didn’t (or haven’t yet) made it to market.

Other chemical producers apply varying degrees of continuous and batch production processes, creating requirements for a variety of tools to ensure timely delivery of quality related information to maximize returns.

Many choices

Deming, Juran, Shewhart, SPC, theory of constraints, design of experiments, control charts, statistical quality control (SQC), Motorola, Six Sigma-these and many other names and terms are part of the everyday language of quality control specialists. Depending on a person’s understanding of the entire quality control discipline, and the opinions of the teachers, some of these names and terms are reverent and some blasphemy.

Among those of us with a more ‘casual’ understanding of the discipline called quality control, these names and terms remain mostly just that-names and terms.

Search the Internet to broaden your knowledge of quality control, and you can become mired in pro and con discussions about lean manufacturing, Kaizen, balanced score cards, dashboards, and more.

It’s no wonder that processes continue to experience variability and achieving four-sigma quality is hailed a success. Unless you spend most of your day in the quality control arena, the terms and options can be overwhelming. And what’s really scary-if you apply the wrong methodology, interpret the wrong variables, and/or use the wrong control charts and analysis methods-you can do more harm than good.

Apply system thinking

Every effort to improve quality requires knowledge of the tools, expertise of the process under study, educated guesses, and luck, hopefully not in equal portions.

Under the heading ‘expertise of the process under study,’ it’s important to understand the interactions among surrounding processes. Ignore these interactions and one process is improved while others degrade.

To ensure each process improvement is ‘compatible’ with surrounding processes requires stepping back and considering the overall system and its many subsystems.

Dr. Bernard McGarvey, Eli Lilly’s (Indianapolis, Ind.) manufacturing process improvement implementation leader, believes an effective quality improvement process makes use of every available tool, not just a favorite few. To support Eli Lilly’s continuous improvement activities, Dr. McGarvey promotes a process known as ‘System Thinking.’

System thinking has been around at least 15 years, providing a methodology that leads to the appropriate application of other quality improvement tools including knowledge management systems. If expert systems, Six Sigma, Kaizen, design of experiments, SPC, etc., represent some of the available tools, then system thinking represents the toolbox.

‘Simply stated, system thinking defines a system, identifies the input and output variables, considers the relationships of the variables on one another and on other systems in order to identify the system’s behavior. As more is learned about the system’s behavior, the more is learned about its variable relationships and influence on surrounding systems, thus the more is learned about system behavior, and on-and-on,’ explains Dr. McGarvey.

Applying system thinking to improve process knowledge and quality requires:

Creation of a model of the systems; and

Recognition that models change over time.

According to Dr. McGarvey, ‘A model of every process exist somewhere. Those models may be in the heads of engineers, operators, and technicians, or they may be among original process design documentation. The point is, process models, whether they are correct or not, exist and are being applied everyday. The key to continuous improvement is to document, collect data about, analyze, adjust, and validate those models. That means you may have to start with a very high level model, rely on small amounts of data, and work towards increasing levels of detail, creating numerous sub-models, collecting more data, and adding additional analysis ‘tools’ along the way.’

Tom Pearson, vice president of Praedictus (Indianapolis, Ind.) adds, ‘You also must understand that processes are dynamic and today’s key input and output variables may not be tomorrow’s key variables. One problem I repeatedly see is, once a process is improved people stop measuring. What really needs to happen is you must focus on what’s important and constantly analyze to ensure you understand what’s becoming important. That means you need to collect and analyze data all the time.’

‘Consider a process with an overhead vacuum system, or a shell and tube heat exchanger. Over time, the efficiency of these systems deteriorates due to fouling. Control strategies developed to manage these systems require periodic re-tuning because process dynamics changed. Ignore the impact of efficiency deterioration, and batch process cycle-times increase, possibly influencing product quality and/or yield,’ Mr. Pearson says.

Recognizing and acknowledging that process dynamics change over time, also implies that associated models (i.e., controller tuning constants) of that process change over time. Accepting systems are in constant state of flux, makes quality improvement activities an on-going process. Anyone approaching quality improvement as a project (defined start and end dates) is bound to be disappointed with the long-term results.

Product quality evaluations

Products regulated by the U.S. FDA (United States Food and Drug Administration, Rockville, Md.) must undergo periodic product quality evaluations (PPQE) to ensure quality remains consistent thorough out the product’s lifecycle.

While the FDA mandates PPQE for regulated products, other producers conduct similar product quality audits, thus the requirement to conduct in-depth PPQE touches every industry.

As long as product quality proves to be okay, PPQE proves to be no big deal. It does become a big deal, especially in batch processes, when a product’s quality isn’t up to specification and the problem isn’t discovered until well into the production cycle.

In a report titled, ‘The gap between batch systems and user requirements,’ John Blanchard, ARC Advisory Group (Dedham, Mass.) analyst, says, ‘From a plant operations perspective most products are not ‘product’ until they are processed through batch, continuous, and packaging operations with quality checks in various stages of the process.’

It’s also true, these finished, out-the-door products are often produced from ‘intermediate’ products made weeks or months earlier.

This means, performing PPQE requires analysis of the product’s genealogy, including such things as raw materials, processing activities, and warehousing duration (shelf-life) of intermediates used to create the product.

Interestingly, statisticians conducting PPQE argue they never have enough data. Besides the traditional key process indicators of pressures, temperatures, lab assays, etc., statisticians often seek to determine if a product’s quality problem can be traced to weather patterns, time-in-storage, shifts and workers on a shift, equipment, routing, and numerous other possible quality influences.

I need more data

Here is the problem often facing far-reaching statisticians: Quality-related product data aren’t being captured, aren’t readily available, and/or aren’t in a usable format, thus creating gaps between current capabilities and user requirements.

Over the past 15 to 20 years, plant control systems have been installed alongside a variety of plant information management systems (PIMS), including Aspen Technology’s (Cambridge, Mass.) InfoPlus21, Intellution’s (Foxborough, Mass.) iHistorian, Invensys’ (Foxboro, Mass.) AIMStar, OSI Software’s (San Leandro, Calif.) PI, Wonderware’s (Lake Forest, Calif.) IndustrialSQL, as well as a host of control system supplier developed PIMS products.

According to AMR Research (Boston, Mass.) analyst, Leif Eriksen, ‘Information available from traditional PIMS products is being encompassed into a new class of applications called Enterprise Manufacturing Intelligence (EMI).’

According to Mr. Eriksen, EMI solutions can be custom built on top of existing PIMS products, or purchased from companies such as Executive Manufacturing Technology (London, Ontario, Canada), IndX Software (Aliso Viejo, Calif.), Lighthammer Software (Exton, Pa.), Mountain Systems (Green Bay, Wis.), Real World Technology (Chicago, Ill.), or Verticore (Salt Lake City, Ut.).

The underlying goal of EMI solutions is to gather data in numerous formats from numerous sources on numerous platforms, make the data appear to flow from a single source, organize the data so as to be associated with specific processing events, and use Web browsers to deliver data as information where needed. (See, ‘Typical plant control and information system after 10+ years’ diagram.)

A recent demonstration of a batch industry EMI solution made use of Intellution’s infoAgent and a few of Mountain Systems’ Proficy products.

The demo used infoAgent to gather data from several traditional PIMS sources and Proficy’s event historian. Using data from Proficy’s genealogy and quality modules, infoAgent organized the information and displayed it on a Web browser to help clarify questions like, how:

Did that last batch of raw material contribute to making a product that wasn’t up to specification?

Can two teams, that appear to be equal, produce products that are so different? or

Does product produced in two, seemingly identical, dryers end up so different?

Unfortunately, in most plants one question remains; are all the data available to make these sorts of informed decisions?

All the data, all the time

In many industries, especially among pharmaceutical drug produces, data collection must begin during R&D and extend through clinical trials and pilot plant production, eventually scaling up and reaching manufacturing, possibly years later.

Anyone involved in any early product development activities recognizes how sparse data can be and how significant correlations don’t become apparent until later in the product’s development cycle. That means every bit of data must be collected and treated as precious. It also means, computerized data analysis, such as provided by Aegis’ (Lafayette, Colo.) Discoverant and Praedictus’ Process Management System (PMS) can be quite useful in early product development.

Discoverant software provides direct, read-only access and conditioning of discrete, continuous, and replicate data from a variety of data sources, placing the data in process-centric views best suited for analysis of the process under study.

Praedictus’ PMS software is designed to collect and digest large volumes of data, automatically analyze the data, and provide the earliest possible prediction of where a process is headed.

When even small amounts of data become information, it begins to fulfill Dr. McGarvey’s earlier statement, ‘As more is learned about the system’s behavior, the more is learned about its variable relationships and influence on surrounding systems, thus the more is learned about system behavior, and on-and-on.’

Of course the ultimate goal is to minimize the time between a product’s discovery and when the product reaches the revenue-generation phase where the keys to success are to efficiently and repeatedly produce quality product.

Throughout the product development cycle, system thinking applies and Dr. McGarvey’s statement about a need to continuously refine process models is repeatedly reinforced.

Bring in the models

Continuous process producers recognize computers are often better at keeping a process in an optimized steady-state condition than human operators. In fact, many continuous processes close the loop between optimization models and the control system, allowing the software to ‘tweak’ controller setpoints.

By design, batch processes rarely operate in a ‘classical definition’ of steady state for long periods. The result is few batch processes have successfully closed the loop between models and controller setpoints.

That’s not to say models can’t be effective or useful in batch processes, it’s just that batch tools from companies such as, Aspen Technology, Hyprotech (Calgary, Alberta, Canada, now part of Aspen Technology), Batch Process Technology (West Lafayette, Ind.), BioComp Systems (Redmond, Wa.), and Intelligen (Scotch Plaines, N.J.), only recently reached the level of sophistication batch users require.

Companies developing batch tools have designed their offerings around themes of product life-cycle management and/or production optimization.

Product life-cycle management tools, such as Hyprotech’s Batch Design Kit (BDK), are designed with the understanding that product development can occur over long time periods and often won’t complete with the same people who began the project.

Tools designed for product life-cycle management support modeling, simulation, scale-up, and scale-down. They also provide levels of rigor to enable researchers, designers, and engineers to iterate to a higher confidence level with minimal data. For example, a reactor modeling tool, such as Hyprotech’s BatchCAD, can be used with minimal data relatively early in the lifecycle with more rigorous simulation modes available as more data become available.

Beginning to emerge among tools designed around the product life-cycle management theme is the ability to develop a systematic approach to capture and share knowledge management. Besides providing appropriate modeling tools, these tools also capture project team knowledge for the benefit of future team members.

Production optimization tools, such as Hyprotech’s Process Tools and Process Manuals, are designed for specific process activities, such as crystallization, filtration, drying, and extraction, and include rigorous models built on the combined knowledge of recognized domain experts.

As the term implies, production optimization tools become most useful when a product reaches the pilot plant and manufacturing stages.

Similar to their continuous process cousins, batch production optimization tools are suited for closing the loop with batch unit control loops, but only if operators feel comfortable they know what’s going on and can ‘jump in’ anytime they want.

Getting operators comfortable

A significant benefit of today’s ‘open’ human-machine interface (HMI) products, such as Iconics’ (Foxborough, Mass.) Genesis32, Intellution’s iFix, GE Fanuc’s (Charlottesville, Va.) Cimplicity, USDATA’s (Richardson, Tex.) FactoryLink, or Wonderware’s InTouch, is its ability to make operators feel comfortable and more informed at the same time.

Almost without exception, open HMIs make extensive use of OPC (OLE for process control) and Microsoft’s DNA architecture, which includes COM, DCOM, VBA, and Web enabling ActiveX technology, to mix control system and performance/business information on the same graphic display.

For example, the simulated results of a crystallization model and appropriate quality control charts can be displayed on the same HMI graphic displays as the control interface elements used to operate the crystallizer.

In his book ‘Bottom-Line Automation,’ Invensys’s vice president of strategic initiatives, Dr. Peter Martin, explains the need to converge technology, quality, and accounting information to produce dynamic performance monitoring measures (DPM) on operator graphic displays.

In a separate interview, Dr. Martin explained that DPM development for batch process is tricky and requires operator training and patience for effective use. However, Dr. Martin also said that developing and placing DPM indicators on batch processing HMI graphics can prove very beneficial because batch products tend to be high value and/or business drivers necessitate reducing batch cycle-times.

From a product’s life-cycle perspective, the ‘ultimate’ batch system solution doesn’t exist because there are too many unknowns.

What does exist is the truism that applying the wrong methodology, interpreting the wrong variables, and/or using the wrong control charts and analysis methods, can do more harm than good.

That’s why it’s important for those in the chemical industry, and especially those with batch processes, to understand how different quality improvement tools, including modeling, SPC/SQC, and the Six Sigma process, can be assembled under the ‘system thinking’ banner to help avoid surprises and capitalize on opportunities.

Comments? E-mail dharrold@reedbusiness.com .

Ignoring quality problems is expensive

1999 Abbott Laboratories (Chicago, Ill.) agrees to pay $100 million in fines for repeated product quality related violations.

2002 After spending millions of additional dollars to improve quality measurements and controls, Abbot Laboratories learns it hasn’t achieved product quality assurance levels required to obtain FDA approval for a new product.

2002 Schering-Plough (Kenilworth, N.J.) agrees to pay $500 million in fines for quality related problems identified by the FDA.

It doesn’t end there.

When companies pay FDA imposed fines they also agree to a schedule to improve quality measurements and controls. Failure to meet the schedule can bring additional per-day fines. In one case, $15,000 per day.

If product quality problems persist after paying initial and per-day fines, the courts can begin taking a percentage of the gross profits generated by the sale of such products. In one recent case the percentage was 16%.

Won’t happen to us!

Perhaps your company doesn’t compete in the tightly regulated world of the FDA. However, there is always someone looking over your shoulder and evaluating your product’s quality in one way or another.

We need look no further than the Ford/Firestone tire problems to find convincing evidence product quality problems have a negative impact on resources and profits.

Ask yourself, ‘At this very moment, is product quality adding to or subtracting from my company’s profits?’

If you don’t know the answer to that question, you need to adjust quality-related priorities within your company.