Project: Biopharmaceutical filtration automation (December 12, 2005)

By Control Engineering Staff December 13, 2005

December 12, 2005

Project Status Summary: 1. Control software development code complete 100%2. HMI software code complete 100%3. HMI external application configuration 100%4. MF beta testing 90%5. UF beta testing 90%

Historical Data Current automation systems are capable of generating large volumes of data that can be captured, stored, and analyzed. Even a small system like the one used for this project can generate many gigabytes of data over a few months of operation. Collecting and storing this data has become increasingly inexpensive as hard drive capacity increases and prices decrease.

For this project, our system will tie into an existing plant continuous historian–GE Proficy Historian for continuous data. Event based data such as alarms, system messages, and operator actions data will be collected in a Microsoft SQL Server database. Continuous data will be presented in standard iFix HMI trend displays. Presentation of event-based data is a bit more challenging due to the depth and richness of the data as compared to continuous data. Consider that each value stored in a continuous database includes a tag, parameter, value, and timestamp whereas an alarm event may include tag, parameter, alarm type, alarm priority, plant area, active time, cleared time, acknowledge time, value at active, alarm setpoint when active, time in alarm, min/max/avg. while in alarm, acknowledged by username, and workstation where acknowledged.

Alarm History iFix includes an alarm ODBC service that logs alarm events to an ODBC compliant database, such as SQL Server. For each event that occurs in the alarming subsystem, a record is created in the database. For example, when an alarm becomes active, a new record is added. When an operator acknowledges the alarm, a separate record is added to the database–unrelated to the active event. The same is true when the alarm clears as another unrelated record is created. We have developed SQL stored procedures that combine all three separate events into a single record in a table of our own design. The result is that alarm data are significantly compressed (three events combined into one) and each record contains all pertinent information about an alarm. On batch systems, the alarm is also related to a batch or phase-simplifying batch analysis and reporting.

Operator Logs Our custom user interface applications for control modules, equipment modules, and operator prompts include functionality for logging actions performed by an operator. For example, when an operator opens a valve from a control module or equipment module faceplate, an event is logged that includes the timestamp, module tag, module description, parameter, username, workstation, old value, new value, and a user-configurable custom message associated with the parameter. When operators are prompted to perform an action or enter data as part of an automated sequence, the same type of data is logged.

The data can be logged either to a text file or to a message queue using Microsoft Message Queue (MSMQ). We use MSMQ for asynchronous queued data transport across the network. Application programming with MSMQ is extremely simple and provides outstanding functionality for reliably moving data across the network. MSMQ is a service provided with the Windows XP operating system. Message queuing automatically queues up messages for later delivery if a destination computer is not reachable. When the target computer becomes available, the queued messages are automatically forwarded. The target computer for our application is the SQL Server used for event based historical logging. We have developed an application that runs as a Windows service for removing messages from a message queue and moving the information into a SQL Server database. Together, these applications provide for the reliable, asynchronous, and disconnected transfer of data across the network and into a historical relational database without requiring the source application to hold a direct database connection. Logging data with this design is extremely robust and will never interrupt plant operations because a database server is not accessible.

What’s Next? The customer will visit our San Francisco office to review the automation application. A few testing activities will occur during the customer review. As always, there may be a few more customer requested modifications once they have a chance to view and run the application against the simulated process.