Analyze data quality for process control systems

Data obtained from process analyzers need to be trusted and verified to help the user make an informed decision. Considerations include communications reliability, analyzer status, and logic platform.

By Norm Kincade, Maverick Technologies November 2, 2018

With new advances in analytical instrumentation, process plants increasingly are using analyzers to help improve operational efficiencies for process control systems. An important part of using analyzer results in a process control system is understanding and trusting the data received from the analyzer.

The information a process analyzer provides can be viewed as a data hierarchy. Each analytical device may contain one or more sub-controllers; each sub-controller manages one or more streams; each sample stream is analyzed to generate one or more measured component values.

The hierarchy explicitly defines the association between data items. If stream 1 is disabled, it is immediately apparent which components will be affected. Similarly, if a sub-controller is turned off, the data hierarchy defines the streams and components that will be affected.

Understanding the data quality starts with the communication between the analyzer and the process control system. If there are no major communication faults, look at the analyzer status and check if the analyzer is running and whether it has a fault. For stream 1 component 1 to be “valid,” the analyzer must be running with no faults and stream 1 has to be online.Other facets of data quality include asking:

  • What should happen to the component values during a maintenance cycle?
  • Should the process control system hold the last value while an analyzer validation cycle is performed?

Calibration cycle

A calibration cycle could introduce a step change in the analytical results from a gas chromatograph. One option is to set the values to not-a-number (NaN) during calibration and re-initialize the control algorithms once they return to normal operation. The data quality logic should also check for a timeout.

An end-of-cycle event is used to notify external systems that the analysis is complete, and a consistent set of data has been stored in the registers. This event can be used to trigger processing of online control applications, such as an asynchronous proportional-integral-derivative (PID) block.

Checking analyzer data quality can be performed in two steps. The first step verifies that communications and all the analytical equipment required to generate the results are operating normally. A clear understanding of the analyzer modes, fault signals, and error conditions is essential.

The next step is to determine the logic verifies that the data meets some reasonable criteria. The instrument may be able to detect concentrations outside the process limits. The logic should verify that the data is within a range that makes sense for the process. There may be physical limits on how fast a value can change, in which case a rate of change alarm can be implemented.

Adequate data quality checks can be performed in many ways. The implementation depends on the communications reliability, analyzer status information, and the platform available to implement the logic.

Norm Kincade is a senior engineer at Maverick Technologies, a CFE Media content partner. Edited by Chris Vavra, production editor, Control Engineering, CFE Media,


Keywords: Process control, process analyzer, data acquisition

  • Data obtained from a process analyzer needs to be understood and trusted.
  • Check the communication between the analyzer and the process control system.
  • Data should meet reasonable criteria during verification.

Consider this

What additional criteria should be considered for analyzing data from a process control system?

ONLINE extra

About the author

Norm has more than 35 years of experience working in process control systems and applications, with a special focus on subsystem interfaces and connecting analyzers, PLCs, and other third-party devices.