Coordinate calibration, verification to improve measurement accuracy
By leveraging smart instrumentation and coordinating calibration and verification, plant personnel can automate maintenance tasks and increase productivity.
Learning Objectives
- An organization’s competitive edge depends on accurate measurement and efficient production.
- Calibration and verification tests are often performed to ensure accuracy and reduce potential downtime.
- Modern smart sensors are often set up to store calibration records and some can even perform self-calibration.
The magnitude of error differs depending on the type, function, and condition of an instrument, and error often increases over time. As a result, a general requirement exists to quantify these errors to determine if measurements are reliable enough to serve their intended purpose.
Though calibration is critical for accurate, efficient and safe production, it can be time-consuming and expensive, result in downtime and cause incidents when handled improperly. To determine the best balance for measurement reliability, plant efficiency and process safety, it is critical to define each instrument’s accuracy requirements and how often the instrument must be calibrated.
More frequent calibrations than necessary waste resources and create an excess of planned downtime, and too few instrument calibrations can adversely impact product quality, regulatory compliance and safety while increasing the likelihood of unplanned downtime.
Calibration versus verification
Calibration is a procedure for establishing a relationship between a quantitative measurement and a known reference. Verification is often a qualitative provision of objective evidence that a given measurement fulfills specified requirements.
The goal of a calibration test is to determine the measurement errors of a given device, referred to as the unit under test (UUT). Once the measurement errors become known, a UUT can be used as a reference to calibrate another instrument, and master meter proving is a common example of this practice. This chain of calibrations must always lead back to national or international primary references to create a guarantee of metrological traceability (Figure 1).
The concept of verification is more widespread, most often referred to as comparing calibration results with a specified requirement to determine compliance. The most common requirement used for verification is maximum permissible error (MPE), defined by a manufacturer, a metrology organization, an end-user, or a regulatory authority. If a device’s measurement errors determined through calibration are less than the MPE, then the instrument fulfills the specified requirement and passes its verification test (Figure 2).
MPE is often determined through a criticality assessment and the consequences of measurement error for each instrument. The MPE, combined with regulatory requirements, is often used to determine calibration frequency and procedures.
Instruments with a tight MPE must often be lab-tested under stringent conditions, but evaluators can make the case for less disruptive calibration methods, or even inline verification (Figure 3), for instrumentation with less strict MPEs.
For supported specialized instruments, inline verification improves plant availability by reducing calibration frequency, along with the risk of damage during transit and mistakes during reinstallation when instruments must be removed from the process. Additionally, this more frequent and easily accessible method for verification empowers operators to track an instrument’s performance over time, providing early notice of measurement drift to reduce unscheduled downtime.
Instrumentation calibration intervals
A calibration interval is defined as the time between two calibrations. Calibration intervals should be determined in alignment with instrument criticality and the risk of a measurement error drifting outside an acceptable range (Figure 4), but this is often not the case.
In practice, most calibration intervals are determined using reactive methods, those in which intervals are initially set at a convenient frequency and then adjusted in response to data from previous calibrations. The problem with this procedure is it does not make any attempt to model or predict measurement reliability over time.
Reactive methods are typically less effective than statistical methods for establishing intervals that meet reliability objectives, and they typically take years to reach a steady state.
Flowmeters and mobile calibration
Legal requirements for regular checks on analytical sensors are commonly fulfilled with wet calibration, which entails immersing a sensor, such as a pH or conductivity probe, in a reference solution. In the case of a flowmeter, a known flow rate must be generated using process media.
This type of wet calibration is easily performed in a laboratory setting, where the environment is controlled, instruments are accessible, and the necessary equipment is at hand. But these conditions are usually not found in production systems, so the same calibration procedures executed in a lab often do not work in production environments.
Wet calibration has another downside in process plants because sensors must typically be removed from the process and sent to a laboratory for verification, then shipped back to the plant and reinstalled following calibration. Through this process, transport or mishandling damage may occur and remain undetected, leading to situations where recently-calibrated instruments do not perform according to lab-verified results.
For minimal process impact, many facilities choose to calibrate temperature, pressure, and analytical instruments on site, while sending flowmeters to a lab for calibration due to the specialized equipment required. But there is another option for plant personnel: The ability to use accredited flow calibrators on site with a mobile calibration rig (Figure 5).
This type of onsite instrumentation calibration:
- Improves plant availability.
- Eliminates the need to disassemble and ship devices.
- Provides greater context for error sources compared to errors detected in a lab.
- Minimizes requirements for additional inventory and spare parts because calibrated instruments are out of and back in service very quickly.
Convenient and cost-effective, this mobile approach eliminates the need to ship instruments offsite. But technicians must still take precautions when opening primary process loops and deactivating electronics during instrument removal from service, and during rewiring and restoration of service.
Smart instrumentation, calibration certificates
Modern smart sensors are often set up to store calibration records, and some can even perform self-calibration. For example, modern pharmaceutical grade steam-in-place temperature sensors incorporate a high-precision reference, known as a “Curie point,” that experiences an abrupt ferromagnetic change at a certain “Curie temperature” during steam-in-place sterilization cycles. The change in properties can be detected electronically, triggering an automated process value checkpoint upon reaching the Curie temperature (Figure 6).
Although self-calibration techniques are not possible for flowmeters, numerous other advances are streamlining calibration and verification procedures and increasing measurement integrity. These advances include:
- Instruments with removable sensors so wiring remains intact, decreasing electrical rewiring errors.
- Sensor-stored certificates of calibration, synchronized with handheld maintenance devices for review and audit trails.
- Pre-calibrated sensors that can be swapped quickly and easily, reducing downtime.
- In-situ testing methods, enabling inline verification.
- Built-in sensor self-diagnostics, providing advance failure warning.
- Smart transmitters that continuously scan for problems, alerting personnel when sensors require cleaning or calibration.
Calibration, verification advice
A short recap of key points on calibration and verification include:
- Calibration has a quantitative outcome.
- Typically, a verification report is issued in qualitative terms, such as pass-fail criterion.
- Verification after a calibration ensures the highest quality of a device’s measurement.
- Verification without calibration can be performed between two calibrations, often extending the required calibration interval.
- Traceability is required in certified calibration and verification procedures.
Many of today’s instruments possess built-in capabilities to diagnose problems, perform verification, and generate auditable reports, easing calibration- and verification-related tasks, and reducing manual intervention and plant downtime requirements. When online calibration is not feasible, fewer and quicker offline calibrations lead to better overall equipment effectiveness.
These smart instruments empower plant personnel to achieve maximum uptime, while maintaining regulatory compliance and upholding the highest degree of safety. An organization’s competitive edge depends on accurate measurement and efficient production, and by optimizing calibration procedures and using inline verification wherever possible, plant personnel can ensure profitable, compliant, and safe operation.
Adam Booth is the flow product marketing manager and Nathan Hedrick is a national product manager, both at Endress+Hauser. Edited by Chris Vavra, web content manager, Control Engineering, CFE Media and Technology, cvavra@cfemedia.com.
MORE ANSWERS
Keywords: calibration, process sensors, verification
CONSIDER THIS
How often do you perform tests on process instruments and what preventive steps do you take to reduce downtime?
ONLINE extra
Learn more at https://www.us.endress.com
Do you have experience and expertise with the topics mentioned in this content? You should consider contributing to our WTWH Media editorial team and getting the recognition you and your company deserve. Click here to start this process.