Temperature sensor calibration

Sensor failure risks are significantly reduced if the temperature sensor can be calibrated in situ.

By Keith Riley February 15, 2019

The most common calibration standard operating procedure [SOP] for temperature sensors in hygienic process industries is a two- or three-point calibration. Calibration can be performed in the facilities lab or near the insertion point. This reflects the state-of-the-art method to obtain the clearest possible temperature curve for the sensor and detect deviation. This practice is in line with expectations from auditors and regulatory authorities.

With the process temperatures seen in the hygienic industries, a standard resistance temperature detector (RTD) will last for many years. Even so, the biggest risk for a thermometer in a hygienic system is the calibration process. Opening the device, removing the insert, connecting and disconnecting power cables, introducing the thermometer into the oil bath or block calibrator, or transporting the thermometer to the laboratory increases the likelihood of mechanical damage.

Removing the sensor from the process or thermowell is the biggest reason for RTD failure. A related question is: “What is the best way to return the sensor to the exact same measuring position in the process after removing it for calibration?”

Risks are significantly reduced if the temperature sensor can be calibrated in situ.

Temperature calibration timing

A major limitation with the traditional 3-point calibration is the calibration cycle times. What is the performance of the RTD between calibration cycles? Most facilities establish the cycle times based upon an analysis of risk management and cost. More calibrations—shorter cycles—reduce the risk, but increase the expense.

Fewer calibrations [longer cycles] do the opposite. A sensor with self-calibration or self-verification capability provides continuous health monitoring of the RTD. Each time sterilization in place (SIP) happens, a sensor performs a calibration without:

  • The need to halt the production process
  • Removing the RTD from the process
  • Any effort needed from the maintenance or metrology staff.

At this point, calibration will immediately identify or eliminate the risk of non-conformities with the RTD before the calibration cycle is up, thus, ensuring the highest product quality.

The minimum temperature threshold necessary to initiate self-calibration is 118°C, according to one sensor’s design specifications. This most commonly occurs during an SIP cycle.

However, since not all hygienic applications employ sterilization in place, there are other ways to take advantage of self-calibration. One is to remove the sensor from the process and place it in ceramic dry block heat source to achieve a temperature of 118°C. Once this occurs, the dry block can be turned off and allowed to cool with the sensor remaining in the dry block heater. As the temperature falls below 118°C, self-calibration will take place and ensure the RTD remains within tolerance, which will significantly reduce process downtime.

Keith Riley is national product manager for pressure/temperature, Endress+Hauser Inc. Edited by Mark T. Hoske, content manager, Control Engineering, CFE Media, mhoske@cfemedia.com.

KEYWORDS: Hygienic temperature sensor, sensor calibration

Sensor elements can fail when removed for calibration.

In-situ calibration reduces risk of sensor failure.

Product quality is ensured when in-place calibration is performed with sterilization cycles.


Would in-place calibration save time and resources in your processes?


Endress+Hauser provides more information about sensor products.

See related New Products for Engineers at www.controleng.com/NP4E

Author Bio: Keith Riley is national product manager for pressure/temperature, Endress+Hauser Inc.