Too Hot to Handle
Measuring temperatures accurately in manufacturing operations has become increasingly important in recent years. Excessive temperature changes can indicate process problems (and probably defective products) or some kind of existing or imminent equipment failure. At the same time, using thermocouples and other contact measurement techniques is often impractical.
Measuring temperatures accurately in manufacturing operations has become increasingly important in recent years. Excessive temperature changes can indicate process problems (and probably defective products) or some kind of existing or imminent equipment failure. At the same time, using thermocouples and other contact measurement techniques is often impractical. Targets have shrunk, increasing the error introduced by contact methods, where sensors drain heat from the objects they are examining and thereby disturb (or distort) the very systems they are trying to measure. (Heisenberg* was right.)
In addition, contact techniques touch their targets at particular locations. Non-contact alternatives can scan entire surfaces to determine which areas require further scrutiny, then examine those areas in greater detail. Since, in many cases, the location of hotspots and other potential defects varies from product to product, even the best thermocouple approach may simply not examine the correct spot. And, of course, many applications prohibit contact altogether.
Daniela Carillo, industry research analyst at Frost and Sullivan, notes that although contact sensors still offer a price advantage, non-contact sensors are more durable because they are not subject to contact wear.
With such factors accumulating in favor of non-contact temperature sensing methods, the transition toward these tools has increased greatly since Control Engineering last addressed the topic in a June 2001 feature article, discussing the emergence of non-contact temperature-sensing technology as an alternative to contact techniques ( www.controleng.com/issues ).
Better and better
In addition to the increasing demand for non-contact alternatives, the methods themselves have advanced. Sensors have shrunk, analysis hardware and software have improved dramatically, and prices for comparable performance have dropped. As a result, more and more manufacturers are choosing non-invasive temperature measurement. By all indications, this trend will continue.
All objects above zero K emit infrared radiation (invisible light below visible red with wavelengths between 0.7 and 1000 microns) that corresponds directly to temperature. Measuring temperature accurately without touching an object requires detecting this emitted energy and compensating for any stray energy reflected from surroundings. The amount of the reflected energy depends on the object's emissivity, a measurement of how reflective it is.
The infrared temperature measurement graphic (page 34), with information from Omega Engineering, shows a basic instrument design. It includes a lens to collect the energy emitted by the target, a detector to convert that energy into an electrical signal, an emissivity adjustment, and an ambient-temperature compensator to reduce measurement error.
Selective filtering of the incoming IR signal has dramatically improved design. More sensitive detectors and more stable signal amplifiers allow the elimination of interference from extraneous environmental conditions. Using only light in the 8 to 14 micron range, for example, avoids interference from atmospheric moisture over long measurement paths. A wavelength of 7.9 microns is used to measure certain thin-film plastics, while 3.86 microns avoids interference from carbon dioxide and water vapor in flames and combustion gases. Accurate measurements using selective filtering require carefully choosing between shorter and longer wavelength response which, in turn, depends on the temperature range of the target object or process. Peak energy shifts toward shorter wavelengths as temperatures rise.
Infrared temperature measurement enjoys wide use in process control. In steel mills and other operations where high temperatures prohibit contact techniques, and in applications involving high-voltage equipment or caustic solutions the non-contact advantages are obvious. But even in relatively well-behaved processes, temperature mapping (as the accompanying photographs vividly illustrate) can pinpoint hotspots and other anomalies that may indicate processes not in equilibrium, and therefore signal the need for intervention or corrective action.
One increasingly popular application of non-contact temperature measurement is in predictive/preventive maintenance (PPM) programs. PPM uses the actual operation history of plant equipment and systems to schedule maintenance and thereby optimize overall plant operation.
Keith Mobley, CEO of Integrated Systems, Knoxville, TN, a predictive-maintenance consulting firm, describes a comprehensive PPM program as using the most cost-effective tools—including thermography and vibration screening—to determine actual operating conditions of critical plant systems and equipment. Basing maintenance schedules on the results of this analysis optimizes the availability of process machinery and greatly reduces maintenance costs.
According to Mobley, maintenance represents between 15% and 60% of the cost of manufacturing goods at individual companies, totaling a staggering $200 billion in the United States alone. He also estimates that fully a third of maintenance money is wasted on unnecessary or improperly conducted activities. Yet relatively simple proactive techniques, such as PPM, remain a rarity in most factories. Idtect, Paris, France, a predictive maintenance company, estimates that only 0.5% of the total maintenance expenditure goes into predictive maintenance, but they expect that number to increase dramatically over the next few years.
What's out there?
Non-contact measuring products fall into two broad categories.
Simple sensors and thermometers offer general information at a reasonable cost, but do not provide either high resolution or high measurement precision. An inexpensive thermal imager costing perhaps $1,500 might offer 256 thermal elements in its focal-plane array, sufficient for coarse profiling, but not for detailed work. A simple infrared thermometer costing a hundred or a few hundred dollars may measure temperatures and store values, but may not interface with a PC for longer-term data storage or for further analysis.
According to Jim Shields, product manager for Wavetek Meterman, many infrared measurement applications require scanning the surface of the target, but they may not require careful calibration or hard traceable values. In these situations, measurements are mostly comparative, examining the same place at regular intervals over some period of time. The instrument may monitor process conditions, looking for excessive temperature changes to indicate that the process is out of control.
Consider, for example, an application to check the bearings on a set of motors. If the bearings are wearing excessively or lubrication is insufficient, temperature will rise dramatically and be detectable even without precise measurements. In each case, actual value is less important than variation, which triggers some kind of action such as adjusting process parameters or shutting down the manufacturing line to re-lubricate bearings or replace them.
Similarly, in a large factory, examining the temperature of the air emerging from heating and cooling ducts determines the efficiency of environmental-control systems. Temperature stability can be critical in many applications—pharmaceuticals being a typical example. A collection of measurements from a scanning tool, even if not subjected to comprehensive software analysis, will nevertheless reveal problem areas.
At the other end of the product spectrum are elaborate thermal imagers that can cost more than $35,000. These products typically offer a plethora of features, including high resolution—perhaps 20,000 thermal elements in the focal plane—and the necessary software to help manufacturers monitor temperatures precisely in critical processes or determine the likelihood of defects in equipment or in a product under inspection.
Between these two extremes, a new imager class is emerging that offers the measurement performance of the higher-priced products, but with fewer bells and whistles. Vendors of this middle class want to provide the performance of high-end systems, but by limiting features that affect after-measurement activities more than the measurements themselves they hope to keep prices down—often under the $10,000 "radar level" of many companies' capital-acquisition procedures.
According to Fernando Lisboa, director of marketing at Raytek Inc. a successful thermal imager for manufacturing applications must include the following features:
Temperature measurements (radiometric capability) . Some instruments provide only the thermal image, sufficient for applications, such as firefighting and law enforcement, where the instrument need only show hot spots. Manufacturing operations, however, generally demand actual temperatures.
Fast-target scanning . Some models selling in the mid-range (around $13,000 to $15,000) measure temperature and provide associated thermal images, but lack fast-scanning capability. The average facility includes more than 2,000 locations to be inspected with varying degrees of criticality. Fast imaging allows a scan of one target and quick movement to the next location.
A full-featured thermal-analysis software package . Users need a way to organize and analyze thermal images and other instrument results for process improvement, troubleshooting, and equipment monitoring.
Training. Although thermal imaging is not "rocket science," getting the best possible performance requires knowing how to use the imager to its best advantage. In this context, training is not necessarily about how to use the instrument, but how to apply it in a target application. Such training programs need to be proactive rather than reactive, teaching users about solutions before encountering the problems.
The most common mistakes people make in measuring temperatures are to misunderstand "cone angle" and "emissivity," warns Shields. One common misconception is that measurement occurs at a single point, like the point of a laser beam. In fact, the measuring location is the base of a cone whose diameter is determined by the distance from the source and the so-called cone angle. If the cone angle is specified as 8:1, for example, then from a distance of 8 inches, the spot under scrutiny at the surface measures 1 inch in diameter. That same instrument from a distance of 16 inches would measure a 2-inch spot.
Low emissivity can prevent even the most precise measurements from representing only the temperature of the object itself. A perfect black body (emissivity = 1.0) emits only its own heat, reflecting none. An aluminum surface, on the other hand, might have an emissivity of only 0.2 or 0.3, compared to 0, which is a perfect reflector. Objects that reflect energy from their environment make the measurement inherently less accurate and less reliable. Liquid surfaces, too, can reflect infrared energy. Low emissivity may even compromise relative measurements, since the energies being reflected may be inconsistent from one measurement time to the next. Most critical measurement devices have emissivity fixed at around 0.95.
It's in the software
As important as the measurement instruments themselves is the accompanying software. With thermometers and other simple systems, the "software" is an operator recording the readings on a clipboard for comparison with past and future results. More elaborate, vendor-supplied PC-based tools include a wide range of features.
Direct software control of the imaging system, for example, allows the capture and analysis of numerous areas of interest at high speeds. Ircon's image analysis, for example, allows users to set emissivities in individual areas of interest and detects any loss of signal, increasing measurement accuracy and related analysis.
A playback control tool allows image sequence files captured from prior tests to be reviewed. By reviewing them at different speeds, users can determine the point of product failure or the time when a process exceeded specification limits.
Precise control of process temperatures is particularly important in certain applications. Production of medical equipment bags and IV bags, for example, requires careful control of sealing temperatures. Processes at temperatures that are either too high or too low can produce bags that leak or fail prematurely. Interactive software also permits setting up real-time alarms that can stop production lines under predetermined conditions.
Mickey Anderson, quality coordinator at Delphi Packard Electric in Clinton, MS, knows the importance of thermal imaging in a process environment first-hand. Delphi maintains over 200 injection-molding setups in its Clinton factory.
Heated or cooled water is sent to a thermolator in each setup to control the temperature of the plastic in the mold. The mix of hot and cold water determines the temperature of the mold. The heated and cooled water are delivered through four pipes. The correct temperature in each case depends on the plastic in the mold and the stage of the manufacturing process.
A situation arose recently where the thermolator on one of the molds did not maintain the right temperature. Under normal circumstances, the bottom pipe should be the coldest. A Flir Systems P-60 produced the "Four Pipes" thermograph (above), which shows that the plumber had routed the pipes incorrectly, so that the coldest pipe (shown as black) was second from the top. A simple replumbing of the lines solved the problem, improving product quality and freeing up a number of people monitoring the process.
In another case, Anderson discovered two 12-horsepower DC drive motors running at temperatures of more than 250 °F, near the failing point for such devices. Simply cleaning the top-mounted blower motor, fan, and filter cooled the drive motor down to 143 degrees—its normal operating temperature. Failure of those drive motors would have cost Delphi $5,000 each, immediately showing the financial benefits of this type of inspection.
Clearly, if a process consists of a well-mixed liquid or slurry and equipment runs well below failure points, contact temperature measurements or no measurements at all will suffice. At the same time, however, equipment and processes are generally not so well-behaved. As such, the move to infrared non-contact techniques will continue to accelerate. As more vendors offer such products, the hotter market will push performance up and prices down.
* Heisenberg's uncertainty principle refresher: the more precisely the position is determined, the less precisely momentum is known in this instant, and vice versa, according to Werner Heisenberg's 1927 paper on quantum mechanics ( www.aip.org ).
Keith Mobley's An Introduction to Predictive Maintenance, 2nd Edition , Butterworth-Heinemann, an imprint of Elsevier Science, Woburn, MA, 2002, is available at www.controleng.com/ bookstore.
Stephen Scheiber is a consulting editor with Control Engineering.
Stationary vs. Hand-held non-contact temperature measurement
Infrared temperature-measuring instruments come in hand-held and stationary forms.
For monitoring processes in real time, mounted sensors provide stability in that they always see the process under the same conditions.
For troubleshooting, predictive maintenance, and other activities that occur at irregular or unpredictable intervals, a hand-held instrument makes more sense.
Ultimately, the controlling factors would be some combination of application, convenience, staff availability, and simple economics.
For more suppliers, go to