The Clean Air Act was passed in 1969 in an attempt to clean up the air in the United States. Though considered progressive at the time, the Clean Air Act has proven insufficient and was thus amended in 1990 with sweeping revisions in an attempt to curb three major threats to the nation's environment and to the health of millions of Americans.
|
The Clean Air Act was passed in 1969 in an attempt to clean up the air in the United States. Though considered progressive at the time, the Clean Air Act has proven insufficient and was thus amended in 1990 with sweeping revisions in an attempt to curb three major threats to the nation’s environment and to the health of millions of Americans. These three threats are acid rain, urban air pollution, and toxic air emissions.
The acid rain program of the Clean Air Act Amendments of 1990 (CAAA) has fostered the growth of Continuous Emissions Monitoring Systems (CEMS) by requiring more than 250 coal burning power plants to install CEMS during Phase I of the CAAA and an additional 2,000 plants to install CEMS in Phase II. Phase I became effective on Jan. 1, 1995, and Phase II on Jan. 1, 2000. These CEMS are in addition to already existing CEMS required by earlier New Source Performance Standards (NSPS) and other existing regulations. These regulatory pressures have promoted growth of CEMS and continuous analyzers from infancy to its mature state. During this period, analytical equipment has progressed from analog instruments born in the laboratory to new digital analytical equipment designed for specific applications.
Thermomagnetic analyzers are very robust and
have an extended service life. They are suitable
for use with highly toxic gas mixtures
Technology basics
CEMS can be broken into three broad methods: extractive, in-situ, and parameter. Extractive methods can be further broken into two techniques, direct source level and dilution. In-situ methods can be either path or point types. Parameter methods rely on physical measurements such as temperature, pressure, or other process measurements to predict emissions so do not incorporate on-line analytical techniques.
Dilution extractive techniques have dominated in the acid rain program, due to the desirability of a wet basis measurement. These types of systems have been successful to this point, but have severe limitations as pollutant concentrations reach continually lower levels. Variables such as stack temperature, stack pressure, and molecular weight have an unfavorable impact on the dilution technique. A dilution sample system is not appropriate where an analyzer with the requisite sensitivity is unavailable.
In general, reactive and condensable gases such as HCl, NH 3 , and formaldehyde present the greatest measurement challenges. Such gases can react with other components within the stack gas stream; they may condense or be absorbed by liquid condensate within a cold extractive sampling system, they may adsorb onto surfaces, or they may polymerize before reaching the analyzer. Reliable results may be obtained-depending upon the components making up the flue gas stream-with special sampling equipment and special operation and maintenance procedures. Both extractive methods rely on similar analytical techniques. A summary of current techniques includes nondispersive infrared gas analyzers, ultraviolet gas analyzers, chemiluminescent gas analyzers, paramagnetic oxygen, flame ionization detection gas analyzers, gas chromatography, mass spectrometry, and Fourier transform infrared analyzers.
The flame ionization detector measures the total content of organic carbon in the sample gas. During the combustion of organic substances in a hydrogen flame, electrically charged are produced. The resulting current of these ions is proportional to the organic carbon.
Analytical techniques and advances
Nondispersive infrared gas analyzers (NDIR) employ several different detection techniques. Optopneumatic detectors, commonly known as Luft detectors (from their inventor, Karl Luft), interference filter photometers (IFC), and gas filter correlation (GFC) are the more predominant types. Theory of operation of these infrared methods is similar and is based upon absorption of infrared energy in the 2 to 11 micron wavelength range. Simple molecules with less than 5 or 6 atoms have infrared absorption spectra with fine structure. Gases fitting this description are ideal for Luft and GFC types of analyzers, which correlate sample-gas spectra with that of the pure component of interest. Interference filter correlation can measure gases with either fine structure in the spectra or broadband absorption.
Recent developments in NDIR analyzers include the ability to have multiple optical benches in one analyzer and stacked detectors that allow measurement of as many as four gases in a single instrument. Use of the microprocessor has added increased capabilities to modern NDIR analyzers. This includes the ‘platform’ approach, where a single system controller provides the user interface for as many as three analytical modules. The system controller provides industry-wide interfaces to control systems and common computer networks. In the ABB (Lewisburg, Va.), Hartmann & Braun offering, for example, Modbus is included for control systems and Ethernet for interfacing to personal computers. Interference filter correlation photometric analyzers have been enhanced through microprocessor technology to measure multiple components and also operate over analyzer networks. These analyzers have also been successfully applied to hot wet measurement demands by heating to temperatures as high as 200 °C.
Improvements have also been made in the calibration of NDIR gas analyzers. By using calibration cells, small optical cells with actual test gases encapsulated within, analyzer calibration can be checked without bottles of expensive test gases.
CEMS of a different color
Ultraviolet (UV) analyzers are generally more sensitive than NDIR analyzers. Another advantage of UV analyzers is that water (H 2 O) is transparent to UV. SO 2 and NO x are frequently measured with this technology. UV analyzers are also useful in measuring NO 2 , hydrogen sulfide, chlorine, and ammonia. State-of-the-art UV analyzers use a four-beam optical approach which is said to result in a more robust and stable analyzer by removing variability caused by aging of components and sensors with ‘dirty windows.’
Paramagnetic analyzers are commonly used in CEMS to measure oxygen as the diluent gas. Newer analyzers of this type are more resistant to corrosion and are smaller than their predecessors, which results in a faster more stable response. Of the various types that exist, the more widespread is the magnetodynamic or ‘dumbbell’ type. This analyzer uses a small dumbbell-shaped optical component to physically determine oxygen concentration. The downside includes moving parts and optical components that are exposed to corrosive sample gases.
Another type of paramagnetic analyzer is the thermomagnetic or ‘magnetic wind’ type. It has no moving parts and can be constructed of highly corrosion-resistant materials. These devices are robust and have extended service life. Microprocessor technology has also advanced paramagnetic analyzers by allowing for surrogate calibration for difficult or impossible-to-prepare gas mixtures or very toxic mixtures.
Shining on
Chemiluminescence technology is used to measure NO levels. Widely used for CEMS installations that see low NO x levels, chemiluminescent analyzers provide good sensitivity and selectivity to NO.
Flame ionization detection (FID) techniques are widely used to measure hydrocarbons. During the combustion of organic substances, in a hydrogen flame, electrically charged particles are produced. The resulting current of these ions is proportional to the organic carbon content. FID has very large dynamic ranges, from 10 to 100,000 mg of organic carbon per cubic meter. The latest analyzers use a fully heated sampling and analytical system that prevents absorption of heavier hydrocarbons that can produce erroneous results. Self-monitoring and automatic fault recognition capabilities and logging functions are usually ‘part of the package.’
Unlike FID technology, gas chromatography (GC) separates a sample stream into its individual components for measurement. In current practice, GC designs integrate a basic analyzer (chromatograph) and a controller/microprocessor package. Systems can perform as stand-alone units or be interfaced with multiple chromatographs, distributed control systems, or a host computer. Significant changes have been made in valves, columns, column systems, and detectors. Polyamide-coated fused silica columns with stabilized stationary phases have reached a high reliability level and are routinely used in process chromatographs. The most popular detectors for online chromatographs are still the thermal conductivity and flame ionization detectors, but other component-selective detectors are being used, including electron capture, flame photometry, photoionization, and chemiluminescence. Process chromatographs are used in CEMS when the pollutant of interest is more exotic than the criteria pollutants.
Heavier stuff
Mass spectrometers have been a basic laboratory tool for many years. The mass spectra of a compound provide a positive ‘fingerprint’ identification of that compound. Mass spectrometers also have inherently broad chemical applicability because the ionization process is fairly uniform for all compounds. One instrument can measure many compounds providing cost-effective monitoring. They are inherently fast, sensitive, and capable of wide dynamic range (parts per billion to 100%). The mass spectrometer is quantitative and linear, since its output is directly proportional to the concentration of the species in the sample. It is a reliable, stable and low-maintenance instrument because it is electronically based rather than chemically based, as are many single sensors. Finally, the mass spectrometer has maximum flexibility because its broad capabilities can be selectively used under programmable microprocessor control.
Recent advances in technology have been made which provide elements necessary for continuous online applications such as CEMS and air monitoring. These developments include reliable, low maintenance turbo-molecular vacuum pumps, which allow near-immediate operations, quadrupole filters, which provide measurements from 0-400 amu; low-cost computing power, and packaging for continuous operation environments. The mass spectrometer is often used to measure hazardous air contaminants, which require fast analysis. In situations requiring many sample points, the speed of new generation devices allows a multistream sampling concept to be used.
Doing the math
Fourier Transform Infrared (FTIR) analyzers are another technology that has been taken from the laboratory to online usage in CEMS. The advent of inexpensive and powerful microprocessors has made this possible. One advantage of the method is that several gases can be monitored at the same time by the same instrument. Heated versions are available that make it possible to monitor hard to handle gases like HCl and NH 3 down to 10 ppm. These units find application in the monitoring of combustion sources, toxic-waste incinerators, and industrial processes, as well as ambient monitoring applications for hazardous air pollutants.
FTIRs can produce detailed infrared spectra over wavelengths of 2.5 to 25 microns. Spectra are calculated from an interference pattern and matched mathematically with the sample data held in memory. A variety of algorithms derive the concentration values from these data. Digital spectra as well as calculated values can be stored. At some later time, the stored spectra can be processed to determine concentrations of other gas components that may have been deemed unimportant earlier.
Software-configured FTIR systems display all results and status messages, allow manual interactive operation during commissioning and servicing, feature self-diagnosis or remote diagnosis by modem, and the ability to archive measured and diagnostic data.
The demand created by the environmental regulations of the 1990s together with the advent of economical powerful microprocessors has moved CEMS further along the development curve by focusing on features that save time and money. Devices use a common microprocessor platform that standardizes operation of a number of satellite analytical modules. This improves the efficiency of the CEMS and its operating personnel by using a consistent approach to all project phases from the planning stages to design, operation, maintenance, training, and spares stocking.
Satellite modules are currently available for NDIR, paramagnetic, electrochemical, thermal conductivity, flame ionization, and ultraviolet technologies. Using these packages, standard modules can be quickly configured to build application-specific multicomponent analyzers for CEMS duty. These often include industry standard computer interfaces, visualization and remote control software modules, and custom drivers for creation of custom applications under Microsoft Windows.
Sometimes the best predictor of the future is the past. History has indicated CEMS can satisfy regulatory requirements, boast production efficiencies, and decrease operating costs (See ‘PEMS’ sidebar). Now that practical technologies exist for making most measurements, a successful monitoring project requires combining the most cost-effective approach with a workable and readily available solution.
For more information on ABB Analytical ( www.abb.com ), visit www.controleng.com/freeinfo .
Author Information |
Bill Worthington is the product line manager for Continuous Gas Analyzers at ABB Automation, Process Analytics Div. in Lewisburg, WV. A 26-year veteran in analyzer field, he holds a degree in chemistry from the University of Georgia and a degree in Chemical Engineering from he Georgia Institute of Technology. |
Manual stack sampling is dangerous (and costly) business
Use of continuous emission monitoring devices or predictive emission monitoring strategies also eliminates serious safety hazards. Collecting stack samples subjects personnel to the danger of falling from sampling platforms-many of which are located 50-150 ft off the ground. Even if safety harnesses are used as part of mandatory procedures, falling tools and equipment can be a problem. Uninsulated stacks, stack gases, and sample probes-hot enough to burn personnel-require insulated gloves and clothing (not the most comfortable working attire).
Sampling platforms can become enveloped by a toxic plume from an upwind stack, or engulfed in their own virulent plume under positive pressure conditions. Potential for explosive gas mixtures can require use of explosion-proof or intrinsically safe instrumentation. Condensation, rain, and snow in these out-of-doors installations also increases electrical shock hazard, resulting in ground-fault and higher NEMA-rating requirements for all instrumentation involved. Additionally, use of more on-site gas monitors and respiratory equipment on sampling platforms may be required, especially if the equipment is housed in enclosed shelters.
If CEMS are Continuous Emission Monitors, what the heck are PEMs?
Measuring and documenting stack emissions is the method by which regulatory agencies like the U.S. Environmental Protection Agency and state EPAs keep potential industrial polluters in check. Because this method has drawbacks in many areas (long-range accuracy and equipment maintenance requirement to name a few), agencies have long sought another method for controlling emissions.
Enter PEMs. Predictive Emission Monitors have evolved to meet the requirements of EPA’s Title 5 in a different way. PEMs use modeling software and data from sensors (combustion and stack temperatures, combustion product flow rates, oxygen and carbon dioxide levels, etc.) at the combustion end to predict emission levels without the need to measure them. If predicted emission levels remain under control then it is assumed that actual emissions will also. Advantages of these systems could be enormous. Instruments at the combustion end usually work in much less hostile process media than do the CEMS ones up the stack. Loss of accuracy from poorly maintained sample collection/preparation equipment and corrosion of wetted parts is (at least) partially avoided. Sensors and related electronics are usually in ‘handier’ locations for calibration and maintenance to occur.
The Achilles Heel of this technology may be the modeling software itself. For software to accurately predict exhaust conditions, it must handle all changes in every appropriate process variable. This can be a tall order in the real world. However, even with these potential problems, PEMs are now at work in industry in such ubiquitous applications as small boiler installations.