Replacing analog meters in the nuclear industry

Cybersecurity concerns about analog meters are influencing nuclear industry automation and are forcing an industry resistant to change to do so.

By Otto Fest, Otek Corp. December 6, 2017

More than 120 years ago, Sir Edward Weston invented the analog panel meter (APM). The meter’s simplicity, along with its low manufacturing cost, is unfortunately offset by its unreliability, inaccuracy, and fragility. Moreover, its readings are subject to misinterpretation. The meter’s propensities toward stuck needles, inaccuracies, and erroneous readings have led to a few tragedies.

This inherent inaccuracy caused the military and nuclear power industry to adopt a precautionary ~5-10% safety factor on all readings. Most APMs have an accuracy of only +/- 5% (+/- 2%, if you are looking at them straight-on from a distance of +/- 10-in.), plus parallax, when interpreting the meter. As a result of this seemingly minor discrepancy, a typical nuclear power plant could lose more than $100,000 of saleable power daily.

The industry cannot remain dependent on late 19th-century metering technology. We were intrigued when challenged by a nuclear plant manager to develop an electronic bar-digital meter identical to our New Technology Meter (NTM), but without a microprocessor. As a signal-powered meter, the NTM is powered by the current loop (4 to 20 or 10 to 50 mA) or by the ac/dc signal it measures.

The resulting solid-state analog meter (SSAM) has no microprocessor or control outputs. The SSAM is display only while the NTM is for display or control. Either series can be externally powered and feature the company’s signal-failure alarm that sounds for about 30 seconds after failure, eliminating the dangerous "stuck needle" syndrome. 

Cybersecurity mandates

Nuclear power plant instrument-and-control (I&C) rooms must comply with Nuclear Regulatory Commission (NRC) cybersecurity regulations (including NEI08-09) if they go digital. Doing so costs millions of dollars and implementation can take years. Otherwise, if industry managers keep their +/- 5% analog meters, they must operate at below optimum efficiency. In an extreme case, they’ll lose 8 to 10% of their optimum electric-power output.

We asked ourselves, "How in the world can we deliver the needed performance without a microprocessor?" More than a decade ago, we experimented with a hardware-only bar-graph digital meter. Within two weeks, after a frantic search for a 2004 notebook and the prototype breadboard, we had a multicolor bar-digital meter solely powered by the 4 to 20 mA (~5 to 60 mW) current loop without a microprocessor.

Although we hadn’t replaced Sir Edward’s invention, it met the plant manager’s challenge and now is being evaluated for Class 1E classification. However, further work engendered a new breed of LED electronic meter that measures, displays, controls, and transmits ac and dc signals without a microprocessor with 0.1% digital accuracy, versus 5% of analogs.

The microprocessor-less and powerless SSAM meter series can be used as a basic bar graph (like the analog meter), as a bar-digital (like obsolete digitals), or as a multi-function meter that is field-configurable. If the housing is the same, all that’s needed is to change the scale plate and pluggable modules and perform any required calibrations. 

Addressing a legacy

At a demonstration of powerless technology at a nuclear-industry corporate "lunch & learn" session, another manager asked: "How should the instrument-and-control (I&C) manager decide whether to digitize the I&C room?"

The points to consider are as follows:

Use of the current loop is the simplest, most reliable means of transmitting analog signals. Therefore, the options include:

1. Replace only the analog meters

2. Add a data acquisition (DAQ) system, or;

3. Add supervisory control with triple redundancy.

However, options 2 and 3 require cybersecurity compliance. Option 1 does not have to comply with cybersecurity requirements because there is no microcontrollers or serial input/output (I/O) to hack, Specific to option 1, options include:

a. Change all signals and wiring to 4 to 20 mA;

b. Install the best 4 to 20 mA transmitters, or;

c. Install an SSAM or its equivalent. With wire-per-wire and panel-compatible drop-in ability, existing wiring and panels can be reused.

A typical reactor produces about $1 million daily of electric power, so every 1% increase in productivity is about $10,000 per day of profit or about $3.65 million per year. 

Nuclear industry snapshot

About 95% of U.S. nuclear power plants were built and commissioned before 1990. For many years, all power plants operated safely and at about 90% efficiency. A limited number of governmental regulations allowed profitable operation with about one-third of the present personnel and expenses.

The cumulative impact of the 1986 Chernobyl meltdown and 2011 Fukushima disaster on the regulatory environment, as well as recent cybersecurity mandates, means most non-regulated plants operate at a loss. Regulated ones, guaranteed a profit via subsidies and rate increases, have no incentive to improve efficiency. The result is six U.S. nuclear power plants recently announced closures, with more to come.

The NRC released a cybersecurity directive to protect plants from terrorist attacks. The directive requires the nuclear industry to spend more than $20 million dollars per control room, if they implement any critical microprocessor-based digital assets. 

Six options for digitizing

The nuclear industry has six identifiable options when it comes to digitizing I&C rooms:

1. Do nothing, keep the current obsolete analog or digital instruments, and lose roughly $10,000/day, or $3.65 million/year, for every 1% loss in efficiency in an approximately $1 million/day plant.

2. Go for broke with flat screens and triple-redundancy supervisory-control, achieving compliance with cybersecurity regulations at an estimated cost of more than $200 million, meaning a seven-year wait for a return to profit.

3. Change to flat screens and a data-acquisition system, which will include cybersecurity compliance, as well as a new I&C room, engineering designs, wiring, training of operators, and NRC approvals. Costs of about $75 million will be incurred, as well as the loss of 1% per day of income. Return on investment (ROI) will take more than three years.

4. Replace obsolete meters with digital meters, which are SCADA/DAS/DCS class 1E-compatible, instead of flat screens. This means no new wiring, except serial I/O, no power supply, and no new building costs. This saves about $140 million, but adds cybersecurity compliance costs. Return on investment (ROI) will take only about 18 months—assuming a 10% increase in efficiency or about $36.5 million/year.

5. Implement a data-acquisition system with digital meters. This is similar to option 3), but at a lower cost of about $35 million, and without associated productivity loss or a new building. ROI is about one year for a typical 5-8% increase in efficiency, including cybersecurity compliance costs.

6. Replace analogs with SSAM, or equivalent, and operate the plants as usual (manually) until decommissioned, but at higher efficiency and profitability.

Importance of digitizing control rooms

The need to digitize control rooms for greater safety and profitability is crucial for the nuclear industry as well as U.S. Navy, the oil & gas industry, the electric power industry, or any industry using analog meters. For a non-regulated utility dealing with a reactor having a limited operating life expectancy, or in a very competitive market, a good choice is to replace obsolete analog and digital meters all at once or as needed. Total cost would be about one-third that of replacing the obsolete meters, and would pay for itself within the first week, assuming a 1% increase in efficiency.

The analog meter has had a long life and has served its purpose. All good things come to an end, and the analog meter is nearing extinction.

Dr. Otto P. Fest is president and founder of Otek Corp.