Five Years Down the Road
It's a risky business, trying to predict what power-quality technologies will look like in the second half of this decade. In fact, I'm still filled with wonder that I can use a device that easily fits in my shirt pocket and talk with crystal clarity to a friend half a world away while driving in my car.
It's a risky business, trying to predict what power-quality technologies will look like in the second half of this decade. In fact, I'm still filled with wonder that I can use a device that easily fits in my shirt pocket and talk with crystal clarity to a friend half a world away while driving in my car. My children accept these marvels with indifference. After all, these tools have existed for as long as they can remember. But for the 40-something crowd, they still defy imagination.
But not so very long ago, who could have imagined the Internet? Business at the speed of light? Palm computers with the power of early room-sized mainframes? Certainly there were those who did. But for most of us, each passing year brings new technological surprises.
So, what can one expect in the area of power quality? Some indication of future trends comes, of course, from where we have been. In a nutshell, there will be continuing development of technology for drives, programmable logic controllers (PLCs), robotics and networking.
While the number of power-quality technologies grows, the size of these tools will probably continue to shrink. We are already seeing, for example, smaller, more portable monitoring instruments—such as those for infrared thermography—that help plant personnel identify and target power problem areas. Moreover, not only the ability to monitor, but also to control power quality will expand, such as with the use of active filtering and soft-start/transient-free contactors.
Twenty-five years ago, power quality received almost no attention. After all, it took a voltage transient the size of a metro bus to have an impact on the manufacturing production environment. Today, however, imperfections as short as a single cycle—as slight as 5%—can disrupt entire production lines.
Prior to the mid-1970s, production equipment consisted of induction motors, electric heaters, welders, coal boilers and gas ovens. Then came the large-scale application of solid-state electronics to these manufacturing processes. The use of intelligent, yet inexpensive devices increased exponentially, increasing industrial productivity. According to the National Bureau of Labor Statistics, today's production associate can out-produce a worker from the '70s by a 2-to-1 margin.
However, there has always been a downside to the advent of variable-speed drives, PLCs and computerized numerical control (CNC) machines. While such technological advances produced vastly improved production speeds and accuracy, they also brought with them certain power-related realities. In essence, this equipment is connected to a power-supply system that has remained largely unchanged since the '50s.
In those days harmonics were almost never a concern. Engineers installed capacitance with no regard to resonance issues. Lightning flashovers were accommodated with spark-gap arresters and circuit reclosers. As long as the voltage was available 99.999% of the time, it was viewed that the job was done right. Momentary outages were a necessary annoyance, and wave shape was left behind in the engineer's college curriculum.
In the '80s and '90s, however, non-linear loads were connected to those power systems, and to everyone's surprise, the resonance points created in the '60s and '70s were discovered when motors and transformers started to overheat. This discovery revealed how switching transients—amplified by tuning—could disrupt the operation of computer-controlled production equipment.
The result was that a gulf emerged between electrical engineers and those attempting to advance the technological revolution. A specialty emerged to bridge that gap: power-quality specialist.
Beyond Today's Standards
Today, demands are being placed on both the suppliers and users of power, and an evolution of awareness and knowledge must proceed on both sides. Education is the key. Power-quality specialists need to understand the technology and the direction it is going, and they must be aware of supply equipment—both present and emerging—that will adequately supply the power needs of that technology.
Current power-quality measures—such as system average interruption frequency index (SAIFI), average service availability index (ASAI) and customer average interruption duration index (CAIDI) —must give way to indices that measure the actual impact on production equipment. These current standards begin their measurement at zero volts. A transient that drops the plant utilization voltage to 5 volts on a 480-volt system is not an event for these standards, but it is certainly an event for today's technology-based production system.
By focusing on a more practical standard, engineers can concentrate on providing real improvement for power users. A rational standard that will likely develop is the frequency of excursions below the lower limit of some type of tolerance envelope, such as measured by the Information Technology Industry Council's ITIC—or CBEMA—curve (see March 2002, p.15).
Another area of concern is electricity deregulation—a double-edged sword for the power-quality field. In order to enhance shareholder value, the unbundled power-distribution companies being created will find themselves pressured to limit maintenance, which will have a negative impact on power quality. But necessity is the mother of invention, and this same deregulation is likely to lead to the development of a broad range of technologies that will address these very concerns.
Deregulation will also usher in a new era of distributed generation. When applied at the utility level, the net effect will be an improvement in power quality at the plant level. Distributing voltage and volt-ampere-reactive (VAR) support out on the grid will have the effect of providing better voltage stability. When applied at the plant level, an increased level of sophistication will be required of plant facility personnel, particularly if the generation is "islanded," where the facility is not physically connected to the grid, but has its own internal generation. In these cases, design and maintenance will become critical for adequate power quality.
There will be other challenges specifically faced by end users. The utility company distributes power to the facility, but the user distributes the power to the actual equipment. This local supply system is often more critical than the systems upstream. One example is power factor correction (see "Power Factor—Past and Present," p.13). Power factor correction is a concept electrical engineers are under increasing pressure to evolve. Simple fixed and switched capacitors create resonance "land mines" that will certainly create problems in the years ahead.
Five Years From Now
To jump ahead, the next five years will certainly witness the continuing explosion of technology-based solutions. Power-quality specialists will have learned to focus on the individual tolerance curves of equipment—and how to supply those needs.
Automation will become more ubiquitous in the facility, for both facility operation—comfort control, lighting, energy management—and for production operations. Robotics will become more prevalent and sophisticated. The sensitivity of these devices will be no less than the sensitivity seen today, and cost considerations may drive this equipment toward greater sensitivity.
For example, variable-speed motor drives will increase their penetration into the plant. As the technology comes down in price, it will become easier to justify its application to increasingly mundane tasks. The standard six-pulse drive is a very low-cost device, and cost will continue to be a driving force in industry five years from now. Control of harmonics at the input to each of these devices is not likely for this reason, and as a consequence, active filtering will become more prevalent to deal with these harmonics.
Historically, voltage variation was dealt with as a supply issue with devices such as a saturable core reactor. These allowed the supply voltage to vary, while controlling the voltage to the load to within very tight tolerances. Use of capacitors within these devices controlled the harmonics that were inherently generated. Variations that resulted from heavy loads on a "soft" system were not accommodated.
Moving forward in time, several emerging technologies and methodologies will become more widely used to deal with voltage variation and harmonics:
Contactor-based motor starters will begin to be replaced with less expensive solid-state switching devices. These devices will have greater durability due to the lack of mechanical contactors. In addition, the inclusion of sufficient control circuits will allow for low-cost soft-starts. These soft-starts will allow plant engineers to reduce the voltage transients from motor starting.
Load-related issues. Today, and in the future, the voltage variation as well as harmonics will be a load-related issue. New active filters will sense voltage magnitude, or both voltage magnitude and waveform, and effectively alter supply impedance on a sub-cycle basis to compensate. The result is a responsive supply that will continue to provide quality power.
Wireless technology will also continue to expand. As factories and buildings expand the use of technology, there will be a drive to incorporate data communications with facility devices. As its depth of application increases, wireless communication will be preferred for its cost and flexibility. Still, there will be "dead" areas where wireless will not operate effectively. While this may not be a field attributed to the power quality specialist, solving the limitations will be. The power quality engineer's experience in exploring and diagnosing problems will be important skills in resolving radio interference.
And more. As technology is expanded to provide communications with production equipment, this communication will include the ability to observe the performance of the power system via power-monitoring equipment. More information than ever will be available. It would be best for manufacturers to agree on a standard for data protocol, but the desire to limit competition will minimize cooperation toward any standard.
Preventive maintenance tools such as infrared camera technology will become cheaper and more prevalent. Cameras that fit on palm-sized computers may well be heading to market five years from now. While they will initially lack the sophistication of today's offerings, they will make up for it in cost and accessibility.
The payoff will be there for the country that is able to deploy the increasing level of technology successfully. Certainly, success of this deployment will depend on maintaining adequate power quality in spite of the economic pressures. It is precisely these pressures that may drive both the technology, and the power-quality solution.
From Pure Power, Summer 2002.
Power Factorâ€”Past and Present
In the past, it seemed as if every plant engineer had his or her own "rule of thumb" for power-factor correction. In general, conventional wisdom was to install individual capacitors on larger motors and provide a modest fixed bank at the service point that did not exceed some percentage of the supply transformer capacity. The installed capacitor, in parallel with the supply impedance—mostly inductive—created a parallel "resonance" point, or high impedance to a specific frequency. These methods worked fine in a world full of linear loads that used current only at 60 cycles.
What was created, however, was a plant with harmonic "land mines" that appeared on the frequency spectrum with the starting and stopping of the capacitance-corrected motors. If exactly the right number of capacitors were energized when a voltage transient occurred—that also happened to contain the same frequency—the transient would step on the "land mine" and be amplified into a much larger, possibly damaging transient.
These events are also the target of today's switching mode power supplies commonly found in lighting ballasts, drives and computers. These devices draw sustained harmonic currents that, when crossed by these roving resonance points, cause a sudden increase in harmonic distortion that can disrupt the operation of sensitive equipment, as well as prematurely age electric system components.
A Capacitor/Filter Revolution
Elspec, an Israeli company, has started a quiet revolution in the way traditional capacitor/filter banks are applied and designed. Used by Square D in their premium line equipment, Elspec's high-speed controllers along with solid-state contactors provide "real-time," transient-free capacitor bank operation. This technology allows standard three-phase capacitors to be switched at zero crossings—zero voltage differential crossings—to allow for transient-free insertion of traditional three-phase delta-configured capacitors. The use of silicon contactors, in place of the traditional type, can also provide the first glimpse of the contactors of the future. Given time and manufacturing improvements, such solid-state devices can give us contact-free contactors, as well as "soft-start" contactors with a very small footprint.
Drive technology has also been finding its way into the low-voltage power-factor correction/harmonic-filtering field with the advent of the Siemens Sipcon P/S and Schneider's Accusine. Three-phase power inverters are being applied to power systems to supply harmonic and reactive current in real time, in some cases sub-cycle. Traditional capacitor banks have necessarily had a delay built into the control system, to minimize wear on the contactors. In addition, they could only reduce harmonic distortion through simple filtering. These new systems rely on solid-state switching technology, which means that not only can they be allowed to operate very quickly, but they can also function on a sub-cycle basis. This high-speed operation not only allows the units to supply volt-ampere-reactive, but also harmonic currents that are precise enough to virtually eliminate them from the utility supply. Coupled with newer control algorithms, the Accusine can even filter sub-cycle welding surges, giving manufacturers more control over weld cycle voltage and quality.
|Search the online Automation Integrator Guide|
Case Study Database
Get more exposure for your case study by uploading it to the Control Engineering case study database, where end-users can identify relevant solutions and explore what the experts are doing to effectively implement a variety of technology and productivity related projects.
These case studies provide examples of how knowledgeable solution providers have used technology, processes and people to create effective and successful implementations in real-world situations. Case studies can be completed by filling out a simple online form where you can outline the project title, abstract, and full story in 1500 words or less; upload photos, videos and a logo.
Click here to visit the Case Study Database and upload your case study.