Taking the complexity out of designing Smart-Grid devices
Last August, National Instruments (NI) unveiled a product that it expects to quicken the pace at which renewable energy sources such as solar and wind power make their way onto the Smart Grid. The NI Single-Board RIO General Purpose Inverter Controller (GPIC) includes a hardware chassis that harnesses field programmable gate-array (FPGA) technology and the well-known NI LabVIEW System Design Software suite. Control Engineering Contributing Editor Sidney Hill recently spoke with Brian MacCleery, National Instruments’ principal product manager for clean energy technology, about how this new development platform should ease the task of building Smart Grid-ready products.
Hill: Let’s start with what I believe will be the biggest question for most Control Engineering readers. Exactly how can the NI Single-Board RIO GPIC make it easier to build products for the Smart Grid?
MacCleery: Our primary goal in developing the GPIC was to offer an embedded systems platform that is optimized for building grid-tied power control systems.
This has been a collaborative effort with the National Renewable Energy Laboratory (NREL) and the California Energy Commission. In addition, more than 20 companies around the world—all of whom are in NI’s customer base—had input into the design of the GPIC, because we really wanted to make sure it matched their requirements.
When we started this effort, we did question whether it was possible to develop an off-the-shelf system that could, in fact, be used by any company that does grid-tied power conversion. The release of the GPIC proves that we answered that question.
Hill: How were you able to ultimately come up with a platform that is successfully being mass marketed?
MacCleery: That’s an interesting question. As we started talking with companies in different industries about this problem, we quickly noticed that their requirements actually converged. We found that there was, indeed, a common set of requirements that spanned industries and applications.
Because of the broad interest in solving this problem, virtually every company we contacted agreed to have its engineering team spend at least a half day with us, going over every single specification for the GPIC to make sure it would meet their specific requirements.
We took that as a good sign. If companies were willing to invest that amount of time and resources to help us design it, there must be a real need for the type of solution we were developing.
Hill: What were some of the common requirements?
MacCleery: We’ve created a fact sheet that lists the GPIC’s specifications. They were carefully developed in collaboration with NREL, the California Energy Commission and approximately 20 companies that are heavily involved in designing power inverters. So, we know the platform can be used to develop inverters for a broad spectrum of applications, including:
- Grid-tied solar inverters
- Wind turbine power converters
- Unity-scale energy storage systems and flexible AC transmission systems (FACTS)
- Medium-voltage motor drives and pumps with unidirectional or regenerative inverters
- Electric and hybrid electric vehicles, automobiles, trains, agricultural equipment
Somewhat surprisingly, we found that with the I/O on the GPIC and the price point we are able to deliver, we can meet the application requirements of any grid-tied power converter that is 50 kW or larger, even up into the tens of thousands of units.
That was NREL’s vision when we started working with them on this project. A gentleman named Bill Kramer, who manages R&D for energy systems integration technology at NREL, had just this vision—what he calls modular power electronics.
The idea is that everyone in industry can save a lot of money by using standard designs and system level development tools that enable Smart Grid and power electronics domain experts to take the lead in designing and implementing the embedded system. If everyone can build their power converters based on the same architecture, it will drive down the cost and make it possible for NI to continuously invest in providing improved development tools.
Hill: From a technology perspective, what specific characteristics of the GPIC ease the process of designing Smart-Grid-capable devices?
MacCleery: Let’s start with some background on the origins of the GPIC. It’s based on a technology we launched in 2004 called NI CompactRIO, which has grown to be one of our largest product lines. The idea behind CompactRIO was to place FPGA technology, or field-programmable gate arrays, at the core of a commercial off-shelf-system for designing circuit boards. We wanted to give embedded design teams the same flexibility they have when developing fully custom designs but on a pre-validated, low-risk platform that provides a fully integrated software development experience. In other words, it’s embedded design without the development cost, risk, and traditionally large specialized design team.
The FPGA technology, in essence, makes it possible for every user to conduct custom circuit board design. The difference is that they’re now working with an off-the-shelf platform and high-level graphical programming tools designed for advanced digital signal processing and embedded control.
The FPGA is like having a mini custom designed application specific chipset inside each system. It makes it possible for users to customize hardware at the silicon logic level.
Traditionally, if you wanted to tailor an FPGA to your specific application, you needed to be an expert in specialized programming languages like Verilog or VHDL. There’s a relatively small population of developers with those skills. Also, embedded system chips are becoming more complex every day, making it even more difficult to program them with text-based languages, even for people with those skills.
For example, in recent years the traditional DSP chip and the FPGA chip have actually merged together into a hybridized chipset in which the FPGA has a massively parallel array of DSP distributed within a mesh of reconfigurable logic and communication infrastructure. This is a real game changer because it brings the best of both worlds together- the signal processing efficiency of the DSP and the silicon level flexibility of the FPGA. Even more, you gain a 40-fold increase in processing performance per dollar by integrating everything into a single chipset.
As we continued our R&D efforts with FPGA technology over the past 15 years, we developed several platforms that combined an FPGA-based hardware chassis with the NI LabVIEW Graphical System Design Software. Today, the LabVIEW FPGA graphical programming tools have reached the point where they can often exceed the efficiency and performance of the traditional text-based FPGA programming languages.
Hill: How did a platform for developing Smart-Grid-ready devices emerge from this effort?
MacCleery: It evolved from another one of our products, the Single-Board RIO. That product has an FPGA and a Power PC processor running the VxWorks real-time operating system.
The GPIC has the same components as the Single Board RIO in addition to 134 channels of I/O. Those I/O channels are designed specifically to meet the requirements of a broad spectrum of power converter applications.
Incorporating the channels necessary to create the GPIC took roughly a year of development time. The economic impact of integrating these inputs and outputs in LabVIEW FPGA—where you can just drop down an I/O node and are up and running—is tremendous. It cuts 70 % to 80 % of the software development cost off the conventional embedded design process for an FPGA board.
When you develop your own fully custom FPGA board, roughly 70% of every dollar spent on software development goes toward work devoted strictly to interfacing to the I/O. Our goal is to bring that expense down to less than 10%. That would allow developers to focus all of their software engineering dollars on developing the complex algorithms needed for their Smart-Grid products.
HILL: With 134 I/O channels on the GPIC, developers have the flexibility to certain of those I/O channels and not the others, depending on their application. . Is that correct?
MacCleery: Exactly. The fact sheet I mentioned previously includes a chart that outlines how the platform maps to various applications. It shows the I/O that a user needs to bring in from their power converter system. For instance, a wind turbine will often have a position sensor on the motor/generator and auxiliary motors. If they use a digital sensor, it would typically connect to the GPIC through the LVTTL, which connects directly to the digital I/O buffers on the FPGA.
Many companies developing medium-voltage drives use fiber-optic connections rather than copper. In that case, they link to the LVTTL with a fiber transceiver and use the FPGA to communicate data, clocks, and pulse width modulation command signals. This is common in large megawatt size inverters. The FPGA is able to time the firing of the power electronics so precisely that you can actually cancel out the harmonic distortion produced by the switching events and produce nearly perfect three-phase power.
The customer basically builds a custom two-layer or four-layer interface board for their GPIC, which provides the custom interface connectors and signal conditioning to match the specific application. These interface boards are relatively simple designs that can be turned around in a few days, but they give the mechanical and cabling flexibility associated with a full custom design.
Through all of the half-day meetings with R&D teams around the world, we found that this is the sweet spot that companies large and small really seem to want.
They want to outsource the processor/FPGA and I/O system to a company like NI and insource the mechanicals, sensor connectivity, power electronics and the embedded software algorithm development. The latter parts are what most companies that specialize in Smart-Grid devices consider to be their core competency. It’s how they differentiate themselves in the marketplace.
It’s clear that engineering managers or boards of director would prefer to spend a higher percentage of their engineering dollars on work more tightly associated with their core competency and the domain expertise that separates them from their competition.
Hill: The GPIC debuted last August. At this point, are you able to cite any examples of how customers are using the platform?
MacCleery: Yes. There’s a GPIC FAQ page on our website. One of the questions relates to customer feedback, and there are comments from a number of early adopters. For example, Yakov Familiant, the lead engineer for power systems and architectures at Eaton Corporation, commented on how it has improved their development process significantly on a range of projects. He explains, “We can now have commercial hardware out for field testing in three months by developing our prototype validation on deployable hardware that is pre-validated and cost effective for production.” In essence, this platform accelerates the pace at which Eaton, a $16-billion company with 100 years experience in electric power, is able to deploy its Smart-Grid power converters in the field at high volumes.
Hill: The literature on the GPIC refers to a “tool chain” for power system development. What does that mean?
MacCleery: That term refers to all of the components necessary to optimize the design of grid-tied power systems. Ultimately, NI wants to offer a comprehensive tool chain for developing Smart Grid power electronic systems. Perfecting the GPIC hardware is an important part of this, but we also want to give developers the best software tools. That’s where the LabVIEW graphical programming suite comes into play.
Companies using this platform are developing complex applications. When you’re talking about Smart-Grid-ready inverters, you’re managing sophisticated control systems as well as the networking that’s necessary to make that system fit into the Smart Grid –a very complex system-level design.
We wanted to create a tool chain that enables people to manage this complexity and design their embedded control systems at a system level, what we refer to as graphical system design. With the complexity of embedded systems today, companies really can’t afford to design at the gate level using hardware description language. The hybridized FPGA inside has 58 DSP cores and text-based languages really become confusing when trying to program more than three parallel tasks simultaneously.
Companies also must deal with a lot of tough engineering tradeoffs. For example, if you increase the energy efficiency of a 5-megawatt wind turbine power inverter by 1% percent, that means millions of dollars of increased revenue over the lifetime of the inverter. But how does that affect the lifetime of the converter and the maintenance costs? How can we find the sweet spot?
Management wants engineering teams using higher-level tools that make it possible for designers to analyze these types of system-level tradeoffs.
In the solar realm, solar panels can last 20 to 30 years and still generate 80 percent of the power they generated at the time of their initial deployment. Inverters, on the other hand, tend to fail much sooner. So a major challenge for the solar industry is to develop inverters that not only are more energy efficient, but also have a longer lifespan.
System-level design tools make it easier for developers to analyze these complex tradeoffs. If they pull one string they can see how that impacts the rest of the system. For example, I may be willing to take a small efficiency hit in order to keep the temperature of the power electronics more constant and thereby extend its life. Every time you heat and cool the power transistors, this takes a tick off their lifetime.
Hill: Is that also what you meant when you spoke of being able to prototype on this platform?
MacCleery: Yes. The typical design flow on the applications we’re talking about starts with simulation. No one tests their software for the first time on a fully functioning inverter. You have high voltage, high current, and expensive equipment so it’s just too dangerous and complex. Much of the validation and verification work must first take place in a safe simulation environment.
Historically, the process of moving from the simulation environment to the actual embedded system was very long and expensive, especially for FPGAs. In many cases, it involved printing the simulation block diagrams to use as design specifications, and then writing the corresponding code by hand in low-level text based programming languages. In that approach, the simulation really only helps you develop design requirements and specifications. The actual process of moving to the embedded software is done by hand.
With the improvements we’ve recently made to our tool chain, customers can now develop their graphical FPGA code in a high-fidelity simulation environment that captures all of the complexity of the switching power electronics and the grid. They can see things like how hot and cool the transistors will get and how the grid responds to the software they’ve written. Additionally, they can write actual live code in the simulation environment and move it to the FPGA on the GPIC in minutes. Developers tell me they want the ability to very easily take the same control software code to go back and forth between the simulation world and the deployment world.
We manage that transition in the LabVIEW project environment, which makes it possible to use the same embedded software code in both a simulation world and the physical world. Say, for example, I’m changing my algorithms to control my grid storage system. If I change it in the deployment environment, that same LabVIEW code for the FPGA is linked into the simulation world. As a result, it updates all of my simulations. That means every time I make a change I can validate it through simulation before running it on the actual embedded system. I can even automate that testing to save more time and ensure every software block in my embedded code is formally validated to comply with the design specifications.
This also makes it possible for a more thorough and disciplined development process because I can have engineers and engineering management reviewing and testing my system simultaneously in both the simulation world and the real world.
Hill: I guess you see this platform helping to accelerate the full rollout of the Smart Grid?
MacCleery: Yes, that’s our goal. There are multiple levels of complexity related to the Smart Grid. There’s the complexity of designing the individual system, like the inverter. There’s also the system-level complexity of managing the overall grid—determining when nodes should switch on or off. Offering a tool chain that makes it possible to simulate all of that interaction brings the design of these systems into the information technology age. So, it really seems that this type of graphical system design tool chain is necessary to speed up the full rollout of the Smart Grid.
This article appeared in the February 2013 Industrial Energy Management special section to Control Engineering, CFE Media.
For more information on the NI Single-Board RIO General Purpose Inverter Controller visit the following resources: