Distributed Control Goes Discrete
Process control has been distributed for quite some time. In fact, the concept is immortalized in the TLA (three-letter acronym) "DCS." Discrete control, until recently, depended on an architecture featuring one large PLC with large racks filled with I/O modules. The view across a discrete manufacturing plant is usually blocked by large centralized control enclosures packed with PLC I/O r...
Process control has been distributed for quite some time. In fact, the concept is immortalized in the TLA (three-letter acronym) ‘DCS.’ Discrete control, until recently, depended on an architecture featuring one large PLC with large racks filled with I/O modules. The view across a discrete manufacturing plant is usually blocked by large centralized control enclosures packed with PLC I/O racks, solenoids, motor control devices, terminal blocks, and lots of wire. The wireway from enclosure to machine contains large bundles of 12 – 16 AWG wire, each from an input or output point to a field device, sometimes located many meters away.
The situation only a few years ago was that engineers and plant managers realized a need to reduce the cost of running so many wires and the subsequent costs of troubleshooting incorrectly terminated or mislabeled wires. Further, large PLCs had all the required functions-like advanced math with floating point, networking, real time clocks, PID and data handling functions, etc.-but were expensive in cost and real estate. Small PLCs were convenient and economical, but were quite limited in function. Industrial networks were being developed amid uncertainty of what standard would ‘win.’
What a difference three years makes! There are still different ‘fieldbuses’ in the market, but engineers are gaining field experience through many installations. Replacing a large bundle of wires with a single cable ranks as the prime benefit for a networked architecture.
Small and micro PLC designers leverage microprocessor design advances and reduce costs, along with software improvements. Today’s smaller controllers generally contain all the software and networking power of yesterday’s large PLCs, but often add more functionality. For example, where once PLC-based motion control was a challenge for the most advanced engineer, today motion is much more easily integrated into a control scheme.
Modularize machine design
Dave Gee, vp technical marketing at Steeplechase Software (Ann Arbor, Mich.), sees benefits to designing distributed control. He says, ‘Perhaps the most important benefit to distributed control is the issue of logical partitioning. By distributing the overall control of a machine among independent PCs, the machine developer is able to construct a modular machine. Different operations can be easily inserted and/or removed depending on the desired function. Operations can be automatically bypassed or activated depending on a particular work order, or they can be physically inserted/removed if a different machine configuration is desired.’
Mike Rothwell, Advantech Automation’s (Cincinnati, O.) manager of its Cincinnati Product Division, notes two architectures and several more benefits of distributed control. ‘Distributed architecture has two possibilities: A centralized controller with remotely distributed I/O modules or intelligent I/O devices and controllers at each location. The first has cost advantages, while the second has performance and redundancy advantages. Some benefits include reduced wiring costs, possible redundancy with multiple controllers, and ease of maintenance with the controller located at the point of use.’
Chuck Ridgeway, engineering manager at Horner Electric (Indianapolis, Ind.) a conveyor controller OEM and integrator, notes that network latency can be a problem. With distributed I/O points, all I/O devices are on the network creating potentially high traffic. With a distributed control scheme, I/O traffic is limited to controller to local I/O, while the network only handles controller-to-controller traffic. He notes that Horner conveyor projects have revealed benefits of scalability-both of conveyor sections and of logic. It is relatively easy to add new chunks of code to the existing modular design.
Gene Mucklin, Cutler-Hammer/Eaton’s (Westerville, O.) product marketing manager, sees communications as a key to distributed control. ‘The incorporation of communications processors into control devices at economical levels is driving distributed control. Distributing control eliminates a single point of failure. As communications are incorporated in devices like motor control centers, process data like the load on a motor can be communicated.’
Industrial networks a key
Distributed control takes several forms, but a key element for all is networking. It is only with development of high-speed industrial networks that significant strides toward distributed architecture have been made. ControlNet, DeviceNet, FOUNDATION Fieldbus, Interbus, and Profibus are examples of open networks designed for industrial use that now have broad application experience. These networks provide physical layer specifications, as well as component identification, diagnostics, and easy setup. (See article in this issue about how to build an industrial network.)
Growth of network-enabled I/O modules, sensors, and actuators has been a valuable byproduct of this rise in networking. Engineers no longer need to feel locked into an unprofitable partnership with one control vendor. It is possible to find IP67-rated I/O modules from several vendors to simply plug into a network, for expansion or replacement.
This architecture still may leave a few, somewhat centralized, controllers in the system, but wiring is greatly reduced. One cable connects the controller with distant I/O modules near the sensors and actuators it controls.
These networks also provide extensive diagnostic tools that enable a technician to quickly find and fix a problem. Cost savings in installation and maintenance usually exceed a sometimes-higher initial material cost.
Egon Hillerman, global marketing manager for Siemens Energy & Automation (Alpharetta, Ga.), points out that an intelligent head connected to I/O nodes on a network, such as Profibus, is more easily programmed than traditional architectures. Another benefit is built-in diagnostics to aid installation and maintenance troubleshooting.
For example, Mr. Hillerman refers to a material handling/sortation system in a distribution warehouse. An intelligent sensor, bar-code scanner in this case, reads product numbers coming along the conveyor. Decisions are made on the fly to divert to various conveyors building orders for each store. Because the architecture is standard for each diverter, OEMs can build and test one, then duplicate as many times as needed for the system.
Kevin Colloton, Rockwell Automation (Mayfield Heights, O.) micro-PLC business, notes emergence of small controllers as an integrated part of a networked architecture. ‘In terms of hardware, plant-floor architectures with a solid networking foundation can allow any controller to report latest status or control data. Controllers that are DeviceNet or ControlNet compatible, for example, are designed to operate either as islands of control or integrated components of a much larger system. A series of small or micro-controllers, connected via a network and used with a centralized HMI package, provide end-users with a low-cost solution that maximizes production and minimizes the risk of plant-wide failure.
‘The best applications for distributed control in discrete manufacturing are larger packaging machines, conveyor lines, or any application in which distance is an issue,’ he adds. ‘Distributed control allows you to place the controller close to the devices with the network significantly reducing wiring costs and providing greater flexibility.’
‘Utopia,’ states Mark Knebusch, Interbus group director for Phoenix Contact (Harrisburg, Pa.), ‘would be a set-up where the programming environment handles distribution of code to the target processors. The user writes one program for the system. In other words, the user or programmer does not need to program individual controllers. Today, the closest to this are control suppliers with a common editing and configuration package for a wide range of controller sizes. The user must still be aware of individual processors, but online programming allows all the programming to be handled from a single physical location.’
Shawn Liu, National Instruments (Austin, Tex.) product manager, also notes the power of software in a distributed control environment. He says, ‘Consider a series of storage tanks in the field, each controlled remotely by a single computer. These computers use control routines to maintain the level in the tanks and are networked to a local computer monitoring the condition of each tank. Our software enhances information sharing on the network. Users can change setpoints on individual tanks or monitor actual levels from the local computer. Users can also design a system where an application resides on the local computer with sections of control code executing on remote computers. Redesigning the control routines for the system is done on the local computer, requiring less time and traveling.’
Ethernet builds momentum
Ethernet has had supporters for several years as an industrial fieldbus. It is the standard backbone for commercial computing and the necessary foundation of the Internet. Since its market is so large, more development money is poured into new products and technical advances than would be possible in the much smaller industrial market. Schneider Electric’s Automation business (North Andover, Mass.) has promoted a ‘Transparent Factory’ built from an Ethernet foundation for several years. The idea is that a common networking infrastructure from factory floor to enterprise systems can see all production information easily and quickly.
Detractors have voiced concern about the supposed indeterminacy of this network given the possibility of information packet ‘collisions’ over the wire, yielding the chance for information loss or corruption. Either would be disastrous for control applications.
Stan Schneider, president of Real-Time Innovations (RTI, Sunnyvale, Calif.), replies to these critics, ‘We wrote a white paper [available at www.rti.com ] examining the effect of collisions on low-level network latencies. This paper, and many others, shows that concern over collisions is largely unfounded. With the advent of fast, inexpensive switching hubs, even this unfounded worry is eliminated.
‘However, hardware is not the real story. TCP protocol underlies much networking today. TCP cannot be relied upon to be deterministic for many reasons. Attention is now focusing on UDP,’ Mr. Schneider suggests. ‘UDP is part of the standard ‘TCP/IP’ stack providing fast delivery of datagram packets. It also uses one-to-many communications, such as multicast. In addition, we have implemented a Real-time Publish Subscribe protocol to make it simple to send the data packet exactly where it was intended.’
A group of companies that had been working with RTI independently has founded Interface for Distributed Automation (IDA). Announced at ISA/Expo 2000 in New Orleans, these companies have joined marketing and technical efforts on Ethernet and web technologies to develop distributed intelligence architectures. Founding members include RTI, Schneider Electric, AG-E (Lemgo, Germany), Jetter (Ludwigsburg, Germany and Cleveland, O.), Phoenix Contact (Blomberg, Germany and Harrisburg, Pa.), Sick AG (Waldkirch, Germany and Bloomington, Minn.), Kuka Robotics (Augsburg, Germany and Sterling Heights, Mich.), and Lenze (Hameln, Germany and Fairfield, N.J.).
Richard Galera, Schneider Electric’s Transparent Factory Team Leader, notes, ‘Schneider Electric shares the strategy to define a set of services on top of Ethernet and Internet standards to enable distribution of control in PLCs and intelligent devices.’
RTI’s Mr. Schneider shares an insight into implications of this ‘Third Generation’ of control (the first two being centralized control and networks). ‘The third generation, distributed intelligence, enables a component-based object model. With this model, each hardware device is represented as a single, well-defined software object. The object includes the resources and properties that define it as well as methods to control, access, and use it on the network. Control programming in this environment does not require knowledge of where each resource is located. For instance, instead of having to remember that memory location ‘N:1200’ is really ‘speed,’ resources can be referenced by intuitive names like ‘current speed’ for an analog value or ‘axis position reached’ for a digital point.’
Putting it together
Jim Berlin, senior vp technology at GE Fanuc Automation (Charlottesville, Va.), advises looking at the entire system when constructing a distributed system. ‘It takes a full set of small to mid-range controllers matched to the needs of applications throughout the system,’ he adds. ‘These controllers must be tied together by a reliable, high-speed LAN so control information can be shared in real time. Most importantly, you need software that allows you to easily set up the system and provides visualization tools for monitoring and characterizing what is going on. The entire system must be accessible from anywhere within the plant or around the world.’
Dave Quebbemann, Omron Electronics’ (Schaumburg, Ill.) industrial automation marketing manager, suggests that users, ‘Look at overall system layout and requirements. Determine the amount of processing power required and the total number of I/O points. Also look at the system to determine ways in which you could aggregate or consolidate I/O stations or points in a logical manner. See what type of system update time is required so you know what type of scan time you need or how fast you need a system update. Ensure compatibility of the fieldbus network, between different input and output devices, and the main processing unit.’
Another factor to consider is integrating all this to the entire plant, or even business. Paul Ruland, PLC & I/O product manager at Automationdirect.com (Cumming, Ga.), notes, ‘Distributed control now often needs to include the ability to instantly connect to other Microsoft-based systems in the enterprise, providing an enabler for implementation of ‘e-manufacturing’ structure. This eliminates automation islands that often exist between process manufacturing and material handling and packaging lines.’
Udo Lutze, president of Lutze (Charlotte, N.C.), notes that the flow of information from the sensor to IT department must not only be simple, it should not affect control performance. If control is handled in a decentralized manner, peer-to-peer control information traffic does not interfere with machine control.
Machine control will likely be much different in just a few years from what it is today, perhaps with an interactive sensor/effector/computer design (as Paul Saffo discusses in the sidebar). Whether or not you agree that will happen soon, or at all, you can re-visit overall machine and process line designs in light of this new ability to decentralize control.
The Zen of Technological Change
I think the industry is poised on the edge of fundamental and nonlinear change. The whole world is about to get very interested in the areas of deploying sensors into devices, not just for test and instrumentation and control, but for lots of other things besides.
Let me step all the way back to the fundamentals. Think about the whole technology sector. What does success look like? It looks like an S-curve. For example, virtually every single consumer electronic device in the last century followed the S-curve, a long flat line, sudden inflection point screaming upwards to mas- sive adoption.
Ask anyone to look at the curve and say what matters. They will focus instantly on that inflection point, the point where the curve suddenly starts to curve upwards. They will ignore that thing off to the left, the flat spot. It turns out the secret to understanding what is happening ahead is that flat spot.
That flat spot is the period of interesting failures. The road to ultimate success is paved by successive failures. That’s exactly what is going to happen to your business, because at this moment, we are at the cusp of several curves that are all converging around sensor and effector technology. It is a moment of what economist Joseph Schumpeter would describe as a moment of creative destruction.
It turns out, if you look back at the recent history of the information revolution, about every 10 years a new foundational technology arrives that fundamentally sets the landscape for the entrepreneurship to follow.
The advent of cheap microprocessors ushered in the processing decade of the 1980s. It was a decade where we were utterly preoccupied with trying to process everything we could get our hands on, words, images, figures, graphs. The poster child of the decade was the personal computer.
The next decade was shaped by a fundamentally different technology-cheap lasers. This decade was defined not by processing, but by access. In the 1990s, the devices that mattered were increasingly defined not by what they processed for us, but by what they connected us to. The symbol of the decade was the World Wide Web.
This decade is being shaped by yet another technology-sensors. Cheap sensing is setting us off in a new direction. I think it will be a decade increasingly of interaction, because it’s not just sensors, it’s also effectors.
As sensors and effectors begin to deploy, watch the interface of human and machine disappear. The world of interface is going to be left behind us very quickly because cyberspace and the physical world are going to come together. It’s already revolutionizing process control. I think that the missing piece in mass customization is fine grain sensor/effector technology. And it’s not just going to change the ‘how’ of manufacturing, it’s going to change the scale of manufacturing. We are going to see some manufacturing processes, say in the paper business, where paper plants that take up blocks and blocks get shrunk down to the size of a small office building.
So revolutions beget revolutions. There is a storm of S-curves arising. Everything is up for grabs, everything is in flux; there are huge opportunities, there are huge risks. It is truly an age of creative destruction, and the challenge is that in an age of creative destruction-spotting the next wave-is going to be very hard indeed.
Paul Saffo, director of The Institute for the Future (San Francisco, Calif.), a research firm that provides strategic planning and forecasting, at National Instruments’ NIWeek 2000. This text is adapted from his talk.