Collecting Data Reports from Controllers

A tutorial on how to create a simple solution using common off-the-shelf software to solve a variety of manufacturing data collection needs.

By William Love, Kredit Automation & Controls March 1, 2010

LINKS

Search www.controleng.c for more information on the terms listed in the “Terminology” box on page 59.

Process and quality engineers, production managers, and others in the discrete and process industries are increasingly asking for snapshot and composite reports of control and production data. Their motivations for these requests are wide ranging. An engineer at one site wants machine cycle data in a spreadsheet to ponder process improvements. Another engineer wants a complex report created for each machine cycle but does not plan to look at it often; therefore, he just wants it stored in a format that any computer system can import. The list of such divergent requirements goes on and on.

 

Press&Cycle Aluminum extrusion plant’s press, with collected cycle data in Microsoft Access.

Despite the diversity of their data needs, one thing they have in common is difficultly finding a simple solution. Doing an online search for data acquisition resources returns a multitude of confusing results. To address this information gap, this article clarifies some rather vague terms (see “Terminology” box), and helps readers understand why the solution to their own requirement is often different from what those terms typically refer to. To help clarify things further, a method is described that uses existing control system software to extract data in a form and location that can be opened in a spreadsheet, database or any other computer system application—including MES or ERP systems. This solution leverages Microsoft technologies commonly installed on operator interface computers.

Types of data acquisition

Because SCADA has “data acquisition” in its name, a person interested in plant data reports who searches online often gets immersed in articles about control systems. The term SCADA had a somewhat different meaning a few years ago. When mines, oil and gas fields, and other industries had scattered controllers and needed to acquire data and adjust setpoints or parameters, SCADA referred to the art of collecting this far flung data (data acquisition) and making adjustments to the controls (supervisory control) in response.

Despite the roots of the name, many current references to SCADA often refer to a PLC/HMI-based control system. This shift occurred when operator interface computers (HMIs) became a common part of just about every control system.

Another common data acquisition (DAQ) activity, which is much different from SCADA, is all about extremely fast and accurate measurement. Such measurements are not about extruder cycle times or totalized flows; they’re about measuring pressure or current variations at kilohertz or megahertz rates. Scientists studying a flux field or a reaction rate are the folks using these systems.

Some of the DAQ solutions used for these measurements are standalone systems that look like the early portable suitcase computers. Other solutions use expensive cards installed in a PC. An extreme example is a digital oscilloscope. This type of data acquisition is much different than SCADA. Yet some product offerings seem to be promoted for general purpose use in both scientific and industrial settings.

This dual nature is exemplified by a recent statement in a control systems article: “We required multi-channel data acquisition at 100k samples/sec synchronized with closed-loop hydraulic servo control on two axes.” In that application, the line between motion control and DAQ appears to blur. But a process technician or operation manager knows that whatever that advanced application is all about, it is not what is needed to get some specified data into a spreadsheet for crunching and pondering.

CSV file pipe test data viewed in Microsoft Excel.

Getting the data you need

With all the changes that have occurred around control system functionality, capability, and terminology, the simple art of getting selected data out of the control system has apparently become too broad of a topic to now have a clearly agreed-upon name. So, in order to arrive at some common ground on the subject of data collection, let’s looks at some real-life engineering examples.

A process technician at an aluminum extrusion plant is trying to solve quality and process problems. He is also doing optimization tests by raising the extrusion speed and observing the effect on runout temperature, quench rate, and product quality. Data is being collected with eyeballs and clipboards. He knows that some averaging and other data analysis is needed prior to saving the record for a cycle. For example, the quench rate is not just a simple number sitting in a data register. The engineer has tried a third-party software package that cost a lot of money and it had just not worked out. So he is seeking a more custom solution with more targeted results that would allow Microsoft Access to view the data. His question is: How can I get data in a spreadsheet at exactly the right times in the cycle and in the right format?

A plant manager at a facility producing a number of formula-based, blended stucco materials in a hybrid batch/continuous process wants to track material usage in a very specific way. The manager knows this will require some programming in the control system (PLC/HMI) to create/capture events. He also knows the PLC, transaction manager, or historian will also have to perform some calculations and buffer and group data. Ultimately, the data collected has to be readable by an IBM mainframe (currently in use) and an Oracle database (near-term planned purchase).

The quality control department at a plastic PVC pipe extrusion manufacturing facility wants to eliminate the chart recorders used to prove that pressure tests have been conducted for each pipe. The engineers in the department want the trend pens to appear on the HMI. But they also want the crucial test data—which is pressure measured at 0.2 sec intervals for five seconds—to be viewable in a database or spreadsheet. They want additional information from the cycle logged at exactly the same time, and with very specific and precise synchronization to other events in the machine cycle. In the end, they are looking for all this data to reside in a flat file that anything from Oracle to Notepad to Microsoft Excel could access.

What do these examples have in common? No matter what system or solution is used, they all involve some programming to correlate and capture specified events and store data in registers. They also all involve creating a new data record after some specified event(s) occur. In some cases, this is a newly defined event created as a composite or a derived event based on several conditions and individual events within the machine or process cycle. Two of the cases noted above involve getting the new data record in a format that any tool—a spreadsheet, database, MES/ERP system or even a text editor—could open to view the record.

How much custom programming?

The engineers at these plants knew their specifications required custom programming. The question was: How much? Did the whole solution have to be homegrown? Should they buy the powerful SCADA add-ons and have solutions engineered with those? Or could their existing control system provide some of the tools to allow a full but straightforward solution?

The answer to this last question is yes in all cases; the existing control system provided many of the tools needed to build a solution:

The composite and derived events could be created in the controller (PLC);

The data could be buffered in the PLC or the HMI (operator interface software system);

The HMI and its software drivers for Windows PCs could act as the data servers;

The HMI had VBA (Microsoft Visual Basic for Applications) embedded and was perfect for writing DAQ code to put the data in the format required and store it in a universally readable format; and

Microsoft Office could be used to access the data for human cognitive analysis.

The only software that had to be purchased was another copy of Office. In all cases mentioned, the HMI had VBA. On other projects I have worked on, the HMI did not have VBA embedded. In those cases, the VBA code was put into a regular Visual Basic program and a driver was purchased to allow the VBA program to extract data from the PLC. (Note: VBA is a language; Visual Basic is an application development tool whose programming language is VBA.)

In two of the three cases sited above (plastic and stucco), the data was stored in text files in a comma-separated-variable (CSV) format. It is important to emphasize the portability of this data. This kind of text file is universally readable. XML files are another format this solution method can create. The point is that the data should be easily and universally accessible.

At the stucco production site, an IBM mainframe had to be able to read the data, as did future, unspecified tools like Oracle or SQL server. The IBM programmers were quite pleased to learn that all they had to do to get the data was open up a text file and read a line. Though this technology may be decades old, it is simple and works flawlessly.

At the plastic PVC pipe manufacturing site, each test on a pipe creates a new CSV file that is named with data/time stamp. For the customer, this was a perfect compliance tool. These files are very small and very portable, and thus met the requirement that most any conceivable tool could use them. A master Excel workbook was designed with a button the user presses to import a specific test CSV file and plot the data in order to display the pass/fail curve of the test. All the other key performance indicators (KPIs) are also presented in a useful format that allows the quality control dept to learn a lot about the test at a glance.

It is important to note that making this spreadsheet tool was not required; it was just an example of using the tools in Microsoft Office to view the data. The raw CSV files can be opened on any computer without buying anything, and they can be imported into any program. Having said that, the spreadsheet only took a short time to create, and the Chart Wizard did a lot of the work. The hardest thing was automating the import of the CSV file data.

The aluminum plant elected to log the data directly into an Access database. The reason: Process technicians just wanted a friendly tool to allow them to view machine cycle data. An added benefit is that the plant’s MES and ERP system can act as clients to the Access database if that data is needed in the future.

Tools are at hand

A key point to remember is that your HMI consoles will likely have two of the most important tools you need to perform these analyses already installed. The first is the data server, or driver—this is a means of extracting data from plant floor controllers without reinventing the wheel. The second is embedded VBA. When the HMI contains the data server and the VBA, it is easy to do DAQ because the VBA code can read values from the PLC. Because this code lives in the HMI, it really is part of the control system and is not a foreign or extra component that gets lost in the shuffle when the HMI is upgraded or re-installed on a new computer.

A data collection solution designed using these common tools can be tailored to the exact needs and specifications of the end-user. These solutions can also often be expanded to include a formula or recipe system using the same tools.

At the stucco production site, for example, a formula processing system was added. Both the IBM system and the Windows SCADA system use the same material and formula CSV files. The VBA program in the HMI was expanded to include a procedure to perform extensive formula integrity checking.

The fact that all these components—the data server, the formula checker, the DAQ system, and the operator interface—are all in one place is important. This means they all get ported or upgraded together when the HMI application is ported or upgraded. This is of great value when others (such as IT) have to work on the system and can do so with relative ease—and with low risk of neglecting some obscure step that would otherwise hinder or defeat the system.

Terminology

Data acquisition (DAQ) – For this article, we define this as anything that obtains data from a controller or process so that it can be used by humans or computers. Online searches on this phrase give a sea of results for scientific measurement systems and SCADA systems.

Data logging – For many, this term is interchangeable with DAQ. However, some information resources put more emphasis on storing data and less on actually using the data.

Data historian – This is a high-end software program used to compress and store huge amounts of raw process data, and provide tools to retrieve and search the data. When modest or small data sets are needed, this can be a Cadillac solution.

Supervisory control and data acquisition (SCADA) – This once meant collecting key pieces of data from multiple remote places and making adjustments to the controls. Now, this term often is used to refer to a control system. This ambiguity makes search results harder to sort out. Furthermore, control system vendors offer data management tools that can be used for some of the applications described in this article. These are expensive, general-purpose tools that should always be considered for developing a solution, but they can be overkill.

Real time enterprise (RTE) and manufacturing execution system (MES) – These involve getting data in near real time to computers that adjust production parameters and other enterprise systems like inventory, asset management, and scheduling. Astute readers will note this is a logical evolution of the original concept of SCADA as described earlier: getting data andmaking adjustments.

 

Author Information

William Love is senior engineer at Kredit Automation & Controls, Phoenix, AZ. He can be reached at 602-315-2561.