Six ways data acquisition strategies can support advanced analytics
Cloud-connected data acquisition and advanced analytics are often seen as inseparable. According to this viewpoint, it is impossible to use sophisticated algorithms without an internet connection. This school of thought posits that leveraging process data for machine optimization, troubleshooting, predictive maintenance, and other areas relies on a Big Data operation with heavy dependence on the cloud or other edge computing strategies. However, this all-or-nothing mindset is incorrect.
All applications, even those without internet connection that rely on old-school data acquisition (DAQ) strategies, can use advanced analytics tools to reap the same benefits as cloud-connected operations. Although most applications can connect to the cloud today without overwhelming obstacles, others may require greater caution. Corporate-level restrictions that safeguard proprietary information or protect against external threats could prevent engineers from implementing edge or cloud computing strategies. This changes the implementation and selection of data acquisition and advanced analytics, but using both of these items are still possible.
Flexible automation software running on modern control systems can enable analytics and process data no matter how it is acquired. Traditional control system data collection architectures are necessary in some circumstances, while edge computing can be leveraged in other situations. To understand the varying levels of data acquisition and analysis used today, it is necessary to reconsider each strategy, starting with the most basic options.
1. File-based data acquisition
At the most fundamental level, modern control systems can collect data offline and store it in binary comma-separated values (CSV) files on local machines or servers. This very basic capability enables the direct connection between control systems and OS file systems—and ultimately importing into databases.
2. Direct database connection
A direct connection to databases can be accomplished using common function blocks as available in all IEC 61131 programmable logic controller (PLC) languages. When triggered, the PLC can use structured query language (SQL) to insert or query data directly from the database.
Modern control systems permit connectivity to many different commonly used database options. These options include traditional relational databases using SQL, as well as more contemporary, distributed not only SQL (NoSQL) databases.
Widely-used SQL databases require predetermined data table creation and configuration. NoSQL options support dynamic collection of process data and often effectively handle large data sets. The traditional approaches provide engineers with datasets for troubleshooting or general documentation processes even if they are less sophisticated and less flexible than recent database innovations.
3. Local machine and process optimization
The next level involves using analytics tools locally for process and machine optimization. This requires an “analytics workbench,” whether it is built into the control system or supplied by a third party. As a result, controls engineers can leverage the information stored in databases (or local binary files) to improve machine and process efficiencies. These tools can run on the same controller that automates the machine, perhaps at the end of a line or cell—or even at the enterprise level.
Cloud connectivity may seem crucial to these processes, but it could be as simple as logging data to local binary files that are then available via local networks. It also is possible to transfer files manually in scenarios where no network connectivity to the equipment is permitted. This provides the acquired data on controllers for use with advanced algorithms on separate PCs. This is beneficial when a machine needs to remain isolated for safety or security reasons.
4. Enterprise-level analytics
Engineers also can implement advanced analytics at the enterprise level without cloud connectivity. This employs the use of modern analytics workbenches to generate code that will perform 24/7 analytics for production lines. This strategy can incorporate much larger data sets from entire lines or plants, with sensors and fieldbuses providing key information.
As the scope of the project grows, the strategy might need to change. Retrieving data manually from isolated machines does not present an optimal methodology. In addition, the first three strategies mentioned are constrained by geolocation. They might not support advanced capabilities required to solve the application challenge, such as machine learning and artificial intelligence (AI). This creates limitations for multi-site operations or machinery original equipment manufacturers (OEMs) that desire advanced troubleshooting, efficiency improvements, and machine health monitoring on a large scale.
5. Cloud computing
Cloud computing has gained prominence for solving the previously mentioned issues. Without geolocation constraints or requiring a set central point for data, it is possible to compare machines, lines, and plants to determine overall equipment effectiveness (OEE) and machine health, especially when machine builder OEMs provide web portals as a part of maintenance and support services.
The end users can even see which operators manage machinery most efficiently. With the data collected in a central location, engineers can use AI tools, implement machine learning, and run advanced algorithms that may not be available locally.
Plants must have cloud-capable controllers or add-on Internet of Things (IoT) gateway devices that use standard communication protocols. These protocols enable connectivity to all major public cloud service providers, which provide storage and data analysis. However, this requires substantial bandwidth to transmit such massive data quantities. The larger issue is these data acquisition and analytics strategies can drift into the information technology (IT) realm. Without IoT-ready automation software, which incorporates all of these tools into an engineering environment, it is difficult for controls engineers to leverage data analytics without additional training to learn how to navigate additional software platforms.
6. Edge computing
The final level used today is edge computing, which supports data filtering and advanced analytics as run on the local controller before possible transmission to the cloud for further analysis. This is a very similar implementation to local machine and process optimization, as explained above. By running these tools on the machine, some edge computing platforms become a hybrid between local and cloud-based data management. Hybrid solutions allow engineers to monitor devices and build algorithms in the cloud. However, they implement and run those algorithms at the machine level.
These hybrid edge computing tools provide the advanced benefits of cloud computing, while adding faster reactions without straining the plant’s bandwidth or networking costs. The approach is optimal for both large end users and OEMs with machines spread across the globe. Machines with edge computing functionality also can process the information and make corrections to minimize downtime while boosting processor efficiency since these tools run in containers rather than the traditional hypervisor approach to virtualization. Containers also are more flexible and secure for edge computing applications.
Reasons to avoid cloud connectivity
Some end users have legitimate reasons for not embracing cloud connectivity. For example, plants that handle hazardous materials should be careful when considering cloud-connected systems. Pharmaceutical manufacturers place quality control, consumer health, and safety above the benefits of cloud connectivity, even when they’re trying to improve traceability. These plants stick to the first three implementation examples mentioned so machines remain isolated or connected through local networks or on-premises local clouds.
It is easy to ignore the advantages contemporary analytics software provides to offline equipment. Advanced analytics tools remain useful for troubleshooting, process optimizations, and real-time machine health monitoring. If an analytics workbench is built into machine control software, engineers can run advanced algorithms to interpret machine data with a minimal learning curve.
Machine builders will find this helpful when providing remote support for machines that are offline. For example, OEMs can set up ring buffers to log the last three hours’ worth of data. If a system experiences faults or other abnormal behavior, end users can download the files from the network or machine and send them to the OEM. The machine builder can use the same analytics workbench to find potential anomalies. It is possible to troubleshoot machines and improve performance. However, it is not as ideal as having a live connection to the machine or data.
While there are some legitimate concerns, there are many reasons to connect applications to the cloud. Not implementing cloud or edge strategies because a factory has managed without one ignores the potential benefits and walks away from real-world competitive advantages.
Engineers must consider all factors when deciding whether to make the leap into IoT or remain off the grid. Either way, advanced analytics technologies provide key improvements for every type of data acquisition strategy and comfort level.
Daymon Thompson, automation product manager, North America, Beckhoff Automation. Edited by Chris Vavra, production editor, Control Engineering, CFE Media, email@example.com.
Flexible automation software running on modern control systems can enable analytics and process data.
Data acquisition and advanced analytics methods include local machine and process optimization and file-based data acquisition.
More recent data acquisition and advanced analytics methods include cloud and edge computing.
What other ways can data acquisition and advanced analytics support manufacturers?
For more information: