Advanced algorithms for analytics on the edge

Substantial computing power in modern industrial PCs and cloud bandwidth considerations make the case to analyze machine performance directly on controllers, before the cloud.

By Daymon Thompson November 7, 2018

The debate between cloud and edge computing strategies remains a point of contention for many control engineers. However, most agree smart factories in an Industrie 4.0 context must efficiently collect, visualize, and analyze data from machines and production lines to enhance equipment performance and production processes. Advanced analytics algorithms allow companies to sift through this mass of information, or Big Data, to identify areas for improvement.

To some, edge computing devices seems to create an unnecessary step when all data can be managed in the cloud with limitless space. Messaging queuing telemetry transport (MQTT) encryption and data security built into the OPC Unified Architecture (OPC UA) specification ensures all data will remain secure while it’s being transferred. When it comes to analytics and data management, however, edge computing presents important advantages to monitor equipment health and maximize production uptime.

Because of the massive amount of data that modern machines can produce, bandwidth can limit cloud computing or push costs outside of a set budget. New analytics software strategies for PC-based controllers allow controls engineers to leverage advanced algorithms locally in addition to data pre-processing and compression. As a result, a key advance in analytical information is the concept to process data on the edge first, which enables individual machines and lines to identify inefficiencies on their own, and make improvements before using the cloud for further analysis across the enterprise.

The cloud and data bandwidth challenges

Depending on the service plan, running all analytics in the cloud can be expensive depending on how much storage space is needed, but the more difficult proposition is transferring the data there in the first place. Managing bandwidth can create a serious issue for factories, since the average Ethernet connection speed across the globe is 7.2 Mbps according to the most recent connectivity report from Akamai.

When one machine sends data to the cloud, much less multiple machines, little-to-no bandwidth is available for the rest of the operation. Kloepfer, Koch, Bartel and Friedmann published two use cases to illustrate this point.

In the first use case, the structural dynamics of wind turbines using 50 sensors at a 100 hertz sampling rate required 2.8 Mbps bandwidth for standard javascript object notation (JSON) to stream all data to the cloud. For the second case, asset condition monitoring in intralogistics, used 20 sensors at a 1,000 hertz sampling rate and required 11.4 Mbps JSON. This a relevant test since JSON is a common format to send data to the cloud or across the web.

Without compressing or pre-processing mechanisms, an average 7.2 Mbps internet connection cannot stream data from three or more large machines or from a full logistics operation that requires advanced measurement, condition monitoring and production traceability. A factory must use a much larger connection, multiple connections, or it can leverage advanced analytics on the edge.

Edge devices and advanced algorithms

In the past, most programmable logic controllers (PLCs) were capable of controlling repetitive tasks in machines, but possessed the computing prowess of a smart toaster. Industrial PCs (IPCs) feature ample storage and powerhouse processors, with four, or as many as 36 cores. The automation software packages for these IPCs run alongside Windows, and can support third-party applications and can be accessed remotely. PC-based control software can provide advanced algorithms to manage data, such as pre-processing, compression, measurement, and condition monitoring. This does not require a separate, standalone software platform.

Condition monitoring performs many operations such as converting raw accelerometer data into the frequency domain locally. This can be done on an edge device or within the actual machine controller’s PLC program. For example, when analyzing vibration, the information is often collected as a 0 to 10 V or 4 to 20 mA signal. This can be changed to a more usable format on the controller through a Fast Fourier Transform (FFT) algorithm.

More extensive machine vibration evaluations are possible using DIN ISO 10816-3: Mechanical Vibration-Evaluation of machine vibration by measurements on non-rotating parts. To monitor bearing life and other specific components, algorithms are available to add to a PLC program to calculate the envelope spectrum first and then the power spectrum. Many common machine conditions and predictive maintenance algorithms can be evaluated within the machine control, or on an edge device.

Automation software should offer built-in algorithms to process deterministic and stochastic data. If the data is deterministic, controllers using pre-processing algorithms could send certain values only upon a change. The recipient should know the mathematic correlation and be able to reconstruct the original signal if desired. For stochastic data, the controller can send statistical information such as the average value. Although the original signal is unknown, the recipient can still use compressed, statistical information.

It also is possible to implement algorithms on the IPC to monitor process data over a set sequence. This includes periodically writing input data according to a configured number of learned points to a file or to a database. After storing standard values, such as torque for a motion operation, algorithms compare cycle values against them. Ensuring data is within a configured bandwidth creates a type of process window monitoring, which can readjust immediately due to the real-time capability of a local controller.

Implementing a cloud and edge strategy

Running advanced algorithms on a local edge device reduces cloud bandwidth requirements and offers an efficient strategy for process optimization. However, that does not mean an operation can or should disconnect from the cloud. In the age of the Industrial Internet of Things (IIoT), it is essential to gather and access data across an operation, even if many analysis and decision-making tasks can be completed on local hardware first.

To decide what needs to be sent to the cloud and what can be processed or pre-processed locally, make sure to ask the following questions:

  • What are the goals your operation wants to achieve through data acquisition in this instance?
  • Which data sets from which machines need to be analyzed in order to achieve these goals?
  • What types of data insights does the operation need to improve efficiency and profitability?

Local monitoring with edge computing often works most efficiently to improve the operation of individual machines. However, the cloud provides the best platform to compare separate machines, production lines or manufacturing sites against each other. Implementing both allows an operation to maximize its capabilities.

Daymon Thompson is automation product manager, Beckhoff Automation, North America. Edited by Emily Guenther, associate content manager, Control Engineering, CFE Media, eguenther@cfemedia.com.

MORE ANSWERS

KEYWORDS: Cloud computing, edge computing

  • Comparing edge and cloud computing to optimize production
  • When to use local monitoring versus the cloud
  • How to best access data across facility operations.

Consider this

Is your facility analyzing whether data can be processed locally or through the cloud to maximize efficiency?