Implementing cloud connectivity for IoT and Industrie 4.0
Beyond the scope of conventional control tasks, applications such as Big Data, data mining, and condition or power monitoring enable the implementation of advanced automation solutions. New hardware and software products for Industrie 4.0 and Industrial Internet of Things (IIoT) provide simple implementation for advanced solutions.
Industrie 4.0 and IIoT strategies place strict requirements on the networking and communication capabilities of devices and services. In the traditional communication pyramid in Figure 1, large quantities of data must be exchanged between field-level sensors and higher level layers in these implementations. However, horizontal communication between programmable logic controller (PLC) systems also plays a critical role in modern production facilities. PC-based control technologies provide universal capabilities for horizontal communication and have become an essential part of present-day automation projects exactly for this reason. Engineering and control software in PC-based control architectures provides the ideal foundational technology for Industrie 4.0 concepts and IoT communication. Moreover, new IIoT-compatible I/O components enable easy-to-configure and seamless integration into public and private cloud applications.
Creating a competitive edge with Industrie 4.0 and IoT
Industrie 4.0 and IoT applications do not start with just the underlying technology. In reality, the work begins much earlier than this. When implementing IoT projects, it is critical to examine the corporate business objectives first and establish the benefits that will be gained from these projects. From an automation provider’s perspective, there are two distinct categories of customers that can be defined: machine manufacturers and the end customers of the automated machines.
In the manufacturing sector in particular, there is an interest in reducing in-house production costs, through efficient and reliable production control and also by reducing the number of rejects that are produced. The traditional machine manufacturer pursues very similar objectives and is interested in reducing the cost of the machine while maintaining or increasing production quality. Optimizing the machine’s energy consumption and production cycles, as well as enabling predictive maintenance and fault diagnostics, are rewarding goals.
Specifically, predictive maintenance and fault diagnostics offer the machine manufacturer a solid basis to establish services that can be offered to end customers as an additional revenue stream. What both customer categories ultimately want is for the machine or product to be designed more efficiently and to increase competitiveness in the marketplace.
Collecting and analyzing process data
The process data that is used during production provides a foundation for creating added value and for achieving business objectives. This includes the machine values that are recorded by a sensor and transmitted via a fieldbus to the PLC. This data can be analyzed directly on the controller for monitoring the status of a system using condition monitoring libraries integrated into leading automation software platforms, thereby reducing downtime and maintenance costs.
When several distributed controllers serve production areas, they may be insufficient to analyze data from a single controller. The collected data from multiple or even all controllers in a production system or a specific machine type is often needed to perform sufficient data analysis to properly analyze the overall system. A corresponding information technology (IT) infrastructure is required for this purpose. Previous implementations focused on the use of a central server system within the machine or corporate network that was equipped with data memory, often in the form of a database system. This allowed analysis software to access the collected data directly in the database to perform corresponding evaluations.
Although this approach to collect and analyze data in production facilities certainly worked well, it also presented a number of problems, since the required IT infrastructure had to be made available first. High hardware and software costs for the corresponding server systems were an issue, in addition to the cost of acquiring skilled personnel due to the implementation of complex networking production systems. To complicate matters, the scalability is very low. Ultimately, the physical limits of the server system are reached (amount of memory or CPU power, or the performance and memory size required for analyses). This often resulted in more extensive, manual conversion work if systems had to be supplemented by new machines or controllers. At the end of the day, the central server system had to grow alongside to handle and process the additional data volume.
Cloud-based communication and data services now avoid prior issues by providing the user with an abstract view of the underlying hardware and software systems. "Abstract" meaning that a user does not have to give any thought to the respective server system when using a service. Rather, only the use of the respective services has to be considered. All of the maintenance and update work on the IT infrastructure is performed by the provider of the cloud system. Such cloud systems can be divided into public and private clouds.
Public cloud service providers provide users with a range of services from their own data centers. This starts with virtual machines, where the actual user has control of the operating system and the applications installed on it, and stretches to abstracted communication and data services, which can be integrated by the user in an application. These can include access to machine learning algorithms, which can make predictions and perform classifications regarding specific data on the basis of certain machine and production information. The algorithms obtain the necessary contents with the aid of the communication services.
Such communication services are usually based on communication protocols, which in turn are based on the publish/subscribe principle. This offers advantages from the resulting decoupling of all applications that communicate with one another. On one hand, any time-consuming disclosure of address information is reduced. All applications communicate through the central cloud service. On the other hand, data communication with a cloud service, via the message broker involves an outgoing communication connection from the perspective of the terminal device—regardless of whether data is sent (publish) or received (subscribe). The advantages for configuring the IT infrastructure are that no incoming communication connections have to be configured, for example in firewalls or other network terminals.
This significantly reduces IT infrastructure set-up time and maintenance costs. Transport protocols used for data communication are lean and standardized, such as message queuing telemetry transport (MQTT) and advanced message queuing protocol (AMQP). In addition, various security mechanisms also can be anchored here; for example, encryption of data communication and authentication with respect to the message broker. The standardized communication protocol, OPC Unified Architecture (OPC UA), has likewise recognized the added value of a publish/subscribe-based communication scenario and taken the steps to integrate this communication principle in the specification. This means that an additional standard besides MQTT and AMQP is consequently available as a transport mechanism to the cloud.
Publish/subscribe mechanisms can also be used in a private cloud for companies or machine networks. For MQTT and AMQP, the infrastructure that is required can be installed and made available easily on any PC in the form of a message broker. This means that both machine-to-machine (M2M) scenarios can be implemented and any terminal devices, such as smartphones, can be connected to the controller. Moreover, access to these devices is further secured by means of firewall systems. The extensions of the OPC UA specification with regard to publish/subscribe will also simplify the configuration and use of 1:N communication scenarios within a machine network in the future.
Data analytics, machine learning services
Once the data has been sent to a public or private cloud service, the next question is how the data can now continue to be processed. Many public cloud providers offer various analytics and machine learning services that can be used for further examination of process data. These platforms provide relevant mechanisms for data analysis and the ability to record all machine processes.
Depending on requirements, this data can either be stored for evaluation locally on the machine processor or within a public or private cloud. This provides the power to create new business ideas and models for the machine manufacturer and benefits for the customer as well.
Industrie 4.0 and IoT are key concepts within the industry. These concepts are vital when innovative business models are a requirement for the underlying infrastructure. This also drives the increased convergence of IT and automation technologies. Cloud-based data services can help implement such automation projects, as they save the machine manufacturer or end customer from having to provide the corresponding IT expertise.
Products for Industrie 4.0 and IoT
Beckhoff provides users with a wide variety of components for simple and standardized integration into cloud-based communication and data services. The Internet of Things (IoT) products within the TwinCAT 3 automation software platform offer varied functionalities for exchanging process data by standardized publish/subscribe-based communication protocols and for accessing special data and communication services of public cloud service providers. Corresponding services can be hosted in public cloud systems, such as Microsoft Azure™ or Amazon Web Services but can be used just as effectively in private cloud systems.
These IoT functions can be accessed alternatively via special function modules directly from the control program or can be configured via an application called the "TwinCAT IoT data agent" outside of the control program. The data that needs to be transmitted can be selected easily via a graphical configurator and configured for transfer to a specific service. An advantage is that the data agent also allows integration of cloud-based services in older, existing TwinCAT systems. The process data can also continue to be exported here using the standardized communication protocol, OPC unified architecture (UA), with the result that data can likewise be sent from non-Beckhoff systems. A smartphone app is available that enables mobile display of a machine’s alarm and status messages.
- Understanding cloud-based communication services
- The advantages of private and public cloud services for data analysis
- How to optimize data processing and analysis.
What are the cybersecurity implications for cloud-based data services?
Control Engineering is hosting a webcast on October 20 at 1 PM CDT on cybersecurity and the IIoT. Click here to register.