Three things to consider when deploying edge computing
Edge computing is changing manufacturing in many ways and companies can take advantage of the many benefits it involves. Three aspects of edge computing to consider are highlighted.
Learning Objectives
- Learn how edge computing is changing manufacturing applications and making artificial intelligence, machine learning (AI/ML) and analytics better.
- Learn about the three aspects manufacturers need to know when deploying edge computing in their facility.
Edge computing insights
- Edge computing, integrated with AI/ML, offers low-latency communication and tailored applications, revolutionizing controls engineering. Proper infrastructure investment ensures analytic readiness and network scalability.
- Prioritizing security, utilizing orchestration platforms and investing in infrastructure are vital for successful AI/ML implementation at the edge, optimizing plant operations and ensuring managed and protected endpoints.
Edge computing is changing analytics, artificial intelligence (AI) and machine learning (ML) applications. Substantial investments in libraries and frameworks have provided a new frontier for engineers and operations specialists to integrate these capabilities alongside process control. While the possibilities for valued applications are endless, security, management and scalability must be carefully considered during design.
Companies looking to deploy edge solutions need to consider three primary pieces: the applications running at the edge, the infrastructure supporting the edge and the security and orchestration of edge appliances.
1. Edge computing applications
Edge computing refers to the execution of applications near controls processes or machinery. An edge device generally lives on the same network as operational technology (OT) devices and exhibits low-latency communication in data collection and response. An edge device may be a physical appliance; however, a virtual machine (VM) with low network latency to process equipment also can be considered at the edge.
One trade-off between physical devices and VMs is the preference for processing power or flexibility. The decision simplifies down to having edge virtual machines or edge physical devices. Virtual machines can be spun up or down as needed providing flexibility in resource allocation.
However, it adds a layer of abstraction impacting processing efficiency and introducing latency. Physical devices allow low latency and substantial process power, but they lose flexibility in resource allocation and are often procured for a single application.
The utilization of edge computing provides a spectrum of advantages, such as quick response times (low latency), efficient use of internet resources (bandwidth efficiency), the ability to handle data in real-time (real-time analytics scalability) and efficient application scalability. Localizing applications removes the need to send data to a central server, reducing the logical distance from data generation and data processing. When properly applied, this approach can result in near real-time solutions; applications gain the capability to make quick decisions and deliver rapid responses. Scaling edge infrastructure is crucial for increasing demands and maintaining performance and responsiveness. Scaling is done by adding an edge device to the infrastructure. The distributed workload limits bottlenecks, increases node management and improves load balancing.
Applications running at the edge are use-case specific and often purpose-built. Some examples include analytics or models to optimize process control such as reducing scrap, improving yield, reducing utility consumption, predictive maintenance and more. Data is brought locally to the edge device to be processed. Raw or aggregate data may be pushed up to enterprise servers or the cloud for further analysis; aggregating the data reduces data transfer and storage costs in the cloud.
2. Edge computing infrastructure
The supporting infrastructure for edge applications is often one of the most unknown aspects when integrating edge applications. Detailed scope of the edge application is required to understand what process parameters need to be collected, what the application’s output is, and what defines success in the process.
Additional considerations, such as readiness of the process for data collection and advanced analytics, can be derived from an analytics maturity model. Advanced process analytics require investment in data collection, cleansing, contextualization and storage are supported through complementary infrastructure. Not all organizations are ready for advanced process analytics. Work with analytics experts to ensure these solutions can be supported.
As devices are deployed and applications begin to scale, the network load also increases. Understanding where data is being transferred over the network is crucial to minimizing load issues with the current infrastructure; upgrades of network switches to support larger data sizes may be required to minimize network interruptions from bandwidth contention. Consultation with plant floor network integrators during the planning phases can identify and provide solutions to mitigate network contention.
3. Edge computing security and orchestration
The security of edge devices should be considered first in the design process. Functional requirements of an edge application are used to specify the security requirements for processing data. Common examples of these security requirements may include data encryption in transit and/or rest, transport level security (TLS) communications and system patching. Orchestration platforms also can secure edge devices and manage their lifecycle as well as update devices and their respective applications. An orchestration platform also can operate at scale, providing these services to a large fleet of deployed edge devices. Management actions can be performed across the fleet of devices, such as updating application versions, pushing new analytic models or security patching.
Building solutions on an orchestration platform accelerates the development and scalability of an application. A small proof of concept or pilot solutions can be evaluated and deployed to the fleet of edge devices working alongside the process. The speed of innovation on an orchestration platform, alongside the built-in security, affords developers a powerful tool to focus on delivering solutions over managing deployments and architectures.
Edge computing, embedded with AI/ML, opens doors for new possibilities and challenges in controls engineering. Applications at the edge allow for low-latency communications and can be tailored to target various business goals. Understanding and investing in the supporting infrastructure is necessary for analytic and network readiness at scale. Security and orchestration, with a security-first mindset and robust orchestration platform, ensure edge devices are managed and protected endpoints.
Investing in security, orchestration and infrastructure can pave the way for AI/ML solutions at the edge to optimize plant operations.
Jackson Cates, business analyst; Nick Malott, technology analyst; Tiati Thelen, business analyst II; all with Interstates, a CFE Media and Technology content partner. Edited by Chris Vavra, web content manager, CFE Media and Technology, cvavra@cfemedia.com.
MORE ANSWERS
Keywords: edge computing, virtual machines
ONLINE
See additional edge computing stories at https://www.controleng.com/edge-cloud-computing/
CONSIDER THIS
How can edge computing improve your facility?
Do you have experience and expertise with the topics mentioned in this content? You should consider contributing to our WTWH Media editorial team and getting the recognition you and your company deserve. Click here to start this process.