Edge computing offers 4-step pathway to digital transformation
Special Report: Living on the edge: Putting computing power close to the process reduces control system latency, creates a distributed architecture, and can integrate machine learning (ML) and artificial intelligence (AI) capabilities for faster, more flexible optimization and expand and improve use of cloud services. See four steps toward edge computing.
Edge computing architectures have been advanced by cloud services, which have long helped companies simplify and secure data aggregation and processing functions. As time progressed, technological advancements such as mobility and differentiation in user interface and user experience (UI/UX) approaches resulted in more connected and distributed processes, devices and machines. Companies realized there was a need for more immediate intelligence from data closer to the source. This increased demand for edge computing.
Edge computing: Four-step progression
Four steps toward greater adoption of edge-computing follow (see graphic).
- Cloud computing revolutionizes business.
- Edge computing architectures resolve emerging challenges.
- Concerns arise over costs of digital transformation.
- Optimization and asset utilization benefits make edge computingmorefeasible.
Edge computing can be crucial for some industries and manufacturing industry applications to resolve a range of challenges. This includes equipment breakdown and unplanned downtime. Intelligent temperature monitoring sensors, for example, can be automated to record temperature changes in the immediate surroundings. In case of an emergency, these devices can activate sprinklers, alert the fire department and shutdown all power systems in a factory.
This scenario will require machine-to-machine (M2M) edge computing to decrease network latency and deliver real-time control and monitoring. Earlier, devices at the edge were only programmed to locally collect data and transmit it to a remote server (cloud). With the assistance of AI, edge devices can now be embedded with machine learning (ML) capabilities to self-learn and execute actions without waiting for a response from a central computer.
Programmable logic controllers (PLCs) can be taught to detect issues, analyze the issue, and execute counteractive procedures, according to some PLC manufacturers. PLCs can act as edge nodes with ML capabilities to activate prescriptive and predictive maintenance without interventions from information technology systems or humans.
Every manufacturing plant wants what this cutting-edge innovation can deliver. The problem: at what cost?
Edge computing justification
Organizations balk at large investments in new technology without getting a tangible sense of the return on investment (ROI). Technologies are evolving rapidly. Just as we embraced cloud computing and are now rethinking this strategy for high availability and instant compute abilities, we could perhaps pose the same question for edge computing.
The immediate benefits may outweigh the costs, but the evolving technologies pose a risk in terms of investment. Intelligent sensors with ML and AI capabilities are far from being cost effective. If an organization has already implemented a cloud strategy, does it make sense to move them to an edge computing strategy immediately? This depends on the requirement. The additional cost for sensors, local processing power and other features will add to the overall overhead and increased costs. As a strategy, this defeats the purpose it was meant to mitigate.
Optimize and utilize
Recently, a fault–tolerant computer server and software manufacturer launched a virtualized, self-protecting edge–computing platform specifically designed for industrial control system environments, as previously reported. The platform comes embedded with zero-touch computing properties and is expected to simplify a range of remote management activities such as cloud-based health monitoring, automated site and data recovery.
Such platforms can offer a middle path to optimize costs by considering capital expenditures (capex) and operational expenditures (opex). Such scenarios should consider the economies of scale. The question to ask is: How close to the edge is close? Should it be completely on equipment or can it be on premise? Processing data on the devices instead of using co-location data centers and existing cloud infrastructure requires research and time. Depending on this answer, infrastructure and cost optimization and utilization indicators could go up or down.
Cloud influence on edge computing
Technology trends can be ambiguous with interminable discussions on the pros and cons of each trend. Most organizations tend to err on the side of caution and deliberate before taking a conscious decision to either adopt or let pass an upcoming technology. Based on empirical evidence it is clear that enterprises who have failed to adopt new technologies lagged behind those who have. Back in 2008, many industry champions dismissed cloud computing trend as passing and yet, almost a decade later, cloud computing is moving to the edge.
Cloud computing saw rapid adoption because it provided organizations with easy accessibility to vast storage with near zero application usage latency and pay-as-you-go models sweetening the deal. Organizations had every reason to go for this.
However, after a decade, as applications are distributed across geographies and major challenges remain with cloud providers, latency, experiential consistency and security, organizations are rethinking cloud strategies.
Having an edge in 2020, beyond
Gartner estimates 91% of today’s data is created and processed in centralized data centers. By 2022 about 75% of all data will need analysis and action at the edge. Edge computing will become the principal method by which enterprises implement digital transformation.
Edge computing offers many benefits over legacy network architecture systems. As it continues to evolve and make more inroads for organizations, it also will raise questions that will require extensive research before investment. Organizations need to look at transformational opportunities with a thoughtful, deliberate approach to validate this investment.
Dr. Keshab Panda is CEO and managing director, L&T Technology Services, a Control Engineering content partner. Edited by Mark T. Hoske, content manager, Control Engineering, CFE Media, firstname.lastname@example.org.
KEYWORDS Edge computing, cloud services
Edge computing distributes control architectures and reduces latency.
Machine learning and artificial intelligence can be integrated into edge computers and PLCs
Edge computing can bring cloud services to distributed architectures.
How could an edge-computing distributed architecture help your processes?
Do you have experience and expertise with the topics mentioned in this content? You should consider contributing to our CFE Media editorial team and getting the recognition you and your company deserve. Click here to start this process.