Does AI have you on edge? Save up to 15% in control system updates

Edge, cloud and hybrid architectures will be used to deploy artificial intelligence solutions, so forward-thinking organizations are laying the proper groundwork now to handle upcoming needs. AI for process modernization projects automatically converts up to 70% of configuration for savings up to 15%.

LEARNING OBJECTIVES

  • Explore many artificial intelligence (AI) opportunities for process manufacturing.
  • Understand how new workloads bring new needs for AI in process manufacturing.
  • Examine AI advantages at the edge and how benefits are available today while preparing for tomorrow’s process manufacturing.


Artificial intelligence (AI) applications are widely expanding and process manufacturers have taken notice. Organizations around the globe are already planning the ways they can perform powerful AI-based analytics, improving their operations and unlocking adaptive advanced process control to drive more autonomous operations and competitive advantage.

Greater use of AI means securely freeing contextualized data from the silos built into the different automation layers and disciplines and moving it up into the cloud where AI tools can put it to good use. Such a strategy, in turn, requires teams to design architectures with a boundless automation vision of seamless data mobility. Efficient movement of data to feed data lakes, data scientists, analytics applications, enterprise resource planning systems and other areas becomes increasingly critical with each emerging AI innovation.

A boundless automation vision becomes even more important as AI tools consuming critical plant data will not be exclusively cloud or edge tools, nor will they be limited to one plant or the corporate office. As a result, AI will redefine the way organizations look at their operational technology (OT) and information technology (IT) architectures.

New applications of AI will need to straddle on-premises and cloud technology, and, simultaneously, the OT/IT convergence. This will require edge environment solutions as a tool to drive seamless integration and data mobility and to bridge the IT/OT and cloud/edge gaps—while managing data, computational resources and security as data flows across an new pipeline.

Many AI opportunities for process manufacturing

While there is still much experimentation happening in AI, people involved in process manufacturing are seeing a significant increase in potential real-world use cases. In many ways it feels as though AI has moved beyond the experimentation phase and into development, where much of the work happening today is in augmenting large language models (LLMs) — and, in particular, augmenting private LLMs using proprietary data.

These new, private LLMs inspire a lot of exciting use cases. Perhaps most importantly, companies are developing tools to help modern operators—many of whom are part of a new generation of digital natives—to improve decisions. Chat bots and embedded advisers, driven by LLMs, support operators and technicians of every experience level with real-time guidance in response to natural language questions.

In addition, AI tools are speeding project design and execution. Teams are using AI to perform 3D plant designs faster and easier, with software that can respond to natural language commands to design and edit plant layouts on the fly, while providing multiple design examples in minutes, rather than days or weeks.

Cloud-based modernization software is using AI to better understand legacy code and support engineers as they convert their legacy control systems into new systems. The advantages are significant as AI modernization tools can automatically convert up to 70% of configuration to reduce initial capital costs of modernization projects by up to 15% (Figure 1).

Figure 1: AI tools in cloud-based modernization software can dramatically re
Figure 1: AI tools in cloud-based modernization software can dramatically reduce the time and cost of projects to update control systems. Courtesy: Emerson

New workloads bring new needs for AI in process manufacturing

One of the primary complexities facing organizations implementing AI solutions will be variable infrastructure needs. Some AI tools will need to run on the edge, while others will require cloud capabilities. For example, applications that require extremely low latency, such as closed-loop and advanced control solutions, will need to operate on-premises, close to the associated equipment to ensure optimal performance.

In addition, many organizations will need to maintain on-premises control of AI tools for better data governance. For example, a pharmaceutical company needing to combine unstructured data from documents with real-time data might not want to put that data in the cloud due to regulatory or intellectual property concerns. In these cases, a smaller, localized language model can bring many of the same benefits, while avoiding the primary risks.

In contrast, some AI tools have requirements that are simply too computing resource intensive for on-premises implementation. Any AI tool that requires continual training to generate probabilistic answers will demand this level of computational power. In most cases, such tools will be implemented in the cloud, where computational power scales quickly and dramatically to handle the fluctuating needs of AI processing.

AI advantages at the edge

Because of this wide variation in potential needs, the different applications of AI (reliability, process control, decision making, operational autonomy, operator support and more) will likely require the ability to run on the edge and in the cloud. To close that gap, many organizations will employ a fit-for-purpose edge environment solution.

An edge environment provides the infrastructure teams need to safely connect OT assets to the IT environment on-premises and in the cloud. The resulting seamless data mobility forms the foundation for a secure data pipeline designed to combine real-time, historical and unstructured data to feed AI tools wherever they are in the enterprise.

On the simplest level, an edge environment provides teams with a way to deliver operational data out to AI applications on the edge or in the cloud. Using a simplified, secure connection architecture, an edge environment reduces network complexity by providing one, unidirectional data flow from the control layer outward. Data is no longer trapped in silos but is available for use by tools at any point in the enterprise, without the risk of access into the control layer (Figure 2).

Figure 2: Edge environment solutions help teams navigate the complexity of OT infrastructure, bringing data from the control layer up to cloud systems. Courtesy: Emerson
Figure 2: Edge environment solutions help teams navigate the complexity of OT infrastructure, bringing data from the control layer up to cloud systems. Courtesy: Emerson

However, the value of a modern edge environment goes beyond data access as it empowers operations teams to run AI tools using optimized hardware which may not be part of their OT infrastructure. Today, most applications built for OT have fixed resource consumption. As AI tools advance and take advantage of the ability to learn and adapt organically, this will no longer be the case. For the most popular AI applications, the more they are used, the more computing resources they will be likely to consume.

Already, the most advanced edge environments provide a data sandbox with containerized application tools to encourage varied execution and experimentation—the perfect incubator for the AI tools that will capture competitive advantage in coming years. Moreover, as edge environments advance, they will leverage this containerization to provide the dedicated, optimized compute power—as well as the secure connectivity to data—to orchestrate AI workloads, ensuring they are assigned to the appropriate platform.

Ultimately, edge environments will help teams provide the right resources at the right time to drive better business outcomes. Not only do they bring data securely to AI systems in the cloud and at the edge, but they are also likely to one day have the ability to observe and dynamically allocate compute power to AI as necessary.

AI benefit today and tomorrow’s process manufacturing

Industry has reached the point where AI in process manufacturing is beyond experimentation. One day, AI tools are likely to be as ubiquitous as spreadsheets and dashboards and much like those tools, the early adopters will benefit from a significant leap in competitive advantage. As a result, organizations are already beginning to build the seamless infrastructure they will need to support AI technologies in their unique environments.

A modern edge environment solution designed around a boundless automation vision for data mobility can provide critical operational benefits today, while preparing process manufacturers to take advantage of the powerful AI tools the organization harnesses in the future, making it a compelling investment opportunity for future-proofing in anticipation of the rise of AI in manufacturing.

Claudio Fayad serves as vice president of technology of Emerson’s Process Systems and Solutions business. Edited by Mark T. Hoske, editor-in-chief, Control Engineering, WTWH Media, [email protected].

CONSIDER THIS

Are seeing advantages in AI process manufacturing applications and preparing for more tomorrow? 

ONLINE

Learn more about upgrades and modernization.

Why it’s critical to integrate distributed control systems

Claudio Fayad
Written by

Claudio Fayad

Claudio Fayad serves as vice president of technology of Emerson’s Process Systems and Solutions business.