Big Data: Getting tangible results

New methods can extract not just oil and gas, but also actionable insights.

By Michael Risse, Seeq Corporation August 5, 2016

Operators are increasingly applying big data analytics to obtain actionable insight regarding the performance of their oilfield assets. Courtesy: Seeq/YokogawaThrough significant advancements in sensors and other measurement instrumentation, along with improved methods of networking and data storage, upstream and midstream providers have the capability to collect mountains of data from drilling and other operations (see Figure 1).

Taking advantage of that data to improve process outcomes, however, has been another matter. In many cases, data resides in process historians which operators use to store time-series data. However, this form of software infrastructure is not particularly well well-suited for performing complex analysis or identifying underlying indicators of equipment failure.

In fact the most common method for extracting value from historian data is by using rudimentary analytic tools like spreadsheets. With this method, historian data is exported to spreadsheets. This data is then analyzed manually to create static reports, which are sent through the chain of command to decision-making personnel.

In addition to relying heavily on human eyes to identify meaningful trends, using spreadsheets to glean actionable information from large volumes of data is a time- and labor-intensive process. Using spreadsheets for data analysis makes it extremely difficult for operators to make real-time decisions regarding how to manage their operation-leading to reactive maintenance, higher costs, and increased downtime. In recent years, in an effort to solve the data rich, information poor (see Figure 2) problem, an increasing number of operators have applied big data analytics to their historians.

By aggregating, contextualizing, and visualizing information—big data analytic platforms give companies the power to quickly assess the performance and health of their assets at the speed-of-thought of the engineers and analysts conducting the research. They also provide the capability to identify meaningful trends and determine where and when equipment failures are likely to occur, both of which are critical to optimizing maintenance schedules and improving operational efficiency. Read on to find out just how oil and gas companies are successfully applying advanced analytics to derive value from their existing process historian data.

ESP failure prediction

One example of how big data analytics is being applied in the oilfield can be seen with electric semi-submersible pumps (ESPs).

When oilfield reservoir pressure gets too low, artificial-lift systems are used to transport fluids to the surface, increasing the production rate. ESPs are a reliable and fast growing artificial-lift technology used today in approximately 15% to 20% of oil wells worldwide. ESP failure can be significant as it often results in unplanned downtime, lost production, and high replacement cost.

Leading oilfield operators utilize real-time surveillance of key ESP performance parameters. Downhole monitoring units transmit large streams of data related to wellhead and flowline pressures, intake and discharge pressures, wellhead temperature, motor current, and motor vibration. Surveillance software can then look for problems such as corrosion, increasing water cut, free gas intake, and sand management.

Real-time measurement data from downhole monitoring is typically collected in a time-series data historian such as OSISoft PI or Honeywell PHD. This data is used for analysis and also displayed on control screens monitored by operations specialists. In the past, the application of predictive analytic software to ESPs has been an expensive proposition requiring IT, control systems, and technical engineering skill sets. Data from the historian was often replicated in an aggregate historian or enterprise data warehouse.

Furthermore, data scientists and event processing experts were needed to create ad hoc rules, statistical models, control charts, and other analysis methods to predict production problems and ESP failure. As changes were made in the field, the analysis methods had to be constantly adjusted, tuned, and updated by software developers. Failing to do so caused nuisance alerts and undermined the confidence of operators trying to use the system, or caused significant problems to be missed entirely. This has changed in recent years as user-friendly analytic software solutions have made their way to the marketplace. The approach focuses on the development of intuitive tools for use by engineers who have the best understanding of equipment, sensors, and oilfield operating history.

Software tools allow engineers to quickly apply their deep understanding of abnormal situation management, equipment reliability, and industrial asset performance management to create actionable information from process historian data.

In the software model, data remains in the historian. Tag metadata is indexed, and statistical models are created so operators can predict when a failure is likely to occur. Incorrect historian data caused by intermittent telemetry signals, or by changes in operating states which can temporarily destabilize sensor measurements, can be automatically adjusted using the software’s data cleansing features. This enables engineers to monitor ESP performance and identify precursor signatures that may indicate failure (see Figure 3).

When parameters deviate from normal operating limits, the platform alerts control personnel so proactive steps can be taken to minimize the impact of equipment failure. The result is increased pump uptime, extended operational life, and optimized production.

In addition, by being able to more effectively maintain ESPs, mean time between failures can be increased. This subsequently leads to a reduction in labor associated with repairs, better resource utilization, optimized spare parts inventory, and more efficient use of working capital.

Many oilfield companies are data rich but information poor, leading to a search for better ways to analyze their process historian data. Courtesy: SeeqMonitoring downhole drilling

Big data analytics can also be applied to the physical environment surrounding oilfield assets. An example of this can be seen in drilling operations, where subsurface sensors monitor downhole conditions and transmit data back to the surface for decision making.

By leveraging historical well data related to sediment build-up, rate of penetration, water opacity, thermal gradients, and differential pressure analytic software can identify key indicators related to common drilling issues such as stuck pipes, blowouts, or lost circulation.

Streaming data transmitted from sensors is monitored in real-time to ensure downhole conditions remain within the bounds of normal and/or expected values. Data analytic software is able to detect when the drilling environment enters an abnormal state and to alert operators so action can be taken to prevent potential problems.

This method of early detection offers a number of operational advantages because it enables decision makers to continuously optimize their drilling strategies. Perhaps even more importantly, it improves well site safety and minimizes the likelihood of damaging the formation and the environment surrounding the rig.

Ensuring accuracy of custody transfer

Big data solutions have also made their way into the midstream segment of the oil and gas market. Midstream companies gather hydrocarbons from upstream producers and operate processing facilities to "sweeten" gas, remove impurities, and distribute product from storage facilities and pipelines. A critical challenge for these companies is keeping track of product flowing to their customers, often through the use of custody transfer systems.

Seeq’s data analytics platform features “Capsules,” data views empowering engineers to monitor leading indicators through the use of visual analytics to identify precursor failures signatures and other subtle changes. Courtesy: SeeqPipeline networks are instrumented with a wide variety of flowmeters measuring product flow and gas chromatographs measuring parameters related to product quality. Supervisory control and data acquisition systems (SCADA) collect and transmit data from these instruments to time-series data historians. This data is then used to monitor real-time process data, ensuring pipeline integrity and compressor performance.

Equally important to pipeline operators is analyzing measurement data from electronic flow meters. In midstream operations, with high volumes of product flowing through lines, even small errors in measurement accuracy can have a large impact on profitability due to lost and unaccounted product. Ensuring that instrumentation is properly calibrated and does not drift is critical to avoiding these types of measurement errors. This is particularly true at points of custody transfer, where product flows from one company to another.

Ensuring ease of use

In recent years, the number of providers offering big data analytic solutions in the oilfield has increased significantly. However, one of the keys for companies to be able to take advantage of the benefits those solutions offer is ease of use. Many platforms today possess powerful analytic capabilities, but those capabilities can only be leveraged if personnel can quickly and efficiently interface with the platform to obtain the actionable information.

This is becoming increasingly important as many experienced professionals are retiring and being replaced by younger individuals with a much lower level of expertise as part of what is often referred to as the "Great Crew Change."

Platforms offering the flexibility to tailor the way data is presented to various disciplines will play a critical role in helping operators manage the industrywide skills shortage in the coming years. So too will solutions facilitating data sharing across departments, allowing companies to break down information silos and make better strategic decisions regarding their operations.

In today’s digital oilfield, most companies possess all of the data they need to improve operations within their historians. Many of these companies, however, continue to rely on traditional analytic tools such as spreadsheets to create actionable insight, which is becoming an increasingly expensive and time-consuming endeavor. As a result, much of the process historian data is not mined to create value.

Michael Risse is a vice president at Seeq Corporation, a company building innovative productivity applications for engineers and analysts that accelerate insights into industrial process data.