Road to real-time performance improvement

Big data and industrial Internet technologies are converging to help companies monitor operations and solve problems on the fly.

By Sidney Hill June 3, 2015

Data analysis is a required core competency for oil and gas companies. It starts with interpreting geographic, geologic, and seismic data to find the right spots to drill, and progresses to monitoring changes in variables, such as temperature and vibration, which affect the performance of drilling, pipelines, or refinery equipment.

It’s common industry practice to deposit this information in a data repository from which it’s retrieved at regular intervals and used in operational performance reviews. However, market dynamics, particularly those related to the price of oil, are causing this practice to change. 

Producers no longer have the luxury of reviewing their performance at regular intervals. Key performance indicators must be reviewed on-the-fly, which is why companies increasingly are looking for ways to access and analyze operational data in real time.

When oil is priced at more than $100 per barrel, then the main objective is to extract as much of it as possible. Furthermore, when the price is that high, producers don’t care about the associated costs because it is cheaper to replace a machine than to deal with the downtime and the major loss in revenue that comes with it.

The recent digital oilfield movement has provided companies a solid foundation for real-time data analysis.

Producers are now measuring things that were never measured before, there are more metrics available to analysts," The main challenge for the digital oilfield to work is to get all the machinery and equipment to communicate with each other; if they can’t communicate, then they can’t analyze data in real time. 

Solving that problem starts with a process known as data rationalization, which simply means getting all operators to agree to use the same terms for specific pieces of data.

The Columbia Pipeline Group is working with GE and Accenture on an advanced analytics application that’s expected to provide real-time data on the condition of pipelines in the Marcellus and Utica shale plays in southwestern Pennsylvania and Ohio. Courtes

Get rational with your data

When data has been rationalized then a platform can be installed to amalgamate data to create metrics and report cards.

The payback for this can be substantial. Consulting firm Bain and Company said its 2014 survey of 400 executives across a spectrum of industries revealed that companies with advanced analytic capabilities are five times more likely to make decisions faster than their peers, and twice as likely to be in the top quartile of financial performance in their industry.

Bain considers companies engaged in advanced analytics to be part of the Big Data revolution, and it published a report, in 2014, titled, "Big Data Analytics in Oil and Gas: Converting the Promise into Value," explaining why it believes oil and gas companies should join this movement.

The report states, among other things, that oil and gas companies with advanced analytic capabilities are more efficient at managing a range of data, such as seismic information, drilling logs, drill bit RPMs, fracking-performance measurements, and production rates. However, it also notes that only a few oil and gas companies have true advanced analytics programs in place.

A major challenge is getting executives to understand that building infrastructure for real-time data analysis requires more than installing new technology. It also requires some changes in organizational thinking and behavior.

Advice for companies wishing to exploit advanced analytics to appoint two individuals to lead the effort: A business analyst who understands the drilling and production process, and an enterprise data architect from the IT department.

This collaboration should start with the business analyst pointing out which pieces of data must be collected, and where they can be found, to generate information that allows for meaningful analysis. While that’s happening, the data architect should be identifying anomalies in the data-such as a single well being called by different names in separate databases-that could result in faulty analysis.

This is the data rationalization process, also sometimes known as master data management. Some oil and gas companies, particularly larger ones, have found this process so daunting that they’ve turned it over to outside consulting firms.

A blueprint for analysis

Whether it’s done in-house or with assistance from outside experts, The process should be conducted in similar fashion. It should start with a blueprint for the type of analysis the company wants to conduct, and that should dictate which pieces of data are rationalized first.

With a lower price of oil, it is vital for profit margins to know which are the top-producing wells and which are the underperformers. To ascertain this information a platform must be created to extract information from each well in terms of production and accounting, rationalize it and then run it through the analytic software.

In addition to the major IT services and consulting firms, some technology vendors offer solutions that promise to smooth the way for oil and gas companies to conduct real-time data analysis. Among them is TIBCO.

"When we work with customers, we try to start by setting up a platform for analyzing the mission critical parts of the business," said Steve Farr, TICBCO’s product marketing manager for the oil and gas industry.

Marathon Oil is using a TIBCO platform to monitor the performance of drilling rigs at various sites around the world and sending the results to both field engineers and managers at the company’s Houston headquarters. The platform, known as Spotfire, extracts data from two primary sources: sensors on individual wells and Marathon’s geological database. Marathon engineers use these data to compare the actual progress of the wells against their projections at the outset of drilling.

"This tells them if things are on track from a production standpoint," Farr said. "They also get information about actual drilling costs."

Faster analytic engines

Farr said Spotfire gives Marathon the option of monitoring some processes in real time while analyzing others at regular intervals. For real-time analysis, the system is programmed to keep and process all data in the memory of the server on which the platform is running. For longer-term analysis, data flow to a back-end database that serves up reports in response to users’ queries.

A vendor called SQLstream claims to cater to companies that only want real-time analytics. "The traditional approach to data analysis has been to collect data and load it into a historian or database using what’s known as an extract, transfer, and load (ETL) process," said Ronnie Beggs, vice president of marketing for SQLstream. "Now, there’s a trend toward in-memory analytics."

The idea behind in-memory analytics is to get meaningful data to users faster by skipping the stage of sending them to a database. However, Beggs argues that in-memory systems still have to perform some database-type functions, such as indexing data, so it can readily respond to queries. He said stream processing eliminates those steps as well, making it even faster than in-memory analytics, and ideal for real-time performance monitoring. Operators for the Columbia Pipeline Group expect to be able to get current data on pipeline conditions anytime, on any device, thanks to an advanced analytics application. Courtesy: GE

An oilfield services company used SQLstream to examine the efficiency of its rigs. "It provided better visibility of what was happening on each rig, allowing them to reduce the number of times they had to send out maintenance crews," Beggs said. "Because of that, they were able to boost the number of rigs they managed from 300 to more than 1,000 in a short period of time."

General Electric (GE) also is developing technology to help oil and gas companies join the advanced analytics movement. As a supplier of sensors and other hardware deployed by oil and gas companies, GE views the spread of advanced analytics software as a logical step toward the industrial Internet of Things.

The road to operational excellence

"We’ve spent three years and more than $1 billion on this industrial Internet journey," said Ashley Haynes-Gaspar, general manager of software and services for GE’s oil and gas business unit. "Software has always been associated with GE equipment. The difference with our industrial Internet offerings is that now the software works with equipment from other suppliers as well."

GE’s industrial Internet centers around a software platform called Predix, which Haynes-Gaspar describes as "an industrial operating system" for building and managing analytic applications. The platform is designed to support application developers across all industries, but Haynes-Gaspar said, "We have multiple applications for oil and gas."

Haynes-Gaspar said the Houston-based Columbia Pipeline Group asked GE to develop an application that will improve a company’s ability to manage pipeline assets. This application, called the Intelligent Pipeline Solution, is set to begin monitoring portions of Columbia’s pipelines in the Marcellus and Utica shale plays in Southwestern Pennsylvania and Ohio in the summer of 2015.

"We need an agile, comprehensive solution to allow for a better real-time view of pipeline integrity across our interstate natural gas pipelines," Shawn Patterson, president, operations and project delivery, Columbia Pipeline Group, said in announcing the decision to adopt the GE application.

The application is expected to pull data from sensors installed in the pipeline to record signs of corrosion or wear. "That data will be combined with information about weather, seismic activity, and topography," Haynes-Gaspar explained. "All those analytics will generate charts offering color-coded views of threat levels along the pipeline. These analytics will allow operators to prioritize maintenance schedules, forecast potential problems, and just generally become smarter about how hard to push their pipelines."

Sidney Hill is a graduate of the Medill School of Journalism at Northwestern University. He has been writing about the convergence of business and technology for more than 20 years. Courtesy: Sidney HillGE listened to Columbia Pipeline’s engineers to determine how to design this application, but it brought consultant firm, Accenture, in to help with integrating the application with non-GE equipment as well as to help Columbia with organizational change issues. 

"For Columbia to realize the full value of this application, operators will have to behave somewhat differently," Haynes-Gaspar said. "Ultimately, the application should help operators make better decisions, but some change management must occur before that happens."

– Sidney Hill is a graduate of the Medill School of Journalism at Northwestern University. He has been writing about the convergence of business and technology for more than 20 years.

– See additional stories from Sidney Hill below.