Analytics is all around us. From social media and advertising to shopping and transportation, all these industries and more use various levels of analytics to enhance or streamline their operations. The oil and gas industry is no different, with ambitions for these technologies centering on a step change in thinking for informed and effective drilling operations.

But a word of caution: There are tangible barriers that require rethinking collaboration within the shared drilling environment. Several companies are already taking proactive steps in their analytics approach and the value they bring to interpretation, drilling or production at their companies.

The ambitions of a digital rig with advanced analytics have been as varied as the approaches, but they tend to center on four promises:
1. Reduced cost through improved operational efficiency;
2. Improved hole quality;
3. Increased understanding of the risks to safety and the environment; and
4. Reduced cost over the life of the well during production, maintenance and workovers.

The concept envisioned is that a drilling rig can act as a data platform. All sensors about the drilling operation, as well as those that cover mechanical systems, are connected and available as an information platform. This platform has enough security to provide corporate protection but still allows interoperability to the various services and technical solutions needed to plan, design, refine and monitor situations in a timely manner.

Using this information platform, Petrolink then applies any variety of processes that can be used to address problems. However, there are many barriers to the delivery of this promise. With the exception of a class of drilling activities services by a single source, most drilling platforms are a mixture of solutions from the operating company and multiple service companies. This blend means that there are literally hundreds of technical software systems that are used in the planning, design and execution of drilling a well. Rarely are these systems connected enough to allow advanced analytical processes to operate effectively without significant preprocessing and data preparation.

Analytical systems operate more effectively when there is a level of predictability in the naming and structures of data. Standards for transfer, access and organization of basic data using industry formats help build a framework of understanding for the unstructured data where answers live.

Real-time data are the life blood of analytics; therefore, nothing is riskier to a process than to have it miss vital data during critical phases of operations. Recently there have been published reports from major operators where the standard for real-time data delivery from the sensor to analytics engine is measured in minutes, not seconds.

As important as the data is the confidence in the way those were collected and how those correlate (in time) to other data collected through other processes. Imagine trying to drive from Houston to Dallas using a map of the U.K.; it would be completely useless.

How do we tackle these problems?

One way is through the recent trend where operators are internalizing the data platforms and using control of data as the means of ensuring that analytics or data sciences may flourish.

A logical change is to rethink how we value and leverage technical services. Historically, we evaluated technologies based on data. Now we also measure them based on interoperability within the larger information landscape.

Finally, revisit the goals of the analytical platform and focus on tangible answers by moving from lofty ambitions toward achieving results that are measured in savings of time or reduction in risk.