When disciplines collide in the oil patch as the silo approach to working presents operational roadblocks, advanced analytics brings it all together—and the results could lead to valuable insight.

But given the relative newness of this emerging process and related technology, earning the trust of people who rely on the information is just as important as devising accurate models and processes that can be explained and understood. Just ask Beau Rollins, the data scientist for Devon Energy who explained how the company used what is known as the Q-SEMMA analytics workflow approach to identify key geologic and engineering parameters associated with economic well performance during Texas A&M’s SAS Day.

As part of the process, a question is posed followed by sampling, exploring, modifying, modeling and assessing data to uncover patterns in an otherwise sea of varied information such as reservoir, geosciences, operations and artificial lift data. Focusing on early oil metrics of 90-day cumulative oil with subject-matter experts to help, the team integrated data from different disciplines.

“Having a standardized process allows your work to be repeatable, so when you move from one project to the next, you’re able to have consistent results on the outset,” Rollins said. He later described the integration piece that “can hook up to core engineering systems and geosciences systems using SAS Enterprise Guide. We can develop a workflow with almost 600 tasks inside it that will capture production stream data, frack design and landing calculations and merge those together alongside with regional geoscience properties from multiple formations—from the target formation, from properties above you and from properties below you and integrate them all every week, restructure the data in a variety of data” and deploy a visualization tool that pulls all of the technical data into one spot for viewing.

“We can integrate reservoir production data with regional geoscience data with artificial lift settings from PI [system technology]with frack design data from WellView and have it all interact with each other visually,” he explained. “You can click through almost 60 geologic properties and see their relationship with where they are spatially, where they are in the distribution and where they are in the scatter plot to start visually identifying trends. You can also look for operational differences between good wells and bad wells. … You can analyze this all in one spot, and it’s refreshed dynamically.”

In this instance, the team was able to develop a proof of concept that showed that predictive modeling on complex geological areas can be done. As part of the data modification process, the team used a scatter plot matrix to zone in on collinearity, pinpointing highly correlated variables, before conducting some basic statistical tests and testing the hypothesis that the two formations being studied were interfacing with each other. Models were created and compared to a “champion model” chosen and deployed. The model’s accuracy was monitored over time.

In the end, the team was able to identify five key variables that were important to predicting well performance. “With average formation properties over a section with very low-resolution data, we could build a model that was correct 74% of the time on its actual future predictions predicting whether a well would be above average or below average,” Rollins said before pointing to a heat map with hotter colors representing higher probabilities and cooler colors representing lower probabilities.

Rollins, however, warned that the data aren’t always going to be perfect, and data scientists must work with data that areavailable.

But data are a hot commodity when it comes to analytics.

“Data are valuable,” said Anton Gordon, predictive analytics lead for Baker Hughes. He said that in the past large swaths of data were lost when migrating from one system to another. “You can’t do that anymore, especially when it comes to customer data. Why? Because those data are useful. They allow you to understand how things change over time [and how] trends change.[Data] become this valuable commodity that everybody is trying to get. “

Capturing data is what is really fueling the market, Gordon added, noting that a company with large datasets has an opportunity to turn those data into viable value solutions.

What sets advanced analytics apart from analysis or other past analytical practices is the speed in which the customer, group or an event can be characterized, Gordon said.

“One of the things we’ve seen in the last five to 10 years is the ability to think about what the next purchase decision will be for the customer,” he said, using Netflix’s recommended movies listings as an example. “The system is automatically taking your information, characterizing you, putting you in a particular bucket and saying you’re probably going to behave this way. As the industry moves more and more toward algorithms of that type, this type of analysis will go into other, different sectors.”

However, Rollins pointed out the skeptics, saying some people either don’t believe that predictive analytics is possible or consider it a fad, so developing strategy and establishing trust are important.

“You can build a model that is very sophisticated and probably very accurate, but if you can’t tell anybody how to use it they won’t trust it,” Rollins said. “You have to work and manage that trust and develop that trust so professionals feel comfortable using it. … Buy-in is the most critical component; it is also the most difficult to achieve.”

Contact the author, Velda Addison, at vaddison@hartenergy.com.