Some people would not consider it far-fetched to say oil and gas industry players have been slow to adapt new technologies.
Moray Laing, executive lead of oil and gas for SAS, believes that this is partially due to the industry being quite fragmented. There is no site where one company runs the entire operation. Instead, there are contractors working with service companies that have different standards from the operator that owns the asset, he said during the company’s Energy Analytics Summit in Houston.
“That leads to the problem … with managing the data,” Laing said, pointing out the amount of effort that goes into aggregating and integrating data. However, when the oil and gas industry moves into something similar to the Internet of Things, “I think you end up with a fully, contextually aware oil and gas industry. What will happen with innovation in our industry? It won’t rely on the traditional data fragments, and to develop it, you will start to see the emergence of garage industries and garage solutions.”
Simon Sheather, a professor and academic director at Texas A&M University, sees a shift from dashboards to predictive models.
“Organizations have spent a lot of time, money and energy in organizing the capture and use of data. Dashboards are great. They tell you what happened five minutes ago or yesterday. It’s like driving your car down the freeway and looking in the rearview mirror,” Sheather said. “That’s great if the road is straight. But [with] predictive models … in the energy sphere you can predict when there is going to be an outage ahead of time or predict when there is going to be a situation on an oil rig up to two days in advance.”
Their comments came in response to what they believed would be the most likely disruptive force in the oil and gas industry about 10 years from now. The two were among the panelists—others included Paul Barnes, a consultant and former technology manager of innovations, for Aera Energy, and Ian Wilson, team lead and co-architect of ConocoPhillips’ Evergreen Global Earth Model project—speaking on people and analytics during the summit, which highlighted data trends in the era of big data, cybersecurity and the generational divide in energy.
In addition to a shifting paradigm that could be created with a greater demand for gas, caused by a surge in natural gas-fueled vehicles for example, Barnes said a disruptive force could involve staffing plans. At present, everyone drives to work in the oil field to get information, but technology changes that. “People don’t need to be present to accomplish results. As the industry changes that paradigm, we will start to discover totally different staffing models than what we’ve been used to,” he said, pointing out how he was able to snap a photo of operations at a California plant from a hotel in Puerto Vallerta.
Wilson added that disruption is more of an opportunity. “We’re going to see a change in organization. … There will be afocus on analytics,” he said. “The companies that will reap the benefits of that will be the ones focusing on this issue now and looking to see how this new technology, this new approach, can be integrated with the real business problems we are facing right now.”
Among the challenges is handling massive amounts of data.
“The thing that is scary about big data is not the volume. It’s the breadth,” said Evan Levy, vice president of business consulting for SAS. About 15 years ago, the challenge of the data warehouse was how to add new sources every three or four months. However, today, “as business people, we need access to data as it comes into existence.” New data sources could number into the hundreds in a year. “How many IT [information technology] organizations could practically handle a hundred new data sources next month, a thousand [data sources] next year?”
While speaking on data trends, Levy mentioned obstacles being faced in the oil and gas industry. One company, he said, doesn’t have any internal systems but receives lease, asset and equipment information from different suppliers, partners and consultants. As such, it is building a data center. “Their big challenge is they don’t have the background of knowing how to manage all of these inbound data.”
He described other industries that have figured out how to design and navigate systems with large amounts of data. The examples included Amazon.com and the library with the Dewey Decimal classification system. “But none of our data are set up that way,” he said. “In reality we know that on average when a piece of data is created there are 12 downstream systems that rely upon that content,” including production, billing and finance systems.
“The thing that we have come to learn when it comes to data is it is assumed that it is accurate, it was just created, and we spent all the time and energy necessary [to ensure] that what shows up is what someone expects. I don’t want to dispel that myth, but in reality it’s not so good,” Levy said. “The data might be on one system, may be in multiple systems or may be in flight. … Everything that we do is challenging the paradigm because technology evolves so dramatically.”
During his talk, Levy identified data trends and gave action plans:
- Data sharing is a defined responsibility. Create a cross-functional business and IT team to establish data-sharing standards; adjust development plans to ensure the accessibility of data; and budget for IT staff to support data sharing.
- Data are a controlled substance. Create systems of record for operational and analytic access; outlaw data dealing and black market data provisioning; define guidelines for accessing, sharing and using data internally; and centralize and manage data provisioning as a production process and activity.
- Data consumerization. All application data should be packaged for sharing and delivery; establish responsibilities for packaging and publishing data; and develop a plan to support data self-service for commodity data assets.
Pointing out the abundance of sources for consumer and business content and the need for individuals to have access to any data asset, Levy added the challenge of big data is variety. In this regard, the action plan includes establishing a card catalog for data content; managing and tracking data assets as well as removing or replacing unused, low-value assets; and leveraging the IT team’s data knowledge.
“The definition of big data is data bigger than what you can get at. One of the things that I think we’re all finding out is it’s not really how much data we have,” Levy said. “It’s how diverse [these data are], and really how can I get my hands on [them].”
Contact the author, Velda Addison, at firstname.lastname@example.org.
A new monitoring service locates microseismic and strain events in real time.
Amid new lows in permeability, biodegradable frac diverters lift production and lower completion costs.
Check out the latest products and services released for the upstream oil and gas industry.