HOUSTON—Data is pouring in from high-performance computing (HPC) and oil and gas leaders say this signals its longevity in the industry.

Speakers from Shell and Schlumberger expanded on the growing opportunities HPC presents for the industry at Rice University’s Oil & Gas High Performance Computing Conference held March 4-6. But, while they have all seen their companies’ data science applications improve operations across the board the consortium agreed that they are simultaneously juggling the difficulties.

Shell International E&P Inc.’s chief scientist of computation and data science Detlef Hohl said that data presents both a dominant challenge and opportunity.

“A high-performance computing perspective changed the way we characterize and run our subsurface and well operations. This [HPC] has many mediums for success,” Hohl added.

For Shell, he said, this includes superior quality and thorough human interpretations for its shallow section, 80% solution for deeper operations in a fraction of the time, the interpretation stage shifting from months to hours and no false positives.

However, Hohl noted that these improvements are seeing the challenges grow at the same pace. Specifically he said moving the data takes most of the energy and time in computing and interpreters— although more efficient—are struggling to interpret all the seismic data at the incoming speed.

Ken Sheldon, a representative from Schlumberger’s Unconventional Software Technology Center, said the company’s cloud-based HPC services have also faced issues.

“The challenges that we have [is that] the new work stations that people are using these days are constrained,” Sheldon said. “These days a high-end work station with a much larger problem set—about six wells that are a mile long of clusters 15 feet apart—just the fracture simulation [for that] may be 72 or 96 hours of time. You spend a little bit of time creating the grid from that and spend a similar amount of time in the reservoir simulator.

“So, the amount of work flows that you can get through in a week or a month is one and that’s not really acceptable given the type of analysis that our engineers would like to perform because they need to do it again and again,” Sheldon added.

Sheldon insisted, however, that this issue presents an “obvious opportunity for throughput improvement by increasing the number of concurrent scenarios that one engineer can orchestrate— that’s where we see the improvement”.

In agreeance, Hohl said his company needs to come up with new algorithms and interpretation paradigms to address big data. A simple fix that he said will continue to drive the business.

As for HPC’s future, Hohl said its importance will strengthen significantly as it addresses the transforming energy industry.

According to Shell’s Sky Scenario, by 2070 solar, bioenergy and wind will dominate renewables while oil will remain the largest fossil energy source. As a result, the demand for high-performance computing will grow exponentially to serve different energy sectors.

“The energy landscape is changing therefore we expect our high performance computing portfolio to change,” Hohl said.