In collaboration with Halliburton and AWS, Explor has achieved a key milestone with seismic data processing in the cloud, the company said on Sept. 9.   

Explor successfully ran Seismic Processing, a DecisionSpace 365 cloud application powered by iEnergy on AWS, leveraging a range of different elastic compute instances to optimize key seismic processing workflows.

In the first phase of a proof of concept, multiple benchmarking tests were run which demonstrated an 85% decrease in CDP sort order times: tested by sorting 308 million traces comprising of 1.72 TB from shot domain to CDP domain, completing the flow in an hour; an 88% decrease in CDP FK Filtering times: tested with a 57 million-trace subset of the data comprising 318 GB, completing the flow in less than six minutes; and an 82% decrease in pre-stack time migration times: tested on the full 165 million-trace dataset comprising of 922GB, completing the flow in 54 minutes.

For this project, Explor provided the 3D seismic dataset, data science and geophysical expertise.  Landmark, a Halliburton business line, provided the Seismic Processing, a DecisionSpace® 365 cloud application powered by iEnergy and technical expertise, and Amazon Web Services (AWS) provided the cloud computing resources and a team of technical experts and Solutions Architects.

Working together on the project, Explor, Halliburton, & AWS were able to optimize the cloud solution to reduce total processing timelines by 90 percent.

“This outcome will drive a step-change in seismic data processing and is a key stepping-stone in the digital transformation of the global seismic industry.  By reducing both timelines and the CapEx burden on seismic data processors, the industry will be able to scale on demand as we acquire ever-increasing seismic trace densities,” Allan Châtenay, president of Explor, said.

The challenges driving this project were the need to process ever-increasing sizes of seismic data while delivering higher quality results at lower cost. In recent years, the seismic industry has dramatically increased 3D seismic trace densities, with several companies (including Explor) breaking over the 100 million trace/km2 threshold. 

Surveys exceeding a billion traces/km2 are now being planned. This exponential growth in acquired seismic trace densities poses new challenges for seismic data processing as processors must deal with data volumes of several hundred terabytes for small or medium sized surveys, with large high-density surveys producing petabytes of data each month. Difficult and volatile market conditions make capital investments in high performance computing infrastructure challenging and risky for processing companies.