It often has been noted that times of slow activity lend themselves to more technological breakthroughs, and conferences like the recent Society of Exploration Geophysicists (SEG) annual meeting indicate that nobody was sleep-walking through the latest downturn. Let us take a look at some of the industry milestones that marked the year 2010.

Hardware
Acquisition technology, both land and marine, has seen huge strides in the past few years in everything from cableless land acquisition systems to wide-azimuth (WAZ) marine acquisition geometries.

In recent years, large channel-count land surveys have been in the news, and in March, WesternGeco announced that it had set a record with its UniQ integrated point-receiver land seismic system by reaching an 80,000-channel-count milestone. During the month of February, the system acquired and quality-checked one terabyte of data per hour – the equivalent of five days production for a typical 3,000-channel conventional crew.

A command-and-control system helps this seismic vessel stay aligned to the ice breaker.

A command-and-control system helps this seismic vessel stay aligned to the ice breaker. (Photo courtesy of
ION Geophysical)

A new cableless company, iSeis, introduced Sigma, a system that is helping to overcome some of the industry’s concerns about cableless technology. Among its benefits are a multistep approach to time-stamping rather than relying solely on GPS, a mesh radio network that reduces data loss, and the ability to harvest data without bringing the boxes into a central station for downloading.

Sercel also made a serious entry into the cableless arena this year by marrying the technology of its 428XL system with Unite to provide a “seamless” approach to infill and detour management requirements for seismic acquisition programs in complex environments. The new system integrates data on a single medium and collects cableless data by real-time radio or infield wireless harvesting.

In the source realm, Geokinetics introduced onSEIS, a lightweight, agile, and eco-friendly vibrator system with all of the benefits of traditional impulsive surface sources with the added advantage of synchronization to improve operational efficiency. The new system offers a solution for urban areas, difficult terrain, and limited access areas.

Another technology that has taken off this year is MicroSeismic Inc.’s buried arrays, which provide permanent microseismic monitoring. A near-surface array of permanently installed geophones is buried a few hundred feet below the earth’s surface and wirelessly linked together to monitor an area up to 500 sq miles (1,295 sq km). The cost to monitor an entire field with buried arrays is the same as the cost to equip a single well with downhole microseismic, according to Mike Mueller, chief geophysicist.

Non-seismic methodologies also saw increased use this year, both onshore and offshore. The cost-effectiveness of techniques like gravity and magnetic surveys allow operators to view large swaths of prospective acreage before honing in with 2-D and 3-D seismic surveys, and a newer technique known as gravity gradiometry imaging provides high-resolution images of density variations in the subsurface. Already, ARKeX has conducted such surveys in the Northern Oman Mountains of the United Arab Emirates, the transition zone in the Gulf of Mexico, and a multiclient project offshore Gabon that combined high-resolution marine gravity gradiometry data with seismic data. Additionally, Bell Geospsace applied its full-tensor gravity gradiometry technique to image structural fault closures, carbonate buildups, and subsalt targets.

Traditional seismic in the marine sector has focused increasingly on newer techniques like WAZ surveying, and CGGVeritas announced in August that it had been awarded a 6,950-sq-mile (18,000-sq-km) extension to a 3-D WAZ survey it is shooting for Pemex in the Mexican waters of the Gulf of Mexico, making it the first WAZ survey of this scale in Latin America and one of the largest in the world.

But streamer technology is pushing the boundaries technologically as well. ION has introduced a system that can manage arctic environments by using streamer steering technology to maneuver equipment under and around ice while also employing ice-resistant GPS buoys for accurate positioning and a command and control module that provides synchronized navigation for the vessel to follow in the wake of the ice breaker. The company already has captured seismic images for phases 1 and 2 of its Northeast GreenlandSPAN survey.

WesternGeco, meanwhile, began the industry’s first dual coil-shooting survey in the western Gulf of Mexico. This is a new method of acquiring long-offset marine seismic data using four vessels following a circular path. It enables better azimuthal coverage and a higher signal-tonoise ratio, the company said.

Ocean-bottom cables (OBC), a technology that many thought nodes would replace, are strengthening their position. WGP Group currently is conducting the world’s largest redeployable 4-D/4-C OBC project over the Azeri-Chirag-Gueneshli (ACG) field in the Caspian Sea. The ACG is one of the world’s largest producing oil fields, so WGP designed a system that could be laid down and picked up as often as necessary to do repeat surveys.

Compressional impedance estimated with probabilistic neural networks (a) delivers better well-log correlation than (b) impedance obtained from poststack inversion. (Image courtesy of Arcis)

Compressional impedance estimated with probabilistic neural networks (a) delivers better well-log correlation than (b) impedance obtained from poststack inversion. (Image courtesy of Arcis)

PGS also likes the OBC approach to reservoir monitoring, and the company recently announced that its OptoSeis fiber-optic OBC system received certification for operations in up to 984 ft (300 m) water depth in early 2010. More recently, the qualification for deepwater operations up to 6,562 ft (2,000 m) was received by Det Norske Veritas.

The electromagnetics (EM) segment of the industry was full of news this year, with OHMRSI deciding to sell its marine acquisition business to focus on processing and interpretation.

Meanwhile, a presentation by Statoil at the European Association of Geoscientists and Engineers annual meeting attempted to place a quantitative value on the information provided by controlled-source EM (CSEM) surveys.

“Based on performance tracking and review of the prediction strength, the economical value of CSEM data can be more than 10 times above the typical costs for a CSEM survey and analysis,” the authors noted. “This value includes only the value related to the actual drill-or-drop decision, neglecting the potential value of the data in other settings, for example, mapping the outline of a prospect, decisions on drill location, ranking of prospects, and potential later use as calibration for other cases. The real value of the CSEM data may, therefore, be considerably higher.”

The value could increase even more if an experiment by PGS leads to a new acquisition system. The company tested the concept of “towed EM,” where EM and seismic data were acquired together in one line. The prototype system was deployed to collect data at similar production rates to 2-D seismic vessels for further processing and inversion development. Onboard quality control indicated that the EM acquisition was unaffected by the seismic source, and vice versa.

In the formation evaluation segment, Schlumberger introduced a multifrequency dielectric dispersion tool capable of accurately quantifying residual hydrocarbon volume, Archie’s exponents, and formation carbon exchange capacity.

Baker Hughes successfully deployed its Nautilus suite of HP/HT formation evaluation technology in an ultra-deep gas discovery in the shallow-water Gulf of Mexico. It was used for logging operations to 28,134 ft (8,850 m), and the tools and sensors withstood the lengthy pipe-conveyed operation and the extreme HP/HT conditions.

Software, workflows
Processing jargon starts to sound like alphabet soup after a while, but it all is geared around using algorithms and mathematical computations to make images as rich and accurate as possible. Reverse time migration almost is a household term already, and other methodologies are gaining acceptance as they prove their worth in imaging complex environments.

TGS, for instance, threw a host of acronyms at the deltaic slope offshore Liberia, including surface-related multiple elimination, true-azimuth multiple elimination, tomographic velocity model-building, and Kirchhoff prestack depth migration. The result was an improvement in continuity and more geologically reasonable structures.

Neural networks also are getting another look. Calgary-based Arcis is using a probabilistic neural network approach to estimate the compressional impedance section, which, in turn, is helpful in delineating reservoir lithological properties. The company’s research indicates that this methodology delivers better well-log correlation than that obtained with a post-stack inversion.

Additionally, SMT founder Tom Smith has launched Geophysical Insight and is researching the use of neural network technology for attribute classification. His goal is to create a completely numerical, unbiased methodology that evaluates possible combinations of selected attributes.

Geomage has introduced “multifocusing,” a new method of moveout correction and seismic data stacking that does not require accurate velocity modeling. It uses time imaging of lowfold common midpoint data to improve signal-to-noise ratio, improve coherence, and increase resolution.

When it comes to interpretation, two themes have been apparent this year. One is the continued push to integrate the workflow so that the same data are being massaged throughout the process. Landmark introduced its DecisionSpace Desktop solution with this goal in mind, while Paradigm has been tweaking its Rock and Fluid Canvas.

The second theme is unconventional resource plays. Several conferences and webcasts this year focused on the need for seismic data in developing unconventional fields, and most of the major players have made this a key part of their toolkit.

ION, for instance, is promoting this through its GX Technology subsidiary. The company recently acquired a 200- sq-mile (518-sq-km) multicomponent, multiclient survey in Clearfield County, Pa., that encompassed survey design, permitting, processing, and attribute analysis and inversion. These types of projects are becoming increasingly attractive to operators in shale plays, the company said.

Global Geophysical also applies cutting-edge technology across the entire seismic workflow, acquiring what it calls “reservoir-grade” seismic – long-offset, full-azimuth 3-D acquisition followed by deriving anisotropic parameters in the shale. This, in turn, allows the interpreter to extract rock property information through seismic inversion and ultimately map impedance to aid in well placement.

Transform Software and Services was one of the first companies to develop software specifically targeted at shale plays, and a recent presentation focused on the Barnett shale, which still presents operators with challenges after 30 years of development.

Transform software automates the interpretation of stimulated reservoir volumes from microseismic data and calibrates these estimates with wellbore and fracture-stage production using new statistical techniques. This is the tip of the iceberg in terms of 2010 milestones and developments, but it reflects the innovative spirit of the exploration industry as it continues to triumph over the challenges posed by the search for oil and gas. 2011 should be an interesting year indeed.