It’s taken more than 15 years of “research,” aka false starts and frustration, but the news is finally here – simultaneous source acquisition is now possible in a marine environment.

The technology is an established, land-acquisition technique with a proven record of improved data quality and increased acquisition efficiency. But developing the technology for marine acquisition has been slower, primarily because of the extra constraints placed on marine acquisition by its sources.

Marine sources lack the ability to shape the source wavelet the way that land vibrators do, and adding extra sources on extra vessels is more expensive than adding sources on land.

But WesternGeco and Apache, under the aegis of a technology partnership aimed at collaborative technology development and fast-tracking promising technologies, took the plunge. The companies took the concept from small-scale tests in the North Sea through simulation exercises on existing and synthetic data to a full, field-development-scale survey offshore Australia.

The SimSource technology was applied to a conventional, narrow-azimuth survey called Cambozola, operated in partnership with Finder Exploration, Santos, OMV Australia, JX Nippon Oil & Gas Exploration Corp., Tap Oil, and Japan Australia LNG (MIMI).

Data in the region were previously acquired using a “flip-flop” technique -- sources were fired alternately every 18.75 m (62 ft), creating a 37.5-m (123-ft) shot interval for each source line. Using simultaneous sources halves the shot interval and doubles the fold, resulting in better-sampled data that improves random and coherent noise attenuation and imaging. The data acquisition technique requires no extra time or vessels.

Conventional acquisition (left) vs. Simultaneous shooting (right) offshore Australia (Photo courtesy of Apache Corp.)

At its heart, the simultaneous source technique is simple. A sparse inversion algorithm exploits source activation times dithered relative to one another to enable separation. Once separated, the data are processed by conventional means. The critical factor is the quality of the source separation. Previous studies indicated that such source separation is possible at the proposed shot intervals.

During the acquisition, the two sources were fired simultaneously every 18.75 m. One of the sources was termed the “master” and shot on position; the other source was fired with prescribed time differences, or dithers, relative to the master shot. These dithers were randomly distributed over a small, time window.

In addition to standard quality control (QC) products, certain QC products were designed to ensure agreed-upon firing times. The survey was acquired without incident.

Once acquired, the data needed to be handled so that any processing applied before separation preserves the signal from both sources. In this type of survey, noise that is not source-generated should also be removed prior to separation.

After separation, the data volume is doubled, and each trace is associated with a single source, at which point conventional processing can be used.

Final data quality was judged (in comparison to vintage data) to be superior in terms of signal-to-noise ratio and AVO analysis, a prime goal for this survey.

Contact the author, Rhonda Duey, at rduey@hartenergy.com.