In today’s world of eye-popping technology, it’s often hard to remember that we once lived without cell phones, BlackBerrys, computers, the Internet, DVD players, etc.

What’s even harder to imagine is geologists and geophysicists working without workstations and modeling software.

But these technologies were not available to those entering the industry in the ‘70s and’80s. They have, along with many more exploration technologies, been developed in the relatively recent past. Rocky Roden, a geophysical consultant, recently outlined some of these game changers.

Roden spoke at the recent Houston Geological Society “TechnoConference,” the theme of which was “Ten years of technology.” His talk, titled “Geoscience Technology in the Search for Hydrocarbons this Past Decade,” was a smorgasbord of technologies that truly have altered the way we look for oil and gas.

Until 1985, he said, technology was really not a big factor in exploration. But the introduction of workstations enabled explorationists to work with 3-D data, and the advent of horizontal drilling increased success rates.

“From the mid ‘90s until today, technology drives what we do,” he said. “This is predominantly driven by the increases in compute power.”

He showed a laundry list of significant discoveries in the past decade, mostly driven by these improvements. After a trough in 2004, the industry has rebounded in discoveries, with deepwater discoveries accounting for 42 to 54% of the total from 2006 to 2009. Petrobras’ success in its presalt finds is partially responsible for this increase.

So what technologies have been driving these successes? Firstly there are advances in seismic acquisition and processing – wide-azimuth surveys, the use of longer offsets, and prestack depth migration, for instance, have all helped improve imaging beneath salt. On land, cableless recording systems allow for very dense receiver arrays, providing a better subsurface image.

Another advance that is still not widely used is multicomponent seismic. These surveys can be 3-D/3-C surveys, which record a three-dimensional survey with three-component geophones; a 9-C survey, which records three components of motion from each of three sources that generate orthogonal motions; 4-C surveys, used offshore and employing three hydrophones and an ocean-bottom seismometer; and 4-D surveys, time-lapse 3-D surveys recorded through time, usually to measure acoustic response differences as a reservoir is produced. Time-lapse surveys are not always acquired with multicomponent surveys, but the added shear wave information offers useful insights about lithology and fluid content.

Roden added that 4-D surveys are now so common that 75% of the seismic surveys being shot in the North Sea are 4-D assessments of existing fields.

Seismic processing has taken advantage of increased compute power, and companies use multiple CPU clusters to speed up processing time. Roden said that processing contractors have some of the largest computer processing facilities of any industry or government in the world.

Unconventional gas operators are taking advantage of advances in seismic technology for fracture characterization. This can be done in the wellbore with logging tools and vertical seismic profiling tools but can also be done from surface seismic by studying geometric attributes, especially similarity and curvature; analyzing compressional waves through azimuthal variations of amplitudes, amplitude vs. offset, normal moveout velocity impedance, attenuation, etc.; measuring shear wave splitting and anisotropy; passive seismic monitoring; and remote sensing, which identifies fracture lineaments.

Controlled-source electromagnetic is another significant development. Roden said that although the technology got tremendous hype when it was first introduced, it still provides a good complement to seismic.

The use of attributes has grown significantly since complex trace attributes were introduced in the 1970s. Roden said that there are now hundreds of attributes, which effectively are a measurement in the data that helps make an interpretation.

Spectral decomposition breaks the seismic data down into discrete frequency bands, he said. It can help identify layering, stratigraphy, and sometimes even direct hydrocarbon indicators. Pore pressure prediction and visualization have also been significant advances.

What will the next decade bring? Roden offered a few ideas:

1. Moore’s law will end in 2020.

2. Human knowledge will double. More scientific knowledge was created in the first decade of this century than in all of prior history.

3. Parallel processing and the integration of GPUs with CPUs will push the capacity of desktop workstations to not only handle ever-increasing volumes of data but also produce high-end graphics.

4. Nanotechnology and miniaturization of microprocessors will truly enable the intelligent oilfield.

5. Broadband data communications will continue to improve.

Looks like I’ll have plenty to write about in the next 10 years!