One of the goals of today’s geoscientist is to provide the reservoir engineer with a description of the reservoir from which he or she can work, optimizing reservoir production over the life of the field. This requires log, core, seismic and production data, as well as geotechnical flare, thought, experience and local knowledge. The current work environment reflects this, with multidisciplinary teams now the norm in many oil companies. Such teams started appearing in the early 1990s and, after a decade of experience, are more cross-disciplinary than ever before. The geologist, geophysicist, petrophysicist and reservoir engineer are now very much integrated and have a healthy knowledge of each other’s disciplines.

In the past, reservoir characterization was mostly deterministic, whereas today probabilistic

Figure 1. In Hampson-Russell’s AVO software, offset modeling and analysis of prestack seismic data are critical AVO tools for fluid modeling and amplitude mapping. In addition, Bayes’ theorem and Monte Carlo simulation can be used to build fluid probability maps. (All images courtesy of Hampson-Russell)
techniques are widely used, and many of the reservoir parameters are assigned risk factors. Reservoir descriptions have therefore moved from being primarily qualitative to largely quantitative. At the same time, we have progressed from using extrapolated
2-D seismic and sparse well data to using 3-D seismic and 3-D models that rely heavily on seismic attributes calibrated to well data. This involves integrating and iterating increasingly large data volumes to provide reservoir simulation models that yield better predictions of field performance.

Within the context of an exploration environment, reservoir characterization can be defined as describing the anticipated distribution and properties of the reservoir. By this we mean the 3-D mapping of porosity, net-to-gross, reservoir thickness, pore fluid, permeability, etc. We no longer work in the isolation of our own disciplines since, following reservoir discovery, accurate reservoir characterization, an ultimate goal, can connect to all of the key geophysical techniques of today: seismic inversion and attributes; amplitude versus offset (AVO) and rock physics; and 4-D, 3-C and 4-C seismic volume interpretation.

Technology has always played a key role in the oil and gas industry. As computing technology has grown, exploration and production software has expanded to follow. Our ability to handle large volumes of data means that geophysical techniques such as rock properties cubes; elastic impedance; and simultaneous inversion for compressional impedance, shear impedance, density and lambda-mu-rho have, at least for some companies, become standard techniques to be used on all prospects. We are seeing the boundary between AVO and inversion — or prestack and poststack studies — starting to blur now that software is readily available for doing full prestack inversions.

Originating from research into artificial intelligence, the use of neural networks has gained momentum within the industry. We can now use neural networks to model missing log data or to create 3-D volumes of petrophysical properties, a key input to reservoir simulator models. Mathematical analysis of such predicted volumes can be used to estimate the corresponding error, further benefiting any risk analysis.

One area of steady growth for reservoir characterization is the use of multicomponent
Figure 2. Hampson-Russell’s STRATA software enables traditional post-stack inversion, which is ideal for identifying lateral and vertical distribution of reservoir sands. Newer AVO inversion algorithms can be used to gain fluid-related information.
seismic data: the recording of converted shear-wave, or PS, data in addition to standard compressional-wave, or PP, data. This opens up a new range of reservoir characterization tools, such as velocity (Vp/Vs) ratios, fluid content, lithological changes, fracturing and anisotropy. The latter two are key reservoir characterization tools for carbonate reservoirs. Furthermore, there are additional benefits inherent in PS data that help reduce the uncertainty of structural interpretations. The classic example is the ability to image through gas clouds, allowing the visualization of previously unseen reservoir events. There are also numerous published examples that demonstrate high-resolution shallow PS seismic data that images faults precisely.

Multicomponent data does not come cheaply or easily; in marine environments, ocean bottom sensors are required. However, as service companies become more experienced in the acquisition and processing of this very special data, techniques are improving, and both costs and turn-around times are coming down. Demand for multicomponent continues to steadily gain momentum and grow. However, most geoscientists are not experienced in interpreting PS data, and there are just a few interpretation packages on the market specifically targeting this non-trivial process.

Today geoscientists in the production environment of oil companies are heavily involved in reservoir characterization, in particular, characterization of the reservoir in terms of how it has been impacted by the ongoing production processes — seismic data carries the all-important inter-well information. In other words, geophysicists are looking at the seismic data and the insight it can bring to production-related changes such as water saturation and reservoir temperature and pressure as well as locating sub-seismic resolution faults and other production barriers. Seismic data used in reservoir characterization can dramatically impact and improve reservoir production over the life of the field.

Globally, the size of hydrocarbon finds is going down at a time when demand for oil and gas is continuing to grow. In many countries, the rate of production exceeds the rate of reserves replacement. In many cases, the reserves are there; they just need to be located or extracted more efficiently. Reservoir geophysics combined with the latest software and reservoir characterization techniques is bringing significant improvements to production in existing fields. Some oil companies are now moving toward time-lapse (4-D) techniques that can be used to indicate infill drilling opportunities, predict flood fronts and help avoid early water breakthroughs.

In the early days of time-lapse seismology, geophysicists spent their energy on trying to
Figure 3. Time-lapse (4-D) studies incorporate key elements of fluid modeling (temperature, pressure, water saturation variations, etc.), and synthetic generation. In Hampson-Russell’s Pro4D software high-tech processing tools are used to match 3-D seismic volumes; difference volumes can be matched to production history.
justify the cost and applicability of this method. However, our understanding of and experience with the impact of fluid changes on seismic data has grown since then. Geophysicists are now concentrating their efforts on demonstrating the value of time-lapse data and incorporating the results into reservoir management workflows. Nevertheless, there are still areas for continued growth and improvement. For example, our ability to model accurately the impact of reservoir pressure changes on seismic data is still open to question.
Some areas of reservoir characterization are, to a degree, still sitting on the sidelines.

Anisotropy (the fact that seismic properties vary as a function of direction of wave propagation) is not incorporated as standard into our processing and modeling even though its importance is well-documented. Its use for fracture detection in carbonate reservoirs is critical, but carbonate reservoirs pose other problems. Fluid modeling is notoriously difficult here due to the range of porosity types and geometries. Perhaps like time-lapse and multicomponent data, it is only a matter of time.

In summary, reservoir characterization — defining the reservoir in terms of its porosity, permeability, fluid content/ water saturation, lateral and vertical heterogeneity, net-to-gross, etc. — is one of the pre-drilling goals of exploration and production teams.

Characterization of virgin reservoirs relies on AVO, inversion, seismic attributes, statistical modeling and simulation and, in some cases, multicomponent data. For reservoir characterization in the production environment we rely on the link between seismic data and reservoir fluid, pressure and temperature. Time-lapse studies are now successfully adding value in several of the world’s basins; “life of field seismic” is proving its worth in the North Sea.

Given the sparseness of well data, seismic has huge potential for inter-well region and lateral reservoir definition. As geophysical companies continue to image and deliver data of ever-improving quality, we can expect that oil companies will continue to push the resulting data to ever more detailed and effective use in their quest to accurately characterize reservoirs and optimize production.