Every oil or gas company that has ever existed has put some kind of number on reservoir performance. That number might have been in barrels, cubic feet, flow rate, pressure, metric tons, BTUs, water, a ratio, or some other measure, all measured over time. Time then was remeasured in money. That money translated to an economic metric such as net present value or booked reserves. Such metrics became the decision focus for the future of project investments. The actual metric of risk the investors were willing to take was often parlayed back to a machine (such as the popular computerized Monte Carlo statistical method) to grant comfort to the risk-takers.

While we can calculate a risk-money number to several significant digits, how far we drift from hard reality using number analysis is too frequently measured in dry holes and premature production failures. This part of science is more of an art, and art is in the eye of the beholder. Therefore, science and engineering are turning to a machine process once again to see if there can be a more deterministic method we can use.

graph- architecture of a typical turbidite system found in the Gulf of Mexico

This architecture is typical of a turbidite system found in the Gulf of Mexico. (Images courtesy of RPSEA)

New research

The University of Texas at Austin (UT) received a research contract with the Research Partnership to Secure Energy for America (RPSEA) to examine the possibility of deriving a repeatable statistical method to describe reservoir performance parameters ahead of major development using 3-D seismic, petrophysical well logs, and standard engineering performance data. The overall objective is to develop data interpretation and computer statistical techniques to assess flow connectivity characteristics of deepwater sediments.

This is a very different approach from the traditional practice. Traditional practice starts with regional geology collection, seismic acquisition and interpretation, seismic attribute estimates, volumetric derivations (usually based on elevation and porosity), a test well or two with some flow tests at a tiny vertical interval, and petrophysical logs. The goal is to progressively update a model for an ultra-deepwater turbidite reservoir initially using very limited appraisal data such as seismic data and limited well test data. Ultimately, it would calibrate extensively using production-engineering flow data. Systematic assessment of the information content in each type of data will therefore be possible.

graph- seismic volumes and stochastic modeling

Turbidite lobes were extracted using seismic volumes (left) and stochastic modeling of permeability variations within a lobe (right) using well data and production information.

One large objective of the study is to prove that a geologic model can be calibrated to actual dynamic reservoir performance using a statistical model selection approach. Moreover, the objective is to prove that calibrated models can be used to estimate key reservoir flow connectivity characteristics discernible from early appraisal data such as well tests. The key here is in early appraisal, the decision time on just how big the development footprint should be.

Marathon Oil provided production performance and well-test data for this project. The existing first-generation software for building a reservoir model, performing model selection, and subsequently assessing flow connectivity was created using seismic, regional well data, and a few local well examples. The model was tested for mechanical defects against preliminary versions of samples within Marathon’s Lobster dataset.

UT is continuing its work on the RPSEA contract in 2013. The latest effort will determine and verify the software using history matches against the full Lobster dataset and will confirm that the model is robust in predicting reservoir conditions. This segment of the work will concentrate on the probabilistic assessment of reservoir flow connectivity conditioned to geologic parameters using a reservoir database. The plan is to assess complex reservoir characterization of turbidite reservoirs through matching production performance. Finally, the work should validate proxy functions developed for well test analysis using pressure data and early well history to match results.

Probable modeling

Researchers at the UT Bureau of Economic Geology have spent the past decade mining previously published and unpublished research for information on the characteristics of deepwater deposits. These data are housed in a proprietary database, which now contains thousands of measurements of sand and shale-bed lengths and thicknesses, spatial geometries, porosities, permeability, lateral connectivity relationships, lateral overlap, and vertical connectivity. In addition, programs in outcrop study have yielded decades of knowledge on below-seismic heterogeneity in deepwater deposits.

Subsequently, these statistics, together with the well-specific information provided by Marathon, will be used to construct a suite of plausible/equi-probable models.

The fast proxy depicting the relationship between reservoir connectivity and flow performance can be used to perform fast uncertainty assessment and sensitivity analysis. Using the algorithms, including Growthsim and random-walker algorithms, researchers build a suite of geologic models and corresponding dynamic simulation models based on appraisal well data and appraisal plus development well data coupled with seismic data. Testing and screening of these models with production history from the Lobster field will use existing automated history-matching or model selection techniques.

Once again, science and engineering decision-makers turn over part of the decision outcome to the Monte Carlo predictor. The multiple point product terms in the proxy expression are representations of the higher order connectivity of the reservoir petrophysical model. Due to uncertainty in the detailed characteristics of the reservoir, the multiple point product terms are themselves uncertain. The probability distributions describing the uncertainty in specific higher order terms can be calibrated on the basis of the suite of history-matched reservoir models for the Lobster field. Monte Carlo uncertainty estimates for flow responses such as oil saturation can be constructed by sampling from the uncertainty distribution for specific reservoir connectivity terms (keeping the other terms constant). The results will be presented in the form of connectivity-response surfaces that provide a visual representation of the relationship between different measures of reservoir connectivity and the corresponding response.

Multiple representations of reservoir compartmentalization also are possible considering the uncertainty in the relationship between seismic two-way time and actual horizon depths. The initial suite of reservoir models used during history-matching also can take into account the prior structural uncertainty. The final history-matched models can be reanalyzed for reservoir compartmentalization by performing spill point analysis.

The preceding analysis is predicated on the ability to synthesize reservoir models with and without a diagenetic overprint. Diagenetic cementation in deepwater deposits is known to occur in deeply buried strata. Several scenarios of total and effective porosity distribution will be designed and used to condition models examining the effect of diagenetic petrophysical alteration of flow in these deepwater deposits. Unconditional models of the diagenetic indicator can then be simulated. The emphasis will be on designing some scenarios for diagenetic distribution of cements as examples of how diagenesis plays out in other deepwater occurrences.

The focus of the current phase of work is to develop and history-match models for the Lobster area that reflect the main architectural elements of the Lobster field. Turbidite-producing intervals occur in the form of thousands of small lobate gravity flows that are amalgamated into what appears as a sheet. This suggests the presence of flow baffles and complex connectivity even within a single depositional sequence. These internal heterogeneities could impact flow connectivity and the long-term production performance of the reservoir. The development of a technique to represent heterogeneity in flow connectivity consistently across a range of scales is proposed. These multiscale reservoir models will then be evaluated for the impact on past flow performance as well as the future productivity of wells.

The final technique and software will be available to the entire industry. Of course, those who use it should question the accuracy of the results. Risk, size of a deepwater footprint, and total investment expense will always involve judgment. This program will add a significant tool for those who need to look where they cannot see and predict what they cannot directly measure. The less we guess, the safer we all will be.