Computer modeling and simulation is common in many industries today. In the early days of computing, simulation replaced physical experimentation only in simple cases where the simulations executed faster than the associated physical experiments. Since the late 1970s, however, advances in numerical modeling and the use of finite element methods (FEM) to model all types of systems has mostly displaced older, inflexible finite difference (FD) techniques in many industries. FEM tools have the advantage of automatically concentrating the computational effort on the parts of the model that the physics itself requires — handling both short-duration transient behavior and very sharp contrasts in model variables — and making interactive modeling finally possible. These tools have reduced design-cycle time immensely and improved the quality of designs because designers are able to duplicate with extraordinary fidelity the exact design they have contemplated, with faith in the models’ predictions.

Working in isolation

The petroleum industry doesn’t have the luxury of designing reservoirs. Nature has provided the design, but then prevents direct examination of it by burying it under miles of rock. The industry has impressive technologies for imaging deep, complex subsurface structures and divining some of their properties, but uncertainties about hydrocarbon presence, rock properties and other parameters are still extremely challenging. The tremendous depth of expertise required in the geosciences and engineering fields and the tendency of the specialties to isolate themselves in “silos” has made our problem of understanding hydrocarbon reservoirs much like the situation of blind men exploring the elephant in the ancient fable from India: each specialty has an essential contribution but, arrived at serially and in isolation, the reservoir can be mischaracterized with potentially serious consequences.

This concern has always been an argument for reservoir modeling and simulation, but we have only recently surmounted the lengthy hypothesis/test cycle-time problems that other industries have conquered. Why? Primarily because the physics of hydrocarbon flow in reservoir rock is considerably more complex than that most other industries have to deal with. Our modeling and simulation problem is much more complicated than that of most other industries, and our legacy FD tools have just not been able to answer many important reservoir development questions or give answers to others in time to make a difference. As a consequence, our industry has largely had to experiment with the drill bit. It is often said that more than 95% of the world's reservoirs and wells are not rigorously modeled.

Understanding what the reservoir is saying

While many attempts have been made to apply FEM methods to reservoir-flow modeling and simulation, only recently have practical, commercial codes made their appearance.
Figure 1. Stewardship of a reservoir implies that information gathered from related assets in all stages from exploration to production is continually applied to refine the reservoir model. As the predictions become reliable, the stewards begin optimizing the exploitation of the reserves. (Image courtesy of PointCross Inc.)
A good thing, too, as most modern well designs are more complex than before, and the detailed, dynamic interaction between these wells and the reservoir is crucial to our understanding of the nature and behavior of the reservoir under development and production. Think of it this way: when a well starts to flow (or stops flowing), we are hitting the reservoir with a powerful stimulus that propagates out far beyond the near-well region; the reservoir’s response should tell us a lot about the reservoir’s character. Production flow rates and pressures are the most definitive data we have about a reservoir's extended properties and, until recently, even simple well designs in full-field simulations had to be grossly approximated with just one or a handful of FD grid blocks. In effect, our reservoirs have been talking to us, but we haven't had the technology to understand what they are saying.

The waterfall workflow problem

A common workflow for today is virtually a waterfall process: the geoscientists do the best they can to build initial static models, which then pass, usually irretrievably, into the domain of the simulation engineers. Only if the engineers can’t successfully coerce the model to agree with the reservoir’s production data — i.e., get a “history match” — will the seismic interpreters and geologists get pulled back into the process. Many petroleum engineers dismiss the process. For the smaller discoveries of today, many will say that their reservoirs will be depleted before modeling would give any guidance for their day-to-day decisions, and they wouldn’t believe the answers anyway.

Typically, a four or five stage-gated process is used to mature exploration and exploitation of a field, and the gateposts often define an operating company’s organization. For various reasons, industry practice is to model the reservoir during or after the appraisal phase or even the development phase or, in many cases, not at all. If modeling and simulation were a quick-turnaround, credible process, it would be prominent in every single one of these lifecycle phases. That it is not demonstrates the inadequacies of legacy modeling tools.

FEM becomes commercial

The use of FEM for reservoir models is more than a promise: the mathematics and equation solvers for multiphase flows through complex structures are commercially available today. FEM methods also feature fast automatic mesh generators, highly-detailed, near-well meshes for complex wells, direct modeling of the interior of hydraulic fracs and meshes that can adaptively “follow” flood fronts. However, overcoming the skepticism of the 95% of the industry whose doubts are rooted in decades of non-application (and misapplication) of reservoir modeling will be a challenge.

Figure 2. Pressure distribution is shown in a 16-Well Finite Element Model with hydraulic fracs, with close-up view of mesh in and around the frac. (Image courtesy of Object Reservoir)
Expanding the coverage of modeling means that geologists can begin early in the exploration phase to build a number of plausible models. Because the questions to be answered in this phase are very high level, the models do not have to be highly detailed, so the process can be very quick. Although the industry is not accustomed to the idea of geophysicists or geologists “doing simulation,” there’s no steep learning curve, as with FD simulation grids.

As new information about the reservoir is acquired during the reservoir lifecycle (e.g., well logs, core and fluid analyses, flow data), the model set can be adjusted (Figure 1).

Thus, the complexity grows only as necessary to answer the questions at hand at the various decision points, and the modeling work done in one phase gives a big head start to the modeling necessary for subsequent phases in the lifecycle. This progression, where the reservoir “comes into focus” as the data to define it is acquired, is at the heart of the important notion of top-down reservoir modeling.

Figure 2 illustrates an increasingly important exploration and production concern in which the capabilities of FEM technology are urgently needed: understanding and efficiently exploiting tight-gas reservoirs. These plays require complex completion designs and frac strategies, well beyond the modeling capabilities of yesterday’s tools. This application requires FEM’s ability to deal with near-well flow dynamics with high precision in the same model that correctly describes large-scale issues such as drainage patterns and optimal well spacing.

Because of the high degree of automation in FEM, models can be built and tested quickly. The user typically has control when needed over the fineness of the mesh by setting the values of a few parameters. In our experience models have been built “on the fly” in meetings where the participants brought the necessary data — for example, structure and property maps and pressure, volume and temperature analyses.

Usually in interactive sessions like this some fundamental assumptions (such as the presence of nearby flow barriers) are found to be incorrect, and the “what if” experimentation that leads to better, and often unexpected, learning about the reservoir begins. Clearly, this is a collaborative exercise for an entire asset team, not just engineers. Working together interactively, a more reasonable picture can quickly emerge.