Reservoir models are run more frequently, and the day could be approaching when they are continuously updated with real-time production well data. One thing everyone agrees on is that the industry is not there yet. There is debate about whether it actually will get there or getting all the way there will be useful enough to justify the effort. The prospect appears

relatively close to fulfillment, and even getting close has value for the industry.

Industry consultant Bill Bartling recalls reservoir characterization as practiced in the 1980s and 1990s.

"It was often called a field study and was done every 10 years," he said. "Inevitably you discovered that you had been sub-optimized for 8 years. If only 8 years ago you had done something, the result would have been a lot better. So this analysis is like an autopsy. Why did the patient die? That's interesting, but wouldn't it have been more interesting to keep the patient in good health?"

Geoscientists and engineers have wondered that for years, but change is making headway.

"Full physics modeling software continues to evolve, providing ever greater levels of visualization and understanding," said Doug Meikle, Halliburton Energy Services Group vice president for the Landmark and Project Management product service line. "Simulation of the respective models, including earth, reservoir, wells, networks and facilities, is driven by the needs of respective expert users, and these simulations broadly reflect their 'discipline' needs. Adopting a lifecycle view necessitates a more holistic or integrative framework, and this has led to development of an integrated asset model approach, whereby full physics models are related to each other and provide a balanced understanding of the asset.

"However, dealing with full physics models has inherent resource and time limitations that restrict their application to an offline mode, providing a good but time-consuming capability for testing and understanding multiple scenarios. This is fine for exploration, field development and other offsite engineering purposes, but not sufficient for production operations, where the asset is exploited and inherent uncertainty arises."

Recent industry pressure to meet long-term production demands has led to a shift in focus. Previously, modeling was primarily the domain of exploration and development, but increasingly operators recognize the need for a model-based operating system to support the asset throughout its production phase, Meikle explained.

"The development of a real-time model-based optimization solution to provide an integrated asset model has been a major breakthrough, making it possible to leverage existing simulation software across the asset value chain and throughout the asset's production lifecycle," he said. "This is a central requirement for intelligent operations."

Satish Pai, vice president of technologies for Schlumberger Oilfield Services, said, "You get a lot of data from intelligent wells and permanent monitoring in real time, but we don't use it to update the reservoir model in real time. We get the data, look at it, massage it about every month and use it. But intelligent modeling, which is really simulating the reservoir in real time with data coming in, is still technically not possible. A reservoir simulator is run every couple of months, and it normally takes 12 to 24 hours, if it's a big field simulation. So getting production data in real time and using that to history-match and update your simulation in real time is a couple of years away."

Will it come?

"I think so, absolutely," Pai said. "With Chevron, we are close to releasing our next-generation reservoir simulator, which is InterSect. It was built for the primary purpose of handling large, much more complex models and simulating much faster than conventional simulators so far have been able to do. When real-time simulation comes, the industry will then have to figure out the process for using it because people are used to simulating a reservoir once every couple of months. Once they start to see what the reservoir is doing all the time, it will lead to a bunch of new reservoir management processes and practices."

One of the biggest challenges is to develop systems that can organize the collected data in a manner useful and intelligent to those who will be analyzing the data in a real-time flow.

"Spreadsheets and 2-D charts are the staple of the production environment, and typically information has come into many different computer applications that do not share databases or communicate with each other," Bartling said. "Compounding that, the analysts work in separate quarters so their collaboration is limited. With operations personnel typically in the field rather than the office, you have lots of isolated workers. All have their own data displayed on their own computers, using their own applications in their own rooms. That's the way it has been.

"In contrast, we are today seeing a new generation of analytical centers emerging - a command and control center concept adapted from a military operations setting - that brings people together into a common working environment. The pioneers in this setup were ConocoPhillips, Statoil and Norsk Hydro, but many other companies are now following suit. Parts of today's analytical software have been integrated and adapted for these rooms, while other parts so far have not been. Managing real-time data and integrating it into decision systems as it becomes available is an emerging science."

Some people question whether full real-time reservoir modeling should be an objective.

"We have made a distinction between what we call real-time processes that require real-time modeling and right-time modeling," said Pieter Kapteijn, smart fields program manager for Shell Exploration Production. "We believe that in the subsurface domain - in the geology and geophysics domain - right-time processes are probably the way to go. I don't think a reservoir engineer is interested in looking at the flood front moving a few millimeters every day through the reservoir because you can't really act on that. So what you want to do is take this enormous mass of real-time data that you can now acquire, whether it's the temperature or pressure - all the control settings of your wells - and try to aggregate and filter the data so that you can update the higher-level processes like subsurface modeling.

"I don't think we'll ever get into real-time modeling of the subsurface except for possibly applications in the real-time drilling domain, where you want to update your static and possibly your dynamic reservoir model based on information gathered at the bit while drilling. I believe that currently is the most challenging application of closing the loop around a process. The process involves taking the real-time logging-while-drilling data from the drilling bit and bringing it immediately into your subsurface model, upgrading your integrated reservoir model and then making operational decisions based on the insights and new options emerging through this process. That is the closest thing to real-time modeling that I can see at the moment."

Weatherford is building automatic low-level data analysis functionality into its downhole data acquisition systems, said Tor Kragas, product line manager of production applications.

"These kinds of tools run in the background and continually look at the data as it is being acquired," he said. "They are not trying to do a highly sophisticated analysis, but instead simplify the handling and management of the large quantities of data generated and highlight only important features for further analysis by an engineer. It may involve something as simple as looking at data from a pressure gauge, detecting when the rate of change exceeds a threshold value and triggering some other process when it does. For instance, I may specify that the gauge collect data once every minute normally, but if the pressure starts changing rapidly due to a choke change or some other event, then I tell it to shift into a fast-acquisition mode.

"Beyond that we're looking at automating some basic engineering analyses at a very low level to use data as it is being acquired. These tools will run in the background and continually operate on data from one or more sensors. Systems that automatically perform engineering calculations and perhaps notify an engineer when a parameter goes outside of a tolerance are very workable today. They're something we'll see much sooner than a full reservoir simulation model being continuously updated in real time."

However far the industry goes with real-time updating of reservoir simulations, the effort is helping to drive new, sharply more efficient work processes in a transformed, highly collaborative work environment.