reservoir simulation: keeping pace with oilfield complexity · reservoir simulation: keeping pace...

12
4 Oilfield Review Reservoir Simulation: Keeping Pace with Oilfield Complexity Geologic complexity and the high cost of resource development continue to push reservoir simulation technology. Next-generation simulators employ multimillion cell models with unstructured grids to handle geologies with high-permeability contrasts. Through the use of more-realistic models, these new simulators will aid in increasing ultimate recovery from both new and existing fields. David A. Edwards Dayal Gunasekera Jonathan Morris Gareth Shaw Kevin Shaw Dominic Walsh Abingdon, England Paul A. Fjerstad Jitendra Kikani Chevron Energy Technology Company Houston, Texas, USA Jessica Franco Total SA Luanda, Angola Viet Hoang Chevron Energy Technology Company San Ramon, California, USA Lisette Quettier Total SA Pau, France Oilfield Review Winter 2011/2012: 23, no. 4. Copyright © 2012 Schlumberger. ECLIPSE and INTERSECT are marks of Schlumberger. Intel ® , Intel386™, Intel486™, Itanium ® and Pentium ® are registered trademarks of Intel Corporation. Linux ® is a registered trademark of Linus Torvalds. Windows ® is a registered trademark of Microsoft Corporation. Interest in simulators is not new. People have long used simulators to model complex activities. Simulation can be categorized into three periods—precomputer, formative and expan- sion. 1 The Buffon needle experiment in 1777 was the first recorded simulation in the precomputer era (1777 to 1945). In this experiment, needles were thrown onto a flat surface to estimate the value of π. 2 In the formative simulation period (1945 to 1970), people used the first electronic computers to solve problems for military applica- tions. These ranged from artillery firing solutions to the development of the hydrogen bomb. The expansion simulation period (1970 to the pres- ent) is distinguished by a profusion of simulation applications. These applications range from 1950 2000

Upload: others

Post on 04-Apr-2020

9 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Reservoir Simulation: Keeping Pace with Oilfield Complexity · Reservoir Simulation: Keeping Pace with Oilfield Complexity Geologic complexity and the high cost of resource development

4 Oilfield Review

Reservoir Simulation: Keeping Pace with Oilfield Complexity

Geologic complexity and the high cost of resource development continue to push

reservoir simulation technology. Next-generation simulators employ multimillion cell

models with unstructured grids to handle geologies with high-permeability contrasts.

Through the use of more-realistic models, these new simulators will aid in increasing

ultimate recovery from both new and existing fields.

David A. EdwardsDayal GunasekeraJonathan MorrisGareth ShawKevin ShawDominic WalshAbingdon, England

Paul A. FjerstadJitendra KikaniChevron Energy Technology CompanyHouston, Texas, USA

Jessica FrancoTotal SALuanda, Angola

Viet HoangChevron Energy Technology CompanySan Ramon, California, USA

Lisette QuettierTotal SAPau, France

Oilfield Review Winter 2011/2012: 23, no. 4. Copyright © 2012 Schlumberger.ECLIPSE and INTERSECT are marks of Schlumberger.Intel®, Intel386™, Intel486™, Itanium® and Pentium® are registered trademarks of Intel Corporation.Linux® is a registered trademark of Linus Torvalds.Windows® is a registered trademark of Microsoft Corporation.

Interest in simulators is not new. People have long used simulators to model complex activities. Simulation can be categorized into three periods—precomputer, formative and expan-sion.1 The Buffon needle experiment in 1777 was the first recorded simulation in the precomputer era (1777 to 1945). In this experiment, needles were thrown onto a flat surface to estimate the

value of π.2 In the formative simulation period (1945 to 1970), people used the first electronic computers to solve problems for military applica-tions. These ranged from artillery firing solutions to the development of the hydrogen bomb. The expansion simulation period (1970 to the pres-ent) is distinguished by a profusion of simulation applications. These applications range from

Oilfield Review WINTER 11/12 Intersect Fig. OpenerORWNT11/12-INT Opener

1950 2000

44581araD4R1.indd 1 2/17/12 9:29 PM

Page 2: Reservoir Simulation: Keeping Pace with Oilfield Complexity · Reservoir Simulation: Keeping Pace with Oilfield Complexity Geologic complexity and the high cost of resource development

Winter 2011/2012 55

games to disaster preparedness and simulation of artificial life forms.3 Industry and government interest in computer simulation is increasing in areas that are computationally difficult, poten-tially dangerous or expensive. Oilfield simula-tions fit all of these criteria.

Oil and gas simulations model activities that extend from deep within the reservoir to process plants on the surface and ultimately include final economic evaluation (above). Numerous factors are driving current production simulation plan-ning to produce accurate results in the shortest possible time. These include remote locations, geologic complexity, complex well trajectories, enhanced recovery schemes, heavy-oil recovery and unconventional gas. Today, operators must make investment decisions quickly and can no lon-

ger base field development decisions solely on data from early well performance. Operators now want accurate simulation of the field from forma-tion discovery through secondary recovery and final abandonment. Nowhere do these factors come into sharper focus than in the reservoir.

This article describes the tools and processes involved in reservoir simulation and discusses how a next-generation simulator is helping oper-ators in Australia, Canada and Kazakhstan.

Visualizing the ReservoirThe earliest reservoir simulators date from the 1930s and were physical models; the interaction of sand, oil and water could be directly viewed—often in vessels with clear sides.4 Early physical simulators were employed when reservoir behav-

ior during waterfloods surprised operators. In addition to physical simulators, scientists used electrical simulators that relied on the analogy between flow of electrical current and flow of res-ervoir fluids.

In the early 1950s, although physical simula-tors were still in use, researchers were starting to think about how a reservoir might be described analytically. Understanding what happens in a reservoir during production is similar in some respects to diagnosing a disease. Data from vari-ous laboratory tests are available but the com-plete disease process cannot be viewed directly. Physicians must deduce what is happening from laboratory results. Reservoir engineers are in a similar position—they cannot actually view the subject of their interest, but must rely on data to

1. Goldsman D, Nance RE and Wilson JR: “A Brief History of Simulation,” in Rossetti MD, Hill RR, Johansson B, Dunkin A and Ingalls RG (eds): Proceedings of the 2009 Winter Simulation Conference. Austin, Texas, USA (December 13–16, 2009): 310–313.

2. Buffon’s needle experiment is one of the oldest known problems in geometric probability. Needles are dropped on a sheet of paper with grid lines, and the probability of the needle crossing one of the lines is calculated. This

> Production simulation. A reservoir engineer takes static and dynamic data (bottom right ) and develops input for a reservoir simulator (bottom left ). The reservoir simulator, whose primary task is to analyze flow through porous media, calculates production profiles as a function of time for the wells in the reservoir. These data are passed to a production engineer to develop well models and a surface network simulator (top left ). A facilities engineer uses the production and composition data to build a process plant model with the help of a process simulator (top right ). Finally, data from all the simulators are passed to an economic simulator (right ).

Oilfield Review WINTER 11/12 Intersect Fig. 1ORWNT11/12-INT 1

Surface Network Simulator Process Simulator

Reservoir SimulatorStatic and Dynamic Data

Economic Simulator

probability is related directly to the value of π. For more information: Weisstein FW: “Buffon’s Needle Problem,” WolframMathWorld, http://mathworld.wolfram.com/BuffonsNeedleProblem.html (accessed July 25, 2011).

3. Freddolino PL, Arkhipov AS, Larson SB, McPherson A and Schulten K: “Molecular Dynamics of the Complete Satellite Tobacco Mosaic Virus,” Structure 14, no. 3 (March 2006): 437–449.

4. Peaceman DW: “A Personal Retrospection of Reservoir Simulation,” in Proceedings of the First and Second International Forum on Reservoir Simulation. Alpbach, Austria (September 12–16, 1988 and September 4–8, 1989): 12–23.

Adamson G, Crick M, Gane B, Gurpinar O, Hardiman J and Ponting D: “Simulation Throughout the Life of a Reservoir,” Oilfield Review 8, no. 2 (Summer 1996): 16–27.

44581araD4R1.indd 2 2/17/12 9:29 PM

Page 3: Reservoir Simulation: Keeping Pace with Oilfield Complexity · Reservoir Simulation: Keeping Pace with Oilfield Complexity Geologic complexity and the high cost of resource development

6 Oilfield Review

tell them what is happening deep below the Earth’s surface. Production and other data are used to build analytical models to describe flow and other reservoir characteristics. In a reser-voir model, the equations that describe fluid behavior arise from fundamental principles that have been understood for more than a hundred years. These principles are the conservation of mass, fluid dynamics and thermodynamic equi-librium between phases.5

When these principles are applied to a reser-voir, the resulting partial differential equations are complex, numerous and nonlinear. Early ana-lytical derivations for describing flow behavior in the reservoir were constrained to simple models, whereas current formulations show a more com-plex picture (below).6 While formulation of the equations for the reservoir has always been straightforward, they cannot be solved exactly

and must be solved by finite-difference methods.7 In reservoir simulation, there is a trade-off between model complexity and ability to con-verge to a solution. Advances in computing capa-bility have helped enhance reservoir simulator capability—especially when complex models and large numbers of cells are involved (next page).8

Computing hardware advances over the past decades have led to a steady progression in simu-lation capabilities.9 Between the early 1950s and 1970, reservoir simulators progressed from two dimensions and simple geometry to three dimen-sions, realistic geometry and a black oil fluid model.10 In the 1970s, researchers introduced compositional models and placed a heavy empha-sis on enhanced oil recovery. In the 1980s, simula-tor development emphasized complex well management and fractured reservoirs; during the 1990s, graphical user interfaces brought enhanced

ease of use. Nearing the end of the 20th century, reservoir simulators added features such as local grid refinement and the ability to handle complex geometry as well as integration with surface facili-ties. Now, simulators can handle complex reser-voirs while offering integrated full-field manage-ment. These models—known as next-generation simulators—have taken advantage of several recently developed technologies, including paral-lel computing.

Parallel Computing—Divide and ConquerOne of the hallmarks of current reservoir simula-tors is the use of parallel computing systems. Parallel computing operates on the principle that large problems like reservoir simulation can be broken down into smaller ones that are then solved concurrently—or in parallel. The shift from serial processing to parallel systems is a direct result of the drive for improved computational performance.

In the 1980s and 1990s, computer hardware designers relied on increases in microprocessor speed to improve computational performance. This technique, called frequency scaling, became the dominant force in processor performance for personal computers until about 2004.11 Frequency scaling came to an end because of the increasing power consumption necessary to achieve higher frequencies. Hardware designers for personal computers then turned to multicore processors—one form of parallel computing.

The kind of thinking that would eventually lead to parallel processing for reservoir simula-tors, however, began around 1990. In an early experiment, oilfield researchers demonstrated that an Intel computer with 16 processors could efficiently handle an oil-water simulation model.12 Since then, the use of parallel computer systems for reservoir simulation has become more commonplace.

As prices for computing equipment have decreased, it has become standard practice to operate parallel computing systems as clusters of single machines connected by a network. These multiple machines, operating in parallel, act as a single entity. The goal in parallel com-puting has always been to solve large problems more quickly by going n times faster on n proces-sors.13 For a host of reasons, this ideal perfor-mance is rarely achieved.

To understand the limitations in parallel net-works, it is instructive to visualize a typical sys-tem used by a modern reservoir simulator. This system might have several stand-alone comput-ers networked through a hub and a switch to a

> Reservoir simulation evolution. One of the first attempts to analytically describe reservoir flow occurred in the early 1950s. Researchers developed a partial differential equation to describe 1D flow of a compressible fluid in a reservoir (top). This equation is derived from Darcy’s law for flow in porous media plus the law of conservation of mass; it describes pressure as a function of time and position. (For details: McCarty DG and Peaceman DW: “Application of Large Computers to Reservoir Engineering Problems,” paper SPE 844, presented at a Joint Meeting of University of Texas and Texas A&M Student Chapters of AIME, Austin, Texas, February 14–15, 1957.) Recent models developed for reservoir simulation consider the flow of multiple components in a reservoir that is divided into a large number of 3D components known as grid cells (bottom). Darcy’s law and conservation of mass, plus thermodynamic equilibrium of components between phases, govern equations that describe flow in and out of these cells. In addition to flow rates, the models describe other variables including pressure, temperature and phase saturation. (For details: Cao et al, reference 6.)

Oilfield Review WINTER 11/12 Intersect Fig. 2ORWNT11/12-INT 2

1951

2011

x

x

z

y

k

p∂2∂

x 2∂ ∂

φc µ=

p

t

1D flow of acompressible fluid

3D flow of n components in a

complex reservoir

.

δ ρ χS q g+ – = .– –V

∆t Σ p ∆p ∆h∆Pcp

rpρpp

p

Np,c

Σ k

Faces

c RcW

p Tkcp χcpkµφ Σ

p

Np,cρp( )( )

44581araD4R1.indd 3 2/17/12 9:29 PM

Page 4: Reservoir Simulation: Keeping Pace with Oilfield Complexity · Reservoir Simulation: Keeping Pace with Oilfield Complexity Geologic complexity and the high cost of resource development

Winter 2011/2012 7

controller computer and a network server.14 As each of the individual computers works on its portion of the reservoir, messages are passed between them to the controller computer and over the network to other systems. In parallel ter-minology, the individual processors are the paral-lel portions of the system, while the work of the controllers is the serial part.15 The overall effect of communications is the primary reason why ideal performance in parallel systems can be only approached but not realized. All computing sys-tems, even parallel systems, have limitations.

The maximum expected improvement that a parallel system can deliver is embodied in Amdahl’s law.16 Consider a simulator that requires 10 hours on a single processor. The 10-hour total time can be broken down into a 9-hour part that is amenable to parallel processing and a 1-hour part that is serial in nature. For this example, Amdahl’s law states that no matter how many processors are assigned to the parallel part of the calculation, the minimum execution time cannot be less than one hour.

Because of the effect of serial communica-tions, in reservoir simulation there is often an opti-mal number of processors for a given problem. Although the data management and housekeeping parts of the system are the primary reasons for a departure from the ideal state, there are others. These include load balancing between processors, bandwidth and issues related to congestion and delays within various parts of the system. Reservoir simulation problems destined for parallel solution must use software and hardware that are designed specifically for parallel operation.

The Next GenerationSince 2000, a petroleum engineer could choose from a number of reservoir simulators. Simulators were numerous enough that the SPE supported frequent projects to compare them.17 Although the simulators differ from one another, their struc-tures have common roots, which lie in serial com-puting and reliance on simple grids. An example of this type of reservoir simulator is the ECLIPSE simulator.18 The ECLIPSE simulator has been a benchmark for 25 years and has been continually updated to handle a variety of reservoir features. Like microprocessors, however, reservoir simula-tors have reached a point at which the familiar tools of the past may not be appropriate for some current field development challenges.

Scientists have developed new reservoir tools—the next-generation simulators—to broaden the

5. Brown G: “Darcy’s Law Basics and More,” http://bioen.okstate.edu/Darcy/LaLoi/ basics.htm (accessed August 23, 2011).

Smith JM and Van Ness HC: Introduction to Chemical Engineering Thermodynamics, 7th Edition. New York City: McGraw Hill Company, 2005.

6. Cao H, Crumpton PI and Schrader ML: “Efficient General Formulation Approach for Modeling Complex Physics,” paper SPE 119165, presented at the SPE Reservoir Simulation Symposium, The Woodlands, Texas, February 2–4, 2009.

7. Finite-difference equations are used to approximate solutions for differential equations. This method obtains an approximation of a derivative by using small, incremental steps from a base value.

8. Intel Corporation: “Moore’s Law: Raising the Bar,” Santa Clara, California, USA: Intel Corporation (2005), ftp://download.intel.com/museum/Moores_law/Printed_Materials/ Moores_Law_Backgrounder.pdf (accessed October 17, 2011).

Fjerstad PA, Sikandar AS, Cao H, Liu J and Da Sie W: “Next Generation Parallel Computing for Large-Scale Reservoir Simulation,” paper SPE 97358, presented at the SPE International Improved Oil Recovery Conference in Asia Pacific, Kuala Lumpur, December 5–6, 2005.

9. Watts JW: “Reservoir Simulation: Past, Present, and Future,” paper SPE 38441, presented at the SPE Reservoir Simulation Symposium, Dallas, June 8–11, 1997.

10. In the black oil fluid model, composition does not change as fluids are produced. For more information see: Fevang Ø, Singh K and Whitsun CH: “Guidelines for Choosing Compositional and Black-Oil Models for Volatile Oil and Gas-Condensate Reservoirs,” paper SPE 63087, presented at the SPE Annual Technical Conference and Exhibition, Dallas, October 1–4, 2000.

11. Scaling, or scalability, is the characteristic of a system or process to handle greater or growing amounts of work without difficulty. For more information: Shalom N: “The Scalability Revolution: From Dead End to Open Road,” GigaSpaces (February 2007), http://www.gigaspaces.com/files /main/Presentations/ByCustomers/

> Computing capability and reservoir simulation. During the past four decades, computing capability and reservoir simulation evolved along similar paths. From the 1970s until 2004, computer microprocessors followed Moore’s law, which states that transistor density on a microprocessor (red circles), doubles about every two years. Reservoir simulation paralleled this growth in computing capability with the growth in number of grid cells (blue bars) that could be accommodated. In the last decade, computing architecture has focused on parallel processing rather than simple increases to transistor count or frequency. Similarly, reservoir simulation has moved toward parallel solution of the reservoir equations.

Oilfield Review WINTER 11/12 Intersect Fig. 3ORWNT11/12-INT 3

1970103

104

104

105

106

107

108

109

105

106

107

108

Year

Intel8086microprocessor

Intel286 microprocessor

Intel386 microprocessor

Intel486 microprocessor

Intel Pentium microprocessor

Intel Pentium II microprocessor

Intel Pentium 4 microprocessor

Intel Itanium 2 microprocessor

Num

ber o

f res

ervo

ir ce

lls e

mpl

oyed

Num

ber o

f tra

nsis

tors

on

mic

ropr

oces

sor

1980 1990 2000 2010

Int

Reservoir cells

Intel microprocessor

white_papers/FromDeadEndToOpenRoad.pdf (accessed September 13, 2011).

Flynn LJ: “Intel Halts Development of 2 New Microprocessors,” The New York Times (May 8, 2004), http://www.nytimes.com/2004/05/08/business/intel-halts-development-of-2-new-microprocessors.html (accessed Sept 13, 2011).

12. Wheeler JA and Smith RA: “Reservoir Simulation on a Hypercube,” SPE Reservoir Engineering 5, no. 4 (November 1, 1990): 544–548.

13. Speedup, a common measure of parallel computing effectiveness, is defined as the time taken on one processor divided by the time taken on n processors. Parallel effectiveness can also be stated in terms of efficiency—the speedup divided by the number of processors.

14. Baker M: “Cluster Computer White Paper,” Portsmouth, England: University of Portsmouth (December 28, 2000), http://arxiv.org/ftp/cs/00040004014.pdf (accessed July 16, 2011).

Each of these computers in the parallel configuration may have either a single core or multiple core microprocessors. Each individual core is termed a parallel processor and can act as an independent part of the system.

15. The serial portion is often called data management and housekeeping.

16. Barney B: “Introduction to Parallel Computing,” https//computing.llnl.gov/tutorials/parallel_comp/ (accessed September 13, 2011).

17. Christie MA and Blunt MJ: “Tenth SPE Comparative Solution Project: A Comparison of Upscaling Techniques,” paper SPE 66599, presented at the SPE Reservoir Simulation Symposium, Houston, February 11–14, 2001.

18. Pettersen Ø: “Basics of Reservoir Simulation with the Eclipse Reservoir Simulator,” Bergen, Norway: University of Bergen, Department of Mathematics, Lecture Notes (2006), http://www.scribd.com/doc/36455888/Basics-of-Reservoir-Simulation (accessed September 13, 2011).

44581araD4R1.indd 4 2/17/12 9:29 PM

Page 5: Reservoir Simulation: Keeping Pace with Oilfield Complexity · Reservoir Simulation: Keeping Pace with Oilfield Complexity Geologic complexity and the high cost of resource development

8 Oilfield Review

technology to handle the greater complexity now present in the oil field. These simulators take advantage of several new technologies that include parallel computing, advanced gridding techniques, modern software engineering and high-perfor-mance computing hardware. The choice between the next-generation simulators and the older ver-sions is determined by field complexity and busi-ness needs. Next-generation tools should be considered if the reservoir needs a high cell count to capture complex geologic features, has exten-sive local grid refinements or has a high permea-bility contrast.

In addition to handling fields of greater com-plexity, the next-generation simulator gives the operator an important advantage—speed. Many reservoir simulations involve difficult calcula-tions that can take hours or days to reach com-pletion using older tools. The next-generation simulators can reduce calculation times on com-plex reservoirs by an order of magnitude or greater. This allows operators to make field devel-opment decisions in time and with confidence, thus maximizing value and reducing risk. Shorter runs lead to more runs, which in turn leads to operators having a better understanding of the reservoir and the impact of geologic uncertain-ties. Shorter run times also allow the simulator to be used more dynamically—it can evaluate development scenarios and optimize designs as new data and information become available.

One of the next-generation tools available now, the INTERSECT reservoir simulator, is the result of a collaborative effort between Schlumberger and Chevron that was initiated in late 2000.19 Total, which also collaborated on the project from 2004 to 2011, assisted researchers in developing the thermal capabilities of the soft-ware. Following the research phase and a subse-quent development phase, Schlumberger released the INTERSECT simulator in late 2009. This sys-tem integrates several new technologies in one package. These include a new well model, advanced gridding, a scalable parallel computing foundation, an efficient linear solver and effective field management. To fully understand this simu-lator, it is instructive to examine each of these parts, starting with the new model for wells.

Multisegment Well ModelThe INTERSECT simulator uses a new multiseg-ment well model to describe fluid flow in the well-bore.20 Wells have become more complex through the years, and models that describe them must reflect their actual design and be able to handle a variety of different situations and equipment. These include multilateral wells, inflow control devices, horizontal sections, deviated wells and annular flow. Older, conventional well models treated the well as a mixing tank that had a uniform fluid com-position, and the models thus reflected total inflow to the well. The new multisegment model over-comes this method of approximation, allowing each branch to produce a different mix of fluids.

This well model provides a detailed descrip-tion of wellbore fluid conditions by discretizing the well into a number of 1D segments. Each seg-ment consists of a segment node and a segment pipe and may have zero, one or more connections with the reservoir grid cells (left). A segment’s node is positioned at the end farthest away from the wellhead, and its pipe represents the flow path from the segment’s node to the node of the next segment toward the wellhead. The number of segment pipes and nodes defined for a given well is limited only by the complexity of the par-ticular well being modeled. It is possible to posi-tion segment nodes at intermediate points along the wellbore where tubing geometry or inclina-tion angle changes. Additional segments can be defined to represent valves or inflow control devices. The optimal number of segments for a given well depends on a compromise between speed and accuracy in the numerical simulation.

An advantage of the multisegment model is its flexibility in handling a variety of well configu-rations, including laterals and extended-reach wells. The model also handles different types of inflow control devices, packers and annular flow. The new multisegment well model is, however, only the beginning of the story on the INTERSECT simulator and others like it. The next step splits the reservoir into smaller areas, called domains.

19. DeBaun D, Byer T, Childs P, Chen J, Saaf F, Wells M, Liu J, Cao H, Pianelo L, Tilakraj V, Crumpton P, Walsh D, Yardumian H, Zorzynski R, Lim K-T, Schrader M, Zapata V, Nolen J and Tchelepi H: “An Extensible Architecture for Next Generation Scalable Parallel Reservoir Simulation,” paper SPE 93274, presented at the SPE Reservation Simulation Symposium, Houston, January 31–February 2, 2005.

For another example of a next-generation simulator: Dogru AH, Fung LSK, Middya U, Al-Shaalan TM, Pita JA, HemanthKumar K, Su HJ, Tan JCT, Hoy H, Dreiman WT, Hahn WA, Al-Harbi R, Al-Youbi A, Al-Zamel NM, Mezghani M and Al-Mani T: “A Next-Generation Parallel Reservoir Simulator for Giant Reservoirs,” paper SPE 119272, presented at the SPE Reservoir Simulation Symposium, The Woodlands, Texas, February 2–4, 2009.

20. Youngs B, Neylon K and Holmes J: “Multisegment Well Modeling Optimizes Inflow Control Devices,” World Oil 231, no. 5 (May 1, 2010): 37–42.

Holmes JA, Byer T, Edwards DA, Stone TW, Pallister I, Shaw G and Walsh D: “A Unified Wellbore Model for Reservoir Simulation,” paper SPE 134928, presented at the SPE Annual Technical Conference and Exhibition, Florence, Italy, September 19–22, 2010.

21. DeBaun et al, reference 19.22. Weisstein FW: “Traveling Salesman Problem,” Wolfram

MathWorld, http://mathworld.wolfram.com/Traveling SalesmanProblem.html (accessed October 12, 2011).

23. Karypis G, Schloegel K and Kumar V: “ParMETIS—Parallel Graph Partitioning and Sparse Matrix Ordering Library,” http://mpc.uci.edu/ParMetis/manual.pdf (accessed July 7, 2011).

Karypis G and Kumar V: “Parallel Multilevel k-way Partitioning Scheme for Irregular Graphs,” SIAM Review 41, no. 2 (June 1999): 278–300.

24. Fjerstad et al, reference 8.25. Hesjedal A: “Introduction to the Gullfaks Field,” http://

www.ipt.ntnu.no/~tpg5200/intro/gullfaks_introduksjon.html (accessed September 26, 2011).

>Multisegment well model. For each segment node in a wellbore, the new well model calculates the total flow in (ΣFIN) and total flow out (ΣFOUT), including any flow between the wellbore and the connected grid cell in the reservoir. Assuming a three-phase black oil simulation, there are three mass conservation equations and a pressure drop equation associated with each well segment. During the simulation, the well equations are solved, along with the other reservoir equations, to give pressure, flow rates and composition in each segment.

Oilfield Review WINTER 11/12 Intersect Fig. 4_2ORWNT11/12-INT 4

ΣFIN

ΣFIN

ΣFOUT

ΣFOUT

Nodes at branch junction

Reservoir

WellboreSegmentNode

Nodes at well connectionswith grid cells

44581araD4R1.indd 5 2/17/12 9:29 PM

Page 6: Reservoir Simulation: Keeping Pace with Oilfield Complexity · Reservoir Simulation: Keeping Pace with Oilfield Complexity Geologic complexity and the high cost of resource development

Winter 2011/2012 9

Domains and a Parallel, Scalable SolverThe calculation of flow within the reservoir is the most difficult part of the simulation—even for simulators using parallel computing hardware. The number of potential reservoir cells is many times larger than the number of processors available. It is natural to parallelize this calculation by dividing the reservoir grid into areas called domains and assigning each one to a separate processor. Partitioning a structured Cartesian grid into seg-ments containing equal numbers of cells while minimizing their surface area may be a straightfor-ward process; partitioning realistic unstructured grids, however, is more difficult (right). Realistic grids must be used to model the heterogeneous nature of a reservoir that has complex faults and horizons. The grids must also have sufficient detail to delineate irregularities such as water fronts, gas breakthroughs, thermal fronts and coning near wells. These irregularities are usually captured by the use of local grid refinements. Partitioning unstructured grids with these complex features and numerous local refinements is challenging; to address this, next-generation simulators typically use partitioning algorithms.21

The objective of partitioning the unstructured grid is to divide the grid into a number of seg-ments, or domains, that represent equal compu-tational loads on each of the parallel processors. Calculating the optimal partitioning for unstruc-tured grids is difficult, and the solution is far from intuitive. Reservoir partitioning is similar to the “traveling salesman problem” in combinato-rial mathematics that seeks to determine the shortest route that permits only one visit to each of a set of cities.22 Unlike the traveling salesman who is concerned only about minimizing his time in transit, partitioning of the reservoir must be guided by the physics of the problem. To this end, the INTERSECT simulator employs the ParMETIS reservoir partitioning algorithm.23 The advan-tages of partitioning a complex reservoir grid to balance the parallel workload become obvious by considering simulation of the Gullfaks field in the Norwegian sector of the North Sea.24

Gullfaks, discovered in 1979 and operated by Statoil, is a complex offshore field that has 106 wells producing about 30,000 m3/d [189,000 bbl/d] of oil.25 This field is highly faulted with deviated and horizontal wells crossing the faults. An INTERSECT simulation of this field developed several domain splits so that different numbers of parallel processors could be evalu-ated for load balancing (right). When compared

> Reservoir grids. Reservoir simulators may lay out the grid in either a structured pattern (upper left ) or as an unstructured pattern (lower right ). Structured grids have hexahedral (cubic) cells laid out in a uniform, repeatable order. Unstructured grids consist of polyhedral cells having any number of faces and may have no discernable ordering. Both grid types partition the reservoir space without gaps or overlaps. Structured grids with many local grid refinements around wells are usually treated as unstructured. Similarly, when a large number of faults are present in a structured grid, it becomes unstructured as a result of the nonneighbor connections created.

Oilfield Review WINTER 11/12 Intersect Fig. 5ORWNT11/12-INT 5

Well

Structured Reservoir Grid

Unstructured Reservoir Grid

> Gullfaks domain decomposition. The highly faulted nature of the Gullfaks field and the number of wells and their complexity result in complicated reservoir communication and drainage patterns. The simulator takes these factors into account and develops a complex, unstructured grid in preparation for partitioning (left ). Fine black lines define individual cell boundaries; vertical lines (magenta) represent wells. Different colors denote varying levels of oil saturation from high (red) to low (blue). This unstructured grid is split into eight domains using a partitioning algorithm for an eight-processor simulation (right ). In the partitioned reservoir, different colors denote the individual domains. Only seven colors appear in the figure—one domain is on the underside of the reservoir and cannot be viewed from this angle. The primary criterion for the domain partition is an equal computational load on each of the parallel processors.

Oilfield Review WINTER 11/12 Intersect Fig. 6ORWNT11/12-INT 6

Gullfaks Field Unstructured Grid Gullfaks Field Domain Split

44581araD4R1.indd 6 2/17/12 9:29 PM

Page 7: Reservoir Simulation: Keeping Pace with Oilfield Complexity · Reservoir Simulation: Keeping Pace with Oilfield Complexity Geologic complexity and the high cost of resource development

10 Oilfield Review

with an ECLIPSE simulation on Gullfaks using eight processors, the INTERSECT approach decreased computational time by more than a factor of five. Runs with higher numbers of pro-cessors showed similar improvements and con-firmed the scalability of the simulation.

Proper domain partitioning is only part of the next-generation simulation story. Once the reser-voir cells are split to balance the workload on the parallel processors, the model must numerically solve a large set of reservoir and well equations. These equations for the reservoir and wells form a large, sparsely populated matrix (left).

Although the equations generated in the sim-ulator are amenable to parallel computation, they are often difficult to solve. Several factors contribute to this difficulty, including large sys-tem sizes, discontinuous anisotropic coefficients, nonsymmetry, coupled wells and unstructured grids. The resultant simulation equations exhibit mixed characteristics. The pressure field equa-tions have long-range coupling and tend to be elliptic, while the saturation or mass balance equations tend to have more local dependency and are hyperbolic. The INTERSECT simulator uses a computationally efficient solver to achieve scalable solution of these equation systems.26 It is based on preconditioning the equations to make them easier to solve numerically. Preconditioning algebraically decomposes the system into subsys-tems that are then manipulated based on their particular characteristics to facilitate solution. The resulting reservoir equations are solved numerically by iterative techniques until conver-gence is reached for the entire system including wells and surface facilities (left).27

The solver provides significant improvements in scalability and performance when compared with current simulators. A major advantage of this highly scalable solver is its ability to handle both structured and unstructured grids in a gen-eral framework for a variety of field situations (next page, top).

26. Cao H, Tchelepi HA, Wallis J and Yardumian H: “Parallel Scalable Unstructured CPR-Type Linear Solver for Reservoir Simulation,” paper SPE 96809, presented at the SPE Annual Technical Conference and Exhibition, Dallas, October 9–12, 2005.

27. The linear solver consumes a significant share of system resources. In a typical INTERSECT case, the solver may use 60% of the central processing unit (CPU) time.

28. Güyagüler B, Zapata VJ, Cao H, Stamati HF and Holmes JA: “Near-Well Subdomain Simulations for Accurate Inflow Performance Relationship Calculation to Improve Stability of Reservoir-Network Coupling,” paper SPE 141207, presented at the SPE Reservoir Simulation Symposium, The Woodlands, Texas, February 21–23, 2011.

>Matrix structure. A matrix of the linearized reservoir simulation equations is typically sparse and asymmetrical (left ). The unmarked spaces represent matrix positions with no equation, while each dot represents the derivative of one equation with respect to one variable (right ). The nine points inside the red square (right ) represent mass conservation equations for gas, water and oil phases. The points on the off-diagonal (left ) represent equations for connections between cells and their neighboring cells in adjacent layers. Points near the vertical and horizontal axes (left ) represent the well equations.

Oilfield Review WINTER 11/12 Intersect Fig. 7ORWNT11/12-INT 7

Column number

Row

num

ber

Equations for wells

Equations for cell-nearest neighbors

Equations for otherreservoir connections

0 2,000 4,000 6,000 8,000 10,000

10,000

8,000

6,000

4,000

2,000

0

2,550

Column number

Row

num

ber

2,550

2,560

2,570

2,580

2,590

2,570 2,590

>Numerical solution. The complete set of fundamental reservoir equations can be written in finite- difference form (top). These equations describe how the values of the dependent variables in each grid cell—pressure, temperature, saturation and mole fractions—change with time. The equations also include a number of property-related terms including porosity, pore volume, viscosity, density and permeability (see DeBaun et al, reference 19). Numerical solution of this large set of equations is carried out by the Newton-Raphson method illustrated on the graph. A residual function R(x) that is some function of the dependent variables is calculated at x0 (dashed black line marks coordinate position) and x0 plus a small increment (not shown). This allows a derivative or tangent line (black) to be calculated, that when extrapolated, predicts the residual going to zero at x1. Another derivative is calculated at x1 that predicts the residual going to zero at x2. This procedure is carried out iteratively until successive calculated values of R(x) agree within some specified tolerance. The locus of points at the intersection of the derivative line and its corresponding value of x describe the path of the residual as it changes with each successive iteration (red).

Oilfield Review WINTER 11/12 Intersect Fig. 8ORWNT11/12-INT 8

xn0 x2 x1 x0

Solution variable x

Tolerance

Resi

dual

, R(x

)

ΔtΔr

Vρ ρρφ( (o + +Soxi gyig SwWiw

_(qo + qg + qw ) + =S

Δxyz Δp ΔPcgo ΔZ( ( _ _oxi + T

kro γoμoρ

γgΔxyz Δp ΔZ( ( _ + gyiTkrgμg

ρ

γw ΔZ (Δxyz Δp ΔPcgo( _ _ ΔPcwo_

wwiTkrwμw

ρ

44581araD4R1.indd 7 2/17/12 9:29 PM

Page 8: Reservoir Simulation: Keeping Pace with Oilfield Complexity · Reservoir Simulation: Keeping Pace with Oilfield Complexity Geologic complexity and the high cost of resource development

Winter 2011/2012 11

Oilfield Review WINTER 11/12 Intersect Fig. 10ORWNT11/12-INT 10

Surface networksimulator

INTERSECT field management

Reservoir A, using the ECLIPSE reservoir

simulator

Reservoir B, using the INTERSECT reservoir

simulator

>Multiple reservoir coupling. The field management module can link independent Reservoirs A and B and surface facilities (center ) via network links. In this example, Reservoir A (lower left ) is using the ECLIPSE simulator on a Microsoft Windows desktop computer while Reservoir B (lower right ) is using the INTERSECT system on a Linux parallel cluster. The surface network simulator, running on a Microsoft Windows desktop, is handling surface facilities for this network (upper right ). The FM module (top) that controls all of these simulators may be a desktop or local mainframe computer.

Field Management WorkflowAn improved field management workflow is one of the components of the INTERSECT simulator package. Field management tasks include design of and modifications to surface facilities, sen­sitivity analyses and economic evaluations. Traditionally, field management tasks have been distributed among the various simulators—includ­ing a reservoir simulator, process facilities simula­tor and economic simulator. The isolation of the simulators in the traditional workflow tends to produce suboptimal field management plans.

The field management (FM) module in the INTERSECT simulator addresses the weaknesses of traditional methods with a collection of tools, algorithms, logic and workflows that allow all of the different simulators to be coupled and run in concert. This provides a great deal of flexibility; for example, the module would allow two iso­lated, offshore gas reservoirs to be linked to a single surface processing facility for modeling and evaluation.28

At the top level, the module executes one or more strategies that are the focal point of the whole framework. Strategies, which consist of a list of instructions and an optional balancing action, can encompass a wide variety of scenarios that might affect production. These strategies may include factors that affect subsurface deliv­erability such as reservoir performance, well per­formance and recovery methods. Other strategies affecting production may include surface capa­bility and economic viability. After the strategy is selected, the FM module employs tools to create a complete topological representation of the field including wells, completions and inflow control devices. Once the strategy has been set and the field topology is defined, the module uses operat­ing targets and limits to set well balancing actions and potential field topology changes. An important feature of the FM workflow is the abil­ity to control multiple simulators running on dif­ferent machines and operating systems and in different locations (right).

Chevron and its partners used the INTERSECT simulator in their field development of a major gas project off the coast of Australia. The large capital outlays envisioned for this project required a next­generation simulator that could run cases quickly on large, unstructured grids characterized by highly heterogeneous geology.

> INTERSECT simulation scalability. This simulation system has been used in a variety of offshore and onshore field scenarios including large gas condensate fields and fields with significant faulting. Scalability—measured as the run time on 16 processors divided by the run time on n processors, or speedup—is calculated as a function of the number of processors. The diagonal straight line (dashed) represents ideal scaling.

Oilfield Review WINTER 11/12 Intersect Fig. 9ORWNT11/12-INT 9

16 50

Number of processors, n100 150 200 250 300

2

4

6

8

10

12

14

16

18

Run

time

on 1

6 pr

oces

sors

Run

time

on n

pro

cess

ors

Ideal scalingFractured carbonate oil fieldHighly faulted supergiant fieldOffshore supergiant fieldMassive gas condensate fieldLarge onshore oil fieldHighly faulted oil field

44581araD4R1.indd 8 2/17/12 9:29 PM

Page 9: Reservoir Simulation: Keeping Pace with Oilfield Complexity · Reservoir Simulation: Keeping Pace with Oilfield Complexity Geologic complexity and the high cost of resource development

12 Oilfield Review

> Gorgon project, offshore Australia. The Gorgon project includes the Gorgon and IO/Jansz subsea gas fields that lie 150 to 220 km [93 to 137 mi] off the mainland. Gas is moved from the fields by deep underwater pipelines (black) to Barrow Island, about 50 km off the coast. There, the raw gas is stripped to remove CO2 and then either liquefied to LNG for export by tanker or moved to the mainland by pipeline for domestic use. On the mainland, gas from Barrow Island is transported through an existing pipeline (blue) that gathers gas from other producing areas nearby.

Oilfield Review WINTER 11/12 Intersect Fig. 11ORWNT11/12-INT 11

0 km

0 mi 50

50

IO/Jansz field

Gorgon field

LNG plantPipeline junction

Gorgon pipeline

BarrowIsland

Dampier to Bunbury natural gas pipeline

A U S T R A L I A

Existing pipeline

Better Decisions—Reduced UncertaintyThe Gorgon project—a joint venture of Chevron, Royal Dutch Shell and ExxonMobil—will pro-duce LNG for export from large fields off the coast of northwest Australia.29 This project will take subsea gas from the Gorgon and IO/Jansz fields and move it by underwater pipeline to Barrow Island about 50 km [31 mi] off the oast (above). Chevron—the operator—is building a 15 million–tonUK [15.2 million–metric ton] LNG plant on Barrow Island to prepare the gas for

export to customers in Japan and Korea. Engineers at Chevron knew that one of the chal-lenges would be to dispose of the high levels of CO2 separated from the raw gas.30 Chevron will meet this challenge by removing the CO2 at the LNG plant and burying it deep beneath the sur-face of Barrow Island (below). Gorgon will be capable of injecting 6.2 million m3/d [220 MMcf/d] of CO2 using nine injection wells spread over three drill centers on Barrow Island.

With billions of dollars of capital and LNG rev-enues at stake, Chevron and its partners under-stood from the start that the engineers developing the business case would need to know how much the project would yield and for how long. Extensive reservoir modeling and simulation were the solutions to this challenge. Some of Chevron’s simulations on an internal serial simu-lator with fine grid models of individual Gorgon formations were taking 13 to 17 hours per run. Early in the project, Chevron decided that migra-tion to the INTERSECT simulator would be required for timely project development.

Although some computer models require a minimal amount of input data, that cannot be said for reservoir simulators. These simulators employ large datasets and typically use purpose-built migrators to move the data from one simula-tor to another. For Gorgon, Chevron used internal migrating software to transform input from their internal simulator to the corresponding INTERSECT input dataset. These data were used to develop history-matching cases at data centers in Houston and in San Ramon, California, USA.31 The results from these cases showed that both simulators were producing equivalent results—although taking very different amounts of CPU time to do it. This process was repeated on high-performance, parallel computing clusters at the Chevron operations center in Perth, Western Australia, Australia.

As Chevron project teams in Australia began advanced project planning, the INTERSECT sim-ulator reduced simulation times by more than an order of magnitude. In one Gorgon gas field simu-lation with 15 wells and 287,000 grid cells, serial run times with the internal simulator were six to eight hours, while the INTERSECT system reduced run times to under 10 minutes with

> CO2 disposal. As natural gas is produced from the various reservoirs in the Gorgon project (left ), it is fed to a CO2 stripping facility located near the LNG plant (middle). Stripped natural gas (orange) flows to liquefaction and an associated domestic gas plant (not shown), while the extracted CO2 (blue) is compressed and injected into an unused saline aquifer 2.5 km [1.6 mi] beneath the surface for disposal. Conditions in the CO2 storage formation are monitored by seismic surveys and surveillance wells (right ).

Oilfield Review WINTER 11/12 Intersect Fig. 12ORWNT11/12-INT 12

Gorgon projectgas fields

CO2

CO2 disposal

CO2 stripping

LNG plantLNG product

Seismicsurveys

Surveillancewells

44581araD4R1.indd 9 2/17/12 9:29 PM

Page 10: Reservoir Simulation: Keeping Pace with Oilfield Complexity · Reservoir Simulation: Keeping Pace with Oilfield Complexity Geologic complexity and the high cost of resource development

Winter 2011/2012 13

excellent scalability. In addition to this simula-tion, Chevron has used the INTERSECT simulator on other fields in the Gorgon area including Wheatstone, IO/Jansz and West Tryal Rocks. Both black oil and compositional models have been used with grids ranging from 45,000 to 1.4 million cells. INTERSECT simulation times on these cases using the Perth cluster ranged from 2 min-utes to 20 minutes depending on the case.

Next-generation reservoir simulation on geo-logic scale models with fast run times has enhanced decision analysis and uncertainty management at Gorgon.

Reducing Simulation TimeReduction of reservoir simulation execution time was also a key factor for Total at their Surmont oil sands project in Canada. Surmont, located in the Athabasca oil sands area about 60 km [37 mi] southeast of Fort McMurray, Alberta, Canada, is a joint venture between ConocoPhillips Canada and Total E&P Canada (right).32 The project was initiated in 2007 with a production of 4,293 m3/d [27,000 bbl/d] of heavy oil; it is expected to reach full capacity of 16,536 m3/d [104,000 bbl/d] in 2012.

At Surmont, the highly viscous bitumen in the unconsolidated reservoir is produced by steam-assisted gravity drainage (SAGD). In this pro-cess, steam is injected through a horizontal well, and heated bitumen is produced by gravity from a parallel, horizontal producing well below the injector. Typically, one steam chamber is associ-ated with each injector and producing well, and a SAGD development consists of several adjacent well pairs.

From a simulations point of view, at the start of SAGD operations, the individual steam cham-bers are independent of each other and simula-tions can be performed on individual SAGD pairs. As the heating and drainage proceed, this inde-pendence between well pairs ceases because of pressure communications, gas channeling and aquifer interactions. Including all well pairs in a typical SAGD development quickly leads to multi-million-cell models that could not be run in a rea-sonable time frame with commercial thermal simulators.33 Total turned to the INTERSECT new-generation simulator to model the full-field, nine-pair SAGD operation at Surmont.

The model describes an oil sands reservoir with an oil viscosity of 1.5 million mPa.s [1.5 million cp] and 1.7 million grid blocks with heterogeneous cell properties.34 The model includes external heat sources and sinks to describe the interaction with

over- and underburden material. The producers are controlled by maximum steam rate, maximum liq-uid rate and minimum bottomhole pressure (BHP). The injectors are controlled by maximum injection rate and maximum BHP.

The INTERSECT system was used to model the first three years of SAGD operations at Surmont

on a parallel computer cluster. To test its speed and scalability, the software was used on different parallel hardware configurations ranging from 1 to 32 processors. These tests proved the ability of this application to handle this large heteroge-neous model quickly enough to support opera-tional decisions. For example, using 16 processors,

29. Flett M, Beacher G, Brantjes J, Burt A, Dauth C, Koelmeyer F, Lawrence R, Leigh S, McKenna J, Gurton R, Robinson WF and Tankersley T: “Gorgon Project: Subsurface Evaluation of Carbon Dioxide Disposal Under Barrow Island,” paper SPE 116372, presented at the SPE Asia Pacific Oil and Gas Conference and Exhibition, Perth, Western Australia, Australia, October 20–22, 2008.

30. Raw gas from the Gorgon fields has about 14% CO2.31. To ensure that two reservoir simulators are producing

equivalent results, a user may employ a technique called history-matching. Each simulator will run the same case, and the oil or gas production rates, as a function of time, will be compared. If they match, the two cases are deemed equivalent. This technique can also be used to calibrate a simulator to a field where long-term production data are available.

> Surmont project. The Surmont oil sands project is located southeast of Fort McMurray, Alberta, Canada, within the greater Athabasca oil sands area. Depending on the topography and the depth of the overburden, oil sands at Athabasca may be produced by surface mining or steam-assisted processes such as SAGD.

Oilfield Review WINTER 11/12 Intersect Fig. 13ORWNT11/12-INT 13

C A N A D A

U N I T E D S T A T E S

A l b e r t a

Fort McMurray

Edmonton

Surmont

Calgary

Athabascaoil sands

0 km

0 mi 200

200

32. Handfield TC, Nations T and Noonan SG: “SAGD Gas Lift Completions and Optimization: A Field Case Study at Surmont,” paper SPE/PS/CHOA 117489, presented at the SPE International Thermal Operations and Heavy Oil Symposium, Calgary, October 20–23, 2008.

33. Total initially tried a commercial, thermal simulator to model operations at Surmont. Case run times were very long—45 hours—making this approach impractical.

34. The oil is modeled using two pseudocomponents— one light and one heavy.

44581araD4R1.indd 10 2/17/12 9:29 PM

Page 11: Reservoir Simulation: Keeping Pace with Oilfield Complexity · Reservoir Simulation: Keeping Pace with Oilfield Complexity Geologic complexity and the high cost of resource development

14 Oilfield Review

the INTERSECT simulator executed the Surmont case in 2.6 hours.35 Parallel scalability is also good—10 times faster on 16 processors compared with a serial run. In addition to predicting flow performance from SAGD operations, the system can also give information on profiles of important variables such as pressure and temperature in the steam chambers (below). In preparation for fully deploying the INTERSECT technology at Surmont, Total is confirming these results on the most cur-rent version of the simulator.

Conserving ResourcesAs operators continue to push into remote areas in search of resources, next-generation simula-tors will be there to aid in planning and develop-ment. A case in point is the new Chevron Tengiz field in the Republic of Kazakhstan, at the shore of the Caspian Sea. Tengiz is a deep, supergiant, naturally fractured carbonate oil and gas field with an oil column of about 1,600 m [5,250 ft] and a production rate of 79,500 m3/d [500,000 bbl/d].36 The Tengiz field is expansive, covering an area of

about 440 km2 [170 mi2], and contains an esti-mated 4.1 billion m3 [26 billion bbl] in place.

The challenge for Chevron in modeling Tengiz was the field’s geologic complexity coupled with the need to reinject large quantities of H2S recov-ered from the production stream. This required combining detailed geologic information with information on the distinctly different flow behaviors between fractured and nonfractured areas of the field. To assist in current field man-agement and support future growth, Chevron developed an INTERSECT case that encom-passed the 116 producing wells. The model con-tained 3.7 million grid blocks in an unstructured grid that included more than 12,000 fractures.37 Chevron has experienced improved efficiency using the new simulator at Tengiz; simulations that once took eight days now take eight hours.38 More-realistic geologic input leads to more-accu-rate production forecasts that allow engineers to make better field development decisions. In addi-tion to their use in the development of new fields, next-generation simulators may also aid recovery of additional oil and gas from older fields.

The world energy markets rely heavily on the giant reservoirs of the Middle East. The largest of these reservoirs—Ghawar—was discovered in 1948 and has been producing for 60 years.39 Ghawar is a large field, measuring 250 km [155 mi] long by 30 km [19 mi] wide. Simulation of a reservoir the size of Ghawar is challenging because of the fine grid size that must be employed to capture the heterogeneities seen in high-resolution seismic data. Using fine grid sizes can reduce errors in upscaling (next page).

To handle reservoirs the size of Ghawar and the other giant fields that it owns, Saudi Aramco has developed a next-generation reservoir simu-lator.40 In one Ghawar black oil simulation, the model used more than a billion cells with a 42-m [138-ft] grid and 51 layers with 1.5-m [5-ft] spac-ing.41 Using a large parallel computing system,

35. This case used 16 processors—four multicore processors each having four cores built into the chip.

36. Tankersley T, Narr W, King, G, Camerlo R, Zhumagulova A, Skalinski M and Pan Y: “Reservoir Modeling to Characterize Dual Porosity, Tengiz Field, Republic of Kazakhstan,” paper SPE 139836, presented at the SPE Caspian Carbonates Technology Conference, Atyrau, Kazakhstan, November 8–10, 2010.

37. The Tengiz simulation also couples the reservoir and well models to surface separation facilities to maximize plant capacities as part of development planning.

38. Chevron Corporation: “Envisioning Perfect Oil Fields, Growing Future Energy Streams,” Next*, no. 4 (November 2010): 2–3.

Chevron is also using the INTERSECT system to reduce run time in field scale models for thermal recovery processes. For more information, see:

> Steam chambers. The nine steam chambers at Surmont are located at a depth of 300 m [984 ft] near the bottom of the oil sands reservoir. These chambers have a lateral spacing of about 100 m [328 ft] and a length of nearly 1,000 m [3,281 ft]. Each chamber has a pair of wells—one steam injector (magenta) and a parallel producing well (not shown). INTERSECT simulation of a thermal process such as SAGD also yields information on temperature profiles in the steam chambers. At Surmont, the temperature varies from more than 230°C [446°F] (red areas) at the core of the chamber to ambient temperature at the periphery (blue areas). Gaps along the length of the chambers reflect permeability differences in the oil sands. The operator monitors temperature in the steam chambers during production. While the steam chambers are relatively small, the SAGD process is efficient. Once the chamber growth reaches the rock at the top of the reservoir, thermal efficiency drops because of heat transfer to the overburden.

Oilfield Review WINTER 11/12 Intersect Fig. 14ORWNT11/12-INT 14

10 66

Temperature, °C

122 178 234

Lim K-T and Hoang V: “A Next-Generation Reservoir Simulator as an Enabling Tool for Routine Analyses of Heavy Oil and Thermal Recovery Process,” WHOC paper 2009-403, presented at the World Heavy Oil Congress, Puerto La Cruz, Venezuela, November 3–5, 2009.

39. Afifi AM: “Ghawar: The Anatomy of the World’s Largest Oil Field,” Search and Discovery (January 25, 2005), http://searchanddiscovery.com/documents/2004/afifi01/ (accessed September 29, 2011).

40. Dogru et al, reference 19.41. Dogru AH, Fung LS, Middya U, Al-Shaalan TM, Byer T,

Hoy H, Hahn WA, Al-Zamel N, Pita J, Hemanthkumar K, Mezghani M, Al-Mana A, Tan J, Dreiman W, Fugl A and Al-Baiz A: “New Frontiers in Large Scale Reservoir Simulation,” paper SPE 142297, presented at the SPE Reservoir Simulation Symposium, The Woodlands, Texas, February 21–23, 2011.

42. Dogru AH: “Giga-Cell Simulation,” The Saudi Aramco Journal of Technology (Spring 2011): 2–7.

43. Farber D: “Microsoft’s Mundie Outlines the Future of Computing,” CNET News (September 25, 2008) http://news.cnet.com/830113953_3-10050826-80.html (accessed August 4, 2011).

44. Dogru et al, reference 41.45. Bridger T: “Cloud Computing Can Be Applied for

Reservoir Modeling,” Hart Energy E&P (March 1, 2011), http://www.epmag.com/Production-Drilling/Cloud-Computing-Be-Applied-Reservoir-Modeling_78380 (accessed August 11, 2011).

44581araD4R1.indd 11 2/17/12 9:29 PM

Page 12: Reservoir Simulation: Keeping Pace with Oilfield Complexity · Reservoir Simulation: Keeping Pace with Oilfield Complexity Geologic complexity and the high cost of resource development

Winter 2011/2012 15

this model simulated 60 years of production his-tory in 21 hours.42 The results were compared with an older simulation run using a 250-m [820-ft] grid and a given production plan. The older simulator predicted no oil left behind after sec-ondary recovery; the new simulator revealed oil pockets that could be produced using infill drill-ing or other methods. This example shows how next-generation simulators may facilitate addi-tional resource recovery.

Although a primary goal of next-generation simulators has been to more completely describe reservoirs through reduced grid size and upscaling, scientists are also pursuing other technology innovations. Improved user inter-faces and new hardware options for reservoir simulation are imminent.

These improved user interfaces embody a con-cept known as spatial computing. Spatial comput-ing relies on multiple core processors, parallel programming and cloud services to produce a vir-tual world controlled by speech and gestures.43 This concept is being tested for controlling large

reservoir simulations with hand gestures and ver-bal commands rather than with a computer mouse.44 To test this concept, a room is equipped with cameras and sensors connected to large screens on the walls and a visual display on a table. Using hand gestures and speech, engineers man-age the simulator input and output. If needed, the system can be used in a collaborative manner via a network with engineers and scientists at other locations. This kind of system has vast possibili-ties—it tends to mask computing system complex-ity and allows the engineers and scientists to freely interact with the reservoir simulation.

Just as ideas such as spatial computing will enhance the user interface, new hardware utili-zation concepts that go beyond onsite parallel computing clusters will add to reservoir simula-tion capability. Clusters of parallel computers are expensive and the associated infrastructure is complex and difficult to maintain. Some opera-tors are discovering that it may be useful to use cloud computing to communicate with multiple clusters in many locations.45 Using this approach, the operator can add system capacity as the

situation dictates rather than depending on a fixed set of hardware. This approach allows the user to communicate with the cloud system via a “thin client” such as a laptop or a tablet. Reservoir modeling tools using this technology have already been developed, and more will follow.

New technology for reservoir simulation is emerging on several fronts. Foremost are next-generation reservoir simulators that produce more-accurate simulations on complex fields with reduced execution time. Other technologies such as spatial and cloud computing are on the near horizon and will allow scientists and engi-neers to interact more naturally with the simula-tions and potentially add hardware capability at will. These developments will give operators more-accurate forecasts, and those improved forecasts will lead to better field development decisions. —DA

> Grid resolution. Areal grid size plays an important role in capturing reservoir heterogeneity and eliminating errors caused by upscaling. Overhead photos of the Colosseum in Rome illustrate this concept. If the area of interest is the Colosseum floor (dashed box, upper left), then a 50 m x 50 m [164 ft x 164 ft] grid is appropriate to capture what is required. Choice of a larger 250 m x 250 m [820 ft x 820 ft] grid (dashed box, right) includes driveways, streets, landscaping and other features not associated with the focal point of interest. In the case of the Colosseum, use of the larger grid to capture properties associated with the floor would introduce errors.

Oilfield Review WINTER 11/12 Intersect Fig. 15ORWNT11/12-INT 15

250 m

50 m

©2011 Google-Imagery ©2011 Digital Globe, GOIEYE

©2011 Google-Imagery ©2011 Digital Globe, GOIEYE

44581araD4R1.indd 12 2/17/12 9:29 PM