validating reservoir models to improve recovery

15
Validating Reservoir Models to Improve Recovery Jack Bouska BP Amoco plc Sunbury, England Mike Cooper Andy O’Donovan BP Amoco plc Aberdeen, Scotland Chip Corbett Houston, Texas, USA Alberto Malinverno Michael Prange Ridgefield, Connecticut, USA Sarah Ryan Cambridge, England For help in preparation of this article, thanks to Ian Bryant, Schlumberger-Doll Research, Ridgefield, Connecticut, USA; Henry Edmundson, Schlumberger Oilfield Services, Paris, France; Omer Gurpinar, Holditch-Reservoir Technologies, Denver, Colorado, USA; Steve McHugo, Geco-Prakla, Gatwick, England; Claude Signer and Lars Sønneland, Geco-Prakla, Stavanger, Norway; and James Wang, GeoQuest, Houston, Texas. We thank the Bureau of Economic Geology, The University of Texas at Austin, for permission to use the Stratton field 3D Seismic and Well Log Data Set, and BP Amoco and Shell UK E&P for permission to use the Foinaven area data. ARI (Azimuthal Resistivity Imager), FMI (Fullbore Formation MicroImager), MDT (Modular Formation Dynamics Tester), OFA (Optical Fluid Analyzer), RST (Reservoir Saturation Tool) and TDT (Thermal Decay Time) are marks of Schlumberger. Valid and worthwhile reservoir simulation hinges on careful preparation of a reservoir model and clear understanding of model uncertainty. Creative integration and comparison of seemingly disparate data and interpretations elevate reservoir models to new heights of reliability. Through this process, all data incorporated into a model and ultimately fed into a simulator become more valuable, resulting in increases on the order of 10% in predicted ultimate recovery. Summer 1999 21 Making and testing predictions are part of our everyday existence and basic to most industries. Safety equipment, medical treatments, weather forecasts and even interior designs are evaluated by simulating situations and predicting the results. Similarly, the oil and gas industry makes predictions about hydrocarbon reservoirs to decide how to improve operations. Reservoir optimization requires carefully constructing a reservoir model and performing simulations. Interpreting and integrating quality- controlled data from a variety of sources and vin- tages, and at different scales, are prerequisites for preparing a comprehensive reservoir model. Most computer simulators take the reservoir model and represent it as three-dimensional blocks through which fluids flow (previous page). The data, models and simulations provide a more complete understanding of reservoir behavior. Working together, skilled interpreters use a simulator to predict reservoir behavior over time and optimize field development strategies accord- ingly. For instance, the effectiveness of infill drilling locations and trajectories can be deter- mined through simulations of multiple scenarios or assessment of the impact of the uncertainty of specific parameters. Reservoir simulation is also useful in evaluating different completion tech- niques as well as deciding whether to maximize production rate or ultimate recovery. In this arti- cle, we consider how the integration of all avail- able data to validate and constrain reservoir models leads to more realistic reservoir simulation (next page).

Upload: grissel-irene-chipana-tapia

Post on 06-Nov-2015

15 views

Category:

Documents


1 download

DESCRIPTION

RESERVORIOS

TRANSCRIPT

  • Validating Reservoir Models to Improve Recovery

    Jack BouskaBP Amoco plcSunbury, England

    Mike CooperAndy ODonovanBP Amoco plcAberdeen, Scotland

    Chip CorbettHouston, Texas, USA

    Alberto MalinvernoMichael PrangeRidgefield,Connecticut, USA

    Sarah RyanCambridge, England

    For help in preparation of this article, thanks to Ian Bryant,Schlumberger-Doll Research, Ridgefield, Connecticut, USA;Henry Edmundson, Schlumberger Oilfield Services, Paris,France; Omer Gurpinar, Holditch-Reservoir Technologies,Denver, Colorado, USA; Steve McHugo, Geco-Prakla,Gatwick, England; Claude Signer and Lars Snneland,Geco-Prakla, Stavanger, Norway; and James Wang,GeoQuest, Houston, Texas. We thank the Bureau ofEconomic Geology, The University of Texas at Austin, forpermission to use the Stratton field 3D Seismic and WellLog Data Set, and BP Amoco and Shell UK E&P for permission to use the Foinaven area data.ARI (Azimuthal Resistivity Imager), FMI (Fullbore Formation MicroImager), MDT (Modular FormationDynamics Tester), OFA (Optical Fluid Analyzer), RST(Reservoir Saturation Tool) and TDT (Thermal Decay Time)are marks of Schlumberger.

    Valid and worthwhile reservoir simulation hinges on careful preparation of a

    reservoir model and clear understanding of model uncertainty. Creative integration

    and comparison of seemingly disparate data and interpretations elevate reservoir

    models to new heights of reliability. Through this process, all data incorporated

    into a model and ultimately fed into a simulator become more valuable, resulting

    in increases on the order of 10% in predicted ultimate recovery.

    Summer 1999 21

    Making and testing predictions are part of oureveryday existence and basic to most industries.Safety equipment, medical treatments, weatherforecasts and even interior designs are evaluatedby simulating situations and predicting theresults. Similarly, the oil and gas industry makespredictions about hydrocarbon reservoirs todecide how to improve operations.

    Reservoir optimization requires carefullyconstructing a reservoir model and performingsimulations. Interpreting and integrating quality-controlled data from a variety of sources and vin-tages, and at different scales, are prerequisitesfor preparing a comprehensive reservoir model.Most computer simulators take the reservoirmodel and represent it as three-dimensionalblocks through which fluids flow (previous page).The data, models and simulations provide a morecomplete understanding of reservoir behavior.

    Working together, skilled interpreters use asimulator to predict reservoir behavior over timeand optimize field development strategies accord-ingly. For instance, the effectiveness of infilldrilling locations and trajectories can be deter-mined through simulations of multiple scenariosor assessment of the impact of the uncertainty ofspecific parameters. Reservoir simulation is alsouseful in evaluating different completion tech-niques as well as deciding whether to maximizeproduction rate or ultimate recovery. In this arti-cle, we consider how the integration of all avail-able data to validate and constrain reservoirmodels leads to more realistic reservoirsimulation (next page).

  • Reservoir simulation is a tool for reservoirmanagement and risk reduction.1 Although thefirst simulations were performed during the 1950s,for a long time limited computer availability andslow speed confined their use to only the mostsignificant projects.2

    At present, reservoir simulation is performedmost commonly in high-risk, high-profile situa-tions, but could improve virtually any project. Thelist of typical applications is varied and extensive(next page): New discoveries, to determine the number of

    wells and the type and specification of facilitiesneeded for production. Particular attention ispaid to the reservoirs drive mechanism and thedevelopment of potential oil, gas and water pro-files. All assessments are subject to the risk oflimited data, sometimes from only a single well.

    Deepwater exploration and other areas whereinitial test wells are expensive. Estimates drawon restricted data, such as seismic data andresults from a single well.

    Fields in which production surprises occur anddevelopment expenditures have already beenincurred. New measurements or productionstrategies might be advisable.

    Secondary recovery implementation. Appro-priate decisions are essential because of theexpense of enhanced production startup.

    Divestment and abandonment decisions.Simulation can help determine whether a fieldhas reached the end of its economic life orhow remaining reserves might be exploited.

    These applications of simulation are madepossible by new programs and computers thatare faster and easier to use. (A full review of theadvances in simulation software that haveoccurred in the last few years is beyond thescope of this article, but will be covered in afuture article in Oilfield Review.) The new simu-lators run on less expensive computers and allowrapid studies to rank opportunities. Along withthese capabilities, however, arises the possibilitythat simulations might be performed indiscrimi-nately or before a validated reservoir model hasbeen built, potentially prompting misleading orerroneous results and poor decision-making.There is also the risk of performing simulationsbased on limited data.

    Developing a first-rate reservoir model fromlimited data at a variety of scales is difficult. Inits most basic form, model validation is achievedthrough integrating different types of data.Researchers are investigating the best way tointegrate some new types of data, such as multi-component seismic data, into reservoir models. A more sophisticated approach involves uncer-tainty analysis (see Validating Models UsingDiverse Data, page 24 ).

    In some cases, it is best to begin with the sim-plest model that fits the data and the objectives ofthe project and reproduces reservoir behavior. Inall cases, the starting point should be an evalua-tion of what answers are required from reservoirsimulation, the accuracy needed and the level ofconfidence or the acceptable range of quantita-tive predictions. The model complexity might beincreased as more data become available. Thereward for increasing model complexity can beevaluated after each simulation run to decidewhether more complex simulation is justified.

    Estimates of well flow rates and predictionsof reservoir performance from simulations affectdesign of production facilities and should bebelieved, even if they seem unlikely. For example,a deepwater Gulf of Mexico field required expan-sion and de-bottlenecking of facilities soon afterinitial production because the initial reservoirmodel was compromised by a pessimistic view ofthe interpreted reservoir continuity and flowrates. Better predictions allow operators to sizefacilities correctly the first time rather than havingto re-engineer them.

    The quality of predevelopment reserve esti-mates, field appraisals and development strate-gies relates closely to reservoir architecture andstructural complexity; reserve estimates tend tobe underestimated in large, less complex fields,whereas reserves in smaller, more complex fieldsare commonly overstated. Poor reservoir modelsand resultant incorrect calculations of reserves,whether too high or too low, have negative eco-nomic consequences. In the North Sea, deficientreservoir models have led to improper facilitiessizing and suboptimal well placement, even infields where simulation studies were carried out.3

    Better validation of models, particularly using 3Dseismic data, might have averted over- or under-sizing production facilities or drilling unnecessarywells in some cases. In other cases, reservoirsimulation has allowed identification of the keydrivers of reservoir performance so that data-gathering efforts can be targeted to reduceuncertainty in those areas. Alternatively, facili-ties can be designed to be flexible within a givenreservoir uncertainty.

    22 Oilfield Review

    Distributed disciplines

    Use of data in isolation, obscuring relationshipsbetween data (for example, seismic and core data)

    Inconsistent or poorly documented interpretation techniques

    Overdependence on simple reservoir maps

    Simulation dependent on computer availabilityand capability

    Unlimited modification of simulation input valuesto achieve match with production history

    Limited use of simulation to guide data acquisition

    Multidisciplinary teamwork

    Integration of data and interpretations to confirmreservoir models

    Archiving of interpretations and consistent methods

    Seismic-guided reservoir property mapping

    Simulations run on personal computersor using massively parallel processing

    Reservoir models constrained by integrated data andinterpretations and prudent adjustment of inputs

    Modeling and simulation to determine optimaltiming for data gathering, such as 4D seismic surveys

    Traditional Approach Leading-Edge Approach

    > Simulation approaches. In the past, single-discipline interpretation and lack of computing capabilitylimited the use of reservoir simulation. Now, a more sophisticated approach to simulation makes themost of multidisciplinary teams and nearly ubiquitous computers.

  • Summer 1999 23

    Increasing the Value of DataOperating companies spend considerable timeand money acquiring datafrom multimillion-dollar seismic surveys and cores from costlyexploratory wells, to sophisticated well logs andproduction tests during and after drilling. Dataacquisition presents potential risksto bothproject economics and the well itselfsuch aslogging or testing tools becoming stuck, a corebarrel malfunctioning or having to shut in or kill aproducing well. One would expect, then, thatdata would be analyzed and incorporated intomodels as fully as possible or not collected in thefirst place. Most reservoir simulations rely heav-ily on production data from wells and only fourtypes of geological or geophysical reservoirmaps: structure of the top of the reservoir, reser-voir thickness, porosity and the ratio of net pay togross pay. These maps are often constructedfrom seismic and well log data alone.Incorporating all available data, such as coreanalyses, seismic-guided reservoir property dis-tributions and fluid analyses, is a cost-effectiveway to strengthen and validate reservoir modelsacross disciplines.

    A reservoir model usually combines produc-tion rates and volumes with geological and geo-physical maps of subsurface strata derived fromwell logs and seismic data. Aquifers are oftenincluded in the model and sealing rocks are typi-cally treated as zero-permeability layers. Thesubsurface maps take into account well locations

    and trajectories. The reservoir model is strength-ened if a geological map of permeability values iscreated by applying a porosity-to-permeabilitytransform to the porosity map according to per-meability values interpreted from well tests, welllogs or cores. Even more rigorous results areobtained when, in addition to inclusion of wellrates and produced or producible hydrocarbonvolumes, all available production data are inputinto the model. These include pressure, gas/oilratios, and fluid densities, saturations, viscositiesand compressibilities.

    In many instances, though, reservoir modelsfail to encompass the full diversity of reservoirdata because only a few basic geological andgeophysical maps, constructed from a subset ofthe data available, are used to describe varia-tions in the data. Additional data and interpre-tations are needed to make reservoir modelsmore robust. For example, core data can serve ascalibrators for geological, petrophysical andengineering data and interpretations, but areoften used only as guides to permeability. Coreanalysis refines model values of porosity, perme-ability, capillary pressure and fluid saturation.Whole cores, while not necessarily represen-tative of the entire reservoir, offer tangibleevidence of grain size and composition, sorting,depositional environment and postdepositionalreservoir history, such as bioturbation, cementa-tion or diagenesis.

    Seismic and well test data enable mapping ofpermeability barriers, but are rarely used in tan-dem. For example, horizon dip, azimuth, coherencyor other seismic attributes might indicate faultpatterns.4 Such information is especially usefulwhen contemplating the addition of directionallydrilled or multilateral wells. These types of inter-pretations are just the beginning; all other datatypes should be similarly scrutinized.

    As mentioned earlier, the reliance of simula-tors on four simple subsurface maps hasimpaired simulation effectiveness. Simulationbecomes more realistic as additional data areincorporated into the reservoir modelreconcil-ing all available data tends to rule out someinterpretations. For example, permeability valuescan be inferred from well logs and confirmed bycore and well test data, and possibly related toseismic attributes, rather than merely computedfrom an empirical transform of a porosity mapand well test data. Reconciling conflicting datarequires acceptance of a hierarchy of data confi-dence. This hierarchy might be developed on thebasis of probable measurement errors.

    New discoveries

    Deepwater exploration

    Mature fields

    Implementation of secondaryrecovery

    Divestment or abandonment

    Determine optimal number ofinfill wells

    Size and type of productionfacilities

    Decide whether to maximizeproduction rate orultimate recovery

    Prospect evaluation

    Scenario planning

    Answers to suddenproduction problems

    Determine appropriaterecovery method

    Determine futureproduction volumes

    Limited data, sometimes fromonly a single well

    Drive mechanism

    Terms of operating licenseor lease

    Limited data, no wells available

    Relatively inexpensive wayto extract maximum valuefrom development costs

    Unanticipated future produc-tion problems might reduceproperty value

    Situation Desired Results Pitfalls or Other Considerations

    > Simulation uses. Reservoir simulation is useful during all phases of the life of a reservoir and in both high- and low-risk projects.

    1. For a general introduction to reservoir simulation:Adamson G, Crick M, Gane B, Gurpinar O, Hardiman Jand Ponting D: Simulation Throughout the Life of aReservoir, Oilfield Review 8, no. 2 (Summer 1996): 16-27.

    2. Watts JW: Reservoir Simulation: Past, Present, andFuture, paper SPE 38441, presented at the SPEReservoir Simulation Symposium, Dallas, Texas, USA,June 8-11, 1997.

    3. Dromgoole P and Speers R: Geoscore: A Method forQuantifying Uncertainty in Field Reserve Estimates,Petroleum Geoscience 3, no. 1 (February 1997): 1-12.

    4. Key SC, Nielsen HH, Signer C, Snneland L, Waagb Kand Veire HH: Fault and Fracture Classification UsingArtificial Neural Networks Case Study from the EkofiskField, Expanded Abstracts, 67th SEG AnnualInternational Meeting and Exposition, Dallas, Texas, USA,November 2-7, 1997: 623-626.

    (continued on page 26)

  • A shared earth model is a model of the geome-try and properties of the reservoir constrainedby a variety of measurements. To be predictive,the model should approximate the key featuresof the actual reservoir as closely as possible(right). In a valid reservoir model, predictionsfrom the model agree with the measured data. Agood fit between predictions and measurementsis not sufficient, though. Several models mightagree equally well with the data. The best modelis the one that agrees best with the data andwith prior information on the model parame-ters. The uncertainty of the model is defined asthe range of model parameters consistent withthe measurements.

    Consider a thin bed imaged by seismic data(below right). The uncertainty of the sharedearth model in this case is described by therange of thickness and impedance values thatsatisfy the data. This range defines a probabilitydensity function (PDF).

    24 Oilfield Review

    Reservoir Model

    Shared Earth Model

    Measured Data

    Predicted Data

    Seismic dataWell log dataDynamic data

    >Model inputs. A shared earth model begins with seismic, well log and dynamic data from theactual reservoir. In this example, the reservoir is represented by a model (top left). Measuredseismic data (top right) are compared with data predicted by the model (bottom right),which can be adjusted to improve the fit. The final three-dimensional shared earth model(bottom left) incorporates all available data.

    Poor fit

    Good fit

    Best fit

    Uncertainty ellipse

    h

    h

    VP

    VP

    >Quantifying uncertainty. The thin bed shown as a red layer (left) has thickness hand acoustic impedance VP. The plot to the right displays the posterior probabilitydensity function. Thickness and impedance values within the red uncertainty ellipsesatisfy the data, and within that ellipse are red circles denoting a good fit and thebest fit. The red circle outside the uncertainty ellipse does not satisfy the model.

    Validating Models Using Diverse Data

  • Summer 1999 25

    Researchers at Schlumberger-Doll Research,Ridgefield, Connecticut, USA, are using aBayesian approach to quantify the uncertaintyof reservoir models (right). A prior PDFrepresents initial information on model para-metersthe vector m. This prior PDF can be combined with a likelihood PDF, whichquantifies information provided by additionaldata, to obtain a posterior PDF. When new databecome available, the posterior PDF is used asthe initial or prior PDF, and the model is againrefined. As additional measurements are incor-porated in the model, the uncertainty decreasesand a better reservoir description follows.

    Effective model validation using a Bayesianapproach requires three modes of operation:interactive, optimization and uncertainty. In theinteractive mode of the prototype application in development, the user modifies the reservoirmodel and observes the consequences of inter-pretation decisions on the data (below right).In the example shown, predicted seismic dataare compared with measured data. In the opti-mization mode, the user selects the modelparameters to optimize. The software finds thebest local fit of the model to the data. Finally,in the uncertainty mode, the uncertainty ellipseof selected reservoir properties is computed anddisplayed. The uncertainty ellipse representsthe range of acceptable models.

    The prototype application has been used totest a Bayesian validation approach againstdiverse data types, including seismic data, welllogs, while-drilling data and production infor-mation. By validating a reservoir model againstall available data before beginning the history-matching phase, the range of admissible modelscan be reduced substantially. The result is amore predictive reservoir model.

    Prior PDFp (m)

    LikelihoodL (m d[1])

    Posterior PDFp (m d[1])

    Prior PDFp (m d[1])

    LikelihoodL (m d[2])

    Posterior PDFp (m d[1],d[2])

    m2

    m1

    m2

    m1

    m2

    m1

    m2

    m1

    m2

    m1

    m2

    m1

    Bestmodel

    Uncertaintyellipse

    >Reducing uncertainty. In a Bayesian approach, a prior PDF quantifies the initial information on model parameters, expressed as vector m. The prior PDF (top left) isrefined by the inclusion of new data (top center) to create the posterior PDF (top right).The uncertainty of the model is shown in the red uncertainty ellipse. The blue circle represents the best model. The posterior PDF then becomes the prior PDF (bottom left)when more data become available (bottom center). The next posterior PDF (bottom right) has a smaller uncertainty ellipse and a slightly different optimal model.

    >Validation modes. Prototype software developed by Schlumberger includes an interactive mode in which theuser assesses the effects of interpretation decisions on reservoir models. In this case, the center of the upperpanel shows predicted seismic data as dotted lines and measured data as solid lines after the upper horizon,shown in green to the left, has been moved. The lower panel shows a better fit between the predicted andmeasured data (center) and the model uncertainty in the ellipse to the right.

  • Limitations of Reservoir ModelsGenerating and fine-tuning the model entail closecollaboration by the reservoir team. As in otherphases of exploration and production, such asgeological and geophysical interpretation ordrilling preparations, handing off results fromone team member to the next along a chain isless effective than working together from theoutset.5 Reservoir teams analyze data and per-form simulations more rapidly as their experienceincreases. Working as a team also ensures thatno one gets bogged down in endless tinkeringwith input parameters to try to obtain a matchwith production history.

    In addition to working interactively, the teammust employ consistent methods to ensure thatnormalization and interpretation are performedproperly. If data are not normalized and inter-preted consistently, relationships between datamight be obscured, such as that between porosityand seismic attributes (see Model Validation,next page).6

    Any uncertainty in the data limits confidencein reservoir models and reservoir simulations.Permeability barriers, pinchouts, faults and othergeological features are not always apparent fromwell, seismic and production data. Their exactlocations might be off by tens or hundreds ofmeters and their effectiveness as flow barriers or

    conduits might be miscalculated. Formationthicknesses are usually defined by integratingseismic and averaged well log data, although the resolution of seismic data is on the order oftens to hundreds of feet, whereas well logs showvariations at the scale of inches. Under- or over-estimating pay thicknesses directly impactssimulation reliability.

    Averaging techniques also affect simulationresults, especially when reservoir properties arehighly variable. Also, problems may occur whenaveraging fine detail, such as interpretationsfrom well logs, to integrate with data of lowerresolution, such as seismic data. For example, areservoir that consists of several distinct layerswith different properties might not behave like asingle layer of the same overall thickness andaverage properties. The uncertainty of manymeasurements increases dramatically with dis-tance from the wellbore. Even though there is adifferent level of uncertainty with each data type,proper model validation forces comparisons ofindependent data and interpretations.

    Upscaling, or representing the data at a com-mon scale, coarsens the fine-scale reservoirdescription in the shared earth model to the degreethat a computer can cope with it (below). This stepusually reduces the number of cells, or subdivisions

    of the reservoir model. Horizontal upscaling in theabsence of horizontal wellbores is typically simplerbecause there is generally less fine detail in seis-mic data, whereas vertical upscaling is compli-cated by the greater amount of detail available atthe wellbores. Thickness and porosity, whose vari-ations typically follow simple, linear averaginglaws, are less prone to upscaling problems thanpermeability. The average permeability of a two-layer system in which one layer has zero perme-ability is not one layer with the averagepermeability of the two layers. The reservoir modelmust be built around such impermeable layers.

    No matter how carefully a model is preparedand simulation performed, the dynamics of pro-duction might affect the reservoir in ways thatreservoir simulation might not predict. Historymatching, or comparing actual production vol-umes and measured pressures with predictionsfrom simulations, is the most common methodfor judging the quality of the reservoir model. Theassumption is made that if the model yields asimulation that matches past production, thenthe model is more likely to be a useful tool forforecasting.7 Certainly, a model that does notmatch past production history or reservoirresponse to past production is unlikely to cor-rectly predict future production.

    26 Oilfield Review

    Classification system Reservoir simulationsDrilling dataGeological modeling 3D and 4D seismic data Seismic modeling

    Well logsPetrophysical modeling

    Simulation model

    Upscaling

    > Shared earth model. A numerical representation of the subsurface, housed in a database shared by multidisciplinary team members, allows constantaccess to and updating of the reservoir model used for simulation. As databases and software improve, the simulation model and the shared earth model,which now must be upscaled before being used as a simulation model, will be the same.

  • Summer 1999 27

    Obtaining a good match between the produc-tion history and predictions from simulations isinexpensive in some cases, but can become timeconsuming when the model is continuouslyrefined and simulated. In certain situations, suchas waterfloods, tracers in the form of chlorides,isotopes or brines are introduced into injectedwater to reveal patterns in the reservoir.Comparisons of these patterns with expectedpatterns can be used to reevaluate input values,for example, porosity, permeability and transmis-sibilitythe ease with which fluid flows fromone model cell to another, to improve the historymatch. Whenever a new well is drilled, it offersan opportunity to check the quality of a reservoirsimulation, principally by comparison of observedpressure with the pressure predicted by themodel at the drilling location.

    The difficulty of simulating a reservoirunderscores the need to constrain the reservoirmodel with all available data. A reservoir modelconstrained and validated by geological, geo-physical and reservoir data before initiatingsimulation extracts as much information as pos-sible from the data and provides a better result.Also, understanding the range and impact ofreservoir uncertainty allows a quantitative andqualitative judgment of the accuracy or range ofmodel predictions.

    Model ValidationA data set from the Stratton field of south Texas(USA) demonstrates the value of cross-disciplinaryinterpretation and model validation in calculatingin-situ gas reserves in the Frio formation (above).8

    The data include 3D seismic data, logs from ninewells, correlations of geological markers and avertical seismic profile (VSP). Resistivity, neutronporosity, bulk density, and spontaneous potential,gamma ray or both curves were available for thenine wells. Preliminary examination of the welllogs and VSP data guided selection of horizons inthe Frio formation for seismic horizon tracking.The VSP provided a good tie between the welland the seismic data along with good under-standing of the phase of the seismic data.9

    A thin, clean Frio sand that is easy to correlateand ties to a mappable seismic event wasselected for both well-by-well analysis and multi-well petrophysical interpretation that ensuredconsistent analysis of all the logs. The interpretersobserved that porous zones seemed to correspond

    5. Galas C: The Future of Reservoir Simulation, Journal of Canadian Petroleum Technology 36, no. 1(January 1997): 5, 23.

    6. Corbett C: Improved Reservoir Characterization ThroughCross-Discipline Multiwell Petrophysical Interpretation,presented at the SPWLA Houston Chapter Symposium,Houston, Texas, USA, May 18, 1999.

    7. This assumption does not always hold. For example, a reservoir model might match the production historyeven when there is bypassed oil. Additional seismic data might reveal undrained reservoir compartments in this case.

    8. Corbett C, Plato JS, Chalupsky GF and Finley RJ:Improved Reservoir Characterization Through Cross-Discipline Multiwell Petrophysical Interpretation,Transactions of the SPWLA 37th Annual LoggingSymposium, New Orleans, Louisiana, USA, June 16-19,1996, paper WW.

    9. Phase refers to the motion of, or means of comparisonof, periodic waves such as seismic waves. Waves thathave the same shape, symmetry and frequency and thatreach maximum and minimum values simultaneouslyare in phase. Waves that are not in phase are typicallydescribed by the angular difference between them, suchas 180 degrees out of phase. Zero-phase wavelets are symmetrical in shape about zero time whereas non-zero-phase wavelets are asymmetrical. Non-zero-phasewavelets are converted to zero-phase wavelets toachieve the best resolution of the seismic data. Known(zero) phase well synthetics and vertical seismic profiles(VSPs) can be compared with local surface seismic datato determine the relative phase of the surface seismicwavelets. Such knowledge allows the surface seismicdata to be corrected to zero phase.For more on combining vertical seismic profiles withother geophysical data: Hope R, Ireson D, Leaney S,Meyer J, Tittle W and Willis M: Seismic Integration to Reduce Risk, Oilfield Review 10, no. 3 (Autumn 1998): 2-15.

    Data loading

    Time horizoninterpretation

    Depth grids

    Attribute extraction

    Model constructionReservoir propertydistribution

    Petrophysicalinterpretation

    Geologiccorrelation

    Weighted average

    VSP well tie

    > Model construction workflow. Once data from the Stratton field were loaded, the team worked together from the outset to correlate well logs, vertical seismic profile (VSP) data and seismic data. Interpreted seismic horizons, depth conversion results and extracted attributes were compared withnormalized well log porosities and geologic log correlations. The consistent relationship between the weighted average porosity and seismic amplitudeprompted generation of a reservoir property distribution map, a seismic-guided map of porosity distribution in this case, to complete the reservoir model.

  • 28 Oilfield Review

    30 35 40

    Amplitude

    45 50 55 60 65

    0.04

    0.05

    0.06

    0.07

    0.08

    0.09

    0.10

    Sing

    le-w

    ell e

    ffect

    ive

    poro

    sity

    Well 10

    Well 9Well 12

    Well 19

    Well 13Well 20

    Well 18Well 7

    Well 11

    0.0930 35 40

    Amplitude

    45 50 55 60 65

    0.10

    0.11

    0.12

    0.13

    0.17

    0.14

    0.15

    0.16

    Mul

    tiwel

    l effe

    ctiv

    e po

    rosi

    ty

    Well 7

    Well 11

    Well 20

    Well 19

    Well 10

    Well 9

    Well 13

    Well 18

    Well 12

    to high seismic amplitude. To confirm this obser-vation, crossplots of effective porosity and ampli-tude were prepared. The crossplots of thewell-by-well petrophysical analysis showed sig-nificant scatter, whereas the multiwell analysisdemonstrated a clear relationship between seis-mic amplitude and effective porosity (left).

    Next, an equation that related effectiveporosity to amplitude was used to generate amap of effective porosity. The mathematical rela-tionship between the weighted average porosityvalues at the wellheads and the seismic ampli-tudes at those locations guided the mapping.Combining carefully integrated core porosity, log-derived porosity and seismic attributesin thiscase, amplitudeproduced a single, validatedporosity map constrained by several independentsources of porosity information (next page).Using each type of data in isolation in theStratton field example obscured relationshipsbetween data and probably would have resultedin a set of incompatible subsurface maps thatwere not physically realistic.

    The difference between the single-well ana-lytical approach and the consistent, normalizedpetrophysical analysis in the Stratton fieldaffects the economic evaluation of the reservoir.The single-well approach precluded integratingthe well logs with the seismic data to generate aseismic-guided porosity map because the cross-plot of effective porosity and amplitude indicatedno consistent relationship between the well logsand seismic data. The in-situ gas volume calcu-lated by single-well petrophysical analysis is12% greater than that calculated from the vali-dated, seismic-guided porosity distribution. Anoverstated gas volume might lead to unnecessaryinfill drilling.

    In another case offshore Malaysia, 3D seis-mic data, well logs, wellbore image logs andcore data enabled generation of time-depthrelationships and synthetic seismograms to tielogs to seismic data.10 The relationship betweeneffective porosity, seismic amplitude and acous-tic impedance, expressed as a calibration func-tion, allowed prediction of effective porositythroughout the 3D seismic data, similar to theprevious Stratton field example. Additionaldata, such as pressure measurements fromwireline tools or well tests, make the reservoirmodel more robust and improve confidence inthe predictions from simulation.11

    > Single well versus multiwell interpretation. Well-by-well petrophysical analysis(top) obscures the relationship between porosity and seismic amplitude. In thisexample from the Stratton field, the plot of effective porosity versus seismicamplitude shows considerable scatter around the line of best fit because the welllogs were not analyzed consistently. The relationship between seismic amplitudeand porosity is clear when the logs are normalized and consistent analyticalmethods are used (bottom). The observed relationship between seismic amplitude and effective porosity allowed interpreters to use the seismic data to generate a map of effective porosity.

  • Summer 1999 29

    Model ManipulationBecause simulation inputs are subject to revisionby the project team to improve the match betweenthe simulation and production history, it is impor-tant to restrict the input model as much as thedata permit and avoid unnecessary adjustments ofinput values. Simulation software typically allowsinterpreters to change not only the geological andgeophysical maps used to build a reservoir model,but also variables such as pressure, temperature,fluid composition and saturation, permeability,transmissibility, skin, productivity index and rockcompressibility. Seasoned interpreters have differ-ent opinions about what changes to simulationinputs are acceptable, but prudently adjustingsimulation input parameters often improves thehistory match.

    Simulation experts use a three-stageapproach to fine-tune a reservoir model, begin-ning with the energy balance, then an adjust-ment for multiple fluid phases, and finally thewell productivity. The energy balance stageaccounts for the reservoir pressure. The relativepermeabilities of different fluid phases areadjusted in the second stage. The final step usesrecent productivity test data, such as bottomholeflowing pressure, tubing surface pressure andtotal fluid production rate, to further improve thehistory match.

    Reservoir thickness values typically are con-strained by seismic data and well logs, but arewrong if the interpreter tracks seismic horizonsincorrectly, if logs and seismic data are not tiedproperly, or if well logs are off-depth or miscor-

    related. Poor-quality seismic data, a commonproblem in structurally complex areas, canhamper horizon tracking. Reprocessing canimprove seismic data quality.

    The depth to the reservoir should also be wellconstrained if log and seismic data are inter-preted diligently. Comparing well logs or syn-thetic seismograms generated from logs withseismic data improves depth conversion.Additional data, such as VSPs, also tend toimprove depth conversion. In structurally com-plex areas, however, depth-based processingfrom the outset is preferable to depth conversion.The enhanced integrity that validation brings to adepth-converted structure mapthat is, a mapdisplayed in units of depth rather than the seis-mic unit of timeis demonstrated by integratingdipmeter data with the structure map or by com-paring depth-converted seismic sections to dip

    10. Corbett C, Solomon GJ, Sonrexa K, Ujang S and Ariffin T:Application of Seismic-Guided Reservoir PropertyMapping to the Dulang West Field, Offshore PeninsularMalaysia, paper SPE 30568, presented at the SPEAnnual Technical Conference and Exhibition, Dallas,Texas, USA, October 22-25, 1995.

    11. For another example of seismic-guided property mapping:Hart BS: Predicting Reservoir Properties from 3-DSeismic Attributes With Little Well ControlJurassicSmackover Formation, AAPG Explorer 20, no. 4 (April 1999): 50-51.

    Well 12Well 18

    Well 9

    Well 11

    Well 13

    Well 20Well 7

    Well 10

    Well 19

    7000

    6000

    5000

    100

    80

    70

    60

    50

    40

    30

    20

    10

    4000

    3000

    2000

    1000

    10000 2000 3000 4000 5000 6000 7000 8000 9000 10,000 11,000

    0

    0.0400.0600.0800.1000.1200.1400.1600.1800.2000.220

    90

    20

    20 40 180 20060 80 100 120 140 160

    > Seismic-guided porosity distribution. In the Stratton field of south Texas, USA, a clear relationship between effectiveporosity and seismic amplitude permitted seismic-guided mapping of effective porosity. This map could not have beencreated without consistent, multiwell petrophysical analysis. Yellow and orange represent areas of high seismicamplitude; blue represents low amplitude.

  • interpretations from wellbore images, such asFMI Fullbore Formation MicroImager or ARIAzimuthal Resistivity Imager logs (left). By inter-preting seismic data, well logs and wellboreimages together rather than independently, theinterpreter ensures that the final structure maphas been rigorously checked.

    Though typically not introduced in the large-scale model-construction phase, informationabout reservoir fluids can offer important insight.Formation tester data, such as MDT ModularFormation Dynamics Tester results, indicate thelocation of a fluid contact. This information, com-bined with well log and seismic data, yields amore constrained starting model for simulation.

    Other fluid information may be used to con-strain the fine-scale model in the vicinity of thewellbore. Perforation locations are consideredknown, but the effectiveness of the perforationsmay be evaluated with production logs, andchanges in fluid saturations monitored with RSTReservoir Saturation Tool or TDT Thermal DecayTime data.

    The ratio of net pay to gross pay can varywidely across a reservoir, but like other simula-tion input values, should not be altered duringthe history-matching stage without good reason.The net-to-gross ratio might be adjusted if sup-ported by drilling results, such as well logs andcores, or production logs.

    Permeability values are obtained in severalways, including core and log analysis and welltests, so comparisons of the values from each ofthese approaches can limit the range of input val-ues, at least at well locations. Effective assimila-tion of wellbore image logs, probe permeabilitydata and core data allows characterization ofhorizontal permeability near the wellbore andprediction of vertical permeability.12 Saturationvalues, established through well log analysis, areverified by capillary pressure data from specialcore analysis, wireline formation tester results orRST measurements. All of these input parame-ters, within reason, are considered adjustable bysimulation experts.

    30 Oilfield Review

    -800

    -1000

    -1200

    -1800-1400

    -1600

    -800

    -1200 Structure contours in meters

    Well location

    Strike and dip from dipmeter

    >Confirming depth conversion. Dipmeter data reduce interpretive contouring options for structure mapsif the mapper honors the data (top). Dipmeter data from the depth of interest, plotted at each well, showreasonable conformity with structure contours in the upper right and lower sections of the map, butrefute the contouring of the upper left area in this fictitious example. Dip interpretation from an image log,tied to an actual depth-converted seismic section, confirms dip direction and magnitude at horizons ofinterest (bottom). The color variation in the seismic section represents acoustic impedance.

    T67V f/s*g/cm3

    11,050

    11,150

    11,250

    11,350

    11,450

    11,550

    11,650

    11,750

    11,850

    11,950

    12,050

    12,150

    Depth, ft

    18,000

    21,000

    24,000

    27,000

    30,000

    33,000

    36,000

    39,000

    42,000

    45,000

    48,000

    51,000

    54,000

    4151443

    4131443

    4121443

    4101443

    4081443

    4061443

    4041443

    12. Thomas S, Corbett P and Jensen J: Permeability andPermeability Anisotropy Characterisation in the NearWellbore: A Numerical Model Using the ProbePermeameter and Micro-Resistivity Image Data,Transactions of the SPWLA 37th Annual LoggingSymposium, New Orleans, Louisiana, USA, June 16-19, 1996, paper JJJ.

    13. Crombie A, Halford F, Hashem M, McNeil R, Thomas EC,Melbourne G and Mullins OC: Innovations in Wireline Fluid Sampling, Oilfield Review 10, no. 3 (Autumn 1998): 26-41.

    14. For more on skin: Hegeman P and Pelissier-Combescure J:Production Logging for Reservoir Testing, Oilfield Review 9, no. 2 (Summer 1997): 16-20.

  • Summer 1999 31

    Multidisciplinary validation of reservoir mod-els increases the value of data beyond the cost ofdata-gathering activities alone. In 1998, forexample, Geco-Prakla acquired 3D multicompo-nent seismic data for Chevron in the Alba field inthe North Sea. The objectives of the survey wereto better image the sandstone reservoir, identifyintrareservoir shales that affect movement ofinjected water and map waterflood progress.After integration of the new shear-wave data toimprove the reservoir model, two additionalwells were drilled in the field. The first well isproducing up to 20,000 B/D [3200 m3/d]; the sec-ond well is being completed and has resulted inthe discovery of Albas highest net sand. Bothwells have confirmed some of the featuresobserved on the converted-wave data. Becausethe first well was drilled less than a year afterseismic acquisition started, Chevron felt the newdata arrived in time to make a significant com-mercial impact on the fields development.17

    Some reservoir engineers minimize adjust-ments to PVT samples, which indicate reservoirfluid composition and behavior at the pressure,volume and temperature conditions of the reser-voir. In cases of surface recombination of thesample or sample collection long after initial pro-duction, however, the engineer might decide toadjust PVT values. At the other extreme, produc-tion rates and volumes, and pressure data fromwells are considered inalterable by someexperts, although exceptions are made at times,such as when production measurement equip-ment fails. Many experts choose to honor themost accurate representation of production data.

    Placing restrictions on the alteration of inputvalues makes a good history match from simula-tion more elusive, but many input values may beadjusted during simulation. Transmissibility iscomputed by the simulator using the input poros-ity and permeability. A high computed transmis-sibility value can be overridden if well tests,formation tester data or seismic data provide evi-dence of separate sand bodies, stratigraphicchanges, faults or other types of reservoir com-partmentalization. Differences in fluid chemistryor pressure from one well to another also sug-gest reservoir compartmentalization. In-situ fluidsamples obtained from the OFA Optical FluidAnalyzer component of the MDT tool are uncon-taminated and can be brought to surface withoutchanging phase for chemical analysis.13

    Production logs, well tests and pressure tran-sient analyses indicate skin, which is a dimen-sionless measure of the formation damagefrequently caused by invasion of drilling fluids orperforation residue.14 When the location, pene-tration and effectiveness of perforations are ofconcern, production logs provide information thatmay affect the model input for skin. If a field islocated in a geological trend of similar accumu-lations, skin values in the trend might be a usefulstarting assumption if data within the field areinitially scarce.

    15. For more on the shared earth model and integrated interpretation: Beardsell M, Vernay P, Buscher H, Denver L, Gras R and Tushingham K: StreamliningInterpretation Workflow, Oilfield Review 10, no. 1(Spring 1998): 22-39.

    16. Major MJ: 3-D Gets Heavy (Oil) Duty Workout, AAPGExplorer 20, no. 6 (June 1999): 26-27.ORourke ST and Ikwumonu A: The Benefits ofEnhanced Integration Capabilities in 3-D ReservoirModeling and Simulation, paper SPE 36539, presentedat the SPE Annual Technical Conference and Exhibition,Denver, Colorado, USA, October 6-9, 1996.Sibley MJ, Bent JV and Davis DW: Reservoir Modelingand Simulation of a Middle Eastern CarbonateReservoir, SPE Reservoir Engineering 12, no. 2 (May 1997): 75-81.

    17. For more on the Alba field survey and shear wave seismic data: MacLeod MK, Hanson RA, Bell CR andMcHugo S: The Alba Field Ocean Bottom Cable SeismicSurvey: Impact on Development, paper SPE 56977, prepared for presentation at the 1999 Offshore EuropeanConference, Aberdeen, Scotland, September 7-9, 1999.Caldwell J, Christie P, Engelmark F, McHugo S, zdemir H,Kristiansen P and MacLeod M: Shear Waves ShineBrightly, Oilfield Review 11, no. 1 (Spring 1999): 2-15.

    Adjustment of the productivity index, anotherinput parameter, affects the quality of a historymatch. The productivity index, often expressed inunits of B/D/psi or Mcf/D/psi, is a measure ofhow much a well is likely to produce. If the skinvalue is known, the productivity indexusuallycomputed from model inputs that include theskincan be computed more accurately. Whendiffering stimulation or completion techniquesamong field wells are used, productivity indexvalues often vary from well to well. For example,hydraulic fracturing of a single well in a field mightenhance permeability and, therefore, productivityof that well alone.

    The options available to change a reservoirmodel to improve the match between one simu-lation run and a fields production history mightappear endless. At some point, practical limits ondata collection, computational power and timefor modifying input parameters curtail simulationiterations. Independent analyses that support theinterpretations of other team members increaseconfidence in reservoir simulation. A proper sim-ulation workflow helps accomplish this goal.Working as a team ensures that all data are usedto validate the reservoir model.

    Validating interpretations and models acrossdisciplines addresses complex problems that aredifficult to solve within the confines of a singlediscipline. A multidisciplinary team sharing adatabase and iteratively validating and updatingshared earth models, or geomodels, achieves thisgoal.15 Operating companies report increases onthe order of 10% in predicted ultimate recoverythrough proper data integration, simulation andreservoir development.16 Cycle time alsodecreases in many cases, probably because ofready access to data and interpretations foreveryone involved in the project.

    . . . Validating interpretations and models across disciplines

    addresses complex problems that are difficult to solve

    within the confines of a single discipline . . . Multidisciplinary

    validation of reservoir models increases the value of

    data beyond the cost of data-gathering activities alone . . .

  • Modeling for Data AcquisitionIn addition to the general question of determin-ing how best to produce a reservoir, simulationcan demonstrate the best time to acquire addi-tional data. Time-lapse (4D) seismic surveys,which are repeated 3D surveys, can be acquiredat optimal times predicted by careful model con-struction and simulation.18 As oil and gas are pro-duced from a reservoir, the traveltime, amplitudeand other attributes of reflected seismic waveschange; simulation can demonstrate when thesechanges become visible.19 Because 4D seismicdata acquisition, processing and interpretationcan cost millions of dollars, it is critical to deter-mine from modeling studies whether reservoirvariations will be discernible in the new survey.This type of prediction differs from routine simu-lation techniques.

    Traditionally, reservoir engineers have beeninterested in matching simulated well productionwith actual production data. These data comefrom one point in spacethe wellheadbut arenearly continuous in time. There are other typesof history matching, though. Seismic data repre-sent a single point in time but offer almost com-plete spatial coverage of the reservoir. Also, themodeled and observed parameters are differ-entseismic amplitudes are of concern ratherthan fluid pressures, for example.

    To perform seismic history matching, first, theseismic response to a saturated reservoir ismodeled. After some period of production, theseismic response to the depleted reservoir iscalculated. The seismic response might be com-plex and include a combination of changes inamplitude, phase, attenuation and traveltime.The initial and depleted reservoir responses dif-fer because the composition of the pore fluidsand the reservoir pressure change during produc-tion, both of which affect the seismic velocity ofthe reservoir. The synthetic responses are com-pared with recorded seismic datathe initial 3Dsurvey and the subsequent repeated survey. Thedifference in seismic character of the reservoirfrom the initial survey to the later survey is afunction of the compressional (P) and shear (S)seismic velocities and is interpreted as a changein fluid content and pressure.

    Multicomponent seismic data and amplitudevariation with offset (AVO) analysis of compres-sional-wave data both reduce the ambiguity ofdistinguishing the effects of pressure changesfrom the effects of pore fluids. Without AVOprocessing, ordinary marine 3D seismic data arefundamentally ambiguous because they respondto compressional waves only, but compressionalwaves respond to both pressure and saturation.Multicomponent seismic data separate compres-sional and shear components, allowing the inter-preter to separate saturation effects and pressureeffects that influence porosity, because shearwaves do not respond to pore fluids.20

    Interpretation of a seismic response changefrom an initial seismic survey to a repeated sur-vey enables detection and spatial calibration ofadditional faults, movement of oil-water contactsand gas coming out of solution. Small faults inthe reservoir section are often visible as linearfeatures of decreased amplitude in a seismicamplitude or coherency plot. Sealing faults alsoappear as patches of undrained hydrocarbonswhose well-defined edges represent the fault.Movements of oil-water contacts are visible aschanges in amplitude and possibly as flatspots. When the reservoir pressure drops below

    32 Oilfield Review

    18. Gawith DE and Gutteridge PA: Seismic Validation ofReservoir Simulation Using a Shared Earth Model,Petroleum Geoscience 2, no. 2 (1996): 97-103.

    19. Pedersen L, Ryan S, Sayers C, Sonneland L and Veire HH:Seismic Snapshots for Reservoir Monitoring, Oilfield Review 8, no. 4 (Winter 1996): 32-43.

    20. In the case of the Magnus field, located on the UK conti-nental shelf, the effect of pressure changes on time-lapse seismic data is greater than the effect of changesin pore fluids. For more information: Watts GFT, Jizba D,Gawith DE and Gutteridge P: Reservoir Monitoring ofthe Magnus Field through 4D Time-Lapse SeismicAnalysis, Petroleum Geoscience 2, no. 4 (November 1996): 361-372.

    21. Pedersen et al, reference 19: 42.

    Predicted Seismic Propertiest2t3

    Predicted Seismic Datat2t3

    Actual Seismic Datat2t3

    > Forward modeling to optimize data acquisition. Predicted properties of seismic data attime t2 (top left) are used to predict the appearance of seismic data (middle left). Thesepredictions are revisited after acquisition of actual seismic data at time t2 (bottom left).Seismic properties at time t3 (top right) are predicted next from actual t2 seismic data.By considering fluid changes in the reservoir and their effects on seismic waves, andthen modeling the seismic data that would result from surveying at time t3 (middle right),additional seismic surveys for reservoir monitoring will be acquired at the optimal timet3 (bottom right).

  • Summer 1999 33

    bubblepoint and gas comes out of solution, P-wave seismic data typically show a significantbrightening of seismic amplitude. Multicomponentseismic data, which include shear-wave data,show no brightening because the shear wavesdo not respond to pore fluids. Such a responseconfirms the presence of gas.

    Seismic history matching has benefited reser-voir management decisions in the Gullfaks field,where interpretation of 4D seismic data indi-cated the existence of previously unseen sealingfaults and the presence of bypassed oil. Thefaults themselves were not seen on the newdata, but were interpreted where fluid contentchanged abruptly in geophysical maps showingthe time-lapse results. After fluid transmissibilityacross faults was reevaluated, the potential forbypassed pay supported drilling an additionalwell.21 That well, drilled by Statoil, initially pro-duced 12,000 B/D [1900 m3/d] from a formerlyundrained compartment and confirmed the pres-

    ence of a gas cap predicted by seismic data.Statoil also reentered and sidetracked an aban-doned well and produced 6300 B/D [1000 m3/d].

    Currently, the scientists at SchlumbergerCambridge Research, Cambridge, England, arecombining information on the distribution of fluids,pressure and other properties from the reservoirsimulator with rock properties from the geo-model, or earth model, to generate a forwardmodel of seismic response (previous page). Inparticular, the porosity, bulk modulus and shearmodulus from the geomodel are combined withsaturation and pressure information from thesimulator. The forward model provides informa-tion about the elastic moduli and density of thereservoir, from which P-wave velocity, densityand acoustic impedance, or other properties, canbe derived.

    Next, the synthetic seismic data are com-pared with actual seismic data. Numerous com-parisons can be made between vintages of

    seismic data, such as maps of seismic attributes,prestack gathers for AVO studies and so on, buteach approach has the common goal of deter-mining the area of mismatch between predictedand recorded seismic data and analyzing the rea-sons for the differences.

    One major challenge in interpretation of anobserved time-lapse seismic response is that thenon-uniqueness in a particular seismic responsemust be considered. For example, an observedchange in amplitude might represent a change insaturation of oil or free gas, a change in theamount of gas dissolved in the oil, a change inpressure, or, most likely, a combination of these.Clearly, it is important to know which of thesefactors are significant in the reservoir, and seismicmodeling can help determine this.

    In the Foinaven study area, West ofShetlands, UK, normal-faulted, layered turbiditereservoirs form separate reservoir compartments(below). A preproduction baseline 3D survey was

    DC2

    DC1

    Gas cap

    Study area

    050100

    0 1 km

    0 1 mile

    0 1 km

    0 1 mile

    Horizontal scale

    Verticalscale, m

    DC1 DC2ReservoirT35T34T32T31

    South North

    SCOTLAND

    FaroeIslands

    ShetlandIslands

    OrkneyIslands

    Schiehallion

    Clair

    Foinaven

    0 20 40 60 mi

    0 20 40 60 80 100 km

    > Foinaven field. Located West of Shetlands (left), the Foinaven field produces from four main turbidite reservoirs. The reservoir map (top right) shows gascaps in red and the strike of the normal faults as black lines. The platforms and well locations are shown in black. The Foinaven study area is indicated bythe blue box. The cross section (bottom right), which extends from south of platform DC1 to north of platform DC2, shows the layered reservoirs that havebeen compartmentalized by normal faulting that must be drained by carefully constructed directional wells.

  • acquired in 1995 and a repeat survey was shot in1998 after 10 months of oil production. Theamplitude brightening in the synthetic seismicdata indicated that development of gas capswould be visible as the reservoir pressuredropped and gas came out of solution during pro-duction (above). The repeat survey verified theexistence of the expanded gas caps as high-amplitude events (left). Because seismic propertymodeling predicted a good match between thebaseline survey and the repeat surveysyntheticseismic sections from simulator model matchedreal seismic data at appropriate timesthe reser-voir team had more confidence in the reservoirsimulator model, both in this area and, by extrap-olation, elsewhere in the field. As a result, addi-tional 4D seismic data will be acquired acrossFoinaven and neighboring Schiehallion field to val-idate the reservoir models and ultimately supportthe placement of a number of infill production andinjection wells.

    Correct timing of data acquisition maximizesthe value of the data. Careful study before repeat3D survey acquisition can ensure optimal acqui-sition timing, which is field-dependent. In theFoinaven case, timely acquisition of 4D seismicdata helped determine where the reservoir wasconnected or segmented and where flow barriersexisted so the team could select optimal infill andwater-injection well placement.

    34 Oilfield Review

    Repeat3D survey1998

    Baseline3D survey1995

    1000 2000 3000 4000 5000 6000 7000

    1000 2000 3000 4000 5000 6000 7000

    Water

    Oil

    Oil + Free gas (~5%)

    Water

    Synthetic Seismic 4D Response

    > Visible changes in repeated surveys. Cross-sectional synthetic seismic displays for the baseline survey and repeated survey (left)show the development of a gas cap. The actual seismic sections confirm the predictions from seismic modeling (right).

    Repeat3D survey1998

    Baseline3D survey1995

    Surface Seismic 4D Response

    Bright

    T32 sand after 10months of production

    T32 sand beforeproduction

    Dim

    T35 sands not yet on production

    Smallgas caps

    Enlargedgas caps

    OWC

    Free gas

    OWC

    -3.40-3.17-2.94-2.71-2.48-2.25-2.02-1.79-1.56-1.33-1.10-0.87-0.65-0.42-0.19 0.04 0.27 0.50

    Free gas

    0 500 1000 1500 2000 m

    0 500 1000 1500 2000 m

    N

    > Amplitude changes. Map views of the 1995 baseline survey and the 1998 repeat survey clearlydisplay the changes in seismic amplitude that result from gas cap development. The small gascaps in the original survey and the synthetic data shown above it (top) enlarged significantlyafter 10 months of production (bottom). The oil-water contact (OWC) remains consistentbetween the two surveys.

  • Summer 1999 35

    Future PossibilitiesReservoir simulation has already helped oil andgas producers increase predicted ultimate recov-ery, and further improvement is likely. In additionto ongoing software and shared earth modelenhancements, reservoir monitoring with down-hole sensors, 4D seismic surveys or other meth-ods is becoming increasingly cost-effective,particularly when new data are acquired at opti-mal times (above).22

    A prudent, efficient team that works togetherto develop a field reduces cycle time andexpense. Sharing data and interpretations allowsthe team to maximize the value of its achieve-ments for more realistic reservoir simulation andimproved understanding of the reservoir. These inturn advance reservoir management, reserverecovery and project economics. Currently, thismethod relies heavily on the teams motivation towork together. Software provides strong support,but is not yet fully integrated to handle the com-plete spectrum of oilfield data simultaneously.

    In the future, new software that validates theshared earth model will incorporate measureduncertainties of data and interpretations. As themodel is refined with the capture of new data,any change in uncertainty will be addressedautomatically. Forward modeling will furtherreduce uncertainty and risk, and maximize thevalue of additional data. If the shared earthmodel is consistently updated and new data andinterpretations are incorporated, project teammembers will have another tool to better copewith increases in both the volume of data and theproductivity expectations of their companies.

    GMG

    22. Watts et al: reference 20.

    Interpretive data

    Uncertainty analysisand risk management

    Development and Production Planning

    Reservoir modeling

    Flow simulation

    Production and reservesforecasting

    Rese

    rvoi

    r Mon

    itorin

    g an

    d Co

    ntro

    l

    Field Implementation

    Development scenario

    Production

    Reservoir Characterization

    Reservoir performance

    Historymatching

    Reservoirdevelopment

    > Future reservoir management. Reservoir optimization is an iterative process that normally begins with reservoir characteriza-tion of a new discovery, but can be implemented at any stage in an existing field. Reservoir management will rely increasingly onmonitoring and modeling reservoir performance to optimize oil and gas production. The key additional element will be ongoingcollection of data at the reservoir scale, including seismic data and wellbore measurements, so that the development plan canbe assessed and, where necessary, modified. Monitoring the reservoir closely will overcome the current problems of historymatching using only the loose constraints of production data.