probabilistic subsurface forecasting—

Upload: nur-syaffiqa-mohamad-ruzlan

Post on 05-Apr-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/31/2019 Probabilistic Subsurface Forecasting

    1/7

    86 JPT MAY 2010

    AbstractThe use of single-valued assessments of company portfoliosand projects continues to decline as the industry accepts thatstrong subsurface uncertainties dictate an ongoing consider-ation of ranges of outcomes. Exploration has pioneered theuse of probabilistic prospect assessments as being the norm,

    in both majors and independents. Production has lagged,in part because of the need to comply with US Security andExchange (SEC) reserves-reporting requirements that drive aconservative deterministic approach.

    Look-backs continue to show the difficulty of achieving aforecast within an uncertainty band as well as the difficultyof establishing what that band should be. Ongoing challengesinclude identifying relevant static and dynamic uncertainties,efficiently and reliably determining ranges and dependen-cies for those uncertainties, incorporating production his-tory (brownfield assessments), and coupling subsurface withoperational and economic uncertainties.

    Despite these challenges, none of which are fully resolved,a systematic approach based on probabilistic principles[often including design-of-experiment (DoE) techniques]provides the best auditable and justifiable means of forecast-ing projects and presenting decision makers with a suitablerange of outcomes to consider.

    IntroductionProbabilistic subsurface assessments are the norm withinthe exploration side of the oil and gas industry, both inmajors and independents (Rose 2007). However, in manycompanies, the production side is still in transition fromsingle-valued deterministic assessments, sometimes carried

    out with ad-hoc sensitivity studies, to more-rigorous proba-bilistic assessments with an auditable trail of assumptionsand a statistical underpinning. Reflecting these changesin practices and technology, recently revised SEC rulesfor reserves reporting (effective 1 January 2010) allow forthe use of both probabilistic and deterministic methods in

    addition to allowing reporting of reserves categories otherthan proved. This paper attempts to present some ofthe challenges facing probabilistic assessments and pres-ent some practical considerations to carry out the assess-ments effectively.

    Look BacksCalibrating AssessmentsLook-backs continue to show the difficulty of achieving aforecast within an uncertainty band along with the difficultyof establishing what that band should be. Demirmen (2007)reviewed reserves estimates in various regions over time andobserved that estimates are poor and that uncertainty doesnot decrease over time.

    Otis and Schneidermann (1997) describe a comprehen-sive exploration-prospect-evaluation system that, startingin 1989, included consistent methods of assessing risk andestimating hydrocarbon volumes, including post-drillingfeedback to calibrate those assessments. Although detailedlook-backs for probabilistic forecasting methodologies havebeen recommended for some time (Murtha 1997) and arebeginning to take place within companies, open publica-tions on actual fields with details still are rare, possiblybecause of the newness of the methodologies or because ofdata sensitivity.

    Identifying Subsurface UncertaintiesA systematic process of identifying relevant subsurface

    uncertainties and then categorizing them can help by break-ing down a complex forecast into simple uncontrollablestatic or dynamic components that can be assessed andcalibrated individually (Williams 2006). Nonsubsurface,controllable, and operational uncertainties also must beconsidered, but the analysis usually is kept tractable by

    Martin Wolff, SPE, is a Senior ReservoirEngineering Advisor for Hess Corporation.

    He earned a BS degree in electrical engi-neering and computer science and an

    MS degree in electrical engineering from

    the University of Illinois and a PhD

    degree in petroleum engineering from the

    University of Texas at Austin. Previously,Wolff worked for Schlumberger, Chevron,

    Fina, Total, and Newfield. He has served as a Technical Editor

    and Review Chairperson for SPE Reservoir Evaluation &

    Engineering and has served on steering committees for several

    SPE Forums and Advanced Technology Workshops.

    Probabilistic Subsurface Forecasting

    What Do We Really Know?Martin Wolff, SPE, Hess Corporation

    Copyright 2010 Society of Petroleum EngineersThis is paper SPE 118550. Distinguished Author Series articles are general, descriptiverepresentations that summarize the state of the art in an area of technology by describing recentdevelopments for readers who are not specialists in the topics discussed. Written by individualsrecognized as experts in the area, these articles provide key references to more definitive workand present specific details only to illustrate the technology. Purpose: to inform the generalreadership of recent advances in various areas of petroleum engineering.

    DISTINGUISHED AUTHOR SERIES

  • 7/31/2019 Probabilistic Subsurface Forecasting

    2/7

    including them later with decision analysis or additionalrounds of uncertainty analysis.

    Grouping parameters also can reduce the dimensionalityof the problem. When parameters are strongly correlated(or anticorrelated), grouping them is justifiable. In fact,

    not grouping or Balkanizing a group of such parameterscould cause them to be dropped in standard screeningmethods such as Pareto charts. For example, decompos-ing a set of relative permeability curves into constituentparameters such as saturation endpoints, critical satura-tions, relative permeability endpoints, and Corey exponentscan cause them all to become insignificant individually.Used together, relative permeability often remains a domi-nant uncertainty.

    Assigning Ranges to UncertaintiesHaving been identified, ranges for each uncertainty must bequantified, which may appear straightforward but contains

    subtle challenges. Breaking down individual uncertaintiesinto components (e.g., measurement, model, or statisticalerror) and carefully considering portfolio and sample-biaseffects can help create reasonable and justifiable ranges.

    Some uncertainties, especially geological ones, are nothandled easily as continuous variables. In many studies,several discrete geological models are constructed to repre-sent the spectrum of possibilities. To integrate these modelswith continuous parameters and generate outcome distribu-tions, likelihoods must be assigned to each model. Althoughassigning statistical meaning to a set of discrete models maybe a challenge if those models are not based on any under-lying statistics, the models do have the advantage of morereadily being fully consistent scenarios rather than a combi-nation of independent geological-parameter values that maynot make any sense (Bentley and Woodhead 1998).

    As noted previously, validation with analog data sets andlook-backs should be carried out when possible becausemany studies and publications have shown that people havea tendency to anchor on what they think they know and tounderestimate the true uncertainties involved. Therefore,any quantitative data that can help establish and validateuncertainty ranges are highly valuable.

    Assigning Distributions to UncertaintiesIn addition to ranges, distributions must be specified foreach uncertainty. There are advocates for different approach-

    es. Naturalists strongly prefer the use of realistic distribu-tions that often are observed in nature (e.g., log normal),while pragmatists prefer distributions that are well-behaved(e.g., bounded) and simple to specify (e.g., uniform or trian-gular). In most cases, specifying ranges has a stronger influ-ence on forecasts than the specific distribution shape, whichmay have little effect (Wolff 2010). Statistical correlationsbetween uncertainties also should be considered, althoughthese too are often secondary effects.

    Uncertainty-to-Forecast RelationshipsHaving been identified and quantified, relationships thenmust be established between uncertainties and forecasts.

    These relationships sometimes can be established fromanalytical and empirical equations but also may be derivedfrom models ranging from simple material-balance throughfull 3D reservoir-simulation models. When complex mod-els are used to define relationships, it often is useful to

    use DoE methods to investigate the uncertainty space effi-ciently. These methods involve modeling defined combina-tions of uncertainties to fit simple equations that can actas efficient surrogates or proxies for the complex models.Monte-Carlo methods then can be used to investigate thedistribution of forecast outcomes, taking into account cor-relations between uncertainties.

    DoE methods have been used for many years in thepetroleum industry. The earliest SPE reference foundwas from a perforating-gun study by Vogel (1956), theearliest reservoir work was on a wet-combustion-drivestudy by Sawyer et al. (1974), while early references to3D reservoir models include Chu (1990) and Damsleth

    et al. (1992). These early papers all highlight the mainadvantage of DoE over traditional one-variable-at-a-time(OVAT) methodsefficiency. Damsleth et al. (1992) give a30 to 40% advantage for D-optimal designs compared withOVAT sensitivities.

    For an extensive bibliography of papers showing pros andcons of different types of DoE and application of DoE tospecific reservoir problems, see an expanded version of thispaper, Wolff (2010).

    Model ComplexityGiven that computational power has increased vastly fromthe 1950s and 1970s to ever-more-powerful multicoreprocessors and cluster computing, an argument can bemade that computational power should not be regardedas a significant constraint for reservoir studies. However,

    Williams et al. (2004) observe that gains in computationalpower are generally used to increase the complexity of themodels rather than to reduce model run times.

    Most would agree with the concept of making things nomore complex than needed, but different disciplines havedifferent perceptions regarding that level of complexity.This problem can be made worse by corporate peer reviews,especially in larger companies, in which excessively com-plex models are carried forward to ensure buy in by allstakeholders. Highly complex models also may requirecomplex logic to form reasonable and consistent develop-

    ment scenarios for each run.Finally, the challenge of quality control (QC) of highly

    complex models cannot be ignoredgarbage in, garbageout applies more strongly than ever. Launching directlyinto tens to hundreds of DoE runs without ensuringthat a base-case model makes physical sense and runsreasonably well will often lead to many frustratingcycles of debug and rework. A single model can readilybe quality controlled in detail, while manual QC oftens of models becomes increasingly difficult. Withhundreds or thousands of models, automatic-QC toolsbecome necessary to complement statistical methods byhighlighting anomalies.

    JPT MAY 2010 87

  • 7/31/2019 Probabilistic Subsurface Forecasting

    3/7

    88 JPT MAY 2010

    Proxy EquationsFig. 1 shows proxy surfaces of varying complexity thatcan be obtained with different designs. A Plackett-Burman(PB) design, for example, is suitable only for linear prox-ies. Folded-Plackett-Burman (FPB) (an experimental designwith double the number of runs of the PB design formed byadding runs reversing the plus and minus 1s in the matrix)

    can provide interaction terms and lumped second-orderterms (all second-order coefficients equal). D-optimal canprovide full second-order polynomials. More-sophisticatedproxies can account for greater response complexities, butat the cost of additional refinement simulation runs. Thesemore-sophisticated proxies may be of particular use inbrownfield studies in which a more-quantitative proxy could

    DISTINGUISHED AUTHOR SERIES

    Fig. 1Proxy surfaces of varying complexity.

    10.8 0.6

    0.40.2

    00.2

    0.40.6

    0.81

    1

    0.6

    0.2

    0.2

    0.6

    1

    1

    0.5

    0

    0.5

    1

    1.5

    2

    2.5

    3

    10.8 0.6

    0.40.2

    00.2

    0.40.6

    0.81

    1

    0.6

    0.2

    0.2

    0.6

    1

    1

    0

    1

    2

    3

    4

    5

    10.8

    0.6 0.40.2

    00.2

    0.40.6

    0.81

    1

    0.6

    0.2

    0.2

    0.6

    1

    0

    1

    2

    3

    4

    5

    6

    7

    8

    10.8

    0.6 0.4 0.20

    0.20.4

    0.60.8

    1

    1

    0.6

    0.2

    0.2

    0.6

    1

    0

    1

    2

    3

    4

    5

    6

    7

    8

    9

    Linear Terms Only Linear and Interaction Terms

    Full Second-Order Polynomial

    Linear, Interaction, and Lumped

    Second-Order Terms

  • 7/31/2019 Probabilistic Subsurface Forecasting

    4/7

    www.jvg-media.com

    The Kuwait Oil Company invites you to discover the opportunities of a lifetime in its

    exploration and production.

    We are looking for petro-technical specialists for full time internal employment

    within the company and based in Kuwait City.

    We offer attractive employment benefits that includes a tax-free salary package,

    annual bonuses, 42 calendar days annual paid vacation with airfare, accommodation

    and furnishing allowances, world class free healthcare for employees and their

    families, children education allowance, a fully maintained company vehicle and

    other great benefits.

    We also offer a challenging, safe and healthy work environment along with an

    enviable lifestyle filled with leisure activities such as a golf course, cricket pitch, football

    field, tennis & squash courts, horse riding facilities, bowling alley, swimming pools,

    health & social clubs, modern shopping malls and many more...

    Minimum Requirements

    Bachelors Degree

    15+ years of experience in the relevant positions

    Excellent English written and verbal communication skills

    Available vacancies for E & P SPECIALISTS

    Petrophysicist Facility Specialists

    Reservoir Engineers

    Reservoir Modelling Specialists

    Mechanical Training Specialists

    Specialist Development Geologists

    Geologists Modeller Petroleum Engineers

    Instrument Training Specialists

    Gas Operations Training Specialists

    Electrical and Power Training Specialists

    Process and Production Operations Training

    Specialists

  • 7/31/2019 Probabilistic Subsurface Forecasting

    5/7

    90 JPT MAY 2010

    DISTINGUISHED AUTHOR SERIES

    be desirable, but may not always add much value to green-field studies where the basic subsurface uncertainties remainpoorly constrained.

    A recognized problem with polynomial proxies is thatthey tend to yield normal distributions because the termsare added (a consequence of the Central Limit theorem).For many types of subsurface forecasts, the prevalence of

    actual skewed distributions, such as log-normal, has beendocumented widely. Therefore, physical proxies, especiallyin simple cases such as the original-oil-in-place (OOIP)equation, have some advantages in achieving more-realisticdistributions. However, errors from the use of nonphysicalproxies are not necessarily significant, depending on theparticular problem studied (Wolff 2010).

    A question raised about computing polynomial prox-ies for relatively simple designs such as FPB is that oftenthere are, apparently, too few equations to solve for allthe coefficients of the polynomial. The explanation is thatnot all parameters are equally significant and that someparameters may be highly correlated or anticorrelated. Both

    factors reduce the dimensionality of the problem, allowingreasonable solutions to be obtained even with an apparentlyinsufficient number of equations.

    Proxy ValidationAt a minimum, a set of blind test runs, which are not usedin building the proxy, should be compared with proxypredictions. A simple crossplot of proxy-predicted vs.experimental results for points used to build the proxycan confirm only that the proxy equation was adequate tomatch data used in the analysis. However, it does not provethat the proxy is also predictive. In general, volumetricsare more reasonably predicted with simpler proxies thanare dynamic results such as recoveries, rates, and break-through times.

    Moving Toward a Common ProcessSome standardization of designs would help these methodsbecome even more accepted and widespread in companies.The reader of this paper likely is a reservoir engineer orearth scientist who, by necessity, dabbles in statistics butprefers not to make each study a research effort on math-ematical methods. Another benefit of somewhat standard-ized processes is that management and technical reviewerscan become familiar and comfortable with certain designsand will not require re-education with each project theyneed to approve. However, because these methodologies are

    still relatively new, a period of testing and exploring differ-ent techniques is still very much under way.

    The literature shows the use of a wide range of method-ologies. Approaches to explore uncertainty space range fromuse of only the simplest PB screening designs for the entireanalysis, through multistage experimental designs of increas-ing accuracy, to bypassing proxy methods altogether in favorof space-filling designs and advanced interpolative methods.

    A basic methodology extracted from multiple papers listedin Wolff (2010) can be stated as follows:

    (1) Define subsurface uncertainties and their ranges.(2) Perform screening analysis (e.g., two-level PB, or FPB),

    and analyze to identify the most-influential uncertainties.

    (3) If necessary, perform a more detailed analysis (e.g.,three-level D-optimal or central-composite).

    (4) Create a proxy model (response surface) by use of lin-ear or polynomial proxies, and validate with blind tests.

    (5) Perform Monte-Carlo simulations to assess uncertaintyand define distributions of outcomes.

    (6) Build deterministic (scenario) low/mid/high models

    tied to the distributions.(7) Use deterministic models to assess development alter-

    natives.However, variations and subtleties abound. Most studies

    split the analysis of the static and dynamic parameters intotwo stages with at least two separate experimental designs.The first stage seeks to create a number of discrete geologi-cal or static models (3, 5, 9, 27, or more are found in theliterature) representing a broad range of hydrocarbons inplace and connectivity (often determined by rapid analysessuch as streamline simulation). Then, the second stagetakes these static models and adds the dynamic param-eters in a second experimental design. This method is

    particularly advantageous if the project team prefers to usehigher-level designs such as D-optimal to reduce the num-ber of uncertainties in each stage. However, this methodcannot account for the full range of individual interac-tions between all static and dynamic parameters becausemany of the static parameters are grouped and fixed intodiscrete models before the final dynamic simulations arerun. This limitation becomes less significant when morediscrete geological models are built that reflect more major-uncertainty combinations.

    Steps 2 and 3 in the base methodology sometimes coin-cide with the static/dynamic parameter split. In manycases, however, parameter screening is performed as anadditional DoE step after having already determined a setof discrete geological models. This culling to the most-influential uncertainties again makes running a higher-leveldesign more feasible, especially with full-field full-physicssimulation models. The risk is that some of the parametersscreened from the analysis as insignificant in the develop-ment scenario that was used may become significant underother scenarios. For example, if the base scenario was aperipheral waterflood, parameters related to aquifer sizeand strength may drop out. If a no-injector scenario is laterexamined, the P10/50/90 deterministic models may notinclude any aquifer variation. Ideally, each scenario wouldhave its own screening DoE performed to retain all relevantinfluential uncertainties.

    An alternative is running a single-stage DoE including allstatic and dynamic parameters. This method can lead to alarge number of parameters. Analysis is made more trac-table by use of intermediate-accuracy designs such as FPB.Such compromise designs do require careful blind testingto ensure accuracy although proxies with mediocre blind-test results often can yield very similar statistics (P10/50/90values) after Monte Carlo simulation when compared withhigher-level designs. As a general observation, the qual-ity of proxy required for quantitative predictive use such asoptimization or history matching usually is higher than thatrequired only for generating a distribution through MonteCarlo methods.

  • 7/31/2019 Probabilistic Subsurface Forecasting

    6/7

    JPT MAY 2010 91

    Determining which development options (i.e., uncon-strained or realistic developments, including controllablevariables) to choose for building the proxy equations andrunning Monte Carlo simulations also has challenges. Oneapproach is to use unconstrained scenarios that effectivelyattempt to isolate subsurface uncertainties from the effectsof these choices (Williams 2006). Another approach is to

    use a realistic base-case development scenario if such ascenario already exists or make an initial pass through theprocess to establish one. Studies that use DoE for optimiza-tion often include key controllable variables in the proxyequation despite knowing that this may present difficultiessuch as more-irregular proxy surfaces requiring higher-level designs.

    Integrated models consider all uncertainties together(including surface and subsurface), which eliminates pick-ing a development option. These models may be vital for theproblem being analyzed; however, they present additionaldifficulties. Either computational costs will increase or com-promises to subsurface physics must be made such as elimi-

    nating reservoir simulation in favor of simplified dimension-less rate-vs.-cumulative-production tables or proxy equa-tions. That reopens the questions: What developmentoptions were used to build those proxies? and How validare those options in other scenarios?

    Deterministic ModelsShort of using integrated models, there still remains thechallenge of applying and optimizing different developmentscenarios to a probabilistic range of forecasts. Normal prac-tice is to select a limited number of deterministic models thatcapture a range of outcomes, often three (e.g., P10/50/90)but sometimes more if testing particular uncertainty com-binations is desired. Normal practice is to match probabilitylevels of two outcomes at once (e.g., pick a P90 model thathas both P90 OOIP and P90 oil recovery). Some studiesattempt to match P90 levels of other outcomes at the sametime, such as discounted oil recovery (which ties better tosimplified economics because it puts a time value on produc-tion), recovery factor, or initial production rate. The moreoutcome matches that are attempted, the more difficult it isto find a suitable model.

    The subsurface uncertainties selected to create a deter-ministic model, and how much to vary each of them, forma subjective exercise because there are an infinite number ofcombinations. Williams (2006) give guidelines for buildingsuch models, including trying to ensure a logical progression

    of key uncertainties from low to high models.If a proxy is quantitatively sound, it can be used to test

    particular combinations of uncertainties that differ fromthose of the DoE before building and running time-consum-ing simulation models. The proxy also can be used to esti-mate response behavior for uncertainty levels between thetwo or three levels (1/0/+1) typically defined in the DoE.This can be useful for tuning a particular combination toachieve a desired response, and it allows moderate combina-tions of uncertainties. Such moderate combinations, ratherthan extremes used in many designs, will be perceived asmore realistic. This choice also will solve the problem of notbeing able to set all key variables to 1 or +1 levels and

    follow a logical progression of values to achieve P90 andP10 outcomes. However, interpolation of uncertainties cansometimes be:

    Challenging (some uncertainties such as perme-ability may not vary linearly compared with others suchas porosity)

    Challenging and time-consuming (e.g., interpolating

    discrete geological models) Impossible [uncertainties with only a discrete number

    of physical states such as many decision variables (e.g., 1.5wells is not possible)]

    Finally, selecting the deterministic models to use is usu-ally a whole-team activity because each discipline may haveits own ideas about which uncertainties need to be testedand which combinations are realistic. This selection processachieves buy-in by the entire team before heading into tech-nical and management reviews.

    Probabilistic brownfield forecasting has the additionalchallenge of needing to match dynamic performance data.Although forecasts should become more-tightly constrained

    with actual field data, data quality and the murky issueof what constitutes an acceptable history match must beconsidered. History-match data can be incorporated intoprobabilistic forecasts through several methods. The tra-ditional and simplest method is to tighten the individualuncertainty ranges until nearly all outcomes are reasonablyhistory matched. This approach is efficient and straightfor-ward but may eliminate some more-extreme combinationsof parameters from consideration.

    Filter-proxy methods that use quality-of-history-matchindicators (Landa and Gyagler 2003) will accept thesemore-extreme uncertainty combinations. The filter-proxymethod also has the virtue of transparencyexplanationand justification of the distribution of the matched modelsis straightforward, as long as the proxies (especially thoseof the quality of the history match) are sufficiently accurate.More-complex history-matching approaches such as geneticalgorithms, evolutionary strategies, and ensemble Kalmanfilter are a very active area for research and commercial activi-ties, but going into any detail on these methods is beyond thescope of this paper.

    ConclusionWhat Do We Really Know?In realistic subsurface-forecasting situations, enough uncer-tainty exists about the basic ranges of parameters thatabsolute numerical errors less than 5 to 10% usually areconsidered relatively minor, although it is difficult to give

    a single value that applies for all situations. For example,when tuning discrete models to P10/50/90 values, a typicalpractice is to stop tuning when the result is within 5% ofthe desired result. Spending a lot of time to obtain a moreprecise result is largely a wasted effort, as look-backs haveshown consistently.

    Brownfield forecasts are believed to be more accurate thangreenfield forecasts that lack calibration, but look-backs alsosuggest that it may be misleading to think the result is thecorrect one just because a great history match was obtained.Carrying forward a reasonable and defensible set of workingmodels that span a range of outcomes makes much moresense than hoping to identify a single true answer. As

  • 7/31/2019 Probabilistic Subsurface Forecasting

    7/7

    92 JPT MAY 2010

    DISTINGUISHED AUTHOR SERIES

    George Box (an eminent statistician) once said: All modelsare wrong, but some are useful.

    In all these studies, there is a continuing series of tradeoffsto be made between the effort applied and its effect on theoutcome. Many studies have carried simple screening designsall the way through to detailed forecasts with well-acceptedresults that are based on very few simulation runs. These

    studies tend to study the input uncertainty distributions ingreat depth, often carefully considering partial correlationsbetween the uncertainties. Although the quality of the prox-ies used in these studies may not be adequate for quantitativepredictive use, it still may be adequate for generating reason-able statistics.

    Other studies use complex designs to obtain highly accu-rate proxies that can be used quantitatively for optimizationand history matching. However, many of these studies haveused standardized uncertainty distributions (often discrete)with less consideration of correlations and dependencies.Higher-speed computers and automated tools are makingsuch workflows less time-consuming so that accurate prox-

    ies and a thorough consideration of the basic uncertaintiesshould both be possible. Whichever emphasis is made,the models that are used should be sufficiently complex tocapture the reservoir physics that influences the outcomesignificantly. At the same time, they should be simpleenough such that time and energy are not wasted on refiningsomething that either has little influence or remains funda-mentally uncertain.

    In the end, probabilistic forecasts can provide answerswith names like P10/50/90 that have specific statisticalmeaning. However, it is a meaning that must consider theassumptions made about the statistics of the basic uncer-tainties, most of which lack a rigorous statistical under-pinning. The advantage of a rigorous process to combinethese uncertainties through DoE, proxies, Monte Carlomethods, scenario modeling, and other techniques is thatthe process is clean and auditable, not that the probabil-ity levels are necessarily quantitatively correct. However,they are as correct as the selection and description of thebasic uncertainties.

    Having broken a complex forecast into simple assumptions,it should become part of a standard process to refine thoseassumptions as more data become available. Ultimately, likethe example from exploration mentioned at the beginning,we hope to calibrate ourselves through detailed look-backsfor continous improvement of our forecast quality.

    AcknowledgmentsThe author would like to thank Kaveh Dehghani, Mark

    Williams, and John Spokes for their support and stimulat-ing discussions. Thank you also to Hao Cheng for our worktogether on the subject and for supplying the graphics forFig. 1.

    ReferencesBentley, M.R. and Woodhead, T.J. 1998. Uncertainty Handling

    Through Scenario-Based Reservoir Modeling. Paper SPE 39717

    presented at the SPE Asia Pacific Conference on Integrated

    Modeling for Asset Management, Kuala Lumpur, 2324 March.

    doi: 10.2118/39717-MS.Chu, C. 1990. Prediction of Steamflood Performance in Heavy Oil

    Reservoirs Using Correlations Developed by Factorial DesignMethod. Paper SPE 20020 presented at the SPE California

    Regional Meeting, Ventura, California, USA, 46 April. doi:

    10.2118/20020-MS.Damsleth, E., Hage, A., and Volden, R. 1992. Maximum Information

    at Minimum Cost: A North Sea Field Development Study With an

    Experimental Design. J Pet Technol 44 (12): 13501356. SPE-

    23139-PA. doi: 10.2118/23139-PA.

    Demirmen, F. 2007. Reserves Estimation: The Challenge for theIndustry. Distinguished Author Series, J Pet Technol 59 (5):

    8089. SPE-103434-PA.

    Landa, J.L. and Gyagler, B. 2003. A Methodology for History

    Matching and the Assessment of Uncertainties Associated With

    Flow Prediction. Paper SPE 84465 presented at the SPE AnnualTechnical Conference and Exhibition, Denver, 58 October. doi:

    10.2118/84465-MS.

    Murtha, J. 1997. Monte Carlo Simulation: Its Status and Future.

    Distinguished Author Series,J Pet Technol 49 (4): 361370. SPE-37932-MS. doi: 10.2118/37932-MS.

    Otis, R.M. and Schneidermann, N. 1997. A process for evaluating

    exploration prospects.AAPG Bulletin81 (7): 10871109.

    Rose, P.R. 2007. Measuring what we think we have found: Advantages

    of probabilistic over deterministic methods for estimating oil andgas reserves and resources in exploration and production.AAPG

    Bulletin 91 (1): 2129. doi: 10.1306/08030606016.

    Sawyer. D.N., Cobb, W.M., Stalkup, F.I., and Braun, P.H. 1974.

    Factorial Design Analysis of Wet-Combustion Drive. SPE J.14(1): 2534. SPE-4140-PA. doi: 10.2118/4140-PA.

    Vogel, L.C. 1956. A Method for Analyzing Multiple Factor

    ExperimentsIts Application to a Study of Gun Perforating

    Methods. Paper SPE 727-G presented at the Fall Meeting of the

    Petroleum Branch of AIME, Los Angeles, 1417 October. doi:10.2118/727-G.

    Williams, G.J.J., Mansfield, M., MacDonald, D.G., and Bush, M.D.

    2004. Top-Down Reservoir Modeling. Paper SPE 89974 pre-

    sented at the SPE Annual Technical Conference and Exhibition,

    Houston, 2629 September. doi: 10.2118/89974-MS.Williams, M.A. 2006. Assessing Dynamic Reservoir Uncertainty:

    Integrating Experimental Design with Field Development

    Planning. SPE Distinguished Lecturer Series presentation given for

    Gulf Coast Section SPE, Houston, 23 March. http://www.spegcs.org/attachments/studygroups/11/SPE%20Mark%20Williams%20Mar_06.ppt.

    Wolff, M. 2010. Probabilistic Subsurface Forecasting. Paper SPE

    132957 available from SPE, Richardson, Texas. JPT