You are on page 1of 7

DISTINGUISHED AUTHOR SERIES

Probabilistic Subsurface Forecasting—


What Do We Really Know?
Martin Wolff, SPE, Hess Corporation

Abstract out with ad-hoc sensitivity studies, to more-rigorous proba-


The use of single-valued assessments of company portfolios bilistic assessments with an auditable trail of assumptions
and projects continues to decline as the industry accepts that and a statistical underpinning. Reflecting these changes
strong subsurface uncertainties dictate an ongoing consider- in practices and technology, recently revised SEC rules
ation of ranges of outcomes. Exploration has pioneered the for reserves reporting (effective 1 January 2010) allow for
use of probabilistic prospect assessments as being the norm, the use of both probabilistic and deterministic methods in
in both majors and independents. Production has lagged, addition to allowing reporting of reserves categories other
in part because of the need to comply with US Security and than “proved.” This paper attempts to present some of
Exchange (SEC) reserves-reporting requirements that drive a the challenges facing probabilistic assessments and pres-
conservative deterministic approach. ent some practical considerations to carry out the assess-
Look-backs continue to show the difficulty of achieving a ments effectively.
forecast within an uncertainty band as well as the difficulty
of establishing what that band should be. Ongoing challenges Look Backs—Calibrating Assessments
include identifying relevant static and dynamic uncertainties, Look-backs continue to show the difficulty of achieving a
efficiently and reliably determining ranges and dependen- forecast within an uncertainty band along with the difficulty
cies for those uncertainties, incorporating production his- of establishing what that band should be. Demirmen (2007)
tory (brownfield assessments), and coupling subsurface with reviewed reserves estimates in various regions over time and
operational and economic uncertainties. observed that estimates are poor and that uncertainty does
Despite these challenges, none of which are fully resolved, not decrease over time.
a systematic approach based on probabilistic principles Otis and Schneidermann (1997) describe a comprehen-
[often including design-of-experiment (DoE) techniques] sive exploration-prospect-evaluation system that, starting
provides the best auditable and justifiable means of forecast- in 1989, included consistent methods of assessing risk and
ing projects and presenting decision makers with a suitable estimating hydrocarbon volumes, including post-drilling
range of outcomes to consider. feedback to calibrate those assessments. Although detailed
look-backs for probabilistic forecasting methodologies have
Introduction been recommended for some time (Murtha 1997) and are
Probabilistic subsurface assessments are the norm within beginning to take place within companies, open publica-
the exploration side of the oil and gas industry, both in tions on actual fields with details still are rare, possibly
majors and independents (Rose 2007). However, in many because of the newness of the methodologies or because of
companies, the production side is still in transition from data sensitivity.
single-valued deterministic assessments, sometimes carried
Identifying Subsurface Uncertainties
A systematic process of identifying relevant subsurface
Martin Wolff, SPE, is a Senior Reservoir uncertainties and then categorizing them can help by break-
Engineering Advisor for Hess Corporation. ing down a complex forecast into simple uncontrollable
He earned a BS degree in electrical engi- static or dynamic components that can be assessed and
neering and computer science and an calibrated individually (Williams 2006). Nonsubsurface,
MS degree in electrical engineering from controllable, and operational uncertainties also must be
the University of Illinois and a PhD considered, but the analysis usually is kept tractable by
degree in petroleum engineering from the
University of Texas at Austin. Previously,
Wolff worked for Schlumberger, Chevron,
Copyright 2010 Society of Petroleum Engineers
Fina, Total, and Newfield. He has served as a Technical Editor This is paper SPE 118550. Distinguished Author Series articles are general, descriptive
and Review Chairperson for SPE Reservoir Evaluation & representations that summarize the state of the art in an area of technology by describing recent
developments for readers who are not specialists in the topics discussed. Written by individuals
Engineering and has served on steering committees for several recognized as experts in the area, these articles provide key references to more definitive work
SPE Forums and Advanced Technology Workshops. and present specific details only to illustrate the technology. Purpose: to inform the general
readership of recent advances in various areas of petroleum engineering.

86 JPT • MAY 2010


including them later with decision analysis or additional These relationships sometimes can be established from
rounds of uncertainty analysis. analytical and empirical equations but also may be derived
Grouping parameters also can reduce the dimensionality from models ranging from simple material-balance through
of the problem. When parameters are strongly correlated full 3D reservoir-simulation models. When complex mod-
(or anticorrelated), grouping them is justifiable. In fact, els are used to define relationships, it often is useful to
not grouping or “Balkanizing” a group of such parameters use DoE methods to investigate the uncertainty space effi-
could cause them to be dropped in standard screening ciently. These methods involve modeling defined combina-
methods such as Pareto charts. For example, decompos- tions of uncertainties to fit simple equations that can act
ing a set of relative permeability curves into constituent as efficient surrogates or proxies for the complex models.
parameters such as saturation endpoints, critical satura- Monte-Carlo methods then can be used to investigate the
tions, relative permeability endpoints, and Corey exponents distribution of forecast outcomes, taking into account cor-
can cause them all to become insignificant individually. relations between uncertainties.
Used together, relative permeability often remains a domi- DoE methods have been used for many years in the
nant uncertainty. petroleum industry. The earliest SPE reference found
was from a perforating-gun study by Vogel (1956), the
Assigning Ranges to Uncertainties earliest reservoir work was on a wet-combustion-drive
Having been identified, ranges for each uncertainty must be study by Sawyer et al. (1974), while early references to
quantified, which may appear straightforward but contains 3D reservoir models include Chu (1990) and Damsleth
subtle challenges. Breaking down individual uncertainties et al. (1992). These early papers all highlight the main
into components (e.g., measurement, model, or statistical advantage of DoE over traditional one-variable-at-a-time
error) and carefully considering portfolio and sample-bias (OVAT) methods—efficiency. Damsleth et al. (1992) give a
effects can help create reasonable and justifiable ranges. 30 to 40% advantage for D-optimal designs compared with
Some uncertainties, especially geological ones, are not OVAT sensitivities.
handled easily as continuous variables. In many studies, For an extensive bibliography of papers showing pros and
several discrete geological models are constructed to repre- cons of different types of DoE and application of DoE to
sent the spectrum of possibilities. To integrate these models specific reservoir problems, see an expanded version of this
with continuous parameters and generate outcome distribu- paper, Wolff (2010).
tions, likelihoods must be assigned to each model. Although
assigning statistical meaning to a set of discrete models may Model Complexity
be a challenge if those models are not based on any under- Given that computational power has increased vastly from
lying statistics, the models do have the advantage of more the 1950s and 1970s to ever-more-powerful multicore
readily being fully consistent scenarios rather than a combi- processors and cluster computing, an argument can be
nation of independent geological-parameter values that may made that computational power should not be regarded
not make any sense (Bentley and Woodhead 1998). as a significant constraint for reservoir studies. However,
As noted previously, validation with analog data sets and Williams et al. (2004) observe that gains in computational
look-backs should be carried out when possible because power “are generally used to increase the complexity of the
many studies and publications have shown that people have models rather than to reduce model run times.”
a tendency to anchor on what they think they know and to Most would agree with the concept of making things no
underestimate the true uncertainties involved. Therefore, more complex than needed, but different disciplines have
any quantitative data that can help establish and validate different perceptions regarding that level of complexity.
uncertainty ranges are highly valuable. This problem can be made worse by corporate peer reviews,
especially in larger companies, in which excessively com-
Assigning Distributions to Uncertainties plex models are carried forward to ensure “buy in” by all
In addition to ranges, distributions must be specified for stakeholders. Highly complex models also may require
each uncertainty. There are advocates for different approach- complex logic to form reasonable and consistent develop-
es. Naturalists strongly prefer the use of realistic distribu- ment scenarios for each run.
tions that often are observed in nature (e.g., log normal), Finally, the challenge of quality control (QC) of highly
while pragmatists prefer distributions that are well-behaved complex models cannot be ignored—“garbage in, garbage
(e.g., bounded) and simple to specify (e.g., uniform or trian- out” applies more strongly than ever. Launching directly
gular). In most cases, specifying ranges has a stronger influ- into tens to hundreds of DoE runs without ensuring
ence on forecasts than the specific distribution shape, which that a base-case model makes physical sense and runs
may have little effect (Wolff 2010). Statistical correlations reasonably well will often lead to many frustrating
between uncertainties also should be considered, although cycles of debug and rework. A single model can readily
these too are often secondary effects. be quality controlled in detail, while manual QC of
tens of models becomes increasingly difficult. With
Uncertainty-to-Forecast Relationships hundreds or thousands of models, automatic-QC tools
Having been identified and quantified, relationships then become necessary to complement statistical methods by
must be established between uncertainties and forecasts. highlighting anomalies.

JPT • MAY 2010 87


DISTINGUISHED AUTHOR SERIES

Linear Terms Only Linear and Interaction Terms

3 5

2.5
4

2
3
1.5

1 2

0.5
1

0
1 0 1
–0.5 0.6
0.6
0.2 0.2
–1 –1
–1 –0.8 –0.2 –1 –0.8 –0.2
–0.6 –0.6 –0.4
–0.4 –0.6 –0.6
–0.2 –0.2 0
0 0.2 0.4 0.2
–1 0.4 –1
0.6 0.8 0.6
1 0.8 1

Linear, Interaction, and Lumped


Second-Order Terms Full Second-Order Polynomial

8 9

7 8

7
6
6
5
5
4
4
3
3

2
2
1 1
1 1 0.6
0.6
0.2 0.2
0 0
–1 –0.8 –0.2 –1 –0.8 –0.2
–0.6 –0.6 –0.4 1
–0.4 –0.6 –0.2 –0.6
–0.2 0
0 0.2 0.2
0.4 –1 0.4 –1
0.6 0.6 0.8
0.8
1

Fig. 1—Proxy surfaces of varying complexity.

Proxy Equations can provide interaction terms and lumped second-order


Fig. 1 shows proxy surfaces of varying complexity that terms (all second-order coefficients equal). D-optimal can
can be obtained with different designs. A Plackett-Burman provide full second-order polynomials. More-sophisticated
(PB) design, for example, is suitable only for linear prox- proxies can account for greater response complexities, but
ies. Folded-Plackett-Burman (FPB) (an experimental design at the cost of additional refinement simulation runs. These
with double the number of runs of the PB design formed by more-sophisticated proxies may be of particular use in
adding runs reversing the plus and minus 1s in the matrix) brownfield studies in which a more-quantitative proxy could

88 JPT • MAY 2010


The Kuwait Oil Company invites you to discover the opportunities of a lifetime in its
exploration and production.
We are looking for petro-technical specialists for full time internal employment
within the company and based in Kuwait City.
We offer attractive employment benefits that includes a tax-free salary package,
annual bonuses, 42 calendar days annual paid vacation with airfare, accommodation
and furnishing allowances, world class free healthcare for employees and their
families, children education allowance, a fully maintained company vehicle and
other great benefits.
We also offer a challenging, safe and healthy work environment along with an
enviable lifestyle filled with leisure activities such as a golf course, cricket pitch, football
field, tennis & squash courts, horse riding facilities, bowling alley, swimming pools,
health & social clubs, modern shopping malls and many more...

Available vacancies for E & P SPECIALISTS


• Petrophysicist • Geologists Modeller
• Facility Specialists • Petroleum Engineers
• Reservoir Engineers • Instrument Training Specialists
• Reservoir Modelling Specialists • Gas Operations Training Specialists
• Mechanical Training Specialists • Electrical and Power Training Specialists
• Specialist Development Geologists • Process and Production Operations Training
Specialists

Minimum Requirements
• Bachelors Degree
• 15+ years of experience in the relevant positions
• Excellent English written and verbal communication skills
www.jvg-media.com
DISTINGUISHED AUTHOR SERIES

be desirable, but may not always add much value to green- (3) If necessary, perform a more detailed analysis (e.g.,
field studies where the basic subsurface uncertainties remain three-level D-optimal or central-composite).
poorly constrained. (4) Create a proxy model (response surface) by use of lin-
A recognized problem with polynomial proxies is that ear or polynomial proxies, and validate with blind tests.
they tend to yield normal distributions because the terms (5) Perform Monte-Carlo simulations to assess uncertainty
are added (a consequence of the Central Limit theorem). and define distributions of outcomes.
For many types of subsurface forecasts, the prevalence of (6) Build deterministic (scenario) low/mid/high models
actual skewed distributions, such as log-normal, has been tied to the distributions.
documented widely. Therefore, physical proxies, especially (7) Use deterministic models to assess development alter-
in simple cases such as the original-oil-in-place (OOIP) natives.
equation, have some advantages in achieving more-realistic However, variations and subtleties abound. Most studies
distributions. However, errors from the use of nonphysical split the analysis of the static and dynamic parameters into
proxies are not necessarily significant, depending on the two stages with at least two separate experimental designs.
particular problem studied (Wolff 2010). The first stage seeks to create a number of discrete geologi-
A question raised about computing polynomial prox- cal or static models (3, 5, 9, 27, or more are found in the
ies for relatively simple designs such as FPB is that often literature) representing a broad range of hydrocarbons in
there are, apparently, too few equations to solve for all place and connectivity (often determined by rapid analyses
the coefficients of the polynomial. The explanation is that such as streamline simulation). Then, the second stage
not all parameters are equally significant and that some takes these static models and adds the dynamic param-
parameters may be highly correlated or anticorrelated. Both eters in a second experimental design. This method is
factors reduce the dimensionality of the problem, allowing particularly advantageous if the project team prefers to use
reasonable solutions to be obtained even with an apparently higher-level designs such as D-optimal to reduce the num-
insufficient number of equations. ber of uncertainties in each stage. However, this method
cannot account for the full range of individual interac-
Proxy Validation tions between all static and dynamic parameters because
At a minimum, a set of blind test runs, which are not used many of the static parameters are grouped and fixed into
in building the proxy, should be compared with proxy discrete models before the final dynamic simulations are
predictions. A simple crossplot of proxy-predicted vs. run. This limitation becomes less significant when more
experimental results for points used to build the proxy discrete geological models are built that reflect more major-
can confirm only that the proxy equation was adequate to uncertainty combinations.
match data used in the analysis. However, it does not prove Steps 2 and 3 in the base methodology sometimes coin-
that the proxy is also predictive. In general, volumetrics cide with the static/dynamic parameter split. In many
are more reasonably predicted with simpler proxies than cases, however, parameter screening is performed as an
are dynamic results such as recoveries, rates, and break- additional DoE step after having already determined a set
through times. of discrete geological models. This culling to the most-
influential uncertainties again makes running a higher-level
Moving Toward a Common Process design more feasible, especially with full-field full-physics
Some standardization of designs would help these methods simulation models. The risk is that some of the parameters
become even more accepted and widespread in companies. screened from the analysis as insignificant in the develop-
The reader of this paper likely is a reservoir engineer or ment scenario that was used may become significant under
earth scientist who, by necessity, dabbles in statistics but other scenarios. For example, if the base scenario was a
prefers not to make each study a research effort on math- peripheral waterflood, parameters related to aquifer size
ematical methods. Another benefit of somewhat standard- and strength may drop out. If a no-injector scenario is later
ized processes is that management and technical reviewers examined, the P10/50/90 deterministic models may not
can become familiar and comfortable with certain designs include any aquifer variation. Ideally, each scenario would
and will not require re-education with each project they have its own screening DoE performed to retain all relevant
need to approve. However, because these methodologies are influential uncertainties.
still relatively new, a period of testing and exploring differ- An alternative is running a single-stage DoE including all
ent techniques is still very much under way. static and dynamic parameters. This method can lead to a
The literature shows the use of a wide range of method- large number of parameters. Analysis is made more trac-
ologies. Approaches to explore uncertainty space range from table by use of intermediate-accuracy designs such as FPB.
use of only the simplest PB screening designs for the entire Such compromise designs do require careful blind testing
analysis, through multistage experimental designs of increas- to ensure accuracy although proxies with mediocre blind-
ing accuracy, to bypassing proxy methods altogether in favor test results often can yield very similar statistics (P10/50/90
of space-filling designs and advanced interpolative methods. values) after Monte Carlo simulation when compared with
A basic methodology extracted from multiple papers listed higher-level designs. As a general observation, the qual-
in Wolff (2010) can be stated as follows: ity of proxy required for quantitative predictive use such as
(1) Define subsurface uncertainties and their ranges. optimization or history matching usually is higher than that
(2) Perform screening analysis (e.g., two-level PB, or FPB), required only for generating a distribution through Monte
and analyze to identify the most-influential uncertainties. Carlo methods.

90 JPT • MAY 2010


Determining which development options (i.e., uncon- follow a logical progression of values to achieve P90 and
strained or realistic developments, including controllable P10 outcomes. However, interpolation of uncertainties can
variables) to choose for building the proxy equations and sometimes be:
running Monte Carlo simulations also has challenges. One • Challenging (some uncertainties such as perme-
approach is to use unconstrained scenarios that effectively ability may not vary linearly compared with others such
attempt to isolate subsurface uncertainties from the effects as porosity)
of these choices (Williams 2006). Another approach is to • Challenging and time-consuming (e.g., interpolating
use a realistic base-case development scenario if such a discrete geological models)
scenario already exists or make an initial pass through the • Impossible [uncertainties with only a discrete number
process to establish one. Studies that use DoE for optimiza- of physical states such as many decision variables (e.g., 1.5
tion often include key controllable variables in the proxy wells is not possible)]
equation despite knowing that this may present difficulties Finally, selecting the deterministic models to use is usu-
such as more-irregular proxy surfaces requiring higher- ally a whole-team activity because each discipline may have
level designs. its own ideas about which uncertainties need to be tested
Integrated models consider all uncertainties together and which combinations are realistic. This selection process
(including surface and subsurface), which eliminates pick- achieves buy-in by the entire team before heading into tech-
ing a development option. These models may be vital for the nical and management reviews.
problem being analyzed; however, they present additional Probabilistic brownfield forecasting has the additional
difficulties. Either computational costs will increase or com- challenge of needing to match dynamic performance data.
promises to subsurface physics must be made such as elimi- Although forecasts should become more-tightly constrained
nating reservoir simulation in favor of simplified dimension- with actual field data, data quality and the murky issue
less rate-vs.-cumulative-production tables or proxy equa- of what constitutes an acceptable history match must be
tions. That reopens the questions: “What development considered. History-match data can be incorporated into
options were used to build those proxies?” and “How valid probabilistic forecasts through several methods. The tra-
are those options in other scenarios?” ditional and simplest method is to tighten the individual
uncertainty ranges until nearly all outcomes are reasonably
Deterministic Models history matched. This approach is efficient and straightfor-
Short of using integrated models, there still remains the ward but may eliminate some more-extreme combinations
challenge of applying and optimizing different development of parameters from consideration.
scenarios to a probabilistic range of forecasts. Normal prac- Filter-proxy methods that use quality-of-history-match
tice is to select a limited number of deterministic models that indicators (Landa and Güyagüler 2003) will accept these
capture a range of outcomes, often three (e.g., P10/50/90) more-extreme uncertainty combinations. The filter-proxy
but sometimes more if testing particular uncertainty com- method also has the virtue of transparency—explanation
binations is desired. Normal practice is to match probability and justification of the distribution of the matched models
levels of two outcomes at once (e.g., pick a P90 model that is straightforward, as long as the proxies (especially those
has both P90 OOIP and P90 oil recovery). Some studies of the quality of the history match) are sufficiently accurate.
attempt to match P90 levels of other outcomes at the same More-complex history-matching approaches such as genetic
time, such as discounted oil recovery (which ties better to algorithms, evolutionary strategies, and ensemble Kalman
simplified economics because it puts a time value on produc- filter are a very active area for research and commercial activi-
tion), recovery factor, or initial production rate. The more ties, but going into any detail on these methods is beyond the
outcome matches that are attempted, the more difficult it is scope of this paper.
to find a suitable model.
The subsurface uncertainties selected to create a deter- Conclusion—What Do We Really Know?
ministic model, and how much to vary each of them, form In realistic subsurface-forecasting situations, enough uncer-
a subjective exercise because there are an infinite number of tainty exists about the basic ranges of parameters that
combinations. Williams (2006) give guidelines for building absolute numerical errors less than 5 to 10% usually are
such models, including trying to ensure a logical progression considered relatively minor, although it is difficult to give
of key uncertainties from low to high models. a single value that applies for all situations. For example,
If a proxy is quantitatively sound, it can be used to test when tuning discrete models to P10/50/90 values, a typical
particular combinations of uncertainties that differ from practice is to stop tuning when the result is within 5% of
those of the DoE before building and running time-consum- the desired result. Spending a lot of time to obtain a more
ing simulation models. The proxy also can be used to esti- precise result is largely a wasted effort, as look-backs have
mate response behavior for uncertainty levels between the shown consistently.
two or three levels (−1/0/+1) typically defined in the DoE. Brownfield forecasts are believed to be more accurate than
This can be useful for tuning a particular combination to greenfield forecasts that lack calibration, but look-backs also
achieve a desired response, and it allows moderate combina- suggest that it may be misleading to think the result is the
tions of uncertainties. Such moderate combinations, rather correct one just because a great history match was obtained.
than extremes used in many designs, will be perceived as Carrying forward a reasonable and defensible set of working
more realistic. This choice also will solve the problem of not models that span a range of outcomes makes much more
being able to set all key variables to −1 or +1 levels and sense than hoping to identify a single “true” answer. As

JPT • MAY 2010 91


DISTINGUISHED AUTHOR SERIES

George Box (an eminent statistician) once said: “All models References
are wrong, but some are useful.” Bentley, M.R. and Woodhead, T.J. 1998. Uncertainty Handling
In all these studies, there is a continuing series of tradeoffs Through Scenario-Based Reservoir Modeling. Paper SPE 39717
to be made between the effort applied and its effect on the presented at the SPE Asia Pacific Conference on Integrated
outcome. Many studies have carried simple screening designs Modeling for Asset Management, Kuala Lumpur, 23–24 March.
all the way through to detailed forecasts with well-accepted doi: 10.2118/39717-MS.
results that are based on very few simulation runs. These Chu, C. 1990. Prediction of Steamflood Performance in Heavy Oil
studies tend to study the input uncertainty distributions in Reservoirs Using Correlations Developed by Factorial Design
great depth, often carefully considering partial correlations Method. Paper SPE 20020 presented at the SPE California
between the uncertainties. Although the quality of the prox- Regional Meeting, Ventura, California, USA, 4–6 April. doi:
ies used in these studies may not be adequate for quantitative 10.2118/20020-MS.
predictive use, it still may be adequate for generating reason- Damsleth, E., Hage, A., and Volden, R. 1992. Maximum Information
able statistics. at Minimum Cost: A North Sea Field Development Study With an
Other studies use complex designs to obtain highly accu- Experimental Design. J Pet Technol 44 (12): 1350–1356. SPE-
rate proxies that can be used quantitatively for optimization 23139-PA. doi: 10.2118/23139-PA.
and history matching. However, many of these studies have Demirmen, F. 2007. Reserves Estimation: The Challenge for the
used standardized uncertainty distributions (often discrete) Industry. Distinguished Author Series, J Pet Technol 59 (5):
with less consideration of correlations and dependencies. 80–89. SPE-103434-PA.
Higher-speed computers and automated tools are making Landa, J.L. and Güyagüler, B. 2003. A Methodology for History
such workflows less time-consuming so that accurate prox- Matching and the Assessment of Uncertainties Associated With
ies and a thorough consideration of the basic uncertainties Flow Prediction. Paper SPE 84465 presented at the SPE Annual
should both be possible. Whichever emphasis is made, Technical Conference and Exhibition, Denver, 5–8 October. doi:
the models that are used should be sufficiently complex to 10.2118/84465-MS.
capture the reservoir physics that influences the outcome Murtha, J. 1997. Monte Carlo Simulation: Its Status and Future.
significantly. At the same time, they should be simple Distinguished Author Series, J Pet Technol 49 (4): 361–370. SPE-
enough such that time and energy are not wasted on refining 37932-MS. doi: 10.2118/37932-MS.
something that either has little influence or remains funda- Otis, R.M. and Schneidermann, N. 1997. A process for evaluating
mentally uncertain. exploration prospects. AAPG Bulletin 81 (7): 1087–1109.
In the end, probabilistic forecasts can provide answers Rose, P.R. 2007. Measuring what we think we have found: Advantages
with names like P10/50/90 that have specific statistical of probabilistic over deterministic methods for estimating oil and
meaning. However, it is a meaning that must consider the gas reserves and resources in exploration and production. AAPG
assumptions made about the statistics of the basic uncer- Bulletin 91 (1): 21–29. doi: 10.1306/08030606016.
tainties, most of which lack a rigorous statistical under- Sawyer. D.N., Cobb, W.M., Stalkup, F.I., and Braun, P.H. 1974.
pinning. The advantage of a rigorous process to combine Factorial Design Analysis of Wet-Combustion Drive. SPE J. 14
these uncertainties through DoE, proxies, Monte Carlo (1): 25–34. SPE-4140-PA. doi: 10.2118/4140-PA.
methods, scenario modeling, and other techniques is that Vogel, L.C. 1956. A Method for Analyzing Multiple Factor
the process is clean and auditable, not that the probabil- Experiments—Its Application to a Study of Gun Perforating
ity levels are necessarily quantitatively correct. However, Methods. Paper SPE 727-G presented at the Fall Meeting of the
they are as correct as the selection and description of the Petroleum Branch of AIME, Los Angeles, 14–17 October. doi:
basic uncertainties. 10.2118/727-G.
Having broken a complex forecast into simple assumptions, Williams, G.J.J., Mansfield, M., MacDonald, D.G., and Bush, M.D.
it should become part of a standard process to refine those 2004. Top-Down Reservoir Modeling. Paper SPE 89974 pre-
assumptions as more data become available. Ultimately, like sented at the SPE Annual Technical Conference and Exhibition,
the example from exploration mentioned at the beginning, Houston, 26–29 September. doi: 10.2118/89974-MS.
we hope to calibrate ourselves through detailed look-backs Williams, M.A. 2006. Assessing Dynamic Reservoir Uncertainty:
for continous improvement of our forecast quality. Integrating Experimental Design with Field Development
Planning. SPE Distinguished Lecturer Series presentation given for
Acknowledgments Gulf Coast Section SPE, Houston, 23 March. http://www.spegcs.
The author would like to thank Kaveh Dehghani, Mark org/attachments/studygroups/11/SPE%20Mark%20Williams%20
Williams, and John Spokes for their support and stimulat- Mar_06.ppt.
ing discussions. Thank you also to Hao Cheng for our work Wolff, M. 2010. Probabilistic Subsurface Forecasting. Paper SPE
together on the subject and for supplying the graphics for 132957 available from SPE, Richardson, Texas. JPT
Fig. 1.

92 JPT • MAY 2010

You might also like