You are on page 1of 86

07/01/2015

Sensitivity Analysis, Assisted History


Matching, Optimization, & and
Uncertainty Analysis Using CMOST

What this course is


An overview of CMOST functionality
Details and examples of how to use CMOST
for:
Sensitivity Analysis
History Matching
Optimization
Uncertainty Assessment

07/01/2015

What this course is not


A treatise on optimization theory & methods
A course on the math behind CMOST
An in-depth discourse on applied history
matching and optimization
We can provide some guidance based on
our experience but we do not claim to be
experts in the application area

Agenda
CMOST overview
CMOST functionality and Tutorials
Sensitivity Analysis
History Matching
CMOST functionality and Tutorials
Optimization
Uncertainty Assessment

07/01/2015

Overview

What is CMOST?
CMOST is CMG software that works in conjunction
with CMG reservoir simulators to perform the
following tasks:
Sensitivity Analysis
Better understanding of a simulation model
Identify important parameters

History Matching
Calibrate simulation model with field data
Obtain multiple history-matched models

Optimization
Improve NPV, Recovery,
Reduce cost

Uncertainty Analysis
Quantify uncertainty
Understand and reduce risk
6

07/01/2015

Typical Workflow for a Brown Field


Reservoir model
Sensitivity analysis

History matching

Matched model

Parameter
Sensitivities

Parameter
Histograms

Optimization

Optimal operating
conditions

Field history file

Forecast model

Optimal model
Uncertainty
assessment

Uncertainty
quantification
7

CMOST View of Simulation Models

Parameters
x1, x2, , xn

Simulation Model
y1=f1(x1, x2, , xn)
y2=f1(x1, x2, , xn)

ym=f1(x1, x2, , xn)

Objective Functions
y1, y2, , yn

07/01/2015

CMOST Process
Experimental
Design
& Optimization
Algorithms

Select
combination of
parameter values

Substitute
parameter values
into simulation
dataset

Analyze results

Objective
Functions &
Proxy Analysis

Parameterization

Run simulation
9

CMOST User Interface

07/01/2015

How Input Data is Organized

Define what data to be extracted from simulation


(Plots and Formulas)

Define how the simulation model is


parameterized
(Inputs)

Define objective functions to be calculated


(Outputs)

General Settings

07/01/2015

General Properties
All input files are entered on the general
properties page:
Base Dataset (required)
Master Dataset (required)
Results template files
Measured Data
Field History Files
Log Files

Fundamental Data
Fundamental data defines which 2D curves of simulation
results need to be viewed
Original Time Series

User-defined Time Series

Custom formula based 2D plots


X-axis: time

Property vs. Distance Series

2D plots similar to Results Graph


X-axis: time

X-axis: Distance

Fluid Contact Depth Series

Depth of Fluid Contact (based on fluid saturation) vs. Time

07/01/2015

Parameterization

Parameterization
Parameters are variables in the simulation
model that will be adjusted when creating new
datasets
- E.g. Porosity, permeability, etc.
To determine the location in the dataset to
substitute values, a master dataset must be
created (.cmm)
A master dataset is almost identical to a normal
simulation dataset except CMOST keywords
have been added to identify where a parameter
should be added
- Acts as a template for creating new datasets

07/01/2015

Master Dataset

Master Dataset
A master dataset can be created in multiple
ways:
CMOST Editor
Builder
Text editor (Notepad, Textpad, etc.)

07/01/2015

Parameterization of Simulation Model (Builder)

List of
Parameters
Select
Dataset
Section
Select
Parameter from
Section
Export Master Dataset (Template
file)

Parameterization of Simulation Model (Text)


Dataset Template
Complementary to Builder
Create CMOST parameters
Better syntax highlighting

Highlight CMOST parameters


Fold no-need-to-see sections

Easy navigation

Different sections of the dataset


Navigate CMOST parameters

Handle include files

Create/extract include files


View include files
Parameterize include files

10

07/01/2015

Master Dataset Syntax


Original Dataset:

PORCON0.20
Master Dataset:

PORCON<cmost>this[0.20]=Porosity</cmost>

Simulator
Keywords

CMOST
Start

No Spaces in
CMOST Portion

Variable
Name

Original
(Default)
Value in
Dataset

CMOST
End

Variable Names
Case Sensitive

Master Dataset Syntax


Formulas can also be used with 1 or more variables:
PORCON<cmost>0.20*PorosityMultiplier</cmost>

Simulator
Keywords

CMOST
Start

Formula

CMOST
End

Default value
optional

11

07/01/2015

Master Dataset Syntax


Values in regions of the reservoir can be modified using
MOD simulation keywords
PORCON0.20
MOD
1:52:81:10*<cmost>this[1]=PorosityMultiplier1</cmost>
6:102:81:10*<cmost>this[1]=PorosityMultiplier2</cmost>

Block Ranges
I:IJ:JK:K

Parameter Definition
Continuous parameter
Lower and upper limit define the sampling range
used by study engines
Discrete parameter
Real, Integer, and Text
Each discrete text value also requires a
numerical value
Formula
Value based off of other parameters values

12

07/01/2015

Dependent Parameters Using Formula


Syntax highlighting
Shows what variables are available to be
used to create formulas
Test and check the formula anytime

Hard Constraints
Criteria that must be satisfied for a dataset
to be created
Used to eliminate unrealistic datasets
E.g. Horizontal permeability should be
greater than vertical permeability

13

07/01/2015

Pre-Simulation Commands
Passes dataset to separate application
before submitting to simulator
Run Builder Silently
Run GOCAD Command Silently
Run User Defined
Can be used to create new geostatistical
realizations, recalculate formulas in Builder,
recalculate rel. perm. curves, etc.

Coupling with Geological Software


Simulation model
H Pair

Geological model

Simulation model

14

07/01/2015

Objective Functions

Objective Functions
An Objective Function (OF) is something (an
expression or a single quantity) for which
you wish to achieve some goal
Usually this goal is to achieve a minimum or
maximum value
In the case of History Matching, one
usually wishes to minimize an error
between field data and simulation
In the case of Optimization, one usually
wishes to maximize something like NPV

15

07/01/2015

Objective Functions

Basic Simulation Results


Values directly taken from simulation results
with no modification
History match error
Percentage relative error
Perfect match: 0%
Net Present Value
Simplified NPV calculation
Can be used to construct user-defined
objective functions which utilize simulation
results discounted by time as variables

Objective Functions

Characteristic Date Times


Specific Dates
Date where maximum or minimum value is
found
Date when value surpasses a specified criteria
Advanced Objective Functions
User defined objective function based on
formula or code (jscript or python)
Soft Constraints
Re-evaluates objective functions based on
simulation results

16

07/01/2015

What are Dynamic Date Times


Characteristic date times, within certain time
series range, that meet certain criteria
Examples:
First time oil rate is higher than 100 bbl/d;
Last time SOR is bigger or equal to 6;
First time oil rate is higher than 100 bbl/d , and
SOR is smaller than 4

Why Need Dynamic Date Times


Fixed date time (e.g. simulation stop time, or
2014-8-15) is fixed for all jobs. Objective function
are calculated on fixed date times for all jobs.
In some cases, we want to get information on date
time that some specified condition are meet (e.g.
oil rate>100). However, these times are not the
same for all simulation jobs.
Thus, they are dynamic.

17

07/01/2015

Dynamic Date Times: Maximize Peak NPV

Peak NPV

Dynamic Date Times: Plateau


Optimization
Plateau period
Oil produced at plateau
Average oil rate at plateau

18

07/01/2015

Characteristic Date Times Page


Start & Stop Time

To name certain
data time

Data times that


meet certain
criteria

Define Dynamic Date Time


Criteria

Criteria

Time Series

User defined time series

19

07/01/2015

Characteristic Time Durations


Defined in Basic Simulation Result Page,
then can be used as objective functions.

User-defined Objective Functions


Use Excel spreadsheet
Map CMOST parameter values to cells
Map simulation results to cells

Use Jscript or Python code


Use executable provided by user (e.g.
MATLAB)
Preview calculation result using base case

20

07/01/2015

Use Excel Spreadsheet


After each simulation is done:
CMOST write Parameter and simulation results
to Excel cells;
Excel calculate objective function using formula
or VBA code;
then CMOST read value back into CMOST, and
use it the objective function value.

Sensitivity Analysis

21

07/01/2015

Sensitivity Analysis Goals


Determine which parameters have an effect
on results
E.g. I expect that rock compressibility is
between values A and B. Does this
uncertainty impact my results?
Determine how much of an effect
parameters have on results
E.g. If permeability is increased by 50mD,
how much will cumulative oil increase?

Sensitivity Analysis Process


Select parameters to analyze
E.g. porosity
Select range of values to analyze
E.g. between 20-30% porosity
Select results (Objective Functions) to
analyze
E.g. Cumulative Oil

22

07/01/2015

Sensitivity Analysis Methodology


One Parameter at a Time (OPAAT)
Each parameter is analyzed independently
while remaining parameters are set to their
reference value
Response Surface Methodology
Multiple parameters are adjusted together
then results are analyzed by fitting a
response surface (Polynomial equation) to
results

One Parameter at a Time (OPAAT)


This method analyzes each parameter
independently
While analyzing one parameter, the method
freezes the other parameters at their reference
values (Median or Default)
This measures the effect of each parameter on
the objective function while removing the effects
of the other parameters.

23

07/01/2015

One Parameter at a Time (OPAAT)


Benefits
Simple to use
Results easy to understand
Results not complicated by effects of other
parameters
Drawbacks
Results are focused around the reference
values
Results can change dramatically if
reference values change

Configure OPAAT Engine


Select reference
values to be used

Objective
Function

Non-monotonic
variables require many
levels of parameter
values to be tested

Parameter

24

07/01/2015

One Parameter at a Time (OPAAT)

One Parameter at a Time (OPAAT)

Porosity = 0.2
CumOil = 33004
bbl

Porosity = 0.25
(reference value)
CumOil = 40416
bbl

Porosity = 0.3
CumOil = 44176
bbl

Note: Min and max objective function values do


not always correspond with Min and Max
parameter values
Check cross plots to verify

25

07/01/2015

One Parameter at a Time (OPAAT)

Length of bar gives the


maximum change in the
objective function over the
parameter range

Response Surface Method (RSM)


Correlation between
response and
parameters
NPV = f(x1, x2, ,
xn)
The response surface
is a proxy for the
reservoir simulator that
allows fast estimation
of the response

52

26

07/01/2015

Response Surface Method (RSM)

Multiple Parameters values are changed


simultaneously
The combination of parameter values that are
chosen are based off of an experimental
design
Response surface (polynomial equation) is fit to
simulation results
Linear
Linear + Quadratic
Linear + Quadratic + Interaction terms

Response Surface Proxy Models


Linear Model

y a0 a1 x1 a2 x2 ak xk
Linear + Quadratic Model
k

j 1

j 1

y a0 a j x j a jj x 2j
Linear + Quadratic + Interaction
k

j 1

j 1

y a0 a j x j a jj x 2j aij xi x j
i j j 2

Statistically insignificant terms are automatically removed


Model type is automatically chosen but can be changed if necessary

27

07/01/2015

SA Using RSM Engine

Handles problematic
simulation runs

Specify desired
accuracy, the engine
will create and run
experiments as
needed.

Response Surface Method (RSM)

28

07/01/2015

Response Surface Method (RSM)


For comparing effect between different
parameters, parameter ranges are
normalized between -1 and 1
In the resulting tornado plot, 2*Coefficient of
the normalized polynomial relation is given
Represents average change going from
min to max parameter value when looking
at linear effects
Similar to bar length from OPAAT method

Response Surface Method (RSM)

29

07/01/2015

Response Surface Method (RSM)

Increasing PERMH_L1
(permeability) from
2625mD to 4375mD
results in an increase in
Cumulative Oil of 12,461
STB on average

Response Surface Method (RSM)


If a parameters has a non-linear relation with
the objective functions, a quadratic term may
also be given (x2)
If modifying 2 parameters at the same time has
an effect stronger than the sum of their
individual linear or quadratic effects, a cross
term may be given (x*y)

30

07/01/2015

Quadratic and Linear Effect Table


X-axis: Parameter
Y-axis: Objective Function

Now Hands on Work!!!

Any Questions?

31

07/01/2015

Control Centre

Engine Settings
Defines task type
Task type can be modified from what
was originally selected when creating
the study
Any other options related to the engine
can be modified from this page

32

07/01/2015

Engine Settings
Study Type
Engine

Simulator Settings
Simulation related settings:
Schedulers
Simulator version
Number of CPUs per job
Maximum simulation run time
Job record and file management
Data I/O Cleanup

33

07/01/2015

Submit Simulations to Compute Cluster

CMG Scheduler
Microsoft HPC
IBM Platform LSF
Oracle Grid Engine
Portable Batch System (PBS/TORGUE)

Optimize Dataset I/O Section


Switches to reduce output file size

Disable restart records writing


Disable grid records writing in OUT
Disable grid records writing in SR2

34

07/01/2015

Optimize Dataset I/O Section

What does CMOST do?

Benefits
Remove I/O bottleneck
Reduce occurrence of strange problems
Reduce support troubleshooting time

Experiments Table
List of Experiments
Parameter values used
Objective function results
Able to sort and filter results
Open in Builder and Results
Add additional experiments
User defined
Predefined experimental designs
(fractional factorial, latin hypercube, etc.)

35

07/01/2015

Simulation Jobs
List of Simulations
Scheduler information
Start Time
End Time
Scheduler Name
Status
File information
Name and location
Normal/Abnormal termination

Now Hands on Work!!!

Any Questions?

36

07/01/2015

Results & Analyses

Results & Analyses


All result objects are dynamically created on
the fly using the data stored in experiment
table
All types of result objects are available for
any study type
HM & OP will automatically have sensitivity
and proxy result if there are enough
experiments.

37

07/01/2015

Results & Analyses


Parameters
Run progress
Parameter value vs. experiment number
Histogram
Frequency that range of parameter values
were chosen
Cross plot
Plot parameter values vs. other data
Current parameter always on y-axis

Results & Analyses


Time Series and Property vs. Distance
Plots of simulation results as defined in the Fundamental Data section

38

07/01/2015

Results & Analyses


Objective Functions
Run progress
Objective function value vs. experiment number
Histogram
Frequency that range of objective function values occurred
Cross plot
Plot objective function values vs. other data
Current objective function always on y-axis
OPAAT Analysis
Results from One Parameter At A Time sensitivity analysis
Proxy Analysis
Proxy verification
Response Surface sensitivity analysis results
Monte Carlo Simulation uncertainty assessment results

Sensitivity Analysis Validating Results

39

07/01/2015

Proxy Analysis for Objective Functions


Statistics

Polynomial proxy
Linear/Quadratic/Interaction

RBF neural network

Response Surface Verification


When using the response surface
methodology, one should verify that the
response surface provides a valid match to
the simulation data
This can be verified through
Response Surface Verification Plot
Summary of Fit Table
Analysis of Variance Table
Effect Screening

40

07/01/2015

Response Surface Verification Plot


Gives a visual overview of how the proxy model
fits the actual simulation results
Distance from each point to the 45 degree line
is the error/residual for that point
Points that fall on the 45 degree line are those
that are perfectly predicted

Response Surface Verification Plot

41

07/01/2015

Summary of Fit Table

R2 is a measure of the amount of reduction in the


variability of the response obtained by using the
regressor variables in the model.
R2 of 1 occurs when there is a perfect fit (the errors are
all zero).
R2 of 0 means that the model predicts the response no
better than the overall response mean.

Summary of Fit Table

R2 can be adjusted to make it comparable over


models with different numbers of regressors by
using the degrees of freedom in its computation.
Here n is the number of observations (training
jobs) and p is the number of terms in the
response model (including the intercept).
When R2 and R2adjusted differ dramatically, there is
good chance that non-significant terms have
been included in the model.

42

07/01/2015

R2adjusted (Example)
4 Samples
n=4

Linear:
R2=0.9683
Quadratic:
R2=0.9715

4
4

1
1
2

0.9683

4
4

1
1
3

0.9715

Summary of Fit Table

PRESS is the prediction error sum of squares.


R2prediction gives some indication of the predictive
capability of the regression model.
For example, we could expect a model with
R2prediction=0.95 to explain about 95% of the
variability in predicting new observations.

43

07/01/2015

Summary of Fit Table

To calculate PRESS,

Select an observation i.
Fit the regression model to the remaining n-1
observations and use this equation to predict the
withheld observation yi.
Denoting this predicted value by y(i), we can find the
prediction error for point i as ei=yi-y(i).
The prediction error is often called the ith PRESS
residual. This procedure is repeated for each
observation i=1,2,3,,n producing a set of n PRESS
residuals: e1, e2, e3en.
The PRESS statistic is then defined as the sum of
squares of the n PRESS residuals:

R2prediction Example

x
1.000
2.000
3.000
4.000

y
1.900
2.450
2.950
3.890

y(i)
1.657
2.484
3.194
3.483

PRESS:
SS
(Total):
R2prediction

ei
0.243
-0.034
-0.244
0.407

ei2
0.059
0.001
0.060
0.165

0.285
2.143
0.867

44

07/01/2015

Summary of Fit Table


Mean of Response
The overall mean of the response values. It is
important as a base model for prediction
because all other models are compared to it.
Standard Error
Estimates the standard deviation of the random
error. It is the square root of the mean square
for Error in the corresponding Analysis of
Variance table. Standard error is commonly
denoted as .

Analysis of Variance Table


Degrees of Freedom
Total = Number of Samples 1
Model = Number of coefficients for the response surface (not
including intercept)
Error = Total Model

Sum of Squares
Total: Sum of Squared distances of each response from the
sample mean
Error: Sum of squared differences between the fitted (RS) values
and the actual simulated values
Model = Total Error

Mean Square
(Sum of Squares)/(Degrees of Freedom)
Converts sum of squares to an average

45

07/01/2015

Analysis of Variance Table


F Ratio
Model mean square divided by the Error mean square
Tests the hypothesis that all the regression parameters
(except the intercept) are zero (have no effect on the
objective function)

Prob > F
The probability of obtaining a greater F-value by chance
alone if the specified model fits no better than the overall
response mean.
Significance probabilities of 0.05 or less are often
considered evidence that there is at least one significant
regression factor in the model

Summary of Fit and Analysis of


Variance Table

46

07/01/2015

Response Surface Model Checklist


Check Predicted vs. Actual
Is there a good fit?
Any outliers?

Check Summary of Fit


Is R-Square adjusted and R-Square prediction
large enough (>0.5)?

Check Analysis of Variance


For a decent model, Prob>F should be very
small (<0.0001).

Effect Screening using Normalized


Parameters (-1, +1)
Coefficient
Coefficients of the response surface model found by least squares

Standard Error
Estimate of the standard deviation of the distribution of the
parameter estimate (coefficient)

t Ratio
Statistic that tests whether the true parameter (coefficient) is zero

Prob > |t|


Probability of getting an even greater t-statistic (in absolute value),
given the hypothesis that the parameter (coefficient) is zero.
Probabilities less than 0.1 are often considered as significant
evidence that the parameter (coefficient) is not zero.
Used to filter statistically insignificant terms

47

07/01/2015

Effect Screening using Normalized


Parameters (-1, +1)
VIF (Variance Inflation Factor)
Measure of multi-collinearity problem
Multi-collinearity refers to one or more near-linear
dependences among the regressor variables due to poor
sampling of the design space
Multi-collinearity can have serious effects on the estimates
of the model coefficients and on the general applicability of
the final model
The larger the variance inflation factor, the more severe the
multi-collinearity.
It is suggested that the variance inflation factors should not
exceed 4 or 5.
If the design matrix is perfectly orthogonal, the variance
inflation factor for all terms will be equal to 1.

Effect Screening using Normalized


Parameters (-1, +1)

48

07/01/2015

Response Surface Method (RSM)

Increasing PERMH_L1
(permeability) from
2625mD to 4375mD
results in an increase in
Cumulative Oil of 12,461
STB on average

Proxy Dashboard
See quick estimation of effects of
parameters on results at all simulation times
See results immediately without needing to
wait for additional simulation
Can assist in manual history matching or
optimization
CMOST can create dataset and run
simulation to verify proxy results

49

07/01/2015

Proxy Dashboard
Build Proxy
Model

What-if Scenario:
Estimate Objective
Functions

Select
Comparison Case

What-if Scenario:
Choose Parameter
Input

What-if Scenario:
Visualize Estimate of
Curve

Proxy Dashboard Method


Curve divided into 100 points
Response surface model created for each
point based on simulation results
Polynomial
RBF neural network
When parameter values are adjusted, each
point along the curve is recalculated using
the response surface

50

07/01/2015

Now Hands on Work!!!

Any Questions?

History Matching and


Optimization

51

07/01/2015

History Matching Goals


In history matching, we are trying to reduce the
error between the simulation results and field
measured data
By matching the simulation model to the historical
behaviour, we have more confidence that the model
will be able to predict future behavior
When creating a simulation model, there may be
uncertainty in the input parameters. These will be
the parameters that should be adjusted when
history matching

History Matching Process


Select parameters to analyze
E.g. porosity, permeability
Select range of values to analyze
E.g. between 20-30% porosity
Select results (Objective Functions) to
match
E.g. Cumulative Oil
CMOST will search for the best combination
of parameter values that will give the lowest
history match error

52

07/01/2015

Objective Function Hierarchy


A hierarchy is used when optimizing the objective
function
Upper terms are calculated as a weighted
average of the lower terms
Global
Objective
Function

Local
Objective
Function 1

Term 1

Term 2

Local
Objective
Function 2

Term 3

Term 1

Local
Objective
Function 3

Term 2

Term 1

Term 2

Term 3

Objective Function Hierarchy


Typically common items are grouped together
E.g. The local objective functions might
represent the error for a well and the terms
might represent the measured data for that
well
Total Error

Well 1 Error

Oil
Production
Error

Water
Production
Error

Well 2 Error

Gas
Production
Error

Bottom-hole
Pressure
Error

Oil
Production
Error

Well 3 Error

Oil
Production
Error

Water
Production
Error

Gas
Production
Error

53

07/01/2015

Calculating History Match Error


simulated
Nt

Number of measurements

(Y

t 1

measured

Yt m )2

Nt

For each measured data point, calculate


different between simulation and measured
result
Square terms to make them positive
Sum up all of the points at all times
Divide by the number of measurements to get
average square
Square root to get average error

Calculating History Match Error


Nt(j)

(Y
t 1

TermError j

s
j,t

Y j,tm )2

Nt(j)
Y 4 Merrj
m
j

maximum difference

measurement error

To compare error of terms with different units, error


must be normalized
This is done by dividing by the maximum difference in
measured values
Measurement error can also be included

Merr represents 1 standard deviation from the mean


Value of 4 is used to include 2 standard deviations on each side of the
mean (95% confidence)

54

07/01/2015

Calculating History Match Error


N(i)

Qi

TermError

i, j

j 1

100% tw i, j

N(i)

tw
j 1

i, j

Each Local Objective Function is made up of


a weighted arithmetic average of each of the
Terms

Calculating History Match Error

Nt(i, j)

(Y
t 1

Qi

N(i)

N(i)

tw
j 1

j 1

i, j

s
i, j,t

Yi,mj,t )2

Nt(i, j)
100% tw i, j
Yi,mj 4 Merri, j

55

07/01/2015

Calculating Global History Match Error


Nw

Q global

w Q
i 1
Nw

w
i 1

The Global Objective Function is made up of a


weighted arithmetic average of each of the
Local Objective Functions

Field Data Weighting


Able to weight each measured data point
individually
Remove or reduce weight of outliers

56

07/01/2015

Optimization Methods
CMG DECE (Designed Evolution, Controlled
Exploration)
Particle Swarm Optimization
Latin Hypercube plus Proxy Optimization
Random Brute Force Search

Optimization Philosophy
Mathematical optimization
Mathematicians are particularly interested in finding the true
absolute optimum.
Optimum 0.000001 is much better than 0.01 even though it may
take 20 extra days to achieve the former.

Engineering optimization
Engineers are more interested in quickly finding optima that are
close to the true optimum.
Optimum 0.01 is much better than 0.000001 if it takes 20 less
days to achieve the former.

CMOST Optimization Philosophy


Engineering optimization
Not intended to solve pure mathematical problems
114

57

07/01/2015

CMG DECE Optimization Algorithm


Generate initial Latin hypercube design
Run simulations using the design
Get initial set of training data

Exploitation (find optimum)


Success?

Add new
solutions to
training data

No
Exploration
(get more information)

Yes
Run simulations
No

Satisfy stop criteria?


Yes
Stop

CMG DECE Optimization Algorithm

58

07/01/2015

DECE Characteristics

Handles continuous & discrete parameters


Handles hard constraints
Asynchronous complete utilization of distributed computing
power
Fast and stable convergence

PSO Optimization Algorithm


A population based stochastic optimization technique
developed in 1995 by James Kennedy and Russell Eberhart.
Let particles move towards the best position in search space,
remembering each local (particles) best known position and
global (swarms) best known position.

118

59

07/01/2015

PSO Optimization Algorithm

Local
Area

119

PSO Optimization Algorithm

Next
Case
120

60

07/01/2015

Latin Hypercube plus Proxy Optimization


Algorithm
Generate initial Latin hypercube design

Run simulations using the design

Get initial set of training data

Polynomial
RFB Neural
Network

Build a proxy model using training data


Add validated
solutions to
training data

Find possible optimum solutions using proxy

Run simulations using these possible solutions

No

Satisfy stop criteria?


Yes
Stop

Latin Hypercube plus Proxy Optimization


Algorithm

61

07/01/2015

Proxy Optimization

Optimization using proxy

Latin hypercube design

Random Search
Parameter value combinations randomly until
the max number of simulator calls has been
reached
No trend to results (scatter)
Only use if search space is small

62

07/01/2015

Now Hands on Work!!!

Any Questions?

Optimization Goals

History matching and optimization are very similar in that in


each one would like to find the maximum or minimum of an
objective function

In history matching, we are trying to reduce the error between


the simulation results and field measured data

With optimization, we are trying to improve an objective


function
Find maximum NPV
Find maximum recovery
Etc.

Typically with optimization, the parameters that will be


adjusted are operational parameters as opposed to reservoir
parameters when history matching

63

07/01/2015

Optimization Process

Select parameters to analyze


E.g. Injection rate, well spacing
Select range of values to analyze
E.g. between 200-500bbl/day injection rate
Select results (Objective Functions) to improve
E.g. NPV, recovery factor
CMOST will search for the best combination of
parameter values that will maximize your objective
function
In some cases we may want to minimize an
objective function such as when looking at run
times during numerical tuning

Calculating Net Present Value (NPV)


Net Present Value (NPV) is often used
as an economic indicator to evaluate the
value of a project
A discount rate (I) is used to incorporate
the time value of money
Money now is worth more than money later

64

07/01/2015

Calculating Net Present Value (NPV)

In CMOST, the cash flow is always calculated


based on the period that has been selected.
Daily
Monthly
Quarterly
Yearly
Yearly discount rate will be converted to the
period of interest
E.g. Daily:

Common OP Engine Settings


DECE
LHD Plus Proxy
OP

PSO
Random Brute Force

All methods use the


same stop criterion

Which objective
function to
optimization

Maximize or minimize

65

07/01/2015

Multi-objective PSO (SPE 170024)


Multiple-objective optimization: Optimizing
multiple, maybe conflicting, objective functions
simultaneously
Two approaches:
1. Optimize an aggregated global objective function
2. Pareto Optimization, e.g. Multiple-objective PSO

Multi-Objective PSO
Select the engine
How many simulations?
First objective function
Maximize or minimize?

2nd objective function


Maximize or minimize?

66

07/01/2015

How MO-PSO Works


Domination: Better in every objective function
Leader: A non-dominated solution
Pareto front: The ensemble of leaders
f1

Dominated solution
Leader
Pareto front
b

a dominates b

f2

Leader Selection
Leader is randomly selected for each particle
Least crowded leaders are given high priority,
trying to find an adequately spread Pareto front
f1

a
b
c
d
e

Priority

Leader

a, f

f2

67

07/01/2015

Case Study: Numerical Tuning


Numerical tuning of a SAGD model for a real
field
Full model takes >42d to run
After cleanup & tuning: 7d to finish (SPE
165511)
Sub model: 4 slices (4x600x50) takes ~1h to
run the first 6 months

Colin Card et.al. A New and Practical Workflow of Large Multi-Pad SAGD Simulation- A
Corner Oil Sands Case Study, SPE HOCC, SPE 165511, June 2013.

Single Objective PSO Optimization


Conflicting objective functions:
1. Run time
2. Material Balance Error
3. Solver Failure Percent
GlobalObj =
(50,000*MaterialBalanceError+1*RunTime+1,000*Solver
FailurePercent)/51,001
Use PSO to minimize GlobalObj

68

07/01/2015

MO-PSO Optimization
Same set of parameters
Minimize two conflicting objective functions:
MaterialBalanceError & RunTime
GlobalObj & SolverFailurePercent are also
calculated, just for comparison

Pareto front @500 MOPSO & 1000


SPSO Experiments
3500

MOPSO
SPSO

3400
3300

Run Time (s)

3200
3100
3000
2900
2800
2700
2600
2500
0

0.005

0.01

0.015

0.02

0.025

0.03

0.035

0.04

Material Balance Error (%)

69

07/01/2015

Now Hands on Work!!!

Any Questions?

Uncertainty Assessment

70

07/01/2015

Uncertainty Assessment (UA)


Once an optimum operating strategy has been
developed for one or more history match models the
question remaining is:
Given residual uncertainties in the HM (or other)
variables, what impact will those uncertainties have
on the NPV of the optimum cases(s)?
How does this work?
Even with simple geological models we are still likely
to have more than one set of geological parameter
values that give an acceptable HM, thus indicating
some uncertainty in these values

141

Uncertainty Assessment (UA)


How does this work?
If we have more than one geological realization that
gives an acceptable HM we have by definition an
uncertainty as to which realization best reflects reality
Thus we need to see how the NPV of our optimum
cases is impacted by the uncertainty in these
realizations
It is important to recognize that the HM process
develops alternative realizations and that parameters
which are part of these realizations cannot have their
values changed independently and arbitrarily

71

07/01/2015

Uncertainty Assessment

Uncertainty Assessment

72

07/01/2015

Monte Carlo Simulation


Input probability
distributions derived from
experience

Probability distribution for HM parameters


DWOC

SORG

81

Pick random values (that


follow input distribution)
and calculate NPV

82

SORW

SORW

SORG

DWOC

0.25 0.30 0.35

0.25 0.30 0.35

83

NPV=F(DWOC, SORG, SORW, )

Repeat for thousands of


iterations
12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

Net present value (M$)

Prior Probability Density Functions


Defines likelihood of a parameter value being
selected in Monte Carlo Simulation for continuous
parameters
Uniform
Triangle
Normal
Log Normal
Custom
Probability for each value defined for discrete
parameter types

73

07/01/2015

Probability Distributions in CMOST

Normal

Lognormal

Triangle

Uniform

Custom

Discrete

Parameter Correlations
Some parameters may be related to each
other
E.g. porosity and permeability may
correlate with each other
Correlation coefficient defines how closely
related parameters are to each other
Value of 1 means parameters are directly
related
Value of 0 means that parameters have no
relation with each other

74

07/01/2015

Sampling of Correlated Parameters


POR

Parameter
POR
PERMH
PERMV

POR

PERMV

POR

PERMH

PERMV

1
0
0

0
1
0

0
0
1

POR

PERMH

PERMH

PERMH

PERMV

PERMV

Sampling of Correlated Parameters


Desired Values

Parameter
POR
PERMH
PERMV

POR

PERMH

PERMV

1
0.6
0.4

0.6
1
0.8

0.4
0.8
1

Calculated Values

Parameter

POR

PERMH

PERMV

POR
PERMH
PERMV

1
0.582
0.385

0.582
1
0.79

0.385
0.79
1

POR

PERMH

POR

PERMH

PERMV

PERMV

75

07/01/2015

Parameter Correlations

0.25

0.0

0.50

0.75

Uncertainty Assessment (UA)

Objective functions can be evaluated using 2 methods


for Monte Carlo Simulation
Response surface as a proxy to reservoir simulation
Running reservoir simulation
Running reservoir simulation is often too slow to
evaluate the response so the proxy method is often
preferred
Situations to run reservoir simulation for Monte Carlo
simulation:
You want to validate MCS-proxy result
Building proxy is not feasible. For example,
Multiple Geostatistical realizations are used
Multiple HM models are used

76

07/01/2015

Uncertainty Assessment (UA)


How do we do this?
The RS is generated by creating an
experimental design and providing and a
statistical distribution for each uncertain
parameter
The process is the same as polynomial
response surface modelling for sensitivity
analysis
RBF neural network type proxy models also
available for Uncertainty Assessment
A Monte Carlo simulation is then run to
determine the NPV distribution

UA Using MCS-Proxy Engine

Handles problematic
simulation runs

Specify desired
accuracy, the engine
will create and run
experiments as
needed.

77

07/01/2015

User Defined Study Type

User-defined Study Type


Manual Engine

User
Defined

All experiments are to be


created by the user
explicitly through:
Classical experimental
design
Latin hypercube
design
Manual

External Engine
Allows the use of users
own optimization
algorithm.

78

07/01/2015

When to Use Manual Engine


Use classical experimental design for SA and
UA
Precise control on the number of Latin
hypercube experiments
Run additional experiments after a
SA/UA/HM/OP run is complete.
Create new experiments using users
optimization algorithm

Working With CMOST

79

07/01/2015

Main CMOST Components


Base Files
To begin a CMOST project, a completed
simulation dataset (.dat) along with its
Simulation Results (SR2: .mrf, .irf) files are
required
CMOST Project
A CMOST Project is the main CMOST file that
can contain multiple related studies

CMOST 2013 File System

ProjectName:SAGD_2D_UA
ProjectFile:SAGD_2D_UA.cmp
ProjectFolder: SAGD_2D_UA.cmpd
Bestpractice: Allfilesrelatedtothe
projectshouldbestoredintheproject
folder.

80

07/01/2015

Main CMOST Components


CMOST Study
A CMOST study contains all of the input
information for CMOST to run a particular
type of task
Information can be copied between studies
Study types can be easily switched
The new study type will use as much
information from the previous study type as
possible

CMOST 2013 File System (contd 1)

StudyName:BoxBen
Study File:BoxBen.cms
StudyFileAutoBackup:BoxBen.bak
StudyFolder: BoxBen.cmsd
Dontmodify/deletefilesinthestudyfolder
unlessyouknowwhatyouredoing.

81

07/01/2015

CMOST 2013 File System (contd 2)

Vectordatarepositoryfile:*.vdr
VDRstorescompressed
simulationdatarequired for
objectivefunctioncalculations
SubsetofSR2results
Nevermodifyordeletevdrfiles
manually

Reusing and Restarting a Study

Auto
Synchronization

Auto/Manual
Reprocessing

After you run an engine, you can go back to change input data.
Experiments inside a study will be automatically reused.
If new parameters are added, you need to resolve reuse pending
experiments.
After you finish the changes, click start engine to restart.

82

07/01/2015

Reusing Data from Another Study

Licensing Multiplier
CMOST uses only partial licenses when
running simulations
E.g. Run 2 STARS simulations while using
only 1 STARS license
Applies to other license types (Parallel,
Dynagrid, etc.)
IMEX
GEM
STARS

4:1
2:1
2:1

x4
x2
x2

83

07/01/2015

CMOST Input Data Features


Quality Data

Quality Result

Further Assistance
Email: support@cmgl.ca
Zip an entire project or selected studies
Email or ftp the zip file to CMG

84

07/01/2015

Diagnostic Zip for Support Request

Diagnostic Zip for Support Request

85

07/01/2015

Now Hands on Work!!!

Any Questions?

86

You might also like