Презентация ПО CMOST 2014

© All Rights Reserved

241 visualizações

Презентация ПО CMOST 2014

© All Rights Reserved

- CMG tutorial.pdf
- 07 CMOST Tutorial
- CMOST
- CMG tutorial
- Manual CMOST
- WINPROP
- CMG Tutorial EOR 2013
- CMG Step by Step
- CMG WinProp Tutorial.pdf
- 07 CMOST Tutorial
- Waterflooding Tutorial CMG
- Manual IMEX 201010en
- Petrel RE Tutorial - Fluid Model
- 1. Introduction to STARS
- Builder
- CMG STARS Guide.pdf
- post-12889-1293602095.ipb_2
- 2013 Houston Training Schedule
- Compositional Simulation
- C02 Thesis

Você está na página 1de 86

Matching, Optimization, & and

Uncertainty Analysis Using CMOST

An overview of CMOST functionality

Details and examples of how to use CMOST

for:

Sensitivity Analysis

History Matching

Optimization

Uncertainty Assessment

07/01/2015

A treatise on optimization theory & methods

A course on the math behind CMOST

An in-depth discourse on applied history

matching and optimization

We can provide some guidance based on

our experience but we do not claim to be

experts in the application area

Agenda

CMOST overview

CMOST functionality and Tutorials

Sensitivity Analysis

History Matching

CMOST functionality and Tutorials

Optimization

Uncertainty Assessment

07/01/2015

Overview

What is CMOST?

CMOST is CMG software that works in conjunction

with CMG reservoir simulators to perform the

following tasks:

Sensitivity Analysis

Better understanding of a simulation model

Identify important parameters

History Matching

Calibrate simulation model with field data

Obtain multiple history-matched models

Optimization

Improve NPV, Recovery,

Reduce cost

Uncertainty Analysis

Quantify uncertainty

Understand and reduce risk

6

07/01/2015

Reservoir model

Sensitivity analysis

History matching

Matched model

Parameter

Sensitivities

Parameter

Histograms

Optimization

Optimal operating

conditions

Forecast model

Optimal model

Uncertainty

assessment

Uncertainty

quantification

7

Parameters

x1, x2, , xn

Simulation Model

y1=f1(x1, x2, , xn)

y2=f1(x1, x2, , xn)

Objective Functions

y1, y2, , yn

07/01/2015

CMOST Process

Experimental

Design

& Optimization

Algorithms

Select

combination of

parameter values

Substitute

parameter values

into simulation

dataset

Analyze results

Objective

Functions &

Proxy Analysis

Parameterization

Run simulation

9

07/01/2015

(Plots and Formulas)

parameterized

(Inputs)

(Outputs)

General Settings

07/01/2015

General Properties

All input files are entered on the general

properties page:

Base Dataset (required)

Master Dataset (required)

Results template files

Measured Data

Field History Files

Log Files

Fundamental Data

Fundamental data defines which 2D curves of simulation

results need to be viewed

Original Time Series

X-axis: time

X-axis: time

X-axis: Distance

07/01/2015

Parameterization

Parameterization

Parameters are variables in the simulation

model that will be adjusted when creating new

datasets

- E.g. Porosity, permeability, etc.

To determine the location in the dataset to

substitute values, a master dataset must be

created (.cmm)

A master dataset is almost identical to a normal

simulation dataset except CMOST keywords

have been added to identify where a parameter

should be added

- Acts as a template for creating new datasets

07/01/2015

Master Dataset

Master Dataset

A master dataset can be created in multiple

ways:

CMOST Editor

Builder

Text editor (Notepad, Textpad, etc.)

07/01/2015

List of

Parameters

Select

Dataset

Section

Select

Parameter from

Section

Export Master Dataset (Template

file)

Dataset Template

Complementary to Builder

Create CMOST parameters

Better syntax highlighting

Fold no-need-to-see sections

Easy navigation

Navigate CMOST parameters

View include files

Parameterize include files

10

07/01/2015

Original Dataset:

PORCON0.20

Master Dataset:

PORCON<cmost>this[0.20]=Porosity</cmost>

Simulator

Keywords

CMOST

Start

No Spaces in

CMOST Portion

Variable

Name

Original

(Default)

Value in

Dataset

CMOST

End

Variable Names

Case Sensitive

Formulas can also be used with 1 or more variables:

PORCON<cmost>0.20*PorosityMultiplier</cmost>

Simulator

Keywords

CMOST

Start

Formula

CMOST

End

Default value

optional

11

07/01/2015

Values in regions of the reservoir can be modified using

MOD simulation keywords

PORCON0.20

MOD

1:52:81:10*<cmost>this[1]=PorosityMultiplier1</cmost>

6:102:81:10*<cmost>this[1]=PorosityMultiplier2</cmost>

Block Ranges

I:IJ:JK:K

Parameter Definition

Continuous parameter

Lower and upper limit define the sampling range

used by study engines

Discrete parameter

Real, Integer, and Text

Each discrete text value also requires a

numerical value

Formula

Value based off of other parameters values

12

07/01/2015

Syntax highlighting

Shows what variables are available to be

used to create formulas

Test and check the formula anytime

Hard Constraints

Criteria that must be satisfied for a dataset

to be created

Used to eliminate unrealistic datasets

E.g. Horizontal permeability should be

greater than vertical permeability

13

07/01/2015

Pre-Simulation Commands

Passes dataset to separate application

before submitting to simulator

Run Builder Silently

Run GOCAD Command Silently

Run User Defined

Can be used to create new geostatistical

realizations, recalculate formulas in Builder,

recalculate rel. perm. curves, etc.

Simulation model

H Pair

Geological model

Simulation model

14

07/01/2015

Objective Functions

Objective Functions

An Objective Function (OF) is something (an

expression or a single quantity) for which

you wish to achieve some goal

Usually this goal is to achieve a minimum or

maximum value

In the case of History Matching, one

usually wishes to minimize an error

between field data and simulation

In the case of Optimization, one usually

wishes to maximize something like NPV

15

07/01/2015

Objective Functions

Values directly taken from simulation results

with no modification

History match error

Percentage relative error

Perfect match: 0%

Net Present Value

Simplified NPV calculation

Can be used to construct user-defined

objective functions which utilize simulation

results discounted by time as variables

Objective Functions

Specific Dates

Date where maximum or minimum value is

found

Date when value surpasses a specified criteria

Advanced Objective Functions

User defined objective function based on

formula or code (jscript or python)

Soft Constraints

Re-evaluates objective functions based on

simulation results

16

07/01/2015

Characteristic date times, within certain time

series range, that meet certain criteria

Examples:

First time oil rate is higher than 100 bbl/d;

Last time SOR is bigger or equal to 6;

First time oil rate is higher than 100 bbl/d , and

SOR is smaller than 4

Fixed date time (e.g. simulation stop time, or

2014-8-15) is fixed for all jobs. Objective function

are calculated on fixed date times for all jobs.

In some cases, we want to get information on date

time that some specified condition are meet (e.g.

oil rate>100). However, these times are not the

same for all simulation jobs.

Thus, they are dynamic.

17

07/01/2015

Peak NPV

Optimization

Plateau period

Oil produced at plateau

Average oil rate at plateau

18

07/01/2015

Start & Stop Time

To name certain

data time

meet certain

criteria

Criteria

Criteria

Time Series

19

07/01/2015

Defined in Basic Simulation Result Page,

then can be used as objective functions.

Use Excel spreadsheet

Map CMOST parameter values to cells

Map simulation results to cells

Use executable provided by user (e.g.

MATLAB)

Preview calculation result using base case

20

07/01/2015

After each simulation is done:

CMOST write Parameter and simulation results

to Excel cells;

Excel calculate objective function using formula

or VBA code;

then CMOST read value back into CMOST, and

use it the objective function value.

Sensitivity Analysis

21

07/01/2015

Determine which parameters have an effect

on results

E.g. I expect that rock compressibility is

between values A and B. Does this

uncertainty impact my results?

Determine how much of an effect

parameters have on results

E.g. If permeability is increased by 50mD,

how much will cumulative oil increase?

Select parameters to analyze

E.g. porosity

Select range of values to analyze

E.g. between 20-30% porosity

Select results (Objective Functions) to

analyze

E.g. Cumulative Oil

22

07/01/2015

One Parameter at a Time (OPAAT)

Each parameter is analyzed independently

while remaining parameters are set to their

reference value

Response Surface Methodology

Multiple parameters are adjusted together

then results are analyzed by fitting a

response surface (Polynomial equation) to

results

This method analyzes each parameter

independently

While analyzing one parameter, the method

freezes the other parameters at their reference

values (Median or Default)

This measures the effect of each parameter on

the objective function while removing the effects

of the other parameters.

23

07/01/2015

Benefits

Simple to use

Results easy to understand

Results not complicated by effects of other

parameters

Drawbacks

Results are focused around the reference

values

Results can change dramatically if

reference values change

Select reference

values to be used

Objective

Function

Non-monotonic

variables require many

levels of parameter

values to be tested

Parameter

24

07/01/2015

Porosity = 0.2

CumOil = 33004

bbl

Porosity = 0.25

(reference value)

CumOil = 40416

bbl

Porosity = 0.3

CumOil = 44176

bbl

not always correspond with Min and Max

parameter values

Check cross plots to verify

25

07/01/2015

maximum change in the

objective function over the

parameter range

Correlation between

response and

parameters

NPV = f(x1, x2, ,

xn)

The response surface

is a proxy for the

reservoir simulator that

allows fast estimation

of the response

52

26

07/01/2015

simultaneously

The combination of parameter values that are

chosen are based off of an experimental

design

Response surface (polynomial equation) is fit to

simulation results

Linear

Linear + Quadratic

Linear + Quadratic + Interaction terms

Linear Model

y a0 a1 x1 a2 x2 ak xk

Linear + Quadratic Model

k

j 1

j 1

y a0 a j x j a jj x 2j

Linear + Quadratic + Interaction

k

j 1

j 1

y a0 a j x j a jj x 2j aij xi x j

i j j 2

Model type is automatically chosen but can be changed if necessary

27

07/01/2015

Handles problematic

simulation runs

Specify desired

accuracy, the engine

will create and run

experiments as

needed.

28

07/01/2015

For comparing effect between different

parameters, parameter ranges are

normalized between -1 and 1

In the resulting tornado plot, 2*Coefficient of

the normalized polynomial relation is given

Represents average change going from

min to max parameter value when looking

at linear effects

Similar to bar length from OPAAT method

29

07/01/2015

Increasing PERMH_L1

(permeability) from

2625mD to 4375mD

results in an increase in

Cumulative Oil of 12,461

STB on average

If a parameters has a non-linear relation with

the objective functions, a quadratic term may

also be given (x2)

If modifying 2 parameters at the same time has

an effect stronger than the sum of their

individual linear or quadratic effects, a cross

term may be given (x*y)

30

07/01/2015

X-axis: Parameter

Y-axis: Objective Function

Any Questions?

31

07/01/2015

Control Centre

Engine Settings

Defines task type

Task type can be modified from what

was originally selected when creating

the study

Any other options related to the engine

can be modified from this page

32

07/01/2015

Engine Settings

Study Type

Engine

Simulator Settings

Simulation related settings:

Schedulers

Simulator version

Number of CPUs per job

Maximum simulation run time

Job record and file management

Data I/O Cleanup

33

07/01/2015

CMG Scheduler

Microsoft HPC

IBM Platform LSF

Oracle Grid Engine

Portable Batch System (PBS/TORGUE)

Switches to reduce output file size

Disable grid records writing in OUT

Disable grid records writing in SR2

34

07/01/2015

Benefits

Remove I/O bottleneck

Reduce occurrence of strange problems

Reduce support troubleshooting time

Experiments Table

List of Experiments

Parameter values used

Objective function results

Able to sort and filter results

Open in Builder and Results

Add additional experiments

User defined

Predefined experimental designs

(fractional factorial, latin hypercube, etc.)

35

07/01/2015

Simulation Jobs

List of Simulations

Scheduler information

Start Time

End Time

Scheduler Name

Status

File information

Name and location

Normal/Abnormal termination

Any Questions?

36

07/01/2015

All result objects are dynamically created on

the fly using the data stored in experiment

table

All types of result objects are available for

any study type

HM & OP will automatically have sensitivity

and proxy result if there are enough

experiments.

37

07/01/2015

Parameters

Run progress

Parameter value vs. experiment number

Histogram

Frequency that range of parameter values

were chosen

Cross plot

Plot parameter values vs. other data

Current parameter always on y-axis

Time Series and Property vs. Distance

Plots of simulation results as defined in the Fundamental Data section

38

07/01/2015

Objective Functions

Run progress

Objective function value vs. experiment number

Histogram

Frequency that range of objective function values occurred

Cross plot

Plot objective function values vs. other data

Current objective function always on y-axis

OPAAT Analysis

Results from One Parameter At A Time sensitivity analysis

Proxy Analysis

Proxy verification

Response Surface sensitivity analysis results

Monte Carlo Simulation uncertainty assessment results

39

07/01/2015

Statistics

Polynomial proxy

Linear/Quadratic/Interaction

When using the response surface

methodology, one should verify that the

response surface provides a valid match to

the simulation data

This can be verified through

Response Surface Verification Plot

Summary of Fit Table

Analysis of Variance Table

Effect Screening

40

07/01/2015

Gives a visual overview of how the proxy model

fits the actual simulation results

Distance from each point to the 45 degree line

is the error/residual for that point

Points that fall on the 45 degree line are those

that are perfectly predicted

41

07/01/2015

variability of the response obtained by using the

regressor variables in the model.

R2 of 1 occurs when there is a perfect fit (the errors are

all zero).

R2 of 0 means that the model predicts the response no

better than the overall response mean.

models with different numbers of regressors by

using the degrees of freedom in its computation.

Here n is the number of observations (training

jobs) and p is the number of terms in the

response model (including the intercept).

When R2 and R2adjusted differ dramatically, there is

good chance that non-significant terms have

been included in the model.

42

07/01/2015

R2adjusted (Example)

4 Samples

n=4

Linear:

R2=0.9683

Quadratic:

R2=0.9715

4

4

1

1

2

0.9683

4

4

1

1

3

0.9715

R2prediction gives some indication of the predictive

capability of the regression model.

For example, we could expect a model with

R2prediction=0.95 to explain about 95% of the

variability in predicting new observations.

43

07/01/2015

To calculate PRESS,

Select an observation i.

Fit the regression model to the remaining n-1

observations and use this equation to predict the

withheld observation yi.

Denoting this predicted value by y(i), we can find the

prediction error for point i as ei=yi-y(i).

The prediction error is often called the ith PRESS

residual. This procedure is repeated for each

observation i=1,2,3,,n producing a set of n PRESS

residuals: e1, e2, e3en.

The PRESS statistic is then defined as the sum of

squares of the n PRESS residuals:

R2prediction Example

x

1.000

2.000

3.000

4.000

y

1.900

2.450

2.950

3.890

y(i)

1.657

2.484

3.194

3.483

PRESS:

SS

(Total):

R2prediction

ei

0.243

-0.034

-0.244

0.407

ei2

0.059

0.001

0.060

0.165

0.285

2.143

0.867

44

07/01/2015

Mean of Response

The overall mean of the response values. It is

important as a base model for prediction

because all other models are compared to it.

Standard Error

Estimates the standard deviation of the random

error. It is the square root of the mean square

for Error in the corresponding Analysis of

Variance table. Standard error is commonly

denoted as .

Degrees of Freedom

Total = Number of Samples 1

Model = Number of coefficients for the response surface (not

including intercept)

Error = Total Model

Sum of Squares

Total: Sum of Squared distances of each response from the

sample mean

Error: Sum of squared differences between the fitted (RS) values

and the actual simulated values

Model = Total Error

Mean Square

(Sum of Squares)/(Degrees of Freedom)

Converts sum of squares to an average

45

07/01/2015

F Ratio

Model mean square divided by the Error mean square

Tests the hypothesis that all the regression parameters

(except the intercept) are zero (have no effect on the

objective function)

Prob > F

The probability of obtaining a greater F-value by chance

alone if the specified model fits no better than the overall

response mean.

Significance probabilities of 0.05 or less are often

considered evidence that there is at least one significant

regression factor in the model

Variance Table

46

07/01/2015

Check Predicted vs. Actual

Is there a good fit?

Any outliers?

Is R-Square adjusted and R-Square prediction

large enough (>0.5)?

For a decent model, Prob>F should be very

small (<0.0001).

Parameters (-1, +1)

Coefficient

Coefficients of the response surface model found by least squares

Standard Error

Estimate of the standard deviation of the distribution of the

parameter estimate (coefficient)

t Ratio

Statistic that tests whether the true parameter (coefficient) is zero

Probability of getting an even greater t-statistic (in absolute value),

given the hypothesis that the parameter (coefficient) is zero.

Probabilities less than 0.1 are often considered as significant

evidence that the parameter (coefficient) is not zero.

Used to filter statistically insignificant terms

47

07/01/2015

Parameters (-1, +1)

VIF (Variance Inflation Factor)

Measure of multi-collinearity problem

Multi-collinearity refers to one or more near-linear

dependences among the regressor variables due to poor

sampling of the design space

Multi-collinearity can have serious effects on the estimates

of the model coefficients and on the general applicability of

the final model

The larger the variance inflation factor, the more severe the

multi-collinearity.

It is suggested that the variance inflation factors should not

exceed 4 or 5.

If the design matrix is perfectly orthogonal, the variance

inflation factor for all terms will be equal to 1.

Parameters (-1, +1)

48

07/01/2015

Increasing PERMH_L1

(permeability) from

2625mD to 4375mD

results in an increase in

Cumulative Oil of 12,461

STB on average

Proxy Dashboard

See quick estimation of effects of

parameters on results at all simulation times

See results immediately without needing to

wait for additional simulation

Can assist in manual history matching or

optimization

CMOST can create dataset and run

simulation to verify proxy results

49

07/01/2015

Proxy Dashboard

Build Proxy

Model

What-if Scenario:

Estimate Objective

Functions

Select

Comparison Case

What-if Scenario:

Choose Parameter

Input

What-if Scenario:

Visualize Estimate of

Curve

Curve divided into 100 points

Response surface model created for each

point based on simulation results

Polynomial

RBF neural network

When parameter values are adjusted, each

point along the curve is recalculated using

the response surface

50

07/01/2015

Any Questions?

Optimization

51

07/01/2015

In history matching, we are trying to reduce the

error between the simulation results and field

measured data

By matching the simulation model to the historical

behaviour, we have more confidence that the model

will be able to predict future behavior

When creating a simulation model, there may be

uncertainty in the input parameters. These will be

the parameters that should be adjusted when

history matching

Select parameters to analyze

E.g. porosity, permeability

Select range of values to analyze

E.g. between 20-30% porosity

Select results (Objective Functions) to

match

E.g. Cumulative Oil

CMOST will search for the best combination

of parameter values that will give the lowest

history match error

52

07/01/2015

A hierarchy is used when optimizing the objective

function

Upper terms are calculated as a weighted

average of the lower terms

Global

Objective

Function

Local

Objective

Function 1

Term 1

Term 2

Local

Objective

Function 2

Term 3

Term 1

Local

Objective

Function 3

Term 2

Term 1

Term 2

Term 3

Typically common items are grouped together

E.g. The local objective functions might

represent the error for a well and the terms

might represent the measured data for that

well

Total Error

Well 1 Error

Oil

Production

Error

Water

Production

Error

Well 2 Error

Gas

Production

Error

Bottom-hole

Pressure

Error

Oil

Production

Error

Well 3 Error

Oil

Production

Error

Water

Production

Error

Gas

Production

Error

53

07/01/2015

simulated

Nt

Number of measurements

(Y

t 1

measured

Yt m )2

Nt

different between simulation and measured

result

Square terms to make them positive

Sum up all of the points at all times

Divide by the number of measurements to get

average square

Square root to get average error

Nt(j)

(Y

t 1

TermError j

s

j,t

Y j,tm )2

Nt(j)

Y 4 Merrj

m

j

maximum difference

measurement error

must be normalized

This is done by dividing by the maximum difference in

measured values

Measurement error can also be included

Value of 4 is used to include 2 standard deviations on each side of the

mean (95% confidence)

54

07/01/2015

N(i)

Qi

TermError

i, j

j 1

100% tw i, j

N(i)

tw

j 1

i, j

a weighted arithmetic average of each of the

Terms

Nt(i, j)

(Y

t 1

Qi

N(i)

N(i)

tw

j 1

j 1

i, j

s

i, j,t

Yi,mj,t )2

Nt(i, j)

100% tw i, j

Yi,mj 4 Merri, j

55

07/01/2015

Nw

Q global

w Q

i 1

Nw

w

i 1

weighted arithmetic average of each of the

Local Objective Functions

Able to weight each measured data point

individually

Remove or reduce weight of outliers

56

07/01/2015

Optimization Methods

CMG DECE (Designed Evolution, Controlled

Exploration)

Particle Swarm Optimization

Latin Hypercube plus Proxy Optimization

Random Brute Force Search

Optimization Philosophy

Mathematical optimization

Mathematicians are particularly interested in finding the true

absolute optimum.

Optimum 0.000001 is much better than 0.01 even though it may

take 20 extra days to achieve the former.

Engineering optimization

Engineers are more interested in quickly finding optima that are

close to the true optimum.

Optimum 0.01 is much better than 0.000001 if it takes 20 less

days to achieve the former.

Engineering optimization

Not intended to solve pure mathematical problems

114

57

07/01/2015

Generate initial Latin hypercube design

Run simulations using the design

Get initial set of training data

Success?

Add new

solutions to

training data

No

Exploration

(get more information)

Yes

Run simulations

No

Yes

Stop

58

07/01/2015

DECE Characteristics

Handles hard constraints

Asynchronous complete utilization of distributed computing

power

Fast and stable convergence

A population based stochastic optimization technique

developed in 1995 by James Kennedy and Russell Eberhart.

Let particles move towards the best position in search space,

remembering each local (particles) best known position and

global (swarms) best known position.

118

59

07/01/2015

Local

Area

119

Next

Case

120

60

07/01/2015

Algorithm

Generate initial Latin hypercube design

Polynomial

RFB Neural

Network

Add validated

solutions to

training data

No

Yes

Stop

Algorithm

61

07/01/2015

Proxy Optimization

Random Search

Parameter value combinations randomly until

the max number of simulator calls has been

reached

No trend to results (scatter)

Only use if search space is small

62

07/01/2015

Any Questions?

Optimization Goals

each one would like to find the maximum or minimum of an

objective function

the simulation results and field measured data

function

Find maximum NPV

Find maximum recovery

Etc.

adjusted are operational parameters as opposed to reservoir

parameters when history matching

63

07/01/2015

Optimization Process

E.g. Injection rate, well spacing

Select range of values to analyze

E.g. between 200-500bbl/day injection rate

Select results (Objective Functions) to improve

E.g. NPV, recovery factor

CMOST will search for the best combination of

parameter values that will maximize your objective

function

In some cases we may want to minimize an

objective function such as when looking at run

times during numerical tuning

Net Present Value (NPV) is often used

as an economic indicator to evaluate the

value of a project

A discount rate (I) is used to incorporate

the time value of money

Money now is worth more than money later

64

07/01/2015

based on the period that has been selected.

Daily

Monthly

Quarterly

Yearly

Yearly discount rate will be converted to the

period of interest

E.g. Daily:

DECE

LHD Plus Proxy

OP

PSO

Random Brute Force

same stop criterion

Which objective

function to

optimization

Maximize or minimize

65

07/01/2015

Multiple-objective optimization: Optimizing

multiple, maybe conflicting, objective functions

simultaneously

Two approaches:

1. Optimize an aggregated global objective function

2. Pareto Optimization, e.g. Multiple-objective PSO

Multi-Objective PSO

Select the engine

How many simulations?

First objective function

Maximize or minimize?

Maximize or minimize?

66

07/01/2015

Domination: Better in every objective function

Leader: A non-dominated solution

Pareto front: The ensemble of leaders

f1

Dominated solution

Leader

Pareto front

b

a dominates b

f2

Leader Selection

Leader is randomly selected for each particle

Least crowded leaders are given high priority,

trying to find an adequately spread Pareto front

f1

a

b

c

d

e

Priority

Leader

a, f

f2

67

07/01/2015

Numerical tuning of a SAGD model for a real

field

Full model takes >42d to run

After cleanup & tuning: 7d to finish (SPE

165511)

Sub model: 4 slices (4x600x50) takes ~1h to

run the first 6 months

Colin Card et.al. A New and Practical Workflow of Large Multi-Pad SAGD Simulation- A

Corner Oil Sands Case Study, SPE HOCC, SPE 165511, June 2013.

Conflicting objective functions:

1. Run time

2. Material Balance Error

3. Solver Failure Percent

GlobalObj =

(50,000*MaterialBalanceError+1*RunTime+1,000*Solver

FailurePercent)/51,001

Use PSO to minimize GlobalObj

68

07/01/2015

MO-PSO Optimization

Same set of parameters

Minimize two conflicting objective functions:

MaterialBalanceError & RunTime

GlobalObj & SolverFailurePercent are also

calculated, just for comparison

SPSO Experiments

3500

MOPSO

SPSO

3400

3300

3200

3100

3000

2900

2800

2700

2600

2500

0

0.005

0.01

0.015

0.02

0.025

0.03

0.035

0.04

69

07/01/2015

Any Questions?

Uncertainty Assessment

70

07/01/2015

Once an optimum operating strategy has been

developed for one or more history match models the

question remaining is:

Given residual uncertainties in the HM (or other)

variables, what impact will those uncertainties have

on the NPV of the optimum cases(s)?

How does this work?

Even with simple geological models we are still likely

to have more than one set of geological parameter

values that give an acceptable HM, thus indicating

some uncertainty in these values

141

How does this work?

If we have more than one geological realization that

gives an acceptable HM we have by definition an

uncertainty as to which realization best reflects reality

Thus we need to see how the NPV of our optimum

cases is impacted by the uncertainty in these

realizations

It is important to recognize that the HM process

develops alternative realizations and that parameters

which are part of these realizations cannot have their

values changed independently and arbitrarily

71

07/01/2015

Uncertainty Assessment

Uncertainty Assessment

72

07/01/2015

Input probability

distributions derived from

experience

DWOC

SORG

81

follow input distribution)

and calculate NPV

82

SORW

SORW

SORG

DWOC

83

iterations

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

Defines likelihood of a parameter value being

selected in Monte Carlo Simulation for continuous

parameters

Uniform

Triangle

Normal

Log Normal

Custom

Probability for each value defined for discrete

parameter types

73

07/01/2015

Normal

Lognormal

Triangle

Uniform

Custom

Discrete

Parameter Correlations

Some parameters may be related to each

other

E.g. porosity and permeability may

correlate with each other

Correlation coefficient defines how closely

related parameters are to each other

Value of 1 means parameters are directly

related

Value of 0 means that parameters have no

relation with each other

74

07/01/2015

POR

Parameter

POR

PERMH

PERMV

POR

PERMV

POR

PERMH

PERMV

1

0

0

0

1

0

0

0

1

POR

PERMH

PERMH

PERMH

PERMV

PERMV

Desired Values

Parameter

POR

PERMH

PERMV

POR

PERMH

PERMV

1

0.6

0.4

0.6

1

0.8

0.4

0.8

1

Calculated Values

Parameter

POR

PERMH

PERMV

POR

PERMH

PERMV

1

0.582

0.385

0.582

1

0.79

0.385

0.79

1

POR

PERMH

POR

PERMH

PERMV

PERMV

75

07/01/2015

Parameter Correlations

0.25

0.0

0.50

0.75

for Monte Carlo Simulation

Response surface as a proxy to reservoir simulation

Running reservoir simulation

Running reservoir simulation is often too slow to

evaluate the response so the proxy method is often

preferred

Situations to run reservoir simulation for Monte Carlo

simulation:

You want to validate MCS-proxy result

Building proxy is not feasible. For example,

Multiple Geostatistical realizations are used

Multiple HM models are used

76

07/01/2015

How do we do this?

The RS is generated by creating an

experimental design and providing and a

statistical distribution for each uncertain

parameter

The process is the same as polynomial

response surface modelling for sensitivity

analysis

RBF neural network type proxy models also

available for Uncertainty Assessment

A Monte Carlo simulation is then run to

determine the NPV distribution

Handles problematic

simulation runs

Specify desired

accuracy, the engine

will create and run

experiments as

needed.

77

07/01/2015

Manual Engine

User

Defined

created by the user

explicitly through:

Classical experimental

design

Latin hypercube

design

Manual

External Engine

Allows the use of users

own optimization

algorithm.

78

07/01/2015

Use classical experimental design for SA and

UA

Precise control on the number of Latin

hypercube experiments

Run additional experiments after a

SA/UA/HM/OP run is complete.

Create new experiments using users

optimization algorithm

79

07/01/2015

Base Files

To begin a CMOST project, a completed

simulation dataset (.dat) along with its

Simulation Results (SR2: .mrf, .irf) files are

required

CMOST Project

A CMOST Project is the main CMOST file that

can contain multiple related studies

ProjectName:SAGD_2D_UA

ProjectFile:SAGD_2D_UA.cmp

ProjectFolder: SAGD_2D_UA.cmpd

Bestpractice: Allfilesrelatedtothe

projectshouldbestoredintheproject

folder.

80

07/01/2015

CMOST Study

A CMOST study contains all of the input

information for CMOST to run a particular

type of task

Information can be copied between studies

Study types can be easily switched

The new study type will use as much

information from the previous study type as

possible

StudyName:BoxBen

Study File:BoxBen.cms

StudyFileAutoBackup:BoxBen.bak

StudyFolder: BoxBen.cmsd

Dontmodify/deletefilesinthestudyfolder

unlessyouknowwhatyouredoing.

81

07/01/2015

Vectordatarepositoryfile:*.vdr

VDRstorescompressed

simulationdatarequired for

objectivefunctioncalculations

SubsetofSR2results

Nevermodifyordeletevdrfiles

manually

Auto

Synchronization

Auto/Manual

Reprocessing

After you run an engine, you can go back to change input data.

Experiments inside a study will be automatically reused.

If new parameters are added, you need to resolve reuse pending

experiments.

After you finish the changes, click start engine to restart.

82

07/01/2015

Licensing Multiplier

CMOST uses only partial licenses when

running simulations

E.g. Run 2 STARS simulations while using

only 1 STARS license

Applies to other license types (Parallel,

Dynagrid, etc.)

IMEX

GEM

STARS

4:1

2:1

2:1

x4

x2

x2

83

07/01/2015

Quality Data

Quality Result

Further Assistance

Email: support@cmgl.ca

Zip an entire project or selected studies

Email or ftp the zip file to CMG

84

07/01/2015

85

07/01/2015

Any Questions?

86

- CMG tutorial.pdfEnviado porjamshidianmajid
- 07 CMOST TutorialEnviado porAnonymous Wo8uG3ay
- CMOSTEnviado porvajrahasta
- CMG tutorialEnviado porSamantha Parker
- Manual CMOSTEnviado porDelia Carolina Mora
- WINPROPEnviado porEfrain Ramirez Chavez
- CMG Tutorial EOR 2013Enviado porhunglytuan
- CMG Step by StepEnviado portsar_philip2010
- CMG WinProp Tutorial.pdfEnviado porIman Saadatmand
- 07 CMOST TutorialEnviado porAnonymous Wo8uG3ay
- Waterflooding Tutorial CMGEnviado porSamir Ferney Quiroga
- Manual IMEX 201010enEnviado porFabiano Oliveira
- Petrel RE Tutorial - Fluid ModelEnviado porJames
- 1. Introduction to STARSEnviado porkpratik41
- BuilderEnviado porThiago Rodrigues Alves
- CMG STARS Guide.pdfEnviado porAastha
- post-12889-1293602095.ipb_2Enviado porTony Chacko
- 2013 Houston Training ScheduleEnviado porQaiser Hafeez
- Compositional SimulationEnviado porDin Winchester
- C02 ThesisEnviado porEmeka Chinaka
- 4-Chemical Flood Exercises_Tutorial - OCT- 2012.pdfEnviado porSumit Kumar
- Introduction to CMGs TUTORIAL_2013_20.pdfEnviado pornajmudeen
- Advanced Builder and Results -Builder Tutorial (2013.10).pdfEnviado pornajmudeen
- 2. Basic CMG Workflow Course Notes_2010.10Enviado porMohammad Kazemi
- SIMPLE_MODEL.pdfEnviado pornajmudeen
- Cmost Manual 2009Enviado porNguyen Thanh Nguyen
- A Guide to Using CMG LicensingEnviado porHERiTAGE1981
- Enhanced-Oil-Recovery-Willhite.pdfEnviado pornguoivietnam10
- EOR ScreeningEnviado porMohanad Hussien
- Waterflood ManualEnviado porRama Nathan

- Java EE Course Material - Srikanth PragadaEnviado porDany Cenas Vásquez
- Full Text 01Enviado porKamal King
- MS Access in Pics TutorialEnviado porananth
- Physical Property Data AspenEnviado porphaggot123
- Earth-Day-activities-for-kids.pdfEnviado porElizabeth Lugo Serrano
- GAT General Application FormEnviado porUmer
- Poles and Zeros of Network FunctionsEnviado porDina Garan
- 8051 in CEnviado poratiqakbar1
- Lecture 17-DNS,Email, Telnet, FTP, SNMPEnviado poryesmurali
- Learn Android Studio 3Enviado porgostozam
- Bộ đề thi CCNA có lời giải.ver5 (200-120)Enviado porNguyen Duc
- John Von NeumannEnviado porJonardTan
- Ezy Math Tutoring - Year 5 AnswersEnviado porVincents Genesius Evans
- SAP HANA Technical Operations Manual EnEnviado porJuan Carlos Sanchez
- All Companies Addresss EmailEnviado porkishoresutradhar
- Meena_WebDeveloper_MCAEnviado porMadurai Meenakshi
- Extension - InkSpace _ InkscapeEnviado porVladanRistić
- LG K8 UsermanualEnviado porMarco Barroso
- Adobe Reader and Acrobat Cleaner Tool for 10.x and LaterEnviado porJake Sparrow
- Identity, Identifiers and AttributesEnviado porGabriel Puliatti
- 1 - Introduction to Virtual RealityEnviado porFatin Aqilah
- 12.1.3Enviado porKatari Bharath
- Usbfix [Clean 1] Paola-pcEnviado porPaola Flors'Redhead
- B-mobile Cellular PhoneEnviado porPrateek Singh
- Offset Method - Intergraph CADWorx & AnalysisEnviado porNagarjun Reddy
- How Microsoft Project Schedules TasksEnviado porfri_13th
- Microprocessor Instrumentation GATE PSU Study MaterialsEnviado porShivamPradhan
- RBF Interpolation and ApproximationEnviado porMiljan Kovacevic
- CATS Web User Guide for TempStaff StudentEnviado porMarcus Goh
- Interpretación de Gödel de la lógica intuicionistaEnviado porelias

## Muito mais do que documentos

Descubra tudo o que o Scribd tem a oferecer, incluindo livros e audiolivros de grandes editoras.

Cancele quando quiser.