Professional Documents
Culture Documents
Summary
History matching and uncertainty quantification are two important
research topics in reservoir simulation currently. In the Bayesian
approach, we start with prior information about a reservoir (e.g.,
from analog outcrop data) and update our reservoir models with
observations (e.g., from production data or time-lapse seismic).
The goal of this activity is often to generate multiple models that
match the history and use the models to quantify uncertainties in
predictions of reservoir performance. A critical aspect of generating multiple history-matched models is the sampling algorithm
used to generate the models. Algorithms that have been studied
include gradient methods, genetic algorithms, and the ensemble
Kalman filter (EnKF).
This paper investigates the efficiency of three stochastic sampling
algorithms: Hamiltonian Monte Carlo (HMC) algorithm, Particle
Swarm Optimization (PSO) algorithm, and the Neighbourhood
Algorithm (NA). HMC is a Markov chain Monte Carlo (MCMC)
technique that uses Hamiltonian dynamics to achieve larger jumps
than are possible with other MCMC techniques. PSO is a swarm
intelligence algorithm that uses similar dynamics to HMC to guide
the search but incorporates acceleration and damping parameters
to provide rapid convergence to possible multiple minima. NA is a
sampling technique that uses the properties of Voronoi cells in high
dimensions to achieve multiple history-matched models.
The algorithms are compared by generating multiple historymatched reservoir models and comparing the Bayesian credible
intervals (p10p50p90) produced by each algorithm. We show
that all the algorithms are able to find equivalent match qualities
for this example but that some algorithms are able to find good
fitting models quickly, whereas others are able to find a more
diverse set of models in parameter space. The effects of the different sampling of model parameter space are compared in terms of
the p10p50p90 uncertainty envelopes in forecast oil rate.
These results show that algorithms based on Hamiltonian
dynamics and swarm intelligence concepts have the potential to be
effective tools in uncertainty quantification in the oil industry.
Introduction
History matching and uncertainty quantification are two important
research topics in reservoir simulation currently. Automated and
assisted history-matching concepts have been developed over
many years [e.g., Oliver et al. (2008)]. In recent years, research
has been quantifying uncertainty by generation of multiple history-matched reservoir models rather than just seeking the best
history-matched model. A practical reason for using multiple
history-matched models is that a single model, even if it is the
best history-matched model, may not provide a good prediction
(Tavassoli et al. 2004).
Most uncertainty quantification studies use a Bayesian approach,
where we start with prior information about a reservoir expressed as
probabilities of unknown input parameters to our model. These prior
probabilities can come from a number of sources (e.g., from analog
p(m | O) =
p(O | m) p(m)
. . . . . . . . . . . . . . . . . . . . . . . . . . . (1)
p(O)
M=
t =1
(q
obs
q sim
2 2
2
t
, . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . (2)
k
k
, . . . . . (3)
= wvi k + c1rand1 pbest i xi k + c2 rand 2 gbes
st x i
where:
vik is the velocity of particle i at iteration k;
xik is the position of particle i at iteration k;
c1 is a weighting factor, termed the cognition component, which
represents the acceleration constant which changes the velocity of
the particle towards pbesti;
c2 is a weighting factor, termed the social component, which
represents the acceleration constant which changes the velocity of
k
the particle towards gbest
;
rand1 and rand2 are two random vectors with each component
corresponding to a uniform random number between 0 and 1;
pbesti is the pbest of particle i;
k
gbest
is the global best of the entire swarm at iteration k;
and w is an inertial weight that influences the convergence of
the algorithm.
Initially, the values of the velocity vectors are randomly generated with vik= 0 [ vmax , vmax ] . If one component in a particle violates
the velocity limit vmax, its velocity will be set back to the limit.
Step 6: Update the position for one particle by using
xi k +1 = xi k + vi k +1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . (4)
Step 7: Repeat Steps 2 to 6 until the maximum number of
iterations is reached or another stopping criterion is attained.
Because PSO was developed as an optimization tool, we
have to extend the algorithm to quantify uncertainty in reservoir
modeling. We choose to extend the algorithm using the same
concepts as the NA by running the NAB code, which computes
March 2010 SPE Journal
H ( x , u )
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . (6)
x
and
x =
H ( x , u )
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . (7)
u
2
2
x
t
x (t + t ) = x (t ) + tu t + . . . . . . . . . . . . . . . . . . . . . . . (8)
2
Although the leapfrog algorithm simulation is time-reversible and
volume-preserving, it preserves energy only to second order in the
sample
k +1
k +1
iteration number.
Computing the Gradients. Implementing HMC requires us to
compute the gradient of the negative logarithm of the posterior
distribution, which is the sum of the misfit and the negative log of
the prior. There are two approaches that can be followed to compute this term: If gradients of the solution with respect to uncertain
variables are available (e.g., from an adjoint code), then this term
can be computed directly; alternatively, it can be computed less
accurately from a proxy.
In the application presented here, we use a general regression
neural network (GRNN) (Specht 1991; Nadaraya 1964; Watson
1964) with Gaussian kernels to approximate the misfit surface
and then compute the gradient from the approximated surface. A
critical step in the regression is that of computing the kernel-width
parameter. In our application, we used a fraction of the average
distance between points in each dimension. More sophisticated
methods are available [e.g., Kanevski and Maignan (2004)]. It is
important to note that there are some restrictions on updating the
surface to ensure consistency with the MCMC assumptions.
Field ApplicationThe Teal South Reservoir
Teal South is a reservoir located in the Gulf of Mexico, approximately 144 km southwest of Morgan City, Louisiana, as shown in
Fig. 1. The 4,500-ft-deep sand is bounded on three sides by faults
and closed by dip to the north, and is shown in Fig. 2. Fluids are
produced from a single horizontal well through solution-gas drive,
aquifer drive, and compaction drive. Production started in late
1996, and data are available in the form of monthly oil, water, and
gas measurements, as well as two pressure measurements of 3,096
psi initially and 2,485 psi after 570 days of production (Christie
et al. 2002).
TABLE 1PARAMETERIZATION
FOR THE TEAL SOUTH MODEL
Parameter
Units
md
Prior range
10
kv/kh
Aquifer strength
million STB
Rock compressibility
psi
10
1
10
10
{1...3}
{2...1}
{79}
{4.0963.699}
1,000
Water Rate, STB/D
2,500
2,000
1,500
1,000
500
800
600
400
200
0
0
200
400
600
800 1,000 1,200 1,400
Time, days
200
400
600
800 1,000 1,200 1,400
Time, days
Fig. 3 Production history (oil rate and water rate) for Teal South reservoir.
1.0
Scaled P2
0.8
0.5
0.2
0.0
0.0
(a)
1.0
4.0
6.0
8.0
10.0
0.3
0.5
Scaled P1
0.8
0.8
0.5
0.2
0.0
0.0
1.0
4.0
6.0
8.0
10.0
0.3
0.5
0.8
Scaled Aquifer Strength
1.0
(b)
Fig. 4 Two different 2D projections of initial population of 30 randomly generated models in 3D parameter space.
34
2500
2000
180
FOPR
1500
160
NA1
140
PSO1
120
100
1000
80
60
500
40
20
0
0
0
200
(a)
400
600
800
Time, days
1000
kh1
1200
kh2
kh3
kh4
kh5
(b)
Fig. 5 Comparison of the best history matches (a) and the corresponding permeability estimates (b) obtained from NA and PSO.
180
180
160
NA2
140
PSO2 140
120
120
100
100
80
80
60
60
40
40
20
20
NA3
160
PSO3
0
kh1
kh2
(a)
kh3
kh4
kh1
kh5
kh2
kh3
kh4
kh5
(b)
Fig. 6 Two alternative sets of parameter values providing almost equivalent match qualities to the maximum likelihood model.
sampling evolves. Note that the white (light) points have misfit 10 or
above and include many models that do not match at all well.
The parameter values for the good history-matched models can
be seen by looking at the range of the black (dark) points. Both
algorithms appear to concentrate sampling for the permeability
NA
PSO
6.0
Generational Minimum
5.5
5.0
4.5
10
20
30
40
Generation
Fig. 7 Evolution of the mean generational minimum misfit for
NA and PSO.
35
1.0
0.8
0.6
0.4
0.2
0.0
500
P1
P2
P3
P4
1000
1.0
0.8
0.6
0.4
0.2
0.0
1.0
0.8
0.6
0.4
0.2
0.0
P5
log(kv/kh)
Rock compressibility
Aquifer strength
1.0
0.8
0.6
0.4
0.2
0.0
500
P2
P3
P4
P5
log(kv/kh)
Rock compressibility
1000
Misfit Evaluation Number
1000
P1
1.0
0.8
0.6
0.4
0.2
0.0
1.0
0.8
0.6
0.4
0.2
0.0
500
500
1.0
0.8
0.6
0.4
0.2
0.0
Aquifer strength
1.0
0.8
0.6
0.4
0.2
0.0
1000
Misfit Evaluation Number
0
P1
400
600
P2
2,500
P3
P5
1.0
0.8
0.6
0.4
0.2
0.0
200
1.0
0.8
0.6
0.4
0.2
0.0
log(kv/kh)
Rock compressibility
P4
Aquifer strength
1.0
0.8
0.6
0.4
0.2
0.0
4.0
6.0
8.0
10.0
400
600
Misfit Evaluation Number
Fig. 10 Sampling history of HMC for each of the eight unknown parameters.
36
1.0
0.8
0.6
0.4
0.2
0.0
200
P90
P50
P10
Truth
End of history
2,000
1,500
1,000
500
0
0
200
400
600
800
Time, days
1,000
1,200
2,500
P90
P50
P10
Truth
End of history
2,000
2,500
P90
P50
P10
Truth
End of history
2,000
1,500
1,500
1,000
1,000
500
500
0
0
0
0
200
400
600
800
Time, days
1,000
1.0
600
800
Time, days
1,000
1,200
1.0
0.8
0.6
0.4
0.2
NA
PSO
HMC
Observed
0.0
400
0.8
0.6
0.4
NA
PSO
HMC
Observed
0.2
0.0
1,100
(a)
200
1,200
1,200
1,300
1,400
1,500
100
(b)
150
200
250
Fig. 14 Cumulative distributions from NA, PSO, and HMC at (a) and after (b) the end of history matching.
References
Bishop, C.M. 2006. Pattern Recognition and Machine Learning. New
York: Springer.
Bissel, R., Sharma, Y., and Killough, J.E. 1994. History Matching Using the
Method of Gradients: Two Case Studies. Paper SPE 28590 presented
at the SPE Annual Technical Conference and Exhibition, New Orleans,
2528 September. doi: 10.2118/28590-MS.
Christie, M.A., Glimm, J., Grove, J.W., Higdon, D.M., Sharp, D.H., and
Wood-Schultz, M.M. 2005. Error Analysis and Simulations of Complex
Phenomena. Los Alamos Science 29: 625.
Christie, M.A., MacBeth, C., and Subbey, S. 2002. Multiple historymatched models for Teal South. The Leading Edge 21 (3): 286289.
doi: 10.1190/1.1463779.
Duane, S., Kennedy, A., Pendleton, B., and Roweth, D. 1987. Hybrid
Monte Carlo. Physics Letters B 195 (2): 216222. doi: 10.1016/03702693(87)91197-X.
Eberhart, R. and Shi, Y. 2007. Computational Intelligence: Concepts to
Implementations. Burlington, Massachusetts: Morgan Kaufmann Publishers (Elsevier).
Eberhart, R.C. and Shi, Y. 2001. Particle swarm optimization: developments, applications and resources. In Proceedings of IEEE Congress
on Evolutionary Computation 2001, Vol. 1, 8186. Piscataway, New
Jersey: IEEE Service Center.
ECLIPSE SimOpt User Guide 2005A. 2005. Houston: Schlumberger/
GeoQuest.
Erbas, D. 2006. Sampling strategies for uncertainty quantification in oil
recovery prediction. PhD thesis, HeriotWatt University, Edinburgh,
UK.
Erbas, D. and Christie, M.A. 2007. Effect of Sampling Strategies on
Prediction Uncertainty Estimation. Paper SPE 106229 presented at the
SPE Reservoir Simulation Symposium, Houston, 2628 February. doi:
10.2118/106229-MS.
Evensen, G. 2007. Data Assimilation: The Ensemble Kalman Filter. Berlin:
Springer Verlag.
Jaynes, E.T. 2003. Probability Theory: The Logic of Science, ed. G.L.
Bretthorst. Cambridge, UK: Cambridge University Press.
Kanevski, M. and Maignan, M. 2004. Analysis and Modelling of Spatial
Environmental Data. Lausanne, Switzerland: Environmental Sciences,
EPFL Press.
Kathrada, M. 2009. Uncertainty evaluation of reservoir simulation models
using particle swarms and hierarchical clustering. PhD thesis, Heriot
Watt University, Edinburgh, UK.
Kennedy, J., and Eberhart, R. 1995. Particle swarm optimisation. In Proceedings of the IEEE International Conference on Neural Networks,
Vol. 4, 19421948. Piscataway, New Jersey: IEEE Service Center.
Lepine, O.J., Bissell, R.C., Aanonsen, S.I., Pallister, I.C., and Barker, J.W.
1999. Uncertainty Analysis in Predictive Reservoir Simulation Using
Gradient Information. SPE J. 4 (3): 251259. SPE-57594-PA. doi:
10.2118/57594-PA.
Liu, N. and Oliver, D.S. 2005. Critical Evaluation of the Ensemble Kalman
Filter on History Matching of Geologic Facies. SPE Res Eval & Eng
8 (6): 470477. SPE-92867-PA. doi: 10.2118/92867-PA.
Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H., and Teller,
E. 1953. Equation of State Calculations by Fast Computing Machines.
J. Chem. Phys. 21 (6): 10871092. doi: 10.1063/1.1699114.
Nadaraya, E.A. 1964. On Estimating Regression. Theory Probab. Appl.
9 (1): 141142. doi: 10.1137/1109020.
Oliver, D.S., Reynolds, A.C., and Liu, N. 2008. Inverse Theory for Petroleum Reservoir Characterization and History Matching. Cambridge,
UK: Cambridge University Press.
38
Romero, C.E., Carter, J.N., Gringarten, A.C., and Zimmerman, R.W. 2000. A
Modified Genetic Algorithm for Reservoir Characterisation. Paper SPE
64765 presented at the International Oil and Gas Conference and Exhibition in China, Beijing, 710 November. doi: 10.2118/64765-MS.
Rotondi, M., Nicotra, G., Godi, A., Contento, F.M., Blunt, M.J., and
Christie, M.A. 2006. Hydrocarbon Production Forecast and Uncertainty
Quantification: A Field Application. Paper SPE 102135 presented at the
SPE Annual Technical Conference and Exhibition, San Antonio, Texas,
USA, 2427 September. doi: 10.2118/102135-MS.
Sambridge, M.S. 1999a. Geophysical inversion with a neighbourhood
algorithmI. Searching a parameter space. Geophysical Journal International 138 (2): 479494. doi: 10.1046/j.1365-246X.1999.00876.x.
Sambridge, M.S. 1999b. Geophysical inversion with a neighbourhood algorithmII. Appraising the ensemble. Geophysical Journal International
138 (3): 727746. doi: 10.1046/j.1365-246x.1999.00900.x.
Specht, D. 1991. A general regression neural network. IEEE Transactions
on Neural Networks 2 (6): 568576. doi: 10.1109/72.97934.
Subbey, S., Christie, M.A., and Sambridge, M. 2003. A Strategy for Rapid
Quantification of Uncertainty in Reservoir Performance Prediction.
Paper SPE 79678 presented at the SPE Reservoir Simulation Symposium, Houston, 35 February. doi: 10.2118/79678-MS.
Subbey, S., Christie, M.A., and Sambridge, M. 2004. Prediction under
uncertainty in reservoir modeling. J. Pet. Sci. Eng. 44 (12): 143153.
doi: 10.1016/j.petrol.2004.02.011.
Tavassoli, Z., Carter, J.N., and King, R.P. 2004. Errors in History Matching.
SPE J. 9 (3): 352361. SPE-86883-PA. doi: 10.2118/86883-PA.
Watson, G.S. 1964. Smooth Regression Analysis. Sankhya: The Indian
Journal of Statistics 26 (Series A, Pt. 4): 359372.
Yu, T., Wilkinson, D., and Castellini, A. 2008. Constructing Reservoir
Flow Simulator Proxies Using Genetic Programming for History
Matching and Production Forecast Uncertainty Analysis. Journal of
Artificial Evolution and Applications 2008: 113. Article ID 263108.
doi: 10.1155/2008/263108.
Linah Mohamed is a PhD student in petroleum engineering at
Heriot-Watt University, UK. She holds an MS degree in computer
science from the University of Khartoum, Sudan. Mohameds
principal research interests are history matching and uncertainty quantification using stochastic sampling techniques
based on Hamiltonian dynamics, swarm intelligence concepts,
and machine learning. Mike Christie is professor of reservoir engineering at Heriot-Watt University in Edinburgh, where he runs a
research group on uncertainty quantification in reservoir simulation. Before joining Heriot-Watt, he spent 18 years working for BP,
holding positions in research and technology development in
the UK and USA. He holds a bachelors degree in mathematics
and a PhD degree in plasma physics, both from the University
of London. Christie has been an SPE Distinguished Lecturer and
was awarded the SPE Ferguson medal in 1990. Vasily Demyanov
carries out research in spatial statistics and machine learning applications to model uncertainty of subsurface reservoir predictions. He holds a PhD in physics and mathematics
from the Russian Academy of Sciences and has worked with
a wide range of spatial statistics and artificial neural networks
applications in environmental and geosciences fields. He has
published several papers in the leading geosciences, environmental, statistics, and computer science editions, including an
award winning paper in Pedometrics. Demyanov is a coauthor
of the chapters on geostatistics and artificial neural networks
in Advanced Mapping of Environmental Data: Geostatistics,
Machine Learning and Bayesian Maximum Entropy. Advanced
kernel learning methods, stochastic optimisation algorithms, and
Bayesian statistics are among his current research interests.