You are on page 1of 11

Society of Petroleum Engineers

SPE 26106

Complexities of Using Neural Network in Well Test Analysis of


Faulted Reservoirs
I.A. Juniardi and Iraj Ershaghi, U. of Southern California'
SPE Members

Copyright 1993, Society of Petroleum Engineers, Inc.

This paper was prepared for presentation at the Western Regional Meeting held in Anchorage, Alaska, U.S.A., 26-28 May 1993.

This paper was selected for presentation by an SPE Program Committee folloWing review of information contained in an abstract submitted by the author(s). Contents of the paper,
as presented, have not been reviewed by the Society of Petroleum Engineers and are subject to correction by the aut~or(s). The ma~erlal, as presen~ed: does no! necessarily ref!ect
any posnion of the Society of Petroleum Engineers, its officers, or members. Papers presented at SPE '!'eellngs are sUbJ9clto publication revIew by EdItorial Commillees of the SOCIety
of Petroleum Engineers. Permission to copy is restricted to an abstract of not more than 300 words. illustrations may not be copIed. The abstract should contain conspicuous acknowledgment
of where and by whom the paper is presented. Write Librarian, SPE, P.O. Box 833836, Richardson, TX 75083-3836, U.S.A. Telex, 163245 SPEUT.

Abstract rameter estimation methods can be used to obtain quantita-


tive description of reservoirs properties.
Neural networks models bring a significant dimension to
the capability of computer aided approaches for well test However, as documented by Ershaghi and Woodbury 1, the
analysis. This study shows, however, that when dealing with problem of non-uniqueness always creates confusion about
complex reservoirs, these models need augmentation from an model selection. In particular, the same type of response
expert system containing two types of information; indepen- may exist among the performance of completely different
dent field data and tables containing frequency of occurence systems. The correct estimation of the reservoir parame-
of non-related models. A Monte Carlo Simulation method is ters, consequently, depends on the integrity of the selected
proposed to provide the augmented statistical informations well test model. Thus, the selection of the reservoir interpre-
required for the expert system. tation model and parameter estimation are two major issues
in well-test analysis.
Introduction
Ramey 2 and Horne3 recently reviewed various diagnostic
Industry wide application of pressure transient test data
methods commonly used in the model selection process. In
has been limited in terms of number, data quality and in-
the traditional analysis, this task was carried out by present-
tegrity of interpretation. Considerable advances in analyt-
ing the observed pressure response in log-log or semilog plots;
ical and numerical modeling of flow problems in reservoirs
Le., Horner plot. As indicated by Horne3 , this approach has
have allowed analysts to study the response of an idealized
limitations with increasing complexity of the models and
reservoir geometry of flow regime under the conditions of
number of controlling parameters. Application of pressure
specified wellbore flow. Because of many possibilities that
derivative curves in well-test analysis, proposed by Bourdet
may exist in real situations, continued efforts are underway
et al. 4, enhances the capability of the analyst to recognize
to formulate and to obtain the pressure response in reservoirs
the reservoir model. Ershaghi et al. 5 indicated the impor-
of increasing complexity.
tance of derivative plots as ISDP (Industry Standard Di-
The use of computer-aided interpretation technique has agnostic Plots) for pattern recognition. Pressure derivative
enhanced the capability of well-test analyst in terms of accu- method requires high-precision data to filter noises associ-
racy and speed to process and analyze the well-test data. ated with the tool or the differentiation process. Application
Many possible analytical reservoir models can be investi- of Expert Systems in well-test analysis has been investigated
gated to find the best interpretation model. Once the proper by Allain and Horne 6 and AI-Kaabi et al. 7 In this approach,
reservoir model has been determined, various non-linear pa- imitating the reasoning process performed by a human ex-
pert, symbolic representation is used to diagnose the im-
oReferences and illustrations at end of paper. age of the reservoir pressure response. Segmentation and

711
2 COMPLEXITIES OF USING NEURAL NETWORK IN WELL TEST ANALYSIS OF FAULTED RESERVOIRS SPE 026106
knowledge- based system are the main parts of the diagnosis plots, normalized and digitized to a set of (x,y) pairs. The
process to identify well-test interpretation model. AI-Kaabi training model consists of three layers of nodes as shown in
and Lees studied the application of back-propagation neural Figure 1.
networks to identify the well-test interpretation model. The
The i level nodes receive the input signal, the k level nodes
main advantage of this approach is the elimination of the
rule based system and its tolerance to the noise. generate activation numbers. The two levels are connected
via an intermediate level designated here as the j level. The
Once the proper model is selected, parameters estimation training consists of an iterative process where optimal weight
of reservoir, such as permeability, skin, etc., is obtained by factors among three levels of nodes are generated such that
matching the modeJ response to that observed. In traditional the activation numbers correspond to the appropriate mod-
approaches, Horner plot and appropriate type curves are els.
commonly used in the matching process. Rosa and Borne2 ,
used the non-linear regression in Laplace space to match the As an example, consider Table 1 and 2 where the objective
field data and provide confidence interval of the result. As it is to train the computer to recognize any of the ten indicated
has been reviewed by Horne 3 , the advantages of using nonlin- patterns. A portion of weight factors generated are shown
ear regression are the possibility to have variable rates "nd in Figure 2 and Table 3. If a pattern similar to model 3 is
the ability to match complex models with more unknown tested for the recognition purpose, the program will generate
reservoir parameters. The objective of this method is to an activation number of pattern similar to column 3 where
minimize the sum of squares of the differences between the the 3rd entry is the most significant number.
measured data and the calculated model by adjusting the
unknown reservoir parameters. Newton's method, optimal Before the training process, the dimensionless derivative
control theory, and direct search method are some of the type curves are normalized for x-axis and y-axis respectively
approaches to achieve this goal.
The purpose of this paper is to focus on the model se-
lection process and to extend previous studies. Specifically 0.1 ~ (tD)N ~ 0.9
the strength and weaknesses of neural network models in
well-test interpretation technique are examined and a new 0.1 ~ (Pb.tD)N ~ 0.9
approach is proposed to safeguard against selection of inap- The entire pattern is then represented as a model with
propriate models. 30 pairs of x and y. The normalization process is shown in
Figure 3.
Neural Networks
The iterative process distributes the error using the back
In recent years, application of neural networks in pattern propagation algorithm. A sigmoid function is used as de-
recognition relative to petroleum engineering problems has scribed below to show the calculation process in a single node
been addressed by some investigators. Baldwin and OtteS
studied the application of neural networks to well log inter-
pretation. Other studies include Arehart's9 analysis of drill
1
bits and Rogers et al. 10 proposed application in beam pump- Output· = --~:------:­
ing. For well test analysis, AI-Kaabi et al,u indicated that J 1 + exp(-sumj )
from the application of back propagation method, neural net- where:
work can help in identification of well test models.
Formalized first by Werbos and later by Parker, as dis- OutputF the output of a node in j level
cussed by Rumelhart and McClelland,12 and Freeman and
Skapural3 , back-propagation neural network has primary ap-
sumj = the summation of incoming signal to
to node j from all nodes in the i level
plications in addressing problems requiring recognition of
complex patterns, fault tolerance or coping with noise, and
performing nontrivial mapping functions. A similar process is used for the k nodes receiving signal
from the j nodes.
In this study, a back propagation neural network model
was developed to handle complex pattern recognition prob- The output of sigmoid function will approach 1 at a very
lem associated from well test responses in faulted reservoirs. large positive argument and 0 at very small argument. These
values represent the activation level of the node based on the
The process consists of training the computer with char-
input received.
acteristic patterns of known well test models, and generating
the appropriate weight factors for perfect recognition of field In our studies we experimented with the j level nodes at
data. For well test studies, the ISDP used was the derivative 130 and 200. The number of k level nodes correspond to the

712
SPE 026106 IRWAN R.JUNIARDI AND IRAJ ERSHAGHI 3
number of models under study. To examine the application wrong interpretation for the cases.
of neural networks as a pattern recognition tool in well test
analysis, we focused on the faulted reservoirs. These reser- Suppose that the response of the same model is now sim-
voirs as indicated by Ershaghi et al. 14 exhibit some of the ulated via a Monte Carlo approach for parameter selection
most complicated and confusing shapes on ISDP. Figure 4 within the ranges of parameters shown in Table 6. In one ex-
shows various fault configurations studied for both homoge- periment with 50 iterations, we generated 50 patterns which
neous and dual porosity reservoirs. Type curves generated were recognized by the network according to the frequency
for 10 models covering 20 patterns within limited range of plot shown in Figure 12. Note that model 7 is the pick in
reservoir and wellbore parameters are shown in Figures 5a - only 28% of the cases. Other potential models such as model
5 j. Table 4 shows the equation representing the 10 selected 2 and 3 are selected with high frequencies. This is because
models. In spite of significant differences among equations, of unusual shapes caused by certain combination of param-
similar patterns are obtained attributed to more than one eters. This type of frequency table must become part of the
well test model. data base assisting the neural network.
From the above studies we have observed that the neural
Experimentation network models need augmentation from an expert system.
Such an expert system is required to assist in the selection
The neural network model was applied to the 20 patterns of the correct model in two different ways. First, the type
described above. Weight factors were generated for various of the frequency data similar to the one determined above,
levels of j nodes. In all experiment a total error of 0.01 was can provide a data base of potential models which can fit a
imposed as a tolerance level. Figures 6 - 9 show synthetic particular shape. For example, for the base patterns studied
patterns with known models which were tested against the here, if a new pattern is recognized by the neural network
network model and activation codes were obtained indicating with an activation number similar to model 2, the expert
perfect recognition as shown in Table 5 (Cases 1 and 2) for system must have adequate statistical data to also identify
Figures 6 - 7 and multiple answers for Figures 8 - 9, and model 7 as a potential answer because of its high frequency of
wrong answer for Figure 9. occurence in the category 2 Model. Experiments such as the
one proposed above can be repeated for other models with
Because oflimited number of training patterns, ifthe range
even a wider range and the information for construction of
of parameters used for the generation of test cases were out-
the frequency data base can be generated for the expert sys-
side the range corresponding to the input models, at time
tem. Second, information in the expert system can include
we faced non unique solutions. For example, an input pat-
checks and balances on other field data; i.e., independent
tern similar to Figure 10 was recognized by the network as
evidence as to fracturing, mud loses, etc.
either model 3 and 8. This obviously casts uncertainties in
the ability of the network to perfectly identify the correct Thus, the neural network approach for pattern recognition
model. has a limited degree of reliability unless either the system is
trained on almost infinite number of patterns or the system
Statement of the Problem is augmented with a database including the statistical fre-
quencies of non-related well test models resembling a given
From the above experiments, it is clear that a reliable neu- pattern and other field related information.
ral network for well test application must include in its train-
ing set considerably more patterns. The patterns must in- Conclusions
clude all possible combinations of reservoir parameters. As
such, the required number of patterns will make the training The complexities of neural network pattern recognition ap-
difficult if not impossible. proach for well test models have been pointed out through
numerous case studies. A hybrid method is proposed where
Proposed Solution from a Monte Carlo simulation approach significant number
of patterns can be generated and tested against a network
To alleviate the requirements for generating a wide range model with essential well test patterns. Statistical data gen-
of type curves, a Monte Carlo Simulation approach was em- erated from such comparison can, in a form of an expert
ployed to test the recognition capability of the network. With system, augment the neural network ability for model selec-
the random parameter selection approach, a significant num- tion processes.
ber of patterns for a particular well test model can be gen-
erated with expected ranges of minimum and maximum of Nomenclature
various parameters. Consider well test model 7. Figures
lla - llb show eight typical normalized type curves gener-
ated for the model using different input parameters. Obvi- CD = wellbore storage,dimensionless
ously, the network model will have a different and at time d = distance to fault, ft
713
4 COMPLEXITIES OF USING NEURAL NETWORK IN WELL TEST ANALYSIS OF FAULTED RESERVOIRS SPE 026106
dD = distance to fault,dimensionless, d/rw 6. Allain, O.F. and Horne, R.N. : "Use of Artificial Intel-
/(s) =
defined by equation of Model 7 ligence in Well Test Analysis, "J .Pet.Tech(March 1990)
I<O(x) = modified lJessel function, 2nd kind, 342-349.
zero order
I<l(X) = modified Bessel function, 2nd kind, 7. AI-Kaabi, A.U. , McVay, D.A. , and Lee, W.J. : "Using
first order an Expert System to Identify a Well Test Interpretation
= wellbore pressure drop, dimensionless Model" J .Pet.Tech(May 1990).
== radial distance, ft
= radial distance, dimensionless, r/rw
8. Baldwin, J.L and Otte, D.N.: "Computer Emulation of
= wellbore radius, ft
Human Mental Process: Application of Neural Network
= Laplace parameter
Simulators to Problems in Well Log Interpretation," pa-
= dimensionless skin factor
per SPE 19619 presented at 1989 Annual Conference,
= time, dimensionless
San Antonio, October 8-11.
= matrix-fracture interporosity parameter,
dimensionless
w = oil zone fracture storativity with respect 9. Arehart, R.A. : "Drill Bit Diagnosis Using Neural Net-
D = denotes dimensionless works," paper SPE 19558 presented at the 1989 Annual
/ =
denotes fracture Meeting, San Antonio, October 8-11.
w = denotes wellbore
10. Rogers, J.D., Guffey, C.G., and Oldham, W.J.B. : "Ar-
tificial Neural Networks for Identification of Beam Pump
Dynamometer Load Cards, "paper SPE 20651 presented
Acknowledgements at the 1990 Annual Conference,New Orleans,September
23-26.
This research was supported by The Gas Research Insti-
tute. We acknowledge the cooperation of Mr. Yusuf Shikari 11. AI-Kaabi, A.U. and Lee, W.J. : "Using Artificial
during the course of this study and thank GRI for permis- Neural :Networks to Identify the Well Test Interpreta-
sion to publish the paper. We also acknowledge Pertamina tion Model, "paper SPE 20332 presented at the 1990
for supporting Mr. Juniardi's scholarship. Petroleum Computer Conference, Denver, June 25-28.

References 12. Rumelhart, D.E. and McClelland, J.L.: "Paral-


lel Distributed Processing: Explorations in the Mi-
1. Ershaghi, I. and Woodbury, J.J. : "Examples of Pitfalls crostructures of Cognition, VoU, "The MIT Press,
in Well Test Analysis", J.Pet.Tech (February 1985) 335- Cambridge, 1986,318-362.
341.
13. Freeman, J .A. and Skapura, D.M.: "Neural Net-
works: Algorithms, Applications, and Programming
2. Ramey, H.J. Jr. : "Advances in Practical Well Test Techniques, "Addison-Wesley Publishing Co.Inc., Oc-
Analysis, "J .Pet.Tech(June 1992) 650-659. tober 1991.

3. Horne, R.N.: "Advances in Computer-Aided Well Test 14. Ershaghi, I., Khachatoorian, R.A. and Y.Shikari :
Interpretation, "paper SPE 24731 presented at the 1992 "Complexities in the Analysis of Pressure Transient Re-
Annual Meeting, Washington DC, October 4-7. sponse in Faulted Naturally Fractured Reservoirs, "pa-
per SPE 23424 presented at the 1991 Eastern Regional
4. Bourdet, D., Ayoub, J.A., and Pirard, Y.M. :"Use of the Meeting, Lexington, October 22-25.
Pressure Derivative in Well Test Interpretation, "SPE
Formation Evaluation(June 1989) 293-302.

5. Ershaghi, I., Chang J. and Khachatoorian, R.A. : "The


Importance of Pattern Recognition for the Analysis of
ISDP for Gas Storage Reservoirs," paper presented at
the 1992 International Gas Research Conference, Or-
lando, November 16-19.

714
Tlible 1

I~J::: : ShoItT"""Teat
fM~~':.:II~Infinite Acting

I~,~, :,II~ lnIiniteAding + Ang...... F'"

Fig.l Input of a derivative plot into the


neural network (After AI-Kaabi l l )

TIIble2

node 1 node 2 node 3 node 4 node 5 node 6 node 7 nodeS node 9 node 10

:':IiiJ:::::::::" 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1

0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1

0.1 0.1 !i¥) 0.1 0.1 0.1 0.1 0.1 0.1 0.1

0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1

0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1

0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1

0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1


0.1 0.1 0.1 0.1 0.1 0.1 0.1 M:} ::: 0.1 0.1
0.1 0.1 0.1 0.1 0.1 0.1 I 0.1 I 0.1 ::<~~ ~{: 0.1
0.1 0.1 0.1 0.1 0.1 0.1 I 0.1 I 0.1 0.1 ?#:\t)

Tlible 3

irc>ut hidden :,::~!'ift hidden ouIpUl }~t


j j k
1 0.1951 1 1 -G.3076
2 -G.0545 1 2 0.6587
3 -G.4537

• • -G.4871
2
2
1
2
~0610
0.7137
• •
2
2
1
2 1.1758
2 3 0.5024 3 0.05111

Fig.2 Set of weights obtained after


training the neural network

715
0.9 10 1

0.8

0.7
100
0.6

O.S

0.4
10-1

0.3

0.2

0.1 10-2
102 103 1(}4 lOS 106 107 108 109 1010

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9

Fig.3 Normalization of a derivative plot

• •
linear perpendicular


• well

I fault

angular parallel

FigA Fault configurations in the reservoirs

716
0.9......--~--~--~--~--~-~--~--~ 0'9,---~--_--~-~--_-_~-~-____,

0.8 0.8
...•..........
0.7 ...................
~'.-.,
- .

~/~~'~:-
0.2 a
0.3 ..

0.2
Model D : Homogcaeous Inrinlle Acting

b
0.1 ' - _......._ _-:'-_ _ _ _-'-_ _ ......._ _-'-_ _...J
O'L.L.I--..,.0'~2--..,.0.3'::---0.~4--~0.5'::-- .......
0.~6--~0.~7--~0.'::"8--JO.9
~ ~ _

0.1 0.2 0.3 Q.4 0.5 0.6 0. 7 0.8 0.9

o.9......--~--~--~-~--~-_~-~-__, 0.9,---~--~--~-~--_-_~-~-__,

0.8 0.8
close
0.7 0.7
/ .
0.6 0.6
..... _-- ... _- .... -- ..
far
0.5 far 0.5

0.4 0.4

0.3 .. Model ill : Homogcaeous + Unear Faull 0.3 .. Model IV : Homogeneous + Perpendicular Fault

0.2 c 0.2 d
0.1 L--~::----:-,O __ ~ __ ~_~~_~ _ _-:'-_ _...J O.I'---~--~--~--~- ......._ -.......- -.......- - - '
0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 ~ U U Q.4 ~ U U M M

0.9r--~--~--~-~--~--~-~--__, 0.9,..---~--~--~--~--_-~--~----.

0.8 _----==----4 0.8

\
close
~.~ -_.-
......\ ..\. .. / . close
0.7 0.7
...."'/" .............
0.6 0.6
........... _- .. _--_ ....
far
0.5 far

l' 0.4 ././.... Model VI : Homogeneous + Par2lIel F.ult


0.3 .. Model V ~ Homogeneous + Angular Fault 0.3

0.2 e 0.2 f
0.1 ';---::'::----::;---:'-:---::':::----:'-:"-----:'::---::':::----:' 0·b.LI---0~.2--~0.~3--..,0~.4--..,0~.S---0~.-6---0.~7---0~.8---0-'.9
0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9

0'9r--~--_--~-~--_--_-~-__, 0.9......---_--_--~--_- __- __ -~--__,

0.8

0.7

0.6

0.5 / ...... 0.5 close


close

0.4 0.4

0.3
Model vn : NFR + Unear Fault 0.3
Model VID : NFR + Peq>endicular F.uk

0.2 g 0.2 h
0.1
0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9

( continued )

717
0.9'r--~--~-~--~-~--~-~-----' 0.9r--~--~--~-~--~--~-~----'

0.8 0.8

0.7

0.6

o.s
0.4

MocIeJ IX : NfR + Aacular Fouk


Model X : NFR + PanUeI FIUIt

i j
O'Il.;----::-::--:'':---:-c---::'::-----:~-__:,:O-__:'::_____.,J
0.1 D.2 0.3 0.4 O.S 0.6 0.7 0.8 0.9

Fig.Sa.-Sj Patterns of the models used for training the network

Table 5

node 1 node 2 node 3 node .. node 5 node 6 node 7 node 8 node 9 node 10
0.0898 0.0896 0.0000 0.1642 0.1342 0.1175 0.1117 0.0897 :::"'~~::: 0.0959
0.1905.. ..~:!~\.. 0.1282 ..

O.llM .. .. .. .. :~{ .. 0.1282 0.1323


:: negligible

~r:::)):{ NFR + Angu.... Faull (Model 9) MOdel 9


~~:: NFR + u.- Fault (Model 7) Model 7
~~:::::):: :::: NFR + P"'l*lCficu18t Faull (Model 8) Model 7 I 8
:~~:): <:::::::::: NFR + PlIlplIIlCIic:uIar Faull (Model 8) Model 7

0.9 0.9r--~--~--~-~--~--~-~--_,

0.8 0.8

0.7

0.6

o.s
0.4 0..

0.3 0.3

0.2 Case I 0.2 Casell

O.IL....-~--~---'--~,___-~-____,~-~--- 0.1 L--~:__-~-__,~-__c'::__-,,-'-:--__,"'::_-_:_'":______}


0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 0.1 0.2 0.3 0.4 O.S 0.6 0.7 0.8 0.9

Fig.6 Pattern of synthetic data case 1 Fig.7 Pattern of synthetic data case 2
tested againts the neural network model tested againts the neural network model

0.9 0.9

0.8 0.8

0.7 0.7

0.6 0.6

O.S o.S

0.4 0..

0.3 0.3

0.2 Case UI 0.2 Case IV

0.1 '--~--~--~-~--~--~-~-.- 0.1


0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 0.1 0.2 0.3 0.. 0.5 0.6 0.7 0.8 0.9
Fig.8 Pattern of synthetic data case 3 Fig.9 Pattern of synthetic data case 4
tested againts the neural network model tested againts the neural network model
718
Table 4

- SPD +S - Ko(rD../i)
Model 1 : Short Term Test
PWD = s[1 + SCD(SPD + 8)] PD = s-{iK1(../i)

Model 2 : Homogeneous Reservoir Infinite Acting The same equation 88 of Model 1

- SPD +S PD = Ko(rD../i) + Ko(dD../i)


Model 3 : Homogeneous Reservoir + Linear Fault PWD = s[1 + SCD(SPD + 8)] s-{iK l(../i)

Po _ Ko(rDVS) + Sy'SK 1(v'i) + A _


Model 4 : Homogeneous Reservoir + Perpendicular Fault WD - S{VSK1(v'i) + CDs[Ko(rDv'i) + SVSK1(JS) + An
3

A = EKo(dDiVS) (3 images)
i=l

Model 5 : Homogeneous Reservoir + Angular Fault The same equation 88 of Model 4 but instead of 3 images put 7 images.
"'......co"
Model 6 : Homogeneous Reservoir + Parallel Fault The same equation as of Model 4 but instead of 3 images put 00 images.

Model 7 : Naturally Fractured Reservoir + Linear Fault The same equation 88 of Model 1 where

I'D _ Ko(rDVif(8) + Ko(dDVif(8) !(s) = w(1 - w)s +,x


- sVS !(S)Kl(VS!(S» (l-w)s+'x

Model 8 : Naturally Fractured Reservoir + Perpendicular Fault


The same equation as of Model 4 but instead of y'S put vs/(s).

Model 9 : Naturally Fractured Reservoir + Angular Fault The same equation as of Model 4 but instead of VS put vs/(s)
and instead of 3 put 7 images.

Model 10 : Naturally Fractured Reservoir + Parallel Fault The same equation as of Model 4 but instead of y'S put vs/(s)
and instead of 3 put 00 images.

You might also like