You are on page 1of 84

i

Modelling and Simulation of Temperature Variations of


Bearings in a Hydropower Generation Unit

A dissertation submitted to the
Department of Energy Technology, Royal Institute of Technology,
Sweden for the partial fulfilment of the requirement for the
Degree of Master of Science in Engineering

By

CGS Gunasekara

















Department of Energy Technology
Royal Institute of Technology,
Stockholm, Sweden









ii
Modelling and Simulation of Temperature Variations of Bear-
ings in a Hydropower Generation Unit


by

CGS Gunasekara




Supervised by
Dr. Primal Fernando,
Dr. Joachim Claesson
















iii
Declaration

The work submitted in this thesis is the result of my own investigation, except where otherwise
stated.

It has not already been accepted for any other degree and is also not being concurrently submitted
for any other degree.


CGS Gunasekara


Date


We/ I endorse declaration by the candidate.




Dr. Primal Fernando






iv
Modelling and Simulation of Temperature Variations
of Bearings in a Hydropower Generation Unit

Abstract
Hydropower contributes around 20% to the world electricity supply and is considered as the most
important, clean, emissions free and economical renewable energy source. Total installed capacity
of Hydropower generation is approximately 777GW in the world (2998TWh/ year). Furthermore,
estimated technically feasible hydropower potential in the world is 14000TWh/ year. The hydro-
power is the major renewable energy source in many countries and runningat a higher plant-factor.
Bearing overheatingis one of the major problems for continues operations of hydropower plants.
Objective of this work is to model and simulate dynamic variation of temperatures of bearings
(generator guide bearing, turbine guide bearing, thrust bearing) of a hydropower generatingunit.
The temperature of a bearing is depends on multiple variables such as temperatures of ambient air,
cooling water and cooling water flow-rate, initial bearingtemperatures, duration of operation and
electrical load. Aim of this study is to minimize the failures of hydropower plants due bearing tem-
perature variations and to improve the plant-factor. The bearing heat exchange system of a hydro-
power plant is multi-input (MI) and multi-output (MO) system with complex nonlinear characteris-
tics. The heat transfer pattern is compel in nature and involves with large number of variables.
Therefore, it is difficult to use conventional modelling methods to model a system of this nature. So
that Neural Network (NN) method has been selected as the best where past input and output data
is available, and the input characteristics can be mapped in order to develop a model. In this report
a neural network model is developed to model the hydropower plant, using Matlab neural network
tool box and matlab as the implementation language.















v
Acknowledgments
Thanks are first due to my supervisors, Dr Primal Fernando and Dr Joachim Claesson for their
great insights guidance and sense of humour. My sincere thanks should go to the Post Graduate
Office, Royal Institute of Technology, Stockholm, Sweden for helping in various ways to clarify the
things related to my academic works in time, with excellent cooperation and guidance. Next, I
would like thank, staff of the Post Graduate Section of ICBT, Sri Lanka who facilitated to carry out
my studies throughout the course.
Lastly, I should thank many individual friends and colleagues who have not been mentioned here
personally in making this educational process success. May be I could not have make it without
your support.

























vi
List of Abbreviations

a
A
dxw
h
H
HE
K
L
LGB
M
MIMO
n
N
NN
Q
R
s
S
T
TGB
THB
U
UGB
w
x

Output
Surface area
Wall thickness of heat exchanger
Enthalpy
Normalized enthalpy
Heat exchanger
Thermal conductivity
Load
Lower guide bearing
Number of trials
Multiple-input, multiple-output
Output of neuron
Number of layers
Neural network
Heat
Number of input nodes
Entropy
Normalized entropy
Temperature
Turbine guide bearing
Thrust bearing
Heat transfer coefficient
Upper guide bearing
Weights
Steam quality

Subscripts

Amb Ambient
Ca Circulating air
cw Coolingwater
cwin Coolingwater inlet
cwout Coolingwater outlet
dotcw Coolingwater flow rate
dotoil Coolingoil flow rate
e Electrical
EL Electrical Load
LGB Lower guide bearing
LGBoin LGB oil in
LGBoout LGB oil out
m Mass
O Oil
TGB Turbine guide bearing
TGBm TGB metal
TGBoin TGB oil in
TGBoout TGB oil out
THB Thrust bearing
UGB Upper guide bearing
UGBm UGB metal
UGBo UGB oil
UGBoin UGB oil in
UGBoout UGB oil out



vii
Greek symbols

Individual heat transfer coefficient
Weight adjusting scalar
Efficiency of heat exchanger





















viii
Modelling and Simulation of Temperature Variations
of Bearings in a Hydropower Generation Unit
1Introduction.........................................................................................................................10
1.2 The hydropower generating unit................................................................................. ...11
1.3 Bearing arrangement of Hydropower unit........................................................,...........12
1.4 Purpose and contribution of the thesis..........................................................................12
1.5 Organization of dissertation............................................................................................13
2Overview of the Modelling..................................................................................................14
2.1 The research problem.......................................................................................................14
2.1.1 Aimandscope..........................................................................................................14
2.1.2 Theresearchquestion.................................................................................................14
2.2 Approach............................................................................................................................14
3Neural Networks.................................................................................................................24
3.1 Introduction................................................................................................................. .....24
3.2 Formal definition...............................................................................................................24
3.3 Biological Neuron..............................................................................................................24
3.4 Mathematical Model of a neuron....................................................................................25
3.4.1 Neuronwithmulti-inputs..........................................................................................27
3.4.2 Layer of Neurons......................................................................................................28
3.4.3 Muli-layer neurons....................................................................................................29
3.4.4 General Structureof NN.................................................................................,........29
3.4.5 Traininga neural network.........................................................................................30
3.4.6 Trainingprocess........................................................................................................30
3.5 Demonstration of developing a NN by example.........................................................31
3.5.1 Problem.....................................................................................................................31
3.5.2 Systemas a NN model.............................................................................................32
3.5.3 Data used.................................................................................................................33
3.5.4 Training...................................................................................................................33
3.5.5 Simulation................................................................................................................36
3.5.6 Results......................................................................................................................36
4Developing the model.........................................................................................................40
4.1 Selection of input variables..............................................................................................40
4.2 Selection of data.................................................................................................................41
4.3 Approach of developing a dynamic model....................................................................41
4.3.1 Developing a static NN model............................................................................41
4.3.2 Trainingthenetwork andtrainingresults..................................................................43
4.3.3 Staticmodel simulationresults...................................................................................45
4.3.4 Developingthe dynamicmodel...................................................................................45
5Results.................................................................................................................................47
5.1 Static model simulation results........................................................................................47
5.1.1 Staticmodel simulationresultsfor bearingmetal temperature......................................47
5.1.2 Co-relationcoefficient of thestaticsimulationresults...................................................48
5.1.3 Staticmodel simulationresultsfor bearingoil temperature..........................................49
5.1.4 Correlationcoefficientsof simulationonbearingoil temperature..................................50
5.1.5 Summaryresultsof staticmodel.................................................................................51
5.2 Dynamic simulation results..............................................................................................52
5.2.1 Dynamicsimulationresultsfor bearingmetal temperature..........................................52
5.2.2 Dynamicsimulationresultsfor bearingoil temperature...............................................53
5.3 Dynamic simulation results for reduced flow rate.......................................................53
5.3.1 Bearingmetal temperaturevariation...........................................................................53
5.3.2 Bearingoil temperaturevariation...............................................................................54
ix
6Discussion.....................................................................................................................56
7Conclusions.................................................................................................................. 57
8References.................................................................................................................... 58
Appendix A : NN initial weight and bias values (NN example).................................... 59
Appendix B: Training record ( NN example)................................................................. 62
Appendix C: Sample data used for training the model................................................... 76
Appendix D: training Matlab script for model.................................................................80
Appendix E: Initial values of trained model.................................................................... 84



10
1 Introduction
1.1 General overview
Hydro power contributes around 20% of the world electricity generation [1]. As a renewable energy
source it has become more important economical resource compared to other renewable sources.
Hydro power produces no direct waste and contribution to CO2, green house gasescompared to fos-
sil fuel plants. Global installed capacity of Hydropower generation (electrical) is approximately
777GW (2998TWh/ year) [1]. It is around 88% of the renewable energy sources [2].
In Sri Lanka about 40% of electricity is generated by hydropower. At present, all most all hydro po-
tentials available in the country have been utilized for electricity generation and few remaining are
under construction.
Total Power Generation GWh
Hired power
1%
Wind
0%
Private Power
37%
Thermal
Complex
22%
Other Hydro
8%
Laxapana
Hydro Complex
15%
Mahaweli Hydro
Complex
17%

Fig.1.1 Hydro electricity contribution in 2009
(Source: Ceylon Electricity Board, statistics 2009)

The electricity generation by different sources in the year 2009 is shown in Fig. 1.1. Electricity gener-
ated in three major hydropower complexes (Mahaweli Hydro complex, Laxapana Hydro Complex
and Other Hydro Complexes) in Sri Lanka [3], contributes 40% to the national energy supply while
the rest is coming from thermal power, mainly diesel. Hence, obtaining the maximum possible share
from hydropower would be great savingto the national economy.
Around 95% of existing hydro power plants in Sri Lanka have passed the 25 year limit of their life
span. Sri Lanka is not in a situation to replace old-hydro power plants, within a short period and also
its energy production is mainly depends on hydropower. Age analysis of the hydropower plants in Sri
Lanka is shown in Table 1.


11




Table 1: Age analysis of hydropower stations in Sri Lanka
(Source: Ceylon Electricity Board, Generation data)
Name of the Station Installed
Capacity/ MW
Commissioned
year
Age
(years)
Inginiyagala
Norton
Udawalawe
Old Laxapana
Polpitiya
Ukuwela
Bowatenna
New Laxapana
Canyon
Kotmale
Victoria
Samanalawela
Randenigala
Nilambe
Rantambe
Kukule
11.25
50
6
50
75
40
40
100
60
201
210
120
122
3.2
50
70
1950
1950
1955
1955
1960
1976
1981
1984
1984
1985
1985
1985
1986
1988
1990
2002
65
65
60
60
50
34
29
26
26
25
25
25
24
22
20
08

Therefore, it is essential to obtain the maximum capacity from the existing plants by minimizingthe
downtime through proper operations. In that context, predicting the availability of hydroelectric gen-
erating units for fault free operation is one of the crucial factors.
Bearingoil temperature plays a vital role in continues operation of hydropower plants. Stable bearing
temperatures in the turbine and generator are essential for their successful continues operations. All
hydraulic and lubricating fluids have operating temperature limits. A machine could lose its stability
and experiences conditional failures whenever the systems fluid temperature exceeds these limits.
Increase in temperatures in a machine may happen due to lack of heat losses, higher ambient
temperatures and long operations at higher mechanical loads. The power plant staff should closely
monitor the bearing oil and metal temperatures in order to ensure a safe operation [4]. Typical
acceptable bearing temperatures of a vertical shaft hydropower turbine are shown in Table 2.
Table 2: Bearing temperature limits (refer Fig. 1.3)
Bearing Type Temperature / deg C (Alarm)
Metal Oil
Upper Guide Bearing (UGB)
Lower Guide Bearing(LGB)
Thrust Bearing(THB)
Turbine Guide Bearing (TGB)
85
85
85
75
70
70
70
70

In this project, from the measured temperature variations of bearings (generator upper guide bearing
UGB, lower guide bearing LGB, turbine guide bearingTGB, thrust bearing THB), a model is created
to predict bearing temperatures at various operation conditions.
1.2 The hydropower generating unit
Hydro electricity is generated by converting potential energy of water to kinetic energy by its turbines.
A typical arrangement of a vertical shaft driven turbine, generator unit is shown in Fig.1.2 [5].
12


Fig.1.2 Overview of a hydropower generating unit
1.3 Bearing arrangement of Hydropower unit
A typical arrangement of the bearings in a vertical shaft generator-turbine unit of a hydropower plant
is shown in Fig.1.3.

Fig.1.3 Turbine-Generator bearing arrangement
1.4 Purpose and contribution of the thesis
The purpose of this thesis is to develop a model to predict the temperatures of bearings for different
operating conditions. The model is developed using previously measured temperatures, loads, and
cooling water flow data. To achieve these, following principle systems are stated.
13
- Choose the inputs and outputs.
- Determine the appropriate method for this system considering the nature of the problem. It
is suggested to use a neural network model for this problem as justified in the next section.
- It is suggested to decompose the system into sub models to identify the heat transfer charac-
teristics of the system.
- In this work, Matlab neural network tool box, and Matlab scripts are used.
1.5 Organization of dissertation
The rest of the chapters of this dissertation are organized as,
- Describes the overview of the modelling strategy approach to the modellingmethod includ-
ing the selection of modelling method and selection of input variables.
- About application of neural network and the theory behind it.
- Describes how to approach to developing the model by considering the heat transfer pattern,
the interaction within system variables and implementing the model.
- Presents the results obtained by simulating the model with comparison to the past actual
characteristics of the system.
- Discusses the performance of the model and concludes the work carried out by this study.
14
2 Overview of the Modelling
This section is devoted to describing the problem under investigation, importance of it to the energy
sector, aims of the research, its scope and limitations, formulation of the research problem and the
approach.
2.1 The research problem
Monitoring the temperature of a bearing is an important task for ensuring continues running of hy-
dropower generating. Old hydropower plants are frequently failed due to bearing temperature rise or
stop when they reach to recommended temperature levels. This may causes frequent power failures
or damagers to turbine-generator system.
2.1.1 Aim and scope
It is aimed to model and simulate the dynamic variation of temperatures of the bearings (generator
guide bearing, turbine guide bearing, thrust bearing) of an in-service hydropower unit.
2.1.2 The research question
One research question has been formulated for focusing the work:
Howshouldmulti-physical interactions ina hydropower bearing-heat exchanger systembemodelled, simulated, inorder
topredict thebearingtemperaturevariation?
2.2 Approach

HE3, HE4 LGB, TGB oil coolers, HE1 THB and UGB oil cooler, HE2 Stator cooler
Fig.2.0 Bearings-heat exchanger system,

15
A simplified diagram that illustrate the physical arrangement of different types of heat exchangers,
bearings, generator stator and cooling fluids flow directions of a hydropower plant is shown in
Fig.2.0. The bearings (UGB, LGB, THB, and TGB) and generator stator are considered as heat
sources and coolingwater as well as ambient air act as heat sinks. Pictures of the TGB oil cooler and
THB oil cooler are shown in Fig 2.1 and Fig. 2.2, respectively.

Fig.2.1 A picture of TGB-heat exchanger arrangement


Fig.2.2 A picture of THB & UGB heat exchanger arrangement

THB and UGB oil cooler consists of shell-and-tube type two parallel heat exchangers. Heat from the
oil is transferred to the circulating cooling water. Interactions of system variables with each other are
shown in heat transfer diagrams in Fig. 2.3 and Fig. 2.4.

16

Fig.2.3 Simplified heat transfer diagram


Fig.2.4 Detailed Heat transfer diagram
The temperature variations in bearing metal, bearing oil, cooling water, circulating air and the load
with time are shown in Fig. 2.5 during a typical running time. The temperatures were measured con-
tinuously during the running period as well as during the stopping period.





Bearings

Generator Stator
+ Rotor

Cooling water

Ambient air

Bearing oil

Circulating air
Heat Source Heat Sink
UGB Metal
THB Metal
LGB Metal
TGB Metal
Generator stator
+ rotor
UGB oil
THB oil
LGB oil
TGB oil
Circulating air




Cooling water

Ambient air
Heat Source
Heat Sink
17



Load variation
0
10
20
30
40
50
16
1
1
1
6
2
1
2
6
3
1
3
6
4
1
4
6
5
1
5
6
6
1
6
6
7
1
7
6
8
1
8
6
9
1
9
6
1
0
1
1
0
6
1
1
1
1
1
6
1
2
1
1
2
6
1
3
1
1
3
6
1
4
1
1
4
6
Time/ hours
M
W
,
M
V
a
r
Load MW
Load Mvar

Fig 2.5 Bearingmetal / oil/ cooling water/ electrical load and circulatingair temperature variation
When the plant started from stand still, the temperatures of the bearings rise rapidly and stabilize at a
certain level for the given generator load profile is shown in Fig. 2.6. Sampling rate of the tempera-
ture values are selected at 10-minute intervals. (Sample datarecord 1365 to 1465 Appendix D)

18




Fig. 2.6 Variation of bearingmetal and oil temperatures

When the external parameters such as cooling water flow rate and cooling water or ambient air tem-
perature varies, the heat absorption rate of the bearing oil coolers varies. Data relevant to these dif-
ferent operating conditions are given in table 2.1. According to this data, when the cooling water
temperature is high (29C) the bearing temperatures are also at a higher value.














19


Table 2.1 Temperature variation of the bearings with load and cooling water temperature.
Load Temperature C / Alarm C
MW MVar Cwin UGBm THBm LGBm TGBm UGBo LGBo TGBo
77
78
76
76
76
74
75
76
9
10
12
13
14
16
20
25
24.8
24.8
24.8
24.7
24.7
24.7
24.7
24.7
50/ 85
50/ 85
50/ 85
50/ 85
50/ 85
50/ 85
50/ 85
50/ 85
71/ 85
71/ 85
72/ 85
72/ 85
72/ 85
72/ 85
72/ 85
72/ 85
62/ 85
62/ 85
63/ 85
63/ 85
63/ 85
63/ 85
63/ 85
63/ 85
56/ 75
56/ 75
57/ 75
57/ 75
57/ 75
57/ 75
57/ 75
57/ 75
54/ 70
54/ 70
55/ 70
55/ 70
55/ 70
55/ 70
55/ 70
55/ 70
54/ 70
55/ 70
55/ 70
55/ 70
55/ 70
55/ 70
55/ 70
55/ 70
54/ 70
54/ 70
55/ 70
55/ 70
55/ 70
55/ 70
55/ 70
55/ 65
73
73
72
76
76
79
35
35
35
32
34
35
24.8
24.8
24.8
24.8
24.8
24.8
53/ 85
53/ 85
53/ 85
54/ 85
57/ 85
57/ 85
81/ 85
82/ 85
82/ 85
82/ 85
83/ 85
83/ 85
71/ 85
72/ 85
75/ 85
75/ 85
75/ 85
76/ 85
63/ 75
63/ 75
64/ 75
64/ 75
65/ 75
65/ 75
60/ 70
60/ 70
61/ 70
61/ 70
62/ 70
61/ 70
60/ 70
61/ 70
61/ 70
61/ 70
61/ 70
62/ 70
60/ 70
60/ 70
61/ 70
61/ 70
62/ 70
62/ 70
30
30
30
30
10
10
10
42
44
44
45
45
44
40
29
29
29
29
29
29
29
65/ 85
65/ 85
65/ 85
65/ 85
65/ 85
65/ 85
65/ 85
78/ 85
78/ 85
78/ 85
78/ 85
78/ 85
78/ 85
78/ 85
77/ 85
77/ 85
77/ 85
77/ 85
77/ 85
77/ 85
77/ 85
72/ 75
72/ 75
72/ 75
72/ 75
72/ 75
72/ 75
72/ 75
60/ 70
60/ 70
60/ 70
60/ 70
60/ 70
60/ 70
60/ 70
65/ 70
65/ 70
65/ 70
65/ 70
65/ 70
65/ 70
65/ 70
64/ 70
64/ 70
64/ 70
65/ 70
65/ 70
65/ 70
64/ 70

When one of the bearing temperatures reaches to the alarm level of the machine, the plant has to be
stopped or automatic shut down takes place. A failure that occurred due to bearing over heating is
shown in Fig. 2.7. It was observed that the THB temperature reached to 83C with an increasing
trend when the machine was running at a load of 77 MW, 37MVar and then the machine was manu-
ally stopped.
20

Fig.2.7 Failure due to bearing temperature rise
The bearing temperature variations show a clear relation to electrical load (both MW and MVars) and
cooling water flow rates. Bearing metal temperatures depend on the initial conditions of the bearing,
external conditions such as cooling water flow rate, cooling water temperature (ambient temperature)
and electrical load of the generator. Parameters involved with system are shown in Fig.2.8.

21

Fig. 2.8 Representation of the system: HE1, HE3, HE4 - bearing oil coolers, HE2 stator air cooler.

From first principles of thermodynamics,
Considering heat transfer from bearing metal to oil,
( )
UGBo UGBm UGB
T T A U Q =
1 1
( 1 )
Considering heat transfer in heat exchanger 1 (HE1),
( ) ( )
cwout cwin dotcw w UGBOOut UGBOin dotoil Oil
T T m C T T m C =
1
( 2 )
For HE2,
( )
cwout cwin dotcw W Air
T T m C Q =
2 2

( 3 )
Where
Air
Q is the heat absorbed from circulating air,
For HE3,
22
( ) ( )
cwout cwin dotcw w LGBOOut LGBOin dotoil Oil
T T m C T T m C =
3 3 3

( 4 )
For HE4,
( ) ( )
cwout cwin dotcw w TGBOOut TGBOin dotoil Oil
T T m C T T m C =
4 4 4

( 5 )
Again, heat absorbed by circulating air can be written as,
( )
EL LGB UGB Air
Q Q Q f Q , , =
=
( 6 )
Where,
EL LGB UGB
Q Q Q , , are the heat generated by upper Guide bearing, Lower guide bearing and
due to Electrical Load of the generator, respectively. Also
CW Air Oil
C C C , , are the specific heat ca-
pacities of bearing oil, air and cooling water, respectively and 1, 2, 3, 4 are the efficiencies of heat
exchangers.
( )
UGBOil UGB UGB UGB UGB
T T A U Q =
( 7 )
( )
LGBOil LGB LGB LGB LGB
T T A U Q =
( 8 )
Also, heat generated at TGB also can be expressed as,
( )
TGBOil TGB TGB TGB TGB
T T A U Q =
( 9 )
Where
TGB LGB UGB TGB LGB UGB
A A A U U U , , , , , are the heat transfer coefficients and surface areas
of the Upper guide bearing, Lower guide bearing and turbine bearing, respectively.
Heat generated due to electrical load can be written as,
( ) Le f Q
EL
=
( 10 )
Where, Leis the electrical load.
Again, heat transfer coefficients, U also a complex non-linear function of temperatures, coolingwater
flow rates, thermal conductivity of the material, the individual convection heat transfer coefficient for
each fluid and wall thickness as given in equation (11) [6].
2 2
1
1 1
1 1
A kA
dx
A UA
wall

+ + = (11)
Therefore, the system under investigation has multiple time dependent inputs and multiple outputs.
Multiple input, Multiple output (MIMO) and interaction within the system are complex and non lin-
ear in nature. So that, all the inputs has to be parallel processed to obtain the output. This type of
computation can not be implemented by using conventional modelling techniques based on sequen-
tial computer programs and based on first principles of thermodynamics. This topic will be discussed
in detail in section 4.0 under developing of the model.
Hence, neural network (NN) approach is the best to model systems which exhibits the following
characteristics. Due to the fascinatingcharacteristics and capability of NNs, most of the models de-
veloped in the past using other techniques are now beingconverted to NN model [7][8].
23

- Inputs and out puts have a cyclic repetitive pattern of variation over the time.
- Input/ output past data of the system which describes the characteristics of the system is
available.
- The NN has the capability to identify the patterns exist in a given data set.
- The NNs can map the input data to the output datain a nonlinear system.
- The NNs can process data in parallel, so it can be applied to MIMO system easily.
- Dynamic systems can be modelled usingtime delay inputs to the network to represent previ-
ous time series values.
- NN need to know little about the theory behind the process of the system.

The approach is described with the followingsteps:
1) As the system consists of several heat exchangers, which has different inputs and output
bearing temperature variables, first the inputs (which characterize the behaviour of the sys-
tem) and outputs of the model are clearly identified.
2) Then the past historical data over a period is collected from past operation data records.
3) Then an artificial NN is formed to model the system by mapping the input to known out-
puts. The system is modelled usingMATLAB neural network tool.
4) The simulated results are compared with past actual outputs and necessary adjustments are
done to get the required accuracy.
5) The model and results are discussed with an objective perspective.

24
3 Neural Networks
Neural networks (NN) play a vital role in the field of modelling and identifyingcharacteristics of non
linear systems. Hence, this section describes the capabilities of NN and mathematical theory behind
it. In section 3.5.1, it is shown by an example, how NN technology can be used to solve a nonlinear
problem.
3.1 Introduction
Neural networks are capable of modelling complex MI-MO systems with non linear characteristics.
So that NNs are a powerful tool in system modelling and identification field compared to conven-
tional modelling techniques. NNs imitate the function of human brain or biological nervous system
made up of small units called neurons. The network is formed by connecting the neurons with each
other by adjustable weights between neurons. Neural network can be trained or adjusted to get a de-
sired output or target for a given input. Hence, when the input, output characteristics of a system;
historical data is available we can train a NN to model the system. NNs have the capability of identi-
fying the patterns exist in the input/ output data, if a pattern exists. In section 3.2 gives a formal defi-
nition of NN.
3.2 Formal definition
The following formal definition was proposed by Hechi-Nielson [9] which describes the functionality
of neural network.
An artificial neural network is a parallel distributed information processing structure consisting of
processing units (which can posses a local memory and can carry out localized information process-
ing operations) interconnected via unidirectional signal channels called connections. Batch processing
unit has a single output connection that branches (fans out) into as many collateral connections as
desired: each carries the same signal the processing unit output signal. The processing unit output
signal can be of any mathematical type desired. The information processing that goes on within each
processing unit can be defined arbitrarily with the restriction that it must be completely local: that is,
it must depend only on the current values of the input signals arriving at the processing element via
impingingconnections and on values stored in the processingunits local memory.
3.3 Biological Neuron
Human nervous system consists of about 1.3 x 10
10
of neurons [10]. They are distributed among the
human brain and the other parts of the body. It is found that about 1 x 10
10
[10] neurons contain in
the human brain itself. The basic building block of the nervous system is the neuron which contains
four main parts. Normally it has a spherical shape. The cell body is called the Soma and is sur-
rounded by tree like branches called Dendrites which receive signal from other neurons as shown in
Fig. 3.1. The out of the neuron passes through the Axon which has a length varying from fraction of
mm to 1 m in human body [10][11].


25

Fig 3.1 Biological Neuron
(source: Artificial Neural Networks, ch1, EE543 Lecture notes ,
METU EEE, Ankara , by Urgu Halici)

At the end of the Axon it is divided into branches called Synapses which transmits the signals to
other neurons. There are about 10
3
-10
4
number of Synapses at each Axon end. The incoming signals
to the cell body or Soma create an electrical potential due to the chemical changes takes place in the
cell body. When this potential called action potential exceeds a certain threshold that neuron fires
and transmits pulse through the Axon [10].
These neurons form a parallel distributed network in the nervous system which helps to transmit in-
formation gathered in the system to the brain to maintain a communication link. Signal transmission
is caused by electric pulses. The pulses passing through the Axon has approximately constant ampli-
tude but different time spacingdecided by the statistics associated with the incoming signals from
synaptic junctions of other neurons [10][11].
3.4 Mathematical Model of a neuron
Characteristics of a biological neuron in mathematical form can be represented as shown in Fig. 3.2.
The main three aspects of the biological neuron needed to be represented are the, synapses and the
actual activity taken place inside the neuron. The weight w models the synapse. The value of the
weight determines the strength of the connection. Then an adder adds up all the inputs.

Fig.3.2 neuron as a model
) ( b wp f a + = (12 )
Typical characteristics of a neuron can be expressed as in equation (12). Where a, pand nare output
of the neuron, input of the neuron and input totheactivationfunctionof the neuron, respectively. The
output of neuron a, is the outcome of a function f called as activation function. Activation func-
tion acts as a transforming function such that the output of a neuron should lie in between two de-
26
fined values. Normally the lower and upper limits of the outputs are in between 0 to 1 or -1 to 1. Ac-
tivation functions used in neural networks can be in several forms [12].

Generally there are three types of activation functions commonly used. The first type of function acts
as a threshold function. If the summed up value exceeds a certain threshold value it is considered as 1
otherwise 0 which is called the step function. In mathematical form it can be shown as, given in
equation (13).
( ) = n f 1 if 0 > n
0 if 0 < n (13)
First type of activation function characteristic is shown in Fig. 3.3

Fig. 3.3 Step function
Second type is the piecewise linear function. Output of this function lies in a linear region depending
on the amplification factor, which can be expressed as shown in equation (14). A graphical form is
shown in Fig. 3.4.
1 2 1 > n
( ) = n f n 2 1 2 1 > > n
0 2 1 s n (14)

Fig. 3.4 Piecewise linear function
27
The third type is Sigmoid function which can take two forms, logistic Sigmoid or tangential Sig-
moid. The logistic sigmoid function is also called as logsig, whose characteristics are shown in Fig.
3.3.

Fig. 3.3 Logsigfunction

The Logsig function f can be expressed as in equation (15),
( )
n
e
n f

+
=
1
1
) ( (15)
The tangential sigmoid function is also called as tansig, whose characteristics are shown in Fig. 3.4.

Fig. 3.4 Tansigfunction

The tansig function f can be expressed as in equation (16),
( )
( )
n n
n n
e e
e e
n f

= ) ( (16)
The tangential sigmoid has the advantage due to its ability to deal with negative numbers which
transforms output in between -1 and +1, while the logsig function normalizes the output in between
0 and1. In our case we are using logsig function as the activation function as we do not deal with
negative numbers.
3.4.1 Neuron with multi-inputs
When there are several inputs to the same neuron, the model can be represented as shown in Fig. 3.5.
p1, p2, . pR are the input value and while w11,w 12,..w1R represents the corresponding weights.
28

Fig. 3.5 neuron with multi inputs
(source: neural network toolbox user guide)

Mathematically this can be represented in vector form as,

| | | | n b
w w w
p
p
p
R
R
= +
(
(
(
(
(
(
(
(

1 12 11
2
1
. .
(17)
1 x R R x 1 1x1 1x1
3.4.2 Layer of Neurons
Similarly a number of inputs also can be modelled as shown in Fig. 3.6 by layer of input neurons.

Fig. 3.6 Layer of neurons with multi inputs
(source: neural network toolbox user guide)



29
3.4.3 Muli-layer neurons

Fig. 3.7 Multi layer neurons (source: neural network toolbox user guide)
In a multi-layer neural network the relationship between inputs and the outputs can be expressed as,
( )
b w
f
a
p L
1 11
1
1
+ = (18)
For layer two as,
( )
2 1 21
2
2
b a w
f
a
L + = (19)
Similarly for the 3
rd
layer,
( )
b a w
f
a
L
3 2 32
3
3
+ = (20)
From equation (19)(20) and (21),
( )
|
.
|

\
|
+ |
.
|

\
|
+ + =
b b b w
f
w
f
w
f
a
p I L L
3 2 1 11
1
21
2
32
3
3
(21)
General Structure of NN
Followings are the typical major aspects common to any NN model.
- A set of input processing units
- A state of activation for each unit
- An output function for each unit
- Topology of the network that describes pattern of connectivity among processing units.
- A rule defined to propagate or combine activities of processing units through out the net-
work.
- A defined rule to activate and update values received from input neurons.
- External data input that provides information of the environment.
- A rule to modify connectivity pattern based on the data which describes the environment.

30
3.4.4 Training a neural network
Generally neural networks trained by adjusting the weights. At the beginning of the training process
the bias values (b) and weights are initialized randomly (random values are selected)
The training method can be classified in to several categories based on the method used by the net-
work to learn behaviour of the actual system by adjusting the weights of the network so as to obtain
the desired output. These methods can be classified into two main types called supervised-learning
and unsupervised-learning.

Fig.3.8 Training process (source: Matlab user guide)
3.4.4.1 Supervised learning
In the process of supervised learning, the network is allowed to adjust its weights by comparing the
input and corresponding outputs. The inputs are fed to the network input nodes batch by batch and
the actual output is compared with target. The error is used as a feedback to adjust the bias values
and weights. This process is repeated until an accepted preset value is reached. The process can be
pictorially depicted as shown in Fig. 3.8 which is called the supervised learning as the network is self-
supervised during the training process. This method needs to have the historical data which repre-
sents the behaviour of the system
3.4.4.2 Unsupervised learning
In this method, when the inputs are fed into the network it creates its own outputs to represent those
inputs. When the same inputs are fed to the network it produces the same out puts as earlier. In this
way network classified the inputs into several categories or identifies the inputs. Compared to the
previous method this training does not need any external supervision.
Our problem under investigation falls into the first category where historical data is used to train the
network.
3.4.5 Training process
Steps of the training process can be given as follows,
- Feed first training sample to the NN. Initialize the threshold and weights for the input hid-
den and output nodes of the network and set small random values.
- For each hidden unit calculate,
i
R
i
ji j
p w n

=
=
1
(22)
31
( )
nj
j
e a

+ + = 1 / 1
for j=1.2N (23)

- For each output calculate,

=
=
N
j
j kj k
a w O
1
*
(24)
- For each neuron calculate a scalingfactor in order to adjust the difference between net-
work actual output and the desired output. In other words, actual and the target.
- Adjust the weights of each neuron to reduce the local error,

kj kj kj
w w + =
(25)
- Move to the next trainingsample, and repeat the procedure for all training samples. At the
end compare the actual output with the target for each output neuron. Then calculate the
mean square error.
- The mean squared error is calculated by calculating difference between target and the actual
output, squaring it summing over all the trials. Then by dividing by the number of trials M, to
get the mean value as given in equation (26).

( )
|
|
.
|

\
|
=

=
2
1
.
1
M
j
j j
t a
M
dError MeanSquare
(26)
When the mean squared error reaches the possible minimum value the corresponding trained net-
work is saved. This is used to test the performance of the network for new data.
3.5 Demonstration of developing a NN by example
In this section a demonstration is done to explain how a problem is solved using NN technology.
This problem is related to the modelling of non linear thermodynamic characteristics, to show the
capability of NNs.
3.5.1 Problem
The problem selected is related to representation of thermodynamic properties of steam. Enthalpy,
entropy characteristics of steam is non-linear. For different value of x (steam quality) characteristic
curves can be represented as shown in Fig. 3.9, for different values of x, x=0.8, 0.85, 0.9 and for 0.95
respectively. These characteristics can not be modelled by using mathematical models due to its com-
plexity and non-linear nature. Hence, some other method has to use to model these characteristics.
32
Enthalpy vs Entropy for Steam
2000
2100
2200
2300
2400
2500
2600
2700
2800
2900
5
.
0
0
5
.
4
0
5
.
8
0
6
.
2
0
6
.
6
0
7
.
0
0
7
.
4
0
7
.
8
0
8
.
2
0
8
.
6
0
9
.
0
0
Entropy S
E
n
t
h
a
l
p
y

h

/

K
J
/
K
g
Enthalpy x=1.0
x=0.95
x=0.90
x=0.85
x=0.80

Fig 3.9 Enthalpy vs Entropy for Steam
Capability of modelling non-linear characteristics of a system in NNs can be useful in modelling a
system of this nature. This system can considered as a model with 2 inputs, steam quality x and en-
tropy s and enthalpy h as the output as shown in Fig. 3.10.
3.5.2 System as a NN model


Fig 3.10 NN model (inputs/ outputs)







33

3.5.3 Data used
Entropy, enthalpy data used to model the system is shown in the table 3.
Table 3: Data of Entropy and enthalpy for different values of x , steam quality
Entropy S
Enthalpy for
x = 1.0 x = 0.95 x = 0.90 x = 0.85 x = 0.80
5.0 2461 2461 2446 2446 2423
5.2 2561 2561 2534 2515 2450
5.4 2653 2630 2600 2538 2438
5.6 2723 2684 2630 2538 2400
5.8 2769 2715 2623 2500 2330
6.0 2800 2715 2600 2446 2265
6.2 2808 2693 2561 2400 2215
6.4 2793 2661 2523 2346 2165
6.6 2769 2638 2475 2300 2123
6.8 2746 2600 2446 2261 2076
7.0 2719 2569 2400 2230 2038
7.2 2692 2542 2369 2200 2015
7.4 2669 2507 2338 2176
7.6 2650 2476 2318 2146
7.8 2623 2453 2293 2130
8.0 2600 2438 2276
8.2 2576 2423 2261
8.4 2553 2400
8.6 2538 2384
8.8 2523
9.0 2515

This data has to be normalized in order to make the data range in between 0-1. We take normalized s
= S/ 10 and h=H/ 10000 to feed into the NN model as inputs and targets in the training process.
3.5.4 Traini ng
Training data set consists of 63 set of input/ output data values (s, x), where s and x are the entropy
and steam quality respectively. Entropy (s) is ranging from 5 to 9 for different values of x ranging
from 0.8 to 0.95. As the system has two inputs (x, s); input layer consists of two neurons and the out-
put layer one neuron to represent the output (h). In this case two hidden layers are selected which
comprises of 3 neurons and 2 neurons for the hidden layer 1 and hidden layer 2, respectively, as
shown in Fig. 3.11. Initially, number of hidden layers and number of neurons in each layer are se-
lected randomly. Later, they are changed so as to get the optimized performance of the model by
minimizing the error.
34

Fig. 3.11 NN model
In the implementation of this network in Matlab, it can be represented in Matlab code as,
nnet =newf f ( pr , [ 2321] , {' t ansi g' ' t ansi g' ' t ansi g' ' t ansi g'
}, ' t r ai nbr ' ) ;
which represents
newff, Create a feed-forward back propagation network.
pr, represents the input data
2321, number of neurons in each layer (input, hidden layer 1, hidden layer 2, output layer etc)
tansig, is a transfer function. Transfer functions calculate a layer's output from its net input.
Trainbr, is a network training function that updates the weight and bias values according to Leven-
berg-Marquardt optimization. It minimizes a combination of squared errors and weights and, then
determines the correct combination so as to produce a network which generalizes well. The process
is called Bayesian regularization [13].
Full code listing of the Model training program is given below.
% develops a model to steam entropy enthalpy Characteristics (10 jun 2010)
% Trains ,validates and tests new data
close all;
clear all;
tic;
file=xlsread('steam','a23:c86'); % loads xl data file
toc;
tic;
B=file(:,1:2); % loads inputs x, s
C=file(:,3:3); % loads outputs h
p=B'; % inputs
t=C'; % targets
Q=6;
n=63;
dtst=14:Q:n; % divides data for training validation
dval= [ 13:Q:n ]; % and testing
dtrn=[1:Q:n 2:Q:n 3:Q:n 4:Q:n 5:Q:n 6:Q:n 7:Q:n 8:Q:n 9:Q:n 10:Q:n 11:Q:n 12:Q:n ];
val.P=p( : , dval); % validation data
val.T=t( : , dval);
test.P=p( : , dtst); % test data
test.T=t( : , dtst);
ptr=p( : , dtrn); % training data
35
ttr=t( : , dtrn);
nnet=network; % creates network
pr=minmax(p);
nnet=newff(pr,[2 3 2 1 ],{ 'tansig' 'tansig' 'tansig'
'tansig' },'trainbr');
nnet.trainParam.epochs = 1000;
nnet.trainParam.show = 1;
nnet.trainParam.lr = 0.01 % SETS ETA learning rate
[nnet,tr]=train(nnet,ptr,ttr,[],[],val,test);
plot(tr.epoch,tr.perf,tr.epoch,tr.vperf,tr.epoch,tr.tperf)
legend('Training' , 'Validation' , 'Test', -2);
ylabel('Squared Error');
xlabel('Epoch ');
title(' Model Performance');
a = sim(nnet,p); % simulates
figure(1)
t1=t(1:1,1:63); % target
a1=a(1:1,1:63); % simulated output
plot(2:64,a1,'.',2:64,t1,'r-')
ylabel('Output');
xlabel('Entropy S ');
title(' Predicted vs Actual');
% writes data into xl file
SUCESS=XLSWRITE('steam_op.xls',a1','b2:b64')
SUCESS=XLSWRITE('steam_op.xls',t1','c2:c64')
toc;
% regresson analysis
figure(2) %
[m(1),b(1),r(1)]=postreg(a1,t1);
% end

In the training process the trainingis done iteratively for a number of epochs (iterations) until the er-
ror (in this case SSE; sum of squared error) reaches to a predefined value. Variation of performance,
training, validation and testing errors during the training process is shown in Fig 3.12. At the end of
around 260 epochs (iterations), training, validation and testing errors have reached to 1.59084e-005,
1.40373e-006 and 6.53338e-007, respectively.

36
Fig.3.12 Training error of the model
3.5.5 Simulation
Simulation is done in order to test the trained model to see whether it performs well for the new data
(unseen data) fed to the model. In this case data relevant to x=1.0 was selected as the new data to test
the model.
Model simulation (testing) matlab code listing
% Tests the model for new data
% of steam x= 1.0 (13 Jun 2010)
close all;
clear all;
tic;
A=XLSREAD('steam'); % loads xl data file
load model; % trained network model
net1= model;
toc;
% column 1 and 2 is input data
rec_start=2;
rec_end=22;
n=rec_end-rec_start; % no of records
tic;
B=A(2:22,3:3); % loads data output (entropy,h)
C=A(2:22,1:2); % loads inputs ( x, s)
p=C';
t=B';
toc;
pr=minmax(p);
tic;
a = sim(net1,p); % simulate the model
figure(1) % graph 1
t1=t(1:1,1:n); % expected target
a1=a(1:1,1:n); % simulated target
plot(1:n,a1,'bx',1:n,t1,'r-')
ylabel('Enthalpy / KJ/ Kg Pu')
xlabel('Entropy, S / KJ/ Kg')
legend('simulated ' , 'actual ');
title('Entropy vs Enthalpy for steam x =1.0')
% end
3.5.6 Results
Simulation results of the model shown in Fig.3.13 illustrate the characteristic curves relevant different
steam qualities. The simulated characteristics generated by the NN model in comparison to actual
values are almost same. This result shows how the model has modelled the characteristics of the
given data set for training.
The NN has satisfactorily modelled the characteristics of steam. Regression analysis for the simula-
tion is shown in Fig.3.14 proves the performance of the model.

37

Fig.3.13 Simulated vs Actual characteristics


Fig. 3.14 Regression analysis result for the model

The simulated results for unseen (new) data to the model which corresponds to x=1.0 is shown in
Fig.3.15. It compares actual characteristics with the simulated result. So the model accurately simu-
lates the steam characteristics where the corresponding regression analysis results are shown in Fig.
3.16.
Hence, using this model any other characteristics curves corresponding to intermediate values of x
such as x=0.775, 0.825, 0.875, 0.925, 0.975 can be obtained is shown in Fig 3.17
.
38

Fig.3.15 Simulated (predicted) characteristics for new data for x=0.1


Fig.3.16 regression analysis results for x=1.0



Fig.3.17 Predicted characteristics for new data x=0.775, x=0.825,
X=0.875, x=0.925, x=0.975 and x=1.0

39

40
4 Developing the model

This section describes the approach and steps followed to develop a dynamic model to simulate hy-
dro-electric power generating unit bearing temperature variation with time, electrical load, with the
duration of operation and other environmental factors.
4.1 Selection of input variables
Input variables which affect to the characteristics of the system under investigation are listed out be-
low.
TLGBm Lower guide bearing metal temperature
TUGBm Upper guide bearingmetal temperature
TTGBm Turbine guide bearing metal temperature
TTHBm Thrust bearing metal temperature
TLGBoil Lower guide bearing oil temperature
TUGBoil Upper guide bearingoil temperature
TTGBoil Turbine guide bearing oil temperature
TTHBoil Thrust bearing oil temperature
Tcoolingwater Cooling water temperature
Tair Circulatingair temperature
mdotCW Cooling water flow rate
mBCW Bearing cooler water flow rate
Le Electrical load (MWs)
Lvars Electrical load (Vars)

The main concern is to simulate the temperature variation pattern of,
TLGB, TUGB,TTHB, TTGB, TLGBoil, TUGBoil, TTGBoil and TTHBoil. But,values of the above variables depend not
only on the instantaneous values of them, but current values as well as the previous values. It can be
illustrated more general form as shown in Fig. 4.1, where, Xi as temperature related inputs, mi as
flow rate related inputs and Li as load variable related inputs .

41

Fig.4.1 System as a static model

Where,
Xi = { TUGBm(0), TTHBm(0), TLGBm(0), TUGBO(0), TLGBO(0), TTGBO(0), TUGBm(t-2T),
TUGBm(t-T), TUGBm(t), TTHBm(t-2T), TTHBm(t-T), TTHBm(t), TLGBm(t-2T),
TLGBm(t-T), TLGBm(t), TTGBm(t-2T), TTGBm(t-T), TTGBm(t), TUGBo(t-2T),
TUGBo(t-T), TUGBo(t), TLGBo(t-2T), TLGBo(t-T), TLGBo(t), TTGBo(t-2T),
TTGBo(t-T), TTGBo(t), TCW(t-2T), TCW(t-T), TCW(t), TCA(t-2T), TCA(t-T), TCA(t), } Mi = { mdot1(t-2T),
mdot1(t-T), mdot1(t), mdot2(t-2T), mdot2(t-T), mdot2(t) } Li = { Lmw(t-2T), Lmw(t-T), Lmw(t), Lmv(t-
2T), Lmv(t-T), Lmv(t), }
4.2 Selection of data
As a case study, a set of data records were obtained from the Victoria hydro power station, Sri Lanka.
It is a vertical shaft turbine generator unit which has an electrical power generating capacity of 71
MW. The data set was extracted from eight channels of the DAQSTANDARD R8.11 data recorder,
which contains bearing metal temperatures, oil temperatures, cooling water flow rates and generator
electrical load. The data set consists of 3623 data records as given in Appendix D. The samplingpe-
riod of data was 10 minutes intervals.
4.3 Approach of developing a dynamic model
In section 4.3.1 up to 4.3.4, it is described the approach and how the model is developed in step by
step by startingfrom the selection of variables to developing a model to characterize the system.
4.3.1 Developi ng a static NN model
As discussed earlier, in section 4.1 and as shown in Fig. 4.1, there are two types of input variables to
the model, temperature dependent variables ( bearing metal temperatures, bearing oil temperatures,
cooling water temperature and circulating air temperature) as denoted by Xi. Second, type of inputs
is the bearing water flow rates that do not change due to the performance of the system and the elec-
trical load that directly affect to the bearing metal and bearing oil temperatures.

42
The variables that interact with system can also be classified into two categories. They are external
variables and internal variables. Electrical load, cooling water temperature, circulating air tempera-
tures are acts as external factors while initial bearing metal temperature, bearingoil temperature act as
internal variables of the system. In a system of this nature, output values depend on the present status
as well as previous status of the system.
In mathematical form, general behavior of the system can be defined as,
State equation,
) ), ( ), ( ( ) ( w t X t S f T t S = + (27)
Output equation,
) ), ( ( ) ( w t S h t y = (28)
Where, S represents the state vector, x external input vector and wneural parameter vector synaptic
connection vectors and operational parameters, f(.) is the function that represents the structure of the
neural network, and h(.) is a function that represents the relationship between state vector S(t) and
output vector y(t) [13].
The output of the system does not depend on the current inputs but also on the previous values.
Therefore, previous time series values also have to be considered. Some times in order to get a rea-
sonable accuracy several previous states have to be considered. Therefore, some sort of memory ca-
pability has to be introduced to the model. The variation of temperatures are continues varying func-
tions. But, as we consider sample inputs at a chosen time interval the model becomes a discrete sys-
tem. Hence, the memory capability can be incorporated by giving a series of time delay inputs to rep-
resent previous states [14].

Fig 4.2 Representation of a NN for prediction

Selection of time delay, inputs to represent the previous states in a NN structure and predicting the
output can be shown in Fig. 4.2, where one time dependent variables is shown there. Equations (27)
and (28) describe behavior of a first order system which takes into account the previous state (with
one step time delay) of the variables. In generally n
th
order system can be described as,
State equation,
) ), ( ) ] 1 [ ( )......... 2 ( ), ( ), ( ( ) ( w t X T n t S T t S T t S t S f T t S = + (29)
Output equation,
43
) ), ( ( ) ( w t S h t y = (30)
We have developed two models; second order and third order. Then by comparing the performance
or output error, the better model with the lowest error can be selected. But, higher the number of
time series values or degree of the network, number of hidden layers, and number of neurons in each
layer the computing power (memory and processor speed) required is higher. Hence, a compromiza-
tion between accuracy and computing power is needed.
In a second order system we have to consider the two previous states. Therefore, in order to predict
the bearing temperature value at time t, bearing temperatures at time, (t-T) and (t-2T) also has to con-
sidered. Then, with the bearingmetal temperature, bearingoil temperature, cooling water tempera-
ture, circulating air temperature and electrical load MWs, MVars altogether makes 32 inputs to the
model. Our intention is to predict the four bearing metal temperatures but as bearingoil tempera-
tures, cooling water temperature and circulating air temperatures also affect to it, altogether the num-
ber of outputs become 9 (TUGBm, TTHBm, TLGBm, TTGBm, TUGBOil, TLGBOil, TTGBOil, Tcw, TCA). So that, the ini-
tial architecture of the NN takes shape of 32 input nodes, and 9 output nodes as shown in Fig. 4.2.
Lets arbitrarily select two hidden layers. This can be changed if necessary during the process of
trainingthe network. [15][16][17].
Then, the initial architecture becomes (32, 24, 15, 9), where number of inputs and outputs are a fixed
value and the number of input also can be changed according to the consideration of previous status
of inputs at interval such as t-T, t-2T, etc depending on the accuracy or the error of training. Training,
validation and testing errors explain to what extent that the model fit to the actual system behavior.

Fig 4.2 General NN architecture
4.3.2 Traini ng the network and trai ning results
For training the network 500 data records which consists of past data inputs and outputs were used.
Initially, time interval t, time intervals t, t-T, t-2T was considered 23 input 9 outputs and 32 inputs, 9
outputs, respectively. Four different architectures were selected by varying the number of previous
status considered, number of hidden layers and number of nodes in hidden layers etc. For training
the model four different architectures were considered as shown in Table 4.1
Table 4.1: Model Architecture
Model no Architecture
1 23,15,12,9 second order 4 Layers
2 32,21,9 third order 3 Layers
3 32,28,16,9 third order 4 Layers
4 32,25,15,9 third order

44
Training, testing and validation results are shown in Table 4.2 for models 1,2,3 and 4, respectively.
The model with the (32, 25, 15, 9) architecture gives the performances giving the lowest training error
of 0.0689 SSE.

Table 4.2: Model Performance
Model no Architecture Training Error (SSE)
1 23, 15, 12, 9 second order 1.4634
2 32, 21, 9 third order 0.2478
3 32, 28, 1 6, 9 third order 0.1189
4 32, 30, 16, 9 third order 0.0689
5 32, 40, 26, 9 third order 0.0011

In order to improve the training performance of the model, the whole system was divided into two
sub units and two separate models are developed for the individual sub units. The new approach is
shown in Fig 4.3.


Fig4.3 Sub models to represent the system
Decomposed model with two sub models shows better training performance compared to single
model. The architecture selected for the sub models are (19, 50, 25, 5) and (17, 50, 30, 6) respectively.
Where, UGBm, THBm, UGBo, CW, CA with delayed time series inputs and MW, Mvar, cooling
bearing water flow rate act as inputs to the model 1. Then, similarly LGBm, TGBm, LGBo, TGBo,
with delayed time series inputs and MW, Mvar, coolingbearingwater flow rate act as inputs to the
model 2. The training performance of the model is shown in Table 4.3 for model 1 and model2, re-
spectively.
Table 4.3. : Sub Model Performance
Model no Architecture Training Error (MSE)
1 19, 50, 25, 5 3.04 X 10-7
2 17, 50, 30, 6 3.80 x 10-7

Graphical representation of the performance and convergence of the errors to a minimum value dur-
ing the training process is shown in Fig. 4.4.

45


Fig. 4.4 Training Performance of Model 1
4.3.3 Static model simulation results
Our approach is to develop (training) a static model to simulate the behavior of the real system and
then to convert it to a dynamic model by arranging a feedback of internal variables as inputs to the
model. The simulated outputs are compared with the actual outputs to evaluate the performance of
the static model.
4.3.4 Developi ng the dynamic model
As described in the previous section in order to model the temporal nature of the system as well as
the effect of the internal variables the general architecture of the model should be as shown in Fig.
4.5 where Xi(0) denotes the initial conditions.

Fig 4.5 NN Dynamic model


46


Algorithm of the simulation:
Read
X
i
( 0) , i ni t i al condi t i ons ( bear i ng met al and oi l
t emper at ur e)
r ead X
i
( t ) , X
i
( t - T) , X
i
( t - 2T) , bear i ng met al and
oi l t emper at ur e
M
i
( t ) , L
i
( t ) cool i ng wat er f l ow r at es, ci r cul at i ng ai r
t emper at ur e and el ect r i cal l oad,
make i nput mat r i x
l oad t r ai ned neur al net wor k
deci de t i me dur at i on n
l oop up t o n r ecor ds
si mul at e and get out put of X
i
( t +T)
updat e i nput s
r ecor d out put

end
pl ot gr aph of X
i
( t ) si mul at ed & act ual
For corresponding Matlab implementation (Matlab codes) see appendix C.
Next section presents the dynamic simulation results obtained from the model.



47
5 Results
5.1 Static model simulation results
5.1.1 Static model simulation results for beari ng metal temperature

Fig. 5.1. Simulation results of the static model

48
Out put results obtained from the static model are shown in Fig. 5.1. It represents the UGB, LGB,
THB and TGB metal temperature variation with time for a given generator load profile.
5.1.2 Co-relation coefficient of the static simulation results
Corresponding correlation coefficient results for bearing metal temperatures for UGB, LGB and
THB and TGB are shown in Fig5.2.and Fig. 5.3 respectively.

Fig.5.2. Co-relation results for UGB and LGB metal temperature


Fig.5.3. Co-relation results for THB and TGB metal temperature





49


5.1.3 Static model simulation results for beari ng oil temperature

Fig 5.4. Simulation results of static model for bearing oil temperature
Simulation results of static model for bearing oil temperature for UGB, THB and TGB and corre-
sponding co-relation coefficients graphs are shown in Fig. 5.4. and Fig. 5.5, respectively.








50



5.1.4 Correlation coefficients of simulation on bearing oil temperature.

Fig.5.5. Correlation coefficients of simulation on bearing oil temperature
Summary of the static model simulation results are shown in Fig.5.6 and Fig.5.7 for bearing metal and
bearing oil.
51
5.1.5 Summary results of static model

Fig. 5.6. Simulation results of static model for all four bearing metal temperatures


Fig. 5.7 Simulation results of static model for bearing oil temperatures
52
5.2 Dynamic simulation results
5.2.1 Dynamic simulation results for beari ng metal temperature

Fig. 5.8 Dynamic simulation results for bearing metal temperature

53
5.2.2 Dynamic simulation results for bearing oil temperature

Fig 5.9 Dynamic simulation results for bearing oil temperature

Dynamic model simulation results for bearing metal temperature variation and bearing oil tempera-
ture variation for new data (unseen data) for the model are shown in Fig.5.9 and Fig.5.10 respectively.
5.3 Dynamic simulation results for reduced flow rate
5.3.1 Beari ng metal temperature variation


54
Fig 5.10 Dynamic simulation results of bearing metal temperature rise for reduced cooling water flow
rate

A new input data set, which represent a different operating environment (i.e. reduced coolingwater
flow rate) were presented to the trained model. The simulated out put given by the model is shown in
Fig.5.10 for bearing metal temperature variation and in Fig.5.11 for bearing oil temperature respec-
tively.
Both the bearing metal and oil temperatures show a temperature rise over the normal operatingcon-
ditions due to reduced cooling effect of heat exchangers as results of reduced (10%) coolingwater
flow rate.
5.3.2 Beari ng oil temperature variation

Fig 5.11 Dynamic simulation results of bearingoil temperature
rise for reduced cooling water flow rate









55



56
6 Discussion
In this research work Neural Network modelling approach was used to model the bearingheat ex-
changer system of a hydro electric power generating unit. The results shown in section 5 consist of
performance obtained from the static model for bearing metal temperature. Static simulation was
done in order to test the accuracy of the static model. Correlation coefficient results shown in Fig. 5.2
and 5.4 respectively show the accuracy of actual verses simulation results.
Then, as discussed in section 4.3.4, dynamic simulation model was developed using the above results.
The results obtained for the dynamic simulation for the untrained on untested data are shown in Fig.
5.7 and 5.8 for bearing metal temperatures and bearing oil temperatures respectively. Those results
shows with accuracy of 1.0 C for bearing metal temperature and 2.0 C variation for bearing
metal temperature with compared to the actual past performance of the system.
For improving the accuracy, more past data needs to feed to cover all possible input combinations
and also several previous values of inputs. Higher number of inputs of the NN model, number of in-
put layer neurons and intermediate layer neurons increase. Therefore, higher computing capacity is
needed in terms of memory and processor speed to train the network.
In section 5.3, it was tested the behavior of the heat exchanger system due to a reduced cooling water
flow rate for the same load profile, as it usually happens in operation of power plants. In section 5.3,
as shown in Fig.5.10 and 5.11, both the bearing metal temperatures and the bearing oil temperatures
have risen over the normal stabilized temperature level due to the reduced heat absorbingrate caused
by the reduced (10%) cooling water flow rate.









57
7 Conclusions
Continuous operation of old hydropower plants have constrained with the failures due to bearing
overheating. The objective of this work was modelling and simulation of dynamic variation of tem-
peratures of bearings (generator guide bearing, turbine guide bearing, thrust bearing) of a hydro elec-
tric generating unit. The temperature of a bearing is depends on multiple variables such as ambient
air temperature, cooling water temperature, cooling water flow-rate, initial bearing temperatures, du-
ration of operation and generatingunit electrical load.
As the problem under investigation was a multi-input (MI) and multi-output (MO) system, conven-
tional first principles based model approach and sequential computer programs could not be applied.
So that the neural network (NN) method was selected as the best where past input and output data is
available, and the input characteristics can be mapped in order to develop a model. The NNs capa-
bility of parallel processing was used to develop a model the system. This was implemented in MAT-
LAB environment. According to the simulation results, it demonstrates a reasonable (2 C) accuracy
to predict the temperature variation for a given generator load profile.
Hence, this model can be used to predict the temperature variation characteristics of the system.
Temperature increase in ambient air, or cooling water (due to reduced cooling water flow rate) would
increase the temperature level of bearing metal and oil. Usingthis model, it is possible to predict the
temperature increase for a given generator load profile for a given period. It will help to determine
maximum safe load level, while maximizing the plant factor minimizing the sudden failures due to
bearing overheating.












58
8 References
[1] www.eia.doe.gov/ Energy Information Administration international statics database, visited 04-
03-2010
[2] Renewable Global Status Report 2006 Update, REN21, published 2007, accessed 04-03-2010;
see Table 4, p. 20
[3] Statistical Digest 2009, Ceylon Electricity Board, Sri Lanka
[4] http:/ / www.machinerylubrication.com/ Read/ 367/ temperature-stability, Machinery Lubrica-
tion, as accessed 2010-03-02
[5] R.K.Sharma, T.K.Sharma, A text book of Water Power Engineering, S.Chand & Company
Ltd, pp 450-455, 2000
[6] http:/ / www.engineeringtoolbox.com/ overall-heat-transfer-coefficient- d_434.html,The Engi-
neering Tool Box, as accessed 22-10-2010
[7] Girish kumar Jar, Artificial neural networks and its applications, IARI, New Delhi- 100-012, pp
41-42,
[8] Ral Rojas, Neural network a systematic introduction, Sprinter -Verlag, Berlin, New-York, pp
5-27, March 1996.
[9] Heichi Neilson R. [1990], Neurocomputing, Addison-Wesley, Reading, Mass.pp 18, 1990
[10] Ugur Hlici, Artificial Neural Networks, EE543 Lecture notes, METU EEE, Ankara.
[11] http:/ / www.learnartificialneuralnetworks.com/ , Artificial Neural network tutorial as accessed
24-06-2010
[12] Matlab Neural Network Users Guide, The Mathworks inc, 1992-2010
[13] Stuart Russel, Peter Norvig, Artificial Intelligence A Modern Approach Persons Inc, pp24,693-
695,727-736,1995,
[14] G. Bekely and K. Goldberg, Eds, Norwell , MA Kluwer , Stable nonlinear system identification
usingneural network models in Neural Networks in Robotics, pp 147-164,1992
[15] http:/ / www.obitko.com/ tutorials/ neural-network-prediction/ prediction-using-neural-
networks.html, Prediction using Neutal Networks, as accessed 17-11-2010
[16] Madan M Gupta, Liang jin and Noriyasu Homma, Static and Dynamic Neural Networks From
Fundamental to Advanced theory, John Wiley& Sons Inc, Hobokan New Jersey,pp 27-31,297-387,
2003
[17] James A Freeman, David M Skapura, Neural networks Algorithms, Applications, and pro-
grammingtechniques, Pearson Inc, pp 12-30,89-93,1999



59
Appendix A : NN initial weight and
bias values (NN example)
nnet =
Neur al Net wor k obj ect :
ar chi t ect ur e:
numI nput s: 1
numLayer s: 4
bi asConnect : [ 1; 1; 1; 1]
i nput Connect : [ 1; 0; 0; 0]
l ayer Connect : [ 4x4 bool ean]
out put Connect : [ 0 0 0 1]
t ar get Connect : [ 0 0 0 1]

numOut put s: 1 ( r ead- onl y)
numTar get s: 1 ( r ead- onl y)
numI nput Del ays: 0 ( r ead- onl y)
numLayer Del ays: 0 ( r ead- onl y)

subobj ect st r uct ur es:

i nput s: {1x1 cel l } of i nput s
l ayer s: {4x1 cel l } of l ayer s
out put s: {1x4 cel l } cont ai ni ng 1 out put
t ar get s: {1x4 cel l } cont ai ni ng 1 t ar get
bi ases: {4x1 cel l } cont ai ni ng 4 bi ases
i nput Wei ght s: {4x1 cel l } cont ai ni ng 1 i nput wei ght
l ayer Wei ght s: {4x4 cel l } cont ai ni ng 3 l ayer wei ght s

f unct i ons:

adapt Fcn: ' t r ai ns'
i ni t Fcn: ' i ni t l ay'
per f or mFcn: ' mse'
t r ai nFcn: ' t r ai nbr '

par amet er s:

adapt Par am: . passes
i ni t Par am: ( none)
per f or mPar am: ( none)
t r ai nPar am: . epochs, . show, . goal , . t i me,
. mi n_gr ad, . max_f ai l , . mem_r educ, . mu,
. mu_dec, . mu_i nc, . mu_max, . l r

wei ght and bi as val ues:

60
I W: {4x1 cel l } cont ai ni ng 1 i nput wei ght ma-
t r i x
LW: {4x4 cel l } cont ai ni ng 3 l ayer wei ght ma-
t r i ces
b: {4x1 cel l } cont ai ni ng 4 bi as vect or s

ot her :

user dat a: ( user st uf f )
























61



62
Appendix B: Training record ( NN
example)
TRAI NBR, Epoch 0/ 1000, SSE 18. 3133/ 0, SSW 1279. 74, Gr ad
4. 87e+001/ 1. 00e- 010, #Par 2. 60e+001/ 26
TRAI NBR, Epoch 1/ 1000, SSE 1. 88459/ 0, SSW 139. 986, Gr ad
2. 35e+001/ 1. 00e- 010, #Par 2. 08e+000/ 26
TRAI NBR, Epoch 2/ 1000, SSE 0. 173437/ 0, SSW 136. 477, Gr ad
6. 31e+000/ 1. 00e- 010, #Par 3. 42e+000/ 26
TRAI NBR, Epoch 3/ 1000, SSE 0. 0269173/ 0, SSW 135. 249, Gr ad 8. 41e-
001/ 1. 00e- 010, #Par 3. 66e+000/ 26
TRAI NBR, Epoch 4/ 1000, SSE 0. 0227162/ 0, SSW 135. 158, Gr ad 6. 38e-
002/ 1. 00e- 010, #Par 5. 74e+000/ 26
TRAI NBR, Epoch 5/ 1000, SSE 0. 019955/ 0, SSW 135. 043, Gr ad 1. 20e-
001/ 1. 00e- 010, #Par 5. 82e+000/ 26
TRAI NBR, Epoch 6/ 1000, SSE 0. 015607/ 0, SSW 134. 917, Gr ad 2. 41e-
001/ 1. 00e- 010, #Par 6. 22e+000/ 26
TRAI NBR, Epoch 7/ 1000, SSE 0. 0102373/ 0, SSW 134. 792, Gr ad 4. 52e-
001/ 1. 00e- 010, #Par 6. 69e+000/ 26
TRAI NBR, Epoch 8/ 1000, SSE 0. 0065884/ 0, SSW 134. 151, Gr ad 4. 35e-
001/ 1. 00e- 010, #Par 7. 39e+000/ 26
TRAI NBR, Epoch 9/ 1000, SSE 0. 00592687/ 0, SSW 133. 443, Gr ad 5. 50e-
001/ 1. 00e- 010, #Par 8. 17e+000/ 26
TRAI NBR, Epoch 10/ 1000, SSE 0. 00455401/ 0, SSW 133, Gr ad 3. 08e-
001/ 1. 00e- 010, #Par 8. 08e+000/ 26
TRAI NBR, Epoch 11/ 1000, SSE 0. 00406591/ 0, SSW132. 633, Gr ad 2. 49e-
001/ 1. 00e- 010, #Par 8. 40e+000/ 26
TRAI NBR, Epoch 12/ 1000, SSE 0. 00361545/ 0, SSW132. 268, Gr ad 1. 09e-
001/ 1. 00e- 010, #Par 8. 55e+000/ 26
TRAI NBR, Epoch 13/ 1000, SSE 0. 00333151/ 0, SSW131. 889, Gr ad 2. 55e-
002/ 1. 00e- 010, #Par 8. 70e+000/ 26
TRAI NBR, Epoch 14/ 1000, SSE 0. 00309213/ 0, SSW131. 553, Gr ad 2. 50e-
002/ 1. 00e- 010, #Par 8. 82e+000/ 26
TRAI NBR, Epoch 15/ 1000, SSE 0. 0028808/ 0, SSW 131. 326, Gr ad 1. 73e-
002/ 1. 00e- 010, #Par 8. 92e+000/ 26
63
TRAI NBR, Epoch 16/ 1000, SSE 0. 00274218/ 0, SSW131. 236, Gr ad 8. 92e-
003/ 1. 00e- 010, #Par 8. 99e+000/ 26
TRAI NBR, Epoch 17/ 1000, SSE 0. 0026621/ 0, SSW 131. 254, Gr ad 1. 19e-
002/ 1. 00e- 010, #Par 9. 05e+000/ 26
TRAI NBR, Epoch 18/ 1000, SSE 0. 00260062/ 0, SSW131. 324, Gr ad 7. 59e-
004/ 1. 00e- 010, #Par 9. 14e+000/ 26
TRAI NBR, Epoch 19/ 1000, SSE 0. 00254052/ 0, SSW131. 418, Gr ad 2. 42e-
002/ 1. 00e- 010, #Par 9. 29e+000/ 26
TRAI NBR, Epoch 20/ 1000, SSE 0. 00247796/ 0, SSW131. 532, Gr ad 6. 83e-
002/ 1. 00e- 010, #Par 9. 57e+000/ 26
TRAI NBR, Epoch 21/ 1000, SSE 0. 00243158/ 0, SSW131. 661, Gr ad 1. 43e-
001/ 1. 00e- 010, #Par 1. 01e+001/ 26
TRAI NBR, Epoch 22/ 1000, SSE 0. 00236767/ 0, SSW131. 762, Gr ad 2. 10e-
001/ 1. 00e- 010, #Par 1. 09e+001/ 26
TRAI NBR, Epoch 23/ 1000, SSE 0. 00225754/ 0, SSW 131. 75, Gr ad 2. 84e-
001/ 1. 00e- 010, #Par 1. 17e+001/ 26
TRAI NBR, Epoch 24/ 1000, SSE 0. 00183234/ 0, SSW 131. 76, Gr ad 1. 64e-
002/ 1. 00e- 010, #Par 1. 25e+001/ 26
TRAI NBR, Epoch 25/ 1000, SSE 0. 00173812/ 0, SSW131. 778, Gr ad 3. 77e-
002/ 1. 00e- 010, #Par 1. 28e+001/ 26
TRAI NBR, Epoch 26/ 1000, SSE 0. 00162116/ 0, SSW131. 793, Gr ad 5. 36e-
002/ 1. 00e- 010, #Par 1. 30e+001/ 26
TRAI NBR, Epoch 27/ 1000, SSE 0. 00148385/ 0, SSW131. 802, Gr ad 7. 28e-
002/ 1. 00e- 010, #Par 1. 32e+001/ 26
TRAI NBR, Epoch 28/ 1000, SSE 0. 00133044/ 0, SSW131. 805, Gr ad 9. 17e-
002/ 1. 00e- 010, #Par 1. 35e+001/ 26
TRAI NBR, Epoch 29/ 1000, SSE 0. 00116453/ 0, SSW131. 799, Gr ad 1. 03e-
001/ 1. 00e- 010, #Par 1. 38e+001/ 26
TRAI NBR, Epoch 30/ 1000, SSE 0. 000989142/ 0, SSW 131. 784, Gr ad
9. 98e- 002/ 1. 00e- 010, #Par 1. 41e+001/ 26
TRAI NBR, Epoch 31/ 1000, SSE 0. 000811578/ 0, SSW 131. 764, Gr ad
8. 63e- 002/ 1. 00e- 010, #Par 1. 44e+001/ 26
TRAI NBR, Epoch 32/ 1000, SSE 0. 000639233/ 0, SSW 131. 747, Gr ad
7. 16e- 002/ 1. 00e- 010, #Par 1. 48e+001/ 26
TRAI NBR, Epoch 33/ 1000, SSE 0. 00047769/ 0, SSW131. 748, Gr ad 6. 69e-
002/ 1. 00e- 010, #Par 1. 51e+001/ 26
TRAI NBR, Epoch 34/ 1000, SSE 0. 000344347/ 0, SSW 131. 787, Gr ad
8. 24e- 002/ 1. 00e- 010, #Par 1. 54e+001/ 26
TRAI NBR, Epoch 35/ 1000, SSE 0. 000265495/ 0, SSW 131. 864, Gr ad
1. 13e- 001/ 1. 00e- 010, #Par 1. 57e+001/ 26
64
TRAI NBR, Epoch 36/ 1000, SSE 0. 000203495/ 0, SSW 131. 937, Gr ad
1. 13e- 001/ 1. 00e- 010, #Par 1. 60e+001/ 26
TRAI NBR, Epoch 37/ 1000, SSE 0. 000149666/ 0, SSW131. 96, Gr ad 8. 72e-
002/ 1. 00e- 010, #Par 1. 62e+001/ 26
TRAI NBR, Epoch 38/ 1000, SSE 0. 0001153/ 0, SSW 131. 909, Gr ad 5. 99e-
002/ 1. 00e- 010, #Par 1. 65e+001/ 26
TRAI NBR, Epoch 39/ 1000, SSE 9. 4329e- 005/ 0, SSW 131. 778, Gr ad
3. 31e- 002/ 1. 00e- 010, #Par 1. 67e+001/ 26
TRAI NBR, Epoch 40/ 1000, SSE 8. 36614e- 005/ 0, SSW 131. 578, Gr ad
1. 58e- 002/ 1. 00e- 010, #Par 1. 69e+001/ 26
TRAI NBR, Epoch 41/ 1000, SSE 7. 75849e- 005/ 0, SSW 131. 337, Gr ad
7. 80e- 003/ 1. 00e- 010, #Par 1. 70e+001/ 26
TRAI NBR, Epoch 42/ 1000, SSE 7. 33396e- 005/ 0, SSW 131. 078, Gr ad
4. 94e- 003/ 1. 00e- 010, #Par 1. 71e+001/ 26
TRAI NBR, Epoch 43/ 1000, SSE 6. 99426e- 005/ 0, SSW 130. 814, Gr ad
3. 85e- 003/ 1. 00e- 010, #Par 1. 71e+001/ 26
TRAI NBR, Epoch 44/ 1000, SSE 6. 70402e- 005/ 0, SSW 130. 553, Gr ad
3. 27e- 003/ 1. 00e- 010, #Par 1. 72e+001/ 26
TRAI NBR, Epoch 45/ 1000, SSE 6. 4468e- 005/ 0, SSW 130. 301, Gr ad
2. 86e- 003/ 1. 00e- 010, #Par 1. 72e+001/ 26
TRAI NBR, Epoch 46/ 1000, SSE 6. 21337e- 005/ 0, SSW 130. 061, Gr ad
2. 54e- 003/ 1. 00e- 010, #Par 1. 73e+001/ 26
TRAI NBR, Epoch 47/ 1000, SSE 5. 99799e- 005/ 0, SSW 129. 833, Gr ad
2. 31e- 003/ 1. 00e- 010, #Par 1. 73e+001/ 26
TRAI NBR, Epoch 48/ 1000, SSE 5. 79687e- 005/ 0, SSW 129. 621, Gr ad
2. 16e- 003/ 1. 00e- 010, #Par 1. 74e+001/ 26
TRAI NBR, Epoch 49/ 1000, SSE 5. 60743e- 005/ 0, SSW 129. 426, Gr ad
2. 11e- 003/ 1. 00e- 010, #Par 1. 74e+001/ 26
TRAI NBR, Epoch 50/ 1000, SSE 5. 42785e- 005/ 0, SSW 129. 246, Gr ad
2. 14e- 003/ 1. 00e- 010, #Par 1. 75e+001/ 26
TRAI NBR, Epoch 51/ 1000, SSE 5. 25691e- 005/ 0, SSW 129. 084, Gr ad
2. 24e- 003/ 1. 00e- 010, #Par 1. 76e+001/ 26
TRAI NBR, Epoch 52/ 1000, SSE 5. 09386e- 005/ 0, SSW 128. 938, Gr ad
2. 38e- 003/ 1. 00e- 010, #Par 1. 76e+001/ 26
TRAI NBR, Epoch 53/ 1000, SSE 4. 93837e- 005/ 0, SSW 128. 81, Gr ad
2. 54e- 003/ 1. 00e- 010, #Par 1. 77e+001/ 26
TRAI NBR, Epoch 54/ 1000, SSE 4. 79052e- 005/ 0, SSW 128. 697, Gr ad
2. 68e- 003/ 1. 00e- 010, #Par 1. 78e+001/ 26
TRAI NBR, Epoch 55/ 1000, SSE 4. 65072e- 005/ 0, SSW 128. 599, Gr ad
2. 77e- 003/ 1. 00e- 010, #Par 1. 79e+001/ 26
65
TRAI NBR, Epoch 56/ 1000, SSE 4. 51966e- 005/ 0, SSW 128. 515, Gr ad
2. 78e- 003/ 1. 00e- 010, #Par 1. 79e+001/ 26
TRAI NBR, Epoch 57/ 1000, SSE 4. 39818e- 005/ 0, SSW 128. 443, Gr ad
2. 71e- 003/ 1. 00e- 010, #Par 1. 80e+001/ 26
TRAI NBR, Epoch 58/ 1000, SSE 4. 28702e- 005/ 0, SSW 128. 382, Gr ad
2. 55e- 003/ 1. 00e- 010, #Par 1. 81e+001/ 26
TRAI NBR, Epoch 59/ 1000, SSE 4. 18668e- 005/ 0, SSW 128. 329, Gr ad
2. 32e- 003/ 1. 00e- 010, #Par 1. 82e+001/ 26
TRAI NBR, Epoch 60/ 1000, SSE 4. 09716e- 005/ 0, SSW 128. 283, Gr ad
2. 04e- 003/ 1. 00e- 010, #Par 1. 83e+001/ 26
TRAI NBR, Epoch 61/ 1000, SSE 4. 01794e- 005/ 0, SSW 128. 242, Gr ad
1. 74e- 003/ 1. 00e- 010, #Par 1. 83e+001/ 26
TRAI NBR, Epoch 62/ 1000, SSE 3. 94798e- 005/ 0, SSW 128. 205, Gr ad
1. 44e- 003/ 1. 00e- 010, #Par 1. 84e+001/ 26
TRAI NBR, Epoch 63/ 1000, SSE 3. 88597e- 005/ 0, SSW 128. 17, Gr ad
1. 16e- 003/ 1. 00e- 010, #Par 1. 85e+001/ 26
TRAI NBR, Epoch 64/ 1000, SSE 3. 83047e- 005/ 0, SSW 128. 136, Gr ad
9. 12e- 004/ 1. 00e- 010, #Par 1. 85e+001/ 26
TRAI NBR, Epoch 65/ 1000, SSE 3. 73302e- 005/ 0, SSW 127. 737, Gr ad
1. 47e- 002/ 1. 00e- 010, #Par 1. 86e+001/ 26
TRAI NBR, Epoch 66/ 1000, SSE 3. 27248e- 005/ 0, SSW 127. 472, Gr ad
2. 79e- 003/ 1. 00e- 010, #Par 1. 87e+001/ 26
TRAI NBR, Epoch 67/ 1000, SSE 3. 05198e- 005/ 0, SSW 127. 313, Gr ad
1. 66e- 003/ 1. 00e- 010, #Par 1. 88e+001/ 26
TRAI NBR, Epoch 68/ 1000, SSE 2. 89203e- 005/ 0, SSW 127. 213, Gr ad
2. 48e- 003/ 1. 00e- 010, #Par 1. 89e+001/ 26
TRAI NBR, Epoch 69/ 1000, SSE 2. 77935e- 005/ 0, SSW 127. 126, Gr ad
5. 93e- 003/ 1. 00e- 010, #Par 1. 90e+001/ 26
TRAI NBR, Epoch 70/ 1000, SSE 2. 69213e- 005/ 0, SSW 127. 017, Gr ad
8. 18e- 003/ 1. 00e- 010, #Par 1. 90e+001/ 26
TRAI NBR, Epoch 71/ 1000, SSE 2. 6198e- 005/ 0, SSW 126. 864, Gr ad
9. 14e- 003/ 1. 00e- 010, #Par 1. 90e+001/ 26
TRAI NBR, Epoch 72/ 1000, SSE 2. 55866e- 005/ 0, SSW 126. 658, Gr ad
9. 08e- 003/ 1. 00e- 010, #Par 1. 91e+001/ 26
TRAI NBR, Epoch 73/ 1000, SSE 2. 50737e- 005/ 0, SSW 126. 397, Gr ad
8. 36e- 003/ 1. 00e- 010, #Par 1. 91e+001/ 26
TRAI NBR, Epoch 74/ 1000, SSE 2. 46526e- 005/ 0, SSW 126. 084, Gr ad
7. 29e- 003/ 1. 00e- 010, #Par 1. 91e+001/ 26
TRAI NBR, Epoch 75/ 1000, SSE 2. 43143e- 005/ 0, SSW 125. 726, Gr ad
6. 13e- 003/ 1. 00e- 010, #Par 1. 92e+001/ 26
66
TRAI NBR, Epoch 76/ 1000, SSE 2. 40461e- 005/ 0, SSW 125. 333, Gr ad
5. 02e- 003/ 1. 00e- 010, #Par 1. 92e+001/ 26
TRAI NBR, Epoch 77/ 1000, SSE 2. 38327e- 005/ 0, SSW 124. 911, Gr ad
4. 07e- 003/ 1. 00e- 010, #Par 1. 92e+001/ 26
TRAI NBR, Epoch 78/ 1000, SSE 2. 36604e- 005/ 0, SSW 124. 469, Gr ad
3. 28e- 003/ 1. 00e- 010, #Par 1. 92e+001/ 26
TRAI NBR, Epoch 79/ 1000, SSE 2. 35178e- 005/ 0, SSW 124. 012, Gr ad
2. 64e- 003/ 1. 00e- 010, #Par 1. 92e+001/ 26
TRAI NBR, Epoch 80/ 1000, SSE 2. 33965e- 005/ 0, SSW 123. 546, Gr ad
2. 14e- 003/ 1. 00e- 010, #Par 1. 92e+001/ 26
TRAI NBR, Epoch 81/ 1000, SSE 2. 32908e- 005/ 0, SSW 123. 074, Gr ad
1. 74e- 003/ 1. 00e- 010, #Par 1. 93e+001/ 26
TRAI NBR, Epoch 82/ 1000, SSE 2. 31966e- 005/ 0, SSW 122. 601, Gr ad
1. 43e- 003/ 1. 00e- 010, #Par 1. 93e+001/ 26
TRAI NBR, Epoch 83/ 1000, SSE 2. 31111e- 005/ 0, SSW 122. 128, Gr ad
1. 18e- 003/ 1. 00e- 010, #Par 1. 93e+001/ 26
TRAI NBR, Epoch 84/ 1000, SSE 2. 30324e- 005/ 0, SSW 121. 658, Gr ad
9. 82e- 004/ 1. 00e- 010, #Par 1. 93e+001/ 26
TRAI NBR, Epoch 85/ 1000, SSE 2. 29592e- 005/ 0, SSW 121. 192, Gr ad
8. 21e- 004/ 1. 00e- 010, #Par 1. 93e+001/ 26
TRAI NBR, Epoch 86/ 1000, SSE 2. 28904e- 005/ 0, SSW 120. 73, Gr ad
6. 92e- 004/ 1. 00e- 010, #Par 1. 93e+001/ 26
TRAI NBR, Epoch 87/ 1000, SSE 2. 28253e- 005/ 0, SSW 120. 275, Gr ad
5. 86e- 004/ 1. 00e- 010, #Par 1. 93e+001/ 26
TRAI NBR, Epoch 88/ 1000, SSE 2. 27633e- 005/ 0, SSW 119. 826, Gr ad
5. 00e- 004/ 1. 00e- 010, #Par 1. 93e+001/ 26
TRAI NBR, Epoch 89/ 1000, SSE 2. 27041e- 005/ 0, SSW 119. 384, Gr ad
4. 30e- 004/ 1. 00e- 010, #Par 1. 93e+001/ 26
TRAI NBR, Epoch 90/ 1000, SSE 2. 26472e- 005/ 0, SSW 118. 95, Gr ad
3. 73e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 91/ 1000, SSE 2. 25925e- 005/ 0, SSW 118. 524, Gr ad
3. 27e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 92/ 1000, SSE 2. 25397e- 005/ 0, SSW 118. 106, Gr ad
2. 90e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 93/ 1000, SSE 2. 24886e- 005/ 0, SSW 117. 695, Gr ad
2. 61e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 94/ 1000, SSE 2. 24391e- 005/ 0, SSW 117. 293, Gr ad
2. 37e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 95/ 1000, SSE 2. 23912e- 005/ 0, SSW 116. 898, Gr ad
2. 19e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
67
TRAI NBR, Epoch 96/ 1000, SSE 2. 23446e- 005/ 0, SSW 116. 512, Gr ad
2. 04e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 97/ 1000, SSE 2. 22993e- 005/ 0, SSW 116. 134, Gr ad
1. 93e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 98/ 1000, SSE 2. 22553e- 005/ 0, SSW 115. 763, Gr ad
1. 85e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 99/ 1000, SSE 2. 22124e- 005/ 0, SSW 115. 399, Gr ad
1. 80e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 100/ 1000, SSE 2. 21707e- 005/ 0, SSW 115. 044, Gr ad
1. 76e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 101/ 1000, SSE 2. 21301e- 005/ 0, SSW 114. 695, Gr ad
1. 74e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 102/ 1000, SSE 2. 20905e- 005/ 0, SSW 114. 353, Gr ad
1. 74e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 103/ 1000, SSE 2. 2052e- 005/ 0, SSW 114. 018, Gr ad
1. 75e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 104/ 1000, SSE 2. 20143e- 005/ 0, SSW 113. 69, Gr ad
1. 78e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 105/ 1000, SSE 2. 19776e- 005/ 0, SSW 113. 368, Gr ad
1. 83e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 106/ 1000, SSE 2. 19418e- 005/ 0, SSW 113. 052, Gr ad
1. 89e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 107/ 1000, SSE 2. 19067e- 005/ 0, SSW 112. 742, Gr ad
1. 97e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 108/ 1000, SSE 2. 18724e- 005/ 0, SSW 112. 438, Gr ad
2. 06e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 109/ 1000, SSE 2. 18389e- 005/ 0, SSW 112. 14, Gr ad
2. 17e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 110/ 1000, SSE 2. 1806e- 005/ 0, SSW 111. 847, Gr ad
2. 28e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 111/ 1000, SSE 2. 17737e- 005/ 0, SSW 111. 56, Gr ad
2. 42e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 112/ 1000, SSE 2. 1742e- 005/ 0, SSW 111. 278, Gr ad
2. 56e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 113/ 1000, SSE 2. 17108e- 005/ 0, SSW 111. 002, Gr ad
2. 71e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 114/ 1000, SSE 2. 168e- 005/ 0, SSW 110. 731, Gr ad
2. 87e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 115/ 1000, SSE 2. 16497e- 005/ 0, SSW 110. 466, Gr ad
3. 04e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
68
TRAI NBR, Epoch 116/ 1000, SSE 2. 16196e- 005/ 0, SSW 110. 206, Gr ad
3. 20e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 117/ 1000, SSE 2. 15899e- 005/ 0, SSW 109. 951, Gr ad
3. 36e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 118/ 1000, SSE 2. 15605e- 005/ 0, SSW 109. 702, Gr ad
3. 52e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 119/ 1000, SSE 2. 15312e- 005/ 0, SSW 109. 459, Gr ad
3. 67e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 120/ 1000, SSE 2. 15021e- 005/ 0, SSW 109. 221, Gr ad
3. 80e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 121/ 1000, SSE 2. 14731e- 005/ 0, SSW 108. 989, Gr ad
3. 91e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 122/ 1000, SSE 2. 14442e- 005/ 0, SSW 108. 762, Gr ad
4. 00e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 123/ 1000, SSE 2. 14154e- 005/ 0, SSW 108. 542, Gr ad
4. 07e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 124/ 1000, SSE 2. 13865e- 005/ 0, SSW 108. 327, Gr ad
4. 10e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 125/ 1000, SSE 2. 13576e- 005/ 0, SSW 108. 118, Gr ad
4. 11e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 126/ 1000, SSE 2. 13286e- 005/ 0, SSW 107. 915, Gr ad
4. 08e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 127/ 1000, SSE 2. 12995e- 005/ 0, SSW 107. 719, Gr ad
4. 02e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 128/ 1000, SSE 2. 12704e- 005/ 0, SSW 107. 528, Gr ad
3. 93e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 129/ 1000, SSE 2. 1241e- 005/ 0, SSW 107. 344, Gr ad
3. 81e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 130/ 1000, SSE 2. 12116e- 005/ 0, SSW 107. 166, Gr ad
3. 66e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 131/ 1000, SSE 2. 11819e- 005/ 0, SSW 106. 994, Gr ad
3. 48e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 132/ 1000, SSE 2. 11521e- 005/ 0, SSW 106. 828, Gr ad
3. 28e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 133/ 1000, SSE 2. 1122e- 005/ 0, SSW 106. 669, Gr ad
3. 06e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 134/ 1000, SSE 2. 10917e- 005/ 0, SSW 106. 516, Gr ad
2. 83e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 135/ 1000, SSE 2. 10611e- 005/ 0, SSW 106. 37, Gr ad
2. 60e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
69
TRAI NBR, Epoch 136/ 1000, SSE 2. 10302e- 005/ 0, SSW 106. 229, Gr ad
2. 35e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 137/ 1000, SSE 2. 09989e- 005/ 0, SSW 106. 096, Gr ad
2. 11e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 138/ 1000, SSE 2. 09673e- 005/ 0, SSW 105. 968, Gr ad
1. 88e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 139/ 1000, SSE 2. 09353e- 005/ 0, SSW 105. 847, Gr ad
1. 65e- 004/ 1. 00e- 010, #Par 1. 94e+001/ 26
TRAI NBR, Epoch 140/ 1000, SSE 2. 09028e- 005/ 0, SSW 105. 733, Gr ad
1. 43e- 004/ 1. 00e- 010, #Par 1. 95e+001/ 26
TRAI NBR, Epoch 141/ 1000, SSE 2. 08699e- 005/ 0, SSW 105. 625, Gr ad
1. 22e- 004/ 1. 00e- 010, #Par 1. 95e+001/ 26
TRAI NBR, Epoch 142/ 1000, SSE 2. 08364e- 005/ 0, SSW 105. 523, Gr ad
1. 03e- 004/ 1. 00e- 010, #Par 1. 95e+001/ 26
TRAI NBR, Epoch 143/ 1000, SSE 2. 08023e- 005/ 0, SSW 105. 429, Gr ad
8. 62e- 005/ 1. 00e- 010, #Par 1. 95e+001/ 26
TRAI NBR, Epoch 144/ 1000, SSE 2. 07676e- 005/ 0, SSW 105. 341, Gr ad
7. 10e- 005/ 1. 00e- 010, #Par 1. 95e+001/ 26
TRAI NBR, Epoch 145/ 1000, SSE 2. 07322e- 005/ 0, SSW 105. 26, Gr ad
5. 83e- 005/ 1. 00e- 010, #Par 1. 95e+001/ 26
TRAI NBR, Epoch 146/ 1000, SSE 2. 06961e- 005/ 0, SSW 105. 186, Gr ad
4. 84e- 005/ 1. 00e- 010, #Par 1. 95e+001/ 26
TRAI NBR, Epoch 147/ 1000, SSE 2. 06591e- 005/ 0, SSW 105. 119, Gr ad
4. 20e- 005/ 1. 00e- 010, #Par 1. 95e+001/ 26
TRAI NBR, Epoch 148/ 1000, SSE 2. 06213e- 005/ 0, SSW 105. 06, Gr ad
3. 98e- 005/ 1. 00e- 010, #Par 1. 95e+001/ 26
TRAI NBR, Epoch 149/ 1000, SSE 2. 05825e- 005/ 0, SSW 105. 008, Gr ad
4. 16e- 005/ 1. 00e- 010, #Par 1. 95e+001/ 26
TRAI NBR, Epoch 150/ 1000, SSE 2. 05427e- 005/ 0, SSW 104. 963, Gr ad
4. 66e- 005/ 1. 00e- 010, #Par 1. 96e+001/ 26
TRAI NBR, Epoch 151/ 1000, SSE 2. 05017e- 005/ 0, SSW 104. 927, Gr ad
5. 43e- 005/ 1. 00e- 010, #Par 1. 96e+001/ 26
TRAI NBR, Epoch 152/ 1000, SSE 2. 04596e- 005/ 0, SSW 104. 898, Gr ad
6. 41e- 005/ 1. 00e- 010, #Par 1. 96e+001/ 26
TRAI NBR, Epoch 153/ 1000, SSE 2. 04162e- 005/ 0, SSW 104. 877, Gr ad
7. 62e- 005/ 1. 00e- 010, #Par 1. 96e+001/ 26
TRAI NBR, Epoch 154/ 1000, SSE 2. 03713e- 005/ 0, SSW 104. 865, Gr ad
9. 06e- 005/ 1. 00e- 010, #Par 1. 96e+001/ 26
TRAI NBR, Epoch 155/ 1000, SSE 2. 0325e- 005/ 0, SSW 104. 862, Gr ad
1. 08e- 004/ 1. 00e- 010, #Par 1. 96e+001/ 26
70
TRAI NBR, Epoch 156/ 1000, SSE 2. 0277e- 005/ 0, SSW 104. 868, Gr ad
1. 28e- 004/ 1. 00e- 010, #Par 1. 97e+001/ 26
TRAI NBR, Epoch 157/ 1000, SSE 2. 02273e- 005/ 0, SSW 104. 883, Gr ad
1. 52e- 004/ 1. 00e- 010, #Par 1. 97e+001/ 26
TRAI NBR, Epoch 158/ 1000, SSE 2. 01757e- 005/ 0, SSW 104. 907, Gr ad
1. 79e- 004/ 1. 00e- 010, #Par 1. 97e+001/ 26
TRAI NBR, Epoch 159/ 1000, SSE 2. 01221e- 005/ 0, SSW 104. 942, Gr ad
2. 10e- 004/ 1. 00e- 010, #Par 1. 97e+001/ 26
TRAI NBR, Epoch 160/ 1000, SSE 2. 00663e- 005/ 0, SSW 104. 986, Gr ad
2. 45e- 004/ 1. 00e- 010, #Par 1. 98e+001/ 26
TRAI NBR, Epoch 161/ 1000, SSE 2. 00081e- 005/ 0, SSW 105. 041, Gr ad
2. 85e- 004/ 1. 00e- 010, #Par 1. 98e+001/ 26
TRAI NBR, Epoch 162/ 1000, SSE 1. 99474e- 005/ 0, SSW 105. 108, Gr ad
3. 28e- 004/ 1. 00e- 010, #Par 1. 98e+001/ 26
TRAI NBR, Epoch 163/ 1000, SSE 1. 98839e- 005/ 0, SSW 105. 185, Gr ad
3. 76e- 004/ 1. 00e- 010, #Par 1. 98e+001/ 26
TRAI NBR, Epoch 164/ 1000, SSE 1. 98174e- 005/ 0, SSW 105. 274, Gr ad
4. 27e- 004/ 1. 00e- 010, #Par 1. 99e+001/ 26
TRAI NBR, Epoch 165/ 1000, SSE 1. 97476e- 005/ 0, SSW 105. 375, Gr ad
4. 81e- 004/ 1. 00e- 010, #Par 1. 99e+001/ 26
TRAI NBR, Epoch 166/ 1000, SSE 1. 96743e- 005/ 0, SSW 105. 489, Gr ad
5. 38e- 004/ 1. 00e- 010, #Par 1. 99e+001/ 26
TRAI NBR, Epoch 167/ 1000, SSE 1. 95971e- 005/ 0, SSW 105. 615, Gr ad
5. 95e- 004/ 1. 00e- 010, #Par 2. 00e+001/ 26
TRAI NBR, Epoch 168/ 1000, SSE 1. 95157e- 005/ 0, SSW 105. 754, Gr ad
6. 53e- 004/ 1. 00e- 010, #Par 2. 00e+001/ 26
TRAI NBR, Epoch 169/ 1000, SSE 1. 94295e- 005/ 0, SSW 105. 907, Gr ad
7. 08e- 004/ 1. 00e- 010, #Par 2. 00e+001/ 26
TRAI NBR, Epoch 170/ 1000, SSE 1. 93382e- 005/ 0, SSW 106. 073, Gr ad
7. 57e- 004/ 1. 00e- 010, #Par 2. 00e+001/ 26
TRAI NBR, Epoch 171/ 1000, SSE 1. 92412e- 005/ 0, SSW 106. 252, Gr ad
7. 99e- 004/ 1. 00e- 010, #Par 2. 01e+001/ 26
TRAI NBR, Epoch 172/ 1000, SSE 1. 91379e- 005/ 0, SSW 106. 445, Gr ad
8. 27e- 004/ 1. 00e- 010, #Par 2. 01e+001/ 26
TRAI NBR, Epoch 173/ 1000, SSE 1. 90277e- 005/ 0, SSW 106. 651, Gr ad
8. 39e- 004/ 1. 00e- 010, #Par 2. 01e+001/ 26
TRAI NBR, Epoch 174/ 1000, SSE 1. 89097e- 005/ 0, SSW 106. 87, Gr ad
8. 30e- 004/ 1. 00e- 010, #Par 2. 02e+001/ 26
TRAI NBR, Epoch 175/ 1000, SSE 1. 87835e- 005/ 0, SSW 107. 1, Gr ad
7. 98e- 004/ 1. 00e- 010, #Par 2. 02e+001/ 26
71
TRAI NBR, Epoch 176/ 1000, SSE 1. 86485e- 005/ 0, SSW 107. 34, Gr ad
7. 43e- 004/ 1. 00e- 010, #Par 2. 02e+001/ 26
TRAI NBR, Epoch 177/ 1000, SSE 1. 85046e- 005/ 0, SSW 107. 589, Gr ad
6. 74e- 004/ 1. 00e- 010, #Par 2. 02e+001/ 26
TRAI NBR, Epoch 178/ 1000, SSE 1. 83519e- 005/ 0, SSW 107. 845, Gr ad
6. 15e- 004/ 1. 00e- 010, #Par 2. 03e+001/ 26
TRAI NBR, Epoch 179/ 1000, SSE 1. 81915e- 005/ 0, SSW 108. 105, Gr ad
6. 08e- 004/ 1. 00e- 010, #Par 2. 03e+001/ 26
TRAI NBR, Epoch 180/ 1000, SSE 1. 80249e- 005/ 0, SSW 108. 366, Gr ad
6. 94e- 004/ 1. 00e- 010, #Par 2. 03e+001/ 26
TRAI NBR, Epoch 181/ 1000, SSE 1. 78548e- 005/ 0, SSW 108. 625, Gr ad
8. 73e- 004/ 1. 00e- 010, #Par 2. 03e+001/ 26
TRAI NBR, Epoch 182/ 1000, SSE 1. 76845e- 005/ 0, SSW 108. 879, Gr ad
1. 12e- 003/ 1. 00e- 010, #Par 2. 04e+001/ 26
TRAI NBR, Epoch 183/ 1000, SSE 1. 7518e- 005/ 0, SSW 109. 124, Gr ad
1. 42e- 003/ 1. 00e- 010, #Par 2. 04e+001/ 26
TRAI NBR, Epoch 184/ 1000, SSE 1. 73598e- 005/ 0, SSW 109. 357, Gr ad
1. 76e- 003/ 1. 00e- 010, #Par 2. 04e+001/ 26
TRAI NBR, Epoch 185/ 1000, SSE 1. 72138e- 005/ 0, SSW 109. 575, Gr ad
2. 12e- 003/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Epoch 186/ 1000, SSE 1. 7083e- 005/ 0, SSW 109. 772, Gr ad
2. 46e- 003/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Epoch 187/ 1000, SSE 1. 69684e- 005/ 0, SSW 109. 946, Gr ad
2. 74e- 003/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Epoch 188/ 1000, SSE 1. 68689e- 005/ 0, SSW 110. 093, Gr ad
2. 89e- 003/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 189/ 1000, SSE 1. 67824e- 005/ 0, SSW 110. 213, Gr ad
2. 92e- 003/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 190/ 1000, SSE 1. 67071e- 005/ 0, SSW 110. 306, Gr ad
2. 82e- 003/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 191/ 1000, SSE 1. 66416e- 005/ 0, SSW 110. 374, Gr ad
2. 65e- 003/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 192/ 1000, SSE 1. 65847e- 005/ 0, SSW 110. 418, Gr ad
2. 43e- 003/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 193/ 1000, SSE 1. 65352e- 005/ 0, SSW 110. 441, Gr ad
2. 20e- 003/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 194/ 1000, SSE 1. 64922e- 005/ 0, SSW 110. 446, Gr ad
1. 97e- 003/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 195/ 1000, SSE 1. 64545e- 005/ 0, SSW 110. 435, Gr ad
1. 77e- 003/ 1. 00e- 010, #Par 2. 07e+001/ 26
72
TRAI NBR, Epoch 196/ 1000, SSE 1. 64214e- 005/ 0, SSW 110. 409, Gr ad
1. 58e- 003/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 197/ 1000, SSE 1. 63919e- 005/ 0, SSW 110. 371, Gr ad
1. 42e- 003/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 198/ 1000, SSE 1. 63655e- 005/ 0, SSW 110. 322, Gr ad
1. 28e- 003/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 199/ 1000, SSE 1. 63416e- 005/ 0, SSW 110. 264, Gr ad
1. 16e- 003/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 200/ 1000, SSE 1. 63199e- 005/ 0, SSW 110. 198, Gr ad
1. 06e- 003/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 201/ 1000, SSE 1. 63e- 005/ 0, SSW110. 124, Gr ad 9. 69e-
004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 202/ 1000, SSE 1. 62816e- 005/ 0, SSW 110. 045, Gr ad
8. 93e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 203/ 1000, SSE 1. 62646e- 005/ 0, SSW 109. 961, Gr ad
8. 27e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 204/ 1000, SSE 1. 62487e- 005/ 0, SSW 109. 873, Gr ad
7. 69e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 205/ 1000, SSE 1. 62339e- 005/ 0, SSW 109. 782, Gr ad
7. 19e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 206/ 1000, SSE 1. 62199e- 005/ 0, SSW 109. 687, Gr ad
6. 75e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 207/ 1000, SSE 1. 62068e- 005/ 0, SSW 109. 59, Gr ad
6. 36e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 208/ 1000, SSE 1. 61943e- 005/ 0, SSW 109. 491, Gr ad
6. 02e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 209/ 1000, SSE 1. 61826e- 005/ 0, SSW 109. 391, Gr ad
5. 71e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 210/ 1000, SSE 1. 61714e- 005/ 0, SSW 109. 289, Gr ad
5. 43e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 211/ 1000, SSE 1. 61608e- 005/ 0, SSW 109. 186, Gr ad
5. 17e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 212/ 1000, SSE 1. 61506e- 005/ 0, SSW 109. 083, Gr ad
4. 94e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 213/ 1000, SSE 1. 61409e- 005/ 0, SSW 108. 978, Gr ad
4. 73e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 214/ 1000, SSE 1. 61317e- 005/ 0, SSW 108. 874, Gr ad
4. 53e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 215/ 1000, SSE 1. 61228e- 005/ 0, SSW 108. 769, Gr ad
4. 34e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
73
TRAI NBR, Epoch 216/ 1000, SSE 1. 61143e- 005/ 0, SSW 108. 664, Gr ad
4. 17e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 217/ 1000, SSE 1. 61062e- 005/ 0, SSW 108. 559, Gr ad
4. 01e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 218/ 1000, SSE 1. 60983e- 005/ 0, SSW 108. 453, Gr ad
3. 86e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 219/ 1000, SSE 1. 60908e- 005/ 0, SSW 108. 348, Gr ad
3. 72e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 220/ 1000, SSE 1. 60835e- 005/ 0, SSW 108. 243, Gr ad
3. 58e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 221/ 1000, SSE 1. 60765e- 005/ 0, SSW 108. 138, Gr ad
3. 45e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 222/ 1000, SSE 1. 60697e- 005/ 0, SSW 108. 033, Gr ad
3. 33e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 223/ 1000, SSE 1. 60632e- 005/ 0, SSW 107. 929, Gr ad
3. 21e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 224/ 1000, SSE 1. 60569e- 005/ 0, SSW 107. 825, Gr ad
3. 10e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 225/ 1000, SSE 1. 60508e- 005/ 0, SSW 107. 721, Gr ad
2. 99e- 004/ 1. 00e- 010, #Par 2. 07e+001/ 26
TRAI NBR, Epoch 226/ 1000, SSE 1. 60449e- 005/ 0, SSW 107. 617, Gr ad
2. 89e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 227/ 1000, SSE 1. 60392e- 005/ 0, SSW 107. 514, Gr ad
2. 79e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 228/ 1000, SSE 1. 60336e- 005/ 0, SSW 107. 411, Gr ad
2. 70e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 229/ 1000, SSE 1. 60282e- 005/ 0, SSW 107. 309, Gr ad
2. 61e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 230/ 1000, SSE 1. 6023e- 005/ 0, SSW 107. 206, Gr ad
2. 52e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 231/ 1000, SSE 1. 60179e- 005/ 0, SSW 107. 105, Gr ad
2. 43e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 232/ 1000, SSE 1. 6013e- 005/ 0, SSW 107. 004, Gr ad
2. 35e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 233/ 1000, SSE 1. 60082e- 005/ 0, SSW 106. 903, Gr ad
2. 27e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 234/ 1000, SSE 1. 60036e- 005/ 0, SSW 106. 802, Gr ad
2. 20e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 235/ 1000, SSE 1. 59991e- 005/ 0, SSW 106. 702, Gr ad
2. 13e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
74
TRAI NBR, Epoch 236/ 1000, SSE 1. 59947e- 005/ 0, SSW 106. 603, Gr ad
2. 05e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 237/ 1000, SSE 1. 59904e- 005/ 0, SSW 106. 504, Gr ad
1. 99e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 238/ 1000, SSE 1. 59862e- 005/ 0, SSW 106. 406, Gr ad
1. 92e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 239/ 1000, SSE 1. 59821e- 005/ 0, SSW 106. 308, Gr ad
1. 86e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 240/ 1000, SSE 1. 59782e- 005/ 0, SSW 106. 21, Gr ad
1. 80e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 241/ 1000, SSE 1. 59743e- 005/ 0, SSW 106. 113, Gr ad
1. 74e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 242/ 1000, SSE 1. 59705e- 005/ 0, SSW 106. 017, Gr ad
1. 68e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 243/ 1000, SSE 1. 59668e- 005/ 0, SSW 105. 921, Gr ad
1. 62e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 244/ 1000, SSE 1. 59632e- 005/ 0, SSW 105. 825, Gr ad
1. 57e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 245/ 1000, SSE 1. 59597e- 005/ 0, SSW 105. 73, Gr ad
1. 51e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 246/ 1000, SSE 1. 59563e- 005/ 0, SSW 105. 636, Gr ad
1. 46e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 247/ 1000, SSE 1. 5953e- 005/ 0, SSW 105. 542, Gr ad
1. 41e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 248/ 1000, SSE 1. 59497e- 005/ 0, SSW 105. 449, Gr ad
1. 37e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 249/ 1000, SSE 1. 59465e- 005/ 0, SSW 105. 356, Gr ad
1. 32e- 004/ 1. 00e- 010, #Par 2. 06e+001/ 26
TRAI NBR, Epoch 250/ 1000, SSE 1. 59434e- 005/ 0, SSW 105. 263, Gr ad
1. 27e- 004/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Epoch 251/ 1000, SSE 1. 59403e- 005/ 0, SSW 105. 172, Gr ad
1. 23e- 004/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Epoch 252/ 1000, SSE 1. 59373e- 005/ 0, SSW 105. 08, Gr ad
1. 19e- 004/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Epoch 253/ 1000, SSE 1. 59344e- 005/ 0, SSW 104. 99, Gr ad
1. 15e- 004/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Epoch 254/ 1000, SSE 1. 59315e- 005/ 0, SSW 104. 899, Gr ad
1. 11e- 004/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Epoch 255/ 1000, SSE 1. 59287e- 005/ 0, SSW 104. 81, Gr ad
1. 07e- 004/ 1. 00e- 010, #Par 2. 05e+001/ 26
75
TRAI NBR, Epoch 256/ 1000, SSE 1. 5926e- 005/ 0, SSW 104. 721, Gr ad
1. 03e- 004/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Epoch 257/ 1000, SSE 1. 59233e- 005/ 0, SSW 104. 632, Gr ad
9. 95e- 005/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Epoch 258/ 1000, SSE 1. 59207e- 005/ 0, SSW 104. 544, Gr ad
9. 59e- 005/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Epoch 259/ 1000, SSE 1. 59181e- 005/ 0, SSW 104. 457, Gr ad
9. 25e- 005/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Epoch 260/ 1000, SSE 1. 59156e- 005/ 0, SSW 104. 37, Gr ad
8. 92e- 005/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Epoch 261/ 1000, SSE 1. 59131e- 005/ 0, SSW 104. 283, Gr ad
8. 59e- 005/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Epoch 262/ 1000, SSE 1. 59107e- 005/ 0, SSW 104. 198, Gr ad
8. 27e- 005/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Epoch 263/ 1000, SSE 1. 59084e- 005/ 0, SSW 104. 112, Gr ad
7. 97e- 005/ 1. 00e- 010, #Par 2. 05e+001/ 26
TRAI NBR, Val i dat i on st op.

El apsed t i me i s 74. 438000 seconds.




76
Appendix C: Sample data used for
training the model
DAQSTANDARD R8.11
Data Viewer R8.11
CEYLON ELECTRIC-
ITY BOARD

Device Type DX2000
Serial No. S5J603679
File Message U1-TOP DATA
Time Correction None
StartingCondition Auto
DividingCondition Auto
Meas Ch. 0
Math Ch. 0
Ext Ch. 0
Data Count 3623
Sampling Interval 10 minutes
Start Time 10/ 25/ 2010 10:20:00
Stop Time 2010/ 11/ 19 14:20:00
Trigger Time 2010/ 11/ 19 14:20:00
Trigger No.
Started by Admin1
Stopped by [ Running ]


Num. Of Converted Data 3623

CH015 CH018 CH023 CH024 CH026 CH027 CH028

UGB.PAD-
5
LGB.PAD-
4
THRUST
BRG.PAD-11 TGB
THRUST/ LGB.-
OIL
C.W.-
INLET LO AD
Date Time C C C C C C MW MVar
2010/ 11/ 03 16:30:00 40.3 43.1 39.6 25.0 38.6 24.6 52 0
2010/ 11/ 03 16:40:00 40.2 42.9 39.4 24.8 38.4 24.7 61 21
2010/ 11/ 03 16:50:00 40.2 42.8 39.2 24.6 38.2 24.7 74 22
2010/ 11/ 03 17:00:00 40.2 42.6 39.0 24.4 38.0 24.8 76 21
2010/ 11/ 03 17:10:00 40.1 42.5 38.9 24.3 37.8 24.8 74 21
2010/ 11/ 03 17:20:00 40.1 42.3 38.7 24.1 31.0 24.8 75 20
2010/ 11/ 03 17:30:00 47.6 51.2 58.9 44.3 42.2 24.6 77 19
2010/ 11/ 03 17:40:00 48.5 55.2 64.6 50.0 46.9 24.6 75 19
77
2010/ 11/ 03 17:50:00 48.5 57.5 67.1 52.5 49.3 24.6 77 20
2010/ 11/ 03 18:00:00 48.8 58.8 68.2 53.6 50.8 24.6 75 21
2010/ 11/ 03 18:10:00 48.9 59.6 68.9 54.3 51.7 24.7 76 21
2010/ 11/ 03 18:20:00 49.0 60.2 69.5 54.9 52.3 24.7 74 24
2010/ 11/ 03 18:30:00 49.1 60.6 69.9 55.3 52.7 24.7 76 21
2010/ 11/ 03 18:40:00 49.2 60.8 70.2 55.6 53.0 24.7 77 19
2010/ 11/ 03 18:50:00 49.3 61.0 70.4 55.8 53.2 24.7 77 19
2010/ 11/ 03 19:00:00 49.3 61.2 70.5 55.9 53.4 24.7 76 18
2010/ 11/ 03 19:10:00 49.5 61.3 70.7 56.1 53.5 24.7 76 17
2010/ 11/ 03 19:20:00 49.5 61.4 70.7 56.1 53.6 24.7 76 17
2010/ 11/ 03 19:30:00 49.7 61.4 70.7 56.1 53.6 24.7 76 17
2010/ 11/ 03 19:40:00 49.8 61.5 70.9 56.3 53.7 24.8 78 17
2010/ 11/ 03 19:50:00 49.9 61.6 70.9 56.3 53.7 24.8 76 16
2010/ 11/ 03 20:00:00 50.0 61.6 71.0 56.4 53.8 24.8 77 16
2010/ 11/ 03 20:10:00 50.0 61.7 71.1 56.5 53.9 24.7 75 15
2010/ 11/ 03 20:20:00 50.1 61.8 71.1 56.5 53.9 24.8 77 15
2010/ 11/ 03 20:30:00 50.1 61.8 71.1 56.5 54.0 24.8 73 16
2010/ 11/ 03 20:40:00 50.1 61.9 71.1 56.5 54.0 24.7 77 14
2010/ 11/ 03 20:50:00 50.2 61.9 71.2 56.6 54.0 24.8 76 13
2010/ 11/ 03 21:00:00 50.2 61.9 71.2 56.6 54.1 24.8 76 13
2010/ 11/ 03 21:10:00 50.3 62.0 71.3 56.7 54.1 24.8 78 10
2010/ 11/ 03 21:20:00 50.4 62.0 71.2 56.6 54.1 24.8 77 9
2010/ 11/ 03 21:30:00 50.4 62.0 71.4 56.8 54.2 24.7 75 6
2010/ 11/ 03 21:40:00 50.5 62.1 71.3 56.7 54.2 24.8 74 7
2010/ 11/ 03 21:50:00 50.5 62.1 71.3 56.7 54.2 24.8 75 8
2010/ 11/ 03 22:00:00 50.5 62.1 71.3 56.7 54.2 24.8 76 10
2010/ 11/ 03 22:10:00 50.5 62.1 71.3 56.7 54.2 24.7 74 9
2010/ 11/ 03 22:20:00 50.4 62.2 71.4 56.8 54.2 24.7 74 7
2010/ 11/ 03 22:30:00 50.5 62.2 71.5 56.9 54.2 24.8 78 7
2010/ 11/ 03 22:40:00 50.5 62.2 71.4 56.8 54.2 24.8 76 8
2010/ 11/ 03 22:50:00 50.5 62.3 71.4 56.8 54.2 24.8 73 15
2010/ 11/ 03 23:00:00 50.5 62.3 71.4 56.8 54.3 24.7 76 12
2010/ 11/ 03 23:10:00 50.6 62.3 71.4 56.8 54.3 24.8 74 8
2010/ 11/ 03 23:20:00 50.6 62.3 71.5 56.9 54.3 24.8 74 9
2010/ 11/ 03 23:30:00 50.6 62.3 71.5 56.9 54.3 24.8 77 7
2010/ 11/ 03 23:40:00 50.6 62.3 71.5 56.9 54.3 24.8 77 5
2010/ 11/ 03 23:50:00 50.6 62.3 71.5 56.9 54.3 24.8 75 4
2010/ 11/ 04 00:00:00 50.6 62.3 71.4 56.8 54.3 24.8 75 3
2010/ 11/ 04 00:10:00 50.6 62.3 71.5 56.9 54.3 24.8 77 5
2010/ 11/ 04 00:20:00 50.5 62.3 71.4 56.8 54.3 24.8 75 5
2010/ 11/ 04 00:30:00 50.4 62.4 71.4 56.8 54.3 24.8 76 5
2010/ 11/ 04 00:40:00 50.6 62.4 71.4 56.8 54.3 24.7 74 6
2010/ 11/ 04 00:50:00 50.6 62.4 71.5 56.9 54.3 24.7 75 6
2010/ 11/ 04 01:00:00 50.6 62.4 71.5 56.9 54.3 24.7 77 5
2010/ 11/ 04 01:10:00 50.5 62.4 71.5 56.9 54.3 24.7 77 5
2010/ 11/ 04 01:20:00 50.5 62.4 71.5 56.9 54.3 24.7 78 5
78
2010/ 11/ 04 01:30:00 50.6 62.4 71.5 56.9 54.3 24.8 74 5
2010/ 11/ 04 01:40:00 50.4 62.4 71.5 56.9 54.4 24.8 75 5
2010/ 11/ 04 01:50:00 50.5 62.4 71.5 56.9 54.4 24.7 74 5
2010/ 11/ 04 02:00:00 50.5 62.4 71.5 56.9 54.4 24.7 74 5
2010/ 11/ 04 02:10:00 50.5 62.5 71.5 56.9 54.4 24.7 77 4
2010/ 11/ 04 02:20:00 50.4 62.5 71.5 56.9 54.4 24.8 76 3
2010/ 11/ 04 02:30:00 50.4 62.4 71.5 56.9 54.4 24.7 75 3
2010/ 11/ 04 02:40:00 50.4 62.4 71.5 56.9 54.4 24.7 75 3
2010/ 11/ 04 02:50:00 50.4 62.4 71.5 56.9 54.4 24.7 76 1
2010/ 11/ 04 03:00:00 50.5 62.4 71.5 56.9 54.4 24.8 75 1
2010/ 11/ 04 03:10:00 50.5 62.5 71.5 56.9 54.4 24.7 78 1
2010/ 11/ 04 03:20:00 50.5 62.5 71.5 56.9 54.4 24.8 78 0
2010/ 11/ 04 03:30:00 50.5 62.4 71.5 56.9 54.4 24.7 74 1
2010/ 11/ 04 03:40:00 50.5 62.5 71.5 56.9 54.4 24.8 72 2
2010/ 11/ 04 03:50:00 50.4 62.5 71.5 56.9 54.4 24.8 76 1
2010/ 11/ 04 04:00:00 50.5 62.5 71.5 56.9 54.4 24.7 73 0
2010/ 11/ 04 04:10:00 50.5 62.5 71.5 56.9 54.4 24.7 76 0
2010/ 11/ 04 04:20:00 50.5 62.4 71.5 56.9 54.4 24.8 77 1
2010/ 11/ 04 04:30:00 50.4 62.5 71.5 56.9 54.4 24.7 74 0
2010/ 11/ 04 04:40:00 50.4 62.5 71.5 56.9 54.4 24.7 77 1
2010/ 11/ 04 04:50:00 50.5 62.5 71.5 56.9 54.3 24.7 74 2
2010/ 11/ 04 05:00:00 50.4 62.5 71.5 56.9 54.4 24.8 76 2
2010/ 11/ 04 05:10:00 50.5 62.5 71.5 56.9 54.4 24.7 77 4
2010/ 11/ 04 05:20:00 50.4 62.5 71.5 56.9 54.4 24.7 77 7
2010/ 11/ 04 05:30:00 50.4 62.5 71.5 56.9 54.4 24.8 76 6
2010/ 11/ 04 05:40:00 50.4 62.5 71.5 56.9 54.4 24.8 75 6
2010/ 11/ 04 05:50:00 50.4 62.5 71.5 56.9 54.4 24.8 74 6
2010/ 11/ 04 06:00:00 50.4 62.5 71.5 56.9 54.4 24.7 76 7
2010/ 11/ 04 06:10:00 50.4 62.5 71.5 56.9 54.4 24.7 79 7
2010/ 11/ 04 06:20:00 50.4 62.6 71.5 56.9 54.4 24.7 73 5
2010/ 11/ 04 06:30:00 50.4 62.5 71.5 56.9 54.4 24.8 74 4
2010/ 11/ 04 06:40:00 50.4 62.5 71.5 56.9 54.4 24.8 74 2
2010/ 11/ 04 06:50:00 50.4 62.5 71.5 56.9 54.4 24.8 76 3
2010/ 11/ 04 07:00:00 50.5 62.6 71.5 56.9 54.4 24.7 75 12
2010/ 11/ 04 07:10:00 50.4 62.6 71.5 56.9 54.5 24.7 60 10
2010/ 11/ 04 07:20:00 50.4 62.5 71.6 57.0 54.5 24.7 74 10
2010/ 11/ 04 07:30:00 50.5 62.6 71.6 57.0 54.5 24.8 73 7
2010/ 11/ 04 07:40:00 50.4 62.6 71.7 57.1 54.5 24.7 71 8
2010/ 11/ 04 07:50:00 50.4 62.6 71.6 57.0 54.5 24.8 69 10
2010/ 11/ 04 08:00:00 50.4 62.7 71.6 57.0 54.5 24.7 74 12
2010/ 11/ 04 08:10:00 50.4 62.5 71.6 57.0 54.5 24.8 76 12
2010/ 11/ 04 08:20:00 50.3 62.6 71.7 57.1 54.5 24.7 75 13

08:30:00 50.4 62.6 71.8 57.2 54.5 24.7 76 14
2010/ 11/ 04 08:40:00 50.4 62.6 71.7 57.1 54.5 24.7 74 16
2010/ 11/ 04 08:50:00 50.4 62.6 71.7 57.1 54.5 24.7 75 20
2010/ 11/ 04 09:00:00 50.4 62.7 71.7 57.1 54.5 24.7 74 20
79
2010/ 11/ 04 09:10:00 50.5 62.7 71.6 57.0 54.5 24.7 76 25




80
Appendix D: training Matlab script
for model
% develops a model for temperature data
% Trains ,validates and tests new data
% written by CGS Gunasekara, 20 Nov 2010

close all;
clear all;
tic;
file=xlsread('VICDATA','Q1364:BE1464'); % loads xl data from file
toc;
tic;
B=file(:,1:31); % loads inputs ok

% output vector

C=file(:,33:41); % loads outputs
p=B'; % inputs
t=C'; % targets
Q=6;
n=100;
dtst=14:Q:n; % divides data for training validation
dval= [ 13:Q:n ]; % and testing
dtrn=[1:Q:n 2:Q:n 3:Q:n 4:Q:n 5:Q:n 6:Q:n 7:Q:n 8:Q:n 9:Q:n 10:Q:n 11:Q:n 12:Q:n ];
val.P=p( : , dval); % validation data
val.T=t( : , dval);
test.P=p( : , dtst); % test data
test.T=t( : , dtst);
ptr=p( : , dtrn); % training data
ttr=t( : , dtrn);

nnet=network; % creates network

pr=minmax(p);

nnet=newff(pr,[31 20 16 9 ],{ 'tansig' 'tansig' 'tansig' 'tansig' },'trainlm');
nnet.trainParam.epochs = 25000;
nnet.trainParam.show = 1;

load vic_100B; %trains partly trained network
nnet=vic_100B;

nnet.trainParam.lr = 0.35 % SETS ETA learningrate
[nnet,tr]=train(nnet,ptr,ttr,[],[],val,test);
figure(1)
plot(tr.epoch,tr.perf,tr.epoch,tr.vperf,tr.epoch,tr.tperf)
legend('Training' , 'Validation' , 'Test', -2);
ylabel('Squared Error');
81
xlabel('Epoch ');
title(' Model Performance');

a = sim(nnet,p); % simulates

t=t*100, a=a*100;
t=t+7; a=a+7;
figure(2)
t1=t(1:1,1:100); % target
a1=a(1:1,1:100); % simulated output

plot(1:n,a1,'r-',1:n,t1,'bo')
ylabel('Temperature/ deg C');
xlabel('Time / (10 minutes samples) ');
title(' UGB metal temperature');
legend('simulated ' , 'actual');
figure(3)
[m,b,r]=postreg(a1,t1);
figure(4)
t2=t(2:2,1:100); % target
a2=a(2:2,1:100); % simulated output

plot(1:n,a2,'r-',1:n,t2,'bo')
ylabel('Temperature/ deg C');
xlabel('Time / (10 minutes samples) ');
title(' LGB metal temperature');
legend('simulated ' , 'actual');
figure(5)
[m,b,r]=postreg(a2,t2);
figure(6)
t3=t(3:3,1:100); % target
a3=a(3:3,1:100); % simulated output

plot(1:n,a3,'r-',1:n,t3,'bo')
ylabel('Temperature/ deg C');
xlabel('Time / (10 minutes samples) ');
title(' THB metal temperature');
legend('simulated ' , 'actual');
figure(7)
[m,b,r]=postreg(a3,t3);
figure(8)
t4=t(4:4,1:100); % target
a4=a(4:4,1:100); % simulated output

plot(1:n,a4,'r-',1:n,t4,'bo')
ylabel('Temperature/ deg C');
xlabel('Time / (10 minutes samples) ');
title(' TGB metal temperature');
legend('simulated ' , 'actual');
figure(9)
[m,b,r]=postreg(a4,t4);
figure(10)
t5=t(5:5,1:100); % target
a5=a(5:5,1:100); % simulated output

plot(1:n,a5,'r-',1:n,t5,'bo')
82
ylabel('Temperature/ deg C');
xlabel('Time / (10 minutes samples) ');
title('UGB Oil temperature');
legend('simulated ' , 'actual');
figure(11)
[m,b,r]=postreg(a5,t5);
figure(12)
t6=t(5:5,1:100); % target
a6=a(5:5,1:100); % simulated output

plot(1:n,a6,'r-',1:n,t6,'bo')
ylabel('Temperature/ deg C');
xlabel('Time / (10 minutes samples) ');
title(' THB Oil temperature');
legend('simulated ' , 'actual');
figure(13)
[m,b,r]=postreg(a6,t6);
figure(14)
t7=t(7:7,1:100); % target
a7=a(7:7,1:100); % simulated output

plot(1:n,a7,'r-',1:n,t7,'bo')
ylabel('Temperature/ degC');
xlabel('Time / (10 minutes samples) ');
title(' TGB Oil temperature');
legend('simulated ' , 'actual');
figure(15)
[m,b,r]=postreg(a7,t7);
figure(16)

% all graphs in one diagram
plot(1:n,a1,'r-',1:n,t1,'bo',1:n,a2,'r-',1:n,t2,'bX',1:n,a3,'r-',1:n,t3,'b*',1:n,a4,'r-',1:n,t4,'b+')
ylabel('Temperature/ deg C');
xlabel('Time / (10 minutes samples) ');
title(' Bearing Metal Temperatures');
legend('UGB Actual ','Simulated' , 'LGB Actual','Simulated','THB Ac-
tual','Simulated','TGB Actual','Simulated');
figure(17)
plot(1:n,a5,'r-',1:n,t5,'bO',1:n,a6,'r-',1:n,t6,'bX',1:n,a7,'r-',1:n,t7,'b+')
ylabel('Temperature/ deg C');
xlabel('Time / (10 minutes samples) ');
title(' Bearing Oil Temperatures');
legend('UGB Actual','Simulated' , 'THB/ LGB Actual','Simulated','TGB Ac-
tual','Simulated');
% writes data into xl file
%SUCESS=XLSWRITE('_op.xls',a1','b2:b64')
%SUCESS=XLSWRITE('_op.xls',t1','c2:c64')
toc;
% end




83



84
Appendix E: Initial values of
trained model
Initial values of network
nnet =
Neural Network object:
architecture:
numInputs: 1
numLayers: 4
biasConnect: [1; 1; 1; 1]
inputConnect: [1; 0; 0; 0]
layerConnect: [4x4 boolean]
outputConnect: [0 0 0 1]
targetConnect: [0 0 0 1]
numOutputs: 1 (read-only)
numTargets: 1 (read-only)
numInputDelays: 0 (read-only)
numLayerDelays: 0 (read-only)
subobject structures:
inputs: {1x1 cell} of inputs
layers: {4x1 cell} of layers
outputs: {1x4 cell} containing1 output
targets: {1x4 cell} containing 1 target
biases: {4x1 cell} containing 4 biases
inputWeights: {4x1 cell} containing 1 input weight
layerWeights: {4x4 cell} containing3 layer weights
functions:
adaptFcn: 'trains'
initFcn: 'initlay'
performFcn: 'mse'
trainFcn: 'trainlm'
parameters:
adaptParam: .passes
initParam: (none)
performParam: (none)
trainParam: .epochs, .goal, .max_fail, .mem_reduc,
.min_grad, .mu, .mu_dec, .mu_inc,
.mu_max, .show, .time, .lr
weight and bias values:
IW: {4x1 cell} containing 1 input weight matrix
LW: {4x4 cell} containing 3 layer weight matrices
b: {4x1 cell} containing 4 bias vectors
other:
userdata: (user stuff)

You might also like