Professional Documents
Culture Documents
e-mail: avk-vrn@mail.ru
I. INTRODUCTION
The numerical solutions of fluid dynamics problem can
be considered using the method of weighted residuals
(MWR) which origin from the supposition about the
capability of analytical presentation of flow equation. The
type of representative testing functions (form- functions)
determines the specific version of MWR such as subareas, collocations, minimum squares and Galerkin
methods [1]. As a rule, MWR algorithm realization is
transformed to variation problem, through which solution
of the total residuals minimization of hydrodynamics by
the way of trial solution parameter selection. The solution
accuracy of MWR is determined by approximating
characteristics of testing functions and by degree
equivalence to the original partial differential equations
for continuum solution of fluid flow.
The neural network training is the change of its inner
parameters in the way that the outlet data of neural
networks gradually approach the required ones. Neural
network training is the movement along the surface of
errors, and the argument of the error surface is the inner
structure of neural networks. Neural network outlet is the
surface being continuously determined for the whole real
space of inlet population set. Artificial neural networks
(ANN) are powerful means of approximation. The idea of
application of ANN methodology for the mathematical
physics equation solution is not a new one and it is
presented in the number of publications, which are
reviewed in the reference book [2]. It is also noted there
that the application of ANN for hydrodynamics problem
simulation is very limited. In particular, among of 2342
publications mentioned in [2], only one [3] deals with
hydrodynamics. The computational algorithm, described
in this article, allows to use neural networks structure for
continuous solution in the computational area,
Manuscript received August 30, 2008
y (w, x) = vi f bi + wij x j + b0 ,
i =1
j =1
(1)
J (w ) = Q( f (w, i) ) ,
i =1
(2)
(3)
E=
( ))
1
s
s 2
.
y f x
2S
(4)
1 2 u 2 u
p
u
u
+u
+v
+
=0;
x
x
y Re x 2 y 2
p
v
v 1 2 v 2 v
+u
+v
+
=0.
y Re x 2 y 2
y
x
(5)
(6)
(7)
ur
,
u
u
r
p
, r* =
, p* =
.
2
u
D
u
As for u and D , for their values one can chose any
values of velocity and of linear dimension in the flow
area, for example fluid velocity value on the channel inlet
and channel width h.
Let on the plane XY there is rectangular region
[a,b][c,d], and on it there is a rectangular analytical grid,
specified by Cartesian product of two one-dimensional
grids {xk}, k=l,,n and {yl}, l=l,,m.
u * =
We will understand
neural net
functions
u, v, p = f NET (w , x, y ) as the (5)-(7) system solution
giving minimum of the total squered residual in the knot
set of computational grid. We present the trial solution of
the system (5)-(7) u, v, p in the form (1):
q
u (w , x, y ) = vi f (bi + wi1x + wi 2 y ) + bu ;
(8)
(9)
i =1
2q
i = q +1
p(w, x, y ) =
3q
vi f (bi
i = 2 q +1
+ wi1x + wi 2 y ) + b p .
(10)
=0
y
=1
y
1
F n = f NET ( x, y )
t
with the implementation of neural net trial functions.
2 p n +1 =
2 p
y
u v v u
.
= 2
x y x y
2 2
+
=0
x 2 y 2
=0
x
=0
y
.
and v =
and u =
x
y
As a result of Laplas equation solution, there is
generated velocity distribution, which can be called freevortex component of the sought quantity. The final
velocity and pressure distribution is generated as a result
of momentum equation solution in accordance with the
following algorithm.
Velocity distribution on the following time layer is
generated according to the formula
u n +1 = F n t p n +1 ,
IV. CONCLUSION
The presented results reveal one of the possible
apparatus applications of artificial neuron nets
numerical solution of equations including differential
ones, similar to method of weighted residuals, but with
the implementation of specific net approximation. Minor
modification of standard training algorithm of neural
network allows using computational points arbitrarily
designed in the computational field for insetting as
supplement the continuous solution.
REFERENCES
[1]
[2]
[3]
[4]
[5]