You are on page 1of 5

International Journal of Engineering and Technical Research (IJETR)

ISSN: 2321-0869, Volume-02, Issue-02, February 2014

Two Methods of Obtaining a Minimal Upper


Estimate for the Error Probability of the Restoring
Formal Neuron
A. I. Prangishvili, O. M. Namicheishvili, M. A. Gogiashvili

Abstract It is shown that a minimal upper estimate for the n1


error probability of the formal neuron, when the latter is used as Y sgn ai X i sgn Z , (1)
a restoring (decision) element, can be obtained by the Laplace
i 1
transform of the convolution of functions as well as by means of where
the generating function of the factorial moment of the sum of n +1
independent random variables. It is proved that in both cases the Z = ai X i . (2)
obtained minimal upper estimates are absolutely identical. i =1

Both the input signal X and its versions X i (i 1, n) are


Index Termsgenerating function, probability of signal
restoration error, restoring neuron, upper estimate. considered as binary random variables coded by the logical
values (1) and (1) . It is formally assumed that the
I. INTRODUCTION threshold of the restoring neuron is introduced into
Let us consider the formal neuron, to the inputs of which consideration by means of the identity an1 , where
different versions X 1 , X 2 , , X n , X n1 of one and the same ( an1 ) and the signal X n1 1 . The main
random binary signal X arrive via the binary channels point of this formalism is that the signal X n1 1 is
B1 , B2 , , Bn , Bn1 with different error probabilities
dumped from some imaginary binary input Bn1 for any value
(
qi = )
i 1, n + 1 , and the neuron must restore the correct input of the input signal X , whereas the value qn +1 is the a priori
signal X or, in other words, make a decision Y using the probability of occurrence of the signal X 1 or, which is
versions X 1 , X 2 , , X n , X n1 . When the binary signal X the same, the error probability of the channel Bn1 . Quite a
arrives at the inputs of the restoring element via the channels vast literature [3]-[7] is dedicated to threshold logic which
of equal reliability, the decision-making, in which some value takes into consideration the varying reliability of channels,
prevails among the signal versions, i.e. the decision-making but in this paper we express our viewpoint in the spirit of the
by the majority principle, was for the first time described by ideas of W. Pierce [8].
J. von Neumann [1], and later V. I. Varshavski [2] Let us further assume that
generalized this principle to redundant analog systems. 1 if Z 0

In the case of input channels with different reliabilities, sgn Z . (3)

1 if Z 0

adaptation of the formal neuron is needed in order to restore
When Z 0 , the solution Y at the output of the restoring
the correct signal. Adaptation is interpreted as the control
formal neuron has the form +1 according to (3). The
process of weights (
ai = )
i 1, n + 1 of the neuron inputs, probability that the restored value Y of the signal X is not
which makes these weights match the current probabilities correct is expressed by the formula
Q Prob {Y X= } Prob { < 0} .
(
qi = )
i 1, n + 1 of the input channels. The purpose of this = (4)
Here XZ is a discrete random variable with
control is to make inputs of high reliability to exert more
influence on decision-making (i.e. on the restoration of the probability distribution density f (v) . This variable is the
correct signal) as compared with inputs of low reliability. sum of independent discrete variables i = ai XX i , and the
Restoration is carried out by vote-weighting by the relation function f i (vi ) describes the probability distribution density
of individual summands i . For the realizations of random
Manuscript received February 12, 2014.
A. I. Prangishvili, Faculty of Informatics and Control Systems, variables and i we introduce the symbols v and vi ,
Georgian Technical University, 0171 Tbilisi, Georgia, +995 591191700,
(e-mail: a_prangi@gtu.ge).
respectively.
O. M. Namicheishvili, Faculty of Informatics and Control Systems, It is easy to observe that the variable i takes the values
Georgian Technical University, 0171 Tbilisi, Georgia, +995 593573139,
(e-mail: o.namicheishvili@gtu.ge).
ai and ai with probabilities 1 qi and qi , respectively.
M. A. Gogiashvili, School (Faculty) of Informatics, Mathematics and Therefore, if we use the Dirac delta function (t ) , then the
Natural Sciences, St. Andrew the First-Called Georgian University of the
Patriarchate of Georgia, 0162 Tbilisi, Georgia, +995 599305303, (e-mail: probability density f i (vi ) can be represented as follows
maia.gogiashvili@yahoo.com).

64 www.erpublication.org
Two Methods of Obtaining a Minimal Upper Estimate for the Error Probability of the Restoring Formal Neuron

f i (vi ) (1 qi ) (vi ai ) qi (vi ai ) n 1


Q j f (v) f i (vi ) q1 q2 qn q n1

vi ai , ai . (5) i 1
,
n 1
j 1, 2
i 1, n 1
where
Such formalism is completely justified and frequently used
qk if vk (ak )
due to the following two properties of the delta-function qk
(t ) 0, t 1 qk if vk (ak )
+
(t )dt = 1
.
(
for any k =
k 1, n + 1 . )
However f i (vi ) can also be represented as Thus formula (9) can also be written in the form
Q
Q q q q q ,
(10)
f i (vi ) qi( ai vi )/ 2 ai (1 qi )( ai vi )/ 2 ai

v 0
j
v 0
1 2 n n 1


vi ai , ai . (6) which is more adequate for cognitive perception and practical


i 1, n 1

realization.

The random variable is the sum of independent discrete II. FINDING A MINIMAL UPPER ESTIMATE BY THE
random variables i . Its distribution density f (v) can be FIRST METHOD
defined in the form of convolution of probability distribution From the expression
densities of summands f i (vi ) : 0

n +1
Q Prob ( <=
= 0) f (v)dv
f (v) = f i (vi ) , (7)
i =1
it follows that for a real positive number s ( s > 0)
where (superposition of the addition and multiplication 0

e f (v)dv e
sv sv
signs) is the convolution symbol. Q f (v)dv.
It is obvious that in view of formula (7) the error
probability at the decision element output can be written in But the left-hand part of this inequality is the Laplace
two equivalent forms transform of the function f (v)
0 0 n +1
Q = Prob ( v < 0 ) =



f (v)dv =

f i (vi )dv =
i =1 L[ f (v) ] = e
sv
f (v)dv,
(8)
0 n +1
= [ (1 qi ) (vi ai ) + qi (vi + ai ) ] dv where L is the Laplace transform operator.
i =1
Therefore
and Q L[ f (v) ]. (11)
n 1
Q f (v) f i (vi ) The random value with realizations v is the sum of
i 1
v 0 v 0
(9) independent random variables i having realizations vi . In
n 1
qi( ai vi )/ 2 ai (1 qi )( ai vi )/ 2 ai , that case, as is known, the Laplace transform for the
i 1
v 0
convolution f (v) of functions f i (vi ) is equal to the product
where the probability distribution density f i (vi ) is defined by
of Laplace transforms of convoluted functions:
(5) in the first case and by (6) in the second case. Integration n +1

or summation in both cases is carried out continuously or L[ f (v) ] = L[ f i (vi ) ] .


i =1
discretely over all negative values of the variable v .
The latter implies that
Formulas (8) and (9) give an exact value of the error n +1
probability of restoration of a binary signal by the formal Q L[ f i (vi ) ]. (12)
neuron. i =1

Note that for practical calculations, formula (9) can be By expression (5) for functions f i (vi ) and the Laplace
written in a more convenient form. Indeed, the complete transform definition, we obtain
number of discrete values of the variable v is 2n1 since

=
svi
=
L[ f i (vi ) ]
v a a a a ,
e f (v )dv i i i
1 2 n n 1
.
where ai is equal either to ai or to ( ai ) , whereas the

e [(1 q ) (v a ) + q (v + a )] dv
svi
= i i i i i i i
proper sign of the weight ai is meant to be within the round

brackets. Using this expression in formula (12) we have


By formula (9), to each discrete value of the sum v there n +1
Q e [(1 q ) (v a ) + q (v + a )] dv .
svi
n 1 i i i i i i i (13)
corresponds the term Q j ( j 1, 2 ) which is the product of i =1

(n 1) co-factors of the form qk or (1 qk ) . Here we should make use of one more property of the Dirac
In particular delta function

g (t ) (t t )dt =

g (t ). 0 0

65 www.erpublication.org
International Journal of Engineering and Technical Research (IJETR)
ISSN: 2321-0869, Volume-02, Issue-02, February 2014

1 1 qi
With this property taken into account, from formula (13)
qi qi
we obtain = exp ln = exp ln
n +1 1 qi 1 qi 2 qi
Q (1 qi )e ai s + qi e + ai s . (14) .
i =1 1 qi 1 q 1 1 qi
= exp ln =
i
exp + ln
qi
Here s , as mentioned above, is an arbitrary real positive qi 2 qi

number. Before we continue simplifying the right-hand part of
inequality (14), we have to define a value of s for which Besides, we denote
expression (14) gives a minimal upper estimate. 1 1 qi
ai ln =i .
Passing to the natural logarithm of inequality (14) we come 2 qi
to the expression
n +1
Then we have
ln Q ln (1 qi )e ai s + qi e + ai s . ai

ai
ei + e i
i =1 qi e 2 + (1 qi )e=
2
2 qi (1 qi ) .
2
Let us define here partial derivatives with respect to
The second co-factor in the right-hand part of this
arguments ai by using the elementary fact that expression is the hypebolic cosine of the argument i :
dy
y
= = f ( x) e f ( x ) ei + e i
dx = ch i .
2
d 1 Therefore
if y = e f ( x ) , and also the fact that ln = f ( x) f ( x) . ai ai
dx f ( x)
qi e 2 + (1 qi )e=
2
i
2 qi (1 qi ) ch=
Hence we obtain
ln Q n +1 1 qi
ai ln q
1
ai s
[ sqi e + ai s s (1 qi )e ai s ].
ai i =1 [(1 qi )e + qi e + ai s ] = 2 qi (1 qi ) ch i .
For the right-hand part of this inequality to be equal to 2

zero, it suffices that the following condition be fulfilled
sqi e + ai s s (1 qi )e ai s =
0, Finally, for estimate (14) we can write
whence it follows that 1 qi
n +1 ai ln q
1 qi .
e +2 ai s = , Q 2 qi (1 qi ) ch i

qi i =1 2

or, which is the same,
1 1 qi For the error probability Q , the right-hand part of the
ai s = ln .
2 qi above inequality is the upper estimate Q + :
If the weights ai of the neuron inputs are put into
1 qi
correspondence with error probabilities qi of these inputs by
n +1 ai ln q
the relations = Q + 2 qi (1 qi ) ch i
.
1 qi i =1 2
ai = ln , (15)
qi
+ +
then the sought value of s will be The minimum Qmin of this upper estimate Q is equal to
1 n +1 n +1
s= . (16) +
Qmin = 2 qi (1 qi ) = 2n +1 qi (1 qi ) . (17)
2
=i 1 =i 1

Using equality (16) in formula (14), we obtain a minimal
It is attained when for the zero argument the hyperbolic
upper estimate for the error probability Q of the restoring
cosine attains the minimum equal to 1.
neuron. Indeed, for the right-hand part of expression (14) the This estimate confirms in a certain sense the advantage of
following chain of identical transforms is valid: the choice of weights of the restoring neuron in compliance
ai a
i with the error probabilities of input signals according to the
ai a
i q e 2
+ (1 q ) e 2
qi e 2 + (1 qi )e= 2
2 qi (1 qi ) i i
= following relations
2 qi (1 qi )
1 qi
ai = ln
qi .
ai a
qi 1 q i
e 2 + i
e 2
1 qi
i 1, n + 1
qi
= 2 qi (1 qi ) . =
2
Let us take into account here that

66 www.erpublication.org
Two Methods of Obtaining a Minimal Upper Estimate for the Error Probability of the Restoring Formal Neuron

III. OBTAINING A MINIMAL UPPER ESTIMATE BY where


THE SECOND METHOD qi
Qi+ =
(1 qi ) wi +
Simulteously, for the probability Q it is useful to obtain a wi
minimal upper estimate in the closed analytic form by one and, along with this,
more new approach. w=
i S ai , 0 < wi < , (=
i 1, n + 1). (25)
As is known [9], the generating function v ( S ) of the +
Now we can find the minimum Q of expression (24) and
factorial moment of the sum of independent random
min

the value w0i of wi will attach a minimum to the upper


variables i is equal to the product of generating functions
estimate of the error probability Q + of the restoring neuron.
v i ( S ) of the factorial moments of individual summands, i.e.
For this, we use the conditions
n +1
v (S ) = v (S ) , i
(18) Q+
= 0
i =1 wi .
where
=
i 1, n + 1
S v f (v ) ,
S =
v (S ) =
v
(19) Hence
qi
w= ,=
(i 1, n + 1) .
v (S ) = S v fi (vi )
S = i i 0i
1 qi
(26)
.
i
vi (20)

=i 1, n + 1 If (26) is substituted into expression (24), then by the
Here is the mathematical expectation symbol and S second method for a minimal upper estimate of the error
is an arbitrary complex number for which series (19) and (20) probability of the restoring neuron we obtain the relation
exist on some segment of the real axis containing the point n +1

S 1. = +
Qmin 2n +1 qi (1 qi ) , (27)
i =1

Since in relation (20), summation is carried out on the set
which coincides with result (17) obtained by the first method.
of two possible values ai and ai of the variable vi , using
The weights ai (i 1, n 1) which match the error
(6) we have
(1 qi ) S a + qi S a
v (S ) = i i
(
probabilities qi = )
i 1, n + 1 are defined from relations (26)
.
i
(21)
= i 1, n + 1 with notation (25) taken into account:

1 q
The substitution of (21) into relation (18) gives ai ln i
n +1 2 ln S 1 qi .
v (S ) = (1 q )S
i =1
i
ai
+ qi S ai .
i 1, n 1





When v 0 , the value S v satisfies the condition Since the value S satisfies condition (22), we have ln S 0
1 and therefore
=Sv > 1,
S( (
v 1 qi
a= K ln
qi ,
i
if of course (28)
0 < S < 1. (22)
= i 1, n + 1
Let us assume that inequality (22) is fulfilled. Then the
where
following relation is valid:
1
= Q f (v ) < S v f (v ) . K=
v<0 v<0 2 ln S . (29)
Since every summand S v f (v) is non-negative, we have the 0 < K <
inequality Thus, the weights ai=
(i 1, n + 1) , which are consistent with
S v f (v ) S v f (v ) .
v<0 v (
the error probabilities qi = )
i 1, n + 1 and attach a minimum
Therefore to the upper estimate of the error probability of the restoring
Q < v (S ) . (23) neuron, are defined to within the a general positive factor K .
The right-hand part of this expression can be taken as the
upper estimate Q + of the error probability Q of the restoring IV. CONCLUSION
neuron A minimal upper estimate of the error probability of the
n +1 restoring formal neuron is defined by formula (17) or, which
Q+ = (1 q )S
i =1
i
ai
+ qi S ai . is the same, by formula (27). In both cases the result can be
The latter relation is easily rewritten in the equivalent form written in the form
n +1
n +1 n +1
q =+
exp A(qi ) ,
Q + = Qi+ = (1 qi ) wi + i ,
Qmin (30)
(24) i =1
=i 1 =i 1 wi

67 www.erpublication.org
International Journal of Engineering and Technical Research (IJETR)
ISSN: 2321-0869, Volume-02, Issue-02, February 2014
where
=
A(qi ) ln 2 qi (1 qi ) . (31)
In view of relations (31) confirming the non-negativity of
the values A(qi ) , formula (30) implies that an increase of
the number n of inputs of the formal decision neuron bings
about a monotone decrease of the minimal upper estimate of
+
the error probability of restoration of the binary signal Qmin Archil I. Prangishvili was born April 13,
1961, in Tbilisi (Georgia). During 1978-1983, he was the student of Faculty
by the exponential law if, certainly, the error probabilities of Automatics and Computer Engineering at Georgian Polytechnic Institute.

(
qi = )
i 1, n + 1 at these inputs are not equal to
1
2
when the
During 1983-1987, he was the Post graduated student at Georgian
Polytechnic Institute. Currently, he is the Doctor of Technical Sciences, Full
professor at Faculty of Informatics and Control Systems of Georgian
+
minimal upper estimate of the error probability Qmin is equal Technical University, full member (academician) of the Georgian National
Academy of Sciences, President of the Georgian Engineering Academy,
to 1. Member of the International Engineering Academy and the International
This result is demonstrates an essenatial inner connection Academy of Informatization of the UN (United Nations), Rector of
with Shannons theorem [10]. According to this theorem, the Georgian Technical University. Archil I. Prangishvili is the expert in the
number of messages of length n (duration ) composed of field of Computer Science, pplied xpert Systems, rtificial Intelligence
and Control Systems, head of Editorial Board of the Journal Automated
individual symbols both in the absence and in the presence Control Systems and member of the Editorial Board of the Georgian
of fixed and probabilistic constraints (in the latter case it is Electronic Scientific Journal (GESJ): Computer Science and
assumed that the source is ergodic) grows by the Telecommunications. Number of published works - more than 120.
including 7 monographs and 5 textbooks.
asymptotically exponential law as n (or ) increases. In
particular we understand this connection as follows: as the
number n of inputs of the restoring formal neuron increases,
the initial information to be used in making the decision Y
increases by the exponential law if there are a number of
possible versions of the input signal, while the minimal upper
+
estimate Qmin of the probability Q that the made decision is
erroneous decreases by the same exponential law.

REFERENCES Oleg M. Namicheishvili received the


Master of Science (Radio physics) from Ivane Javakhishvili Tbilisi State
[1] J. Neumann, Probabilistic logics and synthesis of reliable organisms University in 1965. During 1965-1968, he studied at the Graduate School of
from unreliable components", in Automata Studies: Annals of Mathematics Tbilisi State University and he was a trainee at the Department of theory of
Studies, no. 34 (in Russian, translated from English), Moscow: Publishing probability in Moscow State University. In 1968 he received the degree of
House for foreign literature IIL, 1956, pp. 68-139. PhD (Engineering Cybernetics and Information Theory) and in 1989 he was
[2] I. Varshavsky "Functional Possibilities and Synthesis of Threshold doing D.Sc.(Tech.) in Georgian Technical University. Over a long period of
Elements" (in Russian), Dokl. AN SSSR, vol. 139, no. 5, pp. 1071-1074, time Oleg Namicheishvili was an internship at CNET (Centre National
1961. d tudes des Tlcommunications, France, Lannion) - a French research
[3] . Dertouzos Threshold logic (in Russian, translated from English), laboratory in telecommunications. He was now working in Faculty of
Moscow: Mir, 1966. Informatics and Control Systems of Georgian Technical University as Full
[4] C. R. Lageweg, S. D. Cotofana, S. Vassiliadis, "A full adder Professor and ALT (Accelerated Life Tests) Project Director. He has a
implementation using set based linear threshold gates", in Proc. 9th IEEE teaching job and his research affected the reliability of information systems.
International Conference on Electronics, Circuits and Systems, New York, He has published more than 150 research papers in various journals. He is a
2002, pp. 665669. full member (academician) of the Georgian Engineering Academy and
[5] V. Beiu, J. M. Quintana, M. J. Avedillo, "VLSI Implementation of Deputy Editor-in-Chief of the Georgian Electronic Scientific Journal
threshold Logic: A Comprehensive Survey", IEEE Trans. Neural Networks, (GESJ): Computer Science and Telecommunications.
vol.. 14, pp. 12171243, May 2003.
[6] R. Zhang, P. Gupta, L. Zhong, N. K. Jha, "Threshold Network
Synthesis and Optimization and Its Application to Nanotechnologies", IEEE
Transaction On Computer-Aided Design Of Integrated Circuits And
Systems, vol. 24, pp. 107118, January 2005.
[7] T. Gowda, S. Vrudhula, N. Kulkarni, K. Berezowski, "Identification
of Threshold Functions and Synthesis of Threshold Networks", IEEE
Transactions on Computer-Aided Design Of Integrated Circuits And
Systems, vol. 30, pp. 665-677, May 2011.
[8] W. Pierce, Failure-tolerant computer design (in Russian, translated
from English), Moscow: Mir, 1968.
[9] G. A. Korn, . M. Korn, Mathematical handbook for scientists and
engineers (in Russian, translated from English), Moscow: Nauka, 1977, p. Maia A. Gogiashvili was born September
832 23, 1971 in Tbilisi (Georgia). In 1994 she graduated Ivane Javakhishvili
Tbilisi State University, faculty of Physics. Since 1994 she is teacher of
[10] S. Goldman, Information theory (in Russian, translated from physics and mathematics at Grigol Robakidze University. From 2005 she is
English), Moscow: Publishing House for foreign literature IIL, 1957. an expert at the National Examinations Center of the Ministry of Education
and Science of Georgia. Currently she is a doctoral candidate of St. Andrew
the First-Called Georgian University of the Patriarchate of Georgia at
Faculty of Information Science, Mathematics and Natural Sciences. She has
published more than 3 research papers in various scientific journals.

68 www.erpublication.org

You might also like