You are on page 1of 2

1. A radial basis function (RBF) network is shown in Figure.

All the parameters of this

network are to be trained using backpropagation. Derive the update rule of hidden layer
parameters. Use this result to find all the parameters of the network after one epoch. The
initial weights of the output layer are shown in Figure. Initial value of is 0.5 and c is
0.2. Assume a learning rate of 0.1 t2=0.8, t3=0.

Since the output layer of RBF is linear, the update rule of output layer weights is,
w2 w1 (ti oi )a j

For the hidden neuron,


c2 c1 c and 2 1
where c

E
E
and
c

E
E a .

c
a c
xc
a

c
2

a
k
c

ae

x c

2 2

= xc a
e
2
E ok xk
E
(t k ok ) ,
ok
ok xk a

xc
2

xc
E
(t k ok ) wk
c
2
k

x c
2

xc

xc
E

(t k ok ) wk

3
k
0.5 , c 0.2, x 1 , 0.1
x c

E
E
is same as before. a (t k ok ) wk
a
k

E
E a

xc
a

0.278
a e 2
o2= 3a=0.834 t2=0.8
o3=2a=0.556 t3=0.6
w2=3+0.1(0.8-0.84)(0.278)=3-.0011=2.999
w3=2+0.1(.6-0.556)(0.278)=2+0.0012=2.0012

ok
xk
1,
wk
xk
a

xc
E
(t k ok ) wk
c
2
k
c 0.2 0.001245 0.1987
c

xc
E
(t k ok ) wk

3
k
0.5 0.0019927 0.498007

a=

-0.001245

a = -0.0019927

You might also like