Professional Documents
Culture Documents
S. Vivekanandan
Cabin: TT 319A
E-Mail: svivekanandan@vit.ac.in
Mobile: 8124274447
BAM
• Kosko developed BAM in 1988.
• It is a hetero associative recurrent neural network consist of
two layers.
• It iterates by sending a signal back and forth between the two
layers until each neurons activations remains constant.
• Here layers are referred as x layer and y layer instead of input
and output and the weights are also bidirectional.
• Three forms of BAM are but architecture remains same
1. Binary
2. Bipolar
3. Continuous
• The H-BAM has ‘n’ units in X-layer and ‘m’ units in y-layer.
• Connections between the layers are bidirectional i.e. if the
weights matrix for signals sent from X –layer to Y-layer is W,
then Y to X is WT.
• It resembles a single layer feed forward network.
• The training process is based on Hebb rule.
• Fully interconnected network, wherein the inputs and the
outputs are different.
• There exist two types of BAM. They are
1. Discrete BAM
2. Continuous BAM
• For bipolar p
W Si
p 1
( p )tj ( p )
• A continuous BAM has the capacity to transfer the input smoothly and
continuously in to the respective output in the range between [ 0 1]
• It uses logistic sigmoidal function as the activation function for all the
inputs.
• For binary inputs and target vectors s(p) : t(p) the weights are determined.
Wij p ( 2 Si( p ) 1)( 2 stj ( p ) 1)
Y-layer
• The logistics sigmoid activation function is given by
1
f (Yin)
1 e( Yinj )
• If bias is included in calculating the net input then
Yinj bj XjWij
Xini bi YjWij
• The memory (storage capacity) is minimum (m,n) where
m = number of units in x-layer
n = number of units in y-layer