Professional Documents
Culture Documents
} so that,
when presented a new pattern , the system outputs one of
the stored memories that is most similar to
Such a system is called content-addressable memory
Part VII 2
Architecture
The Hopfield net consists of N McCulloch-Pitts neurons,
recurrently connected among themselves
Part VII 3
Definition
Each neuron is defined as
where
=1
+
and =
1 if 0
1 otherwise
Without loss of generality, let
= 0
Part VII 4
Storage phase
To store fundamental memories, the Hopfield model uses the
outer-product rule, a form of Hebbian learning:
=
1
=1
,
Hence
, i.e., =
=
1
Therefore the memory is stable
Part VII 8
One memory case (cont.)
Actually for any input pattern, as long as the Hamming
distance between and is less than 2 , the net converges
to . Otherwise it converges to
Think about it
Part VII 9
Multiple memories
The stability (alignment) condition for any memory
is
=
,
where
=
1
,
=
,
+
1
,
Part VII 10
crosstalk
Multiple memories (cont.)
If |crosstalk| <N, the memory system is stable. In general,
fewer memories are more likely stable
Example 2 in book (see blackboard)
Part VII 11
Example (cont.)
Part VII 12
Memory capacity
Define
=
,
,
< 0 stable
0
< stable
> unstable
What if
= ?
Part VII 13
Memory capacity (cont.)
Consider random memories where each element takes +1 or
-1 with equal probability (prob.). We measure
error
= Prob(
> )
To compute capacity M
max
, decide on an error criterion
For random patterns,
is proportional to a sum of
( 1) random numbers of +1 or -1. For large , it can
be approximated by a Gaussian distribution (central limit
theorem) with zero mean and variance
2
=
Part VII 14
Memory capacity (cont.)
So
error
=
1
2
exp (
2
2
2
)
d
=
1
2
1
2
exp
2
2
2
0
d
=
1
2
1
2
exp
2
/(2)
0
d
Part VII 15
error function
( = 2)
Memory capacity (cont.)
Error probability plot
Part VII 16
x
N
(
)
Memory capacity (cont.)
error
max
/
0.001 0.105
0.0036 0.138
0.01 0.185
0.05 0.37
0.1 0.61
Part VII 17
So
error
< 0.01
max
= 0.185, an upper bound
Memory capacity (cont.)
Part VII 18
What if 1% flips occur? Avalanche effect?
Real upper bound: 0.138N to prevent the avalanche effect
Memory capacity (cont.)
The above analysis is for one bit. If we want perfect retrieval
for
> 0.99
Approximately
error
<
0.01
For this case
max
=
2log
Part VII 19
Memory capacity (cont.)
But real patterns are not random (they could be encoded) and
the capacity is worse for correlated patterns
At one favorable extreme, if memories are orthogonal
= 0 for
then
= 0 and
max
=
This is the maximum number of orthogonal patterns
Part VII 20
Memory capacity (cont.)
But in reality a useful system stores a little less; otherwise
the memory is useless as it does not evolve
Part VII 21
Coding illustration
Part VII 22
Energy function (Lyapunov function)
The existence of an energy (Lyapunov function) for a
dynamical system ensures its stability
The energy function for the Hopfield net is
() =
1
2
Theorem: Given symmetric weights,
, the energy
function does not increase as the Hopfield net evolves
Part VII 23
Energy function (cont.)
Let
after an update
If
=
1
2
+
1
2
=
1
2
+
1
2
= 2
= 2
< 0
Part VII 25
since
=
different signs
since
Energy function (cont.)
Thus, () decreases every time
flips. Since is
bounded, the Hopfield net is always stable
Remarks:
Useful concepts from dynamical systems: attractors, basins of
attraction, energy (Lyapunov) surface or landscape
Part VII 26
Energy contour map
Part VII 27
2-D energy landscape
Part VII 28
Memory recall illustration
Part VII 29
Remarks (cont.)
Bipolar neurons can be extended to continuous-valued
neurons by using hyperbolic tangent activation function, and
discrete update can be extended to continuous-time
dynamics (good for analog VLSI implementation)
The concept of energy minimization has been applied to
optimization problems (neural optimization)
Part VII 30
Spurious states
Not all local minima (stable states) correspond to
fundamental memories. Typically,
, linear combination
of odd number of memories, or other uncorrelated patterns,
can be attractors
Such attractors are called spurious states
Part VII 31
Spurious states (cont.)
Spurious states tend to have smaller basins and occur higher
on the energy surface
Part VII 32
local minima
for spurious states
Kinds of associative memory
Autoassociative (e.g. Hopfiled net)
Heteroassociative: store pairs of <
> explicitly
Part VII 33
matrix memory
holographic memory