Escolar Documentos
Profissional Documentos
Cultura Documentos
in
Associative Memory : Soft Computing Course Lecture 21 24, notes, slides
Associative Memory
Soft Computing
Associative Memory (AM), topics : Description, content
address-
in
and noise, performance measure - memory capacity and contentaddressability. Associative memory models : network architectures -
e.
ub
et
cs
w.
ww
www.csetube.in
www.csetube.in
Associative Memory
Soft Computing
Topics
(Lectures
4 hours)
Slides
03-12
content-addressability.
13-20
Network architectures -
in
e.
et
ub
patterns.
25-41
cs
w.
BAM,
ww
5. References
02
www.csetube.in
www.csetube.in
Associative Memory
What is Associative Memory ?
memory :
auto-associative
and
hetero-associative.
in
that
e.
ub
different from the input pattern not only in content but possibly also
cs
et
to
implement
w.
these
03
ww
www.csetube.in
associative
memory
www.csetube.in
SC - AM description
1. Associative Memory
An associative memory is a content-addressable structure that maps a
set of input patterns to a set of output patterns.
A content-addressable
structure
memory
where the
is accessed
from the input pattern not only in content but possibly also in type
in
and format.
ww
w.
cs
et
ub
e.
04
www.csetube.in
www.csetube.in
SC - AM description
in
e.
ub
Thomas Edison
Christopher Columbus
et
Albert Einstein
Crhistpher Columbos
ww
w.
cs
Charles Darwin
Christopher Columbus
Blaise Pascal
Marco Polo
Neil Armstrong
Sigmund Freud
Fig.
www.csetube.in
www.csetube.in
SC - AM description
[Continued from previous slide]
hetero-associative memory.
Auto-associative memory
y[3], . . . . . y[M],
pattern vectors
and
have
in
e.
ub
{c(M), y(M)}.
et
cs
Linear
associater
memory.
is
the
simplest
artificial
neural
associative
ww
w.
Hopfield model
and
models
follow
different
neural
memorize information.
06
www.csetube.in
network
architectures
to
www.csetube.in
SC - AM description
Example
An associative memory is a storehouse of associated patterns which
are encoded in some form.
When the storehouse is triggered or excited with a pattern, then
a stored pattern.
Fig below illustrates the working of an associated memory.
The associated pattern pairs
( , ), ( , +), (7 , 4).
Recalled
Pattern The association is represented
by the symbol
cs
et
ub
e.
in
Input
Pattern
w.
ww
www.csetube.in
www.csetube.in
SC - AM description
An
auto-associative correlator,
that
most
closely
ub
e.
in
Examples
Input
pattern
presented
Presented
distorted
pattern
w.
cs
et
Recall of
associated
pattern
ww
Hetero-associative memory
Recall of
perfect
pattern
Auto-associative memory
www.csetube.in
www.csetube.in
SC - AM description
and
Content-addressability.
Encoding or memorization
Building
an
associative
weight matrix W
memory
means,
constructing
connection
, where
(yj)k
j = 1, 2, . . . , n.
e.
i = 1, 2, . . . , m
ub
for
in
(xi)k
is accomplished by
et
k=1
Wk
where
w.
W=
cs
09
ww
www.csetube.in
www.csetube.in
SC - AM description
Retrieval or recollection
After memorization, the process of retrieving a stored pattern, given
an input pattern, is called decoding.
Given an input pattern X, the decoding or recollection is accomplished by:
first compute
using
input j =
j=1
xi w i j
ub
where
other wise
e.
-1
in
Yj =
ww
w.
cs
et
10
www.csetube.in
www.csetube.in
SC - AM description
input
pattern
may
contain
corrupted
input
pattern
is
presented,
the
network
will
many
processing
are
elements
robust
and
performing
distributed computations.
ww
w.
cs
et
ub
e.
in
11
www.csetube.in
fault
tolerant
highly
because
parallel
and
www.csetube.in
SC - AM description
Performance Measures
The
of
memory
associative
capacity
and
memory
content-addressability
performance
for
correct
are
the
measures
retrieval. These
two
capacity
refers
to
the
maximum
number
of
associated
is
to
retrieve
the
in
ww
w.
cs
et
ub
e.
12
www.csetube.in
www.csetube.in
SC - AM models
networks.
The simplest associative memory model is Linear associator, which is a
feed-forward
type
of
network.
It
has
very
low
memory
capacity
and
Hopfield Model
in
ub
e.
ww
w.
cs
et
13
www.csetube.in
www.csetube.in
SC - AM models
the
can be
but
it
possesses a very low memory capacity and therefore not much used.
The
are
Hopfield Model
and
until
the
system
becomes
stable.
in
behavior.
Hopfield
networks
are
e.
layers.
The
cs
the
the
connections
are
is
similar to
bi-directional
and
w.
therefore
but
et
linear associator,
ub
model
can
perform
both
auto-associative
ww
of
these
three models
www.csetube.in
are
described
in
www.csetube.in
SC - AM models
to
memorize
The
Linear
associator
model,
network,
consists, two layers of processing units, one serving as the input layer
while the other as the output layer.
The
The
is similar to
in
e.
ub
cs
et
ww
w.
15
www.csetube.in
www.csetube.in
SC - AM models
weights wij
w11
x1
y1
w21
w12
w22
outputs
e.
w2m
inputs
y2
in
x2
ub
wn1
w1m
et
wn2
Ym
wnm
w.
cs
Xn
ww
all n input units are connected to all m output units via connection
weight matrix
of
W = [wij]n x m
th
th
output unit.
the
connection
pattern
building
weight
matrix
stores
the
different
associated
associative
weight matrix W
memory
is
constructing
the
connection
then the stored pattern associated with the input pattern is retrieved.
[Continued in next slide]
16
www.csetube.in
www.csetube.in
SC - AM models
[Continued from previous slide]
Encoding : The process of constructing the connection weight matrix
is called encoding.
matrix Wk for an associated pattern pair (Xk, Yk) are computed as:
(wij)k = (xi)k (yj)k
(xi)k is the i
th
where
i = 1, 2, ..., m,
j = 1, 2, ..., n.
Weight matrix :
and
is accomplished
Wk
k=1
normalizing, usually
of
retrieving
stored
pattern,
in
process
is
called
decoding;
e.
the
ub
xi w i j where
j=1
et
input j stands for the weighted sum of the input or activation value of
cs
w.
and then determine the units Output using a bipolar output function:
+1 if input j j
ww
Yj =
-1
where
other wise
Note: The output units behave like linear threshold units; that compute
a weighted sum of the input and produces a -1 or +1 depending
whether the weighted sum is below or above a certain threshold value.
Performance :
www.csetube.in
www.csetube.in
SC - AM models
pairs,
memory
are
means
stored
in
patterns
memory.
rather
Hopfield
than
model
associated
is
one-layer
W13
W23
W31
W21
connection
weights wij
W43
W42
W32
W12
W41
2
I1
I2
I3
V3
V2
outputs V
neurons
I4
inputs I
V4
e.
V1
W34
in
W14
alternate view
ub
et
w.
connection
cs
unit is connected to every other unit in the network but not to itself.
ww
ij
=wji
for i, j = 1, 2, . . . . . , m.
each
input j =
xi w i j + I j
i=1
for j = 1, 2, . . ., m.
X k Yk
where XK = YK
www.csetube.in
www.csetube.in
SC - AM models
[Continued from previous slide]
Weight Matrix :
is accomplished by
where
Wk
k=1
the Hopfield
or
retrieving
is
accomplished
xi w i j where input j
j=1
stands for the weighted sum of the input or activation value of node
for j = 1, 2, ..., n. and xi is the i
th
by
j,
et
where
other wise
e.
-1
ub
+1 if input j j
Yj =
in
cs
that
compute a
w.
ww
www.csetube.in
how to store
and
how to
www.csetube.in
SC - AM models
incorporating
an
auto-associations
memories.
as
The
the
Hopfield
additional
well
network
as
layer
to
perform
hetero-associations
structure
of
the
on
recurrent
the
bidirectional
stored
associative
and
j = 1, 2, . . . , m.
neurons
weights wij
w11
x1
y1
w21
w12
w22
w2m
inputs
y2
in
x2
w1m
wn1
ub
wn2
outputs
e.
wij = wji ,
Ym
et
Xn
cs
wnm
w.
ww
X k Yk .
different associated
pattern
pairs
simultaneously,
is
k=1
Wk
20
www.csetube.in
accomplished
www.csetube.in
SC - AM
auto correlator
than associated
pattern pairs,
are
stored
in memory.
In this
ww
w.
cs
et
ub
e.
in
21
www.csetube.in
and
is
www.csetube.in
SC - AM
auto correlator
A1 , A2, A3
to be stored as
an auto-correlator.
A1 = (-1, 1 , -1 , 1 )
A2 = ( 1, 1 ,
1 , -1 )
A3 = (-1, -1 , -1 , 1 )
Note that the outer product of two vectors U
U1
U2
U3
U4
= UT V =
V1 V2 V3
and
U1V1
U2V1
U3V1
U4V1
V is
U1V2
U2V2
U3V2
U4V2
U1V3
U2V3
U3V3
U4V3
A1 , A2, A3
patterns are
in
ww
w.
cs
et
ub
1 -1 1 -1
-1 1 -1 1
1 -1 1 -1
-1 1 -1 1
e.
=
i
i=1
1 1 1 -1
1 1 1 -1
1 1 1 -1
-1 -1 -1 1
1 1 1 -1
1 1 1 -1
1 1 1 -1
-1 -1 -1 1
is
www.csetube.in
3 1 3 -3
1 3 1 -1
3 1 3 -3
-3 -1 -3 3
bipolar
www.csetube.in
SC - AM
auto correlator
previous
slide
shows
the
connection
matrix
T = [t i j ] =
i=1
of
the
three
i
3 1 3 -3
1 3 1 -1
3 1 3 -3
-3 -1 -3 3
aj
old
= (ai t i j , aj
for j = 1 , 2 , . . . , p
= ai t i j
Therefore
new
a1
new
a2
new
a3
new
a4
+
+
+
+
1x1
1x3
1x1
1x-1
et
cs
new
aj
1x3
1x1
1x3
1x-3
e.
t i , j=1
t i , j=2
t i , j=3
t i , j=4
w.
= ai
= ai
= ai
= ai
= (ai t i j
+
+
+
+
1x3
1x1
1x3
1x-3
4
+
+
+
+
-1x-3
-1x-1
-1x-3
-1x3
= 10
=
= 10
= -1
1
1
1
-1
old
, aj
) for j = 1 , 2 , . . . , p
is ( , )
ww
ub
i=
in
then find
and
= (10 , 1)
= (6 ,
1)
= (10 , 1)
= (-1 , -1)
( a1
new
, a2
new
, a3
new
a 4 , ) = ( -1, -1 , -1 , 1 ) = A3
23
www.csetube.in
www.csetube.in
SC - AM
auto correlator
A'
= ( 1,
1 ,
1 , 1 )
the
proximity
of
the
noisy
vector
to
the
stored
patterns
X = (x1 , x2 , . . . , xn)
m
HD (x , y) =
where
| (xi - yi ) |
i=1
The HDs of A' from each of the stored patterns A1 , A2, A3 are
HD (A' , A1)
= |(x1 - y1 )|,
= 2
HD (A' , A3)
= 6
e.
HD (A' , A2)
in
= 4
ub
et
cs
= ai
= ai
= ai
= ai
t i , j=1
t i , j=2
t i , j=3
t i , j=4
ww
1
1x3
1x1
1x3
1x-3
w.
i=
new
Therefore
aj
new
a1
new
a2
new
a3
new
a4
2
1x1
1x3
1x1
1x-1
+
+
+
+
+
+
+
+
3
1x3
1x1
1x3
1x-3
+
+
+
+
4
1x-3
1x-1
1x-3
1x3
= 4
= 4
= 4
= -4
1
1
1
-1
old
= (ai t i j
, aj
) for j = 1 , 2 , . . . , p
is ( , )
= (4 , 1)
= (4 , 1)
= (4 , 1)
= (-4 , -1)
is A2 .
www.csetube.in
www.csetube.in
SC - Bidirectional hetero AM
previous
section.
Kosko
(1987)
extended
this
network
to
two-layer
and if the
model
then
recalls
pattern Y
given a
pattern
or vice-versa,
it
e.
in
ww
w.
cs
et
ub
25
www.csetube.in
is
www.csetube.in
SC - Bidirectional hetero AM
with
elements Ai
and
the
other layer
in bipolar mode,
ON = 1 and OFF =
ON = 1 and OFF
and
= -1
M0 =
i=1
[ Xi ] [ Yi ]
e.
in
ub
et
E= -M
ww
the operations
w.
to retrieve
cs
www.csetube.in
deletion of
www.csetube.in
SC - Bidirectional hetero AM
' = ( M )
" = (' M )
)
T
(F) = G = g1 , g2 , . . . . , gr ,
F = ( f1 , f2 , . . . . , fr )
1
if
fi >0
0 (binary)
-1 (bipolar)
has
proved
that
fi = 0
this
process
ww
Kosko
w.
cs
previous g i ,
fi < 0
ub
et
gi =
in
is correlation matrix
e.
correlation matrix M.
27
www.csetube.in
will
converge
for
any
www.csetube.in
SC - Bidirectional hetero AM
and a set
of correlation matrix M :
a
or
an existing pair (Xj , Yj) can be deleted from the memory model.
= X1 Y1
+ X1 Y1 + . . . . + Xn Yn
+ X'
Y'
from the correlation matrix M , them the new correlation matrix Mnew
is given by
T
functioning of
e.
The addition
in
Yj )
( Xj
the system
as a human
et
Note :
ub
Mnew = M
ww
28
w.
cs
and forgetfulness.
www.csetube.in
memory
similar to the
exhibiting
learning
www.csetube.in
SC - Bidirectional hetero AM
the
adding
new
patterns
must
not
destroy
the
previously
stored patterns.
in
e.
= - AM A
ub
E(A)
cs
E(A, B) = - AM B
et
it corresponds
ww
w.
presented
as
initial
condition
to
BAM.
The
neurons
change
their states until a bidirectional stable state (Af , Bf) is reached. Kosko
has shown that such stable state is reached for any matrix M
corresponds to local minimum of the energy function.
decoding
lowers
the
energy E
( , ) is given by E = M
If the energy
(Ai , Bi)
E = Ai M Bi
if
the
energy
when it
Each cycle of
be recalled, even though one starts with = Ai. Thus Kosko's encoding
method does not ensure that the stored pairs are at a local minimum.
29
www.csetube.in
www.csetube.in
SC - Bidirectional hetero AM
pair.
B1 =
A2 =
B2 =
A3 =
B3 =
1 -1 -1 -1 -1
X2 =
( -1
X3 =
( -1 -1
Y1 =
1 -1 -1 -1
Y2 =
1 -1
1 -1
Y3 =
( -1
1 -1 -1 -1
1
1 -1 -1
1 -1
X2 Y2 +
X3 Y3
ub
X1 Y1 +
1 -3 -1
1 -3
-1 -1
1 -1
3
1
1
1 -1
-1 -1 -1
-3
-1
3 -1
1 -1
w.
cs
et
M =
e.
in
Y3 .
ww
Suppose we start with = X3, and we hope to retrieve the associated pair
The calculations for the retrieval of Y3 yield :
M = ( -1 -1 1 -1 1 1 ) ( M ) = ( -6 6 6 6 -6 )
( M) = ' = ( -1 1 1 1 -1 )
' M T = ( -5 -5 5 -3 7 5 )
(' M T) = ( -1 -1 1 -1 1 1 ) = '
' M = ( -1 -1 1 -1 1 1 ) M = ( -6 6 6 6 -6 )
(' M) = " =
( -1 1 1 1 1 -1 )
= '
(f , f) = (X3 , Y3 )
30
www.csetube.in
www.csetube.in
SC - Bidirectional hetero AM
( 1 0 0 1 1 1 0 0 0 )
B1 =
1 1 1 0 0 0 0 1 0
A2 =
( 0 1 1 1 0 0 1 1 1 )
B2 =
1 0 0 0 0 0 0 0 1
A3 =
( 1 0 1 0 1 1 0 1 1 )
B3 =
0 1 0 1 0 0 1 0 1
( 1 -1 -1 1 1 1 -1 -1 -1 )
Y1 =
1 1 1 -1 -1 -1 -1 1 -1 )
X2 =
( -1 1 1 1 -1 -1 1 1 1 )
Y2 =
1 -1 -1 -1 -1 -1 -1 -1 1
X3 =
( 1 -1 1 -1 1 1 -1 1 1 )
Y3 =
( -1 1 -1 1 -1 -1 1 0 1
1
-1
1
-3
1
1
-1
1
1
in
-1
1
-1
-1
-1
-1
1
-1
-1
1
-1
-3
1
1
1
-1
-3
-3
e.
-1
1
-1
-1
-1
-1
1
-1
-1
ub
1
-1
1
-3
1
1
-1
1
1
X3 Y3
et
1
-1
-3
1
1
1
-1
-3
-3
cs
3
-3
-1
-1
3
3
-3
-1
-1
X2 Y2 +
ww
-1
1
-1
3
-1
-1
1
-1
-1
w.
M = X1 Y1
www.csetube.in
-1
1
3
-1
-1
-1
1
3
3
www.csetube.in
SC - Bidirectional hetero AM
[Continued from previous slide]
( -1 1 1 1 -1 -1 1 1 1 )
Y2 =
1 -1 -1 -1 -1 -1 -1 -1 1
( M) = (
-19 -13 -5
-5 -13 13
-1
-1
-11 -11 11
'
'
"
-1
-1
-1
' M T = ( -11 11
-1
-1
-19 -13 -5
-5 -13 13
-1
-1
(' M ) = (
-1
' M = (
(' M) = (
-1
-1
-1
= '
-1
F = ' =
-1
-1
-1
-1
-1
-1
Y2
e.
-1
et
ub
F = ' =
in
But ' is not Y2 . Thus the vector pair (X2 , Y2) is not recalled correctly
w.
cs
32
ww
www.csetube.in
www.csetube.in
SC - Bidirectional hetero AM
[Continued from previous slide]
(X2
Y2)
at a point which
-1 -1 -1
-1 -1 -1
= - 73
which is lower than E2 confirming the hypothesis that (X2 , Y2) is not
e.
in
this
ww
33
and only if
w.
if
et
cs
that
guarantee
ub
www.csetube.in
pair
is at a local minimum
www.csetube.in
SC - Bidirectional hetero AM
Kosko extended
the unidirectional
Xi
retrieve
the
nearest
pair
equations. However, Kosko's encoding method does not ensure that the stored
pairs are at local minimum and hence, results in incorrect recall.
which
pair weight
for
T
Xi
Yi
as
Yi
Xi
where qi
is viewed
It denotes the
minimum number of times for using a pattern pair (Xi , Yi) for training to
e.
let us
ub
in
Xi
Yi
cs
P = (q 1)
et
w.
ww
the existing correlation matrix. As a result the energy E' can reduced to
an arbitrarily low value by a suitable choice of q. This also ensures that
the energy at (Ai , Bi) does not exceed at points which are one Hamming
distance away from this location.
The new value of the energy function E evaluated at (Ai , Bi) then becomes
T
The
next
few
slides
(q 1) Ai Xi
explains
Yi Bi
Multiple training encoding strategy for the recall of three pattern pairs
(X1 , Y1 ) , (X1 , Y1 ) , (X1 , Y1 ) using one and same augmentation matrix
M . Also an algorithm to summarize the complete
www.csetube.in
process of multiple
www.csetube.in
SC - Bidirectional hetero AM
( 1 0 0 1 1 1 0 0 0 )
B1 =
1 1 1 0 0 0 0 1 0
A2 =
( 0 1 1 1 0 0 1 1 1 )
B2 =
1 0 0 0 0 0 0 0 1
A3 =
( 1 0 1 0 1 1 0 1 1 )
B3 =
0 1 0 1 0 0 1 0 1
( 1 -1 -1 1 1 1 -1 -1 -1 )
Y1 =
1 1 1 -1 -1 -1 -1 1 -1 )
X2 =
( -1 1 1 1 -1 -1 1 1 1 )
Y2 =
1 -1 -1 -1 -1 -1 -1 -1 1
X3 =
( 1 -1 1 -1 1 1 -1 1 1 )
Y3 =
( -1 1 -1 1 -1 -1 1 0 1
Y2 =
in
X2 =
e.
1 -1 -1 -1 -1 -1 -1 -1 1
2
-2
-4
0
2
2
-2
-4
-4
2
-2
0
-4
2
2
-2
0
0
et
4
-4
-2
-2
4
4
-4
-2
-2
0
0
-2
-2
0
0
0
-2
-2
cs
2
0
4
-2
-2
2
0
0
+ 2 X2 Y2 +
w.
M = X1 Y1
ww
becomes
ub
0
0
-2
-2
0
0
0
-2
-2
X3 Y3
2
-2
0
-4
2
2
-2
0
0
www.csetube.in
2
-2
-4
0
2
2
-2
-4
-4
-2
2
4
0
-2
-2
2
4
4
www.csetube.in
SC - Bidirectional hetero AM
[Continued from previous slide]
-1
-1
-1
-1
-8 -14 -22 22
-1
-1
-1
-1
-1
-1
-1
-8 -14 -22 23
1
-1
-1
'
'
)
)
"
-1
-1
-1
F = ' =
-1
-1
-1
-1
-1
= Y2
(X2
matrix M .
Y2 )
is correctly
recalled,
using
in
Thus,
-1
augmented
X2
correlation
et
ub
e.
Here
-1
cs
ww
w.
36
www.csetube.in
www.csetube.in
SC - Bidirectional hetero AM
[Continued from previous slide]
Note : The previous slide showed that the pattern pair (X2 , Y2 ) is correctly
recalled, using augmented correlation matrix
M
but
= X1 Y1 + 2 X2 Y2 + X3 Y3
then
the
same
matrix
can not
recall
correctly
the
other
( 1 -1 -1 1 1 1 -1 -1 -1 )
Y1 =
1 1 1 -1 -1 -1 -1 1 -1 )
-6
( M) = (
-1
24 22
1
22 -22
1
-1
)
)
'
'
-1
' M = ( -14 28 22 14
14 22 -22
-1
"
e.
-1
-1
-1
ub
-1
-1
in
-1
(' M) = (
F = ' =
-1
cs
-1
-1
-1
-1
-1
-1
-1
Y1
X1
w.
F = ' =
et
ww
Thus, the pattern pair (X1 , Y1 ) is not correctly recalled, using augmented
correlation matrix M.
correlation
matrix
needs to be further
www.csetube.in
www.csetube.in
SC - Bidirectional hetero AM
[Continued from previous slide]
Y1 )
cannot be recalled
under the same augmentation matrix M that is able to recall (X2 , Y2).
However, this
by multiple training
of (X1
Y 1)
M = 2 X1 Y1 + 2 X2 Y2 + X3 Y3
-1
1
-1
5
-1
-1
1
-1
-1
5
-5
-3
-1
5
5
-5
-3
-3
3
-3
-5
1
3
3
-3
-5
-5
1
-1
1
-5
1
1
-1
1
1
-1
1
-1
-3
-1
-1
1
-1
-1
-1
1
-1
-3
-1
-1
1
-1
-1
1
-1
1
-5
1
1
-1
1
1
3
-3
-5
1
3
3
-3
-5
-5
-3
3
5
-1
-3
-3
3
5
5
Now observe in the next slide that all three pairs can be correctly recalled.
e.
in
ww
w.
cs
et
ub
38
www.csetube.in
www.csetube.in
SC - Bidirectional hetero AM
[ Continued from previous slide ]
(X1 , Y1 )
1 -1 -1 1 1 1 -1 -1 -1 )
Y1 =
1 1 1 -1 -1 -1 -1 1 -1 )
=
=
=
=
=
=
(
3 33 31 -3 -5 -5 -3 31 -31
(
1
1
1 -1 -1 -1 -1
1 -1
( 13 -13 -19 23 13 13 -13 -19 -19
(
1 -1 -1
1
1
1 -1 -1 -1
(
3 33 31 -3 -5 -5 -3 31 -31
(
1
1
1 -1 -1 -1 -1
1 -1
)
)
)
)
)
)
'
'
"
-1
-1
)
)
'
'
=
=
(
(
1
1
-1
1
-1
1
1
-1
1
-1
1
-1
-1
-1
-1
1
=
=
X1
Y1
(X2 , Y2 )
( -1 1 1 1 -1 -1 1 1 1 )
Y2 = (
1 -1 -1 -1 -1 -1 -1 -1 1 )
(
7 -35 -29 -7 -1 -1 -7 -29 29
(
1 -1 -1 -1 -1 -1 -1 -1
1
( -15 15 17 19 -15 -15 15 17 17
( -1
1
1
1 -1 -1
1
1
1
(
7 -35 -29 -7 -1 -1 -7 -29 29
(
1 -1 -1 -1 -1 -1 -1 -1
1
et
ub
e.
=
=
=
=
=
=
cs
M
( M)
(' M T )
(' M T )
' M
(' M)
in
'
'
"
'
'
=
=
(
(
ww
F =
F =
w.
1
-1
1
-1
1
-1
-1
-1
-1
-1
1
-1
1
-1
1
1
)
)
=
=
X2
Y2
(X3 , Y3 )
1 -1 1 -1 1 1 -1 1 1 )
Y3 = (
-1 1 -1 1 -1 -1 1 0 1 )
=
=
=
=
=
=
( -13 17 -1 13 -5 -5 13 -1
1
( -1
1 -1
1 -1 -1
1 -1
1
( 11 -11 27 -63 11 11 -11 27 27
(
1 -1
1 -1
1
1 -1
1
1
( -13 17 -1 13 -5 -5 13 -1
1
( -1
1 -1
1 -1 -1
1 -1
1
)
)
)
)
)
)
'
'
"
=
=
(
(
1
-1
-1
1
1
-1
-1
1
1
-1
1
-1
www.csetube.in
-1
1
1
0
1
1
)
)
=
=
X3
Y3
www.csetube.in
SC - Bidirectional hetero AM
[Continued from previous slide]
Thus, the multiple training encoding strategy ensures the correct recall
of a pair for a suitable augmentation of
M . The generalization of
the
correlation matrix, for the correct recall of all training pairs, is written as
M =
qi Xi
Yi
real numbers.
i=1
This modified
correlation matrix
ww
w.
cs
et
ub
e.
in
40
www.csetube.in
www.csetube.in
SC - Bidirectional hetero AM
i
Algorithm Mul_Tr_Encode ( N , X
i )
, q
i
, Y
where
=
X
1,
X
2, . . . . ,
X
N)
X
Y
=
1 ,
Y
2, . . . . ,
Y
N)
Y
i = ( x i 1 ,x i
where X
, . . .x i n )
2
j = ( x j 1 , x j , . . . x j
where Y
n
2
M [0]
For i 1 to N
in
i ) ( X
i )
M M [ qi Transpose ( X
end
ub
e.
(symbols
cs
A_M where
w.
Step 4 Compute
et
A_M
to
A_M
'
to get B
( A_M )
ww
ie
M
A
'
B
end
41
www.csetube.in