This document discusses autocorrelators and heterocorrelators, specifically Kosko's discrete bidirectional associative memory (BAM) model.
Autocorrelators store patterns as a connection matrix that is the sum of the outer products of the patterns. Their recall equation performs vector-matrix multiplication between the input and connection matrix.
Kosko's BAM extends unidirectional autocorrelators to be bidirectional. It stores paired patterns (A,B) and recalls using recall equations that update A and B vectors iteratively until equilibrium is reached. The correlation matrix is calculated from the training pairs and an energy function aims to minimize energy for stored pattern retrieval.
1 of 14
Downloaded 16 times
More Related Content
Autocorrelators1
1. 1
Autocorrelators
Dept. of Computer Science & Engineering
2013-2014
Presented By:
Ranjit R. Banshpal
Mtech 1st sem (CSE)
Roll NO.18
1
A
seminar on
G.H. Raisoni College of Engineering Nagpur
1
2. Autocorrelators
Autocorrelators easily recognized by the title of Hopfield
Associative Memory (HAM).
First order autocorrelators obtain their connection matrix by
multiplying a patterns element with every other patterns elements.
A first order autocorrelator stores M bipolar pattern
A1,A2,,Am by summing together m outer product as
3. Here,
is a (p p) connection matrix and
p
The autocorrelators recall equation is vector-matrix
multiplication,
The recall equation is given by,
=f ( )
Where Ai=(a1,a2,..,ap) and two parameter
bipolar threshold function is,
1, if 留 > 0
f(留 , 硫 )= 硫, if 留 = 0
-1, if 留 < 0
4. Working of an autocorrelator
Consider the following pattern,
A1=(-1,1,-1,1)
A2=(1,1,1,-1)
A3=(-1,-1,-1,1)
The connection matrix,
3 1 3 -3
41 14 1 3 1 -1
3 1 3 -3
-3 -1 -3 3
5. Recognition of stored patterns
The autocorrelator is presented stored pattern
A2=(1,1,1,-1)
With the help of recall equation
= f ( 3 + 1 + 3 + 3, 1 ) = 1
= f ( 6, 1 ) = 1
= f ( 10, 1 ) = 1
= f ( -10, 1 ) = -1
6. Recognition of noisy patterns
Consider a vector A=(1,1,1,1) which is a
distorted presentation of one among the store pattern
With the help of Hamming Distance measure we
can find noisy vector pattern
The Hamming distance (HD) of vector X from Y,
given X=(x1,x2,.,xn) and Y=(y1,y2,.,yn) is given
by,
HD( x, y ) =
7. Heterocorrelators : Koskos discrete BAM
Bidirectional associative memory (BAM) is two level
nonlinear neural network
Kosko extended the unidirectional to bidirectional
processes.
Noise does not affect performance
There are N training pairs
{(A1,B1),(A2,B2),.,(Ai,Bi),,(An, Bn)} where
Ai=(ai1,ai2,.,ain)
Bi =(bi1,bi2,,bip)
8. Here, aij or bij is either ON or OFF state
In binary mode, ON = 1 and OFF = 0 and
In bipolar mode, ON = 1 and OFF = -1
Formula for correlation matrix is,
Recall equations,
Starting with (留, 硫) as the initial condition, we determine the
finite sequence (留, 硫 ),(留, 硫),.., until equilibrium
point (留F, 硫 F ) is reached.
Here ,
硫 = (留M)
留 = (硫 MT)
9. 陸(F) = G = g1, g2, ., gn
F = ( f1,f2,.,f n)
1 if fi > 0
0 (binary)
gi = , fi < 0
-1 (bipolar)
previous gi, fi = 0
10. Addition and Deletion of Pattern Pairs
If given set of pattern pairs (Xi, Yi) for i=1,2,.,n
Then we can be added (X,Y) or can be deleted (Xj,Yj)
from the memory model.
In the case of addition,
In the case of deletion,
11. Energy function for BAM
The value of the energy function for
particular pattern has to occupy a minimum
point in energy landscape,
Adding new patterns do not destroy
previously stored patterns.
12. Hopfield propose an energy function as,
E(A) = -AMAT
Kosko propose an energy function as,
E(A,B)= - AMBT
If energy function for any point (留, 硫) is given by
E = - 留M硫T
If energy E evaluate using the coordinates of the pair
(Ai,Bi),
13. Working of Koskos BAM
Step 1:
converting to bipolar forms
Step 2:
The matrix M is calculated as,
Step 3:
Retrieve the associative pair
硫 = (留M)
留 = (硫 MT)