Bi-directional associative memory

from Wikipedia, the free encyclopedia

Bidirectional associative memory , English bidirectional associative memory (short: BAM) is a class of artificial neural networks and can be used as generalized Hopfield network are considered. BAM belongs to the group of feedback neural networks.

structure

A BAM network consists of an input layer from and an output layer from artificial neurons, both layers are connected to each other in both directions, whereby the weights are symmetrical. This leads to an m × n matrix for the weights directed from to . The weights from to correspond to the transposed matrix .

Training phase

In the training phase, the network learns to link an n-dimensional vector with an m-dimensional vector . For this purpose, both vectors are applied to the input layer and output layer and the weight matrix can be calculated in one learning step. The following applies:

Finally, all weight matrices are added to the resulting weight matrix.

Restore Pattern

In the case of a recall , a noisy input vector is applied and the network is simply allowed to compute, i.e. H. Neurons of the output layer calculate their new state and pass it on again to . Then the process starts from the beginning until the steadily decreasing energy of the network has reached a local minimum. The associated output vector can now be extracted.

and

literature

  • Gerhard Schöneburg, Nikolaus Hansen, Andreas Gawelczyk, Neural Networks , Markt & Technik Verlag Haar (1990), ISBN 3-89090-329-0 .
  • Andreas Zell, Simulation of Neural Networks , R. Oldenbourg Verlag Munich (1997), ISBN 3-486-24350-0 .