Training noise adaptation in attractor neural networks
K Y M Wong; D Sherrington; K Y M Wong; Dept. of Phys., Imperial Coll., London, UK; D Sherrington; Dept. of Phys., Imperial Coll., London, UK
Журнал:
Journal of Physics A: Mathematical and General
Дата:
1990-02-21
Аннотация:
The authors consider synaptic neural networks which minimise the output error of the stored patterns when the input patterns are ensembles of their noisy versions with overlap m<sub>t</sub> with the clean patterns. When m<sub>t</sub> is infinitesimally less than 1, the network automatically attains maximal stability, confirming the usefulness of training noises in enhancing memory associativity. When m<sub>t</sub> drops below 1, the field distribution has two bands for large m<sub>t</sub>, and one continuous band for small m<sub>t</sub>. Errorless retrieval is impossible for training noises of the order N<sup>0</sup>. With the increase in training noise, the retrieval overlap deteriorates, although memory associativity does increase for sufficiently low storage.
402.4Кб