Мобильная версия

Доступно журналов:

3 288

Доступно статей:

3 891 637

 

Скрыть метаданые

Автор K Y M Wong
Автор D Sherrington
Дата выпуска 1990-02-21
dc.description The authors consider synaptic neural networks which minimise the output error of the stored patterns when the input patterns are ensembles of their noisy versions with overlap m<sub>t</sub> with the clean patterns. When m<sub>t</sub> is infinitesimally less than 1, the network automatically attains maximal stability, confirming the usefulness of training noises in enhancing memory associativity. When m<sub>t</sub> drops below 1, the field distribution has two bands for large m<sub>t</sub>, and one continuous band for small m<sub>t</sub>. Errorless retrieval is impossible for training noises of the order N<sup>0</sup>. With the increase in training noise, the retrieval overlap deteriorates, although memory associativity does increase for sufficiently low storage.
Формат application.pdf
Издатель Institute of Physics Publishing
Название Training noise adaptation in attractor neural networks
Тип lett
DOI 10.1088/0305-4470/23/4/009
Print ISSN 0305-4470
Журнал Journal of Physics A: Mathematical and General
Том 23
Первая страница L175
Последняя страница L182
Аффилиация K Y M Wong; Dept. of Phys., Imperial Coll., London, UK
Аффилиация D Sherrington; Dept. of Phys., Imperial Coll., London, UK
Выпуск 4

Скрыть метаданые