Phase transitions in simple learning
J A Hertz; A Krogh; G I Thorbergsson; J A Hertz; Nordita, Copenhagen, Denmark; A Krogh; Nordita, Copenhagen, Denmark; G I Thorbergsson; Nordita, Copenhagen, Denmark
Журнал:
Journal of Physics A: Mathematical and General
Дата:
1989-06-21
Аннотация:
The authors investigate learning in the simplest type of a layered neural network, the one layer perceptron. The learning process is treated as a statistical dynamical problem. Quantities one is interested in include the relaxation time (the learning time) and the capacity and how they depend on noise and constraints on the weights. The relaxation time is calculated as a function of the noise level and the number p of associations to be learned. They consider three different cases for input patterns that are random and uncorrelated. In the first, where the connection weights are constrained to satisfy N<sup>-1</sup> Sigma <sub>i</sub> omega <sub>i</sub><sup>2</sup>=S<sup>2</sup>, there is a critical value of p(<N) separating regimes of perfect and imperfect learning at zero noise. In contrast, the second model, unconstrained learning, exhibits a different kind of transition at p=N, and noise plays no role. In the third model, where the constraint is imposed only on the thermal fluctuations, there is a line of phase transitions terminating at p=N and zero noise. They have also considered learning with correlated input patterns. The most important difference is the emergence of a second relaxation time, which the authors interpret as the time it takes to learn a prototype of the patterns.
781.5Кб