Dynamics of Generalization in Linear Perceptrons

Part of Advances in Neural Information Processing Systems 3 (NIPS 1990)

Bibtex Metadata Paper

Authors

Anders Krogh, John Hertz

Abstract

We study the evolution of the generalization ability of a simple linear per(cid:173) ceptron with N inputs which learns to imitate a "teacher perceptron". The system is trained on p = aN binary example inputs and the generaliza(cid:173) tion ability measured by testing for agreement with the teacher on all 2N possible binary input patterns. The dynamics may be solved analytically and exhibits a phase transition from imperfect to perfect generalization at a = 1. Except at this point the generalization ability approaches its asymptotic value exponentially, with critical slowing down near the tran(cid:173) sition; the relaxation time is ex (1 - y'a)-2. Right at the critical point, 1 the approach to perfect generalization follows a power law ex t - '2. In the presence of noise, the generalization ability is degraded by an amount ex (va - 1)-1 just above a = 1.