Dynamics of Supervised Learning with Restricted Training Sets and Noisy Teachers

Part of Advances in Neural Information Processing Systems 12 (NIPS 1999)

Bibtex Metadata Paper


Anthony Coolen, C. Mace


We generalize a recent formalism to describe the dynamics of supervised learning in layered neural networks, in the regime where data recycling is inevitable, to the case of noisy teachers. Our theory generates reliable predictions for the evolution in time of training- and generalization er(cid:173) rors, and extends the class of mathematically solvable learning processes in large neural networks to those situations where overfitting can occur.