Part of Advances in Neural Information Processing Systems 5 (NIPS 1992)
Noboru Murata, Shuji Yoshizawa, Shun-ichi Amari
Learning curves show how a neural network is improved as the number of t.raiuing examples increases and how it is related to the network complexity. The present paper clarifies asymptotic properties and their relation of t.wo learning curves, one concerning the predictive loss or generalization loss and the other the training loss. The result gives a natural definition of the complexity of a neural network. Moreover, it provides a new criterion of model selection.