SIMPLIFYING NEURAL NETS BY DISCOVERING FLAT MINIMA

Part of Advances in Neural Information Processing Systems 7 (NIPS 1994)

Bibtex Metadata Paper

Authors

Sepp Hochreiter, Jürgen Schmidhuber

Abstract

We present a new algorithm for finding low complexity networks with high generalization capability. The algorithm searches for large connected regions of so-called ''fiat'' minima of the error func(cid:173) tion. In the weight-space environment of a "flat" minimum, the error remains approximately constant. Using an MDL-based ar(cid:173) gument, flat minima can be shown to correspond to low expected overfitting. Although our algorithm requires the computation of second order derivatives, it has backprop's order of complexity. Experiments with feedforward and recurrent nets are described. In an application to stock market prediction, the method outperforms conventional backprop, weight decay, and "optimal brain surgeon" .