Part of Advances in Neural Information Processing Systems 11 (NIPS 1998)
Sepp Hochreiter, Jürgen Schmidhuber
This paper reveals a previously ignored connection between two important fields: regularization and independent component anal(cid:173) ysis (ICA). We show that at least one representative of a broad class of algorithms (regularizers that reduce network complexity) extracts independent features as a by-product. This algorithm is Flat Minimum Search (FMS), a recent general method for finding low-complexity networks with high generalization capability. FMS works by minimizing both training error and required weight pre(cid:173) cision. According to our theoretical analysis the hidden layer of an FMS-trained autoassociator attempts at coding each input by a sparse code with as few simple features as possible. In experi(cid:173) ments the method extracts optimal codes for difficult versions of the "noisy bars" benchmark problem by separating the underlying sources, whereas ICA and PCA fail. Real world images are coded with fewer bits per pixel than by ICA or PCA.