Bayesian Unsupervised Learning of Higher Order Structure

Part of Advances in Neural Information Processing Systems 9 (NIPS 1996)

Bibtex Metadata Paper

Authors

Michael Lewicki, Terrence J. Sejnowski

Abstract

Multilayer architectures such as those used in Bayesian belief net(cid:173) works and Helmholtz machines provide a powerful framework for representing and learning higher order statistical relations among inputs. Because exact probability calculations with these mod(cid:173) els are often intractable, there is much interest in finding approxi(cid:173) mate algorithms. We present an algorithm that efficiently discovers higher order structure using EM and Gibbs sampling. The model can be interpreted as a stochastic recurrent network in which ambi(cid:173) guity in lower-level states is resolved through feedback from higher levels. We demonstrate the performance of the algorithm on bench(cid:173) mark problems.