Part of Advances in Neural Information Processing Systems 10 (NIPS 1997)
Christopher Bishop, Neil Lawrence, Tommi Jaakkola, Michael Jordan
Exact inference in densely connected Bayesian networks is computation(cid:173) ally intractable, and so there is considerable interest in developing effec(cid:173) tive approximation schemes. One approach which has been adopted is to bound the log likelihood using a mean-field approximating distribution. While this leads to a tractable algorithm, the mean field distribution is as(cid:173) sumed to be factorial and hence unimodal. In this paper we demonstrate the feasibility of using a richer class of approximating distributions based on mixtures of mean field distributions. We derive an efficient algorithm for updating the mixture parameters and apply it to the problem of learn(cid:173) ing in sigmoid belief networks. Our results demonstrate a systematic improvement over simple mean field theory as the number of mixture components is increased.