Part of Advances in Neural Information Processing Systems 10 (NIPS 1997)
Applications of Gaussian mixture models occur frequently in the fields of statistics and artificial neural networks. One of the key issues arising from any mixture model application is how to es(cid:173) timate the optimum number of mixture components. This paper extends the Reversible-Jump Markov Chain Monte Carlo (MCMC) algorithm to the case of multivariate spherical Gaussian mixtures using a hierarchical prior model. Using this method the number of mixture components is no longer fixed but becomes a param(cid:173) eter of the model which we shall estimate. The Reversible-Jump MCMC algorithm is capable of moving between parameter sub(cid:173) spaces which correspond to models with different numbers of mix(cid:173) ture components. As a result a sample from the full joint distribu(cid:173) tion of all unknown model parameters is generated. The technique is then demonstrated on a simulated example and a well known vowel dataset.