Unsupervised and Supervised Clustering: The Mutual Information between Parameters and Observations

Part of Advances in Neural Information Processing Systems 11 (NIPS 1998)

Bibtex Metadata Paper

Authors

Didier Herschkowitz, Jean-Pierre Nadal

Abstract

Recent works in parameter estimation and neural coding have demonstrated that optimal performance are related to the mutual information between parameters and data. We consider the mutual information in the case where the dependency in the parameter (a vector 8) of the conditional p.d.f. of each observation (a vector 0, is through the scalar product 8.~ only. We derive bounds and asymptotic behaviour for the mutual information and compare with results obtained on the same model with the" replica technique" .