Information Bottleneck for Gaussian Variables

Part of Advances in Neural Information Processing Systems 16 (NIPS 2003)

Bibtex Metadata Paper


Gal Chechik, Amir Globerson, Naftali Tishby, Yair Weiss


The problem of extracting the relevant aspects of data was ad- dressed through the information bottleneck (IB) method, by (soft) clustering one variable while preserving information about another - relevance - variable. An interesting question addressed in the current work is the extension of these ideas to obtain continuous representations that preserve relevant information, rather than dis- crete clusters. We give a formal deflnition of the general continuous IB problem and obtain an analytic solution for the optimal repre- sentation for the important case of multivariate Gaussian variables. The obtained optimal representation is a noisy linear projection to eigenvectors of the normalized correlation matrix §xjy§¡1 x , which is also the basis obtained in Canonical Correlation Analysis. How- ever, in Gaussian IB, the compression tradeofi parameter uniquely determines the dimension, as well as the scale of each eigenvector. This introduces a novel interpretation where solutions of difierent ranks lie on a continuum parametrized by the compression level. Our analysis also provides an analytic expression for the optimal tradeofi - the information curve - in terms of the eigenvalue spec- trum.