Part of Advances in Neural Information Processing Systems 16 (NIPS 2003)
Jakob Verbeek, Sam Roweis, Nikos Vlassis
We propose a non-linear Canonical Correlation Analysis (CCA) method which works by coordinating or aligning mixtures of linear models. In the same way that CCA extends the idea of PCA, our work extends re- cent methods for non-linear dimensionality reduction to the case where multiple embeddings of the same underlying low dimensional coordi- nates are observed, each lying on a different high dimensional manifold. We also show that a special case of our method, when applied to only a single manifold, reduces to the Laplacian Eigenmaps algorithm. As with previous alignment schemes, once the mixture models have been estimated, all of the parameters of our model can be estimated in closed form without local optima in the learning. Experimental results illustrate the viability of the approach as a non-linear extension of CCA.