Information Diffusion Kernels

Part of Advances in Neural Information Processing Systems 15 (NIPS 2002)

Bibtex Metadata Paper

Authors

Guy Lebanon, John Lafferty

Abstract

A new family of kernels for statistical learning is introduced that ex- ploits the geometric structure of statistical models. Based on the heat equation on the Riemannian manifold defined by the Fisher informa- tion metric, information diffusion kernels generalize the Gaussian kernel of Euclidean space, and provide a natural way of combining generative statistical modeling with non-parametric discriminative learning. As a special case, the kernels give a new approach to applying kernel-based learning algorithms to discrete data. Bounds on covering numbers for the new kernels are proved using spectral theory in differential geometry, and experimental results are presented for text classification.