Part of Advances in Neural Information Processing Systems 17 (NIPS 2004)
Jerry Zhu, Jaz Kandola, Zoubin Ghahramani, John Lafferty
We present an algorithm based on convex optimization for constructing kernels for semi-supervised learning. The kernel matrices are derived from the spectral decomposition of graph Laplacians, and combine la- beled and unlabeled data in a systematic fashion. Unlike previous work using diffusion kernels and Gaussian random field kernels, a nonpara- metric kernel approach is presented that incorporates order constraints during optimization. This results in flexible kernels and avoids the need to choose among different parametric forms. Our approach relies on a quadratically constrained quadratic program (QCQP), and is compu- tationally feasible for large datasets. We evaluate the kernels on real datasets using support vector machines, with encouraging results.