Olivier Bousquet, Daniel Herrmann
We investigate data based procedures for selecting the kernel when learn- ing with Support Vector Machines. We provide generalization error bounds by estimating the Rademacher complexities of the corresponding function classes. In particular we obtain a complexity bound for function classes induced by kernels with given eigenvectors, i.e., we allow to vary the spectrum and keep the eigenvectors ﬁx. This bound is only a loga- rithmic factor bigger than the complexity of the function class induced by a single kernel. However, optimizing the margin over such classes leads to overﬁtting. We thus propose a suitable way of constraining the class. We use an efﬁcient algorithm to solve the resulting optimization problem, present preliminary experimental results, and compare them to an alignment-based approach.