Selective Labeling via Error Bound Minimization

Part of Advances in Neural Information Processing Systems 25 (NIPS 2012)

Bibtex »Metadata »Paper »Supplemental »

Authors

Quanquan Gu, Tong Zhang, Jiawei Han, Chris Ding

Abstract

In many practical machine learning problems, the acquisition of labeled data is often expensive and/or time consuming. This motivates us to study a problem as follows: given a label budget, how to select data points to label such that the learning performance is optimized. We propose a selective labeling method by analyzing the generalization error of Laplacian regularized Least Squares (LapRLS). In particular, we derive a deterministic generalization error bound for LapRLS trained on subsampled data, and propose to select a subset of data points to label by minimizing this upper bound. Since the minimization is a combinational problem, we relax it into continuous domain and solve it by projected gradient descent. Experiments on benchmark datasets show that the proposed method outperforms the state-of-the-art methods.