beta-risk: a New Surrogate Risk for Learning from Weakly Labeled Data

Part of Advances in Neural Information Processing Systems 29 (NIPS 2016)

Bibtex »Metadata »Paper »Reviews »Supplemental »

Authors

Valentina Zantedeschi, Rémi Emonet, Marc Sebban

Abstract

<p>During the past few years, the machine learning community has paid attention to developping new methods for learning from weakly labeled data. This field covers different settings like semi-supervised learning, learning with label proportions, multi-instance learning, noise-tolerant learning, etc. This paper presents a generic framework to deal with these weakly labeled scenarios. We introduce the beta-risk as a generalized formulation of the standard empirical risk based on surrogate margin-based loss functions. This risk allows us to express the reliability on the labels and to derive different kinds of learning algorithms. We specifically focus on SVMs and propose a soft margin beta-svm algorithm which behaves better that the state of the art.</p>