Divergences, surrogate loss functions and experimental design

Part of Advances in Neural Information Processing Systems 18 (NIPS 2005)

Bibtex Metadata Paper

Authors

XuanLong Nguyen, Martin J. Wainwright, Michael Jordan

Abstract

In this paper, we provide a general theorem that establishes a correspon- dence between surrogate loss functions in classification and the family of f-divergences. Moreover, we provide constructive procedures for determining the f-divergence induced by a given surrogate loss, and conversely for finding all surrogate loss functions that realize a given f-divergence. Next we introduce the notion of universal equivalence among loss functions and corresponding f-divergences, and provide nec- essary and sufficient conditions for universal equivalence to hold. These ideas have applications to classification problems that also involve a com- ponent of experiment design; in particular, we leverage our results to prove consistency of a procedure for learning a classifier under decen- tralization requirements.