Part of Advances in Neural Information Processing Systems 20 (NIPS 2007)
Joseph K. Bradley, Robert E. Schapire
We study boosting in the ﬁltering setting, where the booster draws examples from an oracle instead of using a ﬁxed training set and so may train efﬁciently on very large datasets. Our algorithm, which is based on a logistic regression technique proposed by Collins, Schapire, & Singer, requires fewer assumptions to achieve bounds equivalent to or better than previous work. Moreover, we give the ﬁrst proof that the algorithm of Collins et al. is a strong PAC learner, albeit within the ﬁltering setting. Our proofs demonstrate the algorithm’s strong theoretical proper- ties for both classiﬁcation and conditional probability estimation, and we validate these results through extensive experiments. Empirically, our algorithm proves more robust to noise and overﬁtting than batch boosters in conditional probability estimation and proves competitive in classiﬁcation.