Part of Advances in Neural Information Processing Systems 17 (NIPS 2004)
Neil Lawrence, Michael Jordan
We present a probabilistic approach to learning a Gaussian Process classifier in the presence of unlabeled data. Our approach involves a "null category noise model" (NCNM) inspired by ordered cate- gorical noise models. The noise model reflects an assumption that the data density is lower between the class-conditional densities. We illustrate our approach on a toy problem and present compar- ative results for the semi-supervised classification of handwritten digits.