Part of Advances in Neural Information Processing Systems 14 (NIPS 2001)
Pascal Vincent, Yoshua Bengio
Guided by an initial idea of building a complex (non linear) decision surface with maximal local margin in input space, we give a possible geometrical intuition as to why K-Nearest Neighbor (KNN) algorithms often perform more poorly than SVMs on classiﬁcation tasks. We then propose modiﬁed K-Nearest Neighbor algorithms to overcome the per- ceived problem. The approach is similar in spirit to Tangent Distance, but with invariances inferred from the local neighborhood rather than prior knowledge. Experimental results on real world classiﬁcation tasks sug- gest that the modiﬁed KNN algorithms often give a dramatic improve- ment over standard KNN and perform as well or better than SVMs.