Koby Crammer, Ran Gilad-bachrach, Amir Navot, Naftali Tishby
Prototypes based algorithms are commonly used to reduce the computa- tional complexity of Nearest-Neighbour (NN) classiﬁers. In this paper we discuss theoretical and algorithmical aspects of such algorithms. On the theory side, we present margin based generalization bounds that sug- gest that these kinds of classiﬁers can be more accurate then the 1-NN rule. Furthermore, we derived a training algorithm that selects a good set of prototypes using large margin principles. We also show that the 20 years old Learning Vector Quantization (LVQ) algorithm emerges natu- rally from our framework.