NeurIPS 2020

Extrapolation Towards Imaginary 0-Nearest Neighbour and Its Improved Convergence Rate

Meta Review

The paper presents a new nonparametric learning method, which seems to combine certain elements of k-nearest neighbors with elements of local regression estimation. It recovers the optimal rates for classification with smooth regression functions and Tsybakov noise, previously established for a local polynomial regression method, but uses a predictor representation involving far fewer parameters, as in a simple weighted k-NN predictor. The reviewers favor accepting the paper. However, they have some reservations, as they would prefer the paper be presented differently, with more space dedicated to presenting the new techniques, and with more investigation into the strengths of this particular method compared to the well-known standard techniques.