Part of Advances in Neural Information Processing Systems 26 (NIPS 2013)
Samory Kpotufe, Francesco Orabona
We consider the problem of maintaining the data-structures of a partition-based regression procedure in a setting where the training data arrives sequentially over time. We prove that it is possible to maintain such a structure in time O(logn) at any time step n while achieving a nearly-optimal regression rate of ˜O(n−2/(2+d)) in terms of the unknown metric dimension d. Finally we prove a new regression lower-bound which is independent of a given data size, and hence is more appropriate for the streaming setting.