Loading [MathJax]/jax/output/CommonHTML/jax.js

The Space Complexity of Approximating Logistic Loss

Part of Advances in Neural Information Processing Systems 37 (NeurIPS 2024) Main Conference Track

Bibtex Paper Supplemental

Authors

Gregory Dexter, Petros Drineas, Rajiv Khanna

Abstract

We provide space complexity lower bounds for data structures that approximate logistic loss up to ϵ-relative error on a logistic regression problem with data XRn×d and labels y1,1d. The space complexity of existing coreset constructions depend on a natural complexity measure μy(X). We give an ˜Ω(dϵ2) space complexity lower bound in the regime μy(X)=O(1) that shows existing coresets are optimal in this regime up to lower order factors. We also prove a general ˜Ω(dμy(X)) space lower bound when ϵ is constant, showing that the dependency on μy(X) is not an artifact of mergeable coresets. Finally, we refute a prior conjecture that μy(X) is hard to compute by providing an efficient linear programming formulation, and we empirically compare our algorithm to prior approximate methods.