Private Learning of Halfspaces: Simplifying the Construction and Reducing the Sample Complexity

Part of Advances in Neural Information Processing Systems 33 (NeurIPS 2020)

AuthorFeedback Bibtex MetaReview Paper Review Supplemental

Authors

Haim Kaplan, Yishay Mansour, Uri Stemmer, Eliad Tsfadia

Abstract

We present a differentially private learner for halfspaces over a finite grid $G$ in $\R^d$ with sample complexity $\approx d^{2.5}\cdot 2^{\log^*|G|}$, which improves the state-of-the-art result of [Beimel et al., COLT 2019] by a $d^2$ factor. The building block for our learner is a new differentially private algorithm for approximately solving the linear feasibility problem: Given a feasible collection of $m$ linear constraints of the form $Ax\geq b$, the task is to {\em privately} identify a solution $x$ that satisfies {\em most} of the constraints. Our algorithm is iterative, where each iteration determines the next coordinate of the constructed solution $x$.