Part of Advances in Neural Information Processing Systems 37 (NeurIPS 2024) Main Conference Track
Yiwen Kou, Zixiang Chen, Quanquan Gu, Sham M. Kakade
The k-sparse parity problem is a classical problem in computational complexity and algorithmic theory, serving as a key benchmark for understanding computational classes. In this paper, we solve the k-sparse parity problem with sign stochastic gradient descent, a variant of stochastic gradient descent (SGD) on two-layer fully-connected neural networks. We demonstrate that this approach can efficiently solve the k-sparse parity problem on a d-dimensional hypercube (k≤O(√d)) with a sample complexity of ˜O(dk−1) using 2Θ(k) neurons, matching the established Ω(dk) lower bounds of Statistical Query (SQ) models. Our theoretical analysis begins by constructing a good neural network capable of correctly solving the k-parity problem. We then demonstrate how a trained neural network with sign SGD can effectively approximate this good network, solving the k-parity problem with small statistical errors. To the best of our knowledge, this is the first result that matches the SQ lower bound for solving k-sparse parity problem using gradient-based methods.