Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
This paper analyzes the convergence rate of single-hidden-layer ReLU neural networks along different Fourier frequencies. It finds that lower frequencies learn first, and finds that biases allow for learning of odd frequencies. The restriction to spherical data is limiting, but the analysis and conclusions (particularly the rates of convergence) are novel and interesting. I recommend acceptance.