NeurIPS 2020

Efficient Variational Inference for Sparse Deep Learning with Theoretical Guarantee

Meta Review

The reviewers agree that this is interesting, rigorous, and novel work that explores sparsity in deep neural networks. While the assumptions required for consistency are not as general as one would hope, this paper lays the groundwork for new research directions. One weakness that the reviewers wish to see addressed is the lack of discussion of similar approaches in the BNN literature and how one might extend this approach to more complex models.