NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:5904
Title:Information-Theoretic Generalization Bounds for SGLD via Data-Dependent Estimates


		
This paper improves upon state-of-the-art information-theoretic generalization bounds for iterative algorithms using PAC-Bayes theory. When particularized to SGLD, this machinery gives a generalization bound that scales with the trace of covariance along the trajectory of the algorithm. This is a topic of current interest, and the techniques in this paper will certainly be useful for further research.