NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:5136
Title:Data-dependent Sample Complexity of Deep Neural Networks via Lipschitz Augmentation


		
This paper developed a new generalization error bounds for smooth activation deep neural networks which is norm-based and has only polynomial dependence on the depth unlike (most of) previous work. The bound is tighter than the ever obtained ones in the same direction, and gives new insight on the generalization error analysis for the deep learning. In particular, it connects the deep and shallow network analysis more appropriately than existing researches. The paper would benefit from more intuitive expositions of the bound and more comparisons with existing bounds on real datasets.