Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
A risk bound for data-dependent hypothesis classes is presented in terms of a notion of stability of the hypothesis class and a newly proposed extension of the Rademacher complexity to data-dependent classes. The paper is clearly written and the results are interesting and mathematically sound. The unification of the complexity-based and stability-based analysis for learning with data-dependent hypothesis seems a significant contribution. Their main theoretical result (Theorem 2) applies to a large range of learning algorithms and is thus relevant to a large body of machine learning work. A nice analysis is presented for bagging, stochastic strongly convex optimisation, and distillation. Hence, we recommend acceptance.