NeurIPS 2020

Minimax Lower Bounds for Transfer Learning with Linear and One-hidden Layer Neural Networks


Meta Review

This paper addresses the problem of inductive transfer with one-hidden-layer neural networks or linear models and proposes minimax lower bounds for these models. Three reviewers and AC agree that it is a well written paper which studies an important problem. The proposed fine-grained minimax rate for transfer learning is a nice contribution to this field. Although the setting is somewhat simple, this work is inspiring for studying inductive transfer with neural networks. There are still some minor concerns on the organization of the paper and the evaluation of the proposed lower bound, which should be fully addressed in the camera-ready version. One reviewer with a negative score pointed out that while the finding in this paper could be applied in the shallow neural network, it requires more strict constraints which limit the probability of extending the complex neural networks. The AC also agrees with this concern, but believes that this paper is worth acceptance as a first step towards the challenging theoretical understanding of inductive transfer. The authors are certainly encouraged to further explore the boundary of this research direction.