NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:3751
Title:Efficient Smooth Non-Convex Stochastic Compositional Optimization via Stochastic Recursive Gradient Descent


		
This paper has been deeply discussed between the reviewers and myself. After a lengthy discussion and thanks to the authors' rebuttal, the reviewers were convinced that the proposed algorithm and its analysis and novel, interesting, and worth to be published in NeurIPS. However, the reviewers also noted the mismatch between the motivating examples in the introduction and the assumptions in the analysis. This must be fixed. Note that it is not enough to state that the assumptions hold in the "domain of optimization" because there is no guarantee that such domain is bounded. So, please carefully take into account the reviewers' comments in preparing the camera-ready version.