NeurIPS 2020

Stochastic Recursive Gradient Descent Ascent for Stochastic Nonconvex-Strongly-Concave Minimax Problems


Meta Review

The reviewers all agree that the improved complexity is new for the class of stochastic nonconvex-strongly-concave minimax problems, and the claim of optimal complexity is clarified in the rebuttal. The weakness pointed out by the reviewers is the seemingly lack of novelty by combining existing techniques of SGDA and SARAH gradient estimators, but the author rebuttal is convincing that the combination in the minimax setting requires innovation. Another weakness is that the SREDA algorithm is rather complex, having multi-level loops and can be hard to tune in practice. Overall I recommend acceptance based on the value of the theoretical improvement. The authors should carefully address the remaining concerns of the reviewers in the revision, especially to clarify the claim of optimal complexity. In addition, the experiment settings should be explained more clearly in the main paper, including the problem formulation of the distributionaly robust optimization problem and hyperparameter choices.