NeurIPS 2020

Continuous Submodular Maximization: Beyond DR-Submodularity


Meta Review

The paper was discussed among the reviewers after the rebuttal phase. Most reviewers found the rebuttal helpful and clarifying, and updated their scores and reviews. The consensus is to accept the paper as a poster in NeurIPS. A reviewer commented that while "the setting in this work is restricted and specialized, overall, the paper provides a comprehensive study of this setting, and after some minor edits to improve the readability, it is in a good shape to be accepted." One concern was "the importance of studying (non-DR) continuous submodular functions in practice" which the authors' rebuttal helped address. The authors should discuss these motivating applications (from the rebuttal) in the paper itself, and attempt to include even more examples and applications to strengthen the motivation for the study of this specific class of problems. Another concern of reviewers was that an exiting paper (Soma-Yoshida 2018) gives a better result, however, the authors' rebuttal pointed out a flaw in the existing paper. In the discussions following the rebuttal, the reviewers found this point valid. Additional comment: the 1/2-approximation guarantee for monotone DR-submodular maximization via gradient descent is likely older than the 2017 paper cited in Related Work (possibly in older work by Chekuri et al.); the authors should look into this and update if needed.