NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Submodular optimization under stochastic oracle is an important direction and has been studied in the maximization setting. This paper, considers submodular minimization under stochastic oracle access and provides an upper bound on the sample complexity. Based on my own reading, I believe the paper will be of interest to the NeurIPS audience and makes an interesting contribution. However, I urge the authors to include the missing references to stochastic submodular maximization to make the distinction more clear. (i) Gradient Methods for Submodular Optimization (ii) Stochastic Submodular Maximization: The Case of Coverage (iii) Conditional Gradient Method for Stochastic Submodular Maximization: Closing the Gap (iv) Submodular Optimization Under Noise Another point that was brought up during the discussions was the sample complexity lower bound. Again, in the maximization setting, this question has been recently resolved in the following paper "Stochastic Conditional Gradient++". Providing a similar result in the minimization setting would be of great interest. All in all, I would like to recommend this paper to NeurIPS as it adds to our understanding of such fundamental optimization problems.