NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:537
Title:An Accelerated Decentralized Stochastic Proximal Algorithm for Finite Sums


		
This paper addresses optimization under communication constraints via decentralized proximal algorithms. Optimization under communication constraints is an important area of machine learning and there is still much to be gained from rigorous research in this area. Statistical literature does not concern itself much with inter- and intra-processor communication; nor does optimization (operation research) literature. Machine learning is a natural community for this research to be advanced. This paper uses a randomization scheme for communicating information between nodes to achieve impressive learning rates. The authors derive theoretical bounds on the estimation error induced by the communication constraint and show that they achieve improvements over state-of-the-art approaches with some empirical experiments. The results seem to be technically sound and significant. The reviewers noted value in estimates of synchronization times and the combination of acceleration and asynchronous aspects. One reviewer noted an important dependency on a parameter in the analysis, and the authors took steps to address that aspect in their response. However, that review remained unchanged after the author response and was not consistent with the other two reviews. While the main contributions of this paper and theoretical, not empirical the theoretical ideas are important and, in my view, will be important for driving research in this field forward. I strongly recommend this paper for acceptance.