Accelerating Stochastic Composition Optimization

Part of Advances in Neural Information Processing Systems 29 (NIPS 2016)

Bibtex »Metadata »Paper »Reviews »Supplemental »

Authors

Mengdi Wang, Ji Liu, Ethan Fang

Abstract

<p>Consider the stochastic composition optimization problem where the objective is a composition of two expected-value functions. We propose a new stochastic first-order method, namely the accelerated stochastic compositional proximal gradient (ASC-PG) method, which updates based on queries to the sampling oracle using two different timescales. The ASC-PG is the first proximal gradient method for the stochastic composition problem that can deal with nonsmooth regularization penalty. We show that the ASC-PG exhibits faster convergence than the best known algorithms, and that it achieves the optimal sample-error complexity in several important special cases. We further demonstrate the application of ASC-PG to reinforcement learning and conduct numerical experiments.</p>