NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
The theory of serial composite minimization for a variety of loss functions and regularizers is available. However, the same cannot be said of the theory of parallel minimization in the "lock-free" or "asynchronous" setting popularized by the Hogwild! paper of Recht et al. This paper advances the state of the art in asynchronous parallel minimization of composite convex functions (one of which is a differentiable loss and the other is a possibly non-differentiable regularizer). The out of the "box" in the title denotes the important feature of their algorithm ASYNCADA that it does not require the constraint set to be a "box" (cartesian product of intervals). They also provide an algorithm HEDGEHOG that is an asynchronous version of exponentiated gradient for optimization over the simplex. The reviewers felt that this work, while not particularly groundbreaking, is an important step forward in an area of current interest in ML. The authors did promise a number of changes that the reviewers agreed will be important for them to make in the final version. The authors should remember to make the promised changes.