Split LBI: An Iterative Regularization Path with Structural Sparsity

Part of Advances in Neural Information Processing Systems 29 (NIPS 2016)

Bibtex »Metadata »Paper »Reviews »Supplemental »

Authors

Chendi Huang, Xinwei Sun, Jiechao Xiong, Yuan Yao

Abstract

An iterative regularization path with structural sparsity is proposed in this paper based on variable splitting and the Linearized Bregman Iteration, hence called \emph{Split LBI}. Despite its simplicity, Split LBI outperforms the popular generalized Lasso in both theory and experiments. A theory of path consistency is presented that equipped with a proper early stopping, Split LBI may achieve model selection consistency under a family of Irrepresentable Conditions which can be weaker than the necessary and sufficient condition for generalized Lasso. Furthermore, some $\ell_2$ error bounds are also given at the minimax optimal rates. The utility and benefit of the algorithm are illustrated by applications on both traditional image denoising and a novel example on partial order ranking.