Processing math: 57%

Dynamic Tensor Product Regression

Part of Advances in Neural Information Processing Systems 35 (NeurIPS 2022) Main Conference Track

Bibtex Paper Supplemental

Authors

Aravind Reddy, Zhao Song, Lichen Zhang

Abstract

In this work, we initiate the study of \emph{Dynamic Tensor Product Regression}. One has matrices A1Rn1×d1,,AqRnq×dq and a label vector bRn1nq, and the goal is to solve the regression problem with the design matrix A being the tensor product of the matrices A1,A2,,Aq i.e. min. At each time step, one matrix A_i receives a sparse change, and the goal is to maintain a sketch of the tensor product A_1\otimes\ldots \otimes A_q so that the regression solution can be updated quickly. Recomputing the solution from scratch for each round is extremely expensive so it is important to develop algorithms which can quickly update the solution with the new design matrix. Our main result is a dynamic tree data structure where any update to a single matrix can be propagated quickly throughout the tree. We show that our data structure can be used to solve dynamic versions of not only Tensor Product Regression, but also Tensor Product Spline regression (which is a generalization of ridge regression) and for maintaining Low Rank Approximations for the tensor product.