Part of Advances in Neural Information Processing Systems 10 (NIPS 1997)
Stefan Schaal, Sethu Vijayakumar, Christopher Atkeson
If globally high dimensional data has locally only low dimensional distribu(cid:173) tions, it is advantageous to perform a local dimensionality reduction before further processing the data. In this paper we examine several techniques for local dimensionality reduction in the context of locally weighted linear re(cid:173) gression. As possible candidates, we derive local versions of factor analysis regression, principle component regression, principle component regression on joint distributions, and partial least squares regression. After outlining the statistical bases of these methods, we perform Monte Carlo simulations to evaluate their robustness with respect to violations of their statistical as(cid:173) sumptions. One surprising outcome is that locally weighted partial least squares regression offers the best average results, thus outperforming even factor analysis, the theoretically most appealing of our candidate techniques.