Dual-Tree Fast Gauss Transforms

Part of Advances in Neural Information Processing Systems 18 (NIPS 2005)

Bibtex Metadata Paper

Authors

Dongryeol Lee, Andrew Moore, Alexander Gray

Abstract

In previous work we presented an efficient approach to computing ker- nel summations which arise in many machine learning methods such as kernel density estimation. This approach, dual-tree recursion with finite- difference approximation, generalized existing methods for similar prob- lems arising in computational physics in two ways appropriate for sta- tistical problems: toward distribution sensitivity and general dimension, partly by avoiding series expansions. While this proved to be the fastest practical method for multivariate kernel density estimation at the optimal bandwidth, it is much less efficient at larger-than-optimal bandwidths. In this work, we explore the extent to which the dual-tree approach can be integrated with multipole-like Hermite expansions in order to achieve reasonable efficiency across all bandwidth scales, though only for low di- mensionalities. In the process, we derive and demonstrate the first truly hierarchical fast Gauss transforms, effectively combining the best tools from discrete algorithms and continuous approximation theory.

1 Fast Gaussian Summation

Kernel summations are fundamental in both statistics/learning and computational physics.