Learning Informative Statistics: A Nonparametnic Approach

Part of Advances in Neural Information Processing Systems 12 (NIPS 1999)

Bibtex Metadata Paper

Authors

John W. Fisher III, Alexander Ihler, Paul Viola

Abstract

We discuss an information theoretic approach for categorizing and mod(cid:173) eling dynamic processes. The approach can learn a compact and informa(cid:173) tive statistic which summarizes past states to predict future observations. Furthermore, the uncertainty of the prediction is characterized nonpara(cid:173) metrically by a joint density over the learned statistic and present obser(cid:173) vation. We discuss the application of the technique to both noise driven dynamical systems and random processes sampled from a density which is conditioned on the past. In the first case we show results in which both the dynamics of random walk and the statistics of the driving noise are captured. In the second case we present results in which a summarizing statistic is learned on noisy random telegraph waves with differing de(cid:173) pendencies on past states. In both cases the algorithm yields a principled approach for discriminating processes with differing dynamics and/or de(cid:173) pendencies. The method is grounded in ideas from information theory and nonparametric statistics.