Just-In-Time Learning for Fast and Flexible Inference

Part of Advances in Neural Information Processing Systems 27 (NIPS 2014)

Bibtex »Metadata »Paper »Reviews »Supplemental »

Authors

S. M. Ali Eslami, Daniel Tarlow, Pushmeet Kohli, John Winn

Abstract

<p>Much of research in machine learning has centered around the search for inference algorithms that are both general-purpose and efficient. The problem is extremely challenging and general inference remains computationally expensive. We seek to address this problem by observing that in most specific applications of a model, we typically only need to perform a small subset of all possible inference computations. Motivated by this, we introduce just-in-time learning, a framework for fast and flexible inference that learns to speed up inference at run-time. Through a series of experiments, we show how this framework can allow us to combine the flexibility of sampling with the efficiency of deterministic message-passing.</p>