Part of Advances in Neural Information Processing Systems 20 (NIPS 2007)
Andreas Krause, Brendan Mcmahan, Carlos Guestrin, Anupam Gupta
In many applications, one has to actively select among a set of expensive observa- tions before making an informed decision. Often, we want to select observations which perform well when evaluated with an objective function chosen by an adver- sary. Examples include minimizing the maximum posterior variance in Gaussian Process regression, robust experimental design, and sensor placement for outbreak detection. In this paper, we present the Submodular Saturation algorithm, a sim- ple and efficient algorithm with strong theoretical approximation guarantees for the case where the possible objective functions exhibit submodularity, an intuitive diminishing returns property. Moreover, we prove that better approximation al- gorithms do not exist unless NP-complete problems admit efficient algorithms. We evaluate our algorithm on several real-world problems. For Gaussian Process regression, our algorithm compares favorably with state-of-the-art heuristics de- scribed in the geostatistics literature, while being simpler, faster and providing theoretical guarantees. For robust experimental design, our algorithm performs favorably compared to SDP-based algorithms.