Loading [MathJax]/jax/output/CommonHTML/jax.js

Sampling from Structured Log-Concave Distributions via a Soft-Threshold Dikin Walk

Part of Advances in Neural Information Processing Systems 36 (NeurIPS 2023) Main Conference Track

Bibtex Paper

Authors

Oren Mangoubi, Nisheeth K. Vishnoi

Abstract

Given a Lipschitz or smooth convex function f:KRd for a bounded polytope K:={ θRd:Aθb}, where ARm×d and bRm, we consider the problem of sampling from the log-concave distribution π(θ)ef(θ) constrained to K. Interest in this problem derives from its applications to Bayesian inference and differential privacy. We present a generalization of the Dikin walk to this setting that requires at most O((md+dL2R2)×mdω1log(wδ)) arithmetic operations to sample from π within error δ>0 in the total variation distance from a w-warm start. Here L is the Lipschitz constant of f, K is contained in a ball of radius R and contains a ball of smaller radius r, and ω2.37 is the matrix-multiplication constant. This improves on the running time of prior works for a range of structured settings important for the aforementioned inference and privacy applications. Technically, we depart from previous Dikin walks by adding a soft-threshold regularizer derived from the Lipschitz or smoothness properties of f to a barrier function for K that allows our version of the Dikin walk to propose updates that have a high Metropolis acceptance ratio for f, while at the same time remaining inside the polytope K.