Part of Advances in Neural Information Processing Systems 35 (NeurIPS 2022) Main Conference Track
Xiang Cheng, Jingzhao Zhang, Suvrit Sra
We study the task of efficiently sampling from a Gibbs distribution dπ∗=e−hdvolg over a Riemannian manifold M via (geometric) Langevin MCMC; this algorithm involves computing exponential maps in random Gaussian directions and is efficiently implementable in practice. The key to our analysis of Langevin MCMC is a bound on the discretization error of the geometric Euler-Murayama scheme, assuming ∇h is Lipschitz and M has bounded sectional curvature. Our error bound matches the error of Euclidean Euler-Murayama in terms of its stepsize dependence. Combined with a contraction guarantee for the geometric Langevin Diffusion under Kendall-Cranston coupling, we prove that the Langevin MCMC iterates lie within ϵ-Wasserstein distance of π∗ after ˜O(ϵ−2) steps, which matches the iteration complexity for Euclidean Langevin MCMC. Our results apply in general settings where h can be nonconvex and M can have negative Ricci curvature. Under additional assumptions that the Riemannian curvature tensor has bounded derivatives, and that π∗ satisfies a CD(⋅,∞) condition, we analyze the stochastic gradient version of Langevin MCMC, and bound its iteration complexity by ˜O(ϵ−2) as well.