Stochastic Variance-Reduced Hamilton Monte Carlo Methods

02/13/2018
by   Difan Zou, et al.
0

We propose a fast stochastic Hamilton Monte Carlo (HMC) method, for sampling from a smooth and strongly log-concave distribution. At the core of our proposed method is a variance reduction technique inspired by the recent advance in stochastic optimization. We show that, to achieve ϵ accuracy in 2-Wasserstein distance, our algorithm achieves Õ(n+κ^2d^1/2/ϵ+κ^4/3d^1/3n^2/3/ϵ^2/3) gradient complexity (i.e., number of component gradient evaluations), which outperforms the state-of-the-art HMC and stochastic gradient HMC methods in a wide regime. We also extend our algorithm for sampling from smooth and general log-concave distributions, and prove the corresponding gradient complexity as well. Experiments on both synthetic and real data demonstrate the superior performance of our algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

02/09/2021

A New Framework for Variance-Reduced Hamiltonian Monte Carlo

We propose a new framework of variance-reduced Hamiltonian Monte Carlo (...
02/15/2018

On the Theory of Variance Reduction for Stochastic Gradient Monte Carlo

We provide convergence guarantees in Wasserstein distance for a variety ...
11/02/2019

Laplacian Smoothing Stochastic Gradient Markov Chain Monte Carlo

As an important Markov Chain Monte Carlo (MCMC) method, stochastic gradi...
12/21/2020

Complexity of zigzag sampling algorithm for strongly log-concave distributions

We study the computational complexity of zigzag sampling algorithm for s...
06/17/2021

Stochastic Bias-Reduced Gradient Methods

We develop a new primitive for stochastic optimization: a low-bias, low-...
12/22/2020

Projected Stochastic Gradient Langevin Algorithms for Constrained Sampling and Non-Convex Learning

Langevin algorithms are gradient descent methods with additive noise. Th...
02/01/2020

Oracle lower bounds for stochastic gradient sampling algorithms

We consider the problem of sampling from a strongly log-concave density ...