Laplacian Smoothing Stochastic Gradient Markov Chain Monte Carlo

11/02/2019
by   Bao Wang, et al.
17

As an important Markov Chain Monte Carlo (MCMC) method, stochastic gradient Langevin dynamics (SGLD) algorithm has achieved great success in Bayesian learning and posterior sampling. However, SGLD typically suffers from slow convergence rate due to its large variance caused by the stochastic gradient. In order to alleviate these drawbacks, we leverage the recently developed Laplacian Smoothing (LS) technique and propose a Laplacian smoothing stochastic gradient Langevin dynamics (LS-SGLD) algorithm. We prove that for sampling from both log-concave and non-log-concave densities, LS-SGLD achieves strictly smaller discretization error in 2-Wasserstein distance, although its mixing rate can be slightly slower. Experiments on both synthetic and real datasets verify our theoretical results, and demonstrate the superior performance of LS-SGLD on different machine learning tasks including posterior sampling, Bayesian logistic regression and training Bayesian convolutional neural networks. The code is available at <https://github.com/BaoWangMath/LS-MCMC>.

READ FULL TEXT
research
09/04/2017

A Convergence Analysis for A Class of Practical Variance-Reduction Stochastic Gradient MCMC

Stochastic gradient Markov Chain Monte Carlo (SG-MCMC) has been develope...
research
05/31/2023

Chain of Log-Concave Markov Chains

Markov chain Monte Carlo (MCMC) is a class of general-purpose algorithms...
research
11/02/2022

Jump-Diffusion Langevin Dynamics for Multimodal Posterior Sampling

Bayesian methods of sampling from a posterior distribution are becoming ...
research
10/21/2019

Aggregated Gradient Langevin Dynamics

In this paper, we explore a general Aggregated Gradient Langevin Dynamic...
research
05/22/2018

Langevin Markov Chain Monte Carlo with stochastic gradients

Monte Carlo sampling techniques have broad applications in machine learn...
research
12/06/2021

Bounding Wasserstein distance with couplings

Markov chain Monte Carlo (MCMC) provides asymptotically consistent estim...
research
02/05/2020

AdaGeo: Adaptive Geometric Learning for Optimization and Sampling

Gradient-based optimization and Markov Chain Monte Carlo sampling can be...

Please sign up or login with your details

Forgot password? Click here to reset