Stochastic Gradient MCMC with Repulsive Forces

11/30/2018
by   Víctor Gallego, et al.
0

We propose a unifying view of two different families of Bayesian inference algorithms, SG-MCMC and SVGD. We show that SVGD plus a noise term can be framed as a multiple chain SG-MCMC method. Instead of treating each parallel chain independently from others, the proposed algorithm implements a repulsive force between particles, avoiding collapse. Experiments in both synthetic distributions and real datasets show the benefits of the proposed scheme.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/14/2017

Stochastic Gradient MCMC Methods for Hidden Markov Models

Stochastic gradient MCMC (SG-MCMC) algorithms have proven useful in scal...
research
11/29/2017

Particle Optimization in Stochastic Gradient MCMC

Stochastic gradient Markov chain Monte Carlo (SG-MCMC) has been increasi...
research
02/11/2019

Cyclical Stochastic Gradient MCMC for Bayesian Deep Learning

The posteriors over neural network weights are high dimensional and mult...
research
03/24/2022

Knowledge Removal in Sampling-based Bayesian Inference

The right to be forgotten has been legislated in many countries, but its...
research
12/23/2015

High-Order Stochastic Gradient Thermostats for Bayesian Learning of Deep Models

Learning in deep models using Bayesian methods has generated significant...
research
10/22/2018

Stochastic Gradient MCMC for State Space Models

State space models (SSMs) are a flexible approach to modeling complex ti...

Please sign up or login with your details

Forgot password? Click here to reset