Deterministic Langevin Monte Carlo with Normalizing Flows for Bayesian Inference

05/27/2022
by   Uros Seljak, et al.
0

We propose a general purpose Bayesian inference algorithm for expensive likelihoods, replacing the stochastic term in the Langevin equation with a deterministic density gradient term. The particle density is evaluated from the current particle positions using a Normalizing Flow (NF), which is differentiable and has good generalization properties in high dimensions. We take advantage of NF preconditioning and NF based Metropolis-Hastings updates for a faster and unbiased convergence. We show on various examples that the method is competitive against state of the art sampling methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/08/2018

Subsampling Sequential Monte Carlo for Static Bayesian Models

Our article shows how to carry out Bayesian inference by combining data ...
research
01/26/2023

Coin Sampling: Gradient-Based Bayesian Inference without Learning Rates

In recent years, particle-based variational inference (ParVI) methods su...
research
02/25/2021

Stein Variational Gradient Descent: many-particle and long-time asymptotics

Stein variational gradient descent (SVGD) refers to a class of methods f...
research
02/02/2019

Meta Particle Flow for Sequential Bayesian Inference

We present a particle flow realization of Bayes' rule, where an ODE-base...
research
03/16/2021

Gradient-Based Markov Chain Monte Carlo for Bayesian Inference With Non-Differentiable Priors

The use of non-differentiable priors in Bayesian statistics has become i...
research
02/23/2022

Efficient CDF Approximations for Normalizing Flows

Normalizing flows model a complex target distribution in terms of a bije...
research
11/23/2021

Forget-SVGD: Particle-Based Bayesian Federated Unlearning

Variational particle-based Bayesian learning methods have the advantage ...

Please sign up or login with your details

Forgot password? Click here to reset