Generative Particle Variational Inference via Estimation of Functional Gradients

03/01/2021
by   Neale Ratzlaff, et al.
0

Recently, particle-based variational inference (ParVI) methods have gained interest because they directly minimize the Kullback-Leibler divergence and do not suffer from approximation errors from the evidence-based lower bound. However, many ParVI approaches do not allow arbitrary sampling from the posterior, and the few that do allow such sampling suffer from suboptimality. This work proposes a new method for learning to approximately sample from the posterior distribution. We construct a neural sampler that is trained with the functional gradient of the KL-divergence between the empirical sampling distribution and the target distribution, assuming the gradient resides within a reproducing kernel Hilbert space. Our generative ParVI (GPVI) approach maintains the asymptotic performance of ParVI methods while offering the flexibility of a generative sampler. Through carefully constructed experiments, we show that GPVI outperforms previous generative ParVI methods such as amortized SVGD, and is competitive with ParVI as well as gold-standard approaches like Hamiltonian Monte Carlo for fitting both exactly known and intractable target distributions.

READ FULL TEXT

page 7

page 8

page 15

research
03/09/2019

NeuTra-lizing Bad Geometry in Hamiltonian Monte Carlo Using Neural Transport

Hamiltonian Monte Carlo is a powerful algorithm for sampling from diffic...
research
12/07/2016

Measuring the non-asymptotic convergence of sequential Monte Carlo samplers using probabilistic programming

A key limitation of sampling algorithms for approximate inference is tha...
research
04/14/2020

Particle-based Energetic Variational Inference

We introduce a new variational inference framework, called energetic var...
research
09/21/2017

Perturbative Black Box Variational Inference

Black box variational inference (BBVI) with reparameterization gradients...
research
06/10/2015

Neural Adaptive Sequential Monte Carlo

Sequential Monte Carlo (SMC), or particle filtering, is a popular class ...
research
07/10/2023

Law of Large Numbers for Bayesian two-layer Neural Network trained with Variational Inference

We provide a rigorous analysis of training by variational inference (VI)...
research
08/09/2021

Pathfinder: Parallel quasi-Newton variational inference

We introduce Pathfinder, a variational method for approximately sampling...

Please sign up or login with your details

Forgot password? Click here to reset