DPVI: A Dynamic-Weight Particle-Based Variational Inference Framework

12/02/2021
by   Chao Zhang, et al.
0

The recently developed Particle-based Variational Inference (ParVI) methods drive the empirical distribution of a set of fixed-weight particles towards a given target distribution π by iteratively updating particles' positions. However, the fixed weight restriction greatly confines the empirical distribution's approximation ability, especially when the particle number is limited. In this paper, we propose to dynamically adjust particles' weights according to a Fisher-Rao reaction flow. We develop a general Dynamic-weight Particle-based Variational Inference (DPVI) framework according to a novel continuous composite flow, which evolves the positions and weights of particles simultaneously. We show that the mean-field limit of our composite flow is actually a Wasserstein-Fisher-Rao gradient flow of certain dissimilarity functional ℱ, which leads to a faster decrease of ℱ than the Wasserstein gradient flow underlying existing fixed-weight ParVIs. By using different finite-particle approximations in our general framework, we derive several efficient DPVI algorithms. The empirical results demonstrate the superiority of our derived DPVI algorithms over their fixed-weight counterparts.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/20/2022

On Representations of Mean-Field Variational Inference

The mean field variational inference (MFVI) formulation restricts the ge...
research
09/29/2020

Unbalanced Sobolev Descent

We introduce Unbalanced Sobolev Descent (USD), a particle descent algori...
research
02/28/2023

Particle-based Online Bayesian Sampling

Online optimization has gained increasing interest due to its capability...
research
05/27/2023

Provably Fast Finite Particle Variants of SVGD via Virtual Particle Stochastic Approximation

Stein Variational Gradient Descent (SVGD) is a popular variational infer...
research
12/21/2020

Variational Transport: A Convergent Particle-BasedAlgorithm for Distributional Optimization

We consider the optimization problem of minimizing a functional defined ...
research
02/25/2021

Stein Variational Gradient Descent: many-particle and long-time asymptotics

Stein variational gradient descent (SVGD) refers to a class of methods f...
research
07/04/2018

Accelerated First-order Methods on the Wasserstein Space for Bayesian Inference

We consider doing Bayesian inference by minimizing the KL divergence on ...

Please sign up or login with your details

Forgot password? Click here to reset