Unbalanced Sobolev Descent

09/29/2020
by   Youssef Mroueh, et al.
5

We introduce Unbalanced Sobolev Descent (USD), a particle descent algorithm for transporting a high dimensional source distribution to a target distribution that does not necessarily have the same mass. We define the Sobolev-Fisher discrepancy between distributions and show that it relates to advection-reaction transport equations and the Wasserstein-Fisher-Rao metric between distributions. USD transports particles along gradient flows of the witness function of the Sobolev-Fisher discrepancy (advection step) and reweighs the mass of particles with respect to this witness function (reaction step). The reaction step can be thought of as a birth-death process of the particles with rate of growth proportional to the witness function. When the Sobolev-Fisher witness function is estimated in a Reproducing Kernel Hilbert Space (RKHS), under mild assumptions we show that USD converges asymptotically (in the limit of infinite particles) to the target distribution in the Maximum Mean Discrepancy (MMD) sense. We then give two methods to estimate the Sobolev-Fisher witness with neural networks, resulting in two Neural USD algorithms. The first one implements the reaction step with mirror descent on the weights, while the second implements it through a birth-death process of particles. We show on synthetic examples that USD transports distributions with or without conservation of mass faster than previous particle descent algorithms, and finally demonstrate its use for molecular biology analyses where our method is naturally suited to match developmental stages of populations of differentiating cells based on their single-cell RNA sequencing profile. Code is available at https://github.com/ibm/usd .

READ FULL TEXT

page 8

page 21

research
12/02/2021

DPVI: A Dynamic-Weight Particle-Based Variational Inference Framework

The recently developed Particle-based Variational Inference (ParVI) meth...
research
05/20/2021

Kernel Stein Discrepancy Descent

Among dissimilarities between probability distributions, the Kernel Stei...
research
05/27/2023

Provably Fast Finite Particle Variants of SVGD via Virtual Particle Stochastic Approximation

Stein Variational Gradient Descent (SVGD) is a popular variational infer...
research
11/17/2022

A Finite-Particle Convergence Rate for Stein Variational Gradient Descent

We provide a first finite-particle convergence rate for Stein variationa...
research
06/11/2019

Maximum Mean Discrepancy Gradient Flow

We construct a Wasserstein gradient flow of the maximum mean discrepancy...
research
01/04/2023

Learning Gaussian Mixtures Using the Wasserstein-Fisher-Rao Gradient Flow

Gaussian mixture models form a flexible and expressive parametric family...
research
05/30/2018

Regularized Kernel and Neural Sobolev Descent: Dynamic MMD Transport

We introduce Regularized Kernel and Neural Sobolev Descent for transport...

Please sign up or login with your details

Forgot password? Click here to reset