Generative Learning With Euler Particle Transport

12/11/2020
by   Yuan Gao, et al.
22

We propose an Euler particle transport (EPT) approach for generative learning. The proposed approach is motivated by the problem of finding an optimal transport map from a reference distribution to a target distribution characterized by the Monge-Ampere equation. Interpreting the infinitesimal linearization of the Monge-Ampere equation from the perspective of gradient flows in measure spaces leads to a stochastic McKean-Vlasov equation. We use the forward Euler method to solve this equation. The resulting forward Euler map pushes forward a reference distribution to the target. This map is the composition of a sequence of simple residual maps, which are computationally stable and easy to train. The key task in training is the estimation of the density ratios or differences that determine the residual maps. We estimate the density ratios (differences) based on the Bregman divergence with a gradient penalty using deep density-ratio (difference) fitting. We show that the proposed density-ratio (difference) estimators do not suffer from the "curse of dimensionality" if data is supported on a lower-dimensional manifold. Numerical experiments with multi-mode synthetic datasets and comparisons with the existing methods on real benchmark datasets support our theoretical results and demonstrate the effectiveness of the proposed method.

READ FULL TEXT

page 12

page 13

page 14

research
02/07/2020

Learning Implicit Generative Models with Theoretical Guarantees

We propose a unified framework for implicit generative modeling (UnifiGe...
research
07/09/2023

A generative flow for conditional sampling via optimal transport

Sampling conditional distributions is a fundamental task for Bayesian in...
research
04/08/2023

Efficient Multimodal Sampling via Tempered Distribution Flow

Sampling from high-dimensional distributions is a fundamental problem in...
research
09/22/2022

Turning Normalizing Flows into Monge Maps with Geodesic Gaussian Preserving Flows

Normalizing Flows (NF) are powerful likelihood-based generative models t...
research
10/06/2021

Relative Entropy Gradient Sampler for Unnormalized Distributions

We propose a relative entropy gradient sampler (REGS) for sampling from ...
research
06/19/2021

Deep Generative Learning via Schrödinger Bridge

We propose to learn a generative model via entropy interpolation with a ...
research
05/26/2021

Augmented KRnet for density estimation and approximation

In this work, we have proposed augmented KRnets including both discrete ...

Please sign up or login with your details

Forgot password? Click here to reset