Deep Generative Learning via Variational Gradient Flow

01/24/2019
by   Gao Yuan, et al.
42

We propose a general framework to learn deep generative models via Variational Gradient Flow (VGrow) on probability spaces. The evolving distribution that asymptotically converges to the target distribution is governed by a vector field, which is the negative gradient of the first variation of the f-divergence between them. We prove that the evolving distribution coincides with the pushforward distribution through the infinitesimal time composition of residual maps that are perturbations of the identity map along the vector field. The vector field depends on the density ratio of the pushforward distribution and the target distribution, which can be consistently learned from a binary classification problem. Connections of our proposed VGrow method with other popular methods, such as VAE, GAN and flow-based methods, have been established in this framework, gaining new insights of deep generative learning. We also evaluated several commonly used divergences, including Kullback-Leibler, Jensen-Shannon, Jeffrey divergences as well as our newly discovered `logD' divergence which serves as the objective function of the logD-trick GAN. Experimental results on benchmark datasets demonstrate that VGrow can generate high-fidelity images in a stable and efficient manner, achieving competitive performance with state-of-the-art GANs.

READ FULL TEXT

page 19

page 20

page 21

page 22

research
12/01/2020

Refining Deep Generative Models via Wasserstein Gradient Flows

Deep generative modeling has seen impressive advances in recent years, t...
research
02/09/2020

Out-of-Distribution Detection with Distance Guarantee in Deep Generative Models

Recent research has shown that it is challenging to detect out-of-distri...
research
09/04/2017

Continuous-Time Flows for Deep Generative Models

Normalizing flows have been developed recently as a method for drawing s...
research
10/10/2016

Generative Adversarial Nets from a Density Ratio Estimation Perspective

Generative adversarial networks (GANs) are successful deep generative mo...
research
12/02/2019

KernelNet: A Data-Dependent Kernel Parameterization for Deep Generative Modeling

Learning with kernels is an often resorted tool in modern machine learni...
research
06/19/2021

Deep Generative Learning via Schrödinger Bridge

We propose to learn a generative model via entropy interpolation with a ...
research
07/12/2021

Active Divergence with Generative Deep Learning – A Survey and Taxonomy

Generative deep learning systems offer powerful tools for artefact gener...

Please sign up or login with your details

Forgot password? Click here to reset