Continuous-Time Flows for Deep Generative Models

09/04/2017
by   Changyou Chen, et al.
0

Normalizing flows have been developed recently as a method for drawing samples from an arbitrary distribution. This method is attractive due to its intrinsic ability to approximate a target distribution arbitrarily well. In practice, however, normalizing flows only consist of a finite number of deterministic transformations, and thus there is no guarantees on the approximation accuracy. In this paper we study the problem of learning deep generative models with continuous-time flows (CTFs), a family of diffusion-based methods that are able to asymptotically approach a target distribution. We discretize the CTF to make training feasible, and develop theory on the approximation error. A framework is then adopted to distill knowledge from a CTF to an efficient inference network. We apply the technique to deep generative models, including a CTF-based variational autoencoder and an adversarial-network-like density estimator. Experiments on various tasks demonstrate the superiority of the proposed CTF framework compared to existing techniques.

READ FULL TEXT

page 7

page 8

page 15

page 16

page 17

page 18

research
06/05/2021

A Variational Perspective on Diffusion-Based Generative Models and Score Matching

Discrete-time diffusion-based generative models and score matching metho...
research
07/03/2023

Sampling the lattice Nambu-Goto string using Continuous Normalizing Flows

Effective String Theory (EST) represents a powerful non-perturbative app...
research
09/01/2020

Variational Mixture of Normalizing Flows

In the past few years, deep generative models, such as generative advers...
research
01/24/2019

Deep Generative Learning via Variational Gradient Flow

We propose a general framework to learn deep generative models via Varia...
research
11/17/2022

Validation Diagnostics for SBI algorithms based on Normalizing Flows

Building on the recent trend of new deep generative models known as Norm...
research
10/20/2022

Graphically Structured Diffusion Models

We introduce a framework for automatically defining and learning deep ge...
research
05/25/2020

Network Bending: Manipulating The Inner Representations of Deep Generative Models

We introduce a new framework for interacting with and manipulating deep ...

Please sign up or login with your details

Forgot password? Click here to reset