DeepAI AI Chat
Log In Sign Up

Continuous-Time Flows for Deep Generative Models

by   Changyou Chen, et al.

Normalizing flows have been developed recently as a method for drawing samples from an arbitrary distribution. This method is attractive due to its intrinsic ability to approximate a target distribution arbitrarily well. In practice, however, normalizing flows only consist of a finite number of deterministic transformations, and thus there is no guarantees on the approximation accuracy. In this paper we study the problem of learning deep generative models with continuous-time flows (CTFs), a family of diffusion-based methods that are able to asymptotically approach a target distribution. We discretize the CTF to make training feasible, and develop theory on the approximation error. A framework is then adopted to distill knowledge from a CTF to an efficient inference network. We apply the technique to deep generative models, including a CTF-based variational autoencoder and an adversarial-network-like density estimator. Experiments on various tasks demonstrate the superiority of the proposed CTF framework compared to existing techniques.


page 7

page 8

page 15

page 16

page 17

page 18


A Variational Perspective on Diffusion-Based Generative Models and Score Matching

Discrete-time diffusion-based generative models and score matching metho...

Variational Mixture of Normalizing Flows

In the past few years, deep generative models, such as generative advers...

Validation Diagnostics for SBI algorithms based on Normalizing Flows

Building on the recent trend of new deep generative models known as Norm...

Graphically Structured Diffusion Models

We introduce a framework for automatically defining and learning deep ge...

Deep Generative Learning via Variational Gradient Flow

We propose a general framework to learn deep generative models via Varia...

Robust normalizing flows using Bernstein-type polynomials

Normalizing flows (NFs) are a class of generative models that allows exa...

Network Bending: Manipulating The Inner Representations of Deep Generative Models

We introduce a new framework for interacting with and manipulating deep ...