Score-based Generative Modeling in Latent Space

06/10/2021
by   Arash Vahdat, et al.
0

Score-based generative models (SGMs) have recently demonstrated impressive results in terms of both sample quality and distribution coverage. However, they are usually applied directly in data space and often require thousands of network evaluations for sampling. Here, we propose the Latent Score-based Generative Model (LSGM), a novel approach that trains SGMs in a latent space, relying on the variational autoencoder framework. Moving from data to latent space allows us to train more expressive generative models, apply SGMs to non-continuous data, and learn smoother SGMs in a smaller space, resulting in fewer network evaluations and faster sampling. To enable training LSGMs end-to-end in a scalable and stable manner, we (i) introduce a new score-matching objective suitable to the LSGM setting, (ii) propose a novel parameterization of the score function that allows SGM to focus on the mismatch of the target distribution with respect to a simple Normal one, and (iii) analytically derive multiple techniques for variance reduction of the training objective. LSGM obtains a state-of-the-art FID score of 2.10 on CIFAR-10, outperforming all existing generative results on this dataset. On CelebA-HQ-256, LSGM is on a par with previous SGMs in sample quality while outperforming them in sampling time by two orders of magnitude. In modeling binary images, LSGM achieves state-of-the-art likelihood on the binarized OMNIGLOT dataset.

READ FULL TEXT

page 2

page 8

page 37

page 38

page 39

page 40

page 41

page 42

research
06/25/2019

Perceptual Generative Autoencoders

Modern generative models are usually designed to match target distributi...
research
06/15/2020

Exponential Tilting of Generative Models: Improving Sample Quality by Training and Sampling from Latent Energy

In this paper, we present a general method that can improve the sample q...
research
06/01/2022

Elucidating the Design Space of Diffusion-Based Generative Models

We argue that the theory and practice of diffusion-based generative mode...
research
05/30/2019

One-element Batch Training by Moving Window

Several deep models, esp. the generative, compare the samples from two d...
research
09/26/2022

Quasi-Conservative Score-based Generative Models

Existing Score-based Generative Models (SGMs) can be categorized into co...
research
02/21/2019

Latent Translation: Crossing Modalities by Bridging Generative Models

End-to-end optimization has achieved state-of-the-art performance on man...
research
02/02/2023

Target specific peptide design using latent space approximate trajectory collector

Despite the prevalence and many successes of deep learning applications ...

Please sign up or login with your details

Forgot password? Click here to reset