Latent Space Refinement for Deep Generative Models

06/01/2021
by   Ramon Winterhalder, et al.
0

Deep generative models are becoming widely used across science and industry for a variety of purposes. A common challenge is achieving a precise implicit or explicit representation of the data probability density. Recent proposals have suggested using classifier weights to refine the learned density of deep generative models. We extend this idea to all types of generative models and show how latent space refinement via iterated generative modeling can circumvent topological obstructions and improve precision. This methodology also applies to cases were the target model is non-differentiable and has many internal latent dimensions which must be marginalized over before refinement. We demonstrate our Latent Space Refinement (LaSeR) protocol on a variety of examples, focusing on the combinations of Normalizing Flows and Generative Adversarial Networks.

READ FULL TEXT

page 2

page 8

12/26/2018

Latent Variable Modeling for Generative Concept Representations and Deep Generative Models

Latent representations are the essence of deep generative models and det...
12/01/2020

Refining Deep Generative Models via Wasserstein Gradient Flows

Deep generative modeling has seen impressive advances in recent years, t...
12/04/2018

Deep Generative Modeling of LiDAR Data

Building models capable of generating structured output is a key challen...
05/01/2021

Feature Disentanglement in generating three-dimensional structure from two-dimensional slice with sliceGAN

Deep generative models are known to be able to model arbitrary probabili...
10/18/2020

Characterizing the Latent Space of Molecular Deep Generative Models with Persistent Homology Metrics

Deep generative models are increasingly becoming integral parts of the i...
02/27/2020

Deep Meditations: Controlled navigation of latent space

We introduce a method which allows users to creatively explore and navig...
11/02/2021

OSOA: One-Shot Online Adaptation of Deep Generative Models for Lossless Compression

Explicit deep generative models (DGMs), e.g., VAEs and Normalizing Flows...