Classification Representations Can be Reused for Downstream Generations

Contrary to the convention of using supervision for class-conditioned generativemodeling, this work explores and demonstrates the feasibility of a learned supervised representation space trained on a discriminative classifier for the downstream task of sample generation. Unlike generative modeling approaches that aim to model the manifold distribution, we directly represent the given data manifold in the classification space and leverage properties of latent space representations to generate new representations that are guaranteed to be in the same class. Interestingly, such representations allow for controlled sample generations for any given class from existing samples and do not require enforcing prior distribution. We show that these latent space representations can be smartly manipulated (using convex combinations of n samples, n≥2) to yield meaningful sample generations. Experiments on image datasets of varying resolutions demonstrate that downstream generations have higher classification accuracy than existing conditional generative models while being competitive in terms of FID.

READ FULL TEXT

page 2

page 11

research
11/08/2018

NEMGAN: Noise Engineered Mode-matching GAN

Conditional generation refers to the process of sampling from an unknown...
research
01/30/2021

Atlas Generative Models and Geodesic Interpolation

Generative neural networks have a well recognized ability to estimate un...
research
03/31/2020

Learning from Small Data Through Sampling an Implicit Conditional Generative Latent Optimization Model

We revisit the long-standing problem of learning from small sample. In r...
research
02/11/2022

Multi-level Latent Space Structuring for Generative Control

Truncation is widely used in generative models for improving the quality...
research
10/19/2022

Palm up: Playing in the Latent Manifold for Unsupervised Pretraining

Large and diverse datasets have been the cornerstones of many impressive...
research
04/05/2022

Lost in Latent Space: Disentangled Models and the Challenge of Combinatorial Generalisation

Recent research has shown that generative models with highly disentangle...
research
09/29/2021

Flow Based Models For Manifold Data

Flow-based generative models typically define a latent space with dimens...

Please sign up or login with your details

Forgot password? Click here to reset