PluGeN: Multi-Label Conditional Generation From Pre-Trained Models

09/18/2021
by   Maciej Wołczyk, et al.
0

Modern generative models achieve excellent quality in a variety of tasks including image or text generation and chemical molecule modeling. However, existing methods often lack the essential ability to generate examples with requested properties, such as the age of the person in the photo or the weight of the generated molecule. Incorporating such additional conditioning factors would require rebuilding the entire architecture and optimizing the parameters from scratch. Moreover, it is difficult to disentangle selected attributes so that to perform edits of only one attribute while leaving the others unchanged. To overcome these limitations we propose PluGeN (Plugin Generative Network), a simple yet effective generative technique that can be used as a plugin to pre-trained generative models. The idea behind our approach is to transform the entangled latent representation using a flow-based module into a multi-dimensional space where the values of each attribute are modeled as an independent one-dimensional distribution. In consequence, PluGeN can generate new samples with desired attributes as well as manipulate labeled attributes of existing examples. Due to the disentangling of the latent representation, we are even able to generate samples with rare or unseen combinations of attributes in the dataset, such as a young person with gray hair, men with make-up, or women with beards. We combined PluGeN with GAN and VAE models and applied it to conditional generation and manipulation of images and chemical molecule modeling. Experiments demonstrate that PluGeN preserves the quality of backbone models while adding the ability to control the values of labeled attributes.

READ FULL TEXT

page 1

page 5

page 6

page 12

page 13

page 14

page 15

page 16

research
02/12/2022

Text and Image Guided 3D Avatar Generation and Manipulation

The manipulation of latent space has recently become an interesting topi...
research
09/09/2020

Multilinear Latent Conditioning for Generating Unseen Attribute Combinations

Deep generative models rely on their inductive bias to facilitate genera...
research
11/02/2020

Deep Representation Decomposition for Feature Disentanglement

Representation disentanglement aims at learning interpretable features, ...
research
10/07/2021

Flow Plugin Network for conditional generation

Generative models have gained many researchers' attention in the last ye...
research
12/05/2022

Breaking the Spurious Causality of Conditional Generation via Fairness Intervention with Corrective Sampling

Trying to capture the sample-label relationship, conditional generative ...
research
07/14/2022

Causal Graphs Underlying Generative Models: Path to Learning with Limited Data

Training generative models that capture rich semantics of the data and i...
research
02/25/2020

Unsupervised Semantic Attribute Discovery and Control in Generative Models

This work focuses on the ability to control via latent space factors sem...

Please sign up or login with your details

Forgot password? Click here to reset