OpenGAN: Open Set Generative Adversarial Networks

by   Luke Ditria, et al.

Many existing conditional Generative Adversarial Networks (cGANs) are limited to conditioning on pre-defined and fixed class-level semantic labels or attributes. We propose an open set GAN architecture (OpenGAN) that is conditioned per-input sample with a feature embedding drawn from a metric space. Using a state-of-the-art metric learning model that encodes both class-level and fine-grained semantic information, we are able to generate samples that are semantically similar to a given source image. The semantic information extracted by the metric learning model transfers to out-of-distribution novel classes, allowing the generative model to produce samples that are outside of the training distribution. We show that our proposed method is able to generate 256×256 resolution images from novel classes that are of similar visual quality to those from the training classes. In lieu of a source image, we demonstrate that random sampling of the metric space also results in high-quality samples. We show that interpolation in the feature space and latent space results in semantically and visually plausible transformations in the image space. Finally, the usefulness of the generated samples to the downstream task of data augmentation is demonstrated. We show that classifier performance can be significantly improved by augmenting the training data with OpenGAN samples on classes that are outside of the GAN training distribution.


page 9

page 12

page 21

page 22

page 23

page 24

page 25

page 26


Learning to Generate Novel Classes for Deep Metric Learning

Deep metric learning aims to learn an embedding space where the distance...

An Improved Evaluation Framework for Generative Adversarial Networks

In this paper, we propose an improved quantitative evaluation framework ...

Addressing Discrepancies in Semantic and Visual Alignment in Neural Networks

For the task of image classification, neural networks primarily rely on ...

Adversarial sampling of unknown and high-dimensional conditional distributions

Many engineering problems require the prediction of realization-to-reali...

Generative Invertible Networks (GIN): Pathophysiology-Interpretable Feature Mapping and Virtual Patient Generation

Machine learning methods play increasingly important roles in pre-proced...

NEMGAN: Noise Engineered Mode-matching GAN

Conditional generation refers to the process of sampling from an unknown...

Distribution Networks for Open Set Learning

In open set learning, a model must be able to generalize to novel classe...

Please sign up or login with your details

Forgot password? Click here to reset