Bayesian Conditional Generative Adverserial Networks

06/17/2017
by   Ehsan Abbasnejad, et al.
0

Traditional GANs use a deterministic generator function (typically a neural network) to transform a random noise input z to a sample x that the discriminator seeks to distinguish. We propose a new GAN called Bayesian Conditional Generative Adversarial Networks (BC-GANs) that use a random generator function to transform a deterministic input y' to a sample x. Our BC-GANs extend traditional GANs to a Bayesian framework, and naturally handle unsupervised learning, supervised learning, and semi-supervised learning problems. Experiments show that the proposed BC-GANs outperforms the state-of-the-arts.

READ FULL TEXT

page 6

page 8

research
05/26/2017

Bayesian GAN

Generative adversarial networks (GANs) can implicitly learn rich distrib...
research
11/20/2018

ChainGAN: A sequential approach to GANs

We propose a new architecture and training methodology for generative ad...
research
05/26/2019

OOGAN: Disentangling GAN with One-Hot Sampling and Orthogonal Regularization

Exploring the potential of GANs for unsupervised disentanglement learnin...
research
05/24/2017

Semi-supervised Learning with GANs: Manifold Invariance with Improved Inference

Semi-supervised learning methods using Generative Adversarial Networks (...
research
06/11/2020

Conditional Sampling With Monotone GANs

We present a new approach for sampling conditional measures that enables...
research
03/21/2023

Linking generative semi-supervised learning and generative open-set recognition

This study investigates the relationship between semi-supervised learnin...
research
07/07/2019

Improving Detection of Credit Card Fraudulent Transactions using Generative Adversarial Networks

In this study, we employ Generative Adversarial Networks as an oversampl...

Please sign up or login with your details

Forgot password? Click here to reset