Flow-GAN: Bridging implicit and prescribed learning in generative models

05/24/2017
by   Aditya Grover, et al.
0

Evaluating the performance of generative models for unsupervised learning is inherently challenging due to the lack of well-defined and tractable objectives. This is particularly difficult for implicit models such as generative adversarial networks (GANs) which perform extremely well in practice for tasks such as sample generation, but sidestep the explicit characterization of a density. We propose Flow-GANs, a generative adversarial network with the generator specified as a normalizing flow model which can perform exact likelihood evaluation. Subsequently, we learn a Flow-GAN using a hybrid objective that integrates adversarial training with maximum likelihood estimation. We show empirically the benefits of Flow-GANs on MNIST and CIFAR-10 datasets in learning generative models that can attain low generalization error based on the log-likelihoods and generate high quality samples. Finally, we show a simple, yet hard to beat baseline for GANs based on Gaussian Mixture Models.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset