DeepAI AI Chat
Log In Sign Up

Generative Reversible Networks

by   Robin Tibor Schirrmeister, et al.
Universitätsklinikum Freiburg

Generative models with an encoding component such as autoencoders currently receive great interest. However, training of autoencoders is typically complicated by the need for training of a separate encoder and decoder model that have to be enforced to be reciprocal to each other. Here, we propose to use the by-design reversible neural networks (RevNets) as a new class of generative models. We investigate the generative performance of RevNets on the CelebA dataset, showing that generative RevNets can indeed generate coherent faces with similar quality as Variational Autoencoders. This first attempt to use RevNets as a generative model still slightly underperformed relative to recent advanced generative models using an autoencoder component on CelebA, but this gap may diminish with further optimization of the training setup of generative RevNets. In addition to the experiments on CelebA, we show a proof-of-principle experiment on the MNIST dataset suggesting that adversary-free trained RevNets can discover meaningful dimensions without pre-specifying the number of latent dimensions of the sampling distribution. In summary, this study shows that RevNets enable generative applications with an encoding component while overcoming the need of training separate encoder and decoder models.


page 2

page 5

page 6

page 7


Training Generative Reversible Networks

Generative models with an encoding component such as autoencoders curren...

Batch norm with entropic regularization turns deterministic autoencoders into generative models

The variational autoencoder is a well defined deep generative model that...

SimVAE: Simulator-Assisted Training forInterpretable Generative Models

This paper presents a simulator-assisted training method (SimVAE) for va...

The Effects of Invertibility on the Representational Complexity of Encoders in Variational Autoencoders

Training and using modern neural-network based latent-variable generativ...

Closing the gap: Exact maximum likelihood training of generative autoencoders using invertible layers

In this work, we provide an exact likelihood alternative to the variatio...

Verified Reversible Programming for Verified Lossless Compression

Lossless compression implementations typically contain two programs, an ...

Machine Learning Techniques to Construct Patched Analog Ensembles for Data Assimilation

Using generative models from the machine learning literature to create a...