DeepAI AI Chat
Log In Sign Up

Generative Reversible Networks

06/05/2018
by   Robin Tibor Schirrmeister, et al.
Universitätsklinikum Freiburg
2

Generative models with an encoding component such as autoencoders currently receive great interest. However, training of autoencoders is typically complicated by the need for training of a separate encoder and decoder model that have to be enforced to be reciprocal to each other. Here, we propose to use the by-design reversible neural networks (RevNets) as a new class of generative models. We investigate the generative performance of RevNets on the CelebA dataset, showing that generative RevNets can indeed generate coherent faces with similar quality as Variational Autoencoders. This first attempt to use RevNets as a generative model still slightly underperformed relative to recent advanced generative models using an autoencoder component on CelebA, but this gap may diminish with further optimization of the training setup of generative RevNets. In addition to the experiments on CelebA, we show a proof-of-principle experiment on the MNIST dataset suggesting that adversary-free trained RevNets can discover meaningful dimensions without pre-specifying the number of latent dimensions of the sampling distribution. In summary, this study shows that RevNets enable generative applications with an encoding component while overcoming the need of training separate encoder and decoder models.

READ FULL TEXT

page 2

page 5

page 6

page 7

06/05/2018

Training Generative Reversible Networks

Generative models with an encoding component such as autoencoders curren...
02/25/2020

Batch norm with entropic regularization turns deterministic autoencoders into generative models

The variational autoencoder is a well defined deep generative model that...
11/19/2019

SimVAE: Simulator-Assisted Training forInterpretable Generative Models

This paper presents a simulator-assisted training method (SimVAE) for va...
07/09/2021

The Effects of Invertibility on the Representational Complexity of Encoders in Variational Autoencoders

Training and using modern neural-network based latent-variable generativ...
05/19/2022

Closing the gap: Exact maximum likelihood training of generative autoencoders using invertible layers

In this work, we provide an exact likelihood alternative to the variatio...
11/02/2022

Verified Reversible Programming for Verified Lossless Compression

Lossless compression implementations typically contain two programs, an ...
02/27/2021

Machine Learning Techniques to Construct Patched Analog Ensembles for Data Assimilation

Using generative models from the machine learning literature to create a...