Regularized Autoencoders via Relaxed Injective Probability Flow

02/20/2020
by   Abhishek Kumar, et al.
5

Invertible flow-based generative models are an effective method for learning to generate samples, while allowing for tractable likelihood computation and inference. However, the invertibility requirement restricts models to have the same latent dimensionality as the inputs. This imposes significant architectural, memory, and computational costs, making them more challenging to scale than other classes of generative models such as Variational Autoencoders (VAEs). We propose a generative model based on probability flows that does away with the bijectivity requirement on the model and only assumes injectivity. This also provides another perspective on regularized autoencoders (RAEs), with our final objectives resembling RAEs with specific regularizers that are derived by lower bounding the probability flow objective. We empirically demonstrate the promise of the proposed model, improving over VAEs and AEs in terms of sample quality.

READ FULL TEXT

page 11

page 12

page 13

page 15

page 16

page 23

page 25

page 27

research
01/31/2020

On Implicit Regularization in β-VAEs

While the impact of variational inference (VI) on posterior inference in...
research
06/12/2019

Copulas as High-Dimensional Generative Models: Vine Copula Autoencoders

We propose a vine copula autoencoder to construct flexible generative mo...
research
12/01/2021

Forward Operator Estimation in Generative Models with Kernel Transfer Operators

Generative models which use explicit density modeling (e.g., variational...
research
04/04/2020

Theoretical Insights into the Use of Structural Similarity Index In Generative Models and Inferential Autoencoders

Generative models and inferential autoencoders mostly make use of ℓ_2 no...
research
11/22/2022

Sparse Probabilistic Circuits via Pruning and Growing

Probabilistic circuits (PCs) are a tractable representation of probabili...
research
11/05/2018

TzK Flow - Conditional Generative Model

We introduce TzK (pronounced "task"), a conditional flow-based encoder/d...
research
11/29/2022

Taming a Generative Model

Generative models are becoming ever more powerful, being able to synthes...

Please sign up or login with your details

Forgot password? Click here to reset