Gaussian Auto-Encoder

11/12/2018
by   Jarek Duda, et al.
0

Evaluating distance between sample distribution and the wanted one, usually Gaussian, is a difficult task required to train generative Auto-Encoders. After the original Variational Auto-Encoder (VAE) using KL divergence, there was claimed superiority of distances based on Wasserstein metric (WAE, SWAE) and L_2 distance of KDE Gaussian smoothened sample for all 1D projections (CWAE). This article derives formulas for also L_2 distance of KDE Gaussian smoothened sample, but this time directly using multivariate Gaussians, also optimizing position-dependent covariance matrix with mean-field approximation, for application in purely Gaussian Auto-Encoder (GAE).

READ FULL TEXT

page 1

page 2

research
11/05/2017

Wasserstein Auto-Encoders

We propose the Wasserstein Auto-Encoder (WAE)---a new algorithm for buil...
research
09/30/2021

Towards Better Data Augmentation using Wasserstein Distance in Variational Auto-encoder

VAE, or variational auto-encoder, compresses data into latent attributes...
research
02/10/2016

A Theory of Generative ConvNet

We show that a generative random field model, which we call generative C...
research
01/28/2019

Lie Group Auto-Encoder

In this paper, we propose an auto-encoder based generative neural networ...
research
06/30/2012

Implicit Density Estimation by Local Moment Matching to Sample from Auto-Encoders

Recent work suggests that some auto-encoder variants do a good job of ca...
research
11/29/2019

X-Ray Sobolev Variational Auto-Encoders

The quality of the generative models (Generative adversarial networks, V...
research
11/15/2022

An FNet based Auto Encoder for Long Sequence News Story Generation

In this paper, we design an auto encoder based off of Google's FNet Arch...

Please sign up or login with your details

Forgot password? Click here to reset