Towards Better Data Augmentation using Wasserstein Distance in Variational Auto-encoder

09/30/2021
by   Zichuan Chen, et al.
2

VAE, or variational auto-encoder, compresses data into latent attributes, and generates new data of different varieties. VAE based on KL divergence has been considered as an effective technique for data augmentation. In this paper, we propose the use of Wasserstein distance as a measure of distributional similarity for the latent attributes, and show its superior theoretical lower bound (ELBO) compared with that of KL divergence under mild conditions. Using multiple experiments, we demonstrate that the new loss function exhibits better convergence property and generates artificial images that could better aid the image classification tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/12/2018

Gaussian Auto-Encoder

Evaluating distance between sample distribution and the wanted one, usua...
research
02/26/2023

Key-Exchange Convolutional Auto-Encoder for Data Augmentation in Early Knee OsteoArthritis Classification

Knee OsteoArthritis (KOA) is a prevalent musculoskeletal condition that ...
research
05/31/2021

Consistency Regularization for Variational Auto-Encoders

Variational auto-encoders (VAEs) are a powerful approach to unsupervised...
research
06/10/2019

An Image Clustering Auto-Encoder Based on Predefined Evenly-Distributed Class Centroids and MMD Distance

In this paper, we propose an end-to-end image clustering auto-encoder al...
research
10/29/2019

Bridging the ELBO and MMD

One of the challenges in training generative models such as the variatio...
research
06/28/2022

AS-IntroVAE: Adversarial Similarity Distance Makes Robust IntroVAE

Recently, introspective models like IntroVAE and S-IntroVAE have excelle...
research
09/13/2017

Sketch-pix2seq: a Model to Generate Sketches of Multiple Categories

Sketch is an important media for human to communicate ideas, which refle...

Please sign up or login with your details

Forgot password? Click here to reset