Improving Variational Autoencoder for Text Modelling with Timestep-Wise Regularisation

11/02/2020
by   Ruizhe Li, et al.
25

The Variational Autoencoder (VAE) is a popular and powerful model applied to text modelling to generate diverse sentences. However, an issue known as posterior collapse (or KL loss vanishing) happens when the VAE is used in text modelling, where the approximate posterior collapses to the prior, and the model will totally ignore the latent variables and be degraded to a plain language model during text generation. Such an issue is particularly prevalent when RNN-based VAE models are employed for text modelling. In this paper, we propose a simple, generic architecture called Timestep-Wise Regularisation VAE (TWR-VAE), which can effectively avoid posterior collapse and can be applied to any RNN-based VAE models. The effectiveness and versatility of our model are demonstrated in different tasks, including language modelling and dialogue response generation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/13/2019

A Stable Variational Autoencoder for Text Modelling

Variational Autoencoder (VAE) is a powerful method for learning represen...
research
04/27/2020

A Batch Normalized Inference Network Keeps the KL Vanishing Away

Variational Autoencoder (VAE) is widely used as a generative model to ap...
research
03/17/2019

Topic-Guided Variational Autoencoders for Text Generation

We propose a topic-guided variational autoencoder (TGVAE) model for text...
research
02/08/2017

A Hybrid Convolutional Variational Autoencoder for Text Generation

In this paper we explore the effect of architectural choices on learning...
research
04/04/2019

Riemannian Normalizing Flow on Variational Wasserstein Autoencoder for Text Modeling

Recurrent Variational Autoencoder has been widely used for language mode...
research
04/30/2020

APo-VAE: Text Generation in Hyperbolic Space

Natural language often exhibits inherent hierarchical structure ingraine...
research
10/22/2022

Recurrence Boosts Diversity! Revisiting Recurrent Latent Variable in Transformer-Based Variational AutoEncoder for Diverse Text Generation

Variational Auto-Encoder (VAE) has been widely adopted in text generatio...

Please sign up or login with your details

Forgot password? Click here to reset