Recurrence Boosts Diversity! Revisiting Recurrent Latent Variable in Transformer-Based Variational AutoEncoder for Diverse Text Generation

10/22/2022
by   Jinyi Hu, et al.
0

Variational Auto-Encoder (VAE) has been widely adopted in text generation. Among many variants, recurrent VAE learns token-wise latent variables with each conditioned on the preceding ones, which captures sequential variability better in the era of RNN. However, it is unclear how to incorporate such recurrent dynamics into the recently dominant Transformer due to its parallelism. In this work, we propose TRACE, a Transformer-based recurrent VAE structure. TRACE imposes recurrence on segment-wise latent variables with arbitrarily separated text segments and constructs the posterior distribution with residual parameterization. Besides, we design an acceleration method by approximating idempotent matrices, which allows parallelism while maintaining the conditional dependence of latent variables. We demonstrate that TRACE could enhance the entanglement of each segment and preceding latent variables and deduce a non-zero lower bound of the KL term, providing a theoretical guarantee of generation diversity. Experiments on two unconditional and one conditional generation tasks show that TRACE achieves significantly improved diversity while maintaining satisfactory generation quality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/13/2022

Fuse It More Deeply! A Variational Transformer with Layer-Wise Latent Variable Inference for Text Generation

The past several years have witnessed Variational Auto-Encoder's superio...
research
11/02/2020

Improving Variational Autoencoder for Text Modelling with Timestep-Wise Regularisation

The Variational Autoencoder (VAE) is a popular and powerful model applie...
research
05/24/2019

mu-Forcing: Training Variational Recurrent Autoencoders for Text Generation

It has been previously observed that training Variational Recurrent Auto...
research
03/26/2019

Improve Diverse Text Generation by Self Labeling Conditional Variational Auto Encoder

Diversity plays a vital role in many text generating applications. In re...
research
04/01/2021

WakaVT: A Sequential Variational Transformer for Waka Generation

Poetry generation has long been a challenge for artificial intelligence....
research
08/10/2021

Regularized Sequential Latent Variable Models with Adversarial Neural Networks

The recurrent neural networks (RNN) with richly distributed internal sta...
research
11/14/2022

Evade the Trap of Mediocrity: Promoting Diversity and Novelty in Text Generation via Concentrating Attention

Recently, powerful Transformer architectures have proven superior in gen...

Please sign up or login with your details

Forgot password? Click here to reset