Paraphrase Generation with Latent Bag of Words

01/07/2020
by   Yao Fu, et al.
0

Paraphrase generation is a longstanding important problem in natural language processing. In addition, recent progress in deep generative models has shown promising results on discrete latent variables for text generation. Inspired by variational autoencoders with discrete latent structures, in this work, we propose a latent bag of words (BOW) model for paraphrase generation. We ground the semantics of a discrete latent variable by the BOW from the target sentences. We use this latent variable to build a fully differentiable content planning and surface realization model. Specifically, we use source words to predict their neighbors and model the target BOW with a mixture of softmax. We use Gumbel top-k reparameterization to perform differentiable subset sampling from the predicted BOW distribution. We retrieve the sampled word embeddings and use them to augment the decoder and guide its generation search space. Our latent BOW model not only enhances the decoder, but also exhibits clear interpretability. We show the model interpretability with regard to (i) unsupervised learning of word neighbors (ii) the step-by-step generation procedure. Extensive experiments demonstrate the transparent and effective generation process of this model.[%s]

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2021

GraphDF: A Discrete Flow Model for Molecular Graph Generation

We consider the problem of molecular graph generation using deep models....
research
06/12/2018

Gaussian mixture models with Wasserstein distance

Generative models with both discrete and continuous latent variables are...
research
05/29/2019

Latent Space Secrets of Denoising Text-Autoencoders

While neural language models have recently demonstrated impressive perfo...
research
03/03/2022

Deep Latent-Variable Models for Text Generation

Text generation aims to produce human-like natural language output for d...
research
06/16/2019

Fixing Gaussian Mixture VAEs for Interpretable Text Generation

Variational auto-encoder (VAE) with Gaussian priors is effective in text...
research
10/07/2020

Narrative Text Generation with a Latent Discrete Plan

Past work on story generation has demonstrated the usefulness of conditi...
research
02/06/2018

Improving Variational Encoder-Decoders in Dialogue Generation

Variational encoder-decoders (VEDs) have shown promising results in dial...

Please sign up or login with your details

Forgot password? Click here to reset