Towards Diverse Paraphrase Generation Using Multi-Class Wasserstein GAN

09/30/2019
by   Zhecheng An, et al.
0

Paraphrase generation is an important and challenging natural language processing (NLP) task. In this work, we propose a deep generative model to generate paraphrase with diversity. Our model is based on an encoder-decoder architecture. An additional transcoder is used to convert a sentence into its paraphrasing latent code. The transcoder takes an explicit pattern embedding variable as condition, so diverse paraphrase can be generated by sampling on the pattern embedding variable. We use a Wasserstein GAN to align the distributions of the real and generated paraphrase samples. We propose a multi-class extension to the Wasserstein GAN, which allows our generative model to learn from both positive and negative samples. The generated paraphrase distribution is forced to get closer to the positive real distribution, and be pushed away from the negative distribution in Wasserstein distance. We test our model in two datasets with both automatic metrics and human evaluation. Results show that our model can generate fluent and reliable paraphrase samples that outperform the state-of-art results, while also provides reasonable variability and diversity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/19/2020

DLow: Diversifying Latent Flows for Diverse Human Motion Prediction

Deep generative models are often used for human motion prediction as the...
research
02/25/2019

Wasserstein-Wasserstein Auto-Encoders

To address the challenges in learning deep generative models (e.g.,the b...
research
02/06/2019

Toward A Neuro-inspired Creative Decoder

Creativity, a process that generates novel and valuable ideas, involves ...
research
08/21/2020

MPCC: Matching Priors and Conditionals for Clustering

Clustering is a fundamental task in unsupervised learning that depends h...
research
05/11/2022

A Unified f-divergence Framework Generalizing VAE and GAN

Developing deep generative models that flexibly incorporate diverse meas...
research
05/23/2019

PHom-GeM: Persistent Homology for Generative Models

Generative neural network models, including Generative Adversarial Netwo...
research
03/07/2023

Stabilized training of joint energy-based models and their practical applications

The recently proposed Joint Energy-based Model (JEM) interprets discrimi...

Please sign up or login with your details

Forgot password? Click here to reset