D-PAGE: Diverse Paraphrase Generation

08/13/2018
by   Qiongkai Xu, et al.
0

In this paper, we investigate the diversity aspect of paraphrase generation. Prior deep learning models employ either decoding methods or add random input noise for varying outputs. We propose a simple method Diverse Paraphrase Generation (D-PAGE), which extends neural machine translation (NMT) models to support the generation of diverse paraphrases with implicit rewriting patterns. Our experimental results on two real-world benchmark datasets demonstrate that our model generates at least one order of magnitude more diverse outputs than the baselines in terms of a new evaluation metric Jeffrey's Divergence. We have also conducted extensive experiments to understand various properties of our model with a focus on diversity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/25/2016

A Simple, Fast Diverse Decoding Algorithm for Neural Generation

In this paper, we propose a simple, fast decoding algorithm that fosters...
research
09/02/2018

Future-Prediction-Based Model for Neural Machine Translation

We propose a novel model for Neural Machine Translation (NMT). Different...
research
10/16/2020

Generating Diverse Translation from Model Distribution with Dropout

Despite the improvement of translation quality, neural machine translati...
research
09/08/2021

Mixup Decoding for Diverse Machine Translation

Diverse machine translation aims at generating various target language t...
research
08/06/2022

DeepGen: Diverse Search Ad Generation and Real-Time Customization

We present DeepGen, a system deployed at web scale for automatically cre...
research
04/15/2021

Sentence-Permuted Paragraph Generation

Generating paragraphs of diverse contents is important in many applicati...
research
09/05/2023

Towards Diverse and Consistent Typography Generation

In this work, we consider the typography generation task that aims at pr...

Please sign up or login with your details

Forgot password? Click here to reset