Joint Copying and Restricted Generation for Paraphrase

11/28/2016
by   Ziqiang Cao, et al.
0

Many natural language generation tasks, such as abstractive summarization and text simplification, are paraphrase-orientated. In these tasks, copying and rewriting are two main writing modes. Most previous sequence-to-sequence (Seq2Seq) models use a single decoder and neglect this fact. In this paper, we develop a novel Seq2Seq model to fuse a copying decoder and a restricted generative decoder. The copying decoder finds the position to be copied based on a typical attention model. The generative decoder produces words limited in the source-specific vocabulary. To combine the two decoders and determine the final output, we develop a predictor to predict the mode of copying or rewriting. This predictor can be guided by the actual writing mode in the training data. We conduct extensive experiments on two different paraphrase datasets. The result shows that our model outperforms the state-of-the-art approaches in terms of both informativeness and language quality.

READ FULL TEXT
research
02/25/2019

Pretraining-Based Natural Language Generation for Text Summarization

In this paper, we propose a novel pretraining-based encoder-decoder fram...
research
06/12/2019

Keeping Notes: Conditional Natural Language Generation with a Scratchpad Mechanism

We introduce the Scratchpad Mechanism, a novel addition to the sequence-...
research
03/05/2018

Query and Output: Generating Words by Querying Distributed Word Representations for Paraphrase Generation

Most recent approaches use the sequence-to-sequence model for paraphrase...
research
08/02/2017

Deep Recurrent Generative Decoder for Abstractive Text Summarization

We propose a new framework for abstractive text summarization based on a...
research
05/31/2021

Reinforced Generative Adversarial Network for Abstractive Text Summarization

Sequence-to-sequence models provide a viable new approach to generative ...
research
10/02/2018

Structured Multi-Label Biomedical Text Tagging via Attentive Neural Tree Decoding

We propose a model for tagging unstructured texts with an arbitrary numb...
research
01/01/2023

Inflected Forms Are Redundant in Question Generation Models

Neural models with an encoder-decoder framework provide a feasible solut...

Please sign up or login with your details

Forgot password? Click here to reset