Global Encoding for Abstractive Summarization

05/10/2018
by   Junyang Lin, et al.
0

In neural abstractive summarization, the conventional sequence-to-sequence (seq2seq) model often suffers from repetition and semantic irrelevance. To tackle the problem, we propose a global encoding framework, which controls the information flow from the encoder to the decoder based on the global information of the source context. It consists of a convolutional gated unit to perform global encoding to improve the representations of the source-side information. Evaluations on the LCSTS and the English Gigaword both demonstrate that our model outperforms the baseline models, and the analysis shows that our model is capable of reducing repetition.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/24/2017

Selective Encoding for Abstractive Sentence Summarization

We propose a selective encoding model to extend the sequence-to-sequence...
research
07/04/2016

Towards Abstraction from Extraction: Multiple Timescale Gated Recurrent Unit for Summarization

In this work, we introduce temporal hierarchies to the sequence to seque...
research
03/25/2020

Learning Syntactic and Dynamic Selective Encoding for Document Summarization

Text summarization aims to generate a headline or a short summary consis...
research
09/12/2018

Closed-Book Training to Improve Summarization Encoder Memory

A good neural sequence-to-sequence summarization model should have a str...
research
10/06/2017

A Semantic Relevance Based Neural Network for Text Summarization and Text Simplification

Text summarization and text simplification are two major ways to simplif...
research
06/12/2019

BiSET: Bi-directional Selective Encoding with Template for Abstractive Summarization

The success of neural summarization models stems from the meticulous enc...
research
08/01/2023

Tackling Hallucinations in Neural Chart Summarization

Hallucinations in text generation occur when the system produces text th...

Please sign up or login with your details

Forgot password? Click here to reset