DeepAI AI Chat
Log In Sign Up

Neural Abstractive Text Summarization with Sequence-to-Sequence Models

12/05/2018
by   Tian Shi, et al.
Virginia Polytechnic Institute and State University
8

In the past few years, neural abstractive text summarization with sequence-to-sequence (seq2seq) models have gained a lot of popularity. Many interesting techniques have been proposed to improve the seq2seq models, making them capable of handling different challenges, such as saliency, fluency and human readability, and generate high-quality summaries. Generally speaking, most of these techniques differ in one of these three categories: network structure, parameter inference, and decoding/generation. There are also other concerns, such as efficiency and parallelism for training a model. In this paper, we provide a comprehensive literature and technical survey on different seq2seq models for abstractive text summarization from viewpoint of network structures, training strategies, and summary generation algorithms. Many models were first proposed for language modeling and generation tasks, such as machine translation, and later applied to abstractive text summarization. Therefore, we also provide a brief review of these models. As part of this survey, we also develop an open source library, namely Neural Abstractive Text Summarizer (NATS) toolkit, for the abstractive text summarization. An extensive set of experiments have been conducted on the widely used CNN/Daily Mail dataset to examine the effectiveness of several different neural network components. Finally, we benchmark two models implemented in NATS on two recently released datasets, i.e., Newsroom and Bytecup.

READ FULL TEXT

page 1

page 4

page 28

05/28/2019

LeafNATS: An Open-Source Toolkit and Live Demo System for Neural Abstractive Text Summarization

Neural abstractive text summarization (NATS) has received a lot of atten...
10/11/2019

Keyphrase Generation: A Multi-Aspect Survey

Extractive keyphrase generation research has been around since the ninet...
07/07/2017

Text Summarization Techniques: A Brief Survey

In recent years, there has been a explosion in the amount of text data f...
05/03/2023

Backdoor Learning on Sequence to Sequence Models

Backdoor learning has become an emerging research area towards building ...
04/05/2023

To Asymmetry and Beyond: Structured Pruning of Sequence to Sequence Models for Improved Inference Efficiency

Sequence-to-sequence language models can be used to produce abstractive ...
11/13/2017

Faithful to the Original: Fact Aware Neural Abstractive Summarization

Unlike extractive summarization, abstractive summarization has to fuse d...
02/05/2018

Diverse Beam Search for Increased Novelty in Abstractive Summarization

Text summarization condenses a text to a shorter version while retaining...

Code Repositories

pycorrector

pycorrector is a toolkit for text error correction. 文本纠错,Kenlm,Seq2Seq_Attention,BERT,MacBERT,ELECTRA,ERNIE,Transformer等模型实现,开箱即用。


view repo

NATS

Neural Abstractive Text Summarization with Sequence-to-Sequence Models


view repo

LeafNATS

Learning Framework for Neural Abstractive Text Summarization


view repo

pycorrector

错别字纠正算法。调用pycorrector接口,使用规则。


view repo

politicalsynthesis

HackMIT 2019!!!


view repo