Stepwise Extractive Summarization and Planning with Structured Transformers

10/06/2020
by   Shashi Narayan, et al.
0

We propose encoder-centric stepwise models for extractive summarization using structured transformers – HiBERT and Extended Transformers. We enable stepwise summarization by injecting the previously generated summary into the structured transformer as an auxiliary sub-structure. Our models are not only efficient in modeling the structure of long inputs, but they also do not rely on task-specific redundancy-aware modeling, making them a general purpose extractive content planner for different tasks. When evaluated on CNN/DailyMail extractive summarization, stepwise models achieve state-of-the-art performance in terms of Rouge without any redundancy aware modeling or sentence filtering. This also holds true for Rotowire table-to-text generation, where our models surpass previously reported metrics for content selection, planning and ordering, highlighting the strength of stepwise modeling. Amongst the two structured transformers we test, stepwise Extended Transformers provides the best performance across both datasets and sets a new standard for these challenges.

READ FULL TEXT
research
09/10/2020

Modern Methods for Text Generation

Synthetic text generation is challenging and has limited success. Recent...
research
05/16/2019

HIBERT: Document Level Pre-training of Hierarchical Bidirectional Transformers for Document Summarization

Neural extractive summarization models usually employ a hierarchical enc...
research
04/15/2021

Planning with Entity Chains for Abstractive Summarization

Pre-trained transformer-based sequence-to-sequence models have become th...
research
02/16/2021

Exploring Transformers in Natural Language Generation: GPT, BERT, and XLNet

Recent years have seen a proliferation of attention mechanisms and the r...
research
02/02/2023

Curriculum-Guided Abstractive Summarization

Recent Transformer-based summarization models have provided a promising ...
research
10/08/2020

Learning to Fuse Sentences with Transformers for Summarization

The ability to fuse sentences is highly attractive for summarization sys...
research
08/17/2022

UniLayout: Taming Unified Sequence-to-Sequence Transformers for Graphic Layout Generation

To satisfy various user needs, different subtasks of graphic layout gene...

Please sign up or login with your details

Forgot password? Click here to reset