The Current State of Summarization

05/08/2023
by   fabian-retkowski, et al.
0

With the explosive growth of textual information, summarization systems have become increasingly important. This work aims at indicating the current state of the art in abstractive text summarization concisely. As part of this, we outline the current paradigm shifts towards pre-trained encoder-decoder models and large autoregressive language models. Additionally, we delve further into the challenges of evaluating summarization systems and the potential of instruction-tuned models for zero-shot summarization. Finally, we provide a brief overview of how summarization systems are currently being integrated into commercial applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/21/2020

Leveraging ParsBERT and Pretrained mT5 for Persian Abstractive Text Summarization

Text summarization is one of the most critical Natural Language Processi...
research
08/24/2020

A Baseline Analysis for Podcast Abstractive Summarization

Podcast summary, an important factor affecting end-users' listening deci...
research
11/19/2022

Combining State-of-the-Art Models with Maximal Marginal Relevance for Few-Shot and Zero-Shot Multi-Document Summarization

In Natural Language Processing, multi-document summarization (MDS) poses...
research
07/08/2019

Searching for Effective Neural Extractive Summarization: What Works and What's Next

The recent years have seen remarkable success in the use of deep neural ...
research
09/30/2019

A Closer Look at Data Bias in Neural Extractive Summarization Models

In this paper, we take stock of the current state of summarization datas...
research
11/29/2022

Zero-Shot Opinion Summarization with GPT-3

Very large language models such as GPT-3 have shown impressive performan...
research
09/14/2023

Leveraging Contextual Information for Effective Entity Salience Detection

In text documents such as news articles, the content and key events usua...

Please sign up or login with your details

Forgot password? Click here to reset