Features in Extractive Supervised Single-document Summarization: Case of Persian News

09/06/2019
by   Hosein Rezaei, et al.
0

Text summarization has been one of the most challenging areas of research in NLP. Much effort has been made to overcome this challenge by using either the abstractive or extractive methods. Extractive methods are more popular, due to their simplicity compared with the more elaborate abstractive methods. In extractive approaches, the system will not generate sentences. Instead, it learns how to score sentences within the text by using some textual features and subsequently selecting those with the highest-rank. Therefore, the core objective is ranking and it highly depends on the document. This dependency has been unnoticed by many state-of-the-art solutions. In this work, the features of the document are integrated into vectors of every sentence. In this way, the system becomes informed about the context, increases the precision of the learned model and consequently produces comprehensive and brief summaries.

READ FULL TEXT
research
08/25/2017

Revisiting the Centroid-based Method: A Strong Baseline for Multi-Document Summarization

The centroid-based model for extractive document summarization is a simp...
research
10/11/2012

Artex is AnotheR TEXt summarizer

This paper describes Artex, another algorithm for Automatic Text Summari...
research
01/07/2022

An Unsupervised Masking Objective for Abstractive Multi-Document News Summarization

We show that a simple unsupervised masking objective can approach near s...
research
10/21/2020

ReSCo-CC: Unsupervised Identification of Key Disinformation Sentences

Disinformation is often presented in long textual articles, especially w...
research
06/06/2021

Extractive Research Slide Generation Using Windowed Labeling Ranking

Presentation slides describing the content of scientific and technical p...
research
09/26/2022

Text Summarization with Oracle Expectation

Extractive summarization produces summaries by identifying and concatena...
research
02/09/2020

Attend to the beginning: A study on using bidirectional attention for extractive summarization

Forum discussion data differ in both structure and properties from gener...

Please sign up or login with your details

Forgot password? Click here to reset