Earlier Isn't Always Better: Sub-aspect Analysis on Corpus and System Biases in Summarization

08/30/2019
by   Taehee Jung, et al.
0

Despite the recent developments on neural summarization systems, the underlying logic behind the improvements from the systems and its corpus-dependency remains largely unexplored. Position of sentences in the original text, for example, is a well known bias for news summarization. Following in the spirit of the claim that summarization is a combination of sub-functions, we define three sub-aspects of summarization: position, importance, and diversity and conduct an extensive analysis of the biases of each sub-aspect with respect to the domain of nine different summarization corpora (e.g., news, academic papers, meeting minutes, movie script, books, posts). We find that while position exhibits substantial bias in news articles, this is not the case, for example, with academic papers and meeting minutes. Furthermore, our empirical study shows that different types of summarization systems (e.g., neural-based) are composed of different degrees of the sub-aspects. Our study provides useful lessons regarding consideration of underlying sub-aspects when collecting a new summarization dataset or developing a new system.

READ FULL TEXT

page 7

page 15

page 16

page 17

research
04/29/2020

Conditional Neural Generation using Sub-Aspect Functions for Extractive News Summarization

Much progress has been made in text summarization, fueled by neural arch...
research
05/19/2019

Structured Summarization of Academic Publications

We propose SUSIE, a novel summarization method that can work with state-...
research
11/02/2020

How Domain Terminology Affects Meeting Summarization Performance

Meetings are essential to modern organizations. Numerous meetings are he...
research
02/27/2018

Live Blog Corpus for Summarization

Live blogs are an increasingly popular news format to cover breaking new...
research
04/27/2020

Screenplay Summarization Using Latent Narrative Structure

Most general-purpose extractive summarization models are trained on news...

Please sign up or login with your details

Forgot password? Click here to reset