Sequence Level Contrastive Learning for Text Summarization

09/08/2021
by   Shusheng Xu, et al.
0

Contrastive learning models have achieved great success in unsupervised visual representation learning, which maximize the similarities between feature representations of different views of the same image, while minimize the similarities between feature representations of views of different images. In text summarization, the output summary is a shorter form of the input document and they have similar meanings. In this paper, we propose a contrastive learning model for supervised abstractive text summarization, where we view a document, its gold summary and its model generated summaries as different views of the same mean representation and maximize the similarities between them during training. We improve over a strong sequence-to-sequence text generation model (i.e., BART) on three different summarization datasets. Human evaluation also shows that our model achieves better faithfulness ratings compared to its counterpart without contrastive objectives.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2020

Unsupervised Reference-Free Summary Quality Evaluation via Contrastive Learning

Evaluation of a document summarization system has been a critical factor...
research
12/02/2021

CO2Sum:Contrastive Learning for Factual-Consistent Abstractive Summarization

Generating factual-consistent summaries is a challenging task for abstra...
research
08/26/2021

Enhanced Seq2Seq Autoencoder via Contrastive Learning for Abstractive Text Summarization

In this paper, we present a denoising sequence-to-sequence (seq2seq) aut...
research
08/26/2021

Alleviating Exposure Bias via Contrastive Learning for Abstractive Text Summarization

Encoder-decoder models have achieved remarkable success in abstractive t...
research
09/29/2022

COLO: A Contrastive Learning based Re-ranking Framework for One-Stage Summarization

Traditional training paradigms for extractive and abstractive summarizat...
research
04/07/2023

On the Importance of Contrastive Loss in Multimodal Learning

Recently, contrastive learning approaches (e.g., CLIP (Radford et al., 2...
research
04/02/2023

Constructive Assimilation: Boosting Contrastive Learning Performance through View Generation Strategies

Transformations based on domain expertise (expert transformations), such...

Please sign up or login with your details

Forgot password? Click here to reset