CO2Sum:Contrastive Learning for Factual-Consistent Abstractive Summarization

12/02/2021
by   Wei Liu, et al.
6

Generating factual-consistent summaries is a challenging task for abstractive summarization. Previous works mainly encode factual information or perform post-correct/rank after decoding. In this paper, we provide a factual-consistent solution from the perspective of contrastive learning, which is a natural extension of previous works. We propose CO2Sum (Contrastive for Consistency), a contrastive learning scheme that can be easily applied on sequence-to-sequence models for factual-consistent abstractive summarization, proving that the model can be fact-aware without modifying the architecture. CO2Sum applies contrastive learning on the encoder, which can help the model be aware of the factual information contained in the input article, or performs contrastive learning on the decoder, which makes the model to generate factual-correct output summary. What's more, these two schemes are orthogonal and can be combined to further improve faithfulness. Comprehensive experiments on public benchmarks demonstrate that CO2Sum improves the faithfulness on large pre-trained language models and reaches competitive results compared to other strong factual-consistent summarization baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/08/2021

Sequence Level Contrastive Learning for Text Summarization

Contrastive learning models have achieved great success in unsupervised ...
research
09/19/2021

CLIFF: Contrastive Learning for Improving Faithfulness and Factuality in Abstractive Summarization

We study generating abstractive summaries that are faithful and factuall...
research
07/10/2023

Improving Factuality of Abstractive Summarization via Contrastive Reward Learning

Modern abstractive summarization models often generate summaries that co...
research
08/26/2021

Enhanced Seq2Seq Autoencoder via Contrastive Learning for Abstractive Text Summarization

In this paper, we present a denoising sequence-to-sequence (seq2seq) aut...
research
05/15/2023

A Hierarchical Encoding-Decoding Scheme for Abstractive Multi-document Summarization

Pre-trained language models (PLMs) have accomplished impressive achievem...
research
02/18/2022

CLSEG: Contrastive Learning of Story Ending Generation

Story Ending Generation (SEG) is a challenging task in natural language ...
research
02/01/2023

Leveraging Task Dependency and Contrastive Learning for Case Outcome Classification on European Court of Human Rights Cases

We report on an experiment in case outcome classification on European Co...

Please sign up or login with your details

Forgot password? Click here to reset