VieSum: How Robust Are Transformer-based Models on Vietnamese Summarization?

10/08/2021
by   Hieu Nguyen, et al.
0

Text summarization is a challenging task within natural language processing that involves text generation from lengthy input sequences. While this task has been widely studied in English, there is very limited research on summarization for Vietnamese text. In this paper, we investigate the robustness of transformer-based encoder-decoder architectures for Vietnamese abstractive summarization. Leveraging transfer learning and self-supervised learning, we validate the performance of the methods on two Vietnamese datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/13/2022

ViT5: Pretrained Text-to-Text Transformer for Vietnamese Language Generation

We present ViT5, a pretrained Transformer-based encoder-decoder model fo...
research
12/21/2020

Leveraging ParsBERT and Pretrained mT5 for Persian Abstractive Text Summarization

Text summarization is one of the most critical Natural Language Processi...
research
01/10/2021

Summaformers @ LaySumm 20, LongSumm 20

Automatic text summarization has been widely studied as an important tas...
research
08/01/2023

Tackling Hallucinations in Neural Chart Summarization

Hallucinations in text generation occur when the system produces text th...
research
09/08/2022

Applying Transformer-based Text Summarization for Keyphrase Generation

Keyphrases are crucial for searching and systematizing scholarly documen...
research
09/01/2019

Repurposing Decoder-Transformer Language Models for Abstractive Summarization

Neural network models have shown excellent fluency and performance when ...
research
09/22/2021

Enriching and Controlling Global Semantics for Text Summarization

Recently, Transformer-based models have been proven effective in the abs...

Please sign up or login with your details

Forgot password? Click here to reset