VieSum: How Robust Are Transformer-based Models on Vietnamese Summarization?

10/08/2021
by   Hieu Nguyen, et al.
0

Text summarization is a challenging task within natural language processing that involves text generation from lengthy input sequences. While this task has been widely studied in English, there is very limited research on summarization for Vietnamese text. In this paper, we investigate the robustness of transformer-based encoder-decoder architectures for Vietnamese abstractive summarization. Leveraging transfer learning and self-supervised learning, we validate the performance of the methods on two Vietnamese datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset