Indian Language Summarization using Pretrained Sequence-to-Sequence Models

03/25/2023
by   Ashok Urlana, et al.
0

The ILSUM shared task focuses on text summarization for two major Indian languages- Hindi and Gujarati, along with English. In this task, we experiment with various pretrained sequence-to-sequence models to find out the best model for each of the languages. We present a detailed overview of the models and our approaches in this paper. We secure the first rank across all three sub-tasks (English, Hindi and Gujarati). This paper also extensively analyzes the impact of k-fold cross-validation while experimenting with limited data size, and we also perform various experiments with a combination of the original and a filtered version of the data to determine the efficacy of the pretrained models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/21/2022

AraBART: a Pretrained Arabic Sequence-to-Sequence Model for Abstractive Summarization

Like most natural language understanding and generation tasks, state-of-...
research
07/28/2022

Sequence to sequence pretraining for a less-resourced Slovenian language

Large pretrained language models have recently conquered the area of nat...
research
12/19/2022

Multilingual Sequence-to-Sequence Models for Hebrew NLP

Recent work attributes progress in NLP to large language models (LMs) wi...
research
02/14/2022

Sequence-to-Sequence Resources for Catalan

In this work, we introduce sequence-to-sequence language resources for C...
research
04/26/2020

Experiments with LVT and FRE for Transformer model

In this paper, we experiment with Large Vocabulary Trick and Feature-ric...
research
03/14/2020

Document Ranking with a Pretrained Sequence-to-Sequence Model

This work proposes a novel adaptation of a pretrained sequence-to-sequen...
research
04/04/2020

Conversational Question Reformulation via Sequence-to-Sequence Architectures and Pretrained Language Models

This paper presents an empirical study of conversational question reform...

Please sign up or login with your details

Forgot password? Click here to reset