Pagsusuri ng RNN-based Transfer Learning Technique sa Low-Resource Language

10/13/2020
by   Dan John Velasco, et al.
0

Low-resource languages such as Filipino suffer from data scarcity which makes it challenging to develop NLP applications for Filipino language. The use of Transfer Learning (TL) techniques alleviates this problem in low-resource setting. In recent years, transformer-based models are proven to be effective in low-resource tasks but faces challenges in accessibility due to its high compute and memory requirements. For this reason, there's a need for a cheaper but effective alternative. This paper has three contributions. First, release a pre-trained AWD-LSTM language model for Filipino language. Second, benchmark AWD-LSTM in the Hate Speech classification task and show that it performs on par with transformer-based models. Third, analyze the the performance of AWD-LSTM in low-resource setting using degradation test and compare it with transformer-based models. —– Ang mga low-resource languages tulad ng Filipino ay gipit sa accessible na datos kaya't mahirap gumawa ng mga applications sa wikang ito. Ang mga Transfer Learning (TL) techniques ay malaking tulong para sa low-resource setting o mga pagkakataong gipit sa datos. Sa mga nagdaang taon, nanaig ang mga transformer-based TL techniques pagdating sa low-resource tasks ngunit ito ay mataas na compute and memory requirements kaya nangangailangan ng mas mura pero epektibong alternatibo. Ang papel na ito ay may tatlong kontribusyon. Una, maglabas ng pre-trained AWD-LSTM language model sa wikang Filipino upang maging tuntungan sa pagbuo ng mga NLP applications sa wikang Filipino. Pangalawa, mag benchmark ng AWD-LSTM sa Hate Speech classification task at ipakita na kayang nitong makipagsabayan sa mga transformer-based models. Pangatlo, suriin ang performance ng AWD-LSTM sa low-resource setting gamit ang degradation test at ikumpara ito sa mga transformer-based models.

READ FULL TEXT
research
05/05/2020

Establishing Baselines for Text Classification in Low-Resource Languages

While transformer-based finetuning techniques have proven effective in t...
research
10/22/2020

Investigating the True Performance of Transformers in Low-Resource Languages: A Case Study in Automatic Corpus Creation

Transformers represent the state-of-the-art in Natural Language Processi...
research
09/17/2021

Boosting Transformers for Job Expression Extraction and Classification in a Low-Resource Setting

In this paper, we explore possible improvements of transformer models in...
research
03/07/2023

A Challenging Benchmark for Low-Resource Learning

With promising yet saturated results in high-resource settings, low-reso...
research
11/17/2021

Green CWS: Extreme Distillation and Efficient Decode Method Towards Industrial Application

Benefiting from the strong ability of the pre-trained model, the researc...
research
01/11/2020

A Continuous Space Neural Language Model for Bengali Language

Language models are generally employed to estimate the probability distr...
research
01/28/2020

A Study of Pyramid Structure for Code Correction

We demonstrate the implementations of pyramid encoders in both multi-lay...

Please sign up or login with your details

Forgot password? Click here to reset