Low-Resource Adaptation of Neural NLP Models

11/09/2020
by   Farhad Nooralahzadeh, et al.
5

Real-world applications of natural language processing (NLP) are challenging. NLP models rely heavily on supervised machine learning and require large amounts of annotated data. These resources are often based on language data available in large quantities, such as English newswire. However, in real-world applications of NLP, the textual resources vary across several dimensions, such as language, dialect, topic, and genre. It is challenging to find annotated data of sufficient amount and quality. The objective of this thesis is to investigate methods for dealing with such low-resource scenarios in information extraction and natural language understanding. To this end, we study distant supervision and sequential transfer learning in various low-resource settings. We develop and adapt neural NLP models to explore a number of research questions concerning NLP tasks with minimal or no training data.

READ FULL TEXT

page 7

page 15

page 20

page 23

page 24

page 29

page 33

page 36

research
10/23/2020

A Survey on Recent Approaches for Natural Language Processing in Low-Resource Scenarios

Current developments in natural language processing offer challenges and...
research
02/03/2023

Mitigating Data Scarcity for Large Language Models

In recent years, pretrained neural language models (PNLMs) have taken th...
research
10/25/2022

This joke is [MASK]: Recognizing Humor and Offense with Prompting

Humor is a magnetic component in everyday human interactions and communi...
research
08/17/2023

Linguistically-Informed Neural Architectures for Lexical, Syntactic and Semantic Tasks in Sanskrit

The primary focus of this thesis is to make Sanskrit manuscripts more ac...
research
08/25/2020

ETC-NLG: End-to-end Topic-Conditioned Natural Language Generation

Plug-and-play language models (PPLMs) enable topic-conditioned natural l...
research
12/19/2022

Privacy Adhering Machine Un-learning in NLP

Regulations introduced by General Data Protection Regulation (GDPR) in t...
research
10/07/2020

Transfer Learning and Distant Supervision for Multilingual Transformer Models: A Study on African Languages

Multilingual transformer models like mBERT and XLM-RoBERTa have obtained...

Please sign up or login with your details

Forgot password? Click here to reset