Task-Adaptive Pre-Training for Boosting Learning With Noisy Labels: A Study on Text Classification for African Languages

06/03/2022
by   Dawei Zhu, et al.
0

For high-resource languages like English, text classification is a well-studied task. The performance of modern NLP models easily achieves an accuracy of more than 90 English (Xie et al., 2019; Yang et al., 2019; Zaheer et al., 2020). However, text classification in low-resource languages is still challenging due to the lack of annotated data. Although methods like weak supervision and crowdsourcing can help ease the annotation bottleneck, the annotations obtained by these methods contain label noise. Models trained with label noise may not generalize well. To this end, a variety of noise-handling techniques have been proposed to alleviate the negative impact caused by the errors in the annotations (for extensive surveys see (Hedderich et al., 2021; Algan Ulusoy, 2021)). In this work, we experiment with a group of standard noisy-handling methods on text classification tasks with noisy labels. We study both simulated noise and realistic noise induced by weak supervision. Moreover, we find task-adaptive pre-training techniques (Gururangan et al., 2020) are beneficial for learning with noisy labels.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/20/2022

Is BERT Robust to Label Noise? A Study on Learning with Noisy Labels in Text Classification

Incorrect labels in training data occur when human annotators make mista...
research
01/27/2021

Towards Robustness to Label Noise in Text Classification via Noise Modeling

Large datasets in NLP suffer from noisy labels, due to erroneous automat...
research
04/14/2022

Label Semantic Aware Pre-training for Few-shot Text Classification

In text classification tasks, useful information is encoded in the label...
research
01/24/2021

Analysing the Noise Model Error for Realistic Noisy Label Data

Distant and weak supervision allow to obtain large amounts of labeled tr...
research
04/27/2022

SkillSpan: Hard and Soft Skill Extraction from English Job Postings

Skill Extraction (SE) is an important and widely-studied task useful to ...
research
07/02/2018

Training a Neural Network in a Low-Resource Setting on Automatically Annotated Noisy Data

Manually labeled corpora are expensive to create and often not available...
research
03/21/2023

A Unified Taxonomy of Deep Syntactic Relations

This paper analyzes multiple deep-syntactic frameworks with the goal of ...

Please sign up or login with your details

Forgot password? Click here to reset