Handling Noisy Labels for Robustly Learning from Self-Training Data for Low-Resource Sequence Labeling

03/28/2019
by   Debjit Paul, et al.
0

In this paper, we address the problem of effectively self-training neural networks in a low-resource setting. Self-training is frequently used to automatically increase the amount of training data. However, in a low-resource scenario, it is less effective due to unreliable annotations created using self-labeling of unlabeled data. We propose to combine self-training with noise handling on the self-labeled data. Directly estimating noise on the combined clean training set and self-labeled data can lead to corruption of the clean data and hence, performs worse. Thus, we propose the Clean and Noisy Label Neural Network which trains on clean and noisy self-labeled data simultaneously by explicitly modelling clean and noisy labels separately. In our experiments on Chunking and NER, this approach performs more robustly than the baselines. Complementary to this explicit approach, noise can also be handled implicitly with the help of an auxiliary learning task. To such a complementary approach, our method is more beneficial than other baseline methods and together provides the best performance overall.

READ FULL TEXT
research
07/02/2018

Training a Neural Network in a Low-Resource Setting on Automatically Annotated Noisy Data

Manually labeled corpora are expensive to create and often not available...
research
10/14/2019

Feature-Dependent Confusion Matrices for Low-Resource NER Labeling with Noisy Labels

In low-resource settings, the performance of supervised labeling models ...
research
02/17/2023

Uncertainty-aware Self-training for Low-resource Neural Sequence Labeling

Neural sequence labeling (NSL) aims at assigning labels for input langua...
research
08/26/2019

Low-Resource Name Tagging Learned with Weakly Labeled Data

Name tagging in low-resource languages or domains suffers from inadequat...
research
06/29/2021

INN: A Method Identifying Clean-annotated Samples via Consistency Effect in Deep Neural Networks

In many classification problems, collecting massive clean-annotated data...
research
10/13/2019

What happens when self-supervision meets Noisy Labels?

The major driving force behind the immense success of deep learning mode...
research
01/28/2020

QActor: On-line Active Learning for Noisy Labeled Stream Data

Noisy labeled data is more a norm than a rarity for self-generated conte...

Please sign up or login with your details

Forgot password? Click here to reset