Adaptive Self-training for Few-shot Neural Sequence Labeling

10/07/2020
by   Yaqing Wang, et al.
4

Neural sequence labeling is an important technique employed for many Natural Language Processing (NLP) tasks, such as Named Entity Recognition (NER), slot tagging for dialog systems and semantic parsing. Large-scale pre-trained language models obtain very good performance on these tasks when fine-tuned on large amounts of task-specific labeled data. However, such large-scale labeled datasets are difficult to obtain for several tasks and domains due to the high cost of human annotation as well as privacy and data access constraints for sensitive user applications. This is exacerbated for sequence labeling tasks requiring such annotations at token-level. In this work, we develop techniques to address the label scarcity challenge for neural sequence labeling models. Specifically, we develop self-training and meta-learning techniques for few-shot training of neural sequence taggers, namely MetaST. While self-training serves as an effective mechanism to learn from large amounts of unlabeled data – meta-learning helps in adaptive sample re-weighting to mitigate error propagation from noisy pseudo-labels. Extensive experiments on six benchmark datasets including two massive multilingual NER datasets and four slot tagging datasets for task-oriented dialog systems demonstrate the effectiveness of our method with around 10 systems for the 10-shot setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/17/2023

Uncertainty-aware Self-training for Low-resource Neural Sequence Labeling

Neural sequence labeling (NSL) aims at assigning labels for input langua...
research
04/29/2017

Semi-supervised sequence tagging with bidirectional language models

Pre-trained word embeddings learned from unlabeled text have become a st...
research
12/29/2020

Few-Shot Named Entity Recognition: A Comprehensive Study

This paper presents a comprehensive study to efficiently build named ent...
research
09/13/2017

Empower Sequence Labeling with Task-Aware Neural Language Model

Linguistic sequence labeling is a general modeling approach that encompa...
research
06/20/2019

Few-Shot Sequence Labeling with Label Dependency Transfer

Few-shot sequence labeling faces a unique challenge compared with the ot...
research
09/17/2021

Self-training with Few-shot Rationalization: Teacher Explanations Aid Student in Few-shot NLU

While pre-trained language models have obtained state-of-the-art perform...
research
02/15/2022

Debiased Pseudo Labeling in Self-Training

Deep neural networks achieve remarkable performances on a wide range of ...

Please sign up or login with your details

Forgot password? Click here to reset