Empower Sequence Labeling with Task-Aware Neural Language Model

09/13/2017
by   Liyuan Liu, et al.
0

Linguistic sequence labeling is a general modeling approach that encompasses a variety of problems, such as part-of-speech tagging and named entity recognition. Recent advances in neural networks (NNs) make it possible to build reliable models without handcrafted features. However, in many cases, it is hard to obtain sufficient annotations to train these models. In this study, we develop a novel neural framework to extract abundant knowledge hidden in raw texts to empower the sequence labeling task. Besides word-level knowledge contained in pre-trained word embeddings, character-aware neural language models are incorporated to extract character-level knowledge. Transfer learning techniques are further adopted to mediate different components and guide the language model towards the key knowledge. Comparing to previous methods, these task-specific knowledge allows us to adopt a more concise model and conduct more efficient training. Different from most transfer learning methods, the proposed framework does not rely on any additional supervision. It extracts knowledge from self-contained order information of training sequences. Extensive experiments on benchmark datasets demonstrate the effectiveness of leveraging character-level knowledge and the efficiency of co-training. For example, on the CoNLL03 NER task, model training completes in about 6 hours on a single GPU, reaching F1 score of 91.71±0.10 without using any extra annotation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/29/2017

Semi-supervised sequence tagging with bidirectional language models

Pre-trained word embeddings learned from unlabeled text have become a st...
research
07/27/2018

Improving Neural Sequence Labelling using Additional Linguistic Information

Sequence labelling is the task of assigning categorical labels to a data...
research
10/07/2020

Adaptive Self-training for Few-shot Neural Sequence Labeling

Neural sequence labeling is an important technique employed for many Nat...
research
06/18/2019

Towards Robust Named Entity Recognition for Historic German

Recent advances in language modeling using deep neural networks have sho...
research
10/06/2022

Distilling Task-specific Logical Rules from Large Pre-trained Models

Logical rules, both transferable and explainable, are widely used as wea...
research
06/24/2018

Character-Level Feature Extraction with Densely Connected Networks

Generating character-level features is an important step for achieving g...
research
09/29/2019

Gated Task Interaction Framework for Multi-task Sequence Tagging

Recent studies have shown that neural models can achieve high performanc...

Please sign up or login with your details

Forgot password? Click here to reset