Hybrid semi-Markov CRF for Neural Sequence Labeling

05/10/2018
by   Zhi-Xiu Ye, et al.
0

This paper proposes hybrid semi-Markov conditional random fields (SCRFs) for neural sequence labeling in natural language processing. Based on conventional conditional random fields (CRFs), SCRFs have been designed for the tasks of assigning labels to segments by extracting features from and describing transitions between segments instead of words. In this paper, we improve the existing SCRF methods by employing word-level and segment-level information simultaneously. First, word-level labels are utilized to derive the segment scores in SCRFs. Second, a CRF output layer and an SCRF output layer are integrated into an unified neural network and trained jointly. Experimental results on CoNLL 2003 named entity recognition (NER) shared task show that our model achieves state-of-the-art performance when no external knowledge is used.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/19/2016

Exploring Segment Representations for Neural Segmentation Models

Many natural language processing (NLP) tasks can be generalized into seg...
research
03/12/2018

A Feature-Rich Vietnamese Named-Entity Recognition Model

In this paper, we present a feature-based named-entity recognition (NER)...
research
09/19/2022

Duration modeling with semi-Markov Conditional Random Fields for keyphrase extraction

Existing methods for keyphrase extraction need preprocessing to generate...
research
03/30/2021

Locally-Contextual Nonlinear CRFs for Sequence Labeling

Linear chain conditional random fields (CRFs) combined with contextual w...
research
11/18/2015

Segmental Recurrent Neural Networks

We introduce segmental recurrent neural networks (SRNNs) which define, g...
research
10/19/2018

Weak Semi-Markov CRFs for NP Chunking in Informal Text

This paper introduces a new annotated corpus based on an existing inform...

Please sign up or login with your details

Forgot password? Click here to reset