DeepAI AI Chat
Log In Sign Up

Lex-BERT: Enhancing BERT based NER with lexicons

by   Wei Zhu, et al.

In this work, we represent Lex-BERT, which incorporates the lexicon information into Chinese BERT for named entity recognition (NER) tasks in a natural manner. Instead of using word embeddings and a newly designed transformer layer as in FLAT, we identify the boundary of words in the sentences using special tokens, and the modified sentence will be encoded directly by BERT. Our model does not introduce any new parameters and are more efficient than FLAT. In addition, we do not require any word embeddings accompanying the lexicon collection. Experiments on Ontonotes and ZhCrossNER show that our model outperforms FLAT and other baselines.


page 1

page 2

page 3

page 4


A More Efficient Chinese Named Entity Recognition base on BERT and Syntactic Analysis

We propose a new Named entity recognition (NER) method to effectively ma...

FLAT: Chinese NER Using Flat-Lattice Transformer

Recently, the character-word lattice structure has been proved to be eff...

Exploring Cross-sentence Contexts for Named Entity Recognition with BERT

Named entity recognition (NER) is frequently addressed as a sequence cla...

NFLAT: Non-Flat-Lattice Transformer for Chinese Named Entity Recognition

Recently, Flat-LAttice Transformer (FLAT) has achieved great success in ...

Unified Named Entity Recognition as Word-Word Relation Classification

So far, named entity recognition (NER) has been involved with three majo...

Merge and Label: A novel neural network architecture for nested NER

Named entity recognition (NER) is one of the best studied tasks in natur...

Lexicon Enhanced Chinese Sequence Labeling Using BERT Adapter

Lexicon information and pre-trained models, such as BERT, have been comb...