Lex-BERT: Enhancing BERT based NER with lexicons

01/02/2021
by   Wei Zhu, et al.
0

In this work, we represent Lex-BERT, which incorporates the lexicon information into Chinese BERT for named entity recognition (NER) tasks in a natural manner. Instead of using word embeddings and a newly designed transformer layer as in FLAT, we identify the boundary of words in the sentences using special tokens, and the modified sentence will be encoded directly by BERT. Our model does not introduce any new parameters and are more efficient than FLAT. In addition, we do not require any word embeddings accompanying the lexicon collection. Experiments on Ontonotes and ZhCrossNER show that our model outperforms FLAT and other baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/11/2021

A More Efficient Chinese Named Entity Recognition base on BERT and Syntactic Analysis

We propose a new Named entity recognition (NER) method to effectively ma...
research
04/24/2020

FLAT: Chinese NER Using Flat-Lattice Transformer

Recently, the character-word lattice structure has been proved to be eff...
research
06/02/2020

Exploring Cross-sentence Contexts for Named Entity Recognition with BERT

Named entity recognition (NER) is frequently addressed as a sequence cla...
research
05/12/2022

NFLAT: Non-Flat-Lattice Transformer for Chinese Named Entity Recognition

Recently, Flat-LAttice Transformer (FLAT) has achieved great success in ...
research
12/19/2021

Unified Named Entity Recognition as Word-Word Relation Classification

So far, named entity recognition (NER) has been involved with three majo...
research
06/30/2019

Merge and Label: A novel neural network architecture for nested NER

Named entity recognition (NER) is one of the best studied tasks in natur...
research
05/15/2021

Lexicon Enhanced Chinese Sequence Labeling Using BERT Adapter

Lexicon information and pre-trained models, such as BERT, have been comb...

Please sign up or login with your details

Forgot password? Click here to reset