NFLAT: Non-Flat-Lattice Transformer for Chinese Named Entity Recognition

05/12/2022
by   Shuang Wu, et al.
0

Recently, Flat-LAttice Transformer (FLAT) has achieved great success in Chinese Named Entity Recognition (NER). FLAT performs lexical enhancement by constructing flat lattices, which mitigates the difficulties posed by blurred word boundaries and the lack of word semantics. In FLAT, the positions of starting and ending characters are used to connect a matching word. However, this method is likely to match more words when dealing with long texts, resulting in long input sequences. Therefore, it significantly increases the memory and computational costs of the self-attention module. To deal with this issue, we advocate a novel lexical enhancement method, InterFormer, that effectively reduces the amount of computational and memory costs by constructing non-flat lattices. Furthermore, with InterFormer as the backbone, we implement NFLAT for Chinese NER. NFLAT decouples lexicon fusion and context feature encoding. Compared with FLAT, it reduces unnecessary attention calculations in "word-character" and "word-word". This reduces the memory usage by about 50 training. The experimental results obtained on several well-known benchmarks demonstrate the superiority of the proposed method over the state-of-the-art hybrid (character-word) models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/24/2020

FLAT: Chinese NER Using Flat-Lattice Transformer

Recently, the character-word lattice structure has been proved to be eff...
research
07/12/2021

MECT: Multi-Metadata Embedding based Cross-Transformer for Chinese Named Entity Recognition

Recently, word enhancement has become very popular for Chinese Named Ent...
research
11/07/2019

Porous Lattice-based Transformer Encoder for Chinese NER

Incorporating lattices into character-level Chinese named entity recogni...
research
01/02/2021

Lex-BERT: Enhancing BERT based NER with lexicons

In this work, we represent Lex-BERT, which incorporates the lexicon info...
research
08/16/2019

Simplify the Usage of Lexicon in Chinese NER

Recently, many works have tried to utilizing word lexicon to augment the...
research
07/16/2020

SLK-NER: Exploiting Second-order Lexicon Knowledge for Chinese NER

Although character-based models using lexicon have achieved promising re...
research
08/14/2019

Raw-to-End Name Entity Recognition in Social Media

Taking word sequences as the input, typical named entity recognition (NE...

Please sign up or login with your details

Forgot password? Click here to reset