FLAT: Chinese NER Using Flat-Lattice Transformer

04/24/2020
by   Xiaonan Li, et al.
0

Recently, the character-word lattice structure has been proved to be effective for Chinese named entity recognition (NER) by incorporating the word information. However, since the lattice structure is complex and dynamic, most existing lattice-based models are hard to fully utilize the parallel computation of GPUs and usually have a low inference-speed. In this paper, we propose FLAT: Flat-LAttice Transformer for Chinese NER, which converts the lattice structure into a flat structure consisting of spans. Each span corresponds to a character or latent word and its position in the original lattice. With the power of Transformer and well-designed position encoding, FLAT can fully leverage the lattice information and has an excellent parallelization ability. Experiments on four datasets show FLAT outperforms other lexicon-based models in performance and efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/12/2022

NFLAT: Non-Flat-Lattice Transformer for Chinese Named Entity Recognition

Recently, Flat-LAttice Transformer (FLAT) has achieved great success in ...
research
11/07/2019

Porous Lattice-based Transformer Encoder for Chinese NER

Incorporating lattices into character-level Chinese named entity recogni...
research
01/02/2021

Lex-BERT: Enhancing BERT based NER with lexicons

In this work, we represent Lex-BERT, which incorporates the lexicon info...
research
08/16/2019

Simplify the Usage of Lexicon in Chinese NER

Recently, many works have tried to utilizing word lexicon to augment the...
research
10/30/2018

Subword Encoding in Lattice LSTM for Chinese Word Segmentation

We investigate a lattice LSTM network for Chinese word segmentation (CWS...
research
08/23/2022

Flat Multi-modal Interaction Transformer for Named Entity Recognition

Multi-modal named entity recognition (MNER) aims at identifying entity s...
research
09/29/2022

Named Entity Recognition in Industrial Tables using Tabular Language Models

Specialized transformer-based models for encoding tabular data have gain...

Please sign up or login with your details

Forgot password? Click here to reset