DeepAI AI Chat
Log In Sign Up

A More Efficient Chinese Named Entity Recognition base on BERT and Syntactic Analysis

by   Xiao Fu, et al.

We propose a new Named entity recognition (NER) method to effectively make use of the results of Part-of-speech (POS) tagging, Chinese word segmentation (CWS) and parsing while avoiding NER error caused by POS tagging error. This paper first uses Stanford natural language process (NLP) tool to annotate large-scale untagged data so as to reduce the dependence on the tagged data; then a new NLP model, g-BERT model, is designed to compress Bidirectional Encoder Representations from Transformers (BERT) model in order to reduce calculation quantity; finally, the model is evaluated based on Chinese NER dataset. The experimental results show that the calculation quantity in g-BERT model is reduced by 60 compared with that in BERT model.


Beheshti-NER: Persian Named Entity Recognition Using BERT

Named entity recognition is a natural language processing task to recogn...

Lex-BERT: Enhancing BERT based NER with lexicons

In this work, we represent Lex-BERT, which incorporates the lexicon info...

A Discourse-Level Named Entity Recognition and Relation Extraction Dataset for Chinese Literature Text

Named Entity Recognition and Relation Extraction for Chinese literature ...

Classification of US Supreme Court Cases using BERT-Based Techniques

Models based on bidirectional encoder representations from transformers ...

Segmenting Natural Language Sentences via Lexical Unit Analysis

In this work, we present Lexical Unit Analysis (LUA), a framework for ge...

Deploying a BERT-based Query-Title Relevance Classifier in a Production System: a View from the Trenches

The Bidirectional Encoder Representations from Transformers (BERT) model...

fastHan: A BERT-based Joint Many-Task Toolkit for Chinese NLP

We present fastHan, an open-source toolkit for four basic tasks in Chine...