DeepAI AI Chat
Log In Sign Up

A More Efficient Chinese Named Entity Recognition base on BERT and Syntactic Analysis

01/11/2021
by   Xiao Fu, et al.
0

We propose a new Named entity recognition (NER) method to effectively make use of the results of Part-of-speech (POS) tagging, Chinese word segmentation (CWS) and parsing while avoiding NER error caused by POS tagging error. This paper first uses Stanford natural language process (NLP) tool to annotate large-scale untagged data so as to reduce the dependence on the tagged data; then a new NLP model, g-BERT model, is designed to compress Bidirectional Encoder Representations from Transformers (BERT) model in order to reduce calculation quantity; finally, the model is evaluated based on Chinese NER dataset. The experimental results show that the calculation quantity in g-BERT model is reduced by 60 compared with that in BERT model.

READ FULL TEXT
03/19/2020

Beheshti-NER: Persian Named Entity Recognition Using BERT

Named entity recognition is a natural language processing task to recogn...
01/02/2021

Lex-BERT: Enhancing BERT based NER with lexicons

In this work, we represent Lex-BERT, which incorporates the lexicon info...
11/19/2017

A Discourse-Level Named Entity Recognition and Relation Extraction Dataset for Chinese Literature Text

Named Entity Recognition and Relation Extraction for Chinese literature ...
04/17/2023

Classification of US Supreme Court Cases using BERT-Based Techniques

Models based on bidirectional encoder representations from transformers ...
12/10/2020

Segmenting Natural Language Sentences via Lexical Unit Analysis

In this work, we present Lexical Unit Analysis (LUA), a framework for ge...
08/23/2021

Deploying a BERT-based Query-Title Relevance Classifier in a Production System: a View from the Trenches

The Bidirectional Encoder Representations from Transformers (BERT) model...
09/18/2020

fastHan: A BERT-based Joint Many-Task Toolkit for Chinese NLP

We present fastHan, an open-source toolkit for four basic tasks in Chine...