Log In Sign Up

Applications of BERT Based Sequence Tagging Models on Chinese Medical Text Attributes Extraction

by   Gang Zhao, et al.

We convert the Chinese medical text attributes extraction task into a sequence tagging or machine reading comprehension task. Based on BERT pre-trained models, we have not only tried the widely used LSTM-CRF sequence tagging model, but also other sequence models, such as CNN, UCNN, WaveNet, SelfAttention, etc, which reaches similar performance as LSTM+CRF. This sheds a light on the traditional sequence tagging models. Since the aspect of emphasis for different sequence tagging models varies substantially, ensembling these models adds diversity to the final system. By doing so, our system achieves good performance on the task of Chinese medical text attributes extraction (subtask 2 of CCKS 2019 task 1).


page 1

page 2

page 3

page 4


FinBERT-MRC: financial named entity recognition using BERT under the machine reading comprehension paradigm

Financial named entity recognition (FinNER) from literature is a challen...

Resource Mention Extraction for MOOC Discussion Forums

In discussions hosted on discussion forums for MOOCs, references to onli...

Team EP at TAC 2018: Automating data extraction in systematic reviews of environmental agents

We describe our entry for the Systematic Review Information Extraction t...

Deep or Simple Models for Semantic Tagging? It Depends on your Data [Experiments]

Semantic tagging, which has extensive applications in text mining, predi...

Fine-tuning ERNIE for chest abnormal imaging signs extraction

Chest imaging reports describe the results of chest radiography procedur...

GEDIT: Geographic-Enhanced and Dependency-Guided Tagging for Joint POI and Accessibility Extraction at Baidu Maps

Providing timely accessibility reminders of a point-of-interest (POI) pl...

Syntactic Patterns Improve Information Extraction for Medical Search

Medical professionals search the published literature by specifying the ...