Fine-tuning BERT for Joint Entity and Relation Extraction in Chinese Medical Text

by   Kui Xue, et al.

Entity and relation extraction is the necessary step in structuring medical text. However, the feature extraction ability of the bidirectional long short term memory network in the existing model does not achieve the best effect. At the same time, the language model has achieved excellent results in more and more natural language processing tasks. In this paper, we present a focused attention model for the joint entity and relation extraction task. Our model integrates well-known BERT language model into joint learning through dynamic range attention mechanism, thus improving the feature representation ability of shared parameter layer. Experimental results on coronary angiography texts collected from Shuguang Hospital show that the F1-score of named entity recognition and relation classification tasks reach 96.89 are better than state-of-the-art methods 1.65



There are no comments yet.


page 1

page 2

page 3

page 4


BERT-Based Multi-Head Selection for Joint Entity-Relation Extraction

In this paper, we report our method for the Information Extraction task ...

Improving Biomedical Pretrained Language Models with Knowledge

Pretrained language models have shown success in many natural language p...

A Sui Generis QA Approach using RoBERTa for Adverse Drug Event Identification

Extraction of adverse drug events from biomedical literature and other t...

A Trigger-Sense Memory Flow Framework for Joint Entity and Relation Extraction

Joint entity and relation extraction framework constructs a unified mode...

Joint Extraction of Entity and Relation with Information Redundancy Elimination

To solve the problem of redundant information and overlapping relations ...

Span-based Joint Entity and Relation Extraction with Transformer Pre-training

We introduce SpERT, an attention model for span-based joint entity and r...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.