Log In Sign Up

BERT-based Ranking for Biomedical Entity Normalization

by   Zongcheng Ji, et al.

Developing high-performance entity normalization algorithms that can alleviate the term variation problem is of great interest to the biomedical community. Although deep learning-based methods have been successfully applied to biomedical entity normalization, they often depend on traditional context-independent word embeddings. Bidirectional Encoder Representations from Transformers (BERT), BERT for Biomedical Text Mining (BioBERT) and BERT for Clinical Text Mining (ClinicalBERT) were recently introduced to pre-train contextualized word representation models using bidirectional Transformers, advancing the state-of-the-art for many natural language processing tasks. In this study, we proposed an entity normalization architecture by fine-tuning the pre-trained BERT / BioBERT / ClinicalBERT models and conducted extensive experiments to evaluate the effectiveness of the pre-trained models for biomedical entity normalization using three different types of datasets. Our experimental results show that the best fine-tuned models consistently outperformed previous methods and advanced the state-of-the-art for biomedical entity normalization, with up to 1.17


page 1

page 2

page 3

page 4


BioBERT: pre-trained biomedical language representation model for biomedical text mining

Biomedical text mining has become more important than ever as the number...

Drug and Disease Interpretation Learning with Biomedical Entity Representation Transformer

Concept normalization in free-form texts is a crucial step in every text...

AILAB-Udine@SMM4H 22: Limits of Transformers and BERT Ensembles

This paper describes the models developed by the AILAB-Udine team for th...

Biomedical Entity Representations with Synonym Marginalization

Biomedical named entities often play important roles in many biomedical ...

Mixture-of-Partitions: Infusing Large Biomedical Knowledge Graphs into BERT

Infusing factual knowledge into pre-trained models is fundamental for ma...

Improved Biomedical Word Embeddings in the Transformer Era

Biomedical word embeddings are usually pre-trained on free text corpora ...