Med-BERT: pre-trained contextualized embeddings on large-scale structured electronic health records for disease prediction

05/22/2020
by   Laila Rasmy, et al.
0

Deep learning (DL) based predictive models from electronic health records (EHR) deliver impressive performance in many clinical tasks. Large training cohorts, however, are often required to achieve high accuracy, hindering the adoption of DL-based models in scenarios with limited training data size. Recently, bidirectional encoder representations from transformers (BERT) and related models have achieved tremendous successes in the natural language processing domain. The pre-training of BERT on a very large training corpus generates contextualized embeddings that can boost the performance of models trained on smaller datasets. We propose Med-BERT, which adapts the BERT framework for pre-training contextualized embedding models on structured diagnosis data from 28,490,650 patients EHR dataset. Fine-tuning experiments are conducted on two disease-prediction tasks: (1) prediction of heart failure in patients with diabetes and (2) prediction of pancreatic cancer from two clinical databases. Med-BERT substantially improves prediction accuracy, boosting the area under receiver operating characteristics curve (AUC) by 2.02-7.12 performance of tasks with very small fine-tuning training sets (300-500 samples) boosting the AUC by more than 20 larger training set. We believe that Med-BERT will benefit disease-prediction studies with small local training datasets, reduce data collection expenses, and accelerate the pace of artificial intelligence aided healthcare.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/20/2023

CPLLM: Clinical Prediction with Large Language Models

We present Clinical Prediction with Large Language Models (CPLLM), a met...
research
06/02/2019

Pre-training of Graph Augmented Transformers for Medication Recommendation

Medication recommendation is an important healthcare application. It is ...
research
01/25/2020

Further Boosting BERT-based Models by Duplicating Existing Layers: Some Intriguing Phenomena inside BERT

Although Bidirectional Encoder Representations from Transformers (BERT) ...
research
07/15/2020

Predicting Clinical Diagnosis from Patients Electronic Health Records Using BERT-based Neural Networks

In this paper we study the problem of predicting clinical diagnoses from...
research
10/22/2022

Spectrum-BERT: Pre-training of Deep Bidirectional Transformers for Spectral Classification of Chinese Liquors

Spectral detection technology, as a non-invasive method for rapid detect...
research
04/24/2023

FineEHR: Refine Clinical Note Representations to Improve Mortality Prediction

Monitoring the health status of patients in the ICU is crucial for provi...
research
12/09/2021

Transferring BERT-like Transformers' Knowledge for Authorship Verification

The task of identifying the author of a text spans several decades and w...

Please sign up or login with your details

Forgot password? Click here to reset