Pre-training of Graph Augmented Transformers for Medication Recommendation

06/02/2019
by   Junyuan Shang, et al.
0

Medication recommendation is an important healthcare application. It is commonly formulated as a temporal prediction task. Hence, most existing works only utilize longitudinal electronic health records (EHRs) from a small number of patients with multiple visits ignoring a large number of patients with a single visit (selection bias). Moreover, important hierarchical knowledge such as diagnosis hierarchy is not leveraged in the representation learning process. To address these challenges, we propose G-BERT, a new model to combine the power of Graph Neural Networks (GNNs) and BERT (Bidirectional Encoder Representations from Transformers) for medical code representation and medication recommendation. We use GNNs to represent the internal hierarchical structures of medical codes. Then we integrate the GNN representation into a transformer-based visit encoder and pre-train it on EHR data from patients only with a single visit. The pre-trained visit encoder and representation are then fine-tuned for downstream predictive tasks on longitudinal EHRs from patients with multiple visits. G-BERT is the first to bring the language model pre-training schema into the healthcare domain and it achieved state-of-the-art performance on the medication recommendation task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/22/2020

Med-BERT: pre-trained contextualized embeddings on large-scale structured electronic health records for disease prediction

Deep learning (DL) based predictive models from electronic health record...
research
03/28/2023

Pre-training Transformers for Knowledge Graph Completion

Learning transferable representation of knowledge graphs (KGs) is challe...
research
06/09/2021

Self-Supervised Graph Learning with Hyperbolic Embedding for Temporal Health Event Prediction

Electronic Health Records (EHR) have been heavily used in modern healthc...
research
07/15/2020

Predicting Clinical Diagnosis from Patients Electronic Health Records Using BERT-based Neural Networks

In this paper we study the problem of predicting clinical diagnoses from...
research
02/13/2022

ET-BERT: A Contextualized Datagram Representation with Pre-training Transformers for Encrypted Traffic Classification

Encrypted traffic classification requires discriminative and robust traf...
research
08/31/2021

Medical SANSformers: Training self-supervised transformers without attention for Electronic Medical Records

We leverage deep sequential models to tackle the problem of predicting h...
research
11/12/2021

Using Deep Learning Sequence Models to Identify SARS-CoV-2 Divergence

SARS-CoV-2 is an upper respiratory system RNA virus that has caused over...

Please sign up or login with your details

Forgot password? Click here to reset