-
Temporal Self-Attention Network for Medical Concept Embedding
In longitudinal electronic health records (EHRs), the event records of a...
read it
-
MedGraph: Structural and Temporal Representation Learning of Electronic Medical Records
Electronic medical record (EMR) data contains historical sequences of vi...
read it
-
Mixed Pooling Multi-View Attention Autoencoder for Representation Learning in Healthcare
Distributed representations have been used to support downstream tasks i...
read it
-
Comparative Visual Analytics for Assessing Medical Records with Sequence Embedding
Machine learning for data-driven diagnosis has been actively studied in ...
read it
-
Inpatient2Vec: Medical Representation Learning for Inpatients
Representation learning (RL) plays an important role in extracting prope...
read it
-
Resset: A Recurrent Model for Sequence of Sets with Applications to Electronic Medical Records
Modern healthcare is ripe for disruption by AI. A game changer would be ...
read it
-
ConCare: Personalized Clinical Feature Embedding via Capturing the Healthcare Context
Predicting the patient's clinical outcome from the historical electronic...
read it
BiteNet: Bidirectional Temporal Encoder Network to Predict Medical Outcomes
Electronic health records (EHRs) are longitudinal records of a patient's interactions with healthcare systems. A patient's EHR data is organized as a three-level hierarchy from top to bottom: patient journey - all the experiences of diagnoses and treatments over a period of time; individual visit - a set of medical codes in a particular visit; and medical code - a specific record in the form of medical codes. As EHRs begin to amass in millions, the potential benefits, which these data might hold for medical research and medical outcome prediction, are staggering - including, for example, predicting future admissions to hospitals, diagnosing illnesses or determining the efficacy of medical treatments. Each of these analytics tasks requires a domain knowledge extraction method to transform the hierarchical patient journey into a vector representation for further prediction procedure. The representations should embed a sequence of visits and a set of medical codes with a specific timestamp, which are crucial to any downstream prediction tasks. Hence, expressively powerful representations are appealing to boost learning performance. To this end, we propose a novel self-attention mechanism that captures the contextual dependency and temporal relationships within a patient's healthcare journey. An end-to-end bidirectional temporal encoder network (BiteNet) then learns representations of the patient's journeys, based solely on the proposed attention mechanism. We have evaluated the effectiveness of our methods on two supervised prediction and two unsupervised clustering tasks with a real-world EHR dataset. The empirical results demonstrate the proposed BiteNet model produces higher-quality representations than state-of-the-art baseline methods.
READ FULL TEXT
Comments
There are no comments yet.