Model-Attentive Ensemble Learning for Sequence Modeling

02/23/2021
by   Victor D. Bourgin, et al.
7

Medical time-series datasets have unique characteristics that make prediction tasks challenging. Most notably, patient trajectories often contain longitudinal variations in their input-output relationships, generally referred to as temporal conditional shift. Designing sequence models capable of adapting to such time-varying distributions remains a prevailing problem. To address this we present Model-Attentive Ensemble learning for Sequence modeling (MAES). MAES is a mixture of time-series experts which leverages an attention-based gating mechanism to specialize the experts on different sequence dynamics and adaptively weight their predictions. We demonstrate that MAES significantly out-performs popular sequence models on datasets subject to temporal shift.

READ FULL TEXT
06/07/2019

Relaxed Weight Sharing: Effectively Modeling Time-Varying Relationships in Clinical Time-Series

Recurrent neural networks (RNNs) are commonly applied to clinical time-s...
06/02/2018

Hierarchical Attention-Based Recurrent Highway Networks for Time Series Prediction

Time series prediction has been studied in a variety of domains. However...
02/06/2018

Granger-causal Attentive Mixtures of Experts

Several methods have recently been proposed to detect salient input feat...
07/23/2022

Time Series Prediction under Distribution Shift using Differentiable Forgetting

Time series prediction is often complicated by distribution shift which ...
05/13/2017

ShortFuse: Biomedical Time Series Representations in the Presence of Structured Information

In healthcare applications, temporal variables that encode movement, hea...
12/31/2013

PSO-MISMO Modeling Strategy for Multi-Step-Ahead Time Series Prediction

Multi-step-ahead time series prediction is one of the most challenging r...