Model-Attentive Ensemble Learning for Sequence Modeling

by   Victor D. Bourgin, et al.

Medical time-series datasets have unique characteristics that make prediction tasks challenging. Most notably, patient trajectories often contain longitudinal variations in their input-output relationships, generally referred to as temporal conditional shift. Designing sequence models capable of adapting to such time-varying distributions remains a prevailing problem. To address this we present Model-Attentive Ensemble learning for Sequence modeling (MAES). MAES is a mixture of time-series experts which leverages an attention-based gating mechanism to specialize the experts on different sequence dynamics and adaptively weight their predictions. We demonstrate that MAES significantly out-performs popular sequence models on datasets subject to temporal shift.


Relaxed Weight Sharing: Effectively Modeling Time-Varying Relationships in Clinical Time-Series

Recurrent neural networks (RNNs) are commonly applied to clinical time-s...

Hierarchical Attention-Based Recurrent Highway Networks for Time Series Prediction

Time series prediction has been studied in a variety of domains. However...

Granger-causal Attentive Mixtures of Experts

Several methods have recently been proposed to detect salient input feat...

Time Series Prediction under Distribution Shift using Differentiable Forgetting

Time series prediction is often complicated by distribution shift which ...

ShortFuse: Biomedical Time Series Representations in the Presence of Structured Information

In healthcare applications, temporal variables that encode movement, hea...

PSO-MISMO Modeling Strategy for Multi-Step-Ahead Time Series Prediction

Multi-step-ahead time series prediction is one of the most challenging r...