Deep Self-Organization: Interpretable Discrete Representation Learning on Time Series

by   Vincent Fortuin, et al.

Human professionals are often required to make decisions based on complex multivariate time series measurements in an online setting, e.g. in health care. Since human cognition is not optimized to work well in high-dimensional spaces, these decisions benefit from interpretable low-dimensional representations. However, many representation learning algorithms for time series data are difficult to interpret. This is due to non-intuitive mappings from data features to salient properties of the representation and non-smoothness over time. To address this problem, we propose to couple a variational autoencoder to a discrete latent space and introduce a topological structure through the use of self-organizing maps. This allows us to learn discrete representations of time series, which give rise to smooth and interpretable embeddings with superior clustering performance. Furthermore, to allow for a probabilistic interpretation of our method, we integrate a Markov model in the latent space. This model uncovers the temporal transition structure, improves clustering performance even further and provides additional explanatory insights as well as a natural representation of uncertainty. We evaluate our model on static (Fashion-)MNIST data, a time series of linearly interpolated (Fashion-)MNIST images, a chaotic Lorenz attractor system with two macro states, as well as on a challenging real world medical time series application. In the latter experiment, our representation uncovers meaningful structure in the acute physiological state of a patient.


page 4

page 6

page 8

page 14

page 15


Variational PSOM: Deep Probabilistic Clustering with Self-Organizing Maps

Generating visualizations and interpretations from high-dimensional data...

Unsupervised Representation Learning for Time Series with Temporal Neighborhood Coding

Time series are often complex and rich in information but sparsely label...

Explaining Deep Classification of Time-Series Data with Learned Prototypes

The emergence of deep learning networks raises a need for algorithms to ...

SOM-CPC: Unsupervised Contrastive Learning with Self-Organizing Maps for Structured Representations of High-Rate Time Series

Continuous monitoring with an ever-increasing number of sensors has beco...

Encoding Time-Series Explanations through Self-Supervised Model Behavior Consistency

Interpreting time series models is uniquely challenging because it requi...

Learning Disentangled Representations for Time Series

Time-series representation learning is a fundamental task for time-serie...

Graph Spectral Embedding for Parsimonious Transmission of Multivariate Time Series

We propose a graph spectral representation of time series data that 1) i...

Please sign up or login with your details

Forgot password? Click here to reset