Generative time series models using Neural ODE in Variational Autoencoders

by   M. L. Garsdal, et al.

In this paper, we implement Neural Ordinary Differential Equations in a Variational Autoencoder setting for generative time series modeling. An object-oriented approach to the code was taken to allow for easier development and research and all code used in the paper can be found here: The results were initially recreated and the reconstructions compared to a baseline Long-Short Term Memory AutoEncoder. The model was then extended with a LSTM encoder and challenged by more complex data consisting of time series in the form of spring oscillations. The model showed promise, and was able to reconstruct true trajectories for all complexities of data with a smaller RMSE than the baseline model. However, it was able to capture the dynamic behavior of the time series for known data in the decoder but was not able to produce extrapolations following the true trajectory very well for any of the complexities of spring data. A final experiment was carried out where the model was also presented with 68 days of solar power production data, and was able to reconstruct just as well as the baseline, even when very little data is available. Finally, the models training time was compared to the baseline. It was found that for small amounts of data the NODE method was significantly slower at training than the baseline, while for larger amounts of data the NODE method would be equal or faster at training. The paper is ended with a future work section which describes the many natural extensions to the work presented in this paper, with examples being investigating further the importance of input data, including extrapolation in the baseline model or testing more specific model setups.


page 9

page 10

page 11

page 12

page 13

page 14

page 15


Modeling Financial Time Series using LSTM with Trainable Initial Hidden States

Extracting previously unknown patterns and information in time series is...

On the balance between the training time and interpretability of neural ODE for time series modelling

Most machine learning methods are used as a black box for modelling. We ...

BreizhCrops: A Satellite Time Series Dataset for Crop Type Identification

This dataset challenges the time series community with the task of satel...

Time Series Compression Based on Adaptive Piecewise Recurrent Autoencoder

Time series account for a large proportion of the data stored in financi...

Causal Recurrent Variational Autoencoder for Medical Time Series Generation

We propose causal recurrent variational autoencoder (CR-VAE), a novel ge...

VConstruct: Filling Gaps in Chl-a Data Using a Variational Autoencoder

Remote sensing of Chlorophyll-a is vital in monitoring climate change. C...

Rethinking 1D-CNN for Time Series Classification: A Stronger Baseline

For time series classification task using 1D-CNN, the selection of kerne...

Please sign up or login with your details

Forgot password? Click here to reset