Neural CDEs for Long Time Series via the Log-ODE Method

09/17/2020
by   James Morrill, et al.
12

Neural Controlled Differential Equations (Neural CDEs) are the continuous-time analogue of an RNN, just as Neural ODEs are analogous to ResNets. However just like RNNs, training Neural CDEs can be difficult for long time series. Here, we propose to apply a technique drawn from stochastic analysis, namely the log-ODE method. Instead of using the original input sequence, our procedure summarises the information over local time intervals via the log-signature map, and uses the resulting shorter stream of log-signatures as the new input. This represents a length/channel trade-off. In doing so we demonstrate efficacy on problems of length up to 17k observations and observe significant training speed-ups, improvements in model performance, and reduced memory requirements compared to the existing algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/19/2022

LORD: Lower-Dimensional Embedding of Log-Signature in Neural Rough Differential Equations

The problem of processing very long time-series data (e.g., a length of ...
research
07/08/2019

Latent ODEs for Irregularly-Sampled Time Series

Time series with non-uniform intervals occur in many applications, and a...
research
01/11/2023

Learnable Path in Neural Controlled Differential Equations

Neural controlled differential equations (NCDEs), which are continuous a...
research
08/22/2019

Learning stochastic differential equations using RNN with log signature features

This paper contributes to the challenge of learning a function on stream...
research
06/21/2021

Neural Controlled Differential Equations for Online Prediction Tasks

Neural controlled differential equations (Neural CDEs) are a continuous-...
research
01/02/2022

Randomized Signature Layers for Signal Extraction in Time Series Data

Time series analysis is a widespread task in Natural Sciences, Social Sc...
research
05/27/2020

Discretize-Optimize vs. Optimize-Discretize for Time-Series Regression and Continuous Normalizing Flows

We compare the discretize-optimize (Disc-Opt) and optimize-discretize (O...

Please sign up or login with your details

Forgot password? Click here to reset