DeepAI AI Chat
Log In Sign Up

An Analytical Approach to Improving Time Warping on Multidimensional Time Series

by   Jörg P. Bachmann, et al.
Humboldt-Universität zu Berlin

Dynamic time warping (DTW) is one of the most used distance functions to compare time series, e. g. in nearest neighbor classifiers. Yet, fast state of the art algorithms only compare 1-dimensional time series efficiently. One of these state of the art algorithms uses a lower bound (LB_Keogh) introduced by E. Keogh to prune DTW computations. We introduce LB_Box as a canonical extension to LB_Keogh on multi-dimensional time series. We evaluate its performance conceptually and experimentally and show that an alternative to LB_Box is necessary for multi-dimensional time series. We also propose a new algorithm for the dog-keeper distance (DK) which is an alternative distance function to DTW and show that it outperforms DTW with LB_Box by more than one order of magnitude on multi-dimensional time series.


High Dimensional Time Series Generators

Multidimensional time series are sequences of real valued vectors. They ...

Warping Resilient Time Series Embeddings

Time series are ubiquitous in real world problems and computing distance...

Parameterizing the cost function of Dynamic Time Warping with application to time series classification

Dynamic Time Warping (DTW) is a popular time series distance measure tha...

TC-DTW: Accelerating Multivariate Dynamic Time Warping Through Triangle Inequality and Point Clustering

Dynamic time warping (DTW) plays an important role in analytics on time ...

Faster Retrieval with a Two-Pass Dynamic-Time-Warping Lower Bound

The Dynamic Time Warping (DTW) is a popular similarity measure between t...

Regular multidimensional stationary time series

The aim of this paper is to give a relatively simple, usable sufficient ...

Semi-Metrification of the Dynamic Time Warping Distance

The dynamic time warping (dtw) distance fails to satisfy the triangle in...