Asymmetric Learning Vector Quantization for Efficient Nearest Neighbor Classification in Dynamic Time Warping Spaces

03/24/2017
by   Brijnesh Jain, et al.
0

The nearest neighbor method together with the dynamic time warping (DTW) distance is one of the most popular approaches in time series classification. This method suffers from high storage and computation requirements for large training sets. As a solution to both drawbacks, this article extends learning vector quantization (LVQ) from Euclidean spaces to DTW spaces. The proposed LVQ scheme uses asymmetric weighted averaging as update rule. Empirical results exhibited superior performance of asymmetric generalized LVQ (GLVQ) over other state-of-the-art prototype generation methods for nearest neighbor classification.

READ FULL TEXT

page 11

page 12

page 13

page 14

research
03/04/2019

Making the Dynamic Time Warping Distance Warping-Invariant

The literature postulates that the dynamic time warping (dtw) distance c...
research
09/22/2019

Classification in asymmetric spaces via sample compression

We initiate the rigorous study of classification in quasi-metric spaces....
research
08/29/2018

Semi-Metrification of the Dynamic Time Warping Distance

The dynamic time warping (dtw) distance fails to satisfy the triangle in...
research
02/14/2013

A Latent Source Model for Nonparametric Time Series Classification

For classifying time series, a nearest-neighbor approach is widely used ...
research
06/06/2016

shapeDTW: shape Dynamic Time Warping

Dynamic Time Warping (DTW) is an algorithm to align temporal sequences w...
research
06/09/2022

Neural Bregman Divergences for Distance Learning

Many metric learning tasks, such as triplet learning, nearest neighbor r...
research
04/11/2021

Sublinear Time Nearest Neighbor Search over Generalized Weighted Manhattan Distance

Nearest Neighbor Search (NNS) over generalized weighted distance is fund...

Please sign up or login with your details

Forgot password? Click here to reset