Using Machine Learning to Augment Dynamic Time Warping Based Signal Classification

06/14/2022
by   Arvind Seshan, et al.
0

Modern applications such as voice recognition rely on the ability to compare signals to pre-recorded ones to classify them. However, this comparison typically needs to ignore differences due to signal noise, temporal offset, signal magnitude, and other external factors. The Dynamic Time Warping (DTW) algorithm quantifies this similarity by finding corresponding regions between the signals and non-linearly warping one signal by stretching and shrinking it. Unfortunately, searching through all "warps" of a signal to find the best corresponding regions is computationally expensive. The FastDTW algorithm improves performance, but sacrifices accuracy by only considering small signal warps. My goal is to improve the speed of DTW while maintaining high accuracy. My key insight is that in any particular application domain, signals exhibit specific types of variation. For example, the accelerometer signal measured for two different people would differ based on their stride length and weight. My system, called Machine Learning DTW (MLDTW), uses machine learning to learn the types of warps that are common in a particular domain. It then uses the learned model to improve DTW performance by limiting the search of potential warps appropriately. My results show that compared to FastDTW, MLDTW is at least as fast and reduces errors by 60 These improvements will significantly impact a wide variety of applications (e.g. health monitoring) and enable more scalable processing of multivariate, higher frequency, and longer signal recordings.

READ FULL TEXT

page 3

page 11

page 12

page 15

page 16

page 17

page 18

page 19

research
07/15/2018

The Globally Optimal Reparameterization Algorithm: an Alternative to Fast Dynamic Time Warping for Action Recognition in Video Sequences

Signal alignment has become a popular problem in robotics due in part to...
research
05/30/2019

A General Optimization Framework for Dynamic Time Warping

Dynamic time warping (DTW) is a method that inputs two time-domain signa...
research
08/05/2019

Chatter Detection in Turning Using Machine Learning and Similarity Measures of Time Series via Dynamic Time Warping

Chatter detection from sensor signals has been an active field of resear...
research
06/02/2020

Detection of gravitational-wave signals from binary neutron star mergers using machine learning

As two neutron stars merge, they emit gravitational waves that can poten...
research
11/14/2019

Deep Learning for Over-the-Air Non-Orthogonal Signal Classification

Non-cooperative communications, where a receiver can automatically disti...
research
03/18/2021

Discriminative Singular Spectrum Classifier with Applications on Bioacoustic Signal Recognition

Automatic analysis of bioacoustic signals is a fundamental tool to evalu...
research
04/30/2018

Interpreting weight maps in terms of cognitive or clinical neuroscience: nonsense?

Since machine learning models have been applied to neuroimaging data, re...

Please sign up or login with your details

Forgot password? Click here to reset