
Foundations of SequencetoSequence Modeling for Time Series
The availability of large amounts of time series data, paired with the p...
read it

Historical Inertia: An Ignored but Powerful Baseline for Long Sequence Timeseries Forecasting
Long sequence timeseries forecasting (LSTF) has become increasingly pop...
read it

Measuring Congruence on High Dimensional Time Series
A time series is a sequence of data items; typical examples are videos, ...
read it

Differentiable Divergences Between Time Series
Computing the discrepancy between time series of variable sizes is notor...
read it

Recurrent Deep Divergencebased Clustering for simultaneous feature learning and clustering of variable length time series
The task of clustering unlabeled time series and sequences entails a par...
read it

Reconciling the Gaussian and Whittle Likelihood with an application to estimation in the frequency domain
In time series analysis there is an apparent dichotomy between time and ...
read it

Routine Modeling with Time Series Metric Learning
Traditionally, the automatic recognition of human activities is performe...
read it
Data Smashing 2.0: Sequence Likelihood (SL) Divergence For Fast Time Series Comparison
Recognizing subtle historical patterns is central to modeling and forecasting problems in time series analysis. Here we introduce and develop a new approach to quantify deviations in the underlying hidden generators of observed data streams, resulting in a new efficiently computable universal metric for time series. The proposed metric is in the sense that we can compare and contrast data streams regardless of where and how they are generated and without any feature engineering step. The approach proposed in this paper is conceptually distinct from our previous work on data smashing, and vastly improves discrimination performance and computing speed. The core idea here is the generalization of the notion of KL divergence often used to compare probability distributions to a notion of divergence in time series. We call this the sequence likelihood (SL) divergence, which may be used to measure deviations within a welldefined class of discretevalued stochastic processes. We devise efficient estimators of SL divergence from finite sample paths and subsequently formulate a universal metric useful for computing distance between time series produced by hidden stochastic generators.
READ FULL TEXT
Comments
There are no comments yet.