Maximally Divergent Intervals for Anomaly Detection

10/21/2016
by   Erik Rodner, et al.
0

We present new methods for batch anomaly detection in multivariate time series. Our methods are based on maximizing the Kullback-Leibler divergence between the data distribution within and outside an interval of the time series. An empirical analysis shows the benefits of our algorithms compared to methods that treat each time step independently from each other without optimizing with respect to all possible intervals.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/09/2018

Greenhouse: A Zero-Positive Machine Learning System for Time-Series Anomaly Detection

This short paper describes our ongoing research on Greenhouse - a zero-p...
research
09/04/2020

Multivariate Time-series Anomaly Detection via Graph Attention Network

Anomaly detection on multivariate time-series is of great importance in ...
research
03/03/2020

CRATOS: Cogination of Reliable Algorithm for Time-series Optimal Solution

Anomaly detection of time series plays an important role in reliability ...
research
04/19/2018

Detecting Regions of Maximal Divergence for Spatio-Temporal Anomaly Detection

Automatic detection of anomalies in space- and time-varying measurements...
research
05/26/2021

Anomaly Detection in Predictive Maintenance: A New Evaluation Framework for Temporal Unsupervised Anomaly Detection Algorithms

The research in anomaly detection lacks a unified definition of what rep...
research
09/14/2021

Anomaly Attribution of Multivariate Time Series using Counterfactual Reasoning

There are numerous methods for detecting anomalies in time series, but t...
research
10/17/2022

tegdet: An extensible Python Library for Anomaly Detection using Time-Evolving Graphs

This paper presents a new Python library for anomaly detection in unsupe...

Please sign up or login with your details

Forgot password? Click here to reset