Low Latency Anomaly Detection and Bayesian Network Prediction of Anomaly Likelihood

11/11/2016
by   Derek Farren, et al.
0

We develop a supervised machine learning model that detects anomalies in systems in real time. Our model processes unbounded streams of data into time series which then form the basis of a low-latency anomaly detection model. Moreover, we extend our preliminary goal of just anomaly detection to simultaneous anomaly prediction. We approach this very challenging problem by developing a Bayesian Network framework that captures the information about the parameters of the lagged regressors calibrated in the first part of our approach and use this structure to learn local conditional probability distributions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/08/2016

Real-Time Anomaly Detection for Streaming Analytics

Much of the worlds data is streaming, time-series data, where anomalies ...
research
07/08/2022

Signed Network Embedding with Application to Simultaneous Detection of Communities and Anomalies

Signed networks are frequently observed in real life with additional sig...
research
11/13/2019

Anomaly Detection in Large Scale Networks with Latent Space Models

We develop a real-time anomaly detection algorithm for directed activity...
research
02/02/2018

Representation Learning for Resource Usage Prediction

Creating a model of a computer system that can be used for tasks such as...
research
10/31/2017

Why (and How) Networks Should Run Themselves

The proliferation of networked devices, systems, and applications that w...
research
02/22/2019

Bayesian Anomaly Detection and Classification

Statistical uncertainties are rarely incorporated in machine learning al...
research
11/11/2019

RAD: On-line Anomaly Detection for Highly Unreliable Data

Classification algorithms have been widely adopted to detect anomalies f...

Please sign up or login with your details

Forgot password? Click here to reset