Data Augmentation by AutoEncoders for Unsupervised Anomaly Detection

12/21/2019
by   Kasra Babaei, et al.
0

This paper proposes an autoencoder (AE) that is used for improving the performance of once-class classifiers for the purpose of detecting anomalies. Traditional one-class classifiers (OCCs) perform poorly under certain conditions such as high-dimensionality and sparsity. Also, the size of the training set plays an important role on the performance of one-class classifiers. Autoencoders have been widely used for obtaining useful latent variables from high-dimensional datasets. In the proposed approach, the AE is capable of deriving meaningful features from high-dimensional datasets while doing data augmentation at the same time. The augmented data is used for training the OCC algorithms. The experimental results show that the proposed approach enhance the performance of OCC algorithms and also outperforms other well-known approaches.

READ FULL TEXT
12/21/2019

AEGR: A simple approach to gradient reversal in autoencoders for network anomaly detection

Anomaly detection is referred to as a process in which the aim is to det...
03/08/2021

Anomaly Detection Based on Selection and Weighting in Latent Space

With the high requirements of automation in the era of Industry 4.0, ano...
08/23/2018

DOPING: Generative Data Augmentation for Unsupervised Anomaly Detection with GAN

Recently, the introduction of the generative adversarial network (GAN) a...
04/30/2021

Data Augmentation in High Dimensional Low Sample Size Setting Using a Geometry-Based Variational Autoencoder

In this paper, we propose a new method to perform data augmentation in a...
08/16/2022

Role of Data Augmentation in Unsupervised Anomaly Detection

Self-supervised learning (SSL) has emerged as a promising alternative to...
09/16/2019

Distance Assessment and Hypothesis Testing of High-Dimensional Samples using Variational Autoencoders

Given two distinct datasets, an important question is if they have arise...
12/09/2020

Detection of Adversarial Supports in Few-shot Classifiers Using Feature Preserving Autoencoders and Self-Similarity

Few-shot classifiers excel under limited training samples, making it use...