Self-Supervised Anomaly Detection by Self-Distillation and Negative Sampling

01/17/2022
by   Nima Rafiee, et al.
0

Detecting whether examples belong to a given in-distribution or are Out-Of-Distribution (OOD) requires identifying features specific to the in-distribution. In the absence of labels, these features can be learned by self-supervised techniques under the generic assumption that the most abstract features are those which are statistically most over-represented in comparison to other distributions from the same domain. In this work, we show that self-distillation of the in-distribution training set together with contrasting against negative examples derived from shifting transformation of auxiliary data strongly improves OOD detection. We find that this improvement depends on how the negative samples are generated. In particular, we observe that by leveraging negative samples, which keep the statistics of low-level features while changing the high-level semantics, higher average detection performance is obtained. Furthermore, good negative sampling strategies can be identified from the sensitivity of the OOD detection score. The efficiency of our approach is demonstrated across a diverse range of OOD detection problems, setting new benchmarks for unsupervised OOD detection in the visual domain.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/18/2020

Understanding Anomaly Detection with Deep Invertible Networks through Hierarchies of Distributions and Features

Deep generative networks trained via maximum likelihood on a natural ima...
research
10/04/2022

CADet: Fully Self-Supervised Anomaly Detection With Contrastive Learning

Handling out-of-distribution (OOD) samples has become a major stake in t...
research
05/27/2022

Raising the Bar in Graph-level Anomaly Detection

Graph-level anomaly detection has become a critical topic in diverse are...
research
12/14/2021

Out-of-Distribution Detection without Class Labels

Anomaly detection methods identify samples that deviate from the normal ...
research
12/01/2020

Unsupervised Anomaly Detection From Semantic Similarity Scores

In this paper, we present SemSAD, a simple and generic framework for det...
research
06/07/2021

Multi-task Transformation Learning for Robust Out-of-Distribution Detection

Detecting out-of-distribution (OOD) samples plays a key role in open-wor...
research
10/07/2019

Negative Sampling in Variational Autoencoders

We propose negative sampling as an approach to improve the notoriously b...

Please sign up or login with your details

Forgot password? Click here to reset