Contextual Affinity Distillation for Image Anomaly Detection

07/06/2023
by   Jie Zhang, et al.
0

Previous works on unsupervised industrial anomaly detection mainly focus on local structural anomalies such as cracks and color contamination. While achieving significantly high detection performance on this kind of anomaly, they are faced with logical anomalies that violate the long-range dependencies such as a normal object placed in the wrong position. In this paper, based on previous knowledge distillation works, we propose to use two students (local and global) to better mimic the teacher's behavior. The local student, which is used in previous studies mainly focuses on structural anomaly detection while the global student pays attention to logical anomalies. To further encourage the global student's learning to capture long-range dependencies, we design the global context condensing block (GCCB) and propose a contextual affinity loss for the student training and anomaly scoring. Experimental results show the proposed method doesn't need cumbersome training techniques and achieves a new state-of-the-art performance on the MVTec LOCO AD dataset.

READ FULL TEXT

page 1

page 3

page 6

page 8

research
01/30/2023

FractalAD: A simple industrial anomaly segmentation method using fractal anomaly generation and backbone knowledge distillation

Although industrial anomaly detection (AD) technology has made significa...
research
03/10/2023

Learning Global-Local Correspondence with Semantic Bottleneck for Logical Anomaly Detection

This paper presents a novel framework, named Global-Local Correspondence...
research
10/14/2022

Asymmetric Student-Teacher Networks for Industrial Anomaly Detection

Industrial defect detection is commonly addressed with anomaly detection...
research
11/06/2019

Uninformed Students: Student-Teacher Anomaly Detection with Discriminative Latent Embeddings

We introduce a simple, yet powerful student-teacher framework for the ch...
research
01/26/2022

Anomaly Detection via Reverse Distillation from One-Class Embedding

Knowledge distillation (KD) achieves promising results on the challengin...
research
06/16/2023

MixedTeacher : Knowledge Distillation for fast inference textural anomaly detection

For a very long time, unsupervised learning for anomaly detection has be...
research
08/06/2022

HaloAE: An HaloNet based Local Transformer Auto-Encoder for Anomaly Detection and Localization

Unsupervised anomaly detection and localization is a crucial task as it ...

Please sign up or login with your details

Forgot password? Click here to reset