Adversarial vulnerability of powerful near out-of-distribution detection

01/18/2022
by   Stanislav Fort, et al.
0

There has been a significant progress in detecting out-of-distribution (OOD) inputs in neural networks recently, primarily due to the use of large models pretrained on large datasets, and an emerging use of multi-modality. We show a severe adversarial vulnerability of even the strongest current OOD detection techniques. With a small, targeted perturbation to the input pixels, we can change the image assignment from an in-distribution to an out-distribution, and vice versa, easily. In particular, we demonstrate severe adversarial vulnerability on the challenging near OOD CIFAR-100 vs CIFAR-10 task, as well as on the far OOD CIFAR-100 vs SVHN. We study the adversarial robustness of several post-processing techniques, including the simple baseline of Maximum of Softmax Probabilities (MSP), the Mahalanobis distance, and the newly proposed Relative Mahalanobis distance. By comparing the loss of OOD detection performance at various perturbation strengths, we demonstrate the beneficial effect of using ensembles of OOD detectors, and the use of the Relative Mahalanobis distance over other post-processing methods. In addition, we show that even strong zero-shot OOD detection using CLIP and multi-modality suffers from a severe lack of adversarial robustness as well. Our code is available at https://github.com/stanislavfort/adversaries_to_OOD_detection

READ FULL TEXT

page 4

page 5

research
06/16/2021

A Simple Fix to Mahalanobis Distance for Improving Near-OOD Detection

Mahalanobis distance (MD) is a simple and popular post-processing method...
research
08/04/2022

A New Kind of Adversarial Example

Almost all adversarial attacks are formulated to add an imperceptible pe...
research
09/30/2020

DVERGE: Diversifying Vulnerabilities for Enhanced Robust Generation of Ensembles

Recent research finds CNN models for image classification demonstrate ov...
research
05/30/2021

Improving Entropic Out-of-Distribution Detection using Isometric Distances and the Minimum Distance Score

Current out-of-distribution detection approaches usually present special...
research
09/30/2022

Your Out-of-Distribution Detection Method is Not Robust!

Out-of-distribution (OOD) detection has recently gained substantial atte...
research
09/04/2023

On the use of Mahalanobis distance for out-of-distribution detection with neural networks for medical imaging

Implementing neural networks for clinical use in medical applications ne...
research
12/16/2021

Towards Robust Neural Image Compression: Adversarial Attack and Model Finetuning

Deep neural network based image compression has been extensively studied...

Please sign up or login with your details

Forgot password? Click here to reset