Learn what you can't learn: Regularized Ensembles for Transductive Out-of-distribution Detection

12/10/2020
by   Alexandru Tifrea, et al.
11

Machine learning models are often used in practice if they achieve good generalization results on in-distribution (ID) holdout data. When employed in the wild, they should also be able to detect samples they cannot predict well. We show that current out-of-distribution (OOD) detection algorithms for neural networks produce unsatisfactory results in a variety of OOD detection scenarios, e.g. when OOD data consists of unseen classes or corrupted measurements. This paper studies how such "hard" OOD scenarios can benefit from adjusting the detection method after observing a batch of the test data. This transductive setting is relevant when the advantage of even a slightly delayed OOD detection outweighs the financial cost for additional tuning. We propose a novel method that uses an artificial labeling scheme for the test data and regularization to obtain ensembles of models that produce contradictory predictions only on the OOD samples in a test batch. We show via comprehensive experiments that our approach is indeed able to significantly outperform both inductive and transductive baselines on difficult OOD detection scenarios, such as unseen classes on CIFAR-10/CIFAR-100, severe corruptions(CIFAR-C), and strong covariate shift (ImageNet vs ObjectNet).

READ FULL TEXT

page 11

page 19

page 20

page 21

page 31

page 32

page 33

page 34

research
06/22/2021

Towards Consistent Predictive Confidence through Fitted Ensembles

Deep neural networks are behind many of the recent successes in machine ...
research
03/22/2023

AUTO: Adaptive Outlier Optimization for Online Test-Time OOD Detection

Out-of-distribution (OOD) detection is a crucial aspect of deploying mac...
research
12/01/2021

Provable Guarantees for Understanding Out-of-distribution Detection

Out-of-distribution (OOD) detection is important for deploying machine l...
research
10/13/2022

Exploiting Mixed Unlabeled Data for Detecting Samples of Seen and Unseen Out-of-Distribution Classes

Out-of-Distribution (OOD) detection is essential in real-world applicati...
research
03/08/2022

CIDER: Exploiting Hyperspherical Embeddings for Out-of-Distribution Detection

Out-of-distribution (OOD) detection is a critical task for reliable mach...
research
10/26/2022

Is Out-of-Distribution Detection Learnable?

Supervised learning aims to train a classifier under the assumption that...
research
08/08/2023

Comprehensive Assessment of the Performance of Deep Learning Classifiers Reveals a Surprising Lack of Robustness

Reliable and robust evaluation methods are a necessary first step toward...

Please sign up or login with your details

Forgot password? Click here to reset