DeepAI AI Chat
Log In Sign Up

Exploring Covariate and Concept Shift for Detection and Calibration of Out-of-Distribution Data

by   Junjiao Tian, et al.

Moving beyond testing on in-distribution data works on Out-of-Distribution (OOD) detection have recently increased in popularity. A recent attempt to categorize OOD data introduces the concept of near and far OOD detection. Specifically, prior works define characteristics of OOD data in terms of detection difficulty. We propose to characterize the spectrum of OOD data using two types of distribution shifts: covariate shift and concept shift, where covariate shift corresponds to change in style, e.g., noise, and concept shift indicates a change in semantics. This characterization reveals that sensitivity to each type of shift is important to the detection and confidence calibration of OOD data. Consequently, we investigate score functions that capture sensitivity to each type of dataset shift and methods that improve them. To this end, we theoretically derive two score functions for OOD detection, the covariate shift score and concept shift score, based on the decomposition of KL-divergence for both scores, and propose a geometrically-inspired method (Geometric ODIN) to improve OOD detection under both shifts with only in-distribution data. Additionally, the proposed method naturally leads to an expressive post-hoc calibration function which yields state-of-the-art calibration performance on both in-distribution and out-of-distribution data. We are the first to propose a method that works well across both OOD detection and calibration and under different types of shifts. Specifically, we improve the previous state-of-the-art OOD detection by relatively 7 vs. SVHN and achieve the best calibration performance of 0.084 Expected Calibration Error on the corrupted CIFAR100C dataset. View project page at


page 1

page 2

page 3

page 4


Improved Predictive Uncertainty using Corruption-based Calibration

We propose a simple post hoc calibration method to estimate the confiden...

A Geometric Perspective towards Neural Calibration via Sensitivity Decomposition

It is well known that vision classification models suffer from poor cali...

Full-Spectrum Out-of-Distribution Detection

Existing out-of-distribution (OOD) detection literature clearly defines ...

Types of Out-of-Distribution Texts and How to Detect Them

Despite agreement on the importance of detecting out-of-distribution (OO...

Generalized ODIN: Detecting Out-of-distribution Image without Learning from Out-of-distribution Data

Deep neural networks have attained remarkable performance when applied t...

Adaptive Calibrator Ensemble for Model Calibration under Distribution Shift

Model calibration usually requires optimizing some parameters (e.g., tem...

A benchmark with decomposed distribution shifts for 360 monocular depth estimation

In this work we contribute a distribution shift benchmark for a computer...