When Does Confidence-Based Cascade Deferral Suffice?

07/06/2023
by   Wittawat Jitkrittum, et al.
0

Cascades are a classical strategy to enable inference cost to vary adaptively across samples, wherein a sequence of classifiers are invoked in turn. A deferral rule determines whether to invoke the next classifier in the sequence, or to terminate prediction. One simple deferral rule employs the confidence of the current classifier, e.g., based on the maximum predicted softmax probability. Despite being oblivious to the structure of the cascade – e.g., not modelling the errors of downstream models – such confidence-based deferral often works remarkably well in practice. In this paper, we seek to better understand the conditions under which confidence-based deferral may fail, and when alternate deferral strategies can perform better. We first present a theoretical characterisation of the optimal deferral rule, which precisely characterises settings under which confidence-based deferral may suffer. We then study post-hoc deferral mechanisms, and demonstrate they can significantly improve upon confidence-based deferral in settings where (i) downstream models are specialists that only work well on a subset of inputs, (ii) samples are subject to label noise, and (iii) there is distribution shift between the train and test set.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/28/2018

Sacrificing Accuracy for Reduced Computation: Cascaded Inference Based on Softmax Confidence

We study the tradeoff between computational effort and accuracy in a cas...
research
11/17/2021

SmoothMix: Training Confidence-calibrated Smoothed Classifiers for Certified Robustness

Randomized smoothing is currently a state-of-the-art method to construct...
research
07/21/2017

Confidence estimation in Deep Neural networks via density modelling

State-of-the-art Deep Neural Networks can be easily fooled into providin...
research
03/16/2019

Deciding with Judgment

A decision maker starts from a judgmental decision and moves to the clos...
research
05/25/2017

Real-Time Background Subtraction Using Adaptive Sampling and Cascade of Gaussians

Background-Foreground classification is a fundamental well-studied probl...
research
06/09/2021

Understanding Softmax Confidence and Uncertainty

It is often remarked that neural networks fail to increase their uncerta...

Please sign up or login with your details

Forgot password? Click here to reset