The Box Size Confidence Bias Harms Your Object Detector

12/03/2021
by   Johannes Gilg, et al.
0

Countless applications depend on accurate predictions with reliable confidence estimates from modern object detectors. It is well known, however, that neural networks including object detectors produce miscalibrated confidence estimates. Recent work even suggests that detectors' confidence predictions are biased with respect to object size and position, but it is still unclear how this bias relates to the performance of the affected object detectors. We formally prove that the conditional confidence bias is harming the expected performance of object detectors and empirically validate these findings. Specifically, we demonstrate how to modify the histogram binning calibration to not only avoid performance impairment but also improve performance through conditional confidence calibration. We further find that the confidence bias is also present in detections generated on the training data of the detector, which we leverage to perform our de-biasing without using additional data. Moreover, Test Time Augmentation magnifies this bias, which results in even larger performance gains from our calibration method. Finally, we validate our findings on a diverse set of object detection architectures and show improvements of up to 0.6 mAP and 0.8 mAP50 without extra data or training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/28/2020

Multivariate Confidence Calibration for Object Detection

Unbiased confidence estimates of neural networks are crucial especially ...
research
09/06/2023

Do We Still Need Non-Maximum Suppression? Accurate Confidence Estimates and Implicit Duplication Modeling with IoU-Aware Calibration

Object detectors are at the heart of many semi- and fully autonomous dec...
research
02/25/2022

Confidence Calibration for Object Detection and Segmentation

Calibrated confidence estimates obtained from neural networks are crucia...
research
07/03/2023

Towards Building Self-Aware Object Detectors via Reliable Uncertainty Quantification and Calibration

The current approach for testing the robustness of object detectors suff...
research
01/08/2021

From Black-box to White-box: Examining Confidence Calibration under different Conditions

Confidence calibration is a major concern when applying artificial neura...
research
10/06/2022

A Review of Uncertainty Calibration in Pretrained Object Detectors

In the field of deep learning based computer vision, the development of ...
research
04/12/2023

Confident Object Detection via Conformal Prediction and Conformal Risk Control: an Application to Railway Signaling

Deploying deep learning models in real-world certified systems requires ...

Please sign up or login with your details

Forgot password? Click here to reset