Class Adaptive Network Calibration

11/28/2022
by   Bingyuan Liu, et al.
0

Recent studies have revealed that, beyond conventional accuracy, calibration should also be considered for training modern deep neural networks. To address miscalibration during learning, some methods have explored different penalty functions as part of the learning objective, alongside a standard classification loss, with a hyper-parameter controlling the relative contribution of each term. Nevertheless, these methods share two major drawbacks: 1) the scalar balancing weight is the same for all classes, hindering the ability to address different intrinsic difficulties or imbalance among classes; and 2) the balancing weight is usually fixed without an adaptive strategy, which may prevent from reaching the best compromise between accuracy and calibration, and requires hyper-parameter search for each application. We propose Class Adaptive Label Smoothing (CALS) for calibrating deep networks, which allows to learn class-wise multipliers during training, yielding a powerful alternative to common label smoothing penalties. Our method builds on a general Augmented Lagrangian approach, a well-established technique in constrained optimization, but we introduce several modifications to tailor it for large-scale, class-adaptive training. Comprehensive evaluation and multiple comparisons on a variety of benchmarks, including standard and long-tailed image classification, semantic segmentation, and text classification, demonstrate the superiority of the proposed method. The code is available at https://github.com/by-liu/CALS.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/10/2021

Balancing Methods for Multi-label Text Classification with Long-Tailed Class Distribution

Multi-label text classification is a challenging task because it require...
research
11/30/2021

The Devil is in the Margin: Margin-based Label Smoothing for Network Calibration

In spite of the dominant performances of deep neural networks, recent wo...
research
04/01/2021

Improving Calibration for Long-Tailed Recognition

Deep neural networks may perform poorly when training datasets are heavi...
research
09/11/2021

Class-Distribution-Aware Calibration for Long-Tailed Visual Recognition

Despite impressive accuracy, deep neural networks are often miscalibrate...
research
08/23/2023

ACLS: Adaptive and Conditional Label Smoothing for Network Calibration

We address the problem of network calibration adjusting miscalibrated co...
research
08/22/2022

Towards Calibrated Hyper-Sphere Representation via Distribution Overlap Coefficient for Long-tailed Learning

Long-tailed learning aims to tackle the crucial challenge that head clas...
research
10/22/2020

Posterior Re-calibration for Imbalanced Datasets

Neural Networks can perform poorly when the training label distribution ...

Please sign up or login with your details

Forgot password? Click here to reset