Class-Balanced Distillation for Long-Tailed Visual Recognition

04/12/2021
by   Ahmet Iscen, et al.
20

Real-world imagery is often characterized by a significant imbalance of the number of images per class, leading to long-tailed distributions. An effective and simple approach to long-tailed visual recognition is to learn feature representations and a classifier separately, with instance and class-balanced sampling, respectively. In this work, we introduce a new framework, by making the key observation that a feature representation learned with instance sampling is far from optimal in a long-tailed setting. Our main contribution is a new training method, referred to as Class-Balanced Distillation (CBD), that leverages knowledge distillation to enhance feature representations. CBD allows the feature representation to evolve in the second training stage, guided by the teacher learned in the first stage. The second stage uses class-balanced sampling, in order to focus on under-represented classes. This framework can naturally accommodate the usage of multiple teachers, unlocking the information from an ensemble of models to enhance recognition capabilities. Our experiments show that the proposed technique consistently outperforms the state of the art on long-tailed recognition benchmarks such as ImageNet-LT, iNaturalist17 and iNaturalist18. The experiments also show that our method does not sacrifice the accuracy of head classes to improve the performance of tail classes, unlike most existing work.

READ FULL TEXT
research
09/09/2021

Self Supervision to Distillation for Long-Tailed Visual Recognition

Deep learning has achieved remarkable progress for visual recognition on...
research
02/28/2022

Long-Tailed Classification with Gradual Balanced Loss and Adaptive Feature Generation

The real-world data distribution is essentially long-tailed, which poses...
research
05/01/2021

GistNet: a Geometric Structure Transfer Network for Long-Tailed Recognition

The problem of long-tailed recognition, where the number of examples per...
research
04/19/2023

Decoupled Training for Long-Tailed Classification With Stochastic Representations

Decoupling representation learning and classifier learning has been show...
research
05/01/2021

Semi-supervised Long-tailed Recognition using Alternate Sampling

Main challenges in long-tailed recognition come from the imbalanced data...
research
04/20/2021

A novel three-stage training strategy for long-tailed classification

The long-tailed distribution datasets poses great challenges for deep le...
research
09/28/2020

Long-Tailed Classification by Keeping the Good and Removing the Bad Momentum Causal Effect

As the class size grows, maintaining a balanced dataset across many clas...

Please sign up or login with your details

Forgot password? Click here to reset