Long-Tailed Visual Recognition via Self-Heterogeneous Integration with Knowledge Excavation

04/03/2023
by   Yan Jin, et al.
0

Deep neural networks have made huge progress in the last few decades. However, as the real-world data often exhibits a long-tailed distribution, vanilla deep models tend to be heavily biased toward the majority classes. To address this problem, state-of-the-art methods usually adopt a mixture of experts (MoE) to focus on different parts of the long-tailed distribution. Experts in these methods are with the same model depth, which neglects the fact that different classes may have different preferences to be fit by models with different depths. To this end, we propose a novel MoE-based method called Self-Heterogeneous Integration with Knowledge Excavation (SHIKE). We first propose Depth-wise Knowledge Fusion (DKF) to fuse features between different shallow parts and the deep part in one network for each expert, which makes experts more diverse in terms of representation. Based on DKF, we further propose Dynamic Knowledge Transfer (DKT) to reduce the influence of the hardest negative class that has a non-negligible impact on the tail classes in our MoE framework. As a result, the classification accuracy of long-tailed data can be significantly improved, especially for the tail classes. SHIKE achieves the state-of-the-art performance of 56.3 (IF100), ImageNet-LT, iNaturalist 2018, and Places-LT, respectively.

READ FULL TEXT

page 4

page 7

research
05/22/2022

Learning Muti-expert Distribution Calibration for Long-tailed Video Classification

Most existing state-of-the-art video classification methods assume the t...
research
10/05/2020

Long-tailed Recognition by Routing Diverse Distribution-Aware Experts

Natural data are often long-tail distributed over semantic classes. Exis...
research
04/13/2023

Transfer Knowledge from Head to Tail: Uncertainty Calibration under Long-tailed Distribution

How to estimate the uncertainty of a given model is a crucial problem. C...
research
01/06/2020

Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification

In real-world scenarios, data tends to exhibit a long-tailed, imbalanced...
research
05/05/2023

Towards Effective Collaborative Learning in Long-Tailed Recognition

Real-world data usually suffers from severe class imbalance and long-tai...
research
07/20/2021

Test-Agnostic Long-Tailed Recognition by Test-Time Aggregating Diverse Experts with Self-Supervision

Existing long-tailed recognition methods, aiming to train class-balance ...
research
11/24/2022

Minority-Oriented Vicinity Expansion with Attentive Aggregation for Video Long-Tailed Recognition

A dramatic increase in real-world video volume with extremely diverse an...

Please sign up or login with your details

Forgot password? Click here to reset