Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification

01/06/2020
by   Liuyu Xiang, et al.
0

In real-world scenarios, data tends to exhibit a long-tailed, imbalanced distribution. Developing algorithms to deal with such long-tailed distribution thus becomes indispensable in practical applications. In this paper, we propose a novel self-paced knowledge distillation framework, termed Learning From Multiple Experts (LFME). Our method is inspired by the observation that deep Convolutional Neural Networks (CNNs) trained on less imbalanced subsets of the entire long-tailed distribution often yield better performances than their jointly-trained counterparts. We refer to these models as `Expert Models', and the proposed LFME framework aggregates the knowledge from multiple `Experts' to learn a unified student model. Specifically, the proposed framework involves two levels of self-paced learning schedules: Self-paced Expert Selection and Self-paced Instance Selection, so that the knowledge is adaptively transferred from multiple `Experts' to the `Student'. In order to verify the effectiveness of our proposed framework, we conduct extensive experiments on two long-tailed benchmark classification datasets. The experimental results demonstrate that our method is able to achieve superior performances compared to the state-of-the-art methods. We also show that our method can be easily plugged into state-of-the-art long-tailed classification algorithms for further improvements.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/22/2021

Relational Subsets Knowledge Distillation for Long-tailed Retinal Diseases Recognition

In the real world, medical datasets often exhibit a long-tailed data dis...
research
08/31/2023

Towards Long-Tailed Recognition for Graph Classification via Collaborative Experts

Graph classification, aiming at learning the graph-level representations...
research
04/03/2023

Long-Tailed Visual Recognition via Self-Heterogeneous Integration with Knowledge Excavation

Deep neural networks have made huge progress in the last few decades. Ho...
research
03/28/2021

Distilling Virtual Examples for Long-tailed Recognition

In this paper, we tackle the long-tailed visual recognition problem from...
research
04/13/2021

Improving Long-Tailed Classification from Instance Level

Data in the real world tends to exhibit a long-tailed label distribution...
research
06/29/2023

NCL++: Nested Collaborative Learning for Long-Tailed Visual Recognition

Long-tailed visual recognition has received increasing attention in rece...
research
05/05/2023

Towards Effective Collaborative Learning in Long-Tailed Recognition

Real-world data usually suffers from severe class imbalance and long-tai...

Please sign up or login with your details

Forgot password? Click here to reset