CUDA: Curriculum of Data Augmentation for Long-Tailed Recognition

02/10/2023
by   Sumyeong Ahn, et al.
0

Class imbalance problems frequently occur in real-world tasks, and conventional deep learning algorithms are well known for performance degradation on imbalanced training datasets. To mitigate this problem, many approaches have aimed to balance among given classes by re-weighting or re-sampling training samples. These re-balancing methods increase the impact of minority classes and reduce the influence of majority classes on the output of models. However, the extracted representations may be of poor quality owing to the limited number of minority samples. To handle this restriction, several methods have been developed that increase the representations of minority samples by leveraging the features of the majority samples. Despite extensive recent studies, no deep analysis has been conducted on determination of classes to be augmented and strength of augmentation has been conducted. In this study, we first investigate the correlation between the degree of augmentation and class-wise performance, and find that the proper degree of augmentation must be allocated for each class to mitigate class imbalance problems. Motivated by this finding, we propose a simple and efficient novel curriculum, which is designed to find the appropriate per-class strength of data augmentation, called CUDA: CUrriculum of Data Augmentation for long-tailed recognition. CUDA can simply be integrated into existing long-tailed recognition methods. We present the results of experiments showing that CUDA effectively achieves better generalization performance compared to the state-of-the-art method on various imbalanced datasets such as CIFAR-100-LT, ImageNet-LT, and iNaturalist 2018.

READ FULL TEXT

page 2

page 8

page 19

research
03/23/2021

MetaSAug: Meta Semantic Augmentation for Long-Tailed Visual Recognition

Real-world training data usually exhibits long-tailed distribution, wher...
research
12/01/2021

The Majority Can Help The Minority: Context-rich Minority Oversampling for Long-tailed Classification

The problem of class imbalanced data lies in that the generalization per...
research
07/26/2022

Class-Aware Universum Inspired Re-Balance Learning for Long-Tailed Recognition

Data augmentation for minority classes is an effective strategy for long...
research
02/19/2023

Mutual Exclusive Modulator for Long-Tailed Recognition

The long-tailed recognition (LTR) is the task of learning high-performan...
research
07/05/2022

DBN-Mix: Training Dual Branch Network Using Bilateral Mixup Augmentation for Long-Tailed Visual Recognition

There is a growing interest in the challenging visual perception task of...
research
09/12/2022

Data Augmentation by Selecting Mixed Classes Considering Distance Between Classes

Data augmentation is an essential technique for improving recognition ac...
research
12/16/2021

Pure Noise to the Rescue of Insufficient Data: Improving Imbalanced Classification by Training on Random Noise Images

Despite remarkable progress on visual recognition tasks, deep neural-net...

Please sign up or login with your details

Forgot password? Click here to reset