Identifying and Compensating for Feature Deviation in Imbalanced Deep Learning
We investigate learning a ConvNet classifier with class-imbalanced data. We found that a ConvNet over-fits significantly to the minor classes that do not have sufficient training instances, even if it is trained using vanilla ERM. We conduct a series of analysis and argue that feature deviation between the training and test instances serves as the main cause. We propose to incorporate class-dependent temperatures (CDT) in learning a ConvNet: CDT forces the minor-class instances to have larger decision values in training, so as to compensate for the effect of feature deviation in testing. We validate our approach on several benchmark datasets and achieve promising results. Our studies further suggest that class-imbalance data affects traditional machine learning and recent deep learning in very different ways. We hope that our insights can inspire new ways of thinking in resolving class-imbalanced deep learning.
READ FULL TEXT