Distribution Learning Based on Evolutionary Algorithm Assisted Deep Neural Networks for Imbalanced Image Classification

07/26/2022
by   Yudi Zhao, et al.
0

To address the trade-off problem of quality-diversity for the generated images in imbalanced classification tasks, we research on over-sampling based methods at the feature level instead of the data level and focus on searching the latent feature space for optimal distributions. On this basis, we propose an iMproved Estimation Distribution Algorithm based Latent featUre Distribution Evolution (MEDA_LUDE) algorithm, where a joint learning procedure is programmed to make the latent features both optimized and evolved by the deep neural networks and the evolutionary algorithm, respectively. We explore the effect of the Large-margin Gaussian Mixture (L-GM) loss function on distribution learning and design a specialized fitness function based on the similarities among samples to increase diversity. Extensive experiments on benchmark based imbalanced datasets validate the effectiveness of our proposed algorithm, which can generate images with both quality and diversity. Furthermore, the MEDA_LUDE algorithm is also applied to the industrial field and successfully alleviates the imbalanced issue in fabric defect classification.

READ FULL TEXT

page 1

page 9

research
10/24/2018

G-SMOTE: A GMM-based synthetic minority oversampling technique for imbalanced learning

Imbalanced Learning is an important learning algorithm for the classific...
research
03/08/2018

Rethinking Feature Distribution for Loss Functions in Image Classification

We propose a large-margin Gaussian Mixture (L-GM) loss for deep neural n...
research
09/29/2021

Multi-loss ensemble deep learning for chest X-ray classification

Class imbalance is common in medical image classification tasks, where t...
research
01/23/2019

Max-margin Class Imbalanced Learning with Gaussian Affinity

Real-world object classes appear in imbalanced ratios. This poses a sign...
research
02/24/2023

Inducing Neural Collapse in Deep Long-tailed Learning

Although deep neural networks achieve tremendous success on various clas...
research
07/17/2018

Pseudo-Feature Generation for Imbalanced Data Analysis in Deep Learning

We generate pseudo-features by multivariate probability distributions ob...
research
05/30/2022

RankSim: Ranking Similarity Regularization for Deep Imbalanced Regression

Data imbalance, in which a plurality of the data samples come from a sma...

Please sign up or login with your details

Forgot password? Click here to reset