Progressive Boosting for Class Imbalance

06/05/2017
by   Roghayeh Soleymani, et al.
0

Pattern recognition applications often suffer from skewed data distributions between classes, which may vary during operations w.r.t. the design data. Two-class classification systems designed using skewed data tend to recognize the majority class better than the minority class of interest. Several data-level techniques have been proposed to alleviate this issue by up-sampling minority samples or under-sampling majority samples. However, some informative samples may be neglected by random under-sampling and adding synthetic positive samples through up-sampling adds to training complexity. In this paper, a new ensemble learning algorithm called Progressive Boosting (PBoost) is proposed that progressively inserts uncorrelated groups of samples into a Boosting procedure to avoid loss of information while generating a diverse pool of classifiers. Base classifiers in this ensemble are generated from one iteration to the next, using subsets from a validation set that grows gradually in size and imbalance. Consequently, PBoost is more robust to unknown and variable levels of skew in operational data, and has lower computation complexity than Boosting ensembles in literature. In PBoost, a new loss factor is proposed to avoid bias of performance towards the negative class. Using this loss factor, the weight update of samples and classifier contribution in final predictions are set based on the ability to recognize both classes. Using the proposed loss factor instead of standard accuracy can avoid biasing performance in any Boosting ensemble. The proposed approach was validated and compared using synthetic data, videos from the FIA dataset that emulates face re-identification applications, and KEEL collection of datasets. Results show that PBoost can outperform state of the art techniques in terms of both accuracy and complexity over different levels of imbalance and overlap between classes.

READ FULL TEXT

page 16

page 29

research
11/09/2020

Synthetic Over-sampling with the Minority and Majority classes for imbalance problems

Class imbalance is a substantial challenge in classifying many real-worl...
research
09/23/2020

I-SiamIDS: an improved Siam-IDS for handling class imbalance in network-based intrusion detection systems

NIDSs identify malicious activities by analyzing network traffic. NIDSs ...
research
03/24/2021

A Novel Adaptive Minority Oversampling Technique for Improved Classification in Data Imbalanced Scenarios

Imbalance in the proportion of training samples belonging to different c...
research
05/24/2022

Phased Progressive Learning with Coupling-Regulation-Imbalance Loss for Imbalanced Classification

Deep neural networks generally perform poorly with datasets that suffer ...
research
12/22/2020

A Survey of Methods for Managing the Classification and Solution of Data Imbalance Problem

The problem of class imbalance is extensive for focusing on numerous app...
research
11/15/2017

LIUBoost : Locality Informed Underboosting for Imbalanced Data Classification

The problem of class imbalance along with class-overlapping has become a...
research
01/19/2020

A meta-algorithm for classification using random recursive tree ensembles: A high energy physics application

The aim of this work is to propose a meta-algorithm for automatic classi...

Please sign up or login with your details

Forgot password? Click here to reset