Cost-Sensitive Feature Selection by Optimizing F-Measures

04/04/2019
by   Chang Xu, et al.
4

Feature selection is beneficial for improving the performance of general machine learning tasks by extracting an informative subset from the high-dimensional features. Conventional feature selection methods usually ignore the class imbalance problem, thus the selected features will be biased towards the majority class. Considering that F-measure is a more reasonable performance measure than accuracy for imbalanced data, this paper presents an effective feature selection algorithm that explores the class imbalance issue by optimizing F-measures. Since F-measure optimization can be decomposed into a series of cost-sensitive classification problems, we investigate the cost-sensitive feature selection by generating and assigning different costs to each class with rigorous theory guidance. After solving a series of cost-sensitive feature selection problems, features corresponding to the best F-measure will be selected. In this way, the selected features will fully represent the properties of all classes. Experimental results on popular benchmarks and challenging real-world data sets demonstrate the significance of cost-sensitive feature selection for the imbalanced data setting and validate the effectiveness of the proposed method.

READ FULL TEXT
research
09/01/2021

An Empirical Study on the Joint Impact of Feature Selection and Data Resampling on Imbalance Classification

Real-world datasets often present different degrees of imbalanced (i.e.,...
research
01/28/2012

Feature selection using nearest attributes

Feature selection is an important problem in high-dimensional data analy...
research
03/22/2021

Feature Selection for Imbalanced Data with Deep Sparse Autoencoders Ensemble

Class imbalance is a common issue in many domain applications of learnin...
research
11/12/2012

Minimal cost feature selection of data with normal distribution measurement errors

Minimal cost feature selection is devoted to obtain a trade-off between ...
research
08/14/2020

Feature Selection Methods for Cost-Constrained Classification in Random Forests

Cost-sensitive feature selection describes a feature selection problem, ...
research
02/26/2023

Data-Centric AI: Deep Generative Differentiable Feature Selection via Discrete Subsetting as Continuous Embedding Space Optimization

Feature Selection (FS), such as filter, wrapper, and embedded methods, a...

Please sign up or login with your details

Forgot password? Click here to reset