A GA-like Dynamic Probability Method With Mutual Information for Feature Selection

10/21/2022
by   Gaoshuai Wang, et al.
0

Feature selection plays a vital role in promoting the classifier's performance. However, current methods ineffectively distinguish the complex interaction in the selected features. To further remove these hidden negative interactions, we propose a GA-like dynamic probability (GADP) method with mutual information which has a two-layer structure. The first layer applies the mutual information method to obtain a primary feature subset. The GA-like dynamic probability algorithm, as the second layer, mines more supportive features based on the former candidate features. Essentially, the GA-like method is one of the population-based algorithms so its work mechanism is similar to the GA. Different from the popular works which frequently focus on improving GA's operators for enhancing the search ability and lowering the converge time, we boldly abandon GA's operators and employ the dynamic probability that relies on the performance of each chromosome to determine feature selection in the new generation. The dynamic probability mechanism significantly reduces the parameter number in GA that making it easy to use. As each gene's probability is independent, the chromosome variety in GADP is more notable than in traditional GA, which ensures GADP has a wider search space and selects relevant features more effectively and accurately. To verify our method's superiority, we evaluate our method under multiple conditions on 15 datasets. The results demonstrate the outperformance of the proposed method. Generally, it has the best accuracy. Further, we also compare the proposed model to the popular heuristic methods like POS, FPA, and WOA. Our model still owns advantages over them.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/23/2017

Efficient Approximate Solutions to Mutual Information Based Global Feature Selection

Mutual Information (MI) is often used for feature selection when develop...
research
10/21/2022

An Adaptive Neighborhood Partition Full Conditional Mutual Information Maximization Method for Feature Selection

Feature selection is used to eliminate redundant features and keep relev...
research
12/11/2002

The structure of evolutionary exploration: On crossover, buildings blocks and Estimation-Of-Distribution Algorithms

The notion of building blocks can be related to the structure of the off...
research
10/03/2017

Multi-layer architecture for efficient steganalysis of Undermp3cover in multi-encoder scenario

Mp3 is a very popular audio format and hence it can be a good host for c...
research
01/30/2020

A Hybrid Two-layer Feature Selection Method Using GeneticAlgorithm and Elastic Net

Feature selection, as a critical pre-processing step for machine learnin...
research
03/21/2016

Static and Dynamic Feature Selection in Morphosyntactic Analyzers

We study the use of greedy feature selection methods for morphosyntactic...
research
03/12/2021

GA for feature selection of EEG heterogeneous data

The electroencephalographic (EEG) signals provide highly informative dat...

Please sign up or login with your details

Forgot password? Click here to reset