High-Order Conditional Mutual Information Maximization for dealing with High-Order Dependencies in Feature Selection

07/18/2022
by   Francisco Souza, et al.
0

This paper presents a novel feature selection method based on the conditional mutual information (CMI). The proposed High Order Conditional Mutual Information Maximization (HOCMIM) incorporates high order dependencies into the feature selection procedure and has a straightforward interpretation due to its bottom-up derivation. The HOCMIM is derived from the CMI's chain expansion and expressed as a maximization optimization problem. The maximization problem is solved using a greedy search procedure, which speeds up the entire feature selection process. The experiments are run on a set of benchmark datasets (20 in total). The HOCMIM is compared with eighteen state-of-the-art feature selection algorithms, from the results of two supervised learning classifiers (Support Vector Machine and K-Nearest Neighbor). The HOCMIM achieves the best results in terms of accuracy and shows to be faster than high order feature selection counterparts.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/23/2017

Efficient Approximate Solutions to Mutual Information Based Global Feature Selection

Mutual Information (MI) is often used for feature selection when develop...
research
10/21/2022

An Adaptive Neighborhood Partition Full Conditional Mutual Information Maximization Method for Feature Selection

Feature selection is used to eliminate redundant features and keep relev...
research
11/12/2019

Model-Augmented Nearest-Neighbor Estimation of Conditional Mutual Information for Feature Selection

Markov blanket feature selection, while theoretically optimal, generally...
research
10/30/2020

Information-theoretic Feature Selection via Tensor Decomposition and Submodularity

Feature selection by maximizing high-order mutual information between th...
research
12/09/2022

Improving Mutual Information based Feature Selection by Boosting Unique Relevance

Mutual Information (MI) based feature selection makes use of MI to evalu...
research
01/27/2020

Feature selection in machine learning: Rényi min-entropy vs Shannon entropy

Feature selection, in the context of machine learning, is the process of...
research
04/06/2023

SLM: End-to-end Feature Selection via Sparse Learnable Masks

Feature selection has been widely used to alleviate compute requirements...

Please sign up or login with your details

Forgot password? Click here to reset