Efficient Approximate Solutions to Mutual Information Based Global Feature Selection

06/23/2017
by   Hemanth Venkateswara, et al.
0

Mutual Information (MI) is often used for feature selection when developing classifier models. Estimating the MI for a subset of features is often intractable. We demonstrate, that under the assumptions of conditional independence, MI between a subset of features can be expressed as the Conditional Mutual Information (CMI) between pairs of features. But selecting features with the highest CMI turns out to be a hard combinatorial problem. In this work, we have applied two unique global methods, Truncated Power Method (TPower) and Low Rank Bilinear Approximation (LowRank), to solve the feature selection problem. These algorithms provide very good approximations to the NP-hard CMI based feature selection problem. We experimentally demonstrate the effectiveness of these procedures across multiple datasets and compare them with existing MI based global and iterative feature selection procedures.

READ FULL TEXT
research
06/09/2016

Variational Information Maximization for Feature Selection

Feature selection is one of the most fundamental problems in machine lea...
research
11/24/2014

Mutual Information-Based Unsupervised Feature Transformation for Heterogeneous Feature Subset Selection

Conventional mutual information (MI) based feature selection (FS) method...
research
02/10/2019

Feature Selection for multi-labeled variables via Dependency Maximization

Feature selection and reducing the dimensionality of data is an essentia...
research
02/21/2023

Feature selection algorithm based on incremental mutual information and cockroach swarm optimization

Feature selection is an effective preprocessing technique to reduce data...
research
07/18/2022

High-Order Conditional Mutual Information Maximization for dealing with High-Order Dependencies in Feature Selection

This paper presents a novel feature selection method based on the condit...
research
11/12/2019

Model-Augmented Nearest-Neighbor Estimation of Conditional Mutual Information for Feature Selection

Markov blanket feature selection, while theoretically optimal, generally...
research
10/21/2022

A GA-like Dynamic Probability Method With Mutual Information for Feature Selection

Feature selection plays a vital role in promoting the classifier's perfo...

Please sign up or login with your details

Forgot password? Click here to reset