Simple stopping criteria for information theoretic feature selection

11/29/2018
by   Shujian Yu, et al.
0

Information theoretic feature selection aims to select a smallest feature subset such that the mutual information between the selected features and the class labels is maximized. Despite the simplicity of this objective, there still remains several open problems to optimize it. These include, for example, the automatic determination of the optimal subset size (i.e., the number of features) or a stopping criterion if the greedy searching strategy is adopted. In this letter, we suggest two stopping criteria by just monitoring the conditional mutual information (CMI) among groups of variables. Using the recently developed multivariate matrix-based Renyi's α-entropy functional, we show that the CMI among groups of variables can be easily estimated without any decomposition or approximation, hence making our criteria easily implemented and seamlessly integrated into any existing information theoretic feature selection methods with greedy search strategy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2016

Variational Information Maximization for Feature Selection

Feature selection is one of the most fundamental problems in machine lea...
research
07/17/2019

Feature Selection via Mutual Information: New Theoretical Insights

Mutual information has been successfully adopted in filter feature-selec...
research
09/24/2015

A Review of Feature Selection Methods Based on Mutual Information

In this work we present a review of the state of the art of information ...
research
02/21/2018

Learning to Explain: An Information-Theoretic Perspective on Model Interpretation

We introduce instancewise feature selection as a methodology for model i...
research
03/28/2019

Information Theoretic Feature Transformation Learning for Brain Interfaces

Objective: A variety of pattern analysis techniques for model training i...
research
06/27/2012

Ranking by Dependence - A Fair Criteria

Estimating the dependences between random variables, and ranking them ac...
research
10/30/2020

Information-theoretic Feature Selection via Tensor Decomposition and Submodularity

Feature selection by maximizing high-order mutual information between th...

Please sign up or login with your details

Forgot password? Click here to reset