Kernel Feature Selection via Conditional Covariance Minimization

07/04/2017
by   Jianbo Chen, et al.
0

We propose a framework for feature selection that employs kernel-based measures of independence to find a subset of covariates that is maximally predictive of the response. Building on past work in kernel dimension reduction, we formulate our approach as a constrained optimization problem involving the trace of the conditional covariance operator, and additionally provide some consistency results. We then demonstrate on a variety of synthetic and real data sets that our method compares favorably with other state-of-the-art algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2019

On Distance and Kernel Measures of Conditional Independence

Measuring conditional independence is one of the important tasks in stat...
research
02/01/2014

Markov Blanket Ranking using Kernel-based Conditional Dependence Measures

Developing feature selection algorithms that move beyond a pure correlat...
research
02/21/2018

Learning to Explain: An Information-Theoretic Perspective on Model Interpretation

We introduce instancewise feature selection as a methodology for model i...
research
10/09/2020

Causal Feature Selection with Dimension Reduction for Interpretable Text Classification

Text features that are correlated with class labels, but do not directly...
research
05/10/2012

A Generalized Kernel Approach to Structured Output Learning

We study the problem of structured output learning from a regression per...
research
08/22/2019

Applications of Nature-Inspired Algorithms for Dimension Reduction: Enabling Efficient Data Analytics

In [1], we have explored the theoretical aspects of feature selection an...
research
04/26/2013

Learning Densities Conditional on Many Interacting Features

Learning a distribution conditional on a set of discrete-valued features...

Please sign up or login with your details

Forgot password? Click here to reset