Taming Nonconvexity in Kernel Feature Selection—Favorable Properties of the Laplace Kernel

06/17/2021
by   Feng Ruan, et al.
10

Kernel-based feature selection is an important tool in nonparametric statistics. Despite many practical applications of kernel-based feature selection, there is little statistical theory available to support the method. A core challenge is the objective function of the optimization problems used to define kernel-based feature selection are nonconvex. The literature has only studied the statistical properties of the global optima, which is a mismatch, given that the gradient-based algorithms available for nonconvex optimization are only able to guarantee convergence to local minima. Studying the full landscape associated with kernel-based methods, we show that feature selection objectives using the Laplace kernel (and other ℓ_1 kernels) come with statistical guarantees that other kernels, including the ubiquitous Gaussian kernel (or other ℓ_2 kernels) do not possess. Based on a sharp characterization of the gradient of the objective function, we show that ℓ_1 kernels eliminate unfavorable stationary points that appear when using an ℓ_2 kernel. Armed with this insight, we establish statistical guarantees for ℓ_1 kernel-based feature selection which do not require reaching the global minima. In particular, we establish model-selection consistency of ℓ_1-kernel-based feature selection in recovering main effects and hierarchical interactions in the nonparametric setting with n ∼log p samples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/24/2020

A Self-Penalizing Objective Function for Scalable Interaction Detection

We tackle the problem of nonparametric variable selection with a focus o...
research
10/12/2021

On the Self-Penalization Phenomenon in Feature Selection

We describe an implicit sparsity-inducing mechanism based on minimizatio...
research
02/12/2019

Sparse Feature Selection in Kernel Discriminant Analysis via Optimal Scoring

We consider the two-group classification problem and propose a kernel cl...
research
01/31/2012

Feature Selection for Value Function Approximation Using Bayesian Model Selection

Feature selection in reinforcement learning (RL), i.e. choosing basis fu...
research
05/23/2012

Efficient Sparse Group Feature Selection via Nonconvex Optimization

Sparse feature selection has been demonstrated to be effective in handli...
research
07/08/2020

Learning from DPPs via Sampling: Beyond HKPV and symmetry

Determinantal point processes (DPPs) have become a significant tool for ...
research
09/10/2022

Variational Autoencoder Kernel Interpretation and Selection for Classification

This work proposed kernel selection approaches for probabilistic classif...

Please sign up or login with your details

Forgot password? Click here to reset