Sparse Feature Selection in Kernel Discriminant Analysis via Optimal Scoring

02/12/2019
by   Alexander F. Lapanowski, et al.
0

We consider the two-group classification problem and propose a kernel classifier based on the optimal scoring framework. Unlike previous approaches, we provide theoretical guarantees on the expected risk consistency of the method. We also allow for feature selection by imposing structured sparsity using weighted kernels. We propose fully-automated methods for selection of all tuning parameters, and in particular adapt kernel shrinkage ideas for ridge parameter selection. Numerical studies demonstrate the superior classification performance of the proposed approach compared to existing nonparametric classifiers.

READ FULL TEXT
research
03/30/2021

A General Framework of Nonparametric Feature Selection in High-Dimensional Data

Nonparametric feature selection in high-dimensional data is an important...
research
06/17/2021

Taming Nonconvexity in Kernel Feature Selection—Favorable Properties of the Laplace Kernel

Kernel-based feature selection is an important tool in nonparametric sta...
research
06/27/2012

Feature Selection via Probabilistic Outputs

This paper investigates two feature-scoring criteria that make use of es...
research
03/21/2022

Feature Selection for Vertex Discriminant Analysis

We revisit vertex discriminant analysis (VDA) from the perspective of pr...
research
01/06/2022

Sparsity-based Feature Selection for Anomalous Subgroup Discovery

Anomalous pattern detection aims to identify instances where deviation f...
research
10/12/2021

On the Self-Penalization Phenomenon in Feature Selection

We describe an implicit sparsity-inducing mechanism based on minimizatio...
research
07/12/2020

Simultaneous Feature Selection and Outlier Detection with Optimality Guarantees

Sparse estimation methods capable of tolerating outliers have been broad...

Please sign up or login with your details

Forgot password? Click here to reset