Alternating direction method of multipliers for penalized zero-variance discriminant analysis

01/21/2014
by   Brendan P. W. Ames, et al.
0

We consider the task of classification in the high dimensional setting where the number of features of the given data is significantly greater than the number of observations. To accomplish this task, we propose a heuristic, called sparse zero-variance discriminant analysis (SZVD), for simultaneously performing linear discriminant analysis and feature selection on high dimensional data. This method combines classical zero-variance discriminant analysis, where discriminant vectors are identified in the null space of the sample within-class covariance matrix, with penalization applied to induce sparse structures in the resulting vectors. To approximately solve the resulting nonconvex problem, we develop a simple algorithm based on the alternating direction method of multipliers. Further, we show that this algorithm is applicable to a larger class of penalized generalized eigenvalue problems, including a particular relaxation of the sparse principal component analysis problem. Finally, we establish theoretical guarantees for convergence of our algorithm to stationary points of the original nonconvex problem, and empirically demonstrate the effectiveness of our heuristic for classifying simulated data and data drawn from applications in time-series classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/01/2021

Sparse principal component analysis for high-dimensional stationary time series

We consider the sparse principal component analysis for high-dimensional...
research
08/07/2022

Sparse semiparametric discriminant analysis for high-dimensional zero-inflated data

Sequencing-based technologies provide an abundance of high-dimensional b...
research
08/24/2016

Kullback-Leibler Penalized Sparse Discriminant Analysis for Event-Related Potential Classification

A brain computer interface (BCI) is a system which provides direct commu...
research
10/22/2014

Penalized versus constrained generalized eigenvalue problems

We investigate the difference between using an ℓ_1 penalty versus an ℓ_1...
research
04/09/2018

High-dimensional Linear Discriminant Analysis: Optimality, Adaptive Algorithm, and Missing Data

This paper aims to develop an optimality theory for linear discriminant ...
research
10/01/2015

QUDA: A Direct Approach for Sparse Quadratic Discriminant Analysis

Quadratic discriminant analysis (QDA) is a standard tool for classificat...
research
07/04/2018

Diagonal Discriminant Analysis with Feature Selection for High Dimensional Data

We introduce a new method of performing high dimensional discriminant an...

Please sign up or login with your details

Forgot password? Click here to reset