Grassmann Iterative Linear Discriminant Analysis with Proxy Matrix Optimization

04/16/2021
by   Navya Nagananda, et al.
0

Linear Discriminant Analysis (LDA) is commonly used for dimensionality reduction in pattern recognition and statistics. It is a supervised method that aims to find the most discriminant space of reduced dimension that can be further used for classification. In this work, we present a Grassmann Iterative LDA method (GILDA) that is based on Proxy Matrix Optimization (PMO). PMO makes use of automatic differentiation and stochastic gradient descent (SGD) on the Grassmann manifold to arrive at the optimal projection matrix. Our results show that GILDAoutperforms the prevailing manifold optimization method.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

08/20/2021

Quadratic Discriminant Analysis by Projection

Discriminant analysis, including linear discriminant analysis (LDA) and ...
05/02/2022

Revisiting Classical Multiclass Linear Discriminant Analysis with a Novel Prototype-based Interpretable Solution

Linear discriminant analysis (LDA) is a fundamental method for feature e...
06/27/2012

Communications Inspired Linear Discriminant Analysis

We study the problem of supervised linear dimensionality reduction, taki...
09/09/2018

Randomized Iterative Algorithms for Fisher Discriminant Analysis

Fisher discriminant analysis (FDA) is a widely used method for classific...
03/08/2021

Large-Sample Properties of Blind Estimation of the Linear Discriminant Using Projection Pursuit

We study the estimation of the linear discriminant with projection pursu...
08/29/2016

Wasserstein Discriminant Analysis

Wasserstein Discriminant Analysis (WDA) is a new supervised method that ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.