Optimal Projections for Gaussian Discriminants

04/07/2020
by   David P. Hofmeyr, et al.
0

The problem of obtaining optimal projections for performing discriminant analysis with Gaussian class densities is studied. Unlike in most existing approaches to the problem, the focus of the optimisation is on the multinomial likelihood based on posterior probability estimates, which directly captures discriminability of classes. In addition to the more commonly considered problem, in this context, of classification, the unsupervised clustering counterpart is also considered. Finding optimal projections offers utility for dimension reduction and regularisation, as well as instructive visualisation for better model interpretability. Practical applications of the proposed approach show considerable promise for both classification and clustering. Code to implement the proposed method is available in the form of an R package from https://github.com/DavidHofmeyr/OPGD.

READ FULL TEXT
research
08/28/2013

Clustering, Classification, Discriminant Analysis, and Dimension Reduction via Generalized Hyperbolic Mixtures

A method for dimension reduction with clustering, classification, or dis...
research
08/07/2021

Clustering Large Data Sets with Incremental Estimation of Low-density Separating Hyperplanes

An efficient method for obtaining low-density hyperplane separators in t...
research
03/12/2021

Fast, Scalable Approximations to Posterior Distributions in Extended Latent Gaussian Models

We define a novel class of additive models called Extended Latent Gaussi...
research
04/05/2021

Alternating projections with applications to Gerchberg-Saxton error reduction

We consider convergence of alternating projections between non-convex se...
research
06/20/2016

Continuum directions for supervised dimension reduction

Dimension reduction of multivariate data supervised by auxiliary informa...

Please sign up or login with your details

Forgot password? Click here to reset