A Category Space Approach to Supervised Dimensionality Reduction

10/27/2016
by   Anthony O. Smith, et al.
0

Supervised dimensionality reduction has emerged as an important theme in the last decade. Despite the plethora of models and formulations, there is a lack of a simple model which aims to project the set of patterns into a space defined by the classes (or categories). To this end, we set up a model in which each class is represented as a 1D subspace of the vector space formed by the features. Assuming the set of classes does not exceed the cardinality of the features, the model results in multi-class supervised learning in which the features of each class are projected into the class subspace. Class discrimination is automatically guaranteed via the imposition of orthogonality of the 1D class sub-spaces. The resulting optimization problem - formulated as the minimization of a sum of quadratic functions on a Stiefel manifold - while being non-convex (due to the constraints), nevertheless has a structure for which we can identify when we have reached a global minimum. After formulating a version with standard inner products, we extend the formulation to reproducing kernel Hilbert spaces in a straightforward manner. The optimization approach also extends in a similar fashion to the kernel version. Results and comparisons with the multi-class Fisher linear (and kernel) discriminants and principal component analysis (linear and kernel) showcase the relative merits of this approach to dimensionality reduction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/14/2018

Class Mean Vector Component and Discriminant Analysis for Kernel Subspace Learning

The kernel matrix used in kernel methods encodes all the information req...
research
10/11/2017

Dimensionality Reduction Ensembles

Ensemble learning has had many successes in supervised learning, but it ...
research
10/30/2022

Gravitational Dimensionality Reduction Using Newtonian Gravity and Einstein's General Relativity

Due to the effectiveness of using machine learning in physics, it has be...
research
02/11/2018

Convex Formulations for Fair Principal Component Analysis

Though there is a growing body of literature on fairness for supervised ...
research
10/16/2014

MKL-RT: Multiple Kernel Learning for Ratio-trace Problems via Convex Optimization

In the recent past, automatic selection or combination of kernels (or fe...
research
04/06/2023

Multi-Linear Kernel Regression and Imputation in Data Manifolds

This paper introduces an efficient multi-linear nonparametric (kernel-ba...
research
02/16/2020

Fair Principal Component Analysis and Filter Design

We consider Fair Principal Component Analysis (FPCA) and search for a lo...

Please sign up or login with your details

Forgot password? Click here to reset