Intrinsic Dimension Estimation via Nearest Constrained Subspace Classifier

02/08/2020
by   Liang Liao, et al.
3

We consider the problems of classification and intrinsic dimension estimation on image data. A new subspace based classifier is proposed for supervised classification or intrinsic dimension estimation. The distribution of the data in each class is modeled by a union of of a finite number ofaffine subspaces of the feature space. The affine subspaces have a common dimension, which is assumed to be much less than the dimension of the feature space. The subspaces are found using regression based on the L0-norm. The proposed method is a generalisation of classical NN (Nearest Neighbor), NFL (Nearest Feature Line) classifiers and has a close relationship to NS (Nearest Subspace) classifier. The proposed classifier with an accurately estimated dimension parameter generally outperforms its competitors in terms of classification accuracy. We also propose a fast version of the classifier using a neighborhood representation to reduce its computational complexity. Experiments on publicly available datasets corroborate these claims.

READ FULL TEXT

page 10

page 11

page 12

page 14

page 15

research
03/29/2016

Scalable Solution for Approximate Nearest Subspace Search

Finding the nearest subspace is a fundamental problem and influential to...
research
01/24/2015

Consistency Analysis of Nearest Subspace Classifier

The Nearest subspace classifier (NSS) finds an estimation of the underly...
research
04/09/2020

k-Nearest Neighbour Classifiers – 2nd Edition

Perhaps the most straightforward classifier in the arsenal or machine le...
research
10/12/2020

Signal classification using weighted orthogonal regression method

In this paper, a new classifier based on the intrinsic properties of the...
research
07/15/2019

Subspace Determination through Local Intrinsic Dimensional Decomposition: Theory and Experimentation

Axis-aligned subspace clustering generally entails searching through eno...
research
08/03/2023

Get the Best of Both Worlds: Improving Accuracy and Transferability by Grassmann Class Representation

We generalize the class vectors found in neural networks to linear subsp...
research
10/15/2020

On Convergence of Nearest Neighbor Classifiers over Feature Transformations

The k-Nearest Neighbors (kNN) classifier is a fundamental non-parametric...

Please sign up or login with your details

Forgot password? Click here to reset