Classification via local manifold approximation

03/03/2019
by   Didong Li, et al.
6

Classifiers label data as belonging to one of a set of groups based on input features. It is challenging to obtain accurate classification performance when the feature distributions in the different classes are complex, with nonlinear, overlapping and intersecting supports. This is particularly true when training data are limited. To address this problem, this article proposes a new type of classifier based on obtaining a local approximation to the support of the data within each class in a neighborhood of the feature to be classified, and assigning the feature to the class having the closest support. This general algorithm is referred to as LOcal Manifold Approximation (LOMA) classification. As a simple and theoretically supported special case having excellent performance in a broad variety of examples, we use spheres for local approximation, obtaining a SPherical Approximation (SPA) classifier. We illustrate substantial gains for SPA over competitors on a variety of challenging simulated and real data examples.

READ FULL TEXT
research
07/30/2021

A SPA-based Manifold Learning Framework for Motor Imagery EEG Data Classification

The electroencephalography (EEG) signal is a non-stationary, stochastic,...
research
10/14/2021

Inferring Manifolds From Noisy Data Using Gaussian Processes

In analyzing complex datasets, it is often of interest to infer lower di...
research
07/05/2022

Predicting Out-of-Domain Generalization with Local Manifold Smoothness

Understanding how machine learning models generalize to new environments...
research
06/22/2011

Learning When Training Data are Costly: The Effect of Class Distribution on Tree Induction

For large, real-world inductive learning problems, the number of trainin...
research
06/02/2022

Robustness to Label Noise Depends on the Shape of the Noise Distribution in Feature Space

Machine learning classifiers have been demonstrated, both empirically an...
research
12/14/2020

Cost-sensitive Hierarchical Clustering for Dynamic Classifier Selection

We consider the dynamic classifier selection (DCS) problem: Given an ens...
research
10/07/2013

Discriminative Features via Generalized Eigenvectors

Representing examples in a way that is compatible with the underlying cl...

Please sign up or login with your details

Forgot password? Click here to reset