Principal Polynomial Analysis

01/31/2016
by   Valero Laparra, et al.
0

This paper presents a new framework for manifold learning based on a sequence of principal polynomials that capture the possibly nonlinear nature of the data. The proposed Principal Polynomial Analysis (PPA) generalizes PCA by modeling the directions of maximal variance by means of curves, instead of straight lines. Contrarily to previous approaches, PPA reduces to performing simple univariate regressions, which makes it computationally feasible and robust. Moreover, PPA shows a number of interesting analytical properties. First, PPA is a volume-preserving map, which in turn guarantees the existence of the inverse. Second, such an inverse can be obtained in closed form. Invertibility is an important advantage over other learning methods, because it permits to understand the identified features in the input domain where the data has physical meaning. Moreover, it allows to evaluate the performance of dimensionality reduction in sensible (input-domain) units. Volume preservation also allows an easy computation of information theoretic quantities, such as the reduction in multi-information after the transform. Third, the analytical nature of PPA leads to a clear geometrical interpretation of the manifold: it allows the computation of Frenet-Serret frames (local features) and of generalized curvatures at any point of the space. And fourth, the analytical Jacobian allows the computation of the metric induced by the data, thus generalizing the Mahalanobis distance. These properties are demonstrated theoretically and illustrated experimentally. The performance of PPA is evaluated in dimensionality and redundancy reduction, in both synthetic and real datasets from the UCI repository.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/31/2016

Dimensionality Reduction via Regression in Hyperspectral Imagery

This paper introduces a new unsupervised method for dimensionality reduc...
research
06/07/2021

HoroPCA: Hyperbolic Dimensionality Reduction via Horospherical Projections

This paper studies Principal Component Analysis (PCA) for data lying in ...
research
05/15/2018

Nonlinear Dimensionality Reduction for Discriminative Analytics of Multiple Datasets

Principal component analysis (PCA) is widely used for feature extraction...
research
10/25/2017

DPCA: Dimensionality Reduction for Discriminative Analytics of Multiple Large-Scale Datasets

Principal component analysis (PCA) has well-documented merits for data e...
research
08/06/2018

Efficient Principal Subspace Projection of Streaming Data Through Fast Similarity Matching

Big data problems frequently require processing datasets in a streaming ...
research
08/17/2020

Principal Ellipsoid Analysis (PEA): Efficient non-linear dimension reduction clustering

Even with the rise in popularity of over-parameterized models, simple di...

Please sign up or login with your details

Forgot password? Click here to reset