Incomplete Pivoted QR-based Dimensionality Reduction

07/12/2016
by   Amit Bermanis, et al.
0

High-dimensional big data appears in many research fields such as image recognition, biology and collaborative filtering. Often, the exploration of such data by classic algorithms is encountered with difficulties due to `curse of dimensionality' phenomenon. Therefore, dimensionality reduction methods are applied to the data prior to its analysis. Many of these methods are based on principal components analysis, which is statistically driven, namely they map the data into a low-dimension subspace that preserves significant statistical properties of the high-dimensional data. As a consequence, such methods do not directly address the geometry of the data, reflected by the mutual distances between multidimensional data point. Thus, operations such as classification, anomaly detection or other machine learning tasks may be affected. This work provides a dictionary-based framework for geometrically driven data analysis that includes dimensionality reduction, out-of-sample extension and anomaly detection. It embeds high-dimensional data in a low-dimensional subspace. This embedding preserves the original high-dimensional geometry of the data up to a user-defined distortion rate. In addition, it identifies a subset of landmark data points that constitute a dictionary for the analyzed dataset. The dictionary enables to have a natural extension of the low-dimensional embedding to out-of-sample data points, which gives rise to a distortion-based criterion for anomaly detection. The suggested method is demonstrated on synthetic and real-world datasets and achieves good results for classification, anomaly detection and out-of-sample tasks.

READ FULL TEXT
research
11/03/2015

PCA-Based Out-of-Sample Extension for Dimensionality Reduction

Dimensionality reduction methods are very common in the field of high di...
research
10/29/2020

Graph Regularized Autoencoder and its Application in Unsupervised Anomaly Detection

Dimensionality reduction is a crucial first step for many unsupervised l...
research
03/20/2022

Subspace Modeling for Fast Out-Of-Distribution and Anomaly Detection

This paper presents a fast, principled approach for detecting anomalous ...
research
02/14/2012

Nonparametric Divergence Estimation with Applications to Machine Learning on Distributions

Low-dimensional embedding, manifold learning, clustering, classification...
research
09/27/2016

Online Categorical Subspace Learning for Sketching Big Data with Misses

With the scale of data growing every day, reducing the dimensionality (a...
research
09/22/2021

The Curse Revisited: a Newly Quantified Concept of Meaningful Distances for Learning from High-Dimensional Noisy Data

Distances between data points are widely used in point cloud representat...
research
09/17/2016

ADAGIO: Fast Data-aware Near-Isometric Linear Embeddings

Many important applications, including signal reconstruction, parameter ...

Please sign up or login with your details

Forgot password? Click here to reset