DeepAI AI Chat
Log In Sign Up

Perturbation Bounds for Procrustes, Classical Scaling, and Trilateration, with Applications to Manifold Learning

by   Ery Arias-Castro, et al.

One of the common tasks in unsupervised learning is dimensionality reduction, where the goal is to find meaningful low-dimensional structures hidden in high-dimensional data. Sometimes referred to as manifold learning, this problem is closely related to the problem of localization, which aims at embedding a weighted graph into a low-dimensional Euclidean space. Several methods have been proposed for localization, and also manifold learning. Nonetheless, the robustness property of most of them is little understood. In this paper, we obtain perturbation bounds for classical scaling and trilateration, which are then applied to derive performance bounds for Isomap, Landmark Isomap, and Maximum Variance Unfolding. A new perturbation bound for procrustes analysis plays a key role.


Dimensionality Reduction on SPD Manifolds: The Emergence of Geometry-Aware Methods

Representing images and videos with Symmetric Positive Definite (SPD) ma...

Conditional Manifold Learning

This paper addresses a problem called "conditional manifold learning", w...

On landmark selection and sampling in high-dimensional data analysis

In recent years, the spectral analysis of appropriately defined kernel m...

Bounded Manifold Completion

Nonlinear dimensionality reduction or, equivalently, the approximation o...

Manifold learning with arbitrary norms

Manifold learning methods play a prominent role in nonlinear dimensional...

Measuring the Robustness of Graph Properties

In this paper, we propose a perturbation framework to measure the robust...