DeepAI AI Chat
Log In Sign Up

Perturbation Bounds for Procrustes, Classical Scaling, and Trilateration, with Applications to Manifold Learning

10/22/2018
by   Ery Arias-Castro, et al.
0

One of the common tasks in unsupervised learning is dimensionality reduction, where the goal is to find meaningful low-dimensional structures hidden in high-dimensional data. Sometimes referred to as manifold learning, this problem is closely related to the problem of localization, which aims at embedding a weighted graph into a low-dimensional Euclidean space. Several methods have been proposed for localization, and also manifold learning. Nonetheless, the robustness property of most of them is little understood. In this paper, we obtain perturbation bounds for classical scaling and trilateration, which are then applied to derive performance bounds for Isomap, Landmark Isomap, and Maximum Variance Unfolding. A new perturbation bound for procrustes analysis plays a key role.

READ FULL TEXT
05/20/2016

Dimensionality Reduction on SPD Manifolds: The Emergence of Geometry-Aware Methods

Representing images and videos with Symmetric Positive Definite (SPD) ma...
11/26/2021

Conditional Manifold Learning

This paper addresses a problem called "conditional manifold learning", w...
06/24/2009

On landmark selection and sampling in high-dimensional data analysis

In recent years, the spectral analysis of appropriately defined kernel m...
12/19/2019

Bounded Manifold Completion

Nonlinear dimensionality reduction or, equivalently, the approximation o...
12/28/2020

Manifold learning with arbitrary norms

Manifold learning methods play a prominent role in nonlinear dimensional...
12/03/2018

Measuring the Robustness of Graph Properties

In this paper, we propose a perturbation framework to measure the robust...