How can classical multidimensional scaling go wrong?

10/21/2021
by   Rishi Sonthalia, et al.
0

Given a matrix D describing the pairwise dissimilarities of a data set, a common task is to embed the data points into Euclidean space. The classical multidimensional scaling (cMDS) algorithm is a widespread method to do this. However, theoretical analysis of the robustness of the algorithm and an in-depth analysis of its performance on non-Euclidean metrics is lacking. In this paper, we derive a formula, based on the eigenvalues of a matrix obtained from D, for the Frobenius norm of the difference between D and the metric D_cmds returned by cMDS. This error analysis leads us to the conclusion that when the derived matrix has a significant number of negative eigenvalues, then D-D_cmds_F, after initially decreasing, will eventually increase as we increase the dimension. Hence, counterintuitively, the quality of the embedding degrades as we increase the dimension. We empirically verify that the Frobenius norm increases as we increase the dimension for a variety of non-Euclidean metrics. We also show on several benchmark datasets that this degradation in the embedding results in the classification accuracy of both simple (e.g., 1-nearest neighbor) and complex (e.g., multi-layer neural nets) classifiers decreasing as we increase the embedding dimension. Finally, our analysis leads us to a new efficiently computable algorithm that returns a matrix D_l that is at least as close to the original distances as D_t (the Euclidean metric closest in ℓ_2 distance). While D_l is not metric, when given as input to cMDS instead of D, it empirically results in solutions whose distance to D does not increase when we increase the dimension and the classification accuracy degrades less than the cMDS solution.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/10/2023

A dual basis approach to multidimensional scaling: spectral analysis and graph regularity

Classical multidimensional scaling (CMDS) is a technique that aims to em...
research
09/22/2017

Intrinsic Metrics: Nearest Neighbor and Edge Squared Distances

Some researchers have proposed using non-Euclidean metrics for clusterin...
research
09/22/2017

Intrinsic Metrics: Exact Equality between a Geodesic Metric and a Graph metric

Some researchers have proposed using non-Euclidean metrics for clusterin...
research
06/11/2013

Efficient Classification for Metric Data

Recent advances in large-margin classification of data residing in gener...
research
02/03/2015

Classification of Hyperspectral Imagery on Embedded Grassmannians

We propose an approach for capturing the signal variability in hyperspec...
research
05/24/2022

Approximate Euclidean lengths and distances beyond Johnson-Lindenstrauss

A classical result of Johnson and Lindenstrauss states that a set of n h...
research
06/09/2022

Neural Bregman Divergences for Distance Learning

Many metric learning tasks, such as triplet learning, nearest neighbor r...

Please sign up or login with your details

Forgot password? Click here to reset