Mathematical Analysis on Out-of-Sample Extensions

04/19/2018
by   Jianzhong Wang, et al.
0

Let X=X∪Z be a data set in R^D, where X is the training set and Z is the test one. Many unsupervised learning algorithms based on kernel methods have been developed to provide dimensionality reduction (DR) embedding for a given training set Φ: X→R^d ( d≪ D) that maps the high-dimensional data X to its low-dimensional feature representation Y=Φ(X). However, these algorithms do not straightforwardly produce DR of the test set Z. An out-of-sample extension method provides DR of Z using an extension of the existent embedding Φ, instead of re-computing the DR embedding for the whole set X. Among various out-of-sample DR extension methods, those based on Nyström approximation are very attractive. Many papers have developed such out-of-extension algorithms and shown their validity by numerical experiments. However, the mathematical theory for the DR extension still need further consideration. Utilizing the reproducing kernel Hilbert space (RKHS) theory, this paper develops a preliminary mathematical analysis on the out-of-sample DR extension operators. It treats an out-of-sample DR extension operator as an extension of the identity on the RKHS defined on X. Then the Nyström-type DR extension turns out to be an orthogonal projection. In the paper, we also present the conditions for the exact DR extension and give the estimate for the error of the extension.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/01/2021

Visual Cluster Separation Using High-Dimensional Sharpened Dimensionality Reduction

Applying dimensionality reduction (DR) to large, high-dimensional data s...
research
12/28/2019

Measuring group-separability in geometrical space for evaluation of pattern recognition and embedding algorithms

Evaluating data separation in a geometrical space is fundamental for pat...
research
10/03/2020

Perplexity-free Parametric t-SNE

The t-distributed Stochastic Neighbor Embedding (t-SNE) algorithm is a u...
research
05/28/2012

Towards a Mathematical Foundation of Immunology and Amino Acid Chains

We attempt to set a mathematical foundation of immunology and amino acid...
research
02/08/2021

Communication-efficient k-Means for Edge-based Machine Learning

We consider the problem of computing the k-means centers for a large hig...
research
09/07/2020

A perturbation based out-of-sample extension framework

Out-of-sample extension is an important task in various kernel based non...
research
12/16/2017

Taming Wild High Dimensional Text Data with a Fuzzy Lash

The bag of words (BOW) represents a corpus in a matrix whose elements ar...

Please sign up or login with your details

Forgot password? Click here to reset