Mathematical Analysis on Out-of-Sample Extensions

04/19/2018
by   Jianzhong Wang, et al.
0

Let X=X∪Z be a data set in R^D, where X is the training set and Z is the test one. Many unsupervised learning algorithms based on kernel methods have been developed to provide dimensionality reduction (DR) embedding for a given training set Φ: X→R^d ( d≪ D) that maps the high-dimensional data X to its low-dimensional feature representation Y=Φ(X). However, these algorithms do not straightforwardly produce DR of the test set Z. An out-of-sample extension method provides DR of Z using an extension of the existent embedding Φ, instead of re-computing the DR embedding for the whole set X. Among various out-of-sample DR extension methods, those based on Nyström approximation are very attractive. Many papers have developed such out-of-extension algorithms and shown their validity by numerical experiments. However, the mathematical theory for the DR extension still need further consideration. Utilizing the reproducing kernel Hilbert space (RKHS) theory, this paper develops a preliminary mathematical analysis on the out-of-sample DR extension operators. It treats an out-of-sample DR extension operator as an extension of the identity on the RKHS defined on X. Then the Nyström-type DR extension turns out to be an orthogonal projection. In the paper, we also present the conditions for the exact DR extension and give the estimate for the error of the extension.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset