Foundations of Coupled Nonlinear Dimensionality Reduction

09/29/2015
by   Mehryar Mohri, et al.
0

In this paper we introduce and analyze the learning scenario of coupled nonlinear dimensionality reduction, which combines two major steps of machine learning pipeline: projection onto a manifold and subsequent supervised learning. First, we present new generalization bounds for this scenario and, second, we introduce an algorithm that follows from these bounds. The generalization error bound is based on a careful analysis of the empirical Rademacher complexity of the relevant hypothesis set. In particular, we show an upper bound on the Rademacher complexity that is in O(√(Λ_(r)/m)), where m is the sample size and Λ_(r) the upper bound on the Ky-Fan r-norm of the associated kernel matrix. We give both upper and lower bound guarantees in terms of that Ky-Fan r-norm, which strongly justifies the definition of our hypothesis set. To the best of our knowledge, these are the first learning guarantees for the problem of coupled dimensionality reduction. Our analysis and learning guarantees further apply to several special cases, such as that of using a fixed kernel with supervised dimensionality reduction or that of unsupervised learning of a kernel for dimensionality reduction followed by a supervised learning algorithm. Based on theoretical analysis, we suggest a structural risk minimization algorithm consisting of the coupled fitting of a low dimensional manifold and a separation function on that manifold.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/11/2017

Dimensionality Reduction Ensembles

Ensemble learning has had many successes in supervised learning, but it ...
research
02/12/2013

Adaptive Metric Dimensionality Reduction

We study adaptive data-dependent dimensionality reduction in the context...
research
11/23/2020

Dimensionality reduction, regularization, and generalization in overparameterized regressions

Overparameterization in deep learning is powerful: Very large models fit...
research
07/21/2010

Manifold Elastic Net: A Unified Framework for Sparse Dimension Reduction

It is difficult to find the optimal sparse solution of a manifold learni...
research
10/19/2017

Nonlinear Supervised Dimensionality Reduction via Smooth Regular Embeddings

The recovery of the intrinsic geometric structures of data collections i...
research
02/24/2015

On the Equivalence between Kernel Quadrature Rules and Random Feature Expansions

We show that kernel-based quadrature rules for computing integrals can b...
research
12/24/2018

bigMap: Big Data Mapping with Parallelized t-SNE

We introduce an improved unsupervised clustering protocol specially suit...

Please sign up or login with your details

Forgot password? Click here to reset