Statistical Consistency of Kernel PCA with Random Features

06/20/2017
by   Bharath Sriperumbudur, et al.
0

Kernel methods are powerful learning methodologies that provide a simple way to construct nonlinear algorithms from linear ones. Despite their popularity, they suffer from poor scalability in big data scenarios. Various approximation methods, including random feature approximation have been proposed to alleviate the problem. However, the statistical consistency of most of these approximate kernel methods are not well understood except for kernel ridge regression wherein it has been shown that the random feature approximation is not only computationally efficient but also statistically consistent with a minimax optimal rate of convergence. In this paper, we investigate the efficacy of random feature approximation in the context of kernel principal component analysis (KPCA) by studying the statistical behavior of approximate KPCA. We show that the approximate KPCA is either computationally efficient or statistically efficient (i.e., achieves the same convergence rate as that of KPCA) but not both. This means, in the context of KPCA, the random feature approximation provides computational efficiency at the cost of statistical efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2021

Statistical Optimality and Computational Efficiency of Nyström Kernel PCA

Kernel methods provide an elegant framework for developing nonlinear lea...
research
06/18/2012

A Linear Approximation to the chi^2 Kernel with Geometric Convergence

We propose a new analytical approximation to the χ^2 kernel that converg...
research
09/14/2018

Revisiting Random Binning Features: Fast Convergence and Strong Parallelizability

Kernel method has been developed as one of the standard approaches for n...
research
09/21/2014

Approximation errors of online sparsification criteria

Many machine learning frameworks, such as resource-allocating networks, ...
research
02/11/2020

Generalization Guarantees for Sparse Kernel Approximation with Entropic Optimal Features

Despite their success, kernel methods suffer from a massive computationa...
research
06/06/2015

Optimal Rates for Random Fourier Features

Kernel methods represent one of the most powerful tools in machine learn...
research
09/22/2016

Randomized Independent Component Analysis

Independent component analysis (ICA) is a method for recovering statisti...

Please sign up or login with your details

Forgot password? Click here to reset