Statistically and Computationally Efficient Variance Estimator for Kernel Ridge Regression

09/17/2018
by   Meimei Liu, et al.
0

In this paper, we propose a random projection approach to estimate variance in kernel ridge regression. Our approach leads to a consistent estimator of the true variance, while being computationally more efficient. Our variance estimator is optimal for a large family of kernels, including cubic splines and Gaussian kernels. Simulation analysis is conducted to support our theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/29/2021

Statistical Inference after Kernel Ridge Regression Imputation under item nonresponse

Imputation is a popular technique for handling missing data. We consider...
research
05/01/2012

A Randomized Mirror Descent Algorithm for Large Scale Multiple Kernel Learning

We consider the problem of simultaneously learning to linearly combine a...
research
06/17/2020

Kernel Alignment Risk Estimator: Risk Prediction from Training Data

We study the risk (i.e. generalization error) of Kernel Ridge Regression...
research
12/17/2019

Extrinsic Kernel Ridge Regression Classifier for Planar Kendall Shape Space

Kernel methods have had great success in the statistics and machine lear...
research
02/17/2018

Nonparametric Testing under Random Projection

A common challenge in nonparametric inference is its high computational ...
research
04/19/2019

Risk Convergence of Centered Kernel Ridge Regression with Large Dimensional Data

This paper carries out a large dimensional analysis of a variation of ke...
research
04/21/2022

Variance estimation in pseudo-expected estimating equations for missing data

Missing data is a common challenge in biomedical research. This fact, al...

Please sign up or login with your details

Forgot password? Click here to reset