Debiased Kernel Methods

02/22/2021
by   Rahul Singh, et al.
0

I propose a practical procedure based on bias correction and sample splitting to calculate confidence intervals for functionals of generic kernel methods, i.e. nonparametric estimators learned in a reproducing kernel Hilbert space (RKHS). For example, an analyst may desire confidence intervals for functionals of kernel ridge regression. I propose a bias correction that mirrors kernel ridge regression. The framework encompasses (i) evaluations over discrete domains, (ii) derivatives over continuous domains, (iii) treatment effects of discrete treatments, and (iv) incremental treatment effects of continuous treatments. For the target quantity, whether it is (i)-(iv), I prove root-n consistency, Gaussian approximation, and semiparametric efficiency by finite sample arguments. I show that the classic assumptions of RKHS learning theory also imply inference.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset