Non-asymptotic Optimal Prediction Error for RKHS-based Partially Functional Linear Models

09/10/2020
by   Xiaoyu Lei, et al.
0

Under the framework of reproducing kernel Hilbert space (RKHS), we consider the penalized least-squares of the partially functional linear models (PFLM), whose predictor contains both functional and traditional multivariate part, and the multivariate part allows a divergent number of parameters. From the non-asymptotic point of view, we focus on the rate-optimal upper and lower bounds of the prediction error. An exact upper bound for the excess prediction risk is shown in a non-asymptotic form under a more general assumption known as the effective dimension to the model, by which we also show the prediction consistency when the number of multivariate covariates p slightly increases with the sample size n. Our new finding implies a trade-off between the number of non-functional predictors and the effective dimension of the kernel principal components to ensure the prediction consistency in the increasing-dimensional setting. The analysis in our proof hinges on the spectral condition of the sandwich operator of the covariance operator and the reproducing kernel, and on the concentration inequalities for the random elements in Hilbert space. Finally, we derive the non-asymptotic minimax lower bound under the regularity assumption of Kullback-Leibler divergence of the models.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

12/23/2019

Exact minimax risk for linear least squares, and the lower tail of sample covariance matrices

The first part of this paper is devoted to the decision-theoretic analys...
09/30/2020

A new upper bound for sampling numbers

We provide a new upper bound for sampling numbers (g_n)_n∈ℕ associated t...
12/07/2018

Principal components analysis of regularly varying functions

The paper is concerned with asymptotic properties of the principal compo...
09/15/2011

Sampled forms of functional PCA in reproducing kernel Hilbert spaces

We consider the sampling problem for functional PCA (fPCA), where the si...
11/22/2019

Parametric Models Analysed with Linear Maps

Parametric entities appear in many contexts, be it in optimisation, cont...
09/24/2021

Optimal policy evaluation using kernel-based temporal difference methods

We study methods based on reproducing kernel Hilbert spaces for estimati...
05/29/2017

Fast learning rate of deep learning via a kernel perspective

We develop a new theoretical framework to analyze the generalization err...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.