DeepAI AI Chat
Log In Sign Up

Optimal prediction for kernel-based semi-functional linear regression

by   Keli Guo, et al.

In this paper, we establish minimax optimal rates of convergence for prediction in a semi-functional linear model that consists of a functional component and a less smooth nonparametric component. Our results reveal that the smoother functional component can be learned with the minimax rate as if the nonparametric component were known. More specifically, a double-penalized least squares method is adopted to estimate both the functional and nonparametric components within the framework of reproducing kernel Hilbert spaces. By virtue of the representer theorem, an efficient algorithm that requires no iterations is proposed to solve the corresponding optimization problem, where the regularization parameters are selected by the generalized cross validation criterion. Numerical studies are provided to demonstrate the effectiveness of the method and to verify the theoretical analysis.


page 1

page 2

page 3

page 4


Kernel-based estimation for partially functional linear model: Minimax rates and randomized sketches

This paper considers the partially functional linear model (PFLM) where ...

Nonparametric Functional Approximation with Delaunay Triangulation

We propose a differentiable nonparametric algorithm, the Delaunay triang...

On estimation and prediction in a spatial semi-functional linear regression model

We tackle estimation and prediction at non-visted sites in a spatial sem...

Optimal Penalized Function-on-Function Regression under a Reproducing Kernel Hilbert Space Framework

Many scientific studies collect data where the response and predictor va...

Adaptive nonparametric estimation in the functional linear model with functional output

In this paper, we consider a functional linear regression model, where b...