Adaptive Ridge-Penalized Functional Local Linear Regression

09/17/2021
by   Wentian Huang, et al.
0

We introduce an original method of multidimensional ridge penalization in functional local linear regressions. The nonparametric regression of functional data is extended from its multivariate counterpart, and is known to be sensitive to the choice of J, where J is the dimension of the projection subspace of the data. Under multivariate setting, a roughness penalty is helpful for variance reduction. However, among the limited works covering roughness penalty under the functional setting, most only use a single scalar for tuning. Our new approach proposes a class of data-adaptive ridge penalties, meaning that the model automatically adjusts the structure of the penalty according to the data sets. This structure has J free parameters and enables a quadratic programming search for optimal tuning parameters that minimize the estimated mean squared error (MSE) of prediction, and is capable of applying different roughness penalty levels to each of the J basis. The strength of the method in prediction accuracy and variance reduction with finite data is demonstrated through multiple simulation scenarios and two real-data examples. Its asymptotic performance is proved and compared to the unpenalized functional local linear regressions.

READ FULL TEXT
research
07/18/2019

Scalar-on-function local linear regression and beyond

Regressing a scalar response on a random function is nowadays a common s...
research
01/27/2020

Penalized angular regression for personalized predictions

Personalization is becoming an important feature in many predictive appl...
research
05/18/2021

Achieving Fairness with a Simple Ridge Penalty

Estimating a fair linear regression model subject to a user-defined leve...
research
11/09/2019

Influence of single observations on the choice of the penalty parameter in ridge regression

Penalized regression methods, such as ridge regression, heavily rely on ...
research
04/10/2022

Optimal Subsampling for Large Sample Ridge Regression

Subsampling is a popular approach to alleviating the computational burde...
research
01/03/2017

New Methods of Enhancing Prediction Accuracy in Linear Models with Missing Data

In this paper, prediction for linear systems with missing information is...
research
04/18/2019

Ridge regularization for Mean Squared Error Reduction in Regression with Weak Instruments

In this paper, I show that classic two-stage least squares (2SLS) estima...

Please sign up or login with your details

Forgot password? Click here to reset