DeepAI AI Chat
Log In Sign Up

Weighted Orthogonal Components Regression Analysis

by   Xiaogang Su, et al.
The University of Texas at El Paso

In the multiple linear regression setting, we propose a general framework, termed weighted orthogonal components regression (WOCR), which encompasses many known methods as special cases, including ridge regression and principal components regression. WOCR makes use of the monotonicity inherent in orthogonal components to parameterize the weight function. The formulation allows for efficient determination of tuning parameters and hence is computationally advantageous. Moreover, WOCR offers insights for deriving new better variants. Specifically, we advocate weighting components based on their correlations with the response, which leads to enhanced predictive performance. Both simulated studies and real data examples are provided to assess and illustrate the advantages of the proposed methods.


page 1

page 2

page 3

page 4


Linear regression in the Bayesian framework

These notes aim at clarifying different strategies to perform linear reg...

On the Optimal Weighted ℓ_2 Regularization in Overparameterized Linear Regression

We consider the linear model 𝐲 = 𝐗β_⋆ + ϵ with 𝐗∈ℝ^n× p in the overparam...

Sparse Principal Components Analysis: a Tutorial

The topic of this tutorial is Least Squares Sparse Principal Components ...

Extension to mixed models of the Supervised Component-based Generalised Linear Regression

We address the component-based regularisation of a multivariate Generali...

Advanced Variations of Two-Dimensional Principal Component Analysis for Face Recognition

The two-dimensional principal component analysis (2DPCA) has become one ...

On the Use of L-functionals in Regression Models

In this paper we survey and unify a large class or L-functionals of the ...

Comparison of Multi-response Prediction Methods

While data science is battling to extract information from the enormous ...