Cholesky-based multivariate Gaussian regression

02/26/2021
by   Thomas Muschinski, et al.
0

Multivariate Gaussian regression is embedded into a general distributional regression framework using flexible additive predictors determining all distributional parameters. While this is relatively straightforward for the means of the multivariate dependent variable, it is more challenging for the full covariance matrix Σ due to two main difficulties: (i) ensuring positive-definiteness of Σ and (ii) regularizing the high model complexity. Both challenges are addressed by adopting a parameterization of Σ based on its basic or modified Cholesky decomposition, respectively. Unlike the decomposition into variances and a correlation matrix, the Cholesky decomposition guarantees positive-definiteness for any predictor values regardless of the distributional dimension. Thus, this enables linking all distributional parameters to flexible predictors without any joint constraints that would substantially complicate other parameterizations. Moreover, this approach enables regularization of the flexible additive predictors through penalized maximum likelihood or Bayesian estimation as for other distributional regression models. Finally, the Cholesky decomposition allows to reduce the number of parameters when the components of the multivariate dependent variable have a natural order (typically time) and a maximum lag can be assumed for the dependencies among the components.

READ FULL TEXT
research
01/11/2011

Bayesian Nonparametric Covariance Regression

Although there is a rich literature on methods for allowing the variance...
research
02/25/2022

Boosting Distributional Copula Regression

Capturing complex dependence structures between outcome variables (e.g.,...
research
11/28/2021

Using the Softplus Function to Construct Alternative Link Functions in Generalized Linear Models and Beyond

Response functions linking regression predictors to properties of the re...
research
04/29/2020

Joint Likelihood-based Principal Components Regression

We propose a method for estimating principal components regressions by m...
research
10/18/2019

Noncrossing structured additive multiple-output Bayesian quantile regression models

Quantile regression models are a powerful tool for studying different po...
research
10/14/2020

Neural Mixture Distributional Regression

We present neural mixture distributional regression (NMDR), a holistic f...
research
06/03/2019

Unconstrained representation of orthogonal matrices with application to common principle components

Many statistical problems involve the estimation of a (d× d) orthogonal ...

Please sign up or login with your details

Forgot password? Click here to reset