Unconstrained representation of orthogonal matrices with application to common principle components
Many statistical problems involve the estimation of a (d× d) orthogonal matrix Q. Such an estimation is often challenging due to the orthonormality constraints on Q. To cope with this problem, we propose a very simple decomposition for orthogonal matrices which we abbreviate as PLR decomposition. It produces a one-to-one correspondence between Q and a (d× d) unit lower triangular matrix L whose d(d-1)/2 entries below the diagonal are unconstrained real values. Once the decomposition is applied, regardless of the objective function under consideration, we can use any classical unconstrained optimization method to find the minimum (or maximum) of the objective function with respect to L. For illustrative purposes, we apply the PLR decomposition in common principle components analysis (CPCA) for the maximum likelihood estimation of the common orthogonal matrix when a multivariate leptokurtic-normal distribution is assumed in each group. Compared to the commonly used normal distribution, the leptokurtic-normal has an additional parameter governing the excess kurtosis; this makes the estimation of Q in CPCA more robust against mild outliers. The usefulness of the PLR decomposition in leptokurtic-normal CPCA is illustrated by two biometric data analyses.
READ FULL TEXT