Variations of Orthonormal Basis Matrices of Subspaces
An orthonormal basis matrix X of a subspace X is known not to be unique, unless there are some kinds of normalization requirements. One of them is to require that X^ TD is positive semi-definite, where D is a constant matrix of apt size. It is a natural one in multi-view subspace learning models in which X serves as a projection matrix and is determined by a maximization problem over the Stiefel manifold whose objective function contains and increases with tr(X^ TD). This paper is concerned with bounding the change in orthonormal basis matrix X as subspace X varies under the requirement that X^ TD stays positive semi-definite. The results are useful in convergence analysis of the NEPv approach (nonlinear eigenvalue problem with eigenvector dependency) to solve the maximization problem.
READ FULL TEXT