Operator-valued formulas for Riemannian Gradient and Hessian and families of tractable metrics in optimization and machine learning
We provide an explicit formula for the Levi-Civita connection and Riemannian Hessian when the tangent space at each point of a Riemannian manifold is embedded in an inner product space with a non-constant metric. Together with a classical formula for projection, this allows us to evaluate Riemannian gradient and Hessian for several families of metric extending existing ones on classical manifolds: a family of metrics on Stiefel manifolds connecting both the constant and canonical ambient metrics with closed-form geodesics; a family of quotient metrics on a manifold of positive-semidefinite matrices of fixed rank, considered as a quotient of a product of Stiefel and positive-definite matrix manifold with affine-invariant metrics; a large family of new metrics on flag manifolds. We show in many instances, this method allows us to apply symbolic calculus to derive formulas for the Riemannian gradient and Hessian. The method greatly extends the list of potential metrics that could be used in manifold optimization and machine learning.
READ FULL TEXT