Multilevel Riemannian optimization for low-rank problems

05/14/2020
by   Marco Sutti, et al.
0

Large-scale optimization problems arising from the discretization of problems involving PDEs sometimes admit solutions that can be well approximated by low-rank matrices. In this paper, we will exploit this low-rank approximation property by solving the optimization problem directly over the set of low-rank matrices. In particular, we introduce a new multilevel algorithm, where the optimization variable is constrained to the Riemannian manifold of fixed-rank matrices. In contrast to most other multilevel low-rank algorithms where the rank is chosen adaptively on each level, we can keep the ranks (and thus the computational complexity) fixed throughout the iterations. Furthermore, classical implementations of line searches based on Wolfe conditions allow computing a solution where the numerical accuracy is limited to about the square root of the machine epsilon. Here, we propose an extension to Riemannian manifolds of the line search of Hager and Zhang, which uses approximate Wolfe conditions that allow computing a solution on the order of the machine epsilon. Numerical experiments demonstrate the computational efficiency of the proposed framework.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/27/2021

Automatic differentiation for Riemannian optimization on low-rank matrix and tensor-train manifolds

In scientific computing and machine learning applications, matrices and ...
research
06/01/2015

A Riemannian low-rank method for optimization over semidefinite matrices with block-diagonal constraints

We propose a new algorithm to solve optimization problems of the form f...
research
01/11/2022

On the continuity of the tangent cone to the determinantal variety

Tangent and normal cones play an important role in constrained optimizat...
research
10/30/2014

Robust sketching for multiple square-root LASSO problems

Many learning tasks, such as cross-validation, parameter search, or leav...
research
12/04/2017

A Dual Framework for Low-rank Tensor Completion

We propose a novel formulation of the low-rank tensor completion problem...
research
05/17/2020

A decoupled form of the structure-preserving doubling algorithm with low-rank structures

The structure-preserving doubling algorithm (SDA) is a fairly efficient ...
research
06/04/2023

Riemannian Low-Rank Model Compression for Federated Learning with Over-the-Air Aggregation

Low-rank model compression is a widely used technique for reducing the c...

Please sign up or login with your details

Forgot password? Click here to reset