DeepAI AI Chat
Log In Sign Up

Local convergence of alternating low-rank optimization methods with overrelaxation

11/29/2021
by   Ivan V. Oseledets, et al.
0

The local convergence of alternating optimization methods with overrelaxation for low-rank matrix and tensor problems is established. The analysis is based on the linearization of the method which takes the form of an SOR iteration for a positive semidefinite Hessian and can be studied in the corresponding quotient geometry of equivalent low-rank representations. In the matrix case, the optimal relaxation parameter for accelerating the local convergence can be determined from the convergence rate of the standard method. This result relies on a version of Young's SOR theorem for positive semidefinite 2 × 2 block systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/11/2019

Bundle Method Sketching for Low Rank Semidefinite Programming

In this paper, we show that the bundle method can be applied to solve se...
10/26/2022

Bures-Wasserstein Barycenters and Low-Rank Matrix Recovery

We revisit the problem of recovering a low-rank positive semidefinite ma...
03/08/2022

Noisy Low-rank Matrix Optimization: Geometry of Local Minima and Convergence Rate

This paper is concerned with low-rank matrix optimization, which has fou...
04/16/2022

Riemannian optimization using three different metrics for Hermitian PSD fixed-rank constraints: an extended version

We consider smooth optimization problems with a Hermitian positive semi-...
07/31/2019

Binary Component Decomposition Part I: The Positive-Semidefinite Case

This paper studies the problem of decomposing a low-rank positive-semide...
08/01/2019

Low-Rank plus Sparse Decomposition of Covariance Matrices using Neural Network Parametrization

This paper revisits the problem of decomposing a positive semidefinite m...
08/03/2020

A parameter-dependent smoother for the multigrid method

The solution of parameter-dependent linear systems, by classical methods...