Escaping Saddle Points in Ill-Conditioned Matrix Completion with a Scalable Second Order Method

09/07/2020
by   Christian Kümmerle, et al.
0

We propose an iterative algorithm for low-rank matrix completion that can be interpreted as both an iteratively reweighted least squares (IRLS) algorithm and a saddle-escaping smoothing Newton method applied to a non-convex rank surrogate objective. It combines the favorable data efficiency of previous IRLS approaches with an improved scalability by several orders of magnitude. Our method attains a local quadratic convergence rate already for a number of samples that is close to the information theoretical limit. We show in numerical experiments that unlike many state-of-the-art approaches, our approach is able to complete very ill-conditioned matrices with a condition number of up to 10^10 from few samples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/03/2021

A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples

We propose an iterative algorithm for low-rank matrix completion that ca...
research
05/20/2023

Optimal Low-Rank Matrix Completion: Semidefinite Relaxations and Eigenvector Disjunctions

Low-rank matrix completion consists of computing a matrix of minimal com...
research
11/19/2018

Denoising and Completion of Structured Low-Rank Matrices via Iteratively Reweighted Least Squares

We propose a new Iteratively Reweighted Least Squares (IRLS) algorithm f...
research
08/24/2022

Accelerating SGD for Highly Ill-Conditioned Huge-Scale Online Matrix Completion

The matrix completion problem seeks to recover a d× d ground truth matri...
research
12/01/2022

Learning Transition Operators From Sparse Space-Time Samples

We consider the nonlinear inverse problem of learning a transition opera...
research
06/16/2022

Introducing the Huber mechanism for differentially private low-rank matrix completion

Performing low-rank matrix completion with sensitive user data calls for...
research
07/03/2019

Globally Convergent Newton Methods for Ill-conditioned Generalized Self-concordant Losses

In this paper, we study large-scale convex optimization algorithms based...

Please sign up or login with your details

Forgot password? Click here to reset