Asymptotic Log-Det Rank Minimization via (Alternating) Iteratively Reweighted Least Squares

06/28/2021
by   Sebastian Krämer, et al.
0

The affine rank minimization (ARM) problem is well known for both its applications and the fact that it is NP-hard. One of the most successful approaches, yet arguably underrepresented, is iteratively reweighted least squares (IRLS), more specifically IRLS-0. Despite comprehensive empirical evidence that it overall outperforms nuclear norm minimization and related methods, it is still not understood to a satisfying degree. In particular, the significance of a slow decrease of the therein appearing regularization parameter denoted γ poses interesting questions. While commonly equated to matrix recovery, we here consider the ARM independently. We investigate the particular structure and global convergence property behind the asymptotic minimization of the log-det objective function on which IRLS-0 is based. We expand on local convergence theorems, now with an emphasis on the decline of γ, and provide representative examples as well as counterexamples such as a diverging IRLS-0 sequence that clarify theoretical limits. We present a data sparse, alternating realization AIRLS-p (related to prior work under the name SALSA) that, along with the rest of this work, serves as basis and introduction to the more general tensor setting. In conclusion, numerical sensitivity experiments are carried out that reconfirm the success of IRLS-0 and demonstrate that in surprisingly many cases, a slower decay of γ will yet lead to a solution of the ARM problem, up to the point that the exact theoretical phase transition for generic recoverability can be observed. Likewise, this suggests that non-convexity is less substantial and problematic for the log-det approach than it might initially appear.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2021

Asymptotic Log-Det Sum-of-Ranks Minimization via Tensor (Alternating) Iteratively Reweighted Least Squares

Affine sum-of-ranks minimization (ASRM) generalizes the affine rank mini...
research
01/29/2014

Smoothed Low Rank and Sparse Matrix Recovery by Iteratively Reweighted Least Squares Minimization

This work presents a general framework for solving the low rank and/or s...
research
09/07/2008

Necessary and Sufficient Conditions for Success of the Nuclear Norm Heuristic for Rank Minimization

Minimizing the rank of a matrix subject to constraints is a challenging ...
research
06/07/2023

Efficient Alternating Minimization with Applications to Weighted Low Rank Approximation

Weighted low rank approximation is a fundamental problem in numerical li...
research
09/13/2021

Nonlinear matrix recovery using optimization on the Grassmann manifold

We investigate the problem of recovering a partially observed high-rank ...
research
04/20/2023

Learning Sparse and Low-Rank Priors for Image Recovery via Iterative Reweighted Least Squares Minimization

We introduce a novel optimization algorithm for image recovery under lea...
research
08/03/2023

Efficiency of First-Order Methods for Low-Rank Tensor Recovery with the Tensor Nuclear Norm Under Strict Complementarity

We consider convex relaxations for recovering low-rank tensors based on ...

Please sign up or login with your details

Forgot password? Click here to reset