Randomly Initialized Alternating Least Squares: Fast Convergence for Matrix Sensing

04/25/2022
by   Kiryung Lee, et al.
0

We consider the problem of reconstructing rank-one matrices from random linear measurements, a task that appears in a variety of problems in signal processing, statistics, and machine learning. In this paper, we focus on the Alternating Least Squares (ALS) method. While this algorithm has been studied in a number of previous works, most of them only show convergence from an initialization close to the true solution and thus require a carefully designed initialization scheme. However, random initialization has often been preferred by practitioners as it is model-agnostic. In this paper, we show that ALS with random initialization converges to the true solution with ε-accuracy in O(log n + log (1/ε)) iterations using only a near-optimal amount of samples, where we assume the measurement matrices to be i.i.d. Gaussian and where by n we denote the ambient dimension. Key to our proof is the observation that the trajectory of the ALS iterates only depends very mildly on certain entries of the random measurement matrices. Numerical experiments corroborate our theoretical predictions.

READ FULL TEXT
research
05/25/2018

Guaranteed Simultaneous Asymmetric Tensor Decomposition via Orthogonalized Alternating Least Squares

We consider the asymmetric orthogonal tensor decomposition problem, and ...
research
05/11/2023

Convergence of Alternating Gradient Descent for Matrix Factorization

We consider alternating gradient descent (AGD) with fixed step size η > ...
research
12/31/2020

Fast Global Convergence for Low-rank Matrix Recovery via Riemannian Gradient Descent with Random Initialization

In this paper, we propose a new global analysis framework for a class of...
research
11/14/2022

Alternating minimization algorithm with initialization analysis for r-local and k-sparse unlabeled sensing

The unlabeled sensing problem is to recover an unknown signal from permu...
research
11/25/2019

The Epsilon-Alternating Least Squares for Orthogonal Low-Rank Tensor Approximation and Its Global Convergence

The epsilon alternating least squares (ϵ-ALS) is developed and analyzed ...
research
07/20/2022

Alternating minimization for generalized rank one matrix sensing: Sharp predictions from a random initialization

We consider the problem of estimating the factors of a rank-1 matrix wit...
research
07/12/2021

Quantile-Based Random Kaczmarz for corrupted linear systems of equations

We consider linear systems Ax = b where A ∈ℝ^m × n consists of normalize...

Please sign up or login with your details

Forgot password? Click here to reset