On the Convergence of Projected-Gradient Methods with Low-Rank Projections for Smooth Convex Minimization over Trace-Norm Balls and Related Problems

02/05/2019
by   Dan Garber, et al.
0

Smooth convex minimization over the unit trace-norm ball is an important optimization problem in machine learning, signal processing, statistics and other fields, that underlies many tasks in which one wishes to recover a low-rank matrix given certain measurements. While first-order methods for convex optimization enjoy optimal convergence rates, they require in worst-case to compute a full-rank SVD on each iteration, in order to compute the projection onto the trace-norm ball. These full-rank SVD computations however prohibit the application of such methods to large problems. A simple and natural heuristic to reduce the computational cost is to approximate the projection using only a low-rank SVD. This raises the question if, and under what conditions, this simple heuristic can indeed result in provable convergence to the optimal solution. In this paper we show that any optimal solution is a center of a Euclid. ball inside-which the projected-gradient mapping admits rank that is at most the multiplicity of the largest singular value of the gradient vector. Moreover, the radius of the ball scales with the spectral gap of this gradient vector. We show how this readily implies the local convergence (i.e., from a "warm-start" initialization) of standard first-order methods, using only low-rank SVD computations. We also quantify the effect of "over-parameterization", i.e., using SVD computations with higher rank, on the radius of this ball, showing it can increase dramatically with moderately larger rank. We extend our results also to the setting of optimization with trace-norm regularization and optimization over bounded-trace positive semidefinite matrices. Our theoretical investigation is supported by concrete empirical evidence that demonstrates the correct convergence of first-order methods with low-rank projections on real-world datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/07/2017

Linear Convergence of a Frank-Wolfe Type Algorithm over Trace-Norm Balls

We propose a rank-k variant of the classical Frank-Wolfe algorithm to so...
research
08/03/2023

Efficiency of First-Order Methods for Low-Rank Tensor Recovery with the Tensor Nuclear Norm Under Strict Complementarity

We consider convex relaxations for recovering low-rank tensors based on ...
research
01/04/2016

Fitting Spectral Decay with the k-Support Norm

The spectral k-support norm enjoys good estimation properties in low ran...
research
12/18/2020

On the Efficient Implementation of the Matrix Exponentiated Gradient Algorithm for Low-Rank Matrix Optimization

Convex optimization over the spectrahedron, i.e., the set of all real n×...
research
09/27/2018

Fast Stochastic Algorithms for Low-rank and Nonsmooth Matrix Problems

Composite convex optimization problems which include both a nonsmooth te...
research
12/20/2017

A Distributed Frank-Wolfe Framework for Learning Low-Rank Matrices with the Trace Norm

We consider the problem of learning a high-dimensional but low-rank matr...
research
10/05/2010

Estimation of low-rank tensors via convex optimization

In this paper, we propose three approaches for the estimation of the Tuc...

Please sign up or login with your details

Forgot password? Click here to reset