Fast Stochastic Algorithms for Low-rank and Nonsmooth Matrix Problems

09/27/2018
by   Dan Garber, et al.
0

Composite convex optimization problems which include both a nonsmooth term and a low-rank promoting term have important applications in machine learning and signal processing, such as when one wishes to recover an unknown matrix that is simultaneously low-rank and sparse. However, such problems are highly challenging to solve in large-scale: the low-rank promoting term prohibits efficient implementations of proximal methods for composite optimization and even simple subgradient methods. On the other hand, methods which are tailored for low-rank optimization, such as conditional gradient-type methods, which are often applied to a smooth approximation of the nonsmooth objective, are slow since their runtime scales with both the large Lipshitz parameter of the smoothed gradient vector and with 1/ϵ. In this paper we develop efficient algorithms for stochastic optimization of a strongly-convex objective which includes both a nonsmooth term and a low-rank promoting term. In particular, to the best of our knowledge, we present the first algorithm that enjoys all following critical properties for large-scale problems: i) (nearly) optimal sample complexity, ii) each iteration requires only a single low-rank SVD computation, and iii) overall number of thin-SVD computations scales only with 1/ϵ (as opposed to poly(1/ϵ) in previous methods). We also give an algorithm for the closely-related finite-sum setting. At the heart of our results lie a novel combination of a variance-reduction technique and the use of a weak-proximal oracle which is key to obtaining all above three properties simultaneously.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/18/2020

On the Efficient Implementation of the Matrix Exponentiated Gradient Algorithm for Low-Rank Matrix Optimization

Convex optimization over the spectrahedron, i.e., the set of all real n×...
research
04/14/2014

Hybrid Conditional Gradient - Smoothing Algorithms with Applications to Sparse and Low Rank Regularization

We study a hybrid conditional gradient - smoothing algorithm (HCGS) for ...
research
01/31/2020

On the Convergence of Stochastic Gradient Descent with Low-Rank Projections for Convex Low-Rank Matrix Problems

We revisit the use of Stochastic Gradient Descent (SGD) for solving conv...
research
03/01/2019

Proximal algorithms for constrained composite optimization, with applications to solving low-rank SDPs

We study a family of (potentially non-convex) constrained optimization p...
research
10/25/2022

Faster Projection-Free Augmented Lagrangian Methods via Weak Proximal Oracle

This paper considers a convex composite optimization problem with affine...
research
03/29/2014

Scalable Robust Matrix Recovery: Frank-Wolfe Meets Proximal Methods

Recovering matrices from compressive and grossly corrupted observations ...

Please sign up or login with your details

Forgot password? Click here to reset