Efficient and Practical Stochastic Subgradient Descent for Nuclear Norm Regularization

06/27/2012
by   Haim Avron, et al.
0

We describe novel subgradient methods for a broad class of matrix optimization problems involving nuclear norm regularization. Unlike existing approaches, our method executes very cheap iterations by combining low-rank stochastic subgradients with efficient incremental SVD updates, made possible by highly optimized and parallelizable dense linear algebra operations on small matrices. Our practical algorithms always maintain a low-rank factorization of iterates that can be conveniently held in memory and efficiently multiplied to generate predictions in matrix completion settings. Empirical comparisons confirm that our approach is highly competitive with several recently proposed state-of-the-art solvers for such problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/14/2020

Efficient Low-Rank Matrix Learning by Factorizable Nonconvex Regularization

Matrix learning is at the core of many machine learning problems. To enc...
research
10/23/2019

Krylov Methods for Low-Rank Regularization

This paper introduces new solvers for the computation of low-rank approx...
research
01/07/2019

Double Weighted Truncated Nuclear Norm Regularization for Low-Rank Matrix Completion

Matrix completion focuses on recovering a matrix from a small subset of ...
research
10/13/2012

A Rank-Corrected Procedure for Matrix Completion with Fixed Basis Coefficients

For the problems of low-rank matrix completion, the efficiency of the wi...
research
12/03/2015

Fast Low-Rank Matrix Learning with Nonconvex Regularization

Low-rank modeling has a lot of important applications in machine learnin...
research
08/11/2022

Adaptive and Implicit Regularization for Matrix Completion

The explicit low-rank regularization, e.g., nuclear norm regularization,...
research
03/27/2013

Efficiently Using Second Order Information in Large l1 Regularization Problems

We propose a novel general algorithm LHAC that efficiently uses second-o...

Please sign up or login with your details

Forgot password? Click here to reset