Fast Low-Rank Matrix Learning with Nonconvex Regularization

12/03/2015
by   Quanming Yao, et al.
0

Low-rank modeling has a lot of important applications in machine learning, computer vision and social network analysis. While the matrix rank is often approximated by the convex nuclear norm, the use of nonconvex low-rank regularizers has demonstrated better recovery performance. However, the resultant optimization problem is much more challenging. A very recent state-of-the-art is based on the proximal gradient algorithm. However, it requires an expensive full SVD in each proximal step. In this paper, we show that for many commonly-used nonconvex low-rank regularizers, a cutoff can be derived to automatically threshold the singular values obtained from the proximal operator. This allows the use of power method to approximate the SVD efficiently. Besides, the proximal operator can be reduced to that of a much smaller matrix projected onto this leading subspace. Convergence, with a rate of O(1/T) where T is the number of iterations, can be guaranteed. Extensive experiments are performed on matrix completion and robust principal component analysis. The proposed method achieves significant speedup over the state-of-the-art. Moreover, the matrix solution obtained is more accurate and has a lower rank than that of the traditional nuclear norm regularizer.

READ FULL TEXT
research
08/01/2017

Large-Scale Low-Rank Matrix Learning with Nonconvex Regularizers

Low-rank modeling has many important applications in computer vision and...
research
08/14/2020

Efficient Low-Rank Matrix Learning by Factorizable Nonconvex Regularization

Matrix learning is at the core of many machine learning problems. To enc...
research
05/06/2022

Low-rank Tensor Learning with Nonconvex Overlapped Nuclear Norm Regularization

Nonconvex regularization has been popularly used in low-rank matrix lear...
research
03/16/2017

Accelerated and Inexact Soft-Impute for Large-Scale Matrix and Tensor Completion

Matrix and tensor completion aim to recover a low-rank matrix / tensor f...
research
10/27/2020

Learning Low-Rank Document Embeddings with Weighted Nuclear Norm Regularization

Recently, neural embeddings of documents have shown success in various l...
research
09/06/2013

Practical Matrix Completion and Corruption Recovery using Proximal Alternating Robust Subspace Minimization

Low-rank matrix completion is a problem of immense practical importance....
research
06/27/2012

Efficient and Practical Stochastic Subgradient Descent for Nuclear Norm Regularization

We describe novel subgradient methods for a broad class of matrix optimi...

Please sign up or login with your details

Forgot password? Click here to reset