Accelerating SGD for Highly Ill-Conditioned Huge-Scale Online Matrix Completion

08/24/2022
by   Gavin Zhang, et al.
0

The matrix completion problem seeks to recover a d× d ground truth matrix of low rank r≪ d from observations of its individual elements. Real-world matrix completion is often a huge-scale optimization problem, with d so large that even the simplest full-dimension vector operations with O(d) time complexity become prohibitively expensive. Stochastic gradient descent (SGD) is one of the few algorithms capable of solving matrix completion on a huge scale, and can also naturally handle streaming data over an evolving ground truth. Unfortunately, SGD experiences a dramatic slow-down when the underlying ground truth is ill-conditioned; it requires at least O(κlog(1/ϵ)) iterations to get ϵ-close to ground truth matrix with condition number κ. In this paper, we propose a preconditioned version of SGD that preserves all the favorable practical qualities of SGD for huge-scale online optimization while also making it agnostic to κ. For a symmetric ground truth and the Root Mean Square Error (RMSE) loss, we prove that the preconditioned SGD converges to ϵ-accuracy in O(log(1/ϵ)) iterations, with a rapid linear convergence rate as if the ground truth were perfectly conditioned with κ=1. In our numerical experiments, we observe a similar acceleration for ill-conditioned matrix completion under the 1-bit cross-entropy loss, as well as pairwise losses such as the Bayesian Personalized Ranking (BPR) loss.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/03/2021

A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples

We propose an iterative algorithm for low-rank matrix completion that ca...
research
09/07/2020

Escaping Saddle Points in Ill-Conditioned Matrix Completion with a Scalable Second Order Method

We propose an iterative algorithm for low-rank matrix completion that ca...
research
04/03/2020

Orthogonal Inductive Matrix Completion

We propose orthogonal inductive matrix completion (OMIC), an interpretab...
research
12/19/2022

Rank-1 Matrix Completion with Gradient Descent and Small Random Initialization

The nonconvex formulation of matrix completion problem has received sign...
research
02/18/2018

Optimizing Spectral Sums using Randomized Chebyshev Expansions

The trace of matrix functions, often called spectral sums, e.g., rank, l...
research
05/26/2016

Provable Efficient Online Matrix Completion via Non-convex Stochastic Gradient Descent

Matrix completion, where we wish to recover a low rank matrix by observi...
research
08/01/2018

Matrix completion and extrapolation via kernel regression

Matrix completion and extrapolation (MCEX) are dealt with here over repr...

Please sign up or login with your details

Forgot password? Click here to reset