Beyond Alternating Updates for Matrix Factorization with Inertial Bregman Proximal Gradient Algorithms

05/22/2019
by   Mahesh Chandra Mukkamala, et al.
0

Matrix Factorization is a popular non-convex objective, for which alternating minimization schemes are mostly used. They usually suffer from the major drawback that the solution is biased towards one of the optimization variables. A remedy is non-alternating schemes. However, due to a lack of Lipschitz continuity of the gradient in matrix factorization problems, convergence cannot be guaranteed. A recently developed remedy relies on the concept of Bregman distances, which generalizes the standard Euclidean distance. We exploit this theory by proposing a novel Bregman distance for matrix factorization problems, which, at the same time, allows for simple/closed form update steps. Therefore, for non-alternating schemes, such as the recently introduced Bregman Proximal Gradient (BPG) method and an inertial variant Convex--Concave Inertial BPG (CoCaIn BPG), convergence of the whole sequence to a stationary point is proved for Matrix Factorization. In several experiments, we observe a superior performance of our non-alternating schemes in terms of speed and objective value at the limit point.

READ FULL TEXT
research
03/05/2019

Inertial Block Mirror Descent Method for Non-Convex Non-Smooth Optimization

In this paper, we propose inertial versions of block coordinate descent ...
research
10/08/2019

Bregman Proximal Framework for Deep Linear Neural Networks

A typical assumption for the analysis of first order optimization method...
research
06/29/2019

Approximate matrix completion based on cavity method

In order to solve large matrix completion problems with practical comput...
research
05/18/2017

A Non-monotone Alternating Updating Method for A Class of Matrix Factorization Problems

In this paper we consider a general matrix factorization model which cov...
research
10/19/2017

Efficient Robust Matrix Factorization with Nonconvex Loss

Robust matrix factorization (RMF), which uses the ℓ_1-loss, often outper...
research
10/19/2017

Scalable Robust Matrix Factorization with Nonconvex Loss

Robust matrix factorization (RMF), which uses the ℓ_1-loss, often outper...

Please sign up or login with your details

Forgot password? Click here to reset