An Accelerated Stochastic Gradient for Canonical Polyadic Decomposition

09/28/2021
by   Ioanna Siaminou, et al.
0

We consider the problem of structured canonical polyadic decomposition. If the size of the problem is very big, then stochastic gradient approaches are viable alternatives to classical methods, such as Alternating Optimization and All-At-Once optimization. We extend a recent stochastic gradient approach by employing an acceleration step (Nesterov momentum) in each iteration. We compare our approach with state-of-the-art alternatives, using both synthetic and real-world data, and find it to be very competitive.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/18/2016

Katyusha: The First Direct Acceleration of Stochastic Gradient Methods

Nesterov's momentum trick is famously known for accelerating gradient de...
research
04/26/2017

Accelerating Stochastic Gradient Descent

There is widespread sentiment that it is not possible to effectively uti...
research
09/20/2021

Accelerated Stochastic Gradient for Nonnegative Tensor Completion and Parallel Implementation

We consider the problem of nonnegative tensor completion. We adopt the a...
research
10/10/2018

Rao-Blackwellized Stochastic Gradients for Discrete Distributions

We wish to compute the gradient of an expectation over a finite or count...
research
09/17/2018

Zeroth-order (Non)-Convex Stochastic Optimization via Conditional Gradient and Gradient Updates

In this paper, we propose and analyze zeroth-order stochastic approximat...
research
10/28/2022

Preferential Subsampling for Stochastic Gradient Langevin Dynamics

Stochastic gradient MCMC (SGMCMC) offers a scalable alternative to tradi...

Please sign up or login with your details

Forgot password? Click here to reset