Variance Reduced Coordinate Descent with Acceleration: New Method With a Surprising Application to Finite-Sum Problems

02/11/2020
by   Filip Hanzely, et al.
0

We propose an accelerated version of stochastic variance reduced coordinate descent – ASVRCD. As other variance reduced coordinate descent methods such as SEGA or SVRCD, our method can deal with problems that include a non-separable and non-smooth regularizer, while accessing a random block of partial derivatives in each iteration only. However, ASVRCD incorporates Nesterov's momentum, which offers favorable iteration complexity guarantees over both SEGA and SVRCD. As a by-product of our theory, we show that a variant of Allen-Zhu (2017) is a specific case of ASVRCD, recovering the optimal oracle complexity for the finite sum objective.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2017

Limitations on Variance-Reduction and Acceleration Schemes for Finite Sum Optimization

We study the conditions under which one is able to efficiently apply var...
research
12/09/2022

Cyclic Block Coordinate Descent With Variance Reduction for Composite Nonconvex Optimization

Nonconvex optimization is central in solving many machine learning probl...
research
09/09/2018

SEGA: Variance Reduction via Gradient Sketching

We propose a randomized first order optimization method--SEGA (SkEtched ...
research
11/13/2016

Accelerated Variance Reduced Block Coordinate Descent

Algorithms with fast convergence, small number of data access, and low p...
research
12/20/2013

Accelerated, Parallel and Proximal Coordinate Descent

We propose a new stochastic coordinate descent method for minimizing the...
research
06/10/2018

Dissipativity Theory for Accelerating Stochastic Variance Reduction: A Unified Analysis of SVRG and Katyusha Using Semidefinite Programs

Techniques for reducing the variance of gradient estimates used in stoch...

Please sign up or login with your details

Forgot password? Click here to reset