Limitations on Variance-Reduction and Acceleration Schemes for Finite Sum Optimization

06/06/2017
by   Yossi Arjevani, et al.
0

We study the conditions under which one is able to efficiently apply variance-reduction and acceleration schemes on finite sum optimization problems. First, we show that, perhaps surprisingly, the finite sum structure by itself, is not sufficient for obtaining a complexity bound of ((n+L/μ)(1/ϵ)) for L-smooth and μ-strongly convex individual functions - one must also know which individual function is being referred to by the oracle at each iteration. Next, we show that for a broad class of first-order and coordinate-descent finite sum algorithms (including, e.g., SDCA, SVRG, SAG), it is not possible to get an `accelerated' complexity bound of ((n+√(n L/μ))(1/ϵ)), unless the strong convexity parameter is given explicitly. Lastly, we show that when this class of algorithms is used for minimizing L-smooth and convex finite sums, the optimal complexity bound is (n+L/ϵ), assuming that (on average) the same update rule is used in every iteration, and (n+√(nL/ϵ)), otherwise.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/11/2020

Variance Reduced Coordinate Descent with Acceleration: New Method With a Surprising Application to Finite-Sum Problems

We propose an accelerated version of stochastic variance reduced coordin...
research
07/07/2020

An Accelerated DFO Algorithm for Finite-sum Convex Functions

Derivative-free optimization (DFO) has recently gained a lot of momentum...
research
05/29/2019

A unified variance-reduced accelerated gradient method for convex optimization

We propose a novel randomized incremental gradient algorithm, namely, VA...
research
02/15/2023

Continuized Acceleration for Quasar Convex Functions in Non-Convex Optimization

Quasar convexity is a condition that allows some first-order methods to ...
research
06/18/2020

Stochastic Variance Reduction via Accelerated Dual Averaging for Finite-Sum Optimization

In this paper, we introduce a simplified and unified method for finite-s...
research
06/03/2019

Towards Unified Acceleration of High-Order Algorithms under Hölder Continuity and Uniform Convexity

In this paper, through a very intuitive vanilla proximal method perspec...
research
02/09/2020

On the Complexity of Minimizing Convex Finite Sums Without Using the Indices of the Individual Functions

Recent advances in randomized incremental methods for minimizing L-smoot...

Please sign up or login with your details

Forgot password? Click here to reset