On the Complexity of Minimizing Convex Finite Sums Without Using the Indices of the Individual Functions

02/09/2020
by   Yossi Arjevani, et al.
5

Recent advances in randomized incremental methods for minimizing L-smooth μ-strongly convex finite sums have culminated in tight complexity of Õ((n+√(n L/μ))log(1/ϵ)) and O(n+√(nL/ϵ)), where μ>0 and μ=0, respectively, and n denotes the number of individual functions. Unlike incremental methods, stochastic methods for finite sums do not rely on an explicit knowledge of which individual function is being addressed at each iteration, and as such, must perform at least Ω(n^2) iterations to obtain O(1/n^2)-optimal solutions. In this work, we exploit the finite noise structure of finite sums to derive a matching O(n^2)-upper bound under the global oracle model, showing that this lower bound is indeed tight. Following a similar approach, we propose a novel adaptation of SVRG which is both compatible with stochastic oracles, and achieves complexity bounds of Õ((n^2+n√(L/μ))log(1/ϵ)) and O(n√(L/ϵ)), for μ>0 and μ=0, respectively. Our bounds hold w.h.p. and match in part existing lower bounds of Ω̃(n^2+√(nL/μ)log(1/ϵ)) and Ω̃(n^2+√(nL/ϵ)), for μ>0 and μ=0, respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/17/2020

Tight Lower Complexity Bounds for Strongly Convex Finite-Sum Optimization

Finite-sum optimization plays an important role in the area of machine l...
research
08/22/2019

A General Analysis Framework of Lower Complexity Bounds for Finite-Sum Optimization

This paper studies the lower bound complexity for the optimization probl...
research
05/25/2016

Tight Complexity Bounds for Optimizing Composite Objectives

We provide tight upper and lower bounds on the complexity of minimizing ...
research
06/28/2021

The Convergence Rate of SGD's Final Iterate: Analysis on Dimension Dependence

Stochastic Gradient Descent (SGD) is among the simplest and most popular...
research
11/10/2022

The Randomized k-Server Conjecture is False!

We prove a few new lower bounds on the randomized competitive ratio for ...
research
06/06/2017

Limitations on Variance-Reduction and Acceleration Schemes for Finite Sum Optimization

We study the conditions under which one is able to efficiently apply var...
research
01/25/2020

Tight Regret Bounds for Noisy Optimization of a Brownian Motion

We consider the problem of Bayesian optimization of a one-dimensional Br...

Please sign up or login with your details

Forgot password? Click here to reset