Efficient Convex Optimization Requires Superlinear Memory

03/29/2022
by   Annie Marsden, et al.
0

We show that any memory-constrained, first-order algorithm which minimizes d-dimensional, 1-Lipschitz convex functions over the unit ball to 1/poly(d) accuracy using at most d^1.25 - δ bits of memory must make at least Ω̃(d^1 + (4/3)δ) first-order queries (for any constant δ∈ [0, 1/4]). Consequently, the performance of such memory-constrained algorithms are a polynomial factor worse than the optimal Õ(d) query bound for this problem obtained by cutting plane methods that use Õ(d^2) memory. This resolves a COLT 2019 open problem of Woodworth and Srebro.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/21/2023

Memory-Query Tradeoffs for Randomized Convex Optimization

We show that any randomized first-order algorithm which minimizes a d-di...
research
07/24/2023

Open Problem: Polynomial linearly-convergent method for geodesically convex optimization?

Let f ℳ→ℝ be a Lipschitz and geodesically convex function defined on a d...
research
06/16/2023

Memory-Constrained Algorithms for Convex Optimization via Recursive Cutting-Planes

We propose a family of recursive cutting-plane algorithms to solve feasi...
research
07/01/2019

Open Problem: The Oracle Complexity of Convex Optimization with Limited Memory

We note that known methods achieving the optimal oracle complexity for f...
research
07/23/2018

Minimizing Sum of Non-Convex but Piecewise log-Lipschitz Functions using Coresets

We suggest a new optimization technique for minimizing the sum ∑_i=1^n f...
research
01/01/2023

ReSQueing Parallel and Private Stochastic Convex Optimization

We introduce a new tool for stochastic convex optimization (SCO): a Rewe...
research
08/07/2022

Decomposable Non-Smooth Convex Optimization with Nearly-Linear Gradient Oracle Complexity

Many fundamental problems in machine learning can be formulated by the c...

Please sign up or login with your details

Forgot password? Click here to reset