Memory-Query Tradeoffs for Randomized Convex Optimization

06/21/2023
by   Xi Chen, et al.
0

We show that any randomized first-order algorithm which minimizes a d-dimensional, 1-Lipschitz convex function over the unit ball must either use Ω(d^2-δ) bits of memory or make Ω(d^1+δ/6-o(1)) queries, for any constant δ∈ (0,1) and when the precision ϵ is quasipolynomially small in d. Our result implies that cutting plane methods, which use Õ(d^2) bits of memory and Õ(d) queries, are Pareto-optimal among randomized first-order algorithms, and quadratic memory is required to achieve optimal query complexity for convex optimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/29/2022

Efficient Convex Optimization Requires Superlinear Memory

We show that any memory-constrained, first-order algorithm which minimiz...
research
02/09/2023

Quadratic Memory is Necessary for Optimal Query Complexity in Convex Optimization: Center-of-Mass is Pareto-Optimal

We give query complexity lower bounds for convex optimization and the re...
research
07/01/2019

Open Problem: The Oracle Complexity of Convex Optimization with Limited Memory

We note that known methods achieving the optimal oracle complexity for f...
research
02/11/2014

On Zeroth-Order Stochastic Convex Optimization via Random Walks

We propose a method for zeroth order stochastic convex optimization that...
research
06/16/2023

Memory-Constrained Algorithms for Convex Optimization via Recursive Cutting-Planes

We propose a family of recursive cutting-plane algorithms to solve feasi...
research
11/07/2016

Optimal Binary Autoencoding with Pairwise Correlations

We formulate learning of a binary autoencoder as a biconvex optimization...
research
07/24/2023

Open Problem: Polynomial linearly-convergent method for geodesically convex optimization?

Let f ℳ→ℝ be a Lipschitz and geodesically convex function defined on a d...

Please sign up or login with your details

Forgot password? Click here to reset