Information-constrained optimization: can adaptive processing of gradients help?

04/02/2021
by   Jayadev Acharya, et al.
0

We revisit first-order optimization under local information constraints such as local privacy, gradient quantization, and computational constraints limiting access to a few coordinates of the gradient. In this setting, the optimization algorithm is not allowed to directly access the complete output of the gradient oracle, but only gets limited information about it subject to the local information constraints. We study the role of adaptivity in processing the gradient output to obtain this limited information from it.We consider optimization for both convex and strongly convex functions and obtain tight or nearly tight lower bounds for the convergence rate, when adaptive gradient processing is allowed. Prior work was restricted to convex functions and allowed only nonadaptive processing of gradients. For both of these function classes and for the three information constraints mentioned above, our lower bound implies that adaptive processing of gradients cannot outperform nonadaptive processing in most regimes of interest. We complement these results by exhibiting a natural optimization problem under information constraints for which adaptive processing of gradient strictly outperforms nonadaptive processing.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/17/2020

Adaptive Gradient Methods for Constrained Convex Optimization

We provide new adaptive first-order methods for constrained convex optim...
04/08/2018

Distributed Non-Convex First-Order Optimization and Information Processing: Lower Complexity Bounds and Rate Optimal Algorithms

We consider a class of distributed non-convex optimization problems ofte...
08/22/2019

Finite Precision Stochastic Optimization -- Accounting for the Bias

We consider first order stochastic optimization where the oracle must qu...
08/22/2019

Finite Precision Stochastic Optimisation -- Accounting for the Bias

We consider first order stochastic optimization where the oracle must qu...
04/19/2022

Making Progress Based on False Discoveries

We consider the question of adaptive data analysis within the framework ...
06/08/2021

Lower Bounds and Optimal Algorithms for Smooth and Strongly Convex Decentralized Optimization Over Time-Varying Networks

We consider the task of minimizing the sum of smooth and strongly convex...
02/01/2021

Distributed Zero-Order Optimization under Adversarial Noise

We study the problem of distributed zero-order optimization for a class ...