Federated Zeroth-Order Optimization using Trajectory-Informed Surrogate Gradients

08/08/2023
by   Yao Shu, et al.
0

Federated optimization, an emerging paradigm which finds wide real-world applications such as federated learning, enables multiple clients (e.g., edge devices) to collaboratively optimize a global function. The clients do not share their local datasets and typically only share their local gradients. However, the gradient information is not available in many applications of federated optimization, which hence gives rise to the paradigm of federated zeroth-order optimization (ZOO). Existing federated ZOO algorithms suffer from the limitations of query and communication inefficiency, which can be attributed to (a) their reliance on a substantial number of function queries for gradient estimation and (b) the significant disparity between their realized local updates and the intended global updates. To this end, we (a) introduce trajectory-informed gradient surrogates which is able to use the history of function queries during optimization for accurate and query-efficient gradient estimation, and (b) develop the technique of adaptive gradient correction using these gradient surrogates to mitigate the aforementioned disparity. Based on these, we propose the federated zeroth-order optimization using trajectory-informed surrogate gradients (FZooS) algorithm for query- and communication-efficient federated ZOO. Our FZooS achieves theoretical improvements over the existing approaches, which is supported by our real-world experiments such as federated black-box adversarial attack and federated non-differentiable metric optimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/24/2022

Communication-Efficient Stochastic Zeroth-Order Optimization for Federated Learning

Federated learning (FL), as an emerging edge artificial intelligence par...
research
02/13/2023

Communication-Efficient Federated Bilevel Optimization with Local and Global Lower Level Problems

Bilevel Optimization has witnessed notable progress recently with new em...
research
12/15/2021

LoSAC: An Efficient Local Stochastic Average Control Method for Federated Optimization

Federated optimization (FedOpt), which targets at collaboratively traini...
research
07/13/2022

TCT: Convexifying Federated Learning using Bootstrapped Neural Tangent Kernels

State-of-the-art federated learning methods can perform far worse than t...
research
09/18/2023

FedLALR: Client-Specific Adaptive Learning Rates Achieve Linear Speedup for Non-IID Data

Federated learning is an emerging distributed machine learning method, e...
research
02/09/2023

Communication-Efficient Federated Hypergradient Computation via Aggregated Iterative Differentiation

Federated bilevel optimization has attracted increasing attention due to...

Please sign up or login with your details

Forgot password? Click here to reset