Efficient Algorithms for Federated Saddle Point Optimization

02/12/2021
by   Charlie Hou, et al.
0

We consider strongly convex-concave minimax problems in the federated setting, where the communication constraint is the main bottleneck. When clients are arbitrarily heterogeneous, a simple Minibatch Mirror-prox achieves the best performance. As the clients become more homogeneous, using multiple local gradient updates at the clients significantly improves upon Minibatch Mirror-prox by communicating less frequently. Our goal is to design an algorithm that can harness the benefit of similarity in the clients while recovering the Minibatch Mirror-prox performance under arbitrary heterogeneity (up to log factors). We give the first federated minimax optimization algorithm that achieves this goal. The main idea is to combine (i) SCAFFOLD (an algorithm that performs variance reduction across clients for convex optimization) to erase the worst-case dependency on heterogeneity and (ii) Catalyst (a framework for acceleration based on modifying the objective) to accelerate convergence without amplifying client drift. We prove that this algorithm achieves our goal, and include experiments to validate the theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2023

Federated Minimax Optimization with Client Heterogeneity

Minimax optimization has seen a surge in interest with the advent of mod...
research
12/05/2022

Partial Variance Reduction improves Non-Convex Federated learning on heterogeneous data

Data heterogeneity across clients is a key challenge in federated learni...
research
07/15/2020

Tackling the Objective Inconsistency Problem in Heterogeneous Federated Optimization

In federated optimization, heterogeneity in the clients' local datasets ...
research
06/23/2023

Synthetic data shuffling accelerates the convergence of federated learning under data heterogeneity

In federated learning, data heterogeneity is a critical challenge. A str...
research
02/18/2020

Distributed Non-Convex Optimization with Sublinear Speedup under Intermittent Client Availability

Federated learning is a new distributed machine learning framework, wher...
research
06/02/2023

Federated Multi-Sequence Stochastic Approximation with Local Hypergradient Estimation

Stochastic approximation with multiple coupled sequences (MSA) has found...
research
10/31/2022

Nesterov Meets Optimism: Rate-Optimal Optimistic-Gradient-Based Method for Stochastic Bilinearly-Coupled Minimax Optimization

We provide a novel first-order optimization algorithm for bilinearly-cou...

Please sign up or login with your details

Forgot password? Click here to reset