Communication-Efficient Gradient Descent-Accent Methods for Distributed Variational Inequalities: Unified Analysis and Local Updates

06/08/2023
by   Siqi Zhang, et al.
3

Distributed and federated learning algorithms and techniques associated primarily with minimization problems. However, with the increase of minimax optimization and variational inequality problems in machine learning, the necessity of designing efficient distributed/federated learning approaches for these problems is becoming more apparent. In this paper, we provide a unified convergence analysis of communication-efficient local training methods for distributed variational inequality problems (VIPs). Our approach is based on a general key assumption on the stochastic estimates that allows us to propose and analyze several novel local training algorithms under a single framework for solving a class of structured non-monotone VIPs. We present the first local gradient descent-accent algorithms with provable improved communication complexity for solving distributed variational inequalities on heterogeneous data. The general algorithmic framework recovers state-of-the-art algorithms and their sharp convergence guarantees when the setting is specialized to minimization or minimax optimization problems. Finally, we demonstrate the strong performance of the proposed algorithms compared to state-of-the-art methods when solving federated minimax optimization problems.

READ FULL TEXT

page 6

page 7

page 8

page 10

page 22

research
11/14/2022

Adaptive Federated Minimax Optimization with Lower complexities

Federated learning is a popular distributed and privacy-preserving machi...
research
03/09/2022

Federated Minimax Optimization: Improved Convergence Analyses and Algorithms

In this paper, we consider nonconvex minimax optimization, which is gain...
research
08/15/2022

Federated Quantum Natural Gradient Descent for Quantum Federated Learning

The heart of Quantum Federated Learning (QFL) is associated with a distr...
research
02/13/2023

Communication-Efficient Federated Bilevel Optimization with Local and Global Lower Level Problems

Bilevel Optimization has witnessed notable progress recently with new em...
research
06/15/2021

Decentralized Local Stochastic Extra-Gradient for Variational Inequalities

We consider decentralized stochastic variational inequalities where the ...
research
11/23/2021

Forget-SVGD: Particle-Based Bayesian Federated Unlearning

Variational particle-based Bayesian learning methods have the advantage ...
research
05/29/2018

K-Beam Subgradient Descent for Minimax Optimization

Minimax optimization plays a key role in adversarial training of machine...

Please sign up or login with your details

Forgot password? Click here to reset