Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient Methods

02/15/2022
by   Aleksandr Beznosikov, et al.
8

Stochastic Gradient Descent-Ascent (SGDA) is one of the most prominent algorithms for solving min-max optimization and variational inequalities problems (VIP) appearing in various machine learning tasks. The success of the method led to several advanced extensions of the classical SGDA, including variants with arbitrary sampling, variance reduction, coordinate randomization, and distributed variants with compression, which were extensively studied in the literature, especially during the last few years. In this paper, we propose a unified convergence analysis that covers a large variety of stochastic gradient descent-ascent methods, which so far have required different intuitions, have different applications and have been developed separately in various communities. A key to our unified framework is a parametric assumption on the stochastic estimates. Via our general theoretical framework, we either recover the sharpest known rates for the known special cases or tighten them. Moreover, to illustrate the flexibility of our approach we develop several new variants of SGDA such as a new variance-reduced method (L-SVRGDA), new distributed methods with compression (QSGDA, DIANA-SGDA, VR-DIANA-SGDA), and a new method with coordinate randomization (SEGA-SGDA). Although variants of the new methods are known for solving minimization problems, they were never considered or analyzed for solving min-max problems and VIPs. We also demonstrate the most important properties of the new methods through extensive numerical experiments.

READ FULL TEXT

page 6

page 7

page 12

page 26

page 29

page 39

research
05/27/2019

A Unified Theory of SGD: Variance Reduction, Sampling, Quantization and Coordinate Descent

In this paper we introduce a unified analysis of a large family of varia...
research
06/06/2021

MURANA: A Generic Framework for Stochastic Variance-Reduced Optimization

We propose a generic variance-reduced algorithm, which we call MUltiple ...
research
11/16/2021

Stochastic Extragradient: General Analysis and Improved Rates

The Stochastic Extragradient (SEG) method is one of the most popular alg...
research
12/19/2022

Gradient Descent-Type Methods: Background and Simple Unified Convergence Analysis

In this book chapter, we briefly describe the main components that const...
research
05/27/2019

One Method to Rule Them All: Variance Reduction for Data, Parameters and Many New Methods

We propose a remarkably general variance-reduced method suitable for sol...
research
06/12/2020

A Unified Analysis of Stochastic Gradient Methods for Nonconvex Federated Optimization

In this paper, we study the performance of a large family of SGD variant...
research
05/07/2019

Estimate Sequences for Variance-Reduced Stochastic Composite Optimization

In this paper, we propose a unified view of gradient-based algorithms fo...

Please sign up or login with your details

Forgot password? Click here to reset