DeepAI
Log In Sign Up

DIPPA: An improved Method for Bilinear Saddle Point Problems

03/15/2021
by   Guangzeng Xie, et al.
0

This paper studies bilinear saddle point problems min_xmax_y g(x) + x^⊤Ay - h(y), where the functions g, h are smooth and strongly-convex. When the gradient and proximal oracle related to g and h are accessible, optimal algorithms have already been developed in the literature <cit.>. However, the proximal operator is not always easy to compute, especially in constraint zero-sum matrix games <cit.>. This work proposes a new algorithm which only requires the access to the gradients of g, h. Our algorithm achieves a complexity upper bound 𝒪̃( A_2/√(μ_x μ_y) + √(κ_x κ_y (κ_x + κ_y))) which has optimal dependency on the coupling condition number A_2/√(μ_x μ_y) up to logarithmic factors.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/11/2020

Improved Algorithms for Convex-Concave Minimax Optimization

This paper studies minimax optimization problems min_x max_y f(x,y), whe...
08/22/2019

A General Analysis Framework of Lower Complexity Bounds for Finite-Sum Optimization

This paper studies the lower bound complexity for the optimization probl...
04/21/2022

Optimal Scaling for the Proximal Langevin Algorithm in High Dimensions

The Metropolis-adjusted Langevin (MALA) algorithm is a sampling algorith...
03/15/2021

Lower Complexity Bounds of Finite-Sum Optimization Problems: The Results and Construction

The contribution of this paper includes two aspects. First, we study the...
06/17/2022

Optimal Extragradient-Based Bilinearly-Coupled Saddle-Point Optimization

We consider the smooth convex-concave bilinearly-coupled saddle-point pr...
04/20/2018

On the Location of the Minimizer of the Sum of Strongly Convex Functions

The problem of finding the minimizer of a sum of convex functions is cen...
10/05/2020

Average-case Acceleration for Bilinear Games and Normal Matrices

Advances in generative modeling and adversarial learning have given rise...