DeepAI

# DIPPA: An improved Method for Bilinear Saddle Point Problems

This paper studies bilinear saddle point problems min_xmax_y g(x) + x^⊤Ay - h(y), where the functions g, h are smooth and strongly-convex. When the gradient and proximal oracle related to g and h are accessible, optimal algorithms have already been developed in the literature <cit.>. However, the proximal operator is not always easy to compute, especially in constraint zero-sum matrix games <cit.>. This work proposes a new algorithm which only requires the access to the gradients of g, h. Our algorithm achieves a complexity upper bound 𝒪̃( A_2/√(μ_x μ_y) + √(κ_x κ_y (κ_x + κ_y))) which has optimal dependency on the coupling condition number A_2/√(μ_x μ_y) up to logarithmic factors.

• 9 publications
• 2 publications
• 45 publications
06/11/2020

### Improved Algorithms for Convex-Concave Minimax Optimization

This paper studies minimax optimization problems min_x max_y f(x,y), whe...
08/22/2019

### A General Analysis Framework of Lower Complexity Bounds for Finite-Sum Optimization

This paper studies the lower bound complexity for the optimization probl...
04/21/2022

### Optimal Scaling for the Proximal Langevin Algorithm in High Dimensions

The Metropolis-adjusted Langevin (MALA) algorithm is a sampling algorith...
03/15/2021

### Lower Complexity Bounds of Finite-Sum Optimization Problems: The Results and Construction

The contribution of this paper includes two aspects. First, we study the...
06/17/2022