Faster Projection-Free Augmented Lagrangian Methods via Weak Proximal Oracle

10/25/2022
by   Dan Garber, et al.
0

This paper considers a convex composite optimization problem with affine constraints, which includes problems that take the form of minimizing a smooth convex objective function over the intersection of (simple) convex sets, or regularized with multiple (simple) functions. Motivated by high-dimensional applications in which exact projection/proximal computations are not tractable, we propose a projection-free augmented Lagrangian-based method, in which primal updates are carried out using a weak proximal oracle (WPO). In an earlier work, WPO was shown to be more powerful than the standard linear minimization oracle (LMO) that underlies conditional gradient-based methods (aka Frank-Wolfe methods). Moreover, WPO is computationally tractable for many high-dimensional problems of interest, including those motivated by recovery of low-rank matrices and tensors, and optimization over polytopes which admit efficient LMOs. The main result of this paper shows that under a certain curvature assumption (which is weaker than strong convexity), our WPO-based algorithm achieves an ergodic rate of convergence of O(1/T) for both the objective residual and feasibility gap. This result, to the best of our knowledge, improves upon the O(1/√(T)) rate for existing LMO-based projection-free methods for this class of problems. Empirical experiments on a low-rank and sparse covariance matrix estimation task and the Max Cut semidefinite relaxation demonstrate the superiority of our method over state-of-the-art LMO-based Lagrangian-based methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/09/2018

Frank-Wolfe Splitting via Augmented Lagrangian Method

Minimizing a function over an intersection of convex sets is an importan...
research
05/11/2020

Inexact and Stochastic Generalized Conditional Gradient with Augmented Lagrangian and Proximal Step

In this paper we propose and analyze inexact and stochastic versions of ...
research
10/02/2019

Global exponential stability of primal-dual gradient flow dynamics based on the proximal augmented Lagrangian: A Lyapunov-based approach

For a class of nonsmooth composite optimization problems with linear equ...
research
01/12/2020

The Proximal Method of Multipliers for a Class of Nonsmooth Convex Optimization

This paper develops the proximal method of multipliers for a class of no...
research
09/27/2018

Fast Stochastic Algorithms for Low-rank and Nonsmooth Matrix Problems

Composite convex optimization problems which include both a nonsmooth te...
research
10/27/2020

Faster Lagrangian-Based Methods in Convex Optimization

In this paper, we aim at unifying, simplifying, and improving the conver...
research
06/06/2015

Classification and regression using an outer approximation projection-gradient method

This paper deals with sparse feature selection and grouping for classifi...

Please sign up or login with your details

Forgot password? Click here to reset