Zeroth and First Order Stochastic Frank-Wolfe Algorithms for Constrained Optimization

07/14/2021
by   Zeeshan Akhtar, et al.
0

This paper considers stochastic convex optimization problems with two sets of constraints: (a) deterministic constraints on the domain of the optimization variable, which are difficult to project onto; and (b) deterministic or stochastic constraints that admit efficient projection. Problems of this form arise frequently in the context of semidefinite programming as well as when various NP-hard problems are solved approximately via semidefinite relaxation. Since projection onto the first set of constraints is difficult, it becomes necessary to explore projection-free algorithms, such as the stochastic Frank-Wolfe (FW) algorithm. On the other hand, the second set of constraints cannot be handled in the same way, and must be incorporated as an indicator function within the objective function, thereby complicating the application of FW methods. Similar problems have been studied before, and solved using first-order stochastic FW algorithms by applying homotopy and Nesterov's smoothing techniques to the indicator function. This work improves upon these existing results and puts forth momentum-based first-order methods that yield improved convergence rates, at par with the best known rates for problems without the second set of constraints. Zeroth-order variants of the proposed algorithms are also developed and again improve upon the state-of-the-art rate results. The efficacy of the proposed algorithms is tested on relevant applications of sparse matrix estimation, clustering via semidefinite relaxation, and uniform sparsest cut problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/13/2020

Conservative Stochastic Optimization with Expectation Constraints

This paper considers stochastic convex optimization problems where the o...
research
04/23/2023

Sarah Frank-Wolfe: Methods for Constrained Optimization with Best Rates and Practical Features

The Frank-Wolfe (FW) method is a popular approach for solving optimizati...
research
05/17/2017

DS++: A flexible, scalable and provably tight relaxation for matching problems

Correspondence problems are often modelled as quadratic optimization pro...
research
11/12/2015

Random Multi-Constraint Projection: Stochastic Gradient Methods for Convex Optimization with Many Constraints

Consider convex optimization problems subject to a large number of const...
research
06/30/2020

Ideal formulations for constrained convex optimization problems with indicator variables

Motivated by modern regression applications, in this paper, we study the...
research
04/20/2014

Efficient Semidefinite Branch-and-Cut for MAP-MRF Inference

We propose a Branch-and-Cut (B&C) method for solving general MAP-MRF inf...
research
01/19/2016

Variable projection without smoothness

The variable projection technique solves structured optimization problem...

Please sign up or login with your details

Forgot password? Click here to reset