Towards Gradient Free and Projection Free Stochastic Optimization

10/08/2018
by   Anit Kumar Sahu, et al.
0

This paper focuses on the problem of constrainedstochastic optimization. A zeroth order Frank-Wolfe algorithm is proposed, which in addition to the projection-free nature of the vanilla Frank-Wolfe algorithm makes it gradient free. Under convexity and smoothness assumption, we show that the proposed algorithm converges to the optimal objective function at a rate O(1/T^1/3), where T denotes the iteration count. In particular, the primal sub-optimality gap is shown to have a dimension dependence of O(d^1/3), which is the best known dimension dependence among all zeroth order optimization algorithms with one directional derivative per iteration. For non-convex functions, we obtain the Frank-Wolfe gap to be O(d^1/3T^-1/4). Experiments on black-box optimization setups demonstrate the efficacy of the proposed algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/07/2021

Scalable Projection-Free Optimization

As a projection-free algorithm, Frank-Wolfe (FW) method, also known as c...
research
10/21/2019

Efficient Projection-Free Online Methods with Stochastic Recursive Gradient

This paper focuses on projection-free methods for solving smooth Online ...
research
04/08/2018

An Accelerated Directional Derivative Method for Smooth Stochastic Convex Optimization

We consider smooth stochastic convex optimization problems in the contex...
research
02/09/2022

A Projection-free Algorithm for Constrained Stochastic Multi-level Composition Optimization

We propose a projection-free conditional gradient-type algorithm for smo...
research
10/21/2019

Stochastic Recursive Gradient-Based Methods for Projection-Free Online Learning

This paper focuses on projection-free methods for solving smooth Online ...
research
06/16/2021

Averaging on the Bures-Wasserstein manifold: dimension-free convergence of gradient descent

We study first-order optimization algorithms for computing the barycente...
research
05/15/2018

On the Application of Danskin's Theorem to Derivative-Free Minimax Optimization

Motivated by Danskin's theorem, gradient-based methods have been applied...

Please sign up or login with your details

Forgot password? Click here to reset