On the Complexity of Finding Small Subgradients in Nonsmooth Optimization

09/21/2022
by   Guy Kornowski, et al.
0

We study the oracle complexity of producing (δ,ϵ)-stationary points of Lipschitz functions, in the sense proposed by Zhang et al. [2020]. While there exist dimension-free randomized algorithms for producing such points within O(1/δϵ^3) first-order oracle calls, we show that no dimension-free rate can be achieved by a deterministic algorithm. On the other hand, we point out that this rate can be derandomized for smooth functions with merely a logarithmic dependence on the smoothness parameter. Moreover, we establish several lower bounds for this task which hold for any randomized algorithm, with or without convexity. Finally, we show how the convergence rate of finding (δ,ϵ)-stationary points can be improved in case the function is convex, a setting which we motivate by proving that in general no finite time algorithm can produce points with small subgradients even for convex functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/16/2023

Deterministic Nonsmooth Nonconvex Optimization

We study the complexity of optimizing nonsmooth nonconvex Lipschitz func...
research
07/10/2023

An Algorithm with Optimal Dimension-Dependence for Zero-Order Nonsmooth Nonconvex Stochastic Optimization

We study the complexity of producing (δ,ϵ)-stationary points of Lipschit...
research
09/26/2022

On the Complexity of Deterministic Nonsmooth and Nonconvex Optimization

In this paper, we present several new results on minimizing a nonsmooth ...
research
05/28/2020

Dimension-Free Bounds on Chasing Convex Functions

We consider the problem of chasing convex functions, where functions arr...
research
02/13/2019

The Complexity of Making the Gradient Small in Stochastic Convex Optimization

We give nearly matching upper and lower bounds on the oracle complexity ...
research
01/09/2020

How to trap a gradient flow

We consider the problem of finding an ε-approximate stationary point of ...
research
06/03/2023

Gradient-free optimization of highly smooth functions: improved analysis and a new algorithm

This work studies minimization problems with zero-order noisy oracle inf...

Please sign up or login with your details

Forgot password? Click here to reset