Deterministic Nonsmooth Nonconvex Optimization

02/16/2023
by   Michael I. Jordan, et al.
0

We study the complexity of optimizing nonsmooth nonconvex Lipschitz functions by producing (δ,ϵ)-stationary points. Several recent works have presented randomized algorithms that produce such points using Õ(δ^-1ϵ^-3) first-order oracle calls, independent of the dimension d. It has been an open problem as to whether a similar result can be obtained via a deterministic algorithm. We resolve this open problem, showing that randomization is necessary to obtain a dimension-free rate. In particular, we prove a lower bound of Ω(d) for any deterministic algorithm. Moreover, we show that unlike smooth or convex optimization, access to function values is required for any deterministic algorithm to halt within any finite time. On the other hand, we prove that if the function is even slightly smooth, then the dimension-free rate of Õ(δ^-1ϵ^-3) can be obtained by a deterministic algorithm with merely a logarithmic dependence on the smoothness parameter. Motivated by these findings, we turn to study the complexity of deterministically smoothing Lipschitz functions. Though there are efficient black-box randomized smoothings, we start by showing that no such deterministic procedure can smooth functions in a meaningful manner, resolving an open question. We then bypass this impossibility result for the structured case of ReLU neural networks. To that end, in a practical white-box setting in which the optimizer is granted access to the network's architecture, we propose a simple, dimension-free, deterministic smoothing that provably preserves (δ,ϵ)-stationary points. Our method applies to a variety of architectures of arbitrary depth, including ResNets and ConvNets. Combined with our algorithm, this yields the first deterministic dimension-free algorithm for optimizing ReLU networks, circumventing our lower bound.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/21/2022

On the Complexity of Finding Small Subgradients in Nonsmooth Optimization

We study the oracle complexity of producing (δ,ϵ)-stationary points of L...
research
09/26/2022

On the Complexity of Deterministic Nonsmooth and Nonconvex Optimization

In this paper, we present several new results on minimizing a nonsmooth ...
research
04/14/2021

Oracle Complexity in Nonsmooth Nonconvex Optimization

It is well-known that given a smooth, bounded-from-below, and possibly n...
research
10/05/2020

No quantum speedup over gradient descent for non-smooth convex optimization

We study the first-order convex optimization problem, where we have blac...
research
02/10/2020

On Complexity of Finding Stationary Points of Nonsmooth Nonconvex Functions

We provide the first non-asymptotic analysis for finding stationary poin...
research
07/10/2023

An Algorithm with Optimal Dimension-Dependence for Zero-Order Nonsmooth Nonconvex Stochastic Optimization

We study the complexity of producing (δ,ϵ)-stationary points of Lipschit...
research
01/09/2020

How to trap a gradient flow

We consider the problem of finding an ε-approximate stationary point of ...

Please sign up or login with your details

Forgot password? Click here to reset