On the Complexity of Deterministic Nonsmooth and Nonconvex Optimization

by   Michael I. Jordan, et al.

In this paper, we present several new results on minimizing a nonsmooth and nonconvex function under a Lipschitz condition. Recent work shows that while the classical notion of Clarke stationarity is computationally intractable up to some sufficiently small constant tolerance, the randomized first-order algorithms find a (δ, ϵ)-Goldstein stationary point with the complexity bound of Õ(δ^-1ϵ^-3), which is independent of dimension d ≥ 1 <cit.>. However, the deterministic algorithms have not been fully explored, leaving open several problems in nonsmooth nonconvex optimization. Our first contribution is to demonstrate that the randomization is necessary to obtain a dimension-independent guarantee, by proving a lower bound of Ω(d) for any deterministic algorithm that has access to both 1^st and 0^th oracles. Furthermore, we show that the 0^th oracle is essential to obtain a finite-time convergence guarantee, by showing that any deterministic algorithm with only the 1^st oracle is not able to find an approximate Goldstein stationary point within a finite number of iterations up to sufficiently small constant parameter and tolerance. Finally, we propose a deterministic smoothing approach under the arithmetic circuit model where the resulting smoothness parameter is exponential in a certain parameter M > 0 (e.g., the number of nodes in the representation of the function), and design a new deterministic first-order algorithm that achieves a dimension-independent complexity bound of Õ(Mδ^-1ϵ^-3).


page 1

page 2

page 3

page 4


Deterministic Nonsmooth Nonconvex Optimization

We study the complexity of optimizing nonsmooth nonconvex Lipschitz func...

On the Complexity of Finding Small Subgradients in Nonsmooth Optimization

We study the oracle complexity of producing (δ,ϵ)-stationary points of L...

Gradient-Free Methods for Deterministic and Stochastic Nonsmooth Nonconvex Optimization

Nonsmooth nonconvex optimization problems broadly emerge in machine lear...

Oracle Complexity in Nonsmooth Nonconvex Optimization

It is well-known that given a smooth, bounded-from-below, and possibly n...

How to trap a gradient flow

We consider the problem of finding an ε-approximate stationary point of ...

On Penalty Methods for Nonconvex Bilevel Optimization and First-Order Stochastic Approximation

In this work, we study first-order algorithms for solving Bilevel Optimi...

Constructive subsampling of finite frames with applications in optimal function recovery

In this paper we present new constructive methods, random and determinis...

Please sign up or login with your details

Forgot password? Click here to reset