Gradient-Free Methods for Deterministic and Stochastic Nonsmooth Nonconvex Optimization

09/12/2022
by   Tianyi Lin, et al.
28

Nonsmooth nonconvex optimization problems broadly emerge in machine learning and business decision making, whereas two core challenges impede the development of efficient solution methods with finite-time convergence guarantee: the lack of computationally tractable optimality criterion and the lack of computationally powerful oracles. The contributions of this paper are two-fold. First, we establish the relationship between the celebrated Goldstein subdifferential <cit.> and uniform smoothing, thereby providing the basis and intuition for the design of gradient-free methods that guarantee the finite-time convergence to a set of Goldstein stationary points. Second, we propose the gradient-free method (GFM) and stochastic GFM for solving a class of nonsmooth nonconvex optimization problems and prove that both of them can return a (δ,ϵ)-Goldstein stationary point of a Lipschitz function f at an expected convergence rate at O(d^3/2δ^-1ϵ^-4) where d is the problem dimension. Two-phase versions of GFM and SGFM are also proposed and proven to achieve improved large-deviation results. Finally, we demonstrate the effectiveness of 2-SGFM on training ReLU neural networks with the Minst dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/22/2013

Stochastic First- and Zeroth-order Methods for Nonconvex Stochastic Programming

In this paper, we introduce a new stochastic approximation (SA) type alg...
research
03/07/2023

On Momentum-Based Gradient Methods for Bilevel Optimization with Nonconvex Lower-Level

Bilevel optimization is a popular two-level hierarchical optimization, w...
research
09/26/2022

On the Complexity of Deterministic Nonsmooth and Nonconvex Optimization

In this paper, we present several new results on minimizing a nonsmooth ...
research
08/30/2023

A Unified Analysis for the Subgradient Methods Minimizing Composite Nonconvex, Nonsmooth and Non-Lipschitz Functions

In this paper we propose a proximal subgradient method (Prox-SubGrad) fo...
research
04/14/2021

Oracle Complexity in Nonsmooth Nonconvex Optimization

It is well-known that given a smooth, bounded-from-below, and possibly n...
research
02/21/2022

Semi-Implicit Hybrid Gradient Methods with Application to Adversarial Robustness

Adversarial examples, crafted by adding imperceptible perturbations to n...

Please sign up or login with your details

Forgot password? Click here to reset