Near-Optimal Methods for Minimizing Star-Convex Functions and Beyond

06/27/2019
by   Oliver Hinder, et al.
0

In this paper, we provide near-optimal accelerated first-order methods for minimizing a broad class of smooth nonconvex functions that are strictly unimodal on all lines through a minimizer. This function class, which we call the class of smooth quasar-convex functions, is parameterized by a constant γ∈ (0,1], where γ = 1 encompasses the classes of smooth convex and star-convex functions, and smaller values of γ indicate that the function can be "more nonconvex." We develop a variant of accelerated gradient descent that computes an ϵ-approximate minimizer of a smooth γ-quasar-convex function with at most O(γ^-1ϵ^-1/2log(γ^-1ϵ^-1)) total function and gradient evaluations. We also derive a lower bound of Ω(γ^-1ϵ^-1/2) on the number of gradient evaluations required by any deterministic first-order method in the worst case, showing that, up to a logarithmic factor, no deterministic first-order algorithm can improve upon ours.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/15/2023

Continuized Acceleration for Quasar Convex Functions in Non-Convex Optimization

Quasar convexity is a condition that allows some first-order methods to ...
research
11/19/2019

Optimal Complexity and Certification of Bregman First-Order Methods

We provide a lower bound showing that the O(1/k) convergence rate of the...
research
05/25/2021

Practical Schemes for Finding Near-Stationary Points of Convex Finite-Sums

The problem of finding near-stationary points in convex optimization has...
research
05/25/2016

Tight Complexity Bounds for Optimizing Composite Objectives

We provide tight upper and lower bounds on the complexity of minimizing ...
research
08/02/2019

Path Length Bounds for Gradient Descent and Flow

We provide path length bounds on gradient descent (GD) and flow (GF) cur...
research
03/16/2023

Orthogonal Directions Constrained Gradient Method: from non-linear equality constraints to Stiefel manifold

We consider the problem of minimizing a non-convex function over a smoot...
research
01/29/2020

Constructing subgradients from directional derivatives for functions of two variables

For any bivariate function that is locally Lipschitz continuous and dire...

Please sign up or login with your details

Forgot password? Click here to reset