Restarts subject to approximate sharpness: A parameter-free and optimal scheme for first-order methods

01/05/2023
by   Ben Adcock, et al.
0

Sharpness is an almost generic assumption in continuous optimization that bounds the distance from minima by objective function suboptimality. It leads to the acceleration of first-order methods via restarts. However, sharpness involves problem-specific constants that are typically unknown, and previous restart schemes reduce convergence rates. Moreover, such schemes are challenging to apply in the presence of noise or approximate model classes (e.g., in compressive imaging or learning problems), and typically assume that the first-order method used produces feasible iterates. We consider the assumption of approximate sharpness, a generalization of sharpness that incorporates an unknown constant perturbation to the objective function error. This constant offers greater robustness (e.g., with respect to noise or relaxation of model classes) for finding approximate minimizers. By employing a new type of search over the unknown constants, we design a restart scheme that applies to general first-order methods and does not require the first-order method to produce feasible iterates. Our scheme maintains the same convergence rate as when assuming knowledge of the constants. The rates of convergence we obtain for various first-order methods either match the optimal rates or improve on previously established rates for a wide range of problems. We showcase our restart scheme on several examples and point to future applications and developments of our framework and theory.

READ FULL TEXT
research
02/18/2014

Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning

Majorization-minimization algorithms consist of successively minimizing ...
research
05/25/2020

Boosting First-order Methods by Shifting Objective: New Schemes with Faster Worst Case Rates

We propose a new methodology to design first-order methods for unconstra...
research
04/21/2023

Experimental Convergence Rate Study for Three Shock-Capturing Schemes and Development of Highly Accurate Combined Schemes

We study experimental convergence rates of three shock-capturing schemes...
research
08/01/2023

Mirror Natural Evolution Strategies

The zeroth-order optimization has been widely used in machine learning a...
research
01/30/2021

An efficient mapped WENO scheme using approximate constant mapping

We present a novel mapping approach for WENO schemes through the use of ...
research
03/03/2022

Parametric complexity analysis for a class of first-order Adagrad-like algorithms

A class of algorithms for optimization in the presence of noise is prese...
research
06/14/2023

Noise Stability Optimization for Flat Minima with Optimal Convergence Rates

We consider finding flat, local minimizers by adding average weight pert...

Please sign up or login with your details

Forgot password? Click here to reset