Higher-Order Accelerated Methods for Faster Non-Smooth Optimization

06/04/2019
by   Brian Bullins, et al.
0

We provide improved convergence rates for various non-smooth optimization problems via higher-order accelerated methods. In the case of ℓ_∞ regression, we achieves an O(ϵ^-4/5) iteration complexity, breaking the O(ϵ^-1) barrier so far present for previous methods. We arrive at a similar rate for the problem of ℓ_1-SVM, going beyond what is attainable by first-order methods with prox-oracle access for non-smooth non-strongly convex problems. We further show how to achieve even faster rates by introducing higher-order regularization. Our results rely on recent advances in near-optimal accelerated methods for higher-order smooth convex optimization. In particular, we extend Nesterov's smoothing technique to show that the standard softmax approximation is not only smooth in the usual sense, but also higher-order smooth. With this observation in hand, we provide the first example of higher-order acceleration techniques yielding faster rates for non-smooth optimization, to the best of our knowledge.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/27/2017

Lower Bounds for Higher-Order Convex Optimization

State-of-the-art methods in convex and non-convex optimization employ hi...
research
05/12/2022

Optimal Methods for Higher-Order Smooth Monotone Variational Inequalities

In this work, we present new simple and optimal algorithms for solving t...
research
05/25/2020

Boosting First-order Methods by Shifting Objective: New Schemes with Faster Worst Case Rates

We propose a new methodology to design first-order methods for unconstra...
research
11/12/2020

Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration

We show that standard extragradient methods (i.e. mirror prox and dual e...
research
05/26/2016

FLAG n' FLARE: Fast Linearly-Coupled Adaptive Gradient Methods

We consider first order gradient methods for effectively optimizing a co...
research
07/07/2023

Higher-Order Corrections to Optimisers based on Newton's Method

The Newton, Gauss–Newton and Levenberg–Marquardt methods all use the fir...
research
07/13/2016

Homotopy Smoothing for Non-Smooth Problems with Lower Complexity than O(1/ε)

In this paper, we develop a novel homoto py smoothing (HOPS) algorithm...

Please sign up or login with your details

Forgot password? Click here to reset