Adaptive first-order methods revisited: Convex optimization without Lipschitz requirements

07/16/2021
by   Kimon Antonakopoulos, et al.
0

We propose a new family of adaptive first-order methods for a class of convex minimization problems that may fail to be Lipschitz continuous or smooth in the standard sense. Specifically, motivated by a recent flurry of activity on non-Lipschitz (NoLips) optimization, we consider problems that are continuous or smooth relative to a reference Bregman function - as opposed to a global, ambient norm (Euclidean or otherwise). These conditions encompass a wide range of problems with singular objectives, such as Fisher markets, Poisson tomography, D-design, and the like. In this setting, the application of existing order-optimal adaptive methods - like UnixGrad or AcceleGrad - is not possible, especially in the presence of randomness and uncertainty. The proposed method - which we call adaptive mirror descent (AdaMir) - aims to close this gap by concurrently achieving min-max optimal rates in problems that are relatively continuous or smooth, including stochastic ones.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/22/2020

Private Stochastic Convex Optimization: Efficient Algorithms for Non-smooth Objectives

In this paper, we revisit the problem of private stochastic convex optim...
research
10/22/2020

Adaptive extra-gradient methods for min-max optimization and games

We present a new family of min-max optimization algorithms that automati...
research
07/07/2017

Non-smooth Non-convex Bregman Minimization: Unification and new Algorithms

We propose a unifying algorithm for non-smooth non-convex optimization. ...
research
04/06/2022

Black-Box Min–Max Continuous Optimization Using CMA-ES with Worst-case Ranking Approximation

In this study, we investigate the problem of min-max continuous optimiza...
research
05/11/2018

Fast Rates of ERM and Stochastic Approximation: Adaptive to Error Bound Conditions

Error bound conditions (EBC) are properties that characterize the growth...
research
07/20/2023

From Adaptive Query Release to Machine Unlearning

We formalize the problem of machine unlearning as design of efficient un...
research
04/22/2019

Provable Bregman-divergence based Methods for Nonconvex and Non-Lipschitz Problems

The (global) Lipschitz smoothness condition is crucial in establishing t...

Please sign up or login with your details

Forgot password? Click here to reset