Parameter-free projected gradient descent

05/31/2023
by   Evgenii Chzhen, et al.
0

We consider the problem of minimizing a convex function over a closed convex set, with Projected Gradient Descent (PGD). We propose a fully parameter-free version of AdaGrad, which is adaptive to the distance between the initialization and the optimum, and to the sum of the square norm of the subgradients. Our algorithm is able to handle projection steps, does not involve restarts, reweighing along the trajectory or additional gradient evaluations compared to the classical PGD. It also fulfills optimal rates of convergence for cumulative regret up to logarithmic factors. We provide an extension of our approach to stochastic optimization and conduct numerical experiments supporting the developed theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/29/2022

Mirror descent of Hopfield model

Mirror descent is a gradient descent method that uses a dual space of pa...
research
02/25/2010

Less Regret via Online Conditioning

We analyze and evaluate an online gradient descent algorithm with adapti...
research
02/27/2022

Thinking Outside the Ball: Optimal Learning with Gradient Descent for Generalized Linear Stochastic Convex Optimization

We consider linear prediction with a convex Lipschitz loss, or more gene...
research
05/25/2023

DoWG Unleashed: An Efficient Universal Parameter-Free Gradient Descent Method

This paper proposes a new easy-to-implement parameter-free gradient-base...
research
09/01/2022

Optimal Regularized Online Convex Allocation by Adaptive Re-Solving

This paper introduces a dual-based algorithm framework for solving the r...
research
04/06/2018

Reconstructing Point Sets from Distance Distributions

We study the problem of reconstructing the locations u_i of a set of po...
research
03/28/2023

First-order optimization on stratified sets

We consider the problem of minimizing a differentiable function with loc...

Please sign up or login with your details

Forgot password? Click here to reset