Comparing different subgradient methods for solving convex optimization problems with functional constraints

01/04/2021
by   Thi Lan Dinh, et al.
0

We provide a dual subgradient method and a primal-dual subgradient method for standard convex optimization problems with complexity 𝒪(ε^-2) and 𝒪(ε^-2r), for all r> 1, respectively. They are based on recent Metel-Takeda's work in [arXiv:2009.12769, 2020, pp. 1-12] and Boyd's method in [Lecture notes of EE364b, Stanford University, Spring 2013-14, pp. 1-39]. The efficiency of our methods is numerically illustrated in a comparison to the others.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/02/2022

Primal-Dual Method for Optimization Problems with Changing Constraints

We propose a modified primal-dual method for general convex optimization...
research
02/16/2020

Convex Optimization on Functionals of Probability Densities

In information theory, some optimization problems result in convex optim...
research
06/20/2014

Playing with Duality: An Overview of Recent Primal-Dual Approaches for Solving Large-Scale Optimization Problems

Optimization methods are at the core of many problems in signal/image pr...
research
02/04/2014

UNLocBoX: A MATLAB convex optimization toolbox for proximal-splitting methods

Convex optimization is an essential tool for machine learning, as many o...
research
01/04/2021

First-Order Methods for Convex Optimization

First-order methods for solving convex optimization problems have been a...
research
03/30/2020

An adaptive finite element approach for lifted branched transport problems

We consider so-called branched transport and variants thereof in two spa...
research
07/25/2019

Safe Feature Elimination for Non-Negativity Constrained Convex Optimization

Inspired by recent work on safe feature elimination for 1-norm regulariz...

Please sign up or login with your details

Forgot password? Click here to reset