Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization

11/09/2017
by   Ahmet Alacaoglu, et al.
0

We propose a new randomized coordinate descent method for a convex optimization template with broad applications. Our analysis relies on a novel combination of four ideas applied to the primal-dual gap function: smoothing, acceleration, homotopy, and coordinate descent with non-uniform sampling. As a result, our method features the first convergence rate guarantees among the coordinate descent methods, that are the best-known under a variety of common structure assumptions on the template. We provide numerical evidence to support the theoretical results with a comparison to state-of-the-art algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/21/2017

Stochastic Primal Dual Coordinate Method with Non-Uniform Sampling Based on Optimality Violations

We study primal-dual type stochastic optimization algorithms with non-un...
research
03/07/2017

Faster Coordinate Descent via Adaptive Importance Sampling

Coordinate descent methods employ random partial updates of decision var...
research
04/16/2022

Beyond L1: Faster and Better Sparse Models with skglm

We propose a new fast algorithm to estimate any sparse generalized linea...
research
01/26/2022

A dual approach for federated learning

We study the federated optimization problem from a dual perspective and ...
research
07/13/2020

Random extrapolation for primal-dual coordinate descent

We introduce a randomly extrapolated primal-dual coordinate descent meth...
research
04/19/2013

Inexact Coordinate Descent: Complexity and Preconditioning

In this paper we consider the problem of minimizing a convex function us...
research
02/26/2021

Fast Cyclic Coordinate Dual Averaging with Extrapolation for Generalized Variational Inequalities

We propose the Cyclic cOordinate Dual avEraging with extRapolation (CODE...

Please sign up or login with your details

Forgot password? Click here to reset