Efficient Greedy Coordinate Descent for Composite Problems

10/16/2018
by   Sai Praneeth Karimireddy, et al.
0

Coordinate descent with random coordinate selection is the current state of the art for many large scale optimization problems. However, greedy selection of the steepest coordinate on smooth problems can yield convergence rates independent of the dimension n, and requiring upto n times fewer iterations. In this paper, we consider greedy updates that are based on subgradients for a class of non-smooth composite problems, which includes L1-regularized problems, SVMs and related applications. For these problems we provide (i) the first linear rates of convergence independent of n, and show that our greedy update rule provides speedups similar to those obtained in the smooth case. This was previously conjectured to be true for a stronger greedy coordinate selection strategy. Furthermore, we show that (ii) our new selection rule can be mapped to instances of maximum inner product search, allowing to leverage standard nearest neighbor algorithms to speed up the implementation. We demonstrate the validity of the approach through extensive numerical experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/17/2012

Feature Clustering for Accelerating Parallel Coordinate Descent

Large-scale L1-regularized loss minimization problems arise in high-dime...
research
07/03/2023

Analyzing and Improving Greedy 2-Coordinate Updates for Equality-Constrained Optimization via Steepest Descent in the 1-Norm

We consider minimizing a smooth function subject to a summation constrai...
research
10/22/2020

Model identification and local linear convergence of coordinate descent

For composite nonsmooth optimization problems, Forward-Backward algorith...
research
03/04/2022

Greedy double subspaces coordinate descent method via orthogonalization

The coordinate descent method is an effective iterative method for solvi...
research
01/05/2016

Coordinate Friendly Structures, Algorithms and Applications

This paper focuses on coordinate update methods, which are useful for so...
research
06/01/2015

Coordinate Descent Converges Faster with the Gauss-Southwell Rule Than Random Selection

There has been significant recent work on the theory and application of ...
research
04/07/2023

A Block Coordinate Descent Method for Nonsmooth Composite Optimization under Orthogonality Constraints

Nonsmooth composite optimization with orthogonality constraints has a br...

Please sign up or login with your details

Forgot password? Click here to reset