Coordinate Descent Converges Faster with the Gauss-Southwell Rule Than Random Selection

06/01/2015
by   Julie Nutini, et al.
0

There has been significant recent work on the theory and application of randomized coordinate descent algorithms, beginning with the work of Nesterov [SIAM J. Optim., 22(2), 2012], who showed that a random-coordinate selection rule achieves the same convergence rate as the Gauss-Southwell selection rule. This result suggests that we should never use the Gauss-Southwell rule, as it is typically much more expensive than random selection. However, the empirical behaviours of these algorithms contradict this theoretical result: in applications where the computational costs of the selection rules are comparable, the Gauss-Southwell selection rule tends to perform substantially better than random coordinate selection. We give a simple analysis of the Gauss-Southwell rule showing that---except in extreme cases---it's convergence rate is faster than choosing random coordinates. Further, in this work we (i) show that exact coordinate optimization improves the convergence rate for certain sparse problems, (ii) propose a Gauss-Southwell-Lipschitz rule that gives an even faster convergence rate given knowledge of the Lipschitz constants of the partial derivatives, (iii) analyze the effect of approximate Gauss-Southwell rules, and (iv) analyze proximal-gradient variants of the Gauss-Southwell rule.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2019

Blockwise Adaptivity: Faster Training and Better Generalization in Deep Learning

Stochastic methods with coordinate-wise adaptive stepsize (such as RMSpr...
research
01/28/2020

Faster Activity and Data Detection in Massive Random Access: A Multi-armed Bandit Approach

This paper investigates the grant-free random access with massive IoT de...
research
12/05/2020

Sketch Project Methods for Linear Feasibility Problems: Greedy Sampling Momentum

We develop two greedy sampling rules for the Sketch Project method f...
research
07/03/2023

Analyzing and Improving Greedy 2-Coordinate Updates for Equality-Constrained Optimization via Steepest Descent in the 1-Norm

We consider minimizing a smooth function subject to a summation constrai...
research
10/16/2018

Efficient Greedy Coordinate Descent for Composite Problems

Coordinate descent with random coordinate selection is the current state...
research
10/09/2018

SNAP: A semismooth Newton algorithm for pathwise optimization with optimal local convergence rate and oracle properties

We propose a semismooth Newton algorithm for pathwise optimization (SNAP...
research
09/14/2018

Revisiting Random Binning Features: Fast Convergence and Strong Parallelizability

Kernel method has been developed as one of the standard approaches for n...

Please sign up or login with your details

Forgot password? Click here to reset