Parallel coordinate descent for the Adaboost problem

10/07/2013
by   Olivier Fercoq, et al.
0

We design a randomised parallel version of Adaboost based on previous studies on parallel coordinate descent. The algorithm uses the fact that the logarithm of the exponential loss is a function with coordinate-wise Lipschitz continuous gradient, in order to define the step lengths. We provide the proof of convergence for this randomised Adaboost algorithm and a theoretical parallelisation speedup factor. We finally provide numerical examples on learning problems of various sizes that show that the algorithm is competitive with concurrent approaches, especially for large scale problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/15/2018

An Analysis of Asynchronous Stochastic Accelerated Coordinate Descent

Gradient descent, and coordinate descent in particular, are core tools i...
research
11/13/2018

Parallel Stochastic Asynchronous Coordinate Descent: Tight Bounds on the Possible Parallelism

Several works have shown linear speedup is achieved by an asynchronous p...
research
06/27/2012

Scaling Up Coordinate Descent Algorithms for Large ℓ_1 Regularization Problems

We present a generic framework for parallel coordinate descent (CD) algo...
research
11/18/2019

SySCD: A System-Aware Parallel Coordinate Descent Algorithm

In this paper we propose a novel parallel stochastic coordinate descent ...
research
12/20/2013

Accelerated, Parallel and Proximal Coordinate Descent

We propose a new stochastic coordinate descent method for minimizing the...
research
03/21/2022

Faster Randomized Block Sparse Kaczmarz by Averaging

The standard randomized sparse Kaczmarz (RSK) method is an algorithm to ...
research
12/17/2012

Feature Clustering for Accelerating Parallel Coordinate Descent

Large-scale L1-regularized loss minimization problems arise in high-dime...

Please sign up or login with your details

Forgot password? Click here to reset