Online Asynchronous Distributed Regression

07/16/2014
by   Gérard Biau, et al.
0

Distributed computing offers a high degree of flexibility to accommodate modern learning constraints and the ever increasing size of datasets involved in massive data issues. Drawing inspiration from the theory of distributed computation models developed in the context of gradient-type optimization algorithms, we present a consensus-based asynchronous distributed approach for nonparametric online regression and analyze some of its asymptotic properties. Substantial numerical evidence involving up to 28 parallel processors is provided on synthetic datasets to assess the excellent performance of our method, both in terms of computation time and prediction accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/06/2020

Block Distributed Majorize-Minimize Memory Gradient Algorithm and its application to 3D image restoration

Modern 3D image recovery problems require powerful optimization framewor...
research
05/13/2019

ONLAY: Online Layering for scalable asynchronous BFT system

This paper presents a new framework, namely , for scalable asynchronous ...
research
04/04/2018

GoSGD: Distributed Optimization for Deep Learning with Gossip Exchange

We address the issue of speeding up the training of convolutional neural...
research
02/21/2019

Gradient Scheduling with Global Momentum for Non-IID Data Distributed Asynchronous Training

Distributed asynchronous offline training has received widespread attent...
research
07/19/2019

ASYNC: Asynchronous Machine Learning on Distributed Systems

ASYNC is a framework that supports the implementation of asynchronous ma...
research
06/24/2020

Advances in Asynchronous Parallel and Distributed Optimization

Motivated by large-scale optimization problems arising in the context of...
research
05/31/2016

CYCLADES: Conflict-free Asynchronous Machine Learning

We present CYCLADES, a general framework for parallelizing stochastic op...

Please sign up or login with your details

Forgot password? Click here to reset