Efficient Decentralized Deep Learning by Dynamic Model Averaging

07/09/2018
by   Michael Kamp, et al.
0

We propose an efficient protocol for decentralized training of deep neural networks from distributed data sources. The proposed protocol allows to handle different phases of model training equally well and to quickly adapt to concept drifts. This leads to a reduction of communication by an order of magnitude compared to periodically communicating state-of-the-art approaches. Moreover, we derive a communication bound that scales well with the hardness of the serialized learning problem. The reduction in communication comes at almost no cost, as the predictive performance remains virtually unchanged. Indeed, the proposed protocol retains loss bounds of periodically averaging schemes. An extensive empirical evaluation validates major improvement of the trade-off between model performance and communication which could be beneficial for numerous decentralized learning applications, such as autonomous driving, or voice recognition and image classification on mobile phones.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/16/2018

Collaborative Deep Learning Across Multiple Data Centers

Valuable training data is often owned by independent organizations and l...
research
08/05/2021

Decentralized Federated Learning with Unreliable Communications

Decentralized federated learning, inherited from decentralized learning,...
research
07/05/2015

Experiments on Parallel Training of Deep Neural Network using Model Averaging

In this work we apply model averaging to parallel training of deep neura...
research
06/26/2021

Decentralized Composite Optimization in Stochastic Networks: A Dual Averaging Approach with Linear Convergence

Decentralized optimization, particularly the class of decentralized comp...
research
03/04/2021

Moshpit SGD: Communication-Efficient Decentralized Training on Heterogeneous Unreliable Devices

Training deep neural networks on large datasets can often be accelerated...
research
11/28/2019

Communication-Efficient Distributed Online Learning with Kernels

We propose an efficient distributed online learning protocol for low-lat...
research
07/13/2022

SmartPubSub: Content-based Pub-Sub on IPFS

The InterPlanetary File System (IPFS) is a hypermedia distribution proto...

Please sign up or login with your details

Forgot password? Click here to reset