Learning@home: Crowdsourced Training of Large Neural Networks using Decentralized Mixture-of-Experts

02/10/2020
by   Maksim Riabinin, et al.
0

Many recent breakthroughs in deep learning were achieved by training increasingly larger models on massive datasets. However, training such models can be prohibitively expensive. For instance, Megatron Language Model with 8.3B parameters was trained on a GPU cluster worth $25 million. As a result, most researchers cannot afford to train state of the art models and contribute to their development. Hypothetically, a researcher could crowdsource the training of large neural networks with thousands of regular PCs provided by volunteers. The raw computing power of ten thousand $2500 desktops dwarfs that of a $25M server pod, but one cannot utilize that power efficiently with conventional distributed training methods. In this work, we propose Learning@home: a neural network training paradigm designed to handle millions of poorly connected participants. We analyze the performance, reliability, and architectural constraints of this paradigm and compare it against existing distributed training techniques.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro