Survey on Large Scale Neural Network Training

02/21/2022
by   Julia Gusak, et al.
0

Modern Deep Neural Networks (DNNs) require significant memory to store weight, activations, and other intermediate tensors during training. Hence, many models do not fit one GPU device or can be trained using only a small per-GPU batch size. This survey provides a systematic overview of the approaches that enable more efficient DNNs training. We analyze techniques that save memory and make good use of computation and communication resources on architectures with a single or several GPUs. We summarize the main categories of strategies and compare strategies within and across categories. Along with approaches proposed in the literature, we discuss available implementations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2016

vDNN: Virtualized Deep Neural Networks for Scalable, Memory-Efficient Neural Network Design

The most widely used machine learning frameworks require users to carefu...
research
02/02/2023

A Survey on Efficient Training of Transformers

Recent advances in Transformers have come with a huge requirement on com...
research
04/26/2018

Profile-guided memory optimization for deep neural networks

Recent years have seen deep neural networks (DNNs) becoming wider and de...
research
06/11/2019

Automatic Model Parallelism for Deep Neural Networks with Compiler and Hardware Support

The deep neural networks (DNNs) have been enormously successful in tasks...
research
01/18/2022

An efficient and flexible inference system for serving heterogeneous ensembles of deep neural networks

Ensembles of Deep Neural Networks (DNNs) has achieved qualitative predic...
research
06/17/2023

Breaking On-device Training Memory Wall: A Systematic Survey

On-device training has become an increasingly popular approach to machin...
research
07/31/2018

Cutting Down Training Memory by Re-fowarding

Deep Neutral Networks(DNN) require huge GPU memory when training on mode...

Please sign up or login with your details

Forgot password? Click here to reset