Micro Batch Streaming: Allowing the Training of DNN models Using a large batch size on Small Memory Systems

10/24/2021
by   DoangJoo Synn, et al.
0

The size of the deep learning models has greatly increased over the past decade. Such models are difficult to train using a large batch size, because commodity machines do not have enough memory to accommodate both the model and a large data size. The batch size is one of the hyper-parameters used in the training model, and it is dependent on and is limited by the target machine memory capacity and it is dependent on the remaining memory after the model is uploaded. A smaller batch size usually results in performance degradation. This paper proposes a framework called Micro-Batch Streaming (MBS) to address this problem. This method helps deep learning models to train by providing a batch streaming algorithm that splits a batch into the appropriate size for the remaining memory size and streams them sequentially to the target machine. A loss normalization algorithm based on the gradient accumulation is used to maintain the performance. The purpose of our method is to allow deep learning models to train using mathematically determined optimal batch sizes that cannot fit into the memory of a target system.

READ FULL TEXT
research
04/25/2020

Memory-efficient training with streaming dimensionality reduction

The movement of large quantities of data during the training of a Deep N...
research
11/21/2019

Filter Response Normalization Layer: Eliminating Batch Dependence in the Training of Deep Neural Networks

Batch Normalization (BN) is a highly successful and widely used batch de...
research
12/14/2018

An Empirical Model of Large-Batch Training

In an increasing number of domains it has been demonstrated that deep le...
research
05/30/2019

One-element Batch Training by Moving Window

Several deep models, esp. the generative, compare the samples from two d...
research
04/01/2019

Reducing BERT Pre-Training Time from 3 Days to 76 Minutes

Large-batch training is key to speeding up deep neural network training ...
research
09/04/2020

A Practical Incremental Method to Train Deep CTR Models

Deep learning models in recommender systems are usually trained in the b...
research
06/07/2018

Fast Distributed Deep Learning via Worker-adaptive Batch Sizing

Deep neural network models are usually trained in cluster environments, ...

Please sign up or login with your details

Forgot password? Click here to reset