E2-Train: Energy-Efficient Deep Network Training with Data-, Model-, and Algorithm-Level Saving

10/29/2019
by   Yue Wang, et al.
0

Convolutional neural networks (CNNs) have been increasingly deployed to edge devices. Hence, many efforts have been made towards efficient CNN inference on resource-constrained platforms. This paper attempts to explore an orthogonal direction: how to conduct more energy-efficient training of CNNs, so as to enable on-device training? We strive to reduce the energy cost during training, by dropping unnecessary computations, from three complementary levels: stochastic mini-batch dropping on the data level; selective layer update on the model level; and sign prediction for low-cost, low-precision back-propagation, on the algorithm level. Extensive simulations and ablation studies, with real energy measurements from an FPGA board, confirm the superiority of our proposed strategies and demonstrate remarkable energy savings for training. For example, when training ResNet-74 on CIFAR-10, we achieve aggressive energy savings of >90 respectively. When training ResNet-110 on CIFAR-100, an over 84 energy saving is achieved without degrading inference accuracy.

READ FULL TEXT
research
07/07/2020

Enabling On-Device CNN Training by Self-Supervised Instance Filtering and Error Map Pruning

This work aims to enable on-device training of convolutional neural netw...
research
05/25/2021

DTNN: Energy-efficient Inference with Dendrite Tree Inspired Neural Networks for Edge Vision Applications

Deep neural networks (DNN) have achieved remarkable success in computer ...
research
10/24/2020

ShiftAddNet: A Hardware-Inspired Deep Network

Multiplication (e.g., convolution) is arguably a cornerstone of modern d...
research
10/15/2017

NeuralPower: Predict and Deploy Energy-Efficient Convolutional Neural Networks

"How much energy is consumed for an inference made by a convolutional ne...
research
04/24/2023

Neurogenesis Dynamics-inspired Spiking Neural Network Training Acceleration

Biologically inspired Spiking Neural Networks (SNNs) have attracted sign...
research
11/14/2021

Energy Efficient Learning with Low Resolution Stochastic Domain Wall Synapse Based Deep Neural Networks

We demonstrate that extremely low resolution quantized (nominally 5-stat...
research
06/09/2022

Predictive Exit: Prediction of Fine-Grained Early Exits for Computation- and Energy-Efficient Inference

By adding exiting layers to the deep learning networks, early exit can t...

Please sign up or login with your details

Forgot password? Click here to reset