Enabling Binary Neural Network Training on the Edge

02/08/2021
by   Erwei Wang, et al.
0

The ever-growing computational demands of increasingly complex machine learning models frequently necessitate the use of powerful cloud-based infrastructure for their training. Binary neural networks are known to be promising candidates for on-device inference due to their extreme compute and memory savings over higher-precision alternatives. However, their existing training methods require the concurrent storage of high-precision activations for all layers, generally making learning on memory-constrained devices infeasible. In this paper, we demonstrate that the backward propagation operations needed for binary neural network training are strongly robust to quantization, thereby making on-the-edge learning with modern models a practical proposition. We introduce a low-cost binary neural network training strategy exhibiting sizable memory footprint and energy reductions while inducing little to no accuracy loss vs Courbariaux Bengio's standard approach. These resource decreases are primarily enabled through the retention of activations exclusively in binary format. Against the latter algorithm, our drop-in replacement sees coincident memory requirement and energy consumption drops of 2–6×, while reaching similar test accuracy in comparable time, across a range of small-scale models trained to classify popular datasets. We also demonstrate from-scratch ImageNet training of binarized ResNet-18, achieving a 3.12× memory reduction. Such savings will allow for unnecessary cloud offloading to be avoided, reducing latency, increasing energy efficiency and safeguarding privacy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/23/2019

Backprop with Approximate Activations for Memory-efficient Network Training

Larger and deeper neural network architectures deliver improved accuracy...
research
05/21/2020

Conditionally Deep Hybrid Neural Networks Across Edge and Cloud

The pervasiveness of "Internet-of-Things" in our daily life has led to a...
research
06/24/2022

Low- and Mixed-Precision Inference Accelerators

With the surging popularity of edge computing, the need to efficiently p...
research
05/25/2021

DTNN: Energy-efficient Inference with Dendrite Tree Inspired Neural Networks for Edge Vision Applications

Deep neural networks (DNN) have achieved remarkable success in computer ...
research
03/25/2022

Fast fluorescence lifetime imaging analysis via extreme learning machine

We present a fast and accurate analytical method for fluorescence lifeti...
research
01/12/2021

Sound Event Detection with Binary Neural Networks on Tightly Power-Constrained IoT Devices

Sound event detection (SED) is a hot topic in consumer and smart city ap...
research
09/30/2019

Random Bias Initialization Improving Binary Neural Network Training

Edge intelligence especially binary neural network (BNN) has attracted c...

Please sign up or login with your details

Forgot password? Click here to reset