ResBinNet: Residual Binary Neural Network

11/03/2017
by   Mohammad Ghasemzadeh, et al.
0

Recent efforts on training light-weight binary neural networks offer promising execution/memory efficiency. This paper introduces ResBinNet, which is a composition of two interlinked methodologies aiming to address the slow convergence speed and limited accuracy of binary convolutional neural networks. The first method, called residual binarization, learns a multi-level binary representation for the features within a certain neural network layer. The second method, called temperature adjustment, gradually binarizes the weights of a particular layer. The two methods jointly learn a set of soft-binarized parameters that improve the convergence rate and accuracy of binary neural networks. We corroborate the applicability and scalability of ResBinNet by implementing a prototype hardware accelerator. The accelerator is reconfigurable in terms of the numerical precision of the binarized features, offering a trade-off between runtime and inference accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/30/2019

A binary-activation, multi-level weight RNN and training algorithm for processing-in-memory inference with eNVM

We present a new algorithm for training neural networks with binary acti...
research
06/23/2017

Loom: Exploiting Weight and Activation Precisions to Accelerate Convolutional Neural Networks

Loom (LM), a hardware inference accelerator for Convolutional Neural Net...
research
03/28/2021

BCNN: Binary Complex Neural Network

Binarized neural networks, or BNNs, show great promise in edge-side appl...
research
05/02/2022

VSA: Reconfigurable Vectorwise Spiking Neural Network Accelerator

Spiking neural networks (SNNs) that enable low-power design on edge devi...
research
07/09/2018

XNOR Neural Engine: a Hardware Accelerator IP for 21.6 fJ/op Binary Neural Network Inference

Binary Neural Networks (BNNs) are promising to deliver accuracy comparab...
research
01/12/2021

Self-Adaptive Reconfigurable Arrays (SARA): Using ML to Assist Scaling GEMM Acceleration

With increasing diversity in Deep Neural Network(DNN) models in terms of...

Please sign up or login with your details

Forgot password? Click here to reset