Signed Binary Weight Networks

11/25/2022
by   Sachit Kuhar, et al.
0

Efficient inference of Deep Neural Networks (DNNs) is essential to making AI ubiquitous. Two important algorithmic techniques have shown promise for enabling efficient inference - sparsity and binarization. These techniques translate into weight sparsity and weight repetition at the hardware-software level enabling the deployment of DNNs with critically low power and latency requirements. We propose a new method called signed-binary networks to improve efficiency further (by exploiting both weight sparsity and weight repetition together) while maintaining similar accuracy. Our method achieves comparable accuracy on ImageNet and CIFAR10 datasets with binary and can lead to 69 sparsity. We observe real speedup when deploying these models on general-purpose devices and show that this high percentage of unstructured sparsity can lead to a further reduction in energy consumption on ASICs.

READ FULL TEXT

page 3

page 7

research
11/04/2019

Deep Compressed Pneumonia Detection for Low-Power Embedded Devices

Deep neural networks (DNNs) have been expanded into medical fields and t...
research
09/04/2023

On the fly Deep Neural Network Optimization Control for Low-Power Computer Vision

Processing visual data on mobile devices has many applications, e.g., em...
research
11/07/2017

SparCE: Sparsity aware General Purpose Core Extensions to Accelerate Deep Neural Networks

Deep Neural Networks (DNNs) have emerged as the method of choice for sol...
research
09/12/2023

Accelerating Deep Neural Networks via Semi-Structured Activation Sparsity

The demand for efficient processing of deep neural networks (DNNs) on em...
research
02/01/2022

Sense: Model Hardware Co-design for Accelerating Sparse Neural Networks

Sparsity is an intrinsic property of neural network(NN). Many software r...
research
10/01/2018

Dynamic Sparse Graph for Efficient Deep Learning

We propose to execute deep neural networks (DNNs) with dynamic and spars...
research
06/12/2019

Parameterized Structured Pruning for Deep Neural Networks

As a result of the growing size of Deep Neural Networks (DNNs), the gap ...

Please sign up or login with your details

Forgot password? Click here to reset