On the Universal Approximation Property and Equivalence of Stochastic Computing-based Neural Networks and Binary Neural Networks

03/14/2018
by   Yanzhi Wang, et al.
0

Large-scale deep neural networks are both memory intensive and computation-intensive, thereby posing stringent requirements on the computing platforms. Hardware accelerations of deep neural networks have been extensively investigated in both industry and academia. Specific forms of binary neural networks (BNNs) and stochastic computing based neural networks (SCNNs) are particularly appealing to hardware implementations since they can be implemented almost entirely with binary operations. Despite the obvious advantages in hardware implementation, these approximate computing techniques are questioned by researchers in terms of accuracy and universal applicability. Also it is important to understand the relative pros and cons of SCNNs and BNNs in theory and in actual hardware implementations. In order to address these concerns, in this paper we prove that the "ideal" SCNNs and BNNs satisfy the universal approximation property with probability 1 (due to the stochastic behavior). The proof is conducted by first proving the property for SCNNs from the strong law of large numbers, and then using SCNNs as a "bridge" to prove for BNNs. Based on the universal approximation property, we further prove that SCNNs and BNNs exhibit the same energy complexity. In other words, they have the same asymptotic energy consumption with the growing of network size. We also provide a detailed analysis of the pros and cons of SCNNs and BNNs for hardware implementations and conclude that SCNNs are more suitable for hardware.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/17/2017

Power Optimizations in MTJ-based Neural Networks through Stochastic Computing

Artificial Neural Networks (ANNs) have found widespread applications in ...
research
11/04/2016

Sparsely-Connected Neural Networks: Towards Efficient VLSI Implementation of Deep Neural Networks

Recently deep neural networks have received considerable attention due t...
research
01/21/2019

Deep Neural Network Approximation for Custom Hardware: Where We've Been, Where We're Going

Deep neural networks have proven to be particularly effective in visual ...
research
03/20/2016

Beyond Binary Computers: How To Implement Multi-Switch Computer Hardware and Software and; The Advantage of a Multi-Switched Computer

This paper explores the possibilities of using a computing methodology -...
research
05/08/2020

Efficient Computation Reduction in Bayesian Neural Networks Through Feature Decomposition and Memorization

Bayesian method is capable of capturing real world uncertainties/incompl...
research
09/29/2015

VLSI Implementation of Deep Neural Network Using Integral Stochastic Computing

The hardware implementation of deep neural networks (DNNs) has recently ...
research
03/19/2018

Local Binary Pattern Networks

Memory and computation efficient deep learning architec- tures are cruci...

Please sign up or login with your details

Forgot password? Click here to reset