Power Optimizations in MTJ-based Neural Networks through Stochastic Computing

08/17/2017
by   Ankit Mondal, et al.
0

Artificial Neural Networks (ANNs) have found widespread applications in tasks such as pattern recognition and image classification. However, hardware implementations of ANNs using conventional binary arithmetic units are computationally expensive, energy-intensive and have large area overheads. Stochastic Computing (SC) is an emerging paradigm which replaces these conventional units with simple logic circuits and is particularly suitable for fault-tolerant applications. Spintronic devices, such as Magnetic Tunnel Junctions (MTJs), are capable of replacing CMOS in memory and logic circuits. In this work, we propose an energy-efficient use of MTJs, which exhibit probabilistic switching behavior, as Stochastic Number Generators (SNGs), which forms the basis of our NN implementation in the SC domain. Further, error resilient target applications of NNs allow us to introduce Approximate Computing, a framework wherein accuracy of computations is traded-off for substantial reductions in power consumption. We propose approximating the synaptic weights in our MTJ-based NN implementation, in ways brought about by properties of our MTJ-SNG, to achieve energy-efficiency. We design an algorithm that can perform such approximations within a given error tolerance in a single-layer NN in an optimal way owing to the convexity of the problem formulation. We then use this algorithm and develop a heuristic approach for approximating multi-layer NNs. To give a perspective of the effectiveness of our approach, a 43 1 about by the proposed algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2018

Correlation Manipulating Circuits for Stochastic Computing

Stochastic computing (SC) is an emerging computing technique that promis...
research
03/14/2018

On the Universal Approximation Property and Equivalence of Stochastic Computing-based Neural Networks and Binary Neural Networks

Large-scale deep neural networks are both memory intensive and computati...
research
06/07/2017

Energy-Efficient Hybrid Stochastic-Binary Neural Networks for Near-Sensor Computing

Recent advances in neural networks (NNs) exhibit unprecedented success a...
research
10/15/2019

Neural Network Design for Energy-Autonomous AI Applications using Temporal Encoding

Neural Networks (NNs) are steering a new generation of artificial intell...
research
06/08/2020

Design Challenges of Neural Network Acceleration Using Stochastic Computing

The enormous and ever-increasing complexity of state-of-the-art neural n...
research
03/03/2023

Machine learning using magnetic stochastic synapses

The impressive performance of artificial neural networks has come at the...
research
02/27/2016

Significance Driven Hybrid 8T-6T SRAM for Energy-Efficient Synaptic Storage in Artificial Neural Networks

Multilayered artificial neural networks (ANN) have found widespread util...

Please sign up or login with your details

Forgot password? Click here to reset