Fuse and Mix: MACAM-Enabled Analog Activation for Energy-Efficient Neural Acceleration

08/17/2022
by   Hanqing Zhu, et al.
0

Analog computing has been recognized as a promising low-power alternative to digital counterparts for neural network acceleration. However, conventional analog computing is mainly in a mixed-signal manner. Tedious analog/digital (A/D) conversion cost significantly limits the overall system's energy efficiency. In this work, we devise an efficient analog activation unit with magnetic tunnel junction (MTJ)-based analog content-addressable memory (MACAM), simultaneously realizing nonlinear activation and A/D conversion in a fused fashion. To compensate for the nascent and therefore currently limited representation capability of MACAM, we propose to mix our analog activation unit with digital activation dataflow. A fully differential framework, SuperMixer, is developed to search for an optimized activation workload assignment, adaptive to various activation energy constraints. The effectiveness of our proposed methods is evaluated on a silicon photonic accelerator. Compared to standard activation implementation, our mixed activation system with the searched assignment can achieve competitive accuracy with >60

READ FULL TEXT
research
04/21/2022

MRAM-based Analog Sigmoid Function for In-memory Computing

We propose an analog implementation of the transcendental activation fun...
research
03/05/2017

A/D Converter Architectures for Energy-Efficient Vision Processor

AI applications have emerged in current world. Among AI applications, co...
research
10/02/2022

Reliability-Aware Deployment of DNNs on In-Memory Analog Computing Architectures

Conventional in-memory computing (IMC) architectures consist of analog m...
research
04/20/2022

Multiply-and-Fire (MNF): An Event-driven Sparse Neural Network Accelerator

Machine learning, particularly deep neural network inference, has become...
research
09/19/2017

An Analog Neural Network Computing Engine using CMOS-Compatible Charge-Trap-Transistor (CTT)

An analog neural network computing engine based on CMOS-compatible charg...
research
06/27/2019

Mixed-Signal Charge-Domain Acceleration of Deep Neural networks through Interleaved Bit-Partitioned Arithmetic

Low-power potential of mixed-signal design makes it an alluring option t...

Please sign up or login with your details

Forgot password? Click here to reset