Reliability-Aware Deployment of DNNs on In-Memory Analog Computing Architectures

10/02/2022
by   Md Hasibul Amin, et al.
0

Conventional in-memory computing (IMC) architectures consist of analog memristive crossbars to accelerate matrix-vector multiplication (MVM), and digital functional units to realize nonlinear vector (NLV) operations in deep neural networks (DNNs). These designs, however, require energy-hungry signal conversion units which can dissipate more than 95 system. In-Memory Analog Computing (IMAC) circuits, on the other hand, remove the need for signal converters by realizing both MVM and NLV operations in the analog domain leading to significant energy savings. However, they are more susceptible to reliability challenges such as interconnect parasitic and noise. Here, we introduce a practical approach to deploy large matrices in DNNs onto multiple smaller IMAC subarrays to alleviate the impacts of noise and parasitics while keeping the computation in the analog domain.

READ FULL TEXT
research
01/29/2022

Interconnect Parasitics and Partitioning in Fully-Analog In-Memory Computing Architectures

Fully-analog in-memory computing (IMC) architectures that implement both...
research
06/27/2019

Mixed-Signal Charge-Domain Acceleration of Deep Neural networks through Interleaved Bit-Partitioned Arithmetic

Low-power potential of mixed-signal design makes it an alluring option t...
research
08/29/2023

OSA-HCIM: On-The-Fly Saliency-Aware Hybrid SRAM CIM with Dynamic Precision Configuration

Computing-in-Memory (CIM) has shown great potential for enhancing effici...
research
09/02/2021

An Electro-Photonic System for Accelerating Deep Neural Networks

The number of parameters in deep neural networks (DNNs) is scaling at ab...
research
11/27/2019

Representable Matrices: Enabling High Accuracy Analog Computation for Inference of DNNs using Memristors

Analog computing based on memristor technology is a promising solution t...
research
08/17/2022

Fuse and Mix: MACAM-Enabled Analog Activation for Energy-Efficient Neural Acceleration

Analog computing has been recognized as a promising low-power alternativ...
research
07/12/2023

Non-Ideal Program-Time Conservation in Charge Trap Flash for Deep Learning

Training deep neural networks (DNNs) is computationally intensive but ar...

Please sign up or login with your details

Forgot password? Click here to reset