Interconnect Parasitics and Partitioning in Fully-Analog In-Memory Computing Architectures

01/29/2022
by   Md Hasibul Amin, et al.
0

Fully-analog in-memory computing (IMC) architectures that implement both matrix-vector multiplication and non-linear vector operations within the same memory array have shown promising performance benefits over conventional IMC systems due to the removal of energy-hungry signal conversion units. However, maintaining the computation in the analog domain for the entire deep neural network (DNN) comes with potential sensitivity to interconnect parasitics. Thus, in this paper, we investigate the effect of wire parasitic resistance and capacitance on the accuracy of DNN models deployed on fully-analog IMC architectures. Moreover, we propose a partitioning mechanism to alleviate the impact of the parasitic while keeping the computation in the analog domain through dividing large arrays into multiple partitions. The SPICE circuit simulation results for a 400 X 120 X 84 X 10 DNN model deployed on a fully-analog IMC circuit show that a 94.84 MNIST classification application with 16, 8, and 8 horizontal partitions, as well as 8, 8, and 1 vertical partitions for first, second, and third layers of the DNN, respectively, which is comparable to the  97 digital implementation on CPU. It is shown that accuracy benefits are achieved at the cost of higher power consumption due to the extra circuitry required for handling partitioning.

READ FULL TEXT

page 1

page 4

research
10/02/2022

Reliability-Aware Deployment of DNNs on In-Memory Analog Computing Architectures

Conventional in-memory computing (IMC) architectures consist of analog m...
research
04/21/2022

MRAM-based Analog Sigmoid Function for In-memory Computing

We propose an analog implementation of the transcendental activation fun...
research
10/02/2022

A Python Framework for SPICE Circuit Simulation of In-Memory Analog Computing Circuits

With the increased attention to memristive-based in-memory analog comput...
research
05/23/2023

Bulk-Switching Memristor-based Compute-In-Memory Module for Deep Neural Network Training

The need for deep neural network (DNN) models with higher performance an...
research
04/18/2023

IMAC-Sim: A Circuit-level Simulator For In-Memory Analog Computing Architectures

With the increased attention to memristive-based in-memory analog comput...
research
09/17/2023

Analog Content-Addressable Memory from Complementary FeFETs

To address the increasing computational demands of artificial intelligen...
research
05/07/2022

Impact of L1 Batch Normalization on Analog Noise Resistant Property of Deep Learning Models

Analog hardware has become a popular choice for machine learning on reso...

Please sign up or login with your details

Forgot password? Click here to reset